2026-03-09T19:17:43.092 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-09T19:17:43.100 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-09T19:17:43.124 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/617 branch: squid description: orch/cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/no kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{reef} 1-volume/{0-create 1-ranks/2 2-allow_standby_replay/no 3-inline/no 4-verify} 2-client/fuse 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} email: null first_in_suite: false flavor: default job_id: '617' last_in_suite: false machine_type: vps meta: - desc: 'setup ceph/reef ' name: kyr-2026-03-09_11:23:05-orch-squid-none-default-vps no_nested_subset: false os_type: centos os_version: 9.stream overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: cluster-conf: mgr: client mount timeout: 30 debug client: 20 debug mgr: 20 debug ms: 1 mon warn on pool no app: false conf: client: client mount timeout: 600 debug client: 20 debug ms: 1 rados mon op timeout: 900 rados osd op timeout: 900 global: mon pg warn min per osd: 0 mds: debug mds: 20 debug mds balancer: 20 debug ms: 1 mds debug frag: true mds debug scatterstat: true mds op complaint time: 180 mds verify scatter: true osd op complaint time: 180 rados mon op timeout: 900 rados osd op timeout: 900 mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 mon down mkfs grace: 300 mon op complaint time: 120 osd: bdev async discard: true bdev enable discard: true bluestore allocator: bitmap bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug ms: 1 debug osd: 20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd mclock iops capacity threshold hdd: 49000 osd objectstore: bluestore osd op complaint time: 180 flavor: default fs: xfs log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) - FS_DEGRADED - filesystem is degraded - FS_INLINE_DATA_DEPRECATED - FS_WITH_FAILED_MDS - MDS_ALL_DOWN - filesystem is offline - is offline because no MDS - MDS_DAMAGE - MDS_DEGRADED - MDS_FAILED - MDS_INSUFFICIENT_STANDBY - MDS_UP_LESS_THAN_MAX - online, but wants - filesystem is online with fewer MDS than max_mds - POOL_APP_NOT_ENABLED - do not have an application enabled - overall HEALTH_ - Replacing daemon - deprecated feature inline_data - MGR_MODULE_ERROR - OSD_DOWN - osds down - overall HEALTH_ - \(OSD_DOWN\) - \(OSD_ - but it is still running - is not responding - MON_DOWN - PG_AVAILABILITY - PG_DEGRADED - Reduced data availability - Degraded data redundancy - pg .* is stuck inactive - pg .* is .*degraded - pg .* is stuck peering sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: bluestore: true conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} osd: bdev async discard: true bdev enable discard: true bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd objectstore: bluestore fs: xfs install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath kclient: syntax: v1 selinux: allowlist: - scontext=system_u:system_r:logrotate_t:s0 - scontext=system_u:system_r:getty_t:s0 thrashosds: bdev_inject_crash: 2 bdev_inject_crash_probability: 0.5 workunit: branch: tt-squid sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - host.a - client.0 - osd.0 - osd.1 - osd.2 - - host.b - client.1 - osd.3 - osd.4 - osd.5 seed: 3443 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 subset: 1/64 suite: orch suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 targets: vm07.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCnuuww/JYvM+rK8gOSiEGkaH4d0HGlvm1jW5kwqmghWt/qY4o1H/p3PHVJskHHdEERVgnkkHHhv122Xh4nww+o= vm08.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIHwxnFOeG0FXoZAq4+x8gfUBS9oyLUhsoPYhZdnVdSfqul2Q6fKyVeFuURrT45Cmdq10vq7SI1HD4H7tOM+vVw= tasks: - install: branch: reef exclude_packages: - ceph-volume - print: '**** done install task...' - cephadm: compiled_cephadm_branch: reef conf: osd: osd_class_default_list: '*' osd_class_load_list: '*' image: quay.ceph.io/ceph-ci/ceph:reef roleless: true - print: '**** done end installing reef cephadm ...' - cephadm.shell: host.a: - ceph config set mgr mgr/cephadm/use_repo_digest true --force - print: '**** done cephadm.shell ceph config set mgr...' - cephadm.shell: host.a: - ceph orch status - ceph orch ps - ceph orch ls - ceph orch host ls - ceph orch device ls - cephadm.shell: host.a: - ceph fs volume create cephfs --placement=4 - ceph fs dump - cephadm.shell: host.a: - ceph fs set cephfs max_mds 2 - cephadm.shell: host.a: - ceph fs set cephfs allow_standby_replay false - cephadm.shell: host.a: - ceph fs set cephfs inline_data false - cephadm.shell: host.a: - ceph fs dump - ceph --format=json fs dump | jq -e ".filesystems | length == 1" - while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done - fs.pre_upgrade_save: null - ceph-fuse: null - print: '**** done client' - parallel: - upgrade-tasks - workload-tasks - cephadm.shell: host.a: - ceph fs dump - fs.post_upgrade_checks: null teuthology: fragments_dropped: - /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/suites/orch/cephadm/mds_upgrade_sequence/tasks/3-upgrade-mgr-staggered.yaml meta: {} postmerge: - "local kernel = py_attrgetter(yaml).get('kernel')\nif kernel ~= nil then\n local\ \ branch = py_attrgetter(kernel).get('branch')\n if branch and not kernel.branch:find\ \ \"-all$\" then\n log.debug(\"removing default kernel specification: %s\"\ , kernel)\n py_attrgetter(kernel).pop('branch', nil)\n py_attrgetter(kernel).pop('deb',\ \ nil)\n py_attrgetter(kernel).pop('flavor', nil)\n py_attrgetter(kernel).pop('kdb',\ \ nil)\n py_attrgetter(kernel).pop('koji', nil)\n py_attrgetter(kernel).pop('koji_task',\ \ nil)\n py_attrgetter(kernel).pop('rpm', nil)\n py_attrgetter(kernel).pop('sha1',\ \ nil)\n py_attrgetter(kernel).pop('tag', nil)\n end\nend\n" variables: fail_fs: false teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-09_11:23:05 tube: vps upgrade-tasks: sequential: - cephadm.shell: env: - sha1 host.a: - ceph config set mgr mgr/orchestrator/fail_fs false || true - cephadm.shell: env: - sha1 host.a: - ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force - ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force - ceph config set global log_to_journald false --force - ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 - cephadm.shell: env: - sha1 host.a: - while ceph orch upgrade status | jq '.in_progress' | grep true && ! ceph orch upgrade status | jq '.message' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done - ceph orch ps - ceph orch upgrade status - ceph health detail - ceph versions - echo "wait for servicemap items w/ changing names to refresh" - sleep 60 - ceph orch ps - ceph versions - ceph versions | jq -e '.overall | length == 1' - ceph versions | jq -e '.overall | keys' | grep $sha1 user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 workload-tasks: sequential: - workunit: clients: all: - suites/fsstress.sh 2026-03-09T19:17:43.124 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa; will attempt to use it 2026-03-09T19:17:43.124 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks 2026-03-09T19:17:43.125 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-09T19:17:43.125 INFO:teuthology.task.internal:Checking packages... 2026-03-09T19:17:43.125 INFO:teuthology.task.internal:Checking packages for os_type 'centos', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-09T19:17:43.125 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-09T19:17:43.125 INFO:teuthology.packaging:ref: None 2026-03-09T19:17:43.125 INFO:teuthology.packaging:tag: None 2026-03-09T19:17:43.125 INFO:teuthology.packaging:branch: squid 2026-03-09T19:17:43.125 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T19:17:43.125 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=squid 2026-03-09T19:17:43.891 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678.ge911bdeb 2026-03-09T19:17:43.892 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-09T19:17:43.893 INFO:teuthology.task.internal:no buildpackages task found 2026-03-09T19:17:43.893 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-09T19:17:43.893 INFO:teuthology.task.internal:Saving configuration 2026-03-09T19:17:43.901 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-09T19:17:43.902 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-09T19:17:43.908 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm07.local', 'description': '/archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/617', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-09 19:16:29.443381', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:07', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCnuuww/JYvM+rK8gOSiEGkaH4d0HGlvm1jW5kwqmghWt/qY4o1H/p3PHVJskHHdEERVgnkkHHhv122Xh4nww+o='} 2026-03-09T19:17:43.913 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm08.local', 'description': '/archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/617', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-09 19:16:29.442991', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:08', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIHwxnFOeG0FXoZAq4+x8gfUBS9oyLUhsoPYhZdnVdSfqul2Q6fKyVeFuURrT45Cmdq10vq7SI1HD4H7tOM+vVw='} 2026-03-09T19:17:43.913 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-09T19:17:43.914 INFO:teuthology.task.internal:roles: ubuntu@vm07.local - ['host.a', 'client.0', 'osd.0', 'osd.1', 'osd.2'] 2026-03-09T19:17:43.914 INFO:teuthology.task.internal:roles: ubuntu@vm08.local - ['host.b', 'client.1', 'osd.3', 'osd.4', 'osd.5'] 2026-03-09T19:17:43.914 INFO:teuthology.run_tasks:Running task console_log... 2026-03-09T19:17:43.920 DEBUG:teuthology.task.console_log:vm07 does not support IPMI; excluding 2026-03-09T19:17:43.925 DEBUG:teuthology.task.console_log:vm08 does not support IPMI; excluding 2026-03-09T19:17:43.925 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7ff0c5a7a170>, signals=[15]) 2026-03-09T19:17:43.925 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-09T19:17:43.926 INFO:teuthology.task.internal:Opening connections... 2026-03-09T19:17:43.926 DEBUG:teuthology.task.internal:connecting to ubuntu@vm07.local 2026-03-09T19:17:43.926 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm07.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-09T19:17:43.986 DEBUG:teuthology.task.internal:connecting to ubuntu@vm08.local 2026-03-09T19:17:43.987 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm08.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-09T19:17:44.047 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-09T19:17:44.048 DEBUG:teuthology.orchestra.run.vm07:> uname -m 2026-03-09T19:17:44.096 INFO:teuthology.orchestra.run.vm07.stdout:x86_64 2026-03-09T19:17:44.096 DEBUG:teuthology.orchestra.run.vm07:> cat /etc/os-release 2026-03-09T19:17:44.154 INFO:teuthology.orchestra.run.vm07.stdout:NAME="CentOS Stream" 2026-03-09T19:17:44.155 INFO:teuthology.orchestra.run.vm07.stdout:VERSION="9" 2026-03-09T19:17:44.155 INFO:teuthology.orchestra.run.vm07.stdout:ID="centos" 2026-03-09T19:17:44.155 INFO:teuthology.orchestra.run.vm07.stdout:ID_LIKE="rhel fedora" 2026-03-09T19:17:44.155 INFO:teuthology.orchestra.run.vm07.stdout:VERSION_ID="9" 2026-03-09T19:17:44.155 INFO:teuthology.orchestra.run.vm07.stdout:PLATFORM_ID="platform:el9" 2026-03-09T19:17:44.155 INFO:teuthology.orchestra.run.vm07.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-09T19:17:44.155 INFO:teuthology.orchestra.run.vm07.stdout:ANSI_COLOR="0;31" 2026-03-09T19:17:44.155 INFO:teuthology.orchestra.run.vm07.stdout:LOGO="fedora-logo-icon" 2026-03-09T19:17:44.155 INFO:teuthology.orchestra.run.vm07.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-09T19:17:44.155 INFO:teuthology.orchestra.run.vm07.stdout:HOME_URL="https://centos.org/" 2026-03-09T19:17:44.155 INFO:teuthology.orchestra.run.vm07.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-09T19:17:44.155 INFO:teuthology.orchestra.run.vm07.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-09T19:17:44.155 INFO:teuthology.orchestra.run.vm07.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-09T19:17:44.155 INFO:teuthology.lock.ops:Updating vm07.local on lock server 2026-03-09T19:17:44.159 DEBUG:teuthology.orchestra.run.vm08:> uname -m 2026-03-09T19:17:44.174 INFO:teuthology.orchestra.run.vm08.stdout:x86_64 2026-03-09T19:17:44.174 DEBUG:teuthology.orchestra.run.vm08:> cat /etc/os-release 2026-03-09T19:17:44.228 INFO:teuthology.orchestra.run.vm08.stdout:NAME="CentOS Stream" 2026-03-09T19:17:44.228 INFO:teuthology.orchestra.run.vm08.stdout:VERSION="9" 2026-03-09T19:17:44.228 INFO:teuthology.orchestra.run.vm08.stdout:ID="centos" 2026-03-09T19:17:44.228 INFO:teuthology.orchestra.run.vm08.stdout:ID_LIKE="rhel fedora" 2026-03-09T19:17:44.228 INFO:teuthology.orchestra.run.vm08.stdout:VERSION_ID="9" 2026-03-09T19:17:44.228 INFO:teuthology.orchestra.run.vm08.stdout:PLATFORM_ID="platform:el9" 2026-03-09T19:17:44.228 INFO:teuthology.orchestra.run.vm08.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-09T19:17:44.228 INFO:teuthology.orchestra.run.vm08.stdout:ANSI_COLOR="0;31" 2026-03-09T19:17:44.228 INFO:teuthology.orchestra.run.vm08.stdout:LOGO="fedora-logo-icon" 2026-03-09T19:17:44.228 INFO:teuthology.orchestra.run.vm08.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-09T19:17:44.228 INFO:teuthology.orchestra.run.vm08.stdout:HOME_URL="https://centos.org/" 2026-03-09T19:17:44.228 INFO:teuthology.orchestra.run.vm08.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-09T19:17:44.228 INFO:teuthology.orchestra.run.vm08.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-09T19:17:44.228 INFO:teuthology.orchestra.run.vm08.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-09T19:17:44.228 INFO:teuthology.lock.ops:Updating vm08.local on lock server 2026-03-09T19:17:44.232 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-09T19:17:44.234 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-09T19:17:44.235 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-09T19:17:44.235 DEBUG:teuthology.orchestra.run.vm07:> test '!' -e /home/ubuntu/cephtest 2026-03-09T19:17:44.237 DEBUG:teuthology.orchestra.run.vm08:> test '!' -e /home/ubuntu/cephtest 2026-03-09T19:17:44.283 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-09T19:17:44.284 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-09T19:17:44.284 DEBUG:teuthology.orchestra.run.vm07:> test -z $(ls -A /var/lib/ceph) 2026-03-09T19:17:44.293 DEBUG:teuthology.orchestra.run.vm08:> test -z $(ls -A /var/lib/ceph) 2026-03-09T19:17:44.307 INFO:teuthology.orchestra.run.vm07.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-09T19:17:44.338 INFO:teuthology.orchestra.run.vm08.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-09T19:17:44.338 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-09T19:17:44.346 DEBUG:teuthology.orchestra.run.vm07:> test -e /ceph-qa-ready 2026-03-09T19:17:44.361 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T19:17:44.565 DEBUG:teuthology.orchestra.run.vm08:> test -e /ceph-qa-ready 2026-03-09T19:17:44.579 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T19:17:44.764 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-09T19:17:44.765 INFO:teuthology.task.internal:Creating test directory... 2026-03-09T19:17:44.765 DEBUG:teuthology.orchestra.run.vm07:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-09T19:17:44.768 DEBUG:teuthology.orchestra.run.vm08:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-09T19:17:44.785 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-09T19:17:44.786 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-09T19:17:44.787 INFO:teuthology.task.internal:Creating archive directory... 2026-03-09T19:17:44.787 DEBUG:teuthology.orchestra.run.vm07:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-09T19:17:44.828 DEBUG:teuthology.orchestra.run.vm08:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-09T19:17:44.846 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-09T19:17:44.848 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-09T19:17:44.848 DEBUG:teuthology.orchestra.run.vm07:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-09T19:17:44.900 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T19:17:44.900 DEBUG:teuthology.orchestra.run.vm08:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-09T19:17:44.914 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T19:17:44.914 DEBUG:teuthology.orchestra.run.vm07:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-09T19:17:44.942 DEBUG:teuthology.orchestra.run.vm08:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-09T19:17:44.966 INFO:teuthology.orchestra.run.vm07.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-09T19:17:44.978 INFO:teuthology.orchestra.run.vm07.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-09T19:17:44.981 INFO:teuthology.orchestra.run.vm08.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-09T19:17:44.991 INFO:teuthology.orchestra.run.vm08.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-09T19:17:44.992 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-09T19:17:44.993 INFO:teuthology.task.internal:Configuring sudo... 2026-03-09T19:17:44.993 DEBUG:teuthology.orchestra.run.vm07:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-09T19:17:45.022 DEBUG:teuthology.orchestra.run.vm08:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-09T19:17:45.059 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-09T19:17:45.062 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-09T19:17:45.062 DEBUG:teuthology.orchestra.run.vm07:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-09T19:17:45.089 DEBUG:teuthology.orchestra.run.vm08:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-09T19:17:45.123 DEBUG:teuthology.orchestra.run.vm07:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-09T19:17:45.171 DEBUG:teuthology.orchestra.run.vm07:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-09T19:17:45.228 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:17:45.229 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-09T19:17:45.288 DEBUG:teuthology.orchestra.run.vm08:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-09T19:17:45.313 DEBUG:teuthology.orchestra.run.vm08:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-09T19:17:45.370 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-09T19:17:45.370 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-09T19:17:45.428 DEBUG:teuthology.orchestra.run.vm07:> sudo service rsyslog restart 2026-03-09T19:17:45.429 DEBUG:teuthology.orchestra.run.vm08:> sudo service rsyslog restart 2026-03-09T19:17:45.458 INFO:teuthology.orchestra.run.vm07.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-09T19:17:45.497 INFO:teuthology.orchestra.run.vm08.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-09T19:17:45.878 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-09T19:17:45.880 INFO:teuthology.task.internal:Starting timer... 2026-03-09T19:17:45.880 INFO:teuthology.run_tasks:Running task pcp... 2026-03-09T19:17:45.882 INFO:teuthology.run_tasks:Running task selinux... 2026-03-09T19:17:45.884 DEBUG:teuthology.task:Applying overrides for task selinux: {'allowlist': ['scontext=system_u:system_r:logrotate_t:s0', 'scontext=system_u:system_r:getty_t:s0']} 2026-03-09T19:17:45.884 INFO:teuthology.task.selinux:Excluding vm07: VMs are not yet supported 2026-03-09T19:17:45.884 INFO:teuthology.task.selinux:Excluding vm08: VMs are not yet supported 2026-03-09T19:17:45.884 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-09T19:17:45.885 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-09T19:17:45.885 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-09T19:17:45.885 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-09T19:17:45.886 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-09T19:17:45.886 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/ceph/ceph-cm-ansible.git 2026-03-09T19:17:45.888 INFO:teuthology.repo_utils:Fetching github.com_ceph_ceph-cm-ansible_main from origin 2026-03-09T19:17:46.509 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-09T19:17:46.514 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-09T19:17:46.514 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventory3rz2yoez --limit vm07.local,vm08.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-09T19:19:45.489 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm07.local'), Remote(name='ubuntu@vm08.local')] 2026-03-09T19:19:45.489 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm07.local' 2026-03-09T19:19:45.490 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm07.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-09T19:19:45.566 DEBUG:teuthology.orchestra.run.vm07:> true 2026-03-09T19:19:45.646 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm07.local' 2026-03-09T19:19:45.646 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm08.local' 2026-03-09T19:19:45.646 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm08.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-09T19:19:45.714 DEBUG:teuthology.orchestra.run.vm08:> true 2026-03-09T19:19:45.791 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm08.local' 2026-03-09T19:19:45.791 INFO:teuthology.run_tasks:Running task clock... 2026-03-09T19:19:45.794 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-09T19:19:45.794 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-09T19:19:45.795 DEBUG:teuthology.orchestra.run.vm07:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-09T19:19:45.796 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-09T19:19:45.797 DEBUG:teuthology.orchestra.run.vm08:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-09T19:19:45.829 INFO:teuthology.orchestra.run.vm07.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-09T19:19:45.843 INFO:teuthology.orchestra.run.vm07.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-09T19:19:45.866 INFO:teuthology.orchestra.run.vm07.stderr:sudo: ntpd: command not found 2026-03-09T19:19:45.869 INFO:teuthology.orchestra.run.vm08.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-09T19:19:45.877 INFO:teuthology.orchestra.run.vm07.stdout:506 Cannot talk to daemon 2026-03-09T19:19:45.882 INFO:teuthology.orchestra.run.vm08.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-09T19:19:45.890 INFO:teuthology.orchestra.run.vm07.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-09T19:19:45.905 INFO:teuthology.orchestra.run.vm07.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-09T19:19:45.913 INFO:teuthology.orchestra.run.vm08.stderr:sudo: ntpd: command not found 2026-03-09T19:19:45.927 INFO:teuthology.orchestra.run.vm08.stdout:506 Cannot talk to daemon 2026-03-09T19:19:45.943 INFO:teuthology.orchestra.run.vm08.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-09T19:19:45.953 INFO:teuthology.orchestra.run.vm07.stderr:bash: line 1: ntpq: command not found 2026-03-09T19:19:45.956 INFO:teuthology.orchestra.run.vm07.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-09T19:19:45.957 INFO:teuthology.orchestra.run.vm07.stdout:=============================================================================== 2026-03-09T19:19:45.957 INFO:teuthology.orchestra.run.vm07.stdout:^? funky.f5s.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T19:19:45.957 INFO:teuthology.orchestra.run.vm07.stdout:^? www.h4x-gamers.top 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T19:19:45.957 INFO:teuthology.orchestra.run.vm07.stdout:^? 193.158.22.13 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T19:19:45.957 INFO:teuthology.orchestra.run.vm07.stdout:^? ntp2.noris.net 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T19:19:45.960 INFO:teuthology.orchestra.run.vm08.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-09T19:19:46.010 INFO:teuthology.orchestra.run.vm08.stderr:bash: line 1: ntpq: command not found 2026-03-09T19:19:46.012 INFO:teuthology.orchestra.run.vm08.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-09T19:19:46.012 INFO:teuthology.orchestra.run.vm08.stdout:=============================================================================== 2026-03-09T19:19:46.012 INFO:teuthology.orchestra.run.vm08.stdout:^? ntp2.noris.net 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T19:19:46.012 INFO:teuthology.orchestra.run.vm08.stdout:^? www.h4x-gamers.top 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T19:19:46.012 INFO:teuthology.orchestra.run.vm08.stdout:^? funky.f5s.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T19:19:46.013 INFO:teuthology.orchestra.run.vm08.stdout:^? 193.158.22.13 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T19:19:46.013 INFO:teuthology.run_tasks:Running task install... 2026-03-09T19:19:46.015 DEBUG:teuthology.task.install:project ceph 2026-03-09T19:19:46.015 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-09T19:19:46.015 DEBUG:teuthology.task.install:config {'branch': 'reef', 'exclude_packages': ['ceph-volume'], 'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-09T19:19:46.015 INFO:teuthology.task.install:Using flavor: default 2026-03-09T19:19:46.017 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-09T19:19:46.017 INFO:teuthology.task.install:extra packages: [] 2026-03-09T19:19:46.017 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': 'reef', 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': None, 'wait_for_package': False} 2026-03-09T19:19:46.017 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-09T19:19:46.017 INFO:teuthology.packaging:ref: None 2026-03-09T19:19:46.017 INFO:teuthology.packaging:tag: None 2026-03-09T19:19:46.017 INFO:teuthology.packaging:branch: reef 2026-03-09T19:19:46.017 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T19:19:46.017 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=reef 2026-03-09T19:19:46.018 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': 'reef', 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': None, 'wait_for_package': False} 2026-03-09T19:19:46.018 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-09T19:19:46.018 INFO:teuthology.packaging:ref: None 2026-03-09T19:19:46.018 INFO:teuthology.packaging:tag: None 2026-03-09T19:19:46.018 INFO:teuthology.packaging:branch: reef 2026-03-09T19:19:46.018 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T19:19:46.018 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=reef 2026-03-09T19:19:46.774 INFO:teuthology.task.install.rpm:Pulling from https://3.chacra.ceph.com/r/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/flavors/default/ 2026-03-09T19:19:46.775 INFO:teuthology.task.install.rpm:Package version is 18.2.7-1055.gab47f43c 2026-03-09T19:19:46.863 INFO:teuthology.task.install.rpm:Pulling from https://3.chacra.ceph.com/r/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/flavors/default/ 2026-03-09T19:19:46.863 INFO:teuthology.task.install.rpm:Package version is 18.2.7-1055.gab47f43c 2026-03-09T19:19:47.356 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://3.chacra.ceph.com/r/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://3.chacra.ceph.com/r/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://3.chacra.ceph.com/r/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-09T19:19:47.356 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:19:47.356 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-09T19:19:47.357 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://3.chacra.ceph.com/r/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://3.chacra.ceph.com/r/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://3.chacra.ceph.com/r/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-09T19:19:47.357 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-09T19:19:47.357 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-09T19:19:47.384 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-09T19:19:47.384 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-09T19:19:47.384 INFO:teuthology.packaging:ref: None 2026-03-09T19:19:47.384 INFO:teuthology.packaging:tag: None 2026-03-09T19:19:47.384 INFO:teuthology.packaging:branch: reef 2026-03-09T19:19:47.384 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T19:19:47.384 DEBUG:teuthology.orchestra.run.vm07:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/reef/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-09T19:19:47.387 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-09T19:19:47.387 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-09T19:19:47.387 INFO:teuthology.packaging:ref: None 2026-03-09T19:19:47.387 INFO:teuthology.packaging:tag: None 2026-03-09T19:19:47.387 INFO:teuthology.packaging:branch: reef 2026-03-09T19:19:47.387 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T19:19:47.387 DEBUG:teuthology.orchestra.run.vm08:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/reef/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-09T19:19:47.453 DEBUG:teuthology.orchestra.run.vm07:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-09T19:19:47.463 DEBUG:teuthology.orchestra.run.vm08:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-09T19:19:47.529 DEBUG:teuthology.orchestra.run.vm07:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-09T19:19:47.550 DEBUG:teuthology.orchestra.run.vm08:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-09T19:19:47.582 INFO:teuthology.orchestra.run.vm08.stdout:check_obsoletes = 1 2026-03-09T19:19:47.583 DEBUG:teuthology.orchestra.run.vm08:> sudo yum clean all 2026-03-09T19:19:47.594 INFO:teuthology.orchestra.run.vm07.stdout:check_obsoletes = 1 2026-03-09T19:19:47.595 DEBUG:teuthology.orchestra.run.vm07:> sudo yum clean all 2026-03-09T19:19:47.771 INFO:teuthology.orchestra.run.vm07.stdout:41 files removed 2026-03-09T19:19:47.794 INFO:teuthology.orchestra.run.vm08.stdout:41 files removed 2026-03-09T19:19:47.794 DEBUG:teuthology.orchestra.run.vm07:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-09T19:19:47.817 DEBUG:teuthology.orchestra.run.vm08:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-09T19:19:49.123 INFO:teuthology.orchestra.run.vm07.stdout:ceph packages for x86_64 66 kB/s | 77 kB 00:01 2026-03-09T19:19:49.190 INFO:teuthology.orchestra.run.vm08.stdout:ceph packages for x86_64 65 kB/s | 77 kB 00:01 2026-03-09T19:19:50.129 INFO:teuthology.orchestra.run.vm07.stdout:ceph noarch packages 12 kB/s | 11 kB 00:00 2026-03-09T19:19:50.163 INFO:teuthology.orchestra.run.vm08.stdout:ceph noarch packages 12 kB/s | 11 kB 00:00 2026-03-09T19:19:51.098 INFO:teuthology.orchestra.run.vm07.stdout:ceph source packages 2.0 kB/s | 1.9 kB 00:00 2026-03-09T19:19:51.128 INFO:teuthology.orchestra.run.vm08.stdout:ceph source packages 2.0 kB/s | 1.9 kB 00:00 2026-03-09T19:19:52.275 INFO:teuthology.orchestra.run.vm08.stdout:CentOS Stream 9 - BaseOS 7.9 MB/s | 8.9 MB 00:01 2026-03-09T19:19:53.817 INFO:teuthology.orchestra.run.vm07.stdout:CentOS Stream 9 - BaseOS 3.3 MB/s | 8.9 MB 00:02 2026-03-09T19:19:56.191 INFO:teuthology.orchestra.run.vm07.stdout:CentOS Stream 9 - AppStream 16 MB/s | 27 MB 00:01 2026-03-09T19:19:58.288 INFO:teuthology.orchestra.run.vm08.stdout:CentOS Stream 9 - AppStream 5.2 MB/s | 27 MB 00:05 2026-03-09T19:20:02.838 INFO:teuthology.orchestra.run.vm08.stdout:CentOS Stream 9 - CRB 5.0 MB/s | 8.0 MB 00:01 2026-03-09T19:20:03.825 INFO:teuthology.orchestra.run.vm07.stdout:CentOS Stream 9 - CRB 1.7 MB/s | 8.0 MB 00:04 2026-03-09T19:20:04.126 INFO:teuthology.orchestra.run.vm08.stdout:CentOS Stream 9 - Extras packages 47 kB/s | 20 kB 00:00 2026-03-09T19:20:04.639 INFO:teuthology.orchestra.run.vm08.stdout:Extra Packages for Enterprise Linux 48 MB/s | 20 MB 00:00 2026-03-09T19:20:05.718 INFO:teuthology.orchestra.run.vm07.stdout:CentOS Stream 9 - Extras packages 20 kB/s | 20 kB 00:01 2026-03-09T19:20:06.767 INFO:teuthology.orchestra.run.vm07.stdout:Extra Packages for Enterprise Linux 22 MB/s | 20 MB 00:00 2026-03-09T19:20:09.510 INFO:teuthology.orchestra.run.vm08.stdout:lab-extras 54 kB/s | 50 kB 00:00 2026-03-09T19:20:11.012 INFO:teuthology.orchestra.run.vm08.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-09T19:20:11.013 INFO:teuthology.orchestra.run.vm08.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-09T19:20:11.016 INFO:teuthology.orchestra.run.vm08.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-09T19:20:11.017 INFO:teuthology.orchestra.run.vm08.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-09T19:20:11.044 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:20:11.047 INFO:teuthology.orchestra.run.vm08.stdout:======================================================================================= 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout:======================================================================================= 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout:Installing: 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 6.5 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-base x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 5.1 M 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-fuse x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 850 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-immutable-object-cache x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 143 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 1.5 M 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-cephadm noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 140 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-dashboard noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 3.5 M 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 7.4 M 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-rook noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 49 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-radosgw x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 7.8 M 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-test x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 36 M 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: cephadm noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 226 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs-devel x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 31 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs2 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 710 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: librados-devel x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 126 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: python3-cephfs x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 162 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: python3-rados x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 322 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: python3-rbd x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 302 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: python3-rgw x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 100 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: rbd-fuse x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 87 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: rbd-mirror x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 3.0 M 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: rbd-nbd x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 172 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout:Upgrading: 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: librados2 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 3.3 M 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: librbd1 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 3.0 M 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout:Installing dependencies: 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-common x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 18 M 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-grafana-dashboards noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 24 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mds x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 2.1 M 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-modules-core noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 248 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mon x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 4.7 M 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-osd x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 17 M 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-prometheus-alerts noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 17 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ceph-selinux x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 25 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: libcephsqlite x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 166 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-09T19:20:11.048 INFO:teuthology.orchestra.run.vm08.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: libradosstriper1 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 475 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: librgw2 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 4.5 M 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-argparse x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 45 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-common x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 130 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-09T19:20:11.049 INFO:teuthology.orchestra.run.vm08.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-09T19:20:11.050 INFO:teuthology.orchestra.run.vm08.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-09T19:20:11.050 INFO:teuthology.orchestra.run.vm08.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-09T19:20:11.050 INFO:teuthology.orchestra.run.vm08.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-09T19:20:11.050 INFO:teuthology.orchestra.run.vm08.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-09T19:20:11.050 INFO:teuthology.orchestra.run.vm08.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-09T19:20:11.050 INFO:teuthology.orchestra.run.vm08.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-09T19:20:11.050 INFO:teuthology.orchestra.run.vm08.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-09T19:20:11.050 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:20:11.050 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-09T19:20:11.050 INFO:teuthology.orchestra.run.vm08.stdout:======================================================================================= 2026-03-09T19:20:11.050 INFO:teuthology.orchestra.run.vm08.stdout:Install 115 Packages 2026-03-09T19:20:11.050 INFO:teuthology.orchestra.run.vm08.stdout:Upgrade 2 Packages 2026-03-09T19:20:11.050 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:20:11.050 INFO:teuthology.orchestra.run.vm08.stdout:Total download size: 181 M 2026-03-09T19:20:11.050 INFO:teuthology.orchestra.run.vm08.stdout:Downloading Packages: 2026-03-09T19:20:11.442 INFO:teuthology.orchestra.run.vm07.stdout:lab-extras 64 kB/s | 50 kB 00:00 2026-03-09T19:20:12.302 INFO:teuthology.orchestra.run.vm08.stdout:(1/117): ceph-18.2.7-1055.gab47f43c.el9.x86_64. 14 kB/s | 6.5 kB 00:00 2026-03-09T19:20:12.922 INFO:teuthology.orchestra.run.vm07.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-09T19:20:12.923 INFO:teuthology.orchestra.run.vm07.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-09T19:20:12.928 INFO:teuthology.orchestra.run.vm07.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-09T19:20:12.928 INFO:teuthology.orchestra.run.vm07.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-09T19:20:12.957 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout:======================================================================================= 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout:======================================================================================= 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout:Installing: 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: ceph x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 6.5 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: ceph-base x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 5.1 M 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: ceph-fuse x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 850 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: ceph-immutable-object-cache x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 143 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 1.5 M 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-cephadm noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 140 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-dashboard noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 3.5 M 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 7.4 M 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-rook noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 49 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: ceph-radosgw x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 7.8 M 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: ceph-test x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 36 M 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: cephadm noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 226 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs-devel x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 31 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs2 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 710 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: librados-devel x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 126 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: python3-cephfs x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 162 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: python3-rados x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 322 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: python3-rbd x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 302 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: python3-rgw x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 100 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: rbd-fuse x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 87 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: rbd-mirror x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 3.0 M 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: rbd-nbd x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 172 k 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout:Upgrading: 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: librados2 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 3.3 M 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: librbd1 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 3.0 M 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout:Installing dependencies: 2026-03-09T19:20:12.961 INFO:teuthology.orchestra.run.vm07.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: ceph-common x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 18 M 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: ceph-grafana-dashboards noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 24 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mds x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 2.1 M 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-modules-core noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 248 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mon x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 4.7 M 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: ceph-osd x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 17 M 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: ceph-prometheus-alerts noarch 2:18.2.7-1055.gab47f43c.el9 ceph-noarch 17 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: ceph-selinux x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 25 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: libcephsqlite x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 166 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: libradosstriper1 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 475 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: librgw2 x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 4.5 M 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-argparse x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 45 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-common x86_64 2:18.2.7-1055.gab47f43c.el9 ceph 130 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-09T19:20:12.962 INFO:teuthology.orchestra.run.vm07.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout:======================================================================================= 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout:Install 115 Packages 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout:Upgrade 2 Packages 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:20:12.963 INFO:teuthology.orchestra.run.vm07.stdout:Total download size: 181 M 2026-03-09T19:20:12.964 INFO:teuthology.orchestra.run.vm07.stdout:Downloading Packages: 2026-03-09T19:20:14.162 INFO:teuthology.orchestra.run.vm07.stdout:(1/117): ceph-18.2.7-1055.gab47f43c.el9.x86_64. 14 kB/s | 6.5 kB 00:00 2026-03-09T19:20:14.417 INFO:teuthology.orchestra.run.vm08.stdout:(2/117): ceph-fuse-18.2.7-1055.gab47f43c.el9.x8 402 kB/s | 850 kB 00:02 2026-03-09T19:20:14.655 INFO:teuthology.orchestra.run.vm08.stdout:(3/117): ceph-immutable-object-cache-18.2.7-105 601 kB/s | 143 kB 00:00 2026-03-09T19:20:16.210 INFO:teuthology.orchestra.run.vm07.stdout:(2/117): ceph-fuse-18.2.7-1055.gab47f43c.el9.x8 415 kB/s | 850 kB 00:02 2026-03-09T19:20:16.439 INFO:teuthology.orchestra.run.vm07.stdout:(3/117): ceph-immutable-object-cache-18.2.7-105 622 kB/s | 143 kB 00:00 2026-03-09T19:20:16.772 INFO:teuthology.orchestra.run.vm08.stdout:(4/117): ceph-mds-18.2.7-1055.gab47f43c.el9.x86 1.0 MB/s | 2.1 MB 00:02 2026-03-09T19:20:17.613 INFO:teuthology.orchestra.run.vm08.stdout:(5/117): ceph-base-18.2.7-1055.gab47f43c.el9.x8 906 kB/s | 5.1 MB 00:05 2026-03-09T19:20:17.616 INFO:teuthology.orchestra.run.vm08.stdout:(6/117): ceph-mgr-18.2.7-1055.gab47f43c.el9.x86 1.7 MB/s | 1.5 MB 00:00 2026-03-09T19:20:18.492 INFO:teuthology.orchestra.run.vm07.stdout:(4/117): ceph-mds-18.2.7-1055.gab47f43c.el9.x86 1.0 MB/s | 2.1 MB 00:02 2026-03-09T19:20:19.154 INFO:teuthology.orchestra.run.vm08.stdout:(7/117): ceph-mon-18.2.7-1055.gab47f43c.el9.x86 3.0 MB/s | 4.7 MB 00:01 2026-03-09T19:20:19.293 INFO:teuthology.orchestra.run.vm07.stdout:(5/117): ceph-mgr-18.2.7-1055.gab47f43c.el9.x86 1.8 MB/s | 1.5 MB 00:00 2026-03-09T19:20:19.538 INFO:teuthology.orchestra.run.vm07.stdout:(6/117): ceph-base-18.2.7-1055.gab47f43c.el9.x8 898 kB/s | 5.1 MB 00:05 2026-03-09T19:20:20.567 INFO:teuthology.orchestra.run.vm08.stdout:(8/117): ceph-common-18.2.7-1055.gab47f43c.el9. 2.1 MB/s | 18 MB 00:08 2026-03-09T19:20:20.596 INFO:teuthology.orchestra.run.vm08.stdout:(9/117): ceph-radosgw-18.2.7-1055.gab47f43c.el9 5.4 MB/s | 7.8 MB 00:01 2026-03-09T19:20:20.687 INFO:teuthology.orchestra.run.vm08.stdout:(10/117): ceph-selinux-18.2.7-1055.gab47f43c.el 211 kB/s | 25 kB 00:00 2026-03-09T19:20:20.805 INFO:teuthology.orchestra.run.vm08.stdout:(11/117): libcephfs-devel-18.2.7-1055.gab47f43c 262 kB/s | 31 kB 00:00 2026-03-09T19:20:20.898 INFO:teuthology.orchestra.run.vm07.stdout:(7/117): ceph-mon-18.2.7-1055.gab47f43c.el9.x86 2.9 MB/s | 4.7 MB 00:01 2026-03-09T19:20:20.933 INFO:teuthology.orchestra.run.vm08.stdout:(12/117): libcephfs2-18.2.7-1055.gab47f43c.el9. 5.5 MB/s | 710 kB 00:00 2026-03-09T19:20:21.053 INFO:teuthology.orchestra.run.vm08.stdout:(13/117): libcephsqlite-18.2.7-1055.gab47f43c.e 1.4 MB/s | 166 kB 00:00 2026-03-09T19:20:21.173 INFO:teuthology.orchestra.run.vm08.stdout:(14/117): librados-devel-18.2.7-1055.gab47f43c. 1.0 MB/s | 126 kB 00:00 2026-03-09T19:20:21.297 INFO:teuthology.orchestra.run.vm08.stdout:(15/117): libradosstriper1-18.2.7-1055.gab47f43 3.7 MB/s | 475 kB 00:00 2026-03-09T19:20:21.931 INFO:teuthology.orchestra.run.vm08.stdout:(16/117): librgw2-18.2.7-1055.gab47f43c.el9.x86 7.1 MB/s | 4.5 MB 00:00 2026-03-09T19:20:22.130 INFO:teuthology.orchestra.run.vm08.stdout:(17/117): python3-ceph-argparse-18.2.7-1055.gab 226 kB/s | 45 kB 00:00 2026-03-09T19:20:22.295 INFO:teuthology.orchestra.run.vm08.stdout:(18/117): python3-ceph-common-18.2.7-1055.gab47 788 kB/s | 130 kB 00:00 2026-03-09T19:20:22.378 INFO:teuthology.orchestra.run.vm07.stdout:(8/117): ceph-radosgw-18.2.7-1055.gab47f43c.el9 5.3 MB/s | 7.8 MB 00:01 2026-03-09T19:20:22.419 INFO:teuthology.orchestra.run.vm08.stdout:(19/117): python3-cephfs-18.2.7-1055.gab47f43c. 1.3 MB/s | 162 kB 00:00 2026-03-09T19:20:22.498 INFO:teuthology.orchestra.run.vm07.stdout:(9/117): ceph-selinux-18.2.7-1055.gab47f43c.el9 209 kB/s | 25 kB 00:00 2026-03-09T19:20:22.543 INFO:teuthology.orchestra.run.vm08.stdout:(20/117): python3-rados-18.2.7-1055.gab47f43c.e 2.6 MB/s | 322 kB 00:00 2026-03-09T19:20:22.576 INFO:teuthology.orchestra.run.vm07.stdout:(10/117): ceph-common-18.2.7-1055.gab47f43c.el9 2.1 MB/s | 18 MB 00:08 2026-03-09T19:20:22.632 INFO:teuthology.orchestra.run.vm08.stdout:(21/117): ceph-osd-18.2.7-1055.gab47f43c.el9.x8 3.3 MB/s | 17 MB 00:05 2026-03-09T19:20:22.667 INFO:teuthology.orchestra.run.vm08.stdout:(22/117): python3-rbd-18.2.7-1055.gab47f43c.el9 2.4 MB/s | 302 kB 00:00 2026-03-09T19:20:22.696 INFO:teuthology.orchestra.run.vm07.stdout:(11/117): libcephfs-devel-18.2.7-1055.gab47f43c 257 kB/s | 31 kB 00:00 2026-03-09T19:20:22.769 INFO:teuthology.orchestra.run.vm08.stdout:(23/117): python3-rgw-18.2.7-1055.gab47f43c.el9 730 kB/s | 100 kB 00:00 2026-03-09T19:20:22.806 INFO:teuthology.orchestra.run.vm08.stdout:(24/117): rbd-fuse-18.2.7-1055.gab47f43c.el9.x8 627 kB/s | 87 kB 00:00 2026-03-09T19:20:22.928 INFO:teuthology.orchestra.run.vm08.stdout:(25/117): rbd-nbd-18.2.7-1055.gab47f43c.el9.x86 1.4 MB/s | 172 kB 00:00 2026-03-09T19:20:23.008 INFO:teuthology.orchestra.run.vm07.stdout:(12/117): ceph-osd-18.2.7-1055.gab47f43c.el9.x8 4.8 MB/s | 17 MB 00:03 2026-03-09T19:20:23.067 INFO:teuthology.orchestra.run.vm08.stdout:(26/117): ceph-grafana-dashboards-18.2.7-1055.g 175 kB/s | 24 kB 00:00 2026-03-09T19:20:23.135 INFO:teuthology.orchestra.run.vm07.stdout:(13/117): libcephsqlite-18.2.7-1055.gab47f43c.e 1.3 MB/s | 166 kB 00:00 2026-03-09T19:20:23.188 INFO:teuthology.orchestra.run.vm08.stdout:(27/117): ceph-mgr-cephadm-18.2.7-1055.gab47f43 1.1 MB/s | 140 kB 00:00 2026-03-09T19:20:23.260 INFO:teuthology.orchestra.run.vm07.stdout:(14/117): librados-devel-18.2.7-1055.gab47f43c. 1.0 MB/s | 126 kB 00:00 2026-03-09T19:20:23.387 INFO:teuthology.orchestra.run.vm07.stdout:(15/117): libradosstriper1-18.2.7-1055.gab47f43 3.7 MB/s | 475 kB 00:00 2026-03-09T19:20:23.471 INFO:teuthology.orchestra.run.vm07.stdout:(16/117): libcephfs2-18.2.7-1055.gab47f43c.el9. 918 kB/s | 710 kB 00:00 2026-03-09T19:20:23.585 INFO:teuthology.orchestra.run.vm08.stdout:(28/117): ceph-mgr-dashboard-18.2.7-1055.gab47f 8.9 MB/s | 3.5 MB 00:00 2026-03-09T19:20:23.592 INFO:teuthology.orchestra.run.vm07.stdout:(17/117): python3-ceph-argparse-18.2.7-1055.gab 370 kB/s | 45 kB 00:00 2026-03-09T19:20:23.623 INFO:teuthology.orchestra.run.vm08.stdout:(29/117): rbd-mirror-18.2.7-1055.gab47f43c.el9. 3.5 MB/s | 3.0 MB 00:00 2026-03-09T19:20:23.730 INFO:teuthology.orchestra.run.vm07.stdout:(18/117): python3-ceph-common-18.2.7-1055.gab47 945 kB/s | 130 kB 00:00 2026-03-09T19:20:23.748 INFO:teuthology.orchestra.run.vm08.stdout:(30/117): ceph-mgr-modules-core-18.2.7-1055.gab 1.9 MB/s | 248 kB 00:00 2026-03-09T19:20:23.863 INFO:teuthology.orchestra.run.vm07.stdout:(19/117): python3-cephfs-18.2.7-1055.gab47f43c. 1.2 MB/s | 162 kB 00:00 2026-03-09T19:20:23.868 INFO:teuthology.orchestra.run.vm08.stdout:(31/117): ceph-mgr-rook-18.2.7-1055.gab47f43c.e 413 kB/s | 49 kB 00:00 2026-03-09T19:20:23.991 INFO:teuthology.orchestra.run.vm07.stdout:(20/117): python3-rados-18.2.7-1055.gab47f43c.e 2.5 MB/s | 322 kB 00:00 2026-03-09T19:20:24.088 INFO:teuthology.orchestra.run.vm08.stdout:(32/117): ceph-test-18.2.7-1055.gab47f43c.el9.x 10 MB/s | 36 MB 00:03 2026-03-09T19:20:24.088 INFO:teuthology.orchestra.run.vm08.stdout:(33/117): ceph-prometheus-alerts-18.2.7-1055.ga 76 kB/s | 17 kB 00:00 2026-03-09T19:20:24.122 INFO:teuthology.orchestra.run.vm07.stdout:(21/117): python3-rbd-18.2.7-1055.gab47f43c.el9 2.3 MB/s | 302 kB 00:00 2026-03-09T19:20:24.144 INFO:teuthology.orchestra.run.vm07.stdout:(22/117): librgw2-18.2.7-1055.gab47f43c.el9.x86 5.9 MB/s | 4.5 MB 00:00 2026-03-09T19:20:24.216 INFO:teuthology.orchestra.run.vm08.stdout:(34/117): cephadm-18.2.7-1055.gab47f43c.el9.noa 1.7 MB/s | 226 kB 00:00 2026-03-09T19:20:24.238 INFO:teuthology.orchestra.run.vm08.stdout:(35/117): ledmon-libs-1.1.0-3.el9.x86_64.rpm 271 kB/s | 40 kB 00:00 2026-03-09T19:20:24.249 INFO:teuthology.orchestra.run.vm07.stdout:(23/117): python3-rgw-18.2.7-1055.gab47f43c.el9 788 kB/s | 100 kB 00:00 2026-03-09T19:20:24.269 INFO:teuthology.orchestra.run.vm07.stdout:(24/117): rbd-fuse-18.2.7-1055.gab47f43c.el9.x8 702 kB/s | 87 kB 00:00 2026-03-09T19:20:24.442 INFO:teuthology.orchestra.run.vm08.stdout:(36/117): ceph-mgr-diskprediction-local-18.2.7- 8.6 MB/s | 7.4 MB 00:00 2026-03-09T19:20:24.443 INFO:teuthology.orchestra.run.vm07.stdout:(25/117): rbd-nbd-18.2.7-1055.gab47f43c.el9.x86 992 kB/s | 172 kB 00:00 2026-03-09T19:20:24.492 INFO:teuthology.orchestra.run.vm08.stdout:(37/117): libconfig-1.7.2-9.el9.x86_64.rpm 262 kB/s | 72 kB 00:00 2026-03-09T19:20:24.547 INFO:teuthology.orchestra.run.vm08.stdout:(38/117): mailcap-2.1.49-5.el9.noarch.rpm 596 kB/s | 33 kB 00:00 2026-03-09T19:20:24.563 INFO:teuthology.orchestra.run.vm08.stdout:(39/117): libgfortran-11.5.0-14.el9.x86_64.rpm 2.4 MB/s | 794 kB 00:00 2026-03-09T19:20:24.563 INFO:teuthology.orchestra.run.vm07.stdout:(26/117): ceph-grafana-dashboards-18.2.7-1055.g 203 kB/s | 24 kB 00:00 2026-03-09T19:20:24.683 INFO:teuthology.orchestra.run.vm08.stdout:(40/117): libquadmath-11.5.0-14.el9.x86_64.rpm 768 kB/s | 184 kB 00:00 2026-03-09T19:20:24.687 INFO:teuthology.orchestra.run.vm07.stdout:(27/117): ceph-mgr-cephadm-18.2.7-1055.gab47f43 1.1 MB/s | 140 kB 00:00 2026-03-09T19:20:24.690 INFO:teuthology.orchestra.run.vm08.stdout:(41/117): python3-cffi-1.14.5-5.el9.x86_64.rpm 1.7 MB/s | 253 kB 00:00 2026-03-09T19:20:24.711 INFO:teuthology.orchestra.run.vm08.stdout:(42/117): python3-cryptography-36.0.1-5.el9.x86 8.4 MB/s | 1.2 MB 00:00 2026-03-09T19:20:24.736 INFO:teuthology.orchestra.run.vm08.stdout:(43/117): python3-ply-3.11-14.el9.noarch.rpm 2.0 MB/s | 106 kB 00:00 2026-03-09T19:20:24.743 INFO:teuthology.orchestra.run.vm08.stdout:(44/117): python3-pycparser-2.20-6.el9.noarch.r 2.5 MB/s | 135 kB 00:00 2026-03-09T19:20:24.763 INFO:teuthology.orchestra.run.vm08.stdout:(45/117): python3-requests-2.25.1-10.el9.noarch 2.4 MB/s | 126 kB 00:00 2026-03-09T19:20:24.788 INFO:teuthology.orchestra.run.vm08.stdout:(46/117): python3-urllib3-1.26.5-7.el9.noarch.r 4.2 MB/s | 218 kB 00:00 2026-03-09T19:20:24.878 INFO:teuthology.orchestra.run.vm07.stdout:(28/117): rbd-mirror-18.2.7-1055.gab47f43c.el9. 4.8 MB/s | 3.0 MB 00:00 2026-03-09T19:20:24.956 INFO:teuthology.orchestra.run.vm08.stdout:(47/117): flexiblas-3.0.4-9.el9.x86_64.rpm 154 kB/s | 30 kB 00:00 2026-03-09T19:20:25.021 INFO:teuthology.orchestra.run.vm08.stdout:(48/117): flexiblas-openblas-openmp-3.0.4-9.el9 229 kB/s | 15 kB 00:00 2026-03-09T19:20:25.069 INFO:teuthology.orchestra.run.vm08.stdout:(49/117): boost-program-options-1.75.0-13.el9.x 319 kB/s | 104 kB 00:00 2026-03-09T19:20:25.136 INFO:teuthology.orchestra.run.vm08.stdout:(50/117): librabbitmq-0.11.0-7.el9.x86_64.rpm 687 kB/s | 45 kB 00:00 2026-03-09T19:20:25.214 INFO:teuthology.orchestra.run.vm08.stdout:(51/117): libpmemobj-1.12.1-1.el9.x86_64.rpm 831 kB/s | 160 kB 00:00 2026-03-09T19:20:25.297 INFO:teuthology.orchestra.run.vm07.stdout:(29/117): ceph-mgr-dashboard-18.2.7-1055.gab47f 5.8 MB/s | 3.5 MB 00:00 2026-03-09T19:20:25.335 INFO:teuthology.orchestra.run.vm08.stdout:(52/117): librdkafka-1.6.1-102.el9.x86_64.rpm 3.2 MB/s | 662 kB 00:00 2026-03-09T19:20:25.346 INFO:teuthology.orchestra.run.vm08.stdout:(53/117): libstoragemgmt-1.10.1-1.el9.x86_64.rp 1.8 MB/s | 246 kB 00:00 2026-03-09T19:20:25.406 INFO:teuthology.orchestra.run.vm08.stdout:(54/117): libxslt-1.1.34-12.el9.x86_64.rpm 3.2 MB/s | 233 kB 00:00 2026-03-09T19:20:25.417 INFO:teuthology.orchestra.run.vm08.stdout:(55/117): lttng-ust-2.12.0-6.el9.x86_64.rpm 4.1 MB/s | 292 kB 00:00 2026-03-09T19:20:25.420 INFO:teuthology.orchestra.run.vm07.stdout:(30/117): ceph-mgr-modules-core-18.2.7-1055.gab 2.0 MB/s | 248 kB 00:00 2026-03-09T19:20:25.473 INFO:teuthology.orchestra.run.vm08.stdout:(56/117): openblas-0.3.29-1.el9.x86_64.rpm 634 kB/s | 42 kB 00:00 2026-03-09T19:20:25.541 INFO:teuthology.orchestra.run.vm07.stdout:(31/117): ceph-mgr-rook-18.2.7-1055.gab47f43c.e 408 kB/s | 49 kB 00:00 2026-03-09T19:20:25.679 INFO:teuthology.orchestra.run.vm07.stdout:(32/117): ceph-test-18.2.7-1055.gab47f43c.el9.x 11 MB/s | 36 MB 00:03 2026-03-09T19:20:25.680 INFO:teuthology.orchestra.run.vm07.stdout:(33/117): ceph-prometheus-alerts-18.2.7-1055.ga 120 kB/s | 17 kB 00:00 2026-03-09T19:20:25.732 INFO:teuthology.orchestra.run.vm07.stdout:(34/117): ledmon-libs-1.1.0-3.el9.x86_64.rpm 782 kB/s | 40 kB 00:00 2026-03-09T19:20:25.779 INFO:teuthology.orchestra.run.vm07.stdout:(35/117): libconfig-1.7.2-9.el9.x86_64.rpm 1.5 MB/s | 72 kB 00:00 2026-03-09T19:20:25.798 INFO:teuthology.orchestra.run.vm07.stdout:(36/117): cephadm-18.2.7-1055.gab47f43c.el9.noa 1.9 MB/s | 226 kB 00:00 2026-03-09T19:20:25.834 INFO:teuthology.orchestra.run.vm08.stdout:(57/117): openblas-openmp-0.3.29-1.el9.x86_64.r 13 MB/s | 5.3 MB 00:00 2026-03-09T19:20:25.849 INFO:teuthology.orchestra.run.vm07.stdout:(37/117): libgfortran-11.5.0-14.el9.x86_64.rpm 11 MB/s | 794 kB 00:00 2026-03-09T19:20:25.867 INFO:teuthology.orchestra.run.vm07.stdout:(38/117): mailcap-2.1.49-5.el9.noarch.rpm 1.8 MB/s | 33 kB 00:00 2026-03-09T19:20:25.902 INFO:teuthology.orchestra.run.vm07.stdout:(39/117): python3-cffi-1.14.5-5.el9.x86_64.rpm 7.2 MB/s | 253 kB 00:00 2026-03-09T19:20:25.904 INFO:teuthology.orchestra.run.vm08.stdout:(58/117): python3-devel-3.9.25-3.el9.x86_64.rpm 3.4 MB/s | 244 kB 00:00 2026-03-09T19:20:25.916 INFO:teuthology.orchestra.run.vm07.stdout:(40/117): libquadmath-11.5.0-14.el9.x86_64.rpm 1.5 MB/s | 184 kB 00:00 2026-03-09T19:20:25.936 INFO:teuthology.orchestra.run.vm08.stdout:(59/117): python3-babel-2.9.1-2.el9.noarch.rpm 13 MB/s | 6.0 MB 00:00 2026-03-09T19:20:25.947 INFO:teuthology.orchestra.run.vm07.stdout:(41/117): python3-ply-3.11-14.el9.noarch.rpm 3.4 MB/s | 106 kB 00:00 2026-03-09T19:20:25.976 INFO:teuthology.orchestra.run.vm07.stdout:(42/117): python3-cryptography-36.0.1-5.el9.x86 17 MB/s | 1.2 MB 00:00 2026-03-09T19:20:25.977 INFO:teuthology.orchestra.run.vm08.stdout:(60/117): python3-jinja2-2.11.3-8.el9.noarch.rp 3.3 MB/s | 249 kB 00:00 2026-03-09T19:20:25.986 INFO:teuthology.orchestra.run.vm07.stdout:(43/117): python3-pycparser-2.20-6.el9.noarch.r 3.4 MB/s | 135 kB 00:00 2026-03-09T19:20:26.003 INFO:teuthology.orchestra.run.vm08.stdout:(61/117): python3-jmespath-1.0.1-1.el9.noarch.r 710 kB/s | 48 kB 00:00 2026-03-09T19:20:26.014 INFO:teuthology.orchestra.run.vm07.stdout:(44/117): python3-requests-2.25.1-10.el9.noarch 3.3 MB/s | 126 kB 00:00 2026-03-09T19:20:26.031 INFO:teuthology.orchestra.run.vm07.stdout:(45/117): python3-urllib3-1.26.5-7.el9.noarch.r 4.8 MB/s | 218 kB 00:00 2026-03-09T19:20:26.047 INFO:teuthology.orchestra.run.vm08.stdout:(62/117): python3-libstoragemgmt-1.10.1-1.el9.x 2.5 MB/s | 177 kB 00:00 2026-03-09T19:20:26.090 INFO:teuthology.orchestra.run.vm08.stdout:(63/117): flexiblas-netlib-3.0.4-9.el9.x86_64.r 2.3 MB/s | 3.0 MB 00:01 2026-03-09T19:20:26.112 INFO:teuthology.orchestra.run.vm08.stdout:(64/117): python3-markupsafe-1.1.1-12.el9.x86_6 531 kB/s | 35 kB 00:00 2026-03-09T19:20:26.134 INFO:teuthology.orchestra.run.vm08.stdout:(65/117): python3-mako-1.1.4-6.el9.noarch.rpm 1.3 MB/s | 172 kB 00:00 2026-03-09T19:20:26.180 INFO:teuthology.orchestra.run.vm07.stdout:(46/117): flexiblas-3.0.4-9.el9.x86_64.rpm 199 kB/s | 30 kB 00:00 2026-03-09T19:20:26.186 INFO:teuthology.orchestra.run.vm08.stdout:(66/117): python3-numpy-f2py-1.23.5-2.el9.x86_6 5.9 MB/s | 442 kB 00:00 2026-03-09T19:20:26.203 INFO:teuthology.orchestra.run.vm08.stdout:(67/117): python3-pyasn1-0.4.8-7.el9.noarch.rpm 2.3 MB/s | 157 kB 00:00 2026-03-09T19:20:26.257 INFO:teuthology.orchestra.run.vm08.stdout:(68/117): python3-pyasn1-modules-0.4.8-7.el9.no 3.8 MB/s | 277 kB 00:00 2026-03-09T19:20:26.264 INFO:teuthology.orchestra.run.vm07.stdout:(47/117): boost-program-options-1.75.0-13.el9.x 417 kB/s | 104 kB 00:00 2026-03-09T19:20:26.270 INFO:teuthology.orchestra.run.vm08.stdout:(69/117): python3-requests-oauthlib-1.3.0-12.el 807 kB/s | 54 kB 00:00 2026-03-09T19:20:26.311 INFO:teuthology.orchestra.run.vm07.stdout:(48/117): flexiblas-openblas-openmp-3.0.4-9.el9 318 kB/s | 15 kB 00:00 2026-03-09T19:20:26.350 INFO:teuthology.orchestra.run.vm08.stdout:(70/117): python3-toml-0.10.2-6.el9.noarch.rpm 519 kB/s | 42 kB 00:00 2026-03-09T19:20:26.414 INFO:teuthology.orchestra.run.vm07.stdout:(49/117): libpmemobj-1.12.1-1.el9.x86_64.rpm 1.5 MB/s | 160 kB 00:00 2026-03-09T19:20:26.425 INFO:teuthology.orchestra.run.vm08.stdout:(71/117): socat-1.7.4.1-8.el9.x86_64.rpm 4.0 MB/s | 303 kB 00:00 2026-03-09T19:20:26.480 INFO:teuthology.orchestra.run.vm07.stdout:(50/117): librabbitmq-0.11.0-7.el9.x86_64.rpm 693 kB/s | 45 kB 00:00 2026-03-09T19:20:26.492 INFO:teuthology.orchestra.run.vm08.stdout:(72/117): xmlstarlet-1.6.1-20.el9.x86_64.rpm 953 kB/s | 64 kB 00:00 2026-03-09T19:20:26.501 INFO:teuthology.orchestra.run.vm08.stdout:(73/117): fmt-8.1.1-5.el9.x86_64.rpm 12 MB/s | 111 kB 00:00 2026-03-09T19:20:26.512 INFO:teuthology.orchestra.run.vm08.stdout:(74/117): gperftools-libs-2.9.1-3.el9.x86_64.rp 28 MB/s | 308 kB 00:00 2026-03-09T19:20:26.608 INFO:teuthology.orchestra.run.vm07.stdout:(51/117): librdkafka-1.6.1-102.el9.x86_64.rpm 5.0 MB/s | 662 kB 00:00 2026-03-09T19:20:26.625 INFO:teuthology.orchestra.run.vm08.stdout:(75/117): libarrow-9.0.0-15.el9.x86_64.rpm 39 MB/s | 4.4 MB 00:00 2026-03-09T19:20:26.642 INFO:teuthology.orchestra.run.vm08.stdout:(76/117): python3-numpy-1.23.5-2.el9.x86_64.rpm 11 MB/s | 6.1 MB 00:00 2026-03-09T19:20:26.643 INFO:teuthology.orchestra.run.vm08.stdout:(77/117): libarrow-doc-9.0.0-15.el9.noarch.rpm 1.4 MB/s | 25 kB 00:00 2026-03-09T19:20:26.646 INFO:teuthology.orchestra.run.vm08.stdout:(78/117): libunwind-1.6.2-1.el9.x86_64.rpm 23 MB/s | 67 kB 00:00 2026-03-09T19:20:26.649 INFO:teuthology.orchestra.run.vm08.stdout:(79/117): liboath-2.6.12-1.el9.x86_64.rpm 7.3 MB/s | 49 kB 00:00 2026-03-09T19:20:26.667 INFO:teuthology.orchestra.run.vm08.stdout:(80/117): python3-asyncssh-2.13.2-5.el9.noarch. 31 MB/s | 548 kB 00:00 2026-03-09T19:20:26.670 INFO:teuthology.orchestra.run.vm08.stdout:(81/117): python3-autocommand-2.2.2-8.el9.noarc 11 MB/s | 29 kB 00:00 2026-03-09T19:20:26.674 INFO:teuthology.orchestra.run.vm07.stdout:(52/117): libstoragemgmt-1.10.1-1.el9.x86_64.rp 3.7 MB/s | 246 kB 00:00 2026-03-09T19:20:26.675 INFO:teuthology.orchestra.run.vm08.stdout:(82/117): parquet-libs-9.0.0-15.el9.x86_64.rpm 29 MB/s | 838 kB 00:00 2026-03-09T19:20:26.675 INFO:teuthology.orchestra.run.vm08.stdout:(83/117): python3-backports-tarfile-1.2.0-1.el9 10 MB/s | 60 kB 00:00 2026-03-09T19:20:26.678 INFO:teuthology.orchestra.run.vm08.stdout:(84/117): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 16 MB/s | 43 kB 00:00 2026-03-09T19:20:26.678 INFO:teuthology.orchestra.run.vm08.stdout:(85/117): python3-cachetools-4.2.4-1.el9.noarch 12 MB/s | 32 kB 00:00 2026-03-09T19:20:26.680 INFO:teuthology.orchestra.run.vm08.stdout:(86/117): python3-certifi-2023.05.07-4.el9.noar 5.7 MB/s | 14 kB 00:00 2026-03-09T19:20:26.683 INFO:teuthology.orchestra.run.vm08.stdout:(87/117): python3-cheroot-10.0.1-4.el9.noarch.r 37 MB/s | 173 kB 00:00 2026-03-09T19:20:26.690 INFO:teuthology.orchestra.run.vm08.stdout:(88/117): python3-google-auth-2.45.0-1.el9.noar 41 MB/s | 254 kB 00:00 2026-03-09T19:20:26.691 INFO:teuthology.orchestra.run.vm08.stdout:(89/117): python3-cherrypy-18.6.1-2.el9.noarch. 33 MB/s | 358 kB 00:00 2026-03-09T19:20:26.692 INFO:teuthology.orchestra.run.vm08.stdout:(90/117): python3-jaraco-8.2.1-3.el9.noarch.rpm 4.1 MB/s | 11 kB 00:00 2026-03-09T19:20:26.694 INFO:teuthology.orchestra.run.vm08.stdout:(91/117): python3-jaraco-classes-3.2.1-5.el9.no 7.0 MB/s | 18 kB 00:00 2026-03-09T19:20:26.695 INFO:teuthology.orchestra.run.vm08.stdout:(92/117): python3-jaraco-collections-3.0.0-8.el 9.4 MB/s | 23 kB 00:00 2026-03-09T19:20:26.696 INFO:teuthology.orchestra.run.vm08.stdout:(93/117): python3-jaraco-context-6.0.1-3.el9.no 7.9 MB/s | 20 kB 00:00 2026-03-09T19:20:26.697 INFO:teuthology.orchestra.run.vm08.stdout:(94/117): python3-jaraco-functools-3.5.0-2.el9. 9.1 MB/s | 19 kB 00:00 2026-03-09T19:20:26.699 INFO:teuthology.orchestra.run.vm08.stdout:(95/117): python3-jaraco-text-4.0.0-2.el9.noarc 9.5 MB/s | 26 kB 00:00 2026-03-09T19:20:26.704 INFO:teuthology.orchestra.run.vm08.stdout:(96/117): python3-logutils-0.3.5-21.el9.noarch. 12 MB/s | 46 kB 00:00 2026-03-09T19:20:26.708 INFO:teuthology.orchestra.run.vm08.stdout:(97/117): python3-more-itertools-8.12.0-2.el9.n 17 MB/s | 79 kB 00:00 2026-03-09T19:20:26.713 INFO:teuthology.orchestra.run.vm08.stdout:(98/117): python3-kubernetes-26.1.0-3.el9.noarc 67 MB/s | 1.0 MB 00:00 2026-03-09T19:20:26.714 INFO:teuthology.orchestra.run.vm08.stdout:(99/117): python3-natsort-7.1.1-5.el9.noarch.rp 11 MB/s | 58 kB 00:00 2026-03-09T19:20:26.718 INFO:teuthology.orchestra.run.vm08.stdout:(100/117): python3-pecan-1.4.2-3.el9.noarch.rpm 50 MB/s | 272 kB 00:00 2026-03-09T19:20:26.719 INFO:teuthology.orchestra.run.vm08.stdout:(101/117): python3-portend-3.1.0-2.el9.noarch.r 3.1 MB/s | 16 kB 00:00 2026-03-09T19:20:26.722 INFO:teuthology.orchestra.run.vm08.stdout:(102/117): python3-pyOpenSSL-21.0.0-1.el9.noarc 26 MB/s | 90 kB 00:00 2026-03-09T19:20:26.724 INFO:teuthology.orchestra.run.vm08.stdout:(103/117): python3-repoze-lru-0.7-16.el9.noarch 6.4 MB/s | 31 kB 00:00 2026-03-09T19:20:26.728 INFO:teuthology.orchestra.run.vm08.stdout:(104/117): python3-routes-2.5.1-5.el9.noarch.rp 36 MB/s | 188 kB 00:00 2026-03-09T19:20:26.729 INFO:teuthology.orchestra.run.vm08.stdout:(105/117): python3-rsa-4.9-2.el9.noarch.rpm 12 MB/s | 59 kB 00:00 2026-03-09T19:20:26.733 INFO:teuthology.orchestra.run.vm08.stdout:(106/117): python3-tempora-5.0.0-2.el9.noarch.r 7.4 MB/s | 36 kB 00:00 2026-03-09T19:20:26.734 INFO:teuthology.orchestra.run.vm08.stdout:(107/117): python3-typing-extensions-4.15.0-1.e 17 MB/s | 86 kB 00:00 2026-03-09T19:20:26.739 INFO:teuthology.orchestra.run.vm08.stdout:(108/117): python3-webob-1.8.8-2.el9.noarch.rpm 42 MB/s | 230 kB 00:00 2026-03-09T19:20:26.739 INFO:teuthology.orchestra.run.vm08.stdout:(109/117): python3-websocket-client-1.2.3-2.el9 17 MB/s | 90 kB 00:00 2026-03-09T19:20:26.744 INFO:teuthology.orchestra.run.vm08.stdout:(110/117): python3-xmltodict-0.12.0-15.el9.noar 4.5 MB/s | 22 kB 00:00 2026-03-09T19:20:26.746 INFO:teuthology.orchestra.run.vm08.stdout:(111/117): python3-werkzeug-2.0.3-3.el9.1.noarc 59 MB/s | 427 kB 00:00 2026-03-09T19:20:26.747 INFO:teuthology.orchestra.run.vm08.stdout:(112/117): python3-zc-lockfile-2.0-10.el9.noarc 7.2 MB/s | 20 kB 00:00 2026-03-09T19:20:26.747 INFO:teuthology.orchestra.run.vm07.stdout:(53/117): libxslt-1.1.34-12.el9.x86_64.rpm 3.1 MB/s | 233 kB 00:00 2026-03-09T19:20:26.751 INFO:teuthology.orchestra.run.vm08.stdout:(113/117): re2-20211101-20.el9.x86_64.rpm 40 MB/s | 191 kB 00:00 2026-03-09T19:20:26.776 INFO:teuthology.orchestra.run.vm08.stdout:(114/117): thrift-0.15.0-4.el9.x86_64.rpm 55 MB/s | 1.6 MB 00:00 2026-03-09T19:20:26.860 INFO:teuthology.orchestra.run.vm07.stdout:(54/117): lttng-ust-2.12.0-6.el9.x86_64.rpm 2.6 MB/s | 292 kB 00:00 2026-03-09T19:20:26.917 INFO:teuthology.orchestra.run.vm07.stdout:(55/117): openblas-0.3.29-1.el9.x86_64.rpm 733 kB/s | 42 kB 00:00 2026-03-09T19:20:27.087 INFO:teuthology.orchestra.run.vm07.stdout:(56/117): flexiblas-netlib-3.0.4-9.el9.x86_64.r 3.3 MB/s | 3.0 MB 00:00 2026-03-09T19:20:27.149 INFO:teuthology.orchestra.run.vm07.stdout:(57/117): openblas-openmp-0.3.29-1.el9.x86_64.r 23 MB/s | 5.3 MB 00:00 2026-03-09T19:20:27.224 INFO:teuthology.orchestra.run.vm07.stdout:(58/117): ceph-mgr-diskprediction-local-18.2.7- 3.1 MB/s | 7.4 MB 00:02 2026-03-09T19:20:27.227 INFO:teuthology.orchestra.run.vm07.stdout:(59/117): python3-devel-3.9.25-3.el9.x86_64.rpm 3.1 MB/s | 244 kB 00:00 2026-03-09T19:20:27.284 INFO:teuthology.orchestra.run.vm07.stdout:(60/117): python3-jmespath-1.0.1-1.el9.noarch.r 846 kB/s | 48 kB 00:00 2026-03-09T19:20:27.368 INFO:teuthology.orchestra.run.vm07.stdout:(61/117): python3-libstoragemgmt-1.10.1-1.el9.x 2.1 MB/s | 177 kB 00:00 2026-03-09T19:20:27.437 INFO:teuthology.orchestra.run.vm07.stdout:(62/117): python3-mako-1.1.4-6.el9.noarch.rpm 2.5 MB/s | 172 kB 00:00 2026-03-09T19:20:27.527 INFO:teuthology.orchestra.run.vm07.stdout:(63/117): python3-markupsafe-1.1.1-12.el9.x86_6 386 kB/s | 35 kB 00:00 2026-03-09T19:20:27.529 INFO:teuthology.orchestra.run.vm07.stdout:(64/117): python3-jinja2-2.11.3-8.el9.noarch.rp 815 kB/s | 249 kB 00:00 2026-03-09T19:20:27.646 INFO:teuthology.orchestra.run.vm07.stdout:(65/117): python3-numpy-f2py-1.23.5-2.el9.x86_6 3.7 MB/s | 442 kB 00:00 2026-03-09T19:20:27.742 INFO:teuthology.orchestra.run.vm07.stdout:(66/117): python3-pyasn1-0.4.8-7.el9.noarch.rpm 1.6 MB/s | 157 kB 00:00 2026-03-09T19:20:27.869 INFO:teuthology.orchestra.run.vm07.stdout:(67/117): python3-numpy-1.23.5-2.el9.x86_64.rpm 18 MB/s | 6.1 MB 00:00 2026-03-09T19:20:27.872 INFO:teuthology.orchestra.run.vm07.stdout:(68/117): python3-pyasn1-modules-0.4.8-7.el9.no 2.1 MB/s | 277 kB 00:00 2026-03-09T19:20:27.929 INFO:teuthology.orchestra.run.vm07.stdout:(69/117): python3-requests-oauthlib-1.3.0-12.el 896 kB/s | 54 kB 00:00 2026-03-09T19:20:27.991 INFO:teuthology.orchestra.run.vm07.stdout:(70/117): python3-toml-0.10.2-6.el9.noarch.rpm 667 kB/s | 42 kB 00:00 2026-03-09T19:20:28.060 INFO:teuthology.orchestra.run.vm08.stdout:(115/117): librados2-18.2.7-1055.gab47f43c.el9. 2.5 MB/s | 3.3 MB 00:01 2026-03-09T19:20:28.095 INFO:teuthology.orchestra.run.vm07.stdout:(71/117): socat-1.7.4.1-8.el9.x86_64.rpm 2.9 MB/s | 303 kB 00:00 2026-03-09T19:20:28.132 INFO:teuthology.orchestra.run.vm08.stdout:(116/117): python3-scipy-1.9.3-2.el9.x86_64.rpm 10 MB/s | 19 MB 00:01 2026-03-09T19:20:28.143 INFO:teuthology.orchestra.run.vm07.stdout:(72/117): xmlstarlet-1.6.1-20.el9.x86_64.rpm 1.3 MB/s | 64 kB 00:00 2026-03-09T19:20:28.153 INFO:teuthology.orchestra.run.vm07.stdout:(73/117): fmt-8.1.1-5.el9.x86_64.rpm 11 MB/s | 111 kB 00:00 2026-03-09T19:20:28.162 INFO:teuthology.orchestra.run.vm07.stdout:(74/117): gperftools-libs-2.9.1-3.el9.x86_64.rp 33 MB/s | 308 kB 00:00 2026-03-09T19:20:28.248 INFO:teuthology.orchestra.run.vm07.stdout:(75/117): libarrow-9.0.0-15.el9.x86_64.rpm 52 MB/s | 4.4 MB 00:00 2026-03-09T19:20:28.251 INFO:teuthology.orchestra.run.vm07.stdout:(76/117): libarrow-doc-9.0.0-15.el9.noarch.rpm 11 MB/s | 25 kB 00:00 2026-03-09T19:20:28.254 INFO:teuthology.orchestra.run.vm07.stdout:(77/117): liboath-2.6.12-1.el9.x86_64.rpm 15 MB/s | 49 kB 00:00 2026-03-09T19:20:28.257 INFO:teuthology.orchestra.run.vm07.stdout:(78/117): libunwind-1.6.2-1.el9.x86_64.rpm 25 MB/s | 67 kB 00:00 2026-03-09T19:20:28.275 INFO:teuthology.orchestra.run.vm07.stdout:(79/117): parquet-libs-9.0.0-15.el9.x86_64.rpm 47 MB/s | 838 kB 00:00 2026-03-09T19:20:28.284 INFO:teuthology.orchestra.run.vm07.stdout:(80/117): python3-asyncssh-2.13.2-5.el9.noarch. 61 MB/s | 548 kB 00:00 2026-03-09T19:20:28.286 INFO:teuthology.orchestra.run.vm07.stdout:(81/117): python3-autocommand-2.2.2-8.el9.noarc 13 MB/s | 29 kB 00:00 2026-03-09T19:20:28.289 INFO:teuthology.orchestra.run.vm07.stdout:(82/117): python3-backports-tarfile-1.2.0-1.el9 23 MB/s | 60 kB 00:00 2026-03-09T19:20:28.291 INFO:teuthology.orchestra.run.vm07.stdout:(83/117): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 19 MB/s | 43 kB 00:00 2026-03-09T19:20:28.295 INFO:teuthology.orchestra.run.vm07.stdout:(84/117): python3-cachetools-4.2.4-1.el9.noarch 11 MB/s | 32 kB 00:00 2026-03-09T19:20:28.297 INFO:teuthology.orchestra.run.vm07.stdout:(85/117): python3-certifi-2023.05.07-4.el9.noar 6.4 MB/s | 14 kB 00:00 2026-03-09T19:20:28.302 INFO:teuthology.orchestra.run.vm07.stdout:(86/117): python3-cheroot-10.0.1-4.el9.noarch.r 37 MB/s | 173 kB 00:00 2026-03-09T19:20:28.309 INFO:teuthology.orchestra.run.vm07.stdout:(87/117): python3-cherrypy-18.6.1-2.el9.noarch. 50 MB/s | 358 kB 00:00 2026-03-09T19:20:28.314 INFO:teuthology.orchestra.run.vm07.stdout:(88/117): python3-google-auth-2.45.0-1.el9.noar 55 MB/s | 254 kB 00:00 2026-03-09T19:20:28.317 INFO:teuthology.orchestra.run.vm07.stdout:(89/117): python3-jaraco-8.2.1-3.el9.noarch.rpm 3.8 MB/s | 11 kB 00:00 2026-03-09T19:20:28.320 INFO:teuthology.orchestra.run.vm07.stdout:(90/117): python3-jaraco-classes-3.2.1-5.el9.no 7.8 MB/s | 18 kB 00:00 2026-03-09T19:20:28.323 INFO:teuthology.orchestra.run.vm07.stdout:(91/117): python3-jaraco-collections-3.0.0-8.el 8.6 MB/s | 23 kB 00:00 2026-03-09T19:20:28.325 INFO:teuthology.orchestra.run.vm07.stdout:(92/117): python3-jaraco-context-6.0.1-3.el9.no 9.2 MB/s | 20 kB 00:00 2026-03-09T19:20:28.327 INFO:teuthology.orchestra.run.vm07.stdout:(93/117): python3-jaraco-functools-3.5.0-2.el9. 9.3 MB/s | 19 kB 00:00 2026-03-09T19:20:28.330 INFO:teuthology.orchestra.run.vm07.stdout:(94/117): python3-jaraco-text-4.0.0-2.el9.noarc 9.4 MB/s | 26 kB 00:00 2026-03-09T19:20:28.345 INFO:teuthology.orchestra.run.vm07.stdout:(95/117): python3-kubernetes-26.1.0-3.el9.noarc 71 MB/s | 1.0 MB 00:00 2026-03-09T19:20:28.348 INFO:teuthology.orchestra.run.vm07.stdout:(96/117): python3-logutils-0.3.5-21.el9.noarch. 16 MB/s | 46 kB 00:00 2026-03-09T19:20:28.351 INFO:teuthology.orchestra.run.vm07.stdout:(97/117): python3-more-itertools-8.12.0-2.el9.n 24 MB/s | 79 kB 00:00 2026-03-09T19:20:28.354 INFO:teuthology.orchestra.run.vm07.stdout:(98/117): python3-natsort-7.1.1-5.el9.noarch.rp 23 MB/s | 58 kB 00:00 2026-03-09T19:20:28.360 INFO:teuthology.orchestra.run.vm07.stdout:(99/117): python3-pecan-1.4.2-3.el9.noarch.rpm 48 MB/s | 272 kB 00:00 2026-03-09T19:20:28.363 INFO:teuthology.orchestra.run.vm07.stdout:(100/117): python3-portend-3.1.0-2.el9.noarch.r 6.2 MB/s | 16 kB 00:00 2026-03-09T19:20:28.367 INFO:teuthology.orchestra.run.vm07.stdout:(101/117): python3-pyOpenSSL-21.0.0-1.el9.noarc 23 MB/s | 90 kB 00:00 2026-03-09T19:20:28.372 INFO:teuthology.orchestra.run.vm07.stdout:(102/117): python3-repoze-lru-0.7-16.el9.noarch 6.7 MB/s | 31 kB 00:00 2026-03-09T19:20:28.373 INFO:teuthology.orchestra.run.vm08.stdout:(117/117): librbd1-18.2.7-1055.gab47f43c.el9.x8 1.9 MB/s | 3.0 MB 00:01 2026-03-09T19:20:28.375 INFO:teuthology.orchestra.run.vm08.stdout:-------------------------------------------------------------------------------- 2026-03-09T19:20:28.376 INFO:teuthology.orchestra.run.vm08.stdout:Total 10 MB/s | 181 MB 00:17 2026-03-09T19:20:28.376 INFO:teuthology.orchestra.run.vm07.stdout:(103/117): python3-routes-2.5.1-5.el9.noarch.rp 46 MB/s | 188 kB 00:00 2026-03-09T19:20:28.382 INFO:teuthology.orchestra.run.vm07.stdout:(104/117): python3-rsa-4.9-2.el9.noarch.rpm 11 MB/s | 59 kB 00:00 2026-03-09T19:20:28.385 INFO:teuthology.orchestra.run.vm07.stdout:(105/117): python3-tempora-5.0.0-2.el9.noarch.r 13 MB/s | 36 kB 00:00 2026-03-09T19:20:28.388 INFO:teuthology.orchestra.run.vm07.stdout:(106/117): python3-typing-extensions-4.15.0-1.e 27 MB/s | 86 kB 00:00 2026-03-09T19:20:28.394 INFO:teuthology.orchestra.run.vm07.stdout:(107/117): python3-webob-1.8.8-2.el9.noarch.rpm 39 MB/s | 230 kB 00:00 2026-03-09T19:20:28.400 INFO:teuthology.orchestra.run.vm07.stdout:(108/117): python3-websocket-client-1.2.3-2.el9 16 MB/s | 90 kB 00:00 2026-03-09T19:20:28.409 INFO:teuthology.orchestra.run.vm07.stdout:(109/117): python3-werkzeug-2.0.3-3.el9.1.noarc 48 MB/s | 427 kB 00:00 2026-03-09T19:20:28.412 INFO:teuthology.orchestra.run.vm07.stdout:(110/117): python3-xmltodict-0.12.0-15.el9.noar 7.6 MB/s | 22 kB 00:00 2026-03-09T19:20:28.416 INFO:teuthology.orchestra.run.vm07.stdout:(111/117): python3-zc-lockfile-2.0-10.el9.noarc 6.8 MB/s | 20 kB 00:00 2026-03-09T19:20:28.420 INFO:teuthology.orchestra.run.vm07.stdout:(112/117): re2-20211101-20.el9.x86_64.rpm 45 MB/s | 191 kB 00:00 2026-03-09T19:20:28.441 INFO:teuthology.orchestra.run.vm07.stdout:(113/117): thrift-0.15.0-4.el9.x86_64.rpm 77 MB/s | 1.6 MB 00:00 2026-03-09T19:20:28.503 INFO:teuthology.orchestra.run.vm07.stdout:(114/117): python3-scipy-1.9.3-2.el9.x86_64.rpm 31 MB/s | 19 MB 00:00 2026-03-09T19:20:28.586 INFO:teuthology.orchestra.run.vm07.stdout:(115/117): python3-babel-2.9.1-2.el9.noarch.rpm 4.0 MB/s | 6.0 MB 00:01 2026-03-09T19:20:28.861 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-09T19:20:28.906 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-09T19:20:28.906 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-09T19:20:29.518 INFO:teuthology.orchestra.run.vm07.stdout:(116/117): librados2-18.2.7-1055.gab47f43c.el9. 3.1 MB/s | 3.3 MB 00:01 2026-03-09T19:20:29.587 INFO:teuthology.orchestra.run.vm07.stdout:(117/117): librbd1-18.2.7-1055.gab47f43c.el9.x8 2.8 MB/s | 3.0 MB 00:01 2026-03-09T19:20:29.590 INFO:teuthology.orchestra.run.vm07.stdout:-------------------------------------------------------------------------------- 2026-03-09T19:20:29.590 INFO:teuthology.orchestra.run.vm07.stdout:Total 11 MB/s | 181 MB 00:16 2026-03-09T19:20:29.669 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-09T19:20:29.669 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-09T19:20:30.107 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-09T19:20:30.156 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-09T19:20:30.156 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-09T19:20:30.525 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-09T19:20:30.541 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 1/119 2026-03-09T19:20:30.555 INFO:teuthology.orchestra.run.vm08.stdout: Installing : thrift-0.15.0-4.el9.x86_64 2/119 2026-03-09T19:20:30.736 INFO:teuthology.orchestra.run.vm08.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/119 2026-03-09T19:20:30.739 INFO:teuthology.orchestra.run.vm08.stdout: Upgrading : librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 4/119 2026-03-09T19:20:30.795 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 4/119 2026-03-09T19:20:30.797 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 5/119 2026-03-09T19:20:30.835 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 5/119 2026-03-09T19:20:30.847 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 6/119 2026-03-09T19:20:30.853 INFO:teuthology.orchestra.run.vm08.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/119 2026-03-09T19:20:30.855 INFO:teuthology.orchestra.run.vm08.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/119 2026-03-09T19:20:30.865 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/119 2026-03-09T19:20:30.866 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 10/119 2026-03-09T19:20:30.903 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 10/119 2026-03-09T19:20:30.905 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86 11/119 2026-03-09T19:20:30.907 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-09T19:20:30.907 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-09T19:20:30.957 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86 11/119 2026-03-09T19:20:30.964 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/119 2026-03-09T19:20:30.992 INFO:teuthology.orchestra.run.vm08.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/119 2026-03-09T19:20:31.001 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/119 2026-03-09T19:20:31.006 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/119 2026-03-09T19:20:31.038 INFO:teuthology.orchestra.run.vm08.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/119 2026-03-09T19:20:31.058 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/119 2026-03-09T19:20:31.064 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/119 2026-03-09T19:20:31.076 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/119 2026-03-09T19:20:31.080 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/119 2026-03-09T19:20:31.089 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/119 2026-03-09T19:20:31.100 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el 22/119 2026-03-09T19:20:31.116 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_6 23/119 2026-03-09T19:20:31.149 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/119 2026-03-09T19:20:31.219 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/119 2026-03-09T19:20:31.239 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/119 2026-03-09T19:20:31.250 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/119 2026-03-09T19:20:31.261 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/119 2026-03-09T19:20:31.266 INFO:teuthology.orchestra.run.vm08.stdout: Installing : librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_6 29/119 2026-03-09T19:20:31.311 INFO:teuthology.orchestra.run.vm08.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/119 2026-03-09T19:20:31.318 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/119 2026-03-09T19:20:31.337 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/119 2026-03-09T19:20:31.366 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/119 2026-03-09T19:20:31.374 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/119 2026-03-09T19:20:31.381 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/119 2026-03-09T19:20:31.397 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/119 2026-03-09T19:20:31.410 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/119 2026-03-09T19:20:31.422 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/119 2026-03-09T19:20:31.499 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/119 2026-03-09T19:20:31.508 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/119 2026-03-09T19:20:31.518 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/119 2026-03-09T19:20:31.575 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/119 2026-03-09T19:20:31.703 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-09T19:20:31.719 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 1/119 2026-03-09T19:20:31.733 INFO:teuthology.orchestra.run.vm07.stdout: Installing : thrift-0.15.0-4.el9.x86_64 2/119 2026-03-09T19:20:31.906 INFO:teuthology.orchestra.run.vm07.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/119 2026-03-09T19:20:31.908 INFO:teuthology.orchestra.run.vm07.stdout: Upgrading : librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 4/119 2026-03-09T19:20:31.959 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 4/119 2026-03-09T19:20:31.960 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 5/119 2026-03-09T19:20:31.991 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 5/119 2026-03-09T19:20:32.001 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 6/119 2026-03-09T19:20:32.005 INFO:teuthology.orchestra.run.vm07.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/119 2026-03-09T19:20:32.007 INFO:teuthology.orchestra.run.vm07.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/119 2026-03-09T19:20:32.012 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/119 2026-03-09T19:20:32.017 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/119 2026-03-09T19:20:32.019 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 10/119 2026-03-09T19:20:32.030 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/119 2026-03-09T19:20:32.037 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/119 2026-03-09T19:20:32.045 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/119 2026-03-09T19:20:32.052 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/119 2026-03-09T19:20:32.055 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 10/119 2026-03-09T19:20:32.057 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86 11/119 2026-03-09T19:20:32.060 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/119 2026-03-09T19:20:32.064 INFO:teuthology.orchestra.run.vm08.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/119 2026-03-09T19:20:32.067 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/119 2026-03-09T19:20:32.081 INFO:teuthology.orchestra.run.vm08.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/119 2026-03-09T19:20:32.089 INFO:teuthology.orchestra.run.vm08.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/119 2026-03-09T19:20:32.095 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/119 2026-03-09T19:20:32.103 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/119 2026-03-09T19:20:32.108 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/119 2026-03-09T19:20:32.110 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86 11/119 2026-03-09T19:20:32.115 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/119 2026-03-09T19:20:32.124 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/119 2026-03-09T19:20:32.129 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/119 2026-03-09T19:20:32.142 INFO:teuthology.orchestra.run.vm07.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/119 2026-03-09T19:20:32.152 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/119 2026-03-09T19:20:32.156 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/119 2026-03-09T19:20:32.183 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/119 2026-03-09T19:20:32.184 INFO:teuthology.orchestra.run.vm07.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/119 2026-03-09T19:20:32.202 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/119 2026-03-09T19:20:32.207 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/119 2026-03-09T19:20:32.215 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/119 2026-03-09T19:20:32.218 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/119 2026-03-09T19:20:32.224 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/119 2026-03-09T19:20:32.236 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el 22/119 2026-03-09T19:20:32.251 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_6 23/119 2026-03-09T19:20:32.284 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/119 2026-03-09T19:20:32.349 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/119 2026-03-09T19:20:32.366 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/119 2026-03-09T19:20:32.377 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/119 2026-03-09T19:20:32.387 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/119 2026-03-09T19:20:32.392 INFO:teuthology.orchestra.run.vm07.stdout: Installing : librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_6 29/119 2026-03-09T19:20:32.431 INFO:teuthology.orchestra.run.vm07.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/119 2026-03-09T19:20:32.437 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/119 2026-03-09T19:20:32.455 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/119 2026-03-09T19:20:32.482 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/119 2026-03-09T19:20:32.489 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/119 2026-03-09T19:20:32.496 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/119 2026-03-09T19:20:32.511 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/119 2026-03-09T19:20:32.513 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/119 2026-03-09T19:20:32.525 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/119 2026-03-09T19:20:32.540 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/119 2026-03-09T19:20:32.550 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/119 2026-03-09T19:20:32.557 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/119 2026-03-09T19:20:32.608 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/119 2026-03-09T19:20:32.618 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/119 2026-03-09T19:20:32.629 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/119 2026-03-09T19:20:32.630 INFO:teuthology.orchestra.run.vm08.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/119 2026-03-09T19:20:32.634 INFO:teuthology.orchestra.run.vm08.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/119 2026-03-09T19:20:32.663 INFO:teuthology.orchestra.run.vm08.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/119 2026-03-09T19:20:32.679 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/119 2026-03-09T19:20:33.070 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/119 2026-03-09T19:20:33.075 INFO:teuthology.orchestra.run.vm08.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/119 2026-03-09T19:20:33.088 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/119 2026-03-09T19:20:33.093 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/119 2026-03-09T19:20:33.102 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/119 2026-03-09T19:20:33.107 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/119 2026-03-09T19:20:33.116 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/119 2026-03-09T19:20:33.120 INFO:teuthology.orchestra.run.vm07.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/119 2026-03-09T19:20:33.124 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/119 2026-03-09T19:20:33.135 INFO:teuthology.orchestra.run.vm07.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/119 2026-03-09T19:20:33.144 INFO:teuthology.orchestra.run.vm07.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/119 2026-03-09T19:20:33.150 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/119 2026-03-09T19:20:33.158 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/119 2026-03-09T19:20:33.165 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/119 2026-03-09T19:20:33.168 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/119 2026-03-09T19:20:33.177 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/119 2026-03-09T19:20:33.183 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/119 2026-03-09T19:20:33.225 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/119 2026-03-09T19:20:33.512 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/119 2026-03-09T19:20:33.548 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/119 2026-03-09T19:20:33.556 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/119 2026-03-09T19:20:33.620 INFO:teuthology.orchestra.run.vm07.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/119 2026-03-09T19:20:33.623 INFO:teuthology.orchestra.run.vm07.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/119 2026-03-09T19:20:33.649 INFO:teuthology.orchestra.run.vm07.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/119 2026-03-09T19:20:34.049 INFO:teuthology.orchestra.run.vm07.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/119 2026-03-09T19:20:34.076 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/119 2026-03-09T19:20:34.105 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/119 2026-03-09T19:20:34.112 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/119 2026-03-09T19:20:34.117 INFO:teuthology.orchestra.run.vm08.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/119 2026-03-09T19:20:34.144 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/119 2026-03-09T19:20:34.283 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/119 2026-03-09T19:20:34.285 INFO:teuthology.orchestra.run.vm08.stdout: Upgrading : librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 72/119 2026-03-09T19:20:34.318 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 72/119 2026-03-09T19:20:34.322 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 73/119 2026-03-09T19:20:34.331 INFO:teuthology.orchestra.run.vm08.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/119 2026-03-09T19:20:34.565 INFO:teuthology.orchestra.run.vm08.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/119 2026-03-09T19:20:34.615 INFO:teuthology.orchestra.run.vm08.stdout: Installing : librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 76/119 2026-03-09T19:20:34.636 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 76/119 2026-03-09T19:20:34.646 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 77/119 2026-03-09T19:20:34.665 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/119 2026-03-09T19:20:34.688 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/119 2026-03-09T19:20:34.808 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/119 2026-03-09T19:20:34.823 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/119 2026-03-09T19:20:34.854 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/119 2026-03-09T19:20:34.895 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/119 2026-03-09T19:20:34.961 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/119 2026-03-09T19:20:34.975 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/119 2026-03-09T19:20:34.981 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 86/119 2026-03-09T19:20:34.986 INFO:teuthology.orchestra.run.vm08.stdout: Installing : mailcap-2.1.49-5.el9.noarch 87/119 2026-03-09T19:20:34.989 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 88/119 2026-03-09T19:20:34.994 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/119 2026-03-09T19:20:35.010 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 89/119 2026-03-09T19:20:35.010 INFO:teuthology.orchestra.run.vm08.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-09T19:20:35.010 INFO:teuthology.orchestra.run.vm08.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-09T19:20:35.010 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:20:35.021 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 89/119 2026-03-09T19:20:35.025 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/119 2026-03-09T19:20:35.032 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/119 2026-03-09T19:20:35.038 INFO:teuthology.orchestra.run.vm07.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/119 2026-03-09T19:20:35.051 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 89/119 2026-03-09T19:20:35.051 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-09T19:20:35.051 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:20:35.070 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 90/119 2026-03-09T19:20:35.130 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 91/119 2026-03-09T19:20:35.133 INFO:teuthology.orchestra.run.vm08.stdout: Installing : cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 91/119 2026-03-09T19:20:35.139 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.e 92/119 2026-03-09T19:20:35.172 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c. 93/119 2026-03-09T19:20:35.177 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-ceph-common-2:18.2.7-1055.gab47f43c.el9. 94/119 2026-03-09T19:20:35.194 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/119 2026-03-09T19:20:35.196 INFO:teuthology.orchestra.run.vm07.stdout: Upgrading : librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 72/119 2026-03-09T19:20:35.229 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 72/119 2026-03-09T19:20:35.233 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 73/119 2026-03-09T19:20:35.241 INFO:teuthology.orchestra.run.vm07.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/119 2026-03-09T19:20:35.464 INFO:teuthology.orchestra.run.vm07.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/119 2026-03-09T19:20:35.466 INFO:teuthology.orchestra.run.vm07.stdout: Installing : librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 76/119 2026-03-09T19:20:35.487 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 76/119 2026-03-09T19:20:35.497 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 77/119 2026-03-09T19:20:35.515 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/119 2026-03-09T19:20:35.537 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/119 2026-03-09T19:20:35.631 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/119 2026-03-09T19:20:35.646 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/119 2026-03-09T19:20:35.674 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/119 2026-03-09T19:20:35.713 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/119 2026-03-09T19:20:35.781 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/119 2026-03-09T19:20:35.791 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/119 2026-03-09T19:20:35.796 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 86/119 2026-03-09T19:20:35.801 INFO:teuthology.orchestra.run.vm07.stdout: Installing : mailcap-2.1.49-5.el9.noarch 87/119 2026-03-09T19:20:35.804 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 88/119 2026-03-09T19:20:35.828 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 89/119 2026-03-09T19:20:35.828 INFO:teuthology.orchestra.run.vm07.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-09T19:20:35.828 INFO:teuthology.orchestra.run.vm07.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-09T19:20:35.828 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:20:35.840 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 89/119 2026-03-09T19:20:35.872 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 89/119 2026-03-09T19:20:35.872 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-09T19:20:35.872 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:20:35.892 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 90/119 2026-03-09T19:20:35.945 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 91/119 2026-03-09T19:20:35.948 INFO:teuthology.orchestra.run.vm07.stdout: Installing : cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 91/119 2026-03-09T19:20:35.953 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.e 92/119 2026-03-09T19:20:35.984 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c. 93/119 2026-03-09T19:20:35.989 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-ceph-common-2:18.2.7-1055.gab47f43c.el9. 94/119 2026-03-09T19:20:36.253 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 95/119 2026-03-09T19:20:36.319 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 95/119 2026-03-09T19:20:36.653 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 95/119 2026-03-09T19:20:36.660 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 96/119 2026-03-09T19:20:36.704 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 96/119 2026-03-09T19:20:36.704 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-09T19:20:36.704 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-09T19:20:36.704 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:20:36.710 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 97/119 2026-03-09T19:20:36.979 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 95/119 2026-03-09T19:20:36.984 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 95/119 2026-03-09T19:20:37.313 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 95/119 2026-03-09T19:20:37.320 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 96/119 2026-03-09T19:20:37.368 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 96/119 2026-03-09T19:20:37.368 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-09T19:20:37.368 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-09T19:20:37.368 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:20:37.373 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 97/119 2026-03-09T19:20:43.433 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 97/119 2026-03-09T19:20:43.433 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /sys 2026-03-09T19:20:43.433 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /proc 2026-03-09T19:20:43.433 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /mnt 2026-03-09T19:20:43.433 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /var/tmp 2026-03-09T19:20:43.433 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /home 2026-03-09T19:20:43.433 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /root 2026-03-09T19:20:43.433 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /tmp 2026-03-09T19:20:43.433 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:20:43.725 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noa 98/119 2026-03-09T19:20:44.258 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 97/119 2026-03-09T19:20:44.258 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /sys 2026-03-09T19:20:44.258 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /proc 2026-03-09T19:20:44.258 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /mnt 2026-03-09T19:20:44.258 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /var/tmp 2026-03-09T19:20:44.258 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /home 2026-03-09T19:20:44.258 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /root 2026-03-09T19:20:44.258 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /tmp 2026-03-09T19:20:44.258 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:20:44.293 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noa 98/119 2026-03-09T19:20:44.293 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noa 98/119 2026-03-09T19:20:44.300 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.n 99/119 2026-03-09T19:20:44.847 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noa 98/119 2026-03-09T19:20:44.854 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.n 99/119 2026-03-09T19:20:44.859 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.n 99/119 2026-03-09T19:20:44.861 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.7-1055.gab4 100/119 2026-03-09T19:20:44.927 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.7-1055.gab4 100/119 2026-03-09T19:20:45.007 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el 101/119 2026-03-09T19:20:45.010 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 102/119 2026-03-09T19:20:45.040 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 102/119 2026-03-09T19:20:45.040 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:20:45.040 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-09T19:20:45.040 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-09T19:20:45.040 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-09T19:20:45.040 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:20:45.056 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 103/119 2026-03-09T19:20:45.178 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 103/119 2026-03-09T19:20:45.181 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 104/119 2026-03-09T19:20:45.207 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 104/119 2026-03-09T19:20:45.207 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:20:45.207 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-09T19:20:45.207 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-09T19:20:45.207 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-09T19:20:45.207 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:20:45.431 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.n 99/119 2026-03-09T19:20:45.434 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.7-1055.gab4 100/119 2026-03-09T19:20:45.445 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 105/119 2026-03-09T19:20:45.471 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 105/119 2026-03-09T19:20:45.471 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:20:45.471 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-09T19:20:45.471 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-09T19:20:45.471 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-09T19:20:45.471 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:20:45.500 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.7-1055.gab4 100/119 2026-03-09T19:20:45.577 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el 101/119 2026-03-09T19:20:45.580 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 102/119 2026-03-09T19:20:45.606 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 102/119 2026-03-09T19:20:45.606 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:20:45.606 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-09T19:20:45.606 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-09T19:20:45.606 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-09T19:20:45.606 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:20:45.624 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 103/119 2026-03-09T19:20:45.742 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 103/119 2026-03-09T19:20:45.744 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 104/119 2026-03-09T19:20:45.769 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 104/119 2026-03-09T19:20:45.769 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:20:45.769 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-09T19:20:45.769 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-09T19:20:45.769 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-09T19:20:45.769 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:20:46.000 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 105/119 2026-03-09T19:20:46.025 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 105/119 2026-03-09T19:20:46.025 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:20:46.025 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-09T19:20:46.025 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-09T19:20:46.025 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-09T19:20:46.025 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:20:46.289 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 106/119 2026-03-09T19:20:46.313 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 106/119 2026-03-09T19:20:46.313 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:20:46.313 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-09T19:20:46.313 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-09T19:20:46.313 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-09T19:20:46.313 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:20:46.716 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 107/119 2026-03-09T19:20:46.720 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 108/119 2026-03-09T19:20:46.750 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 108/119 2026-03-09T19:20:46.751 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:20:46.751 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-09T19:20:46.751 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-09T19:20:46.751 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-09T19:20:46.751 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:20:46.763 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-immutable-object-cache-2:18.2.7-1055.gab47f 109/119 2026-03-09T19:20:46.791 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.7-1055.gab47f 109/119 2026-03-09T19:20:46.791 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:20:46.791 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-09T19:20:46.791 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:20:46.864 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 106/119 2026-03-09T19:20:46.897 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 106/119 2026-03-09T19:20:46.897 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:20:46.897 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-09T19:20:46.897 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-09T19:20:46.897 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-09T19:20:46.897 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:20:46.957 INFO:teuthology.orchestra.run.vm08.stdout: Installing : rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 110/119 2026-03-09T19:20:46.994 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 110/119 2026-03-09T19:20:46.994 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:20:46.994 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-09T19:20:46.994 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-09T19:20:46.994 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-09T19:20:46.994 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:20:47.335 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 107/119 2026-03-09T19:20:47.339 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 108/119 2026-03-09T19:20:47.364 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 108/119 2026-03-09T19:20:47.364 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:20:47.364 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-09T19:20:47.364 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-09T19:20:47.364 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-09T19:20:47.364 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:20:47.375 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-immutable-object-cache-2:18.2.7-1055.gab47f 109/119 2026-03-09T19:20:47.401 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.7-1055.gab47f 109/119 2026-03-09T19:20:47.401 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:20:47.401 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-09T19:20:47.401 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:20:47.562 INFO:teuthology.orchestra.run.vm07.stdout: Installing : rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 110/119 2026-03-09T19:20:47.587 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 110/119 2026-03-09T19:20:47.587 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:20:47.587 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-09T19:20:47.587 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-09T19:20:47.587 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-09T19:20:47.587 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:20:48.983 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 111/119 2026-03-09T19:20:48.996 INFO:teuthology.orchestra.run.vm08.stdout: Installing : rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 112/119 2026-03-09T19:20:49.002 INFO:teuthology.orchestra.run.vm08.stdout: Installing : rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 113/119 2026-03-09T19:20:49.050 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_ 114/119 2026-03-09T19:20:49.056 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 115/119 2026-03-09T19:20:49.067 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 116/119 2026-03-09T19:20:49.071 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 117/119 2026-03-09T19:20:49.071 INFO:teuthology.orchestra.run.vm08.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 118/119 2026-03-09T19:20:49.089 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 118/119 2026-03-09T19:20:49.089 INFO:teuthology.orchestra.run.vm08.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 119/119 2026-03-09T19:20:49.666 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 111/119 2026-03-09T19:20:49.719 INFO:teuthology.orchestra.run.vm07.stdout: Installing : rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 112/119 2026-03-09T19:20:49.815 INFO:teuthology.orchestra.run.vm07.stdout: Installing : rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 113/119 2026-03-09T19:20:49.868 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_ 114/119 2026-03-09T19:20:49.876 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 115/119 2026-03-09T19:20:50.037 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 116/119 2026-03-09T19:20:50.042 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 117/119 2026-03-09T19:20:50.042 INFO:teuthology.orchestra.run.vm07.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 118/119 2026-03-09T19:20:50.059 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 118/119 2026-03-09T19:20:50.059 INFO:teuthology.orchestra.run.vm07.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 119/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 119/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 1/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 2/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 3/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 4/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-immutable-object-cache-2:18.2.7-1055.gab47f 5/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 6/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 7/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 8/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 9/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 10/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 11/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 12/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_ 13/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 14/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 15/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_6 16/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86 17/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 18/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el 19/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ceph-common-2:18.2.7-1055.gab47f43c.el9. 20/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_6 21/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 22/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 23/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 24/119 2026-03-09T19:20:50.384 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 25/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 26/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 27/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c. 28/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noa 29/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.n 30/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.7-1055.gab4 31/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el 32/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 33/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.e 34/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 35/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/119 2026-03-09T19:20:50.386 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/119 2026-03-09T19:20:50.387 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/119 2026-03-09T19:20:50.387 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/119 2026-03-09T19:20:50.387 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/119 2026-03-09T19:20:50.387 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/119 2026-03-09T19:20:50.387 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/119 2026-03-09T19:20:50.387 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/119 2026-03-09T19:20:50.387 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/119 2026-03-09T19:20:50.387 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/119 2026-03-09T19:20:50.387 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 97/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 98/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 99/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 100/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 101/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 102/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 103/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 104/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 105/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 106/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 107/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 108/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 109/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 110/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 111/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 112/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 113/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : re2-1:20211101-20.el9.x86_64 114/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 115/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 116/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 117/119 2026-03-09T19:20:50.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 118/119 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 119/119 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout:Upgraded: 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout:Installed: 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-immutable-object-cache-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.493 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-09T19:20:50.494 INFO:teuthology.orchestra.run.vm08.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-ply-3.11-14.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: re2-1:20211101-20.el9.x86_64 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:20:50.495 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:20:50.589 DEBUG:teuthology.parallel:result is None 2026-03-09T19:20:51.345 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 119/119 2026-03-09T19:20:51.345 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 1/119 2026-03-09T19:20:51.345 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 2/119 2026-03-09T19:20:51.345 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 3/119 2026-03-09T19:20:51.345 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 4/119 2026-03-09T19:20:51.345 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-immutable-object-cache-2:18.2.7-1055.gab47f 5/119 2026-03-09T19:20:51.345 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 6/119 2026-03-09T19:20:51.345 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 7/119 2026-03-09T19:20:51.345 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 8/119 2026-03-09T19:20:51.345 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 9/119 2026-03-09T19:20:51.345 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 10/119 2026-03-09T19:20:51.345 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 11/119 2026-03-09T19:20:51.345 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 12/119 2026-03-09T19:20:51.345 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_ 13/119 2026-03-09T19:20:51.345 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 14/119 2026-03-09T19:20:51.345 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 15/119 2026-03-09T19:20:51.346 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_6 16/119 2026-03-09T19:20:51.346 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86 17/119 2026-03-09T19:20:51.346 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 18/119 2026-03-09T19:20:51.346 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el 19/119 2026-03-09T19:20:51.346 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ceph-common-2:18.2.7-1055.gab47f43c.el9. 20/119 2026-03-09T19:20:51.346 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_6 21/119 2026-03-09T19:20:51.346 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 22/119 2026-03-09T19:20:51.346 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 23/119 2026-03-09T19:20:51.346 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 24/119 2026-03-09T19:20:51.346 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 25/119 2026-03-09T19:20:51.346 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 26/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 27/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c. 28/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noa 29/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.n 30/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.7-1055.gab4 31/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el 32/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 33/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.e 34/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 35/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/119 2026-03-09T19:20:51.347 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/119 2026-03-09T19:20:51.349 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 97/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 98/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 99/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 100/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 101/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 102/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 103/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 104/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 105/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 106/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 107/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 108/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 109/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 110/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 111/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 112/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 113/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : re2-1:20211101-20.el9.x86_64 114/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 115/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 116/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 117/119 2026-03-09T19:20:51.350 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 118/119 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 119/119 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout:Upgraded: 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout:Installed: 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-immutable-object-cache-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:20:51.584 INFO:teuthology.orchestra.run.vm07.stdout: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-09T19:20:51.585 INFO:teuthology.orchestra.run.vm07.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-ply-3.11-14.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: re2-1:20211101-20.el9.x86_64 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-09T19:20:51.586 INFO:teuthology.orchestra.run.vm07.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-09T19:20:51.587 INFO:teuthology.orchestra.run.vm07.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-09T19:20:51.587 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:20:51.587 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:20:51.686 DEBUG:teuthology.parallel:result is None 2026-03-09T19:20:51.687 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-09T19:20:51.687 INFO:teuthology.packaging:ref: None 2026-03-09T19:20:51.687 INFO:teuthology.packaging:tag: None 2026-03-09T19:20:51.687 INFO:teuthology.packaging:branch: reef 2026-03-09T19:20:51.687 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T19:20:51.687 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=reef 2026-03-09T19:20:52.448 DEBUG:teuthology.orchestra.run.vm07:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-09T19:20:52.469 INFO:teuthology.orchestra.run.vm07.stdout:18.2.7-1055.gab47f43c.el9 2026-03-09T19:20:52.470 INFO:teuthology.packaging:The installed version of ceph is 18.2.7-1055.gab47f43c.el9 2026-03-09T19:20:52.470 INFO:teuthology.task.install:The correct ceph version 18.2.7-1055.gab47f43c is installed. 2026-03-09T19:20:52.471 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-09T19:20:52.471 INFO:teuthology.packaging:ref: None 2026-03-09T19:20:52.471 INFO:teuthology.packaging:tag: None 2026-03-09T19:20:52.471 INFO:teuthology.packaging:branch: reef 2026-03-09T19:20:52.471 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T19:20:52.471 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=reef 2026-03-09T19:20:53.203 DEBUG:teuthology.orchestra.run.vm08:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-09T19:20:53.225 INFO:teuthology.orchestra.run.vm08.stdout:18.2.7-1055.gab47f43c.el9 2026-03-09T19:20:53.227 INFO:teuthology.packaging:The installed version of ceph is 18.2.7-1055.gab47f43c.el9 2026-03-09T19:20:53.227 INFO:teuthology.task.install:The correct ceph version 18.2.7-1055.gab47f43c is installed. 2026-03-09T19:20:53.228 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-09T19:20:53.228 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:20:53.228 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-09T19:20:53.257 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-09T19:20:53.257 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-09T19:20:53.297 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-09T19:20:53.297 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:20:53.297 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/usr/bin/daemon-helper 2026-03-09T19:20:53.328 DEBUG:teuthology.orchestra.run.vm07:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-09T19:20:53.396 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-09T19:20:53.396 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/usr/bin/daemon-helper 2026-03-09T19:20:53.423 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-09T19:20:53.486 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-09T19:20:53.486 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:20:53.486 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-09T19:20:53.516 DEBUG:teuthology.orchestra.run.vm07:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-09T19:20:53.583 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-09T19:20:53.583 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-09T19:20:53.610 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-09T19:20:53.677 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-09T19:20:53.677 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:20:53.677 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/usr/bin/stdin-killer 2026-03-09T19:20:53.707 DEBUG:teuthology.orchestra.run.vm07:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-09T19:20:53.776 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-09T19:20:53.776 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/usr/bin/stdin-killer 2026-03-09T19:20:53.805 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-09T19:20:53.873 INFO:teuthology.run_tasks:Running task print... 2026-03-09T19:20:53.879 INFO:teuthology.task.print:**** done install task... 2026-03-09T19:20:53.909 INFO:teuthology.run_tasks:Running task cephadm... 2026-03-09T19:20:53.969 INFO:tasks.cephadm:Config: {'compiled_cephadm_branch': 'reef', 'conf': {'osd': {'osd_class_default_list': '*', 'osd_class_load_list': '*', 'bdev async discard': True, 'bdev enable discard': True, 'bluestore allocator': 'bitmap', 'bluestore block size': 96636764160, 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd op complaint time': 180}, 'client': {'client mount timeout': 600, 'debug client': 20, 'debug ms': 1, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'global': {'mon pg warn min per osd': 0}, 'mds': {'debug mds': 20, 'debug mds balancer': 20, 'debug ms': 1, 'mds debug frag': True, 'mds debug scatterstat': True, 'mds op complaint time': 180, 'mds verify scatter': True, 'osd op complaint time': 180, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20, 'mon down mkfs grace': 300, 'mon op complaint time': 120}}, 'image': 'quay.ceph.io/ceph-ci/ceph:reef', 'roleless': True, 'cluster-conf': {'mgr': {'client mount timeout': 30, 'debug client': 20, 'debug mgr': 20, 'debug ms': 1, 'mon warn on pool no app': False}}, 'flavor': 'default', 'fs': 'xfs', 'log-ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', 'FS_DEGRADED', 'filesystem is degraded', 'FS_INLINE_DATA_DEPRECATED', 'FS_WITH_FAILED_MDS', 'MDS_ALL_DOWN', 'filesystem is offline', 'is offline because no MDS', 'MDS_DAMAGE', 'MDS_DEGRADED', 'MDS_FAILED', 'MDS_INSUFFICIENT_STANDBY', 'MDS_UP_LESS_THAN_MAX', 'online, but wants', 'filesystem is online with fewer MDS than max_mds', 'POOL_APP_NOT_ENABLED', 'do not have an application enabled', 'overall HEALTH_', 'Replacing daemon', 'deprecated feature inline_data', 'MGR_MODULE_ERROR', 'OSD_DOWN', 'osds down', 'overall HEALTH_', '\\(OSD_DOWN\\)', '\\(OSD_', 'but it is still running', 'is not responding', 'MON_DOWN', 'PG_AVAILABILITY', 'PG_DEGRADED', 'Reduced data availability', 'Degraded data redundancy', 'pg .* is stuck inactive', 'pg .* is .*degraded', 'pg .* is stuck peering'], 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'} 2026-03-09T19:20:53.969 INFO:tasks.cephadm:Cluster image is quay.ceph.io/ceph-ci/ceph:reef 2026-03-09T19:20:53.969 INFO:tasks.cephadm:Cluster fsid is 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:20:53.969 INFO:tasks.cephadm:Choosing monitor IPs and ports... 2026-03-09T19:20:53.970 INFO:tasks.cephadm:No mon roles; fabricating mons 2026-03-09T19:20:53.970 INFO:tasks.cephadm:Monitor IPs: {'mon.vm07': '192.168.123.107', 'mon.vm08': '192.168.123.108'} 2026-03-09T19:20:53.970 INFO:tasks.cephadm:Normalizing hostnames... 2026-03-09T19:20:53.970 DEBUG:teuthology.orchestra.run.vm07:> sudo hostname $(hostname -s) 2026-03-09T19:20:53.995 DEBUG:teuthology.orchestra.run.vm08:> sudo hostname $(hostname -s) 2026-03-09T19:20:54.022 INFO:tasks.cephadm:Downloading "compiled" cephadm from cachra for reef 2026-03-09T19:20:54.023 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T19:20:54.611 INFO:tasks.cephadm:builder_project result: [{'url': 'https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'chacra_url': 'https://3.chacra.ceph.com/repos/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'ref': 'squid', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'distro': 'centos', 'distro_version': '9', 'distro_codename': None, 'modified': '2026-02-25 18:55:15.146628', 'status': 'ready', 'flavor': 'default', 'project': 'ceph', 'archs': ['source', 'x86_64'], 'extra': {'version': '19.2.3-678-ge911bdeb', 'package_manager_version': '19.2.3-678.ge911bdeb', 'build_url': 'https://jenkins.ceph.com/job/ceph-dev-pipeline/3275/', 'root_build_cause': '', 'node_name': '10.20.192.26+soko16', 'job_name': 'ceph-dev-pipeline'}}] 2026-03-09T19:20:55.344 INFO:tasks.util.chacra:got chacra host 3.chacra.ceph.com, ref reef, sha1 ab47f43c099b2cbae6e21342fe673ce251da54d6 from https://shaman.ceph.com/api/search/?project=ceph&distros=centos%2F9%2Fx86_64&flavor=default&ref=reef 2026-03-09T19:20:55.345 INFO:tasks.cephadm:Discovered cachra url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-09T19:20:55.345 INFO:tasks.cephadm:Downloading cephadm from url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-09T19:20:55.345 DEBUG:teuthology.orchestra.run.vm07:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-09T19:20:56.639 INFO:teuthology.orchestra.run.vm07.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 9 19:20 /home/ubuntu/cephtest/cephadm 2026-03-09T19:20:56.639 DEBUG:teuthology.orchestra.run.vm08:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-09T19:20:57.946 INFO:teuthology.orchestra.run.vm08.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 9 19:20 /home/ubuntu/cephtest/cephadm 2026-03-09T19:20:57.947 DEBUG:teuthology.orchestra.run.vm07:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-09T19:20:57.965 DEBUG:teuthology.orchestra.run.vm08:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-09T19:20:57.985 INFO:tasks.cephadm:Pulling image quay.ceph.io/ceph-ci/ceph:reef on all hosts... 2026-03-09T19:20:57.985 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef pull 2026-03-09T19:20:58.007 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef pull 2026-03-09T19:20:58.275 INFO:teuthology.orchestra.run.vm07.stderr:Pulling container image quay.ceph.io/ceph-ci/ceph:reef... 2026-03-09T19:20:58.277 INFO:teuthology.orchestra.run.vm08.stderr:Pulling container image quay.ceph.io/ceph-ci/ceph:reef... 2026-03-09T19:21:35.356 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:21:35.356 INFO:teuthology.orchestra.run.vm07.stdout: "ceph_version": "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)", 2026-03-09T19:21:35.356 INFO:teuthology.orchestra.run.vm07.stdout: "image_id": "b6fe7eb6a9d0f9b143033c43e1cdf7ef0918719fc7cff0dd0e2c113bb482fdd6", 2026-03-09T19:21:35.356 INFO:teuthology.orchestra.run.vm07.stdout: "repo_digests": [ 2026-03-09T19:21:35.356 INFO:teuthology.orchestra.run.vm07.stdout: "quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40" 2026-03-09T19:21:35.356 INFO:teuthology.orchestra.run.vm07.stdout: ] 2026-03-09T19:21:35.356 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:21:40.658 INFO:teuthology.orchestra.run.vm08.stdout:{ 2026-03-09T19:21:40.658 INFO:teuthology.orchestra.run.vm08.stdout: "ceph_version": "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)", 2026-03-09T19:21:40.658 INFO:teuthology.orchestra.run.vm08.stdout: "image_id": "b6fe7eb6a9d0f9b143033c43e1cdf7ef0918719fc7cff0dd0e2c113bb482fdd6", 2026-03-09T19:21:40.658 INFO:teuthology.orchestra.run.vm08.stdout: "repo_digests": [ 2026-03-09T19:21:40.658 INFO:teuthology.orchestra.run.vm08.stdout: "quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40" 2026-03-09T19:21:40.658 INFO:teuthology.orchestra.run.vm08.stdout: ] 2026-03-09T19:21:40.658 INFO:teuthology.orchestra.run.vm08.stdout:} 2026-03-09T19:21:40.674 DEBUG:teuthology.orchestra.run.vm07:> sudo mkdir -p /etc/ceph 2026-03-09T19:21:40.700 DEBUG:teuthology.orchestra.run.vm08:> sudo mkdir -p /etc/ceph 2026-03-09T19:21:40.731 DEBUG:teuthology.orchestra.run.vm07:> sudo chmod 777 /etc/ceph 2026-03-09T19:21:40.764 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod 777 /etc/ceph 2026-03-09T19:21:40.803 INFO:tasks.cephadm:Writing seed config... 2026-03-09T19:21:40.803 INFO:tasks.cephadm: override: [osd] osd_class_default_list = * 2026-03-09T19:21:40.803 INFO:tasks.cephadm: override: [osd] osd_class_load_list = * 2026-03-09T19:21:40.803 INFO:tasks.cephadm: override: [osd] bdev async discard = True 2026-03-09T19:21:40.803 INFO:tasks.cephadm: override: [osd] bdev enable discard = True 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [osd] bluestore allocator = bitmap 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [osd] bluestore block size = 96636764160 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [osd] bluestore fsck on mount = True 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [osd] debug bluefs = 1/20 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [osd] debug bluestore = 1/20 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [osd] debug ms = 1 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [osd] debug osd = 20 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [osd] debug rocksdb = 4/10 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [osd] mon osd backfillfull_ratio = 0.85 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [osd] mon osd full ratio = 0.9 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [osd] mon osd nearfull ratio = 0.8 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [osd] osd failsafe full ratio = 0.95 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [osd] osd mclock iops capacity threshold hdd = 49000 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [osd] osd objectstore = bluestore 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [osd] osd op complaint time = 180 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [client] client mount timeout = 600 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [client] debug client = 20 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [client] debug ms = 1 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [client] rados mon op timeout = 900 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [client] rados osd op timeout = 900 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [global] mon pg warn min per osd = 0 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mds] debug mds = 20 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mds] debug mds balancer = 20 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mds] debug ms = 1 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mds] mds debug frag = True 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mds] mds debug scatterstat = True 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mds] mds op complaint time = 180 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mds] mds verify scatter = True 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mds] osd op complaint time = 180 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mds] rados mon op timeout = 900 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mds] rados osd op timeout = 900 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mgr] debug mgr = 20 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mgr] debug ms = 1 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mon] debug mon = 20 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mon] debug ms = 1 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mon] debug paxos = 20 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mon] mon down mkfs grace = 300 2026-03-09T19:21:40.804 INFO:tasks.cephadm: override: [mon] mon op complaint time = 120 2026-03-09T19:21:40.805 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:21:40.805 DEBUG:teuthology.orchestra.run.vm07:> dd of=/home/ubuntu/cephtest/seed.ceph.conf 2026-03-09T19:21:40.820 DEBUG:tasks.cephadm:Final config: [global] # make logging friendly to teuthology log_to_file = true log_to_stderr = false log to journald = false mon cluster log to file = true mon cluster log file level = debug mon clock drift allowed = 1.000 # replicate across OSDs, not hosts osd crush chooseleaf type = 0 #osd pool default size = 2 osd pool default erasure code profile = plugin=jerasure technique=reed_sol_van k=2 m=1 crush-failure-domain=osd # enable some debugging auth debug = true ms die on old message = true ms die on bug = true debug asserts on shutdown = true # adjust warnings mon max pg per osd = 10000# >= luminous mon pg warn max object skew = 0 mon osd allow primary affinity = true mon osd allow pg remap = true mon warn on legacy crush tunables = false mon warn on crush straw calc version zero = false mon warn on no sortbitwise = false mon warn on osd down out interval zero = false mon warn on too few osds = false mon_warn_on_pool_pg_num_not_power_of_two = false # disable pg_autoscaler by default for new pools osd_pool_default_pg_autoscale_mode = off # tests delete pools mon allow pool delete = true fsid = 17715774-1bed-11f1-9ad8-1bc9d74ff594 mon pg warn min per osd = 0 [osd] osd scrub load threshold = 5.0 osd scrub max interval = 600 osd mclock profile = high_recovery_ops osd recover clone overlap = true osd recovery max chunk = 1048576 osd deep scrub update digest min age = 30 osd map max advance = 10 osd memory target autotune = true # debugging osd debug shutdown = true osd debug op order = true osd debug verify stray on activate = true osd debug pg log writeout = true osd debug verify cached snaps = true osd debug verify missing on start = true osd debug misdirected ops = true osd op queue = debug_random osd op queue cut off = debug_random osd shutdown pgref assert = true bdev debug aio = true osd sloppy crc = true osd_class_default_list = * osd_class_load_list = * bdev async discard = True bdev enable discard = True bluestore allocator = bitmap bluestore block size = 96636764160 bluestore fsck on mount = True debug bluefs = 1/20 debug bluestore = 1/20 debug ms = 1 debug osd = 20 debug rocksdb = 4/10 mon osd backfillfull_ratio = 0.85 mon osd full ratio = 0.9 mon osd nearfull ratio = 0.8 osd failsafe full ratio = 0.95 osd mclock iops capacity threshold hdd = 49000 osd objectstore = bluestore osd op complaint time = 180 [mgr] mon reweight min pgs per osd = 4 mon reweight min bytes per osd = 10 mgr/telemetry/nag = false debug mgr = 20 debug ms = 1 [mon] mon data avail warn = 5 mon mgr mkfs grace = 240 mon reweight min pgs per osd = 4 mon osd reporter subtree level = osd mon osd prime pg temp = true mon reweight min bytes per osd = 10 # rotate auth tickets quickly to exercise renewal paths auth mon ticket ttl = 660# 11m auth service ticket ttl = 240# 4m # don't complain about global id reclaim mon_warn_on_insecure_global_id_reclaim = false mon_warn_on_insecure_global_id_reclaim_allowed = false debug mon = 20 debug ms = 1 debug paxos = 20 mon down mkfs grace = 300 mon op complaint time = 120 [client.rgw] rgw cache enabled = true rgw enable ops log = true rgw enable usage log = true [client] client mount timeout = 600 debug client = 20 debug ms = 1 rados mon op timeout = 900 rados osd op timeout = 900 [mds] debug mds = 20 debug mds balancer = 20 debug ms = 1 mds debug frag = True mds debug scatterstat = True mds op complaint time = 180 mds verify scatter = True osd op complaint time = 180 rados mon op timeout = 900 rados osd op timeout = 900 2026-03-09T19:21:40.820 DEBUG:teuthology.orchestra.run.vm07:mon.vm07> sudo journalctl -f -n 0 -u ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mon.vm07.service 2026-03-09T19:21:40.862 INFO:tasks.cephadm:Bootstrapping... 2026-03-09T19:21:40.862 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef -v bootstrap --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 --config /home/ubuntu/cephtest/seed.ceph.conf --output-config /etc/ceph/ceph.conf --output-keyring /etc/ceph/ceph.client.admin.keyring --output-pub-ssh-key /home/ubuntu/cephtest/ceph.pub --mon-ip 192.168.123.107 --skip-admin-label && sudo chmod +r /etc/ceph/ceph.client.admin.keyring 2026-03-09T19:21:40.976 INFO:teuthology.orchestra.run.vm07.stdout:-------------------------------------------------------------------------------- 2026-03-09T19:21:40.976 INFO:teuthology.orchestra.run.vm07.stdout:cephadm ['--image', 'quay.ceph.io/ceph-ci/ceph:reef', '-v', 'bootstrap', '--fsid', '17715774-1bed-11f1-9ad8-1bc9d74ff594', '--config', '/home/ubuntu/cephtest/seed.ceph.conf', '--output-config', '/etc/ceph/ceph.conf', '--output-keyring', '/etc/ceph/ceph.client.admin.keyring', '--output-pub-ssh-key', '/home/ubuntu/cephtest/ceph.pub', '--mon-ip', '192.168.123.107', '--skip-admin-label'] 2026-03-09T19:21:40.998 INFO:teuthology.orchestra.run.vm07.stdout:/bin/podman: stdout 5.8.0 2026-03-09T19:21:40.998 INFO:teuthology.orchestra.run.vm07.stderr:Specifying an fsid for your cluster offers no advantages and may increase the likelihood of fsid conflicts. 2026-03-09T19:21:40.998 INFO:teuthology.orchestra.run.vm07.stdout:Verifying podman|docker is present... 2026-03-09T19:21:41.019 INFO:teuthology.orchestra.run.vm07.stdout:/bin/podman: stdout 5.8.0 2026-03-09T19:21:41.019 INFO:teuthology.orchestra.run.vm07.stdout:Verifying lvm2 is present... 2026-03-09T19:21:41.019 INFO:teuthology.orchestra.run.vm07.stdout:Verifying time synchronization is in place... 2026-03-09T19:21:41.027 INFO:teuthology.orchestra.run.vm07.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-09T19:21:41.027 INFO:teuthology.orchestra.run.vm07.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-09T19:21:41.033 INFO:teuthology.orchestra.run.vm07.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-09T19:21:41.033 INFO:teuthology.orchestra.run.vm07.stdout:systemctl: stdout inactive 2026-03-09T19:21:41.040 INFO:teuthology.orchestra.run.vm07.stdout:systemctl: stdout enabled 2026-03-09T19:21:41.048 INFO:teuthology.orchestra.run.vm07.stdout:systemctl: stdout active 2026-03-09T19:21:41.048 INFO:teuthology.orchestra.run.vm07.stdout:Unit chronyd.service is enabled and running 2026-03-09T19:21:41.048 INFO:teuthology.orchestra.run.vm07.stdout:Repeating the final host check... 2026-03-09T19:21:41.070 INFO:teuthology.orchestra.run.vm07.stdout:/bin/podman: stdout 5.8.0 2026-03-09T19:21:41.070 INFO:teuthology.orchestra.run.vm07.stdout:podman (/bin/podman) version 5.8.0 is present 2026-03-09T19:21:41.071 INFO:teuthology.orchestra.run.vm07.stdout:systemctl is present 2026-03-09T19:21:41.071 INFO:teuthology.orchestra.run.vm07.stdout:lvcreate is present 2026-03-09T19:21:41.077 INFO:teuthology.orchestra.run.vm07.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-09T19:21:41.077 INFO:teuthology.orchestra.run.vm07.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-09T19:21:41.084 INFO:teuthology.orchestra.run.vm07.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-09T19:21:41.084 INFO:teuthology.orchestra.run.vm07.stdout:systemctl: stdout inactive 2026-03-09T19:21:41.091 INFO:teuthology.orchestra.run.vm07.stdout:systemctl: stdout enabled 2026-03-09T19:21:41.098 INFO:teuthology.orchestra.run.vm07.stdout:systemctl: stdout active 2026-03-09T19:21:41.098 INFO:teuthology.orchestra.run.vm07.stdout:Unit chronyd.service is enabled and running 2026-03-09T19:21:41.098 INFO:teuthology.orchestra.run.vm07.stdout:Host looks OK 2026-03-09T19:21:41.098 INFO:teuthology.orchestra.run.vm07.stdout:Cluster fsid: 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:21:41.098 INFO:teuthology.orchestra.run.vm07.stdout:Acquiring lock 140342359048736 on /run/cephadm/17715774-1bed-11f1-9ad8-1bc9d74ff594.lock 2026-03-09T19:21:41.098 INFO:teuthology.orchestra.run.vm07.stdout:Lock 140342359048736 acquired on /run/cephadm/17715774-1bed-11f1-9ad8-1bc9d74ff594.lock 2026-03-09T19:21:41.098 INFO:teuthology.orchestra.run.vm07.stdout:Verifying IP 192.168.123.107 port 3300 ... 2026-03-09T19:21:41.099 INFO:teuthology.orchestra.run.vm07.stdout:Verifying IP 192.168.123.107 port 6789 ... 2026-03-09T19:21:41.099 INFO:teuthology.orchestra.run.vm07.stdout:Base mon IP(s) is [192.168.123.107:3300, 192.168.123.107:6789], mon addrv is [v2:192.168.123.107:3300,v1:192.168.123.107:6789] 2026-03-09T19:21:41.103 INFO:teuthology.orchestra.run.vm07.stdout:/sbin/ip: stdout default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.107 metric 100 2026-03-09T19:21:41.103 INFO:teuthology.orchestra.run.vm07.stdout:/sbin/ip: stdout 192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.107 metric 100 2026-03-09T19:21:41.105 INFO:teuthology.orchestra.run.vm07.stdout:/sbin/ip: stdout ::1 dev lo proto kernel metric 256 pref medium 2026-03-09T19:21:41.105 INFO:teuthology.orchestra.run.vm07.stdout:/sbin/ip: stdout fe80::/64 dev eth0 proto kernel metric 1024 pref medium 2026-03-09T19:21:41.108 INFO:teuthology.orchestra.run.vm07.stdout:/sbin/ip: stdout 1: lo: mtu 65536 state UNKNOWN qlen 1000 2026-03-09T19:21:41.108 INFO:teuthology.orchestra.run.vm07.stdout:/sbin/ip: stdout inet6 ::1/128 scope host 2026-03-09T19:21:41.108 INFO:teuthology.orchestra.run.vm07.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-09T19:21:41.108 INFO:teuthology.orchestra.run.vm07.stdout:/sbin/ip: stdout 2: eth0: mtu 1500 state UP qlen 1000 2026-03-09T19:21:41.108 INFO:teuthology.orchestra.run.vm07.stdout:/sbin/ip: stdout inet6 fe80::5055:ff:fe00:7/64 scope link noprefixroute 2026-03-09T19:21:41.108 INFO:teuthology.orchestra.run.vm07.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-09T19:21:41.108 INFO:teuthology.orchestra.run.vm07.stdout:Mon IP `192.168.123.107` is in CIDR network `192.168.123.0/24` 2026-03-09T19:21:41.109 INFO:teuthology.orchestra.run.vm07.stdout:Mon IP `192.168.123.107` is in CIDR network `192.168.123.0/24` 2026-03-09T19:21:41.109 INFO:teuthology.orchestra.run.vm07.stdout:Inferred mon public CIDR from local network configuration ['192.168.123.0/24', '192.168.123.0/24'] 2026-03-09T19:21:41.109 INFO:teuthology.orchestra.run.vm07.stdout:Internal network (--cluster-network) has not been provided, OSD replication will default to the public_network 2026-03-09T19:21:41.109 INFO:teuthology.orchestra.run.vm07.stdout:Pulling container image quay.ceph.io/ceph-ci/ceph:reef... 2026-03-09T19:21:42.874 INFO:teuthology.orchestra.run.vm07.stdout:/bin/podman: stdout b6fe7eb6a9d0f9b143033c43e1cdf7ef0918719fc7cff0dd0e2c113bb482fdd6 2026-03-09T19:21:42.874 INFO:teuthology.orchestra.run.vm07.stdout:/bin/podman: stderr Trying to pull quay.ceph.io/ceph-ci/ceph:reef... 2026-03-09T19:21:42.874 INFO:teuthology.orchestra.run.vm07.stdout:/bin/podman: stderr Getting image source signatures 2026-03-09T19:21:42.874 INFO:teuthology.orchestra.run.vm07.stdout:/bin/podman: stderr Copying blob sha256:8c0f38fb8a72d42ac81f075843e5360929f695c9f93c12951e7539b9ed9b1b5f 2026-03-09T19:21:42.874 INFO:teuthology.orchestra.run.vm07.stdout:/bin/podman: stderr Copying blob sha256:8e380faede39ebd4286247457b408d979ab568aafd8389c42ec304b8cfba4e92 2026-03-09T19:21:42.874 INFO:teuthology.orchestra.run.vm07.stdout:/bin/podman: stderr Copying config sha256:b6fe7eb6a9d0f9b143033c43e1cdf7ef0918719fc7cff0dd0e2c113bb482fdd6 2026-03-09T19:21:42.874 INFO:teuthology.orchestra.run.vm07.stdout:/bin/podman: stderr Writing manifest to image destination 2026-03-09T19:21:43.041 INFO:teuthology.orchestra.run.vm07.stdout:ceph: stdout ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable) 2026-03-09T19:21:43.041 INFO:teuthology.orchestra.run.vm07.stdout:Ceph version: ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable) 2026-03-09T19:21:43.041 INFO:teuthology.orchestra.run.vm07.stdout:Extracting ceph user uid/gid from container image... 2026-03-09T19:21:43.132 INFO:teuthology.orchestra.run.vm07.stdout:stat: stdout 167 167 2026-03-09T19:21:43.132 INFO:teuthology.orchestra.run.vm07.stdout:Creating initial keys... 2026-03-09T19:21:43.258 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph-authtool: stdout AQDHHa9pjTP4DBAArXgPgdTtq3ORjbF4yNgEYQ== 2026-03-09T19:21:43.898 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph-authtool: stdout AQDHHa9pH2juExAA9HKDaBO/ZoYZqMv8k4hu9w== 2026-03-09T19:21:44.005 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph-authtool: stdout AQDHHa9pr3N4OhAAOq/K5ejYcxSWXJtJMWFp+A== 2026-03-09T19:21:44.005 INFO:teuthology.orchestra.run.vm07.stdout:Creating initial monmap... 2026-03-09T19:21:44.136 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-09T19:21:44.136 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/monmaptool: stdout setting min_mon_release = pacific 2026-03-09T19:21:44.136 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: set fsid to 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:21:44.136 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-09T19:21:44.136 INFO:teuthology.orchestra.run.vm07.stdout:monmaptool for vm07 [v2:192.168.123.107:3300,v1:192.168.123.107:6789] on /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-09T19:21:44.136 INFO:teuthology.orchestra.run.vm07.stdout:setting min_mon_release = pacific 2026-03-09T19:21:44.136 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/monmaptool: set fsid to 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:21:44.136 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-09T19:21:44.136 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:21:44.137 INFO:teuthology.orchestra.run.vm07.stdout:Creating mon... 2026-03-09T19:21:44.275 INFO:teuthology.orchestra.run.vm07.stdout:create mon.vm07 on 2026-03-09T19:21:44.433 INFO:teuthology.orchestra.run.vm07.stdout:systemctl: stderr Removed "/etc/systemd/system/multi-user.target.wants/ceph.target". 2026-03-09T19:21:44.573 INFO:teuthology.orchestra.run.vm07.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /etc/systemd/system/ceph.target. 2026-03-09T19:21:44.733 INFO:teuthology.orchestra.run.vm07.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594.target → /etc/systemd/system/ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594.target. 2026-03-09T19:21:44.734 INFO:teuthology.orchestra.run.vm07.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph.target.wants/ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594.target → /etc/systemd/system/ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594.target. 2026-03-09T19:21:44.896 INFO:teuthology.orchestra.run.vm07.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mon.vm07 2026-03-09T19:21:44.897 INFO:teuthology.orchestra.run.vm07.stdout:systemctl: stderr Failed to reset failed state of unit ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mon.vm07.service: Unit ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mon.vm07.service not loaded. 2026-03-09T19:21:45.058 INFO:teuthology.orchestra.run.vm07.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594.target.wants/ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mon.vm07.service → /etc/systemd/system/ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@.service. 2026-03-09T19:21:45.263 INFO:teuthology.orchestra.run.vm07.stdout:firewalld does not appear to be present 2026-03-09T19:21:45.263 INFO:teuthology.orchestra.run.vm07.stdout:Not possible to enable service . firewalld.service is not available 2026-03-09T19:21:45.263 INFO:teuthology.orchestra.run.vm07.stdout:Waiting for mon to start... 2026-03-09T19:21:45.263 INFO:teuthology.orchestra.run.vm07.stdout:Waiting for mon... 2026-03-09T19:21:45.649 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout cluster: 2026-03-09T19:21:45.650 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout id: 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:21:45.650 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout health: HEALTH_OK 2026-03-09T19:21:45.650 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout 2026-03-09T19:21:45.650 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout services: 2026-03-09T19:21:45.650 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mon: 1 daemons, quorum vm07 (age 0.209795s) 2026-03-09T19:21:45.650 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mgr: no daemons active 2026-03-09T19:21:45.650 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout osd: 0 osds: 0 up, 0 in 2026-03-09T19:21:45.650 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout 2026-03-09T19:21:45.650 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout data: 2026-03-09T19:21:45.650 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout pools: 0 pools, 0 pgs 2026-03-09T19:21:45.650 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout objects: 0 objects, 0 B 2026-03-09T19:21:45.650 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout usage: 0 B used, 0 B / 0 B avail 2026-03-09T19:21:45.650 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout pgs: 2026-03-09T19:21:45.650 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.463+0000 7f8f0b04e640 1 Processor -- start 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.465+0000 7f8f0b04e640 1 -- start start 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.465+0000 7f8f0b04e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f041082d0 0x7f8f041086d0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.465+0000 7f8f0b04e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8f04108ca0 con 0x7f8f041082d0 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.466+0000 7f8f08dc3640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f041082d0 0x7f8f041086d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.466+0000 7f8f08dc3640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f041082d0 0x7f8f041086d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:35174/0 (socket says 192.168.123.107:35174) 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.466+0000 7f8f08dc3640 1 -- 192.168.123.107:0/3203914044 learned_addr learned my addr 192.168.123.107:0/3203914044 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.466+0000 7f8f08dc3640 1 -- 192.168.123.107:0/3203914044 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8f04109480 con 0x7f8f041082d0 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.468+0000 7f8f08dc3640 1 --2- 192.168.123.107:0/3203914044 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f041082d0 0x7f8f041086d0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f8ef4009920 tx=0x7f8ef402ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=487345086da69edf server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.468+0000 7f8efb7fe640 1 -- 192.168.123.107:0/3203914044 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8ef402f9b0 con 0x7f8f041082d0 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.468+0000 7f8efb7fe640 1 -- 192.168.123.107:0/3203914044 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f8ef402fb10 con 0x7f8f041082d0 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.468+0000 7f8efb7fe640 1 -- 192.168.123.107:0/3203914044 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8ef402fde0 con 0x7f8f041082d0 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.468+0000 7f8f0b04e640 1 -- 192.168.123.107:0/3203914044 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f041082d0 msgr2=0x7f8f041086d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.468+0000 7f8f0b04e640 1 --2- 192.168.123.107:0/3203914044 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f041082d0 0x7f8f041086d0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f8ef4009920 tx=0x7f8ef402ef20 comp rx=0 tx=0).stop 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.469+0000 7f8f0b04e640 1 -- 192.168.123.107:0/3203914044 shutdown_connections 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.469+0000 7f8f0b04e640 1 --2- 192.168.123.107:0/3203914044 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f041082d0 0x7f8f041086d0 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.469+0000 7f8f0b04e640 1 -- 192.168.123.107:0/3203914044 >> 192.168.123.107:0/3203914044 conn(0x7f8f0407b8f0 msgr2=0x7f8f041066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.469+0000 7f8f0b04e640 1 -- 192.168.123.107:0/3203914044 shutdown_connections 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.469+0000 7f8f0b04e640 1 -- 192.168.123.107:0/3203914044 wait complete. 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.469+0000 7f8f0b04e640 1 Processor -- start 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.470+0000 7f8f0b04e640 1 -- start start 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.470+0000 7f8f0b04e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f041082d0 0x7f8f0419d590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.470+0000 7f8f0b04e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8f0419dad0 con 0x7f8f041082d0 2026-03-09T19:21:45.651 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.470+0000 7f8f08dc3640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f041082d0 0x7f8f0419d590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.470+0000 7f8f08dc3640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f041082d0 0x7f8f0419d590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:35188/0 (socket says 192.168.123.107:35188) 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.470+0000 7f8f08dc3640 1 -- 192.168.123.107:0/2507685446 learned_addr learned my addr 192.168.123.107:0/2507685446 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.470+0000 7f8f08dc3640 1 -- 192.168.123.107:0/2507685446 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8ef40095d0 con 0x7f8f041082d0 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.471+0000 7f8f08dc3640 1 --2- 192.168.123.107:0/2507685446 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f041082d0 0x7f8f0419d590 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f8ef4009a50 tx=0x7f8ef40047c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.471+0000 7f8ef9ffb640 1 -- 192.168.123.107:0/2507685446 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8ef4041610 con 0x7f8f041082d0 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.471+0000 7f8ef9ffb640 1 -- 192.168.123.107:0/2507685446 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f8ef4041770 con 0x7f8f041082d0 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.471+0000 7f8f0b04e640 1 -- 192.168.123.107:0/2507685446 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8f0419dcd0 con 0x7f8f041082d0 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.471+0000 7f8ef9ffb640 1 -- 192.168.123.107:0/2507685446 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8ef4041920 con 0x7f8f041082d0 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.471+0000 7f8f0b04e640 1 -- 192.168.123.107:0/2507685446 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8f0419e170 con 0x7f8f041082d0 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.472+0000 7f8ef9ffb640 1 -- 192.168.123.107:0/2507685446 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 1) v1 ==== 672+0+0 (secure 0 0 0) 0x7f8ef404a450 con 0x7f8f041082d0 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.472+0000 7f8ef9ffb640 1 -- 192.168.123.107:0/2507685446 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f8ef40406e0 con 0x7f8f041082d0 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.472+0000 7f8f0b04e640 1 -- 192.168.123.107:0/2507685446 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8ed0005350 con 0x7f8f041082d0 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.474+0000 7f8ef9ffb640 1 -- 192.168.123.107:0/2507685446 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+74806 (secure 0 0 0) 0x7f8ef406a450 con 0x7f8f041082d0 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.506+0000 7f8f0b04e640 1 -- 192.168.123.107:0/2507685446 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "status"} v 0) v1 -- 0x7f8ed00058d0 con 0x7f8f041082d0 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.506+0000 7f8ef9ffb640 1 -- 192.168.123.107:0/2507685446 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "status"}]=0 v0) v1 ==== 54+0+320 (secure 0 0 0) 0x7f8ef4058030 con 0x7f8f041082d0 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.507+0000 7f8f0b04e640 1 -- 192.168.123.107:0/2507685446 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f041082d0 msgr2=0x7f8f0419d590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.507+0000 7f8f0b04e640 1 --2- 192.168.123.107:0/2507685446 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f041082d0 0x7f8f0419d590 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f8ef4009a50 tx=0x7f8ef40047c0 comp rx=0 tx=0).stop 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.507+0000 7f8f0b04e640 1 -- 192.168.123.107:0/2507685446 shutdown_connections 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.507+0000 7f8f0b04e640 1 --2- 192.168.123.107:0/2507685446 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f041082d0 0x7f8f0419d590 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.507+0000 7f8f0b04e640 1 -- 192.168.123.107:0/2507685446 >> 192.168.123.107:0/2507685446 conn(0x7f8f0407b8f0 msgr2=0x7f8f04194470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.508+0000 7f8f0b04e640 1 -- 192.168.123.107:0/2507685446 shutdown_connections 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.508+0000 7f8f0b04e640 1 -- 192.168.123.107:0/2507685446 wait complete. 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:mon is available 2026-03-09T19:21:45.652 INFO:teuthology.orchestra.run.vm07.stdout:Assimilating anything we can from ceph.conf... 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout [global] 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout fsid = 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mon_cluster_log_file_level = debug 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mon_host = [v2:192.168.123.107:3300,v1:192.168.123.107:6789] 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout [osd] 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.784+0000 7fd818f1c640 1 Processor -- start 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.785+0000 7fd818f1c640 1 -- start start 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.785+0000 7fd818f1c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd81407c880 0x7fd81407cc80 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.785+0000 7fd818f1c640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd81407d250 con 0x7fd81407c880 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.785+0000 7fd812575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd81407c880 0x7fd81407cc80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.785+0000 7fd812575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd81407c880 0x7fd81407cc80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:35194/0 (socket says 192.168.123.107:35194) 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.785+0000 7fd812575640 1 -- 192.168.123.107:0/2773413037 learned_addr learned my addr 192.168.123.107:0/2773413037 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.786+0000 7fd812575640 1 -- 192.168.123.107:0/2773413037 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd81407da80 con 0x7fd81407c880 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.787+0000 7fd812575640 1 --2- 192.168.123.107:0/2773413037 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd81407c880 0x7fd81407cc80 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7fd808009920 tx=0x7fd80802ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=6fe4d838f2cd667f server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.787+0000 7fd811573640 1 -- 192.168.123.107:0/2773413037 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd80802f9b0 con 0x7fd81407c880 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.787+0000 7fd811573640 1 -- 192.168.123.107:0/2773413037 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7fd80802fb10 con 0x7fd81407c880 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.787+0000 7fd811573640 1 -- 192.168.123.107:0/2773413037 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd80802fde0 con 0x7fd81407c880 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.787+0000 7fd818f1c640 1 -- 192.168.123.107:0/2773413037 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd81407c880 msgr2=0x7fd81407cc80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.787+0000 7fd818f1c640 1 --2- 192.168.123.107:0/2773413037 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd81407c880 0x7fd81407cc80 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7fd808009920 tx=0x7fd80802ef20 comp rx=0 tx=0).stop 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.788+0000 7fd818f1c640 1 -- 192.168.123.107:0/2773413037 shutdown_connections 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.788+0000 7fd818f1c640 1 --2- 192.168.123.107:0/2773413037 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd81407c880 0x7fd81407cc80 unknown :-1 s=CLOSED pgs=3 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.788+0000 7fd818f1c640 1 -- 192.168.123.107:0/2773413037 >> 192.168.123.107:0/2773413037 conn(0x7fd81407b8f0 msgr2=0x7fd8141066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.788+0000 7fd818f1c640 1 -- 192.168.123.107:0/2773413037 shutdown_connections 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.788+0000 7fd818f1c640 1 -- 192.168.123.107:0/2773413037 wait complete. 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.788+0000 7fd818f1c640 1 Processor -- start 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.789+0000 7fd818f1c640 1 -- start start 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.789+0000 7fd818f1c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd81407c880 0x7fd8141aa450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.789+0000 7fd818f1c640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd8141aa990 con 0x7fd81407c880 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.789+0000 7fd812575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd81407c880 0x7fd8141aa450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.789+0000 7fd812575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd81407c880 0x7fd8141aa450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:35204/0 (socket says 192.168.123.107:35204) 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.789+0000 7fd812575640 1 -- 192.168.123.107:0/4012913955 learned_addr learned my addr 192.168.123.107:0/4012913955 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.789+0000 7fd812575640 1 -- 192.168.123.107:0/4012913955 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd8080095d0 con 0x7fd81407c880 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.790+0000 7fd812575640 1 --2- 192.168.123.107:0/4012913955 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd81407c880 0x7fd8141aa450 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7fd808002410 tx=0x7fd8080047c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.790+0000 7fd7ff7fe640 1 -- 192.168.123.107:0/4012913955 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd808040610 con 0x7fd81407c880 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.790+0000 7fd7ff7fe640 1 -- 192.168.123.107:0/4012913955 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7fd808040790 con 0x7fd81407c880 2026-03-09T19:21:45.884 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.790+0000 7fd7ff7fe640 1 -- 192.168.123.107:0/4012913955 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd808040940 con 0x7fd81407c880 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.790+0000 7fd818f1c640 1 -- 192.168.123.107:0/4012913955 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd8141aab90 con 0x7fd81407c880 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.790+0000 7fd818f1c640 1 -- 192.168.123.107:0/4012913955 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd8141ab030 con 0x7fd81407c880 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.791+0000 7fd7ff7fe640 1 -- 192.168.123.107:0/4012913955 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 1) v1 ==== 672+0+0 (secure 0 0 0) 0x7fd80804c450 con 0x7fd81407c880 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.791+0000 7fd7ff7fe640 1 -- 192.168.123.107:0/4012913955 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fd80803f6a0 con 0x7fd81407c880 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.791+0000 7fd818f1c640 1 -- 192.168.123.107:0/4012913955 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd7d8005350 con 0x7fd81407c880 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.793+0000 7fd7ff7fe640 1 -- 192.168.123.107:0/4012913955 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+74806 (secure 0 0 0) 0x7fd80806c450 con 0x7fd81407c880 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.823+0000 7fd818f1c640 1 -- 192.168.123.107:0/4012913955 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7fd7d8003c00 con 0x7fd81407c880 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.831+0000 7fd7ff7fe640 1 -- 192.168.123.107:0/4012913955 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v2) v1 ==== 70+0+471 (secure 0 0 0) 0x7fd80805a020 con 0x7fd81407c880 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.831+0000 7fd7ff7fe640 1 -- 192.168.123.107:0/4012913955 <== mon.0 v2:192.168.123.107:3300/0 8 ==== config(26 keys) v1 ==== 1003+0+0 (secure 0 0 0) 0x7fd80803fd60 con 0x7fd81407c880 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.833+0000 7fd818f1c640 1 -- 192.168.123.107:0/4012913955 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd81407c880 msgr2=0x7fd8141aa450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.833+0000 7fd818f1c640 1 --2- 192.168.123.107:0/4012913955 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd81407c880 0x7fd8141aa450 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7fd808002410 tx=0x7fd8080047c0 comp rx=0 tx=0).stop 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.833+0000 7fd818f1c640 1 -- 192.168.123.107:0/4012913955 shutdown_connections 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.833+0000 7fd818f1c640 1 --2- 192.168.123.107:0/4012913955 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd81407c880 0x7fd8141aa450 unknown :-1 s=CLOSED pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.833+0000 7fd818f1c640 1 -- 192.168.123.107:0/4012913955 >> 192.168.123.107:0/4012913955 conn(0x7fd81407b8f0 msgr2=0x7fd814105e60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.834+0000 7fd818f1c640 1 -- 192.168.123.107:0/4012913955 shutdown_connections 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:45.834+0000 7fd818f1c640 1 -- 192.168.123.107:0/4012913955 wait complete. 2026-03-09T19:21:45.885 INFO:teuthology.orchestra.run.vm07.stdout:Generating new minimal ceph.conf... 2026-03-09T19:21:46.098 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.005+0000 7fc1204cf640 1 Processor -- start 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.005+0000 7fc1204cf640 1 -- start start 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.006+0000 7fc1204cf640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc118104ea0 0x7fc1181052a0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.006+0000 7fc1204cf640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc1181057e0 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.006+0000 7fc11e244640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc118104ea0 0x7fc1181052a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.006+0000 7fc11e244640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc118104ea0 0x7fc1181052a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:35208/0 (socket says 192.168.123.107:35208) 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.006+0000 7fc11e244640 1 -- 192.168.123.107:0/97406461 learned_addr learned my addr 192.168.123.107:0/97406461 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.006+0000 7fc11e244640 1 -- 192.168.123.107:0/97406461 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc118105920 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.007+0000 7fc11e244640 1 --2- 192.168.123.107:0/97406461 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc118104ea0 0x7fc1181052a0 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fc10c009920 tx=0x7fc10c0311b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=ae91b5c506251a33 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.007+0000 7fc11d242640 1 -- 192.168.123.107:0/97406461 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc10c031c40 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.007+0000 7fc11d242640 1 -- 192.168.123.107:0/97406461 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(26 keys) v1 ==== 1003+0+0 (secure 0 0 0) 0x7fc10c031da0 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.007+0000 7fc11d242640 1 -- 192.168.123.107:0/97406461 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc10c0376b0 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.008+0000 7fc1204cf640 1 -- 192.168.123.107:0/97406461 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc118104ea0 msgr2=0x7fc1181052a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.008+0000 7fc1204cf640 1 --2- 192.168.123.107:0/97406461 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc118104ea0 0x7fc1181052a0 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fc10c009920 tx=0x7fc10c0311b0 comp rx=0 tx=0).stop 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.008+0000 7fc1204cf640 1 -- 192.168.123.107:0/97406461 shutdown_connections 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.008+0000 7fc1204cf640 1 --2- 192.168.123.107:0/97406461 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc118104ea0 0x7fc1181052a0 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.008+0000 7fc1204cf640 1 -- 192.168.123.107:0/97406461 >> 192.168.123.107:0/97406461 conn(0x7fc118100680 msgr2=0x7fc118102ac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.008+0000 7fc1204cf640 1 -- 192.168.123.107:0/97406461 shutdown_connections 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.008+0000 7fc1204cf640 1 -- 192.168.123.107:0/97406461 wait complete. 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.009+0000 7fc1204cf640 1 Processor -- start 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.009+0000 7fc1204cf640 1 -- start start 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.009+0000 7fc1204cf640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc118104ea0 0x7fc118199780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.009+0000 7fc1204cf640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc118199cc0 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.009+0000 7fc11e244640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc118104ea0 0x7fc118199780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.009+0000 7fc11e244640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc118104ea0 0x7fc118199780 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:35212/0 (socket says 192.168.123.107:35212) 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.009+0000 7fc11e244640 1 -- 192.168.123.107:0/3944008978 learned_addr learned my addr 192.168.123.107:0/3944008978 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.009+0000 7fc11e244640 1 -- 192.168.123.107:0/3944008978 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc10c0095d0 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.010+0000 7fc11e244640 1 --2- 192.168.123.107:0/3944008978 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc118104ea0 0x7fc118199780 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fc10c0098f0 tx=0x7fc10c039670 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.010+0000 7fc1077fe640 1 -- 192.168.123.107:0/3944008978 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc10c00e270 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.010+0000 7fc1077fe640 1 -- 192.168.123.107:0/3944008978 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(26 keys) v1 ==== 1003+0+0 (secure 0 0 0) 0x7fc10c037e20 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.011+0000 7fc1077fe640 1 -- 192.168.123.107:0/3944008978 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc10c042360 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.011+0000 7fc1204cf640 1 -- 192.168.123.107:0/3944008978 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc118199f20 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.011+0000 7fc1204cf640 1 -- 192.168.123.107:0/3944008978 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc11819a3a0 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.012+0000 7fc1077fe640 1 -- 192.168.123.107:0/3944008978 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 1) v1 ==== 672+0+0 (secure 0 0 0) 0x7fc10c044460 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.012+0000 7fc1077fe640 1 -- 192.168.123.107:0/3944008978 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fc10c042df0 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.012+0000 7fc1204cf640 1 -- 192.168.123.107:0/3944008978 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc1181052a0 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.014+0000 7fc1077fe640 1 -- 192.168.123.107:0/3944008978 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+74806 (secure 0 0 0) 0x7fc10c049070 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.045+0000 7fc1204cf640 1 -- 192.168.123.107:0/3944008978 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7fc118106c60 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.045+0000 7fc1077fe640 1 -- 192.168.123.107:0/3944008978 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v2) v1 ==== 76+0+181 (secure 0 0 0) 0x7fc10c03e070 con 0x7fc118104ea0 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.047+0000 7fc1204cf640 1 -- 192.168.123.107:0/3944008978 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc118104ea0 msgr2=0x7fc118199780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.047+0000 7fc1204cf640 1 --2- 192.168.123.107:0/3944008978 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc118104ea0 0x7fc118199780 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fc10c0098f0 tx=0x7fc10c039670 comp rx=0 tx=0).stop 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.047+0000 7fc1204cf640 1 -- 192.168.123.107:0/3944008978 shutdown_connections 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.047+0000 7fc1204cf640 1 --2- 192.168.123.107:0/3944008978 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc118104ea0 0x7fc118199780 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.047+0000 7fc1204cf640 1 -- 192.168.123.107:0/3944008978 >> 192.168.123.107:0/3944008978 conn(0x7fc118100680 msgr2=0x7fc11818ffa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.047+0000 7fc1204cf640 1 -- 192.168.123.107:0/3944008978 shutdown_connections 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.047+0000 7fc1204cf640 1 -- 192.168.123.107:0/3944008978 wait complete. 2026-03-09T19:21:46.099 INFO:teuthology.orchestra.run.vm07.stdout:Restarting the monitor... 2026-03-09T19:21:46.193 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 systemd[1]: Stopping Ceph mon.vm07 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:21:46.454 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07[48253]: 2026-03-09T19:21:46.193+0000 7fc5f2cbd640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm07 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T19:21:46.454 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07[48253]: 2026-03-09T19:21:46.193+0000 7fc5f2cbd640 -1 mon.vm07@0(leader) e1 *** Got Signal Terminated *** 2026-03-09T19:21:46.454 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 podman[48461]: 2026-03-09 19:21:46.239378968 +0000 UTC m=+0.061973546 container died 7c655a9264eaf08d5c18fa0bd26dcc75b794b5597a3d04f48c82536efef3d63d (image=quay.ceph.io/ceph-ci/ceph:reef, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_REF=reef, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T19:21:46.454 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 podman[48461]: 2026-03-09 19:21:46.258061603 +0000 UTC m=+0.080656181 container remove 7c655a9264eaf08d5c18fa0bd26dcc75b794b5597a3d04f48c82536efef3d63d (image=quay.ceph.io/ceph-ci/ceph:reef, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T19:21:46.454 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 bash[48461]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07 2026-03-09T19:21:46.454 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mon.vm07.service: Deactivated successfully. 2026-03-09T19:21:46.454 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 systemd[1]: Stopped Ceph mon.vm07 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:21:46.454 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 systemd[1]: Starting Ceph mon.vm07 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:21:46.454 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 podman[48531]: 2026-03-09 19:21:46.4317393 +0000 UTC m=+0.017782211 container create ccb644205fb3015552c6dee0fd883e9274481ccc83c6a2d83d43307275007c07 (image=quay.ceph.io/ceph-ci/ceph:reef, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, CEPH_REF=reef) 2026-03-09T19:21:46.478 INFO:teuthology.orchestra.run.vm07.stdout:Setting public_network to 192.168.123.0/24 in global config section 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 podman[48531]: 2026-03-09 19:21:46.470164359 +0000 UTC m=+0.056207271 container init ccb644205fb3015552c6dee0fd883e9274481ccc83c6a2d83d43307275007c07 (image=quay.ceph.io/ceph-ci/ceph:reef, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS) 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 podman[48531]: 2026-03-09 19:21:46.473630854 +0000 UTC m=+0.059673765 container start ccb644205fb3015552c6dee0fd883e9274481ccc83c6a2d83d43307275007c07 (image=quay.ceph.io/ceph-ci/ceph:reef, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223) 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 bash[48531]: ccb644205fb3015552c6dee0fd883e9274481ccc83c6a2d83d43307275007c07 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 podman[48531]: 2026-03-09 19:21:46.425307699 +0000 UTC m=+0.011350621 image pull b6fe7eb6a9d0f9b143033c43e1cdf7ef0918719fc7cff0dd0e2c113bb482fdd6 quay.ceph.io/ceph-ci/ceph:reef 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 systemd[1]: Started Ceph mon.vm07 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: set uid:gid to 167:167 (ceph:ceph) 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable), process ceph-mon, pid 2 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: pidfile_write: ignore empty --pid-file 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: load: jerasure load: lrc 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: RocksDB version: 7.9.2 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Git sha 0 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Compile date 2026-02-26 02:56:47 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: DB SUMMARY 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: DB Session ID: MHDYJ989SEPIT71PMIRZ 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: CURRENT file: CURRENT 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: IDENTITY file: IDENTITY 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: MANIFEST file: MANIFEST-000010 size: 179 Bytes 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm07/store.db dir, Total Num: 1, files: 000008.sst 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm07/store.db: 000009.log size: 88970 ; 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.error_if_exists: 0 2026-03-09T19:21:46.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.create_if_missing: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.paranoid_checks: 1 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.env: 0x55db64682ee0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.fs: PosixFileSystem 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.info_log: 0x55db66a601e0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_file_opening_threads: 16 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.statistics: (nil) 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.use_fsync: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_log_file_size: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.keep_log_file_num: 1000 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.recycle_log_file_num: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.allow_fallocate: 1 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.allow_mmap_reads: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.allow_mmap_writes: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.use_direct_reads: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.create_missing_column_families: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.db_log_dir: 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.wal_dir: 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.advise_random_on_open: 1 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.db_write_buffer_size: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.write_buffer_manager: 0x55db66a703c0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.rate_limiter: (nil) 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.wal_recovery_mode: 2 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.enable_thread_tracking: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.enable_pipelined_write: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.unordered_write: 0 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-09T19:21:46.721 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.row_cache: None 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.wal_filter: None 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.allow_ingest_behind: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.two_write_queues: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.manual_wal_flush: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.wal_compression: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.atomic_flush: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.log_readahead_size: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.best_efforts_recovery: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.allow_data_in_errors: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.db_host_id: __hostname__ 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_background_jobs: 2 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_background_compactions: -1 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_subcompactions: 1 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_total_wal_size: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_open_files: -1 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.bytes_per_sync: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compaction_readahead_size: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_background_flushes: -1 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Compression algorithms supported: 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: kZSTD supported: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: kXpressCompression supported: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: kBZip2Compression supported: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: kLZ4Compression supported: 1 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: kZlibCompression supported: 1 2026-03-09T19:21:46.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: kLZ4HCCompression supported: 1 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: kSnappyCompression supported: 1 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm07/store.db/MANIFEST-000010 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.merge_operator: 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compaction_filter: None 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compaction_filter_factory: None 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.sst_partitioner_factory: None 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55db66ab1c80) 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: cache_index_and_filter_blocks: 1 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: pin_top_level_index_and_filter: 1 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: index_type: 0 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: data_block_index_type: 0 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: index_shortening: 1 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: checksum: 4 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: no_block_cache: 0 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_cache: 0x55db66a83350 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_cache_name: BinnedLRUCache 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_cache_options: 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: capacity : 536870912 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: num_shard_bits : 4 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: strict_capacity_limit : 0 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: high_pri_pool_ratio: 0.000 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_cache_compressed: (nil) 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: persistent_cache: (nil) 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_size: 4096 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_size_deviation: 10 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_restart_interval: 16 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: index_block_restart_interval: 1 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: metadata_block_size: 4096 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: partition_filters: 0 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: use_delta_encoding: 1 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: filter_policy: bloomfilter 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: whole_key_filtering: 1 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: verify_compression: 0 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: read_amp_bytes_per_bit: 0 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: format_version: 5 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: enable_index_compression: 1 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_align: 0 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: max_auto_readahead_size: 262144 2026-03-09T19:21:46.723 INFO:journalctl@ceph.mon.vm07.vm07.stdout: prepopulate_block_cache: 0 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout: initial_auto_readahead_size: 8192 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout: num_file_reads_for_auto_readahead: 2 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.write_buffer_size: 33554432 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_write_buffer_number: 2 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compression: NoCompression 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.bottommost_compression: Disabled 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.prefix_extractor: nullptr 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.num_levels: 7 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compression_opts.level: 32767 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compression_opts.strategy: 0 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compression_opts.enabled: false 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.target_file_size_base: 67108864 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-09T19:21:46.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.arena_block_size: 1048576 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.disable_auto_compactions: 0 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.inplace_update_support: 0 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.bloom_locality: 0 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.max_successive_merges: 0 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.paranoid_file_checks: 0 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.force_consistency_checks: 1 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.report_bg_io_stats: 0 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.ttl: 2592000 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.enable_blob_files: false 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.min_blob_size: 0 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.blob_file_size: 268435456 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.blob_file_starting_level: 0 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm07/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 12, last_sequence is 5, log_number is 5,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 5 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 2026-03-09T19:21:46.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: b4b8e1cf-65f6-4df8-9145-1b98b7a045ce 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773084106501198, "job": 1, "event": "recovery_started", "wal_files": [9]} 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #9 mode 2 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773084106506676, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 13, "file_size": 84616, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8, "largest_seqno": 285, "table_properties": {"data_size": 82738, "index_size": 203, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 645, "raw_key_size": 13151, "raw_average_key_size": 51, "raw_value_size": 75663, "raw_average_value_size": 295, "num_data_blocks": 9, "num_entries": 256, "num_filter_entries": 256, "num_deletions": 3, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773084106, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "b4b8e1cf-65f6-4df8-9145-1b98b7a045ce", "db_session_id": "MHDYJ989SEPIT71PMIRZ", "orig_file_number": 13, "seqno_to_time_mapping": "N/A"}} 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773084106506734, "job": 1, "event": "recovery_finished"} 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: [db/version_set.cc:5047] Creating manifest 15 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm07/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55db66a84e00 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: rocksdb: DB pointer 0x55db66b92000 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: starting mon.vm07 rank 0 at public addrs [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] at bind addrs [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] mon_data /var/lib/ceph/mon/ceph-vm07 fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: mon.vm07@-1(???) e1 preinit fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: mon.vm07@-1(???).mds e1 new map 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: mon.vm07@-1(???).mds e1 print_map 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout: e1 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout: legacy client fscid: -1 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout: No filesystems configured 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: mon.vm07@-1(???).osd e1 crush map has features 3314932999778484224, adjusting msgr requires 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: mon.vm07@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: mon.vm07@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: mon.vm07@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: mon.vm07@-1(???).paxosservice(auth 1..2) refresh upgraded, format 0 -> 3 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: mon.vm07 is new leader, mons vm07 in quorum (ranks 0) 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: monmap e1: 1 mons at {vm07=[v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0]} removed_ranks: {} disallowed_leaders: {} 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: fsmap 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: osdmap e1: 0 total, 0 up, 0 in 2026-03-09T19:21:46.726 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:46 vm07 ceph-mon[48545]: mgrmap e1: no daemons active 2026-03-09T19:21:46.729 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.612+0000 7f0bb48e2640 1 Processor -- start 2026-03-09T19:21:46.730 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.614+0000 7f0bb48e2640 1 -- start start 2026-03-09T19:21:46.730 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.614+0000 7f0bb48e2640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bac1082f0 0x7f0bac1086f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:46.730 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.614+0000 7f0bb48e2640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0bac108cc0 con 0x7f0bac1082f0 2026-03-09T19:21:46.730 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.615+0000 7f0bb2657640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bac1082f0 0x7f0bac1086f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:46.730 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.615+0000 7f0bb2657640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bac1082f0 0x7f0bac1086f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:35214/0 (socket says 192.168.123.107:35214) 2026-03-09T19:21:46.730 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.615+0000 7f0bb2657640 1 -- 192.168.123.107:0/3357208565 learned_addr learned my addr 192.168.123.107:0/3357208565 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:46.730 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.615+0000 7f0bb2657640 1 -- 192.168.123.107:0/3357208565 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0bac1094a0 con 0x7f0bac1082f0 2026-03-09T19:21:46.730 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.617+0000 7f0bb2657640 1 --2- 192.168.123.107:0/3357208565 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bac1082f0 0x7f0bac1086f0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f0b98009920 tx=0x7f0b9802ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=735c4c4193b75b4 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:46.731 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.617+0000 7f0bb1655640 1 -- 192.168.123.107:0/3357208565 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0b9802f9b0 con 0x7f0bac1082f0 2026-03-09T19:21:46.731 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.617+0000 7f0bb1655640 1 -- 192.168.123.107:0/3357208565 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(26 keys) v1 ==== 1003+0+0 (secure 0 0 0) 0x7f0b98037440 con 0x7f0bac1082f0 2026-03-09T19:21:46.731 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.617+0000 7f0bb1655640 1 -- 192.168.123.107:0/3357208565 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0b980354e0 con 0x7f0bac1082f0 2026-03-09T19:21:46.731 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.618+0000 7f0bb48e2640 1 -- 192.168.123.107:0/3357208565 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bac1082f0 msgr2=0x7f0bac1086f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:46.731 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.618+0000 7f0bb48e2640 1 --2- 192.168.123.107:0/3357208565 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bac1082f0 0x7f0bac1086f0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f0b98009920 tx=0x7f0b9802ef20 comp rx=0 tx=0).stop 2026-03-09T19:21:46.731 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.618+0000 7f0bb48e2640 1 -- 192.168.123.107:0/3357208565 shutdown_connections 2026-03-09T19:21:46.731 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.618+0000 7f0bb48e2640 1 --2- 192.168.123.107:0/3357208565 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bac1082f0 0x7f0bac1086f0 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:46.731 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.618+0000 7f0bb48e2640 1 -- 192.168.123.107:0/3357208565 >> 192.168.123.107:0/3357208565 conn(0x7f0bac07ba00 msgr2=0x7f0bac1066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:46.731 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.619+0000 7f0bb48e2640 1 -- 192.168.123.107:0/3357208565 shutdown_connections 2026-03-09T19:21:46.731 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.619+0000 7f0bb48e2640 1 -- 192.168.123.107:0/3357208565 wait complete. 2026-03-09T19:21:46.731 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.619+0000 7f0bb48e2640 1 Processor -- start 2026-03-09T19:21:46.731 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.620+0000 7f0bb48e2640 1 -- start start 2026-03-09T19:21:46.731 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.620+0000 7f0bb48e2640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bac1082f0 0x7f0bac07fc50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:46.731 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.620+0000 7f0bb48e2640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0bac080190 con 0x7f0bac1082f0 2026-03-09T19:21:46.731 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.620+0000 7f0bb2657640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bac1082f0 0x7f0bac07fc50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.620+0000 7f0bb2657640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bac1082f0 0x7f0bac07fc50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:35224/0 (socket says 192.168.123.107:35224) 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.620+0000 7f0bb2657640 1 -- 192.168.123.107:0/2455675245 learned_addr learned my addr 192.168.123.107:0/2455675245 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.621+0000 7f0bb2657640 1 -- 192.168.123.107:0/2455675245 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0b980095d0 con 0x7f0bac1082f0 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.621+0000 7f0bb2657640 1 --2- 192.168.123.107:0/2455675245 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bac1082f0 0x7f0bac07fc50 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f0b980098f0 tx=0x7f0b98035340 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.621+0000 7f0b977fe640 1 -- 192.168.123.107:0/2455675245 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0b98035690 con 0x7f0bac1082f0 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.621+0000 7f0b977fe640 1 -- 192.168.123.107:0/2455675245 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(26 keys) v1 ==== 1003+0+0 (secure 0 0 0) 0x7f0b98035c40 con 0x7f0bac1082f0 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.622+0000 7f0b977fe640 1 -- 192.168.123.107:0/2455675245 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0b98040c60 con 0x7f0bac1082f0 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.622+0000 7f0bb48e2640 1 -- 192.168.123.107:0/2455675245 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0bac080390 con 0x7f0bac1082f0 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.622+0000 7f0bb48e2640 1 -- 192.168.123.107:0/2455675245 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0bac07c790 con 0x7f0bac1082f0 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.623+0000 7f0b977fe640 1 -- 192.168.123.107:0/2455675245 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 1) v1 ==== 672+0+0 (secure 0 0 0) 0x7f0b98049400 con 0x7f0bac1082f0 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.623+0000 7f0b977fe640 1 -- 192.168.123.107:0/2455675245 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f0b98035db0 con 0x7f0bac1082f0 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.623+0000 7f0bb48e2640 1 -- 192.168.123.107:0/2455675245 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0bac108770 con 0x7f0bac1082f0 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.626+0000 7f0b977fe640 1 -- 192.168.123.107:0/2455675245 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+74806 (secure 0 0 0) 0x7f0b98044020 con 0x7f0bac1082f0 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.657+0000 7f0bb48e2640 1 -- 192.168.123.107:0/2455675245 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command([{prefix=config set, name=public_network}] v 0) v1 -- 0x7f0bac07cc30 con 0x7f0bac1082f0 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.660+0000 7f0b977fe640 1 -- 192.168.123.107:0/2455675245 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{prefix=config set, name=public_network}]=0 v3)=0 v3) v1 ==== 130+0+0 (secure 0 0 0) 0x7f0b980580c0 con 0x7f0bac1082f0 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.661+0000 7f0b977fe640 1 -- 192.168.123.107:0/2455675245 <== mon.0 v2:192.168.123.107:3300/0 8 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f0b98049ca0 con 0x7f0bac1082f0 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.662+0000 7f0bb48e2640 1 -- 192.168.123.107:0/2455675245 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bac1082f0 msgr2=0x7f0bac07fc50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.662+0000 7f0bb48e2640 1 --2- 192.168.123.107:0/2455675245 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bac1082f0 0x7f0bac07fc50 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f0b980098f0 tx=0x7f0b98035340 comp rx=0 tx=0).stop 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.663+0000 7f0bb48e2640 1 -- 192.168.123.107:0/2455675245 shutdown_connections 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.663+0000 7f0bb48e2640 1 --2- 192.168.123.107:0/2455675245 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bac1082f0 0x7f0bac07fc50 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.663+0000 7f0bb48e2640 1 -- 192.168.123.107:0/2455675245 >> 192.168.123.107:0/2455675245 conn(0x7f0bac07ba00 msgr2=0x7f0bac105d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.663+0000 7f0bb48e2640 1 -- 192.168.123.107:0/2455675245 shutdown_connections 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:46.663+0000 7f0bb48e2640 1 -- 192.168.123.107:0/2455675245 wait complete. 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:Wrote config to /etc/ceph/ceph.conf 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:Wrote keyring to /etc/ceph/ceph.client.admin.keyring 2026-03-09T19:21:46.732 INFO:teuthology.orchestra.run.vm07.stdout:Creating mgr... 2026-03-09T19:21:46.733 INFO:teuthology.orchestra.run.vm07.stdout:Verifying port 0.0.0.0:9283 ... 2026-03-09T19:21:46.733 INFO:teuthology.orchestra.run.vm07.stdout:Verifying port 0.0.0.0:8765 ... 2026-03-09T19:21:46.733 INFO:teuthology.orchestra.run.vm07.stdout:Verifying port 0.0.0.0:8443 ... 2026-03-09T19:21:46.897 INFO:teuthology.orchestra.run.vm07.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mgr.vm07.xacuym 2026-03-09T19:21:46.897 INFO:teuthology.orchestra.run.vm07.stdout:systemctl: stderr Failed to reset failed state of unit ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mgr.vm07.xacuym.service: Unit ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mgr.vm07.xacuym.service not loaded. 2026-03-09T19:21:47.040 INFO:teuthology.orchestra.run.vm07.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594.target.wants/ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mgr.vm07.xacuym.service → /etc/systemd/system/ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@.service. 2026-03-09T19:21:47.227 INFO:teuthology.orchestra.run.vm07.stdout:firewalld does not appear to be present 2026-03-09T19:21:47.227 INFO:teuthology.orchestra.run.vm07.stdout:Not possible to enable service . firewalld.service is not available 2026-03-09T19:21:47.227 INFO:teuthology.orchestra.run.vm07.stdout:firewalld does not appear to be present 2026-03-09T19:21:47.227 INFO:teuthology.orchestra.run.vm07.stdout:Not possible to open ports <[9283, 8765, 8443]>. firewalld.service is not available 2026-03-09T19:21:47.227 INFO:teuthology.orchestra.run.vm07.stdout:Waiting for mgr to start... 2026-03-09T19:21:47.227 INFO:teuthology.orchestra.run.vm07.stdout:Waiting for mgr... 2026-03-09T19:21:47.504 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout 2026-03-09T19:21:47.504 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout { 2026-03-09T19:21:47.504 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "fsid": "17715774-1bed-11f1-9ad8-1bc9d74ff594", 2026-03-09T19:21:47.504 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "health": { 2026-03-09T19:21:47.504 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-09T19:21:47.504 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-09T19:21:47.504 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-09T19:21:47.504 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:47.504 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-09T19:21:47.504 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout 0 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout ], 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "vm07" 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout ], 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "quorum_age": 0, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "restful" 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout ], 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "modified": "2026-03-09T19:21:45.300246+0000", 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout } 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.384+0000 7fb9d195b640 1 Processor -- start 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.385+0000 7fb9d195b640 1 -- start start 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.385+0000 7fb9d195b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9cc071820 0x7fb9cc071c20 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.385+0000 7fb9d195b640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb9cc0721f0 con 0x7fb9cc071820 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.386+0000 7fb9d0959640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9cc071820 0x7fb9cc071c20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.386+0000 7fb9d0959640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9cc071820 0x7fb9cc071c20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:35260/0 (socket says 192.168.123.107:35260) 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.386+0000 7fb9d0959640 1 -- 192.168.123.107:0/3874298051 learned_addr learned my addr 192.168.123.107:0/3874298051 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.387+0000 7fb9d0959640 1 -- 192.168.123.107:0/3874298051 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb9cc072330 con 0x7fb9cc071820 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.388+0000 7fb9d0959640 1 --2- 192.168.123.107:0/3874298051 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9cc071820 0x7fb9cc071c20 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fb9bc0089a0 tx=0x7fb9bc031440 comp rx=0 tx=0).ready entity=mon.0 client_cookie=118256b0c209f5c7 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:47.505 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.388+0000 7fb9cb7fe640 1 -- 192.168.123.107:0/3874298051 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb9bc031e50 con 0x7fb9cc071820 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.388+0000 7fb9cb7fe640 1 -- 192.168.123.107:0/3874298051 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7fb9bc035070 con 0x7fb9cc071820 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.389+0000 7fb9d195b640 1 -- 192.168.123.107:0/3874298051 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9cc071820 msgr2=0x7fb9cc071c20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.389+0000 7fb9d195b640 1 --2- 192.168.123.107:0/3874298051 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9cc071820 0x7fb9cc071c20 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fb9bc0089a0 tx=0x7fb9bc031440 comp rx=0 tx=0).stop 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.389+0000 7fb9d195b640 1 -- 192.168.123.107:0/3874298051 shutdown_connections 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.389+0000 7fb9d195b640 1 --2- 192.168.123.107:0/3874298051 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9cc071820 0x7fb9cc071c20 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.389+0000 7fb9d195b640 1 -- 192.168.123.107:0/3874298051 >> 192.168.123.107:0/3874298051 conn(0x7fb9cc06d060 msgr2=0x7fb9cc06f480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.390+0000 7fb9d195b640 1 -- 192.168.123.107:0/3874298051 shutdown_connections 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.390+0000 7fb9d195b640 1 -- 192.168.123.107:0/3874298051 wait complete. 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.390+0000 7fb9d195b640 1 Processor -- start 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.390+0000 7fb9d195b640 1 -- start start 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.390+0000 7fb9d195b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9cc1a2430 0x7fb9cc1a2850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.390+0000 7fb9d195b640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb9bc03b720 con 0x7fb9cc1a2430 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.391+0000 7fb9d0959640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9cc1a2430 0x7fb9cc1a2850 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.391+0000 7fb9d0959640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9cc1a2430 0x7fb9cc1a2850 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:35268/0 (socket says 192.168.123.107:35268) 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.391+0000 7fb9d0959640 1 -- 192.168.123.107:0/3900274888 learned_addr learned my addr 192.168.123.107:0/3900274888 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.391+0000 7fb9d0959640 1 -- 192.168.123.107:0/3900274888 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb9bc008650 con 0x7fb9cc1a2430 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.392+0000 7fb9d0959640 1 --2- 192.168.123.107:0/3900274888 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9cc1a2430 0x7fb9cc1a2850 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fb9bc008970 tx=0x7fb9bc008c70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.393+0000 7fb9c9ffb640 1 -- 192.168.123.107:0/3900274888 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb9bc03c480 con 0x7fb9cc1a2430 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.393+0000 7fb9d195b640 1 -- 192.168.123.107:0/3900274888 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb9cc1a2d90 con 0x7fb9cc1a2430 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.393+0000 7fb9d195b640 1 -- 192.168.123.107:0/3900274888 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb9cc1a5930 con 0x7fb9cc1a2430 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.393+0000 7fb9c9ffb640 1 -- 192.168.123.107:0/3900274888 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7fb9bc035040 con 0x7fb9cc1a2430 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.393+0000 7fb9c9ffb640 1 -- 192.168.123.107:0/3900274888 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb9bc00b5c0 con 0x7fb9cc1a2430 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.394+0000 7fb9c9ffb640 1 -- 192.168.123.107:0/3900274888 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 1) v1 ==== 672+0+0 (secure 0 0 0) 0x7fb9bc046400 con 0x7fb9cc1a2430 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.395+0000 7fb9c9ffb640 1 -- 192.168.123.107:0/3900274888 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fb9bc037070 con 0x7fb9cc1a2430 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.395+0000 7fb9d195b640 1 -- 192.168.123.107:0/3900274888 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb994005350 con 0x7fb9cc1a2430 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.396+0000 7fb9c9ffb640 1 -- 192.168.123.107:0/3900274888 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+74806 (secure 0 0 0) 0x7fb9bc04b050 con 0x7fb9cc1a2430 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.436+0000 7fb9d195b640 1 -- 192.168.123.107:0/3900274888 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7fb9940051c0 con 0x7fb9cc1a2430 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.436+0000 7fb9c9ffb640 1 -- 192.168.123.107:0/3900274888 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7fb9bc042070 con 0x7fb9cc1a2430 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.440+0000 7fb99b7fe640 1 -- 192.168.123.107:0/3900274888 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9cc1a2430 msgr2=0x7fb9cc1a2850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.440+0000 7fb99b7fe640 1 --2- 192.168.123.107:0/3900274888 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9cc1a2430 0x7fb9cc1a2850 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fb9bc008970 tx=0x7fb9bc008c70 comp rx=0 tx=0).stop 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.440+0000 7fb99b7fe640 1 -- 192.168.123.107:0/3900274888 shutdown_connections 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.440+0000 7fb99b7fe640 1 --2- 192.168.123.107:0/3900274888 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9cc1a2430 0x7fb9cc1a2850 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.440+0000 7fb99b7fe640 1 -- 192.168.123.107:0/3900274888 >> 192.168.123.107:0/3900274888 conn(0x7fb9cc06d060 msgr2=0x7fb9cc06e9e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.440+0000 7fb99b7fe640 1 -- 192.168.123.107:0/3900274888 shutdown_connections 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:47.440+0000 7fb99b7fe640 1 -- 192.168.123.107:0/3900274888 wait complete. 2026-03-09T19:21:47.506 INFO:teuthology.orchestra.run.vm07.stdout:mgr not available, waiting (1/15)... 2026-03-09T19:21:47.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:47 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/2455675245' entity='client.admin' 2026-03-09T19:21:47.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:47 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/3900274888' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout { 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "fsid": "17715774-1bed-11f1-9ad8-1bc9d74ff594", 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "health": { 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout 0 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout ], 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "vm07" 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout ], 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "quorum_age": 3, 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-09T19:21:49.747 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-09T19:21:49.748 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-09T19:21:49.749 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-09T19:21:49.749 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-09T19:21:49.749 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:49.749 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-09T19:21:49.749 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-09T19:21:49.749 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-09T19:21:49.749 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-09T19:21:49.749 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-09T19:21:49.749 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-09T19:21:49.749 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-09T19:21:49.749 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "restful" 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout ], 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "modified": "2026-03-09T19:21:45.300246+0000", 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout } 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.669+0000 7f7e74da5640 1 Processor -- start 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.670+0000 7f7e74da5640 1 -- start start 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.670+0000 7f7e74da5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7e70071cd0 0x7f7e700720d0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.670+0000 7f7e74da5640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7e70072610 con 0x7f7e70071cd0 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.670+0000 7f7e6e575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7e70071cd0 0x7f7e700720d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.670+0000 7f7e6e575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7e70071cd0 0x7f7e700720d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:44526/0 (socket says 192.168.123.107:44526) 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.670+0000 7f7e6e575640 1 -- 192.168.123.107:0/2859534379 learned_addr learned my addr 192.168.123.107:0/2859534379 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.670+0000 7f7e6e575640 1 -- 192.168.123.107:0/2859534379 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7e70072750 con 0x7f7e70071cd0 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.671+0000 7f7e6e575640 1 --2- 192.168.123.107:0/2859534379 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7e70071cd0 0x7f7e700720d0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f7e600089a0 tx=0x7f7e60031440 comp rx=0 tx=0).ready entity=mon.0 client_cookie=3a267e5281c7c93 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.672+0000 7f7e6d573640 1 -- 192.168.123.107:0/2859534379 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7e60031e50 con 0x7f7e70071cd0 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.672+0000 7f7e6d573640 1 -- 192.168.123.107:0/2859534379 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f7e60035070 con 0x7f7e70071cd0 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.672+0000 7f7e74da5640 1 -- 192.168.123.107:0/2859534379 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7e70071cd0 msgr2=0x7f7e700720d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.672+0000 7f7e74da5640 1 --2- 192.168.123.107:0/2859534379 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7e70071cd0 0x7f7e700720d0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f7e600089a0 tx=0x7f7e60031440 comp rx=0 tx=0).stop 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.673+0000 7f7e74da5640 1 -- 192.168.123.107:0/2859534379 shutdown_connections 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.673+0000 7f7e74da5640 1 --2- 192.168.123.107:0/2859534379 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7e70071cd0 0x7f7e700720d0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.673+0000 7f7e74da5640 1 -- 192.168.123.107:0/2859534379 >> 192.168.123.107:0/2859534379 conn(0x7f7e7006d2a0 msgr2=0x7f7e7006f6e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.674+0000 7f7e74da5640 1 -- 192.168.123.107:0/2859534379 shutdown_connections 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.674+0000 7f7e74da5640 1 -- 192.168.123.107:0/2859534379 wait complete. 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.674+0000 7f7e74da5640 1 Processor -- start 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.674+0000 7f7e74da5640 1 -- start start 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.674+0000 7f7e74da5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7e701a2640 0x7f7e701a2a60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.674+0000 7f7e74da5640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7e6003b720 con 0x7f7e701a2640 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.675+0000 7f7e6e575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7e701a2640 0x7f7e701a2a60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.675+0000 7f7e6e575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7e701a2640 0x7f7e701a2a60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:44536/0 (socket says 192.168.123.107:44536) 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.675+0000 7f7e6e575640 1 -- 192.168.123.107:0/1618256251 learned_addr learned my addr 192.168.123.107:0/1618256251 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.675+0000 7f7e6e575640 1 -- 192.168.123.107:0/1618256251 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7e60008650 con 0x7f7e701a2640 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.675+0000 7f7e6e575640 1 --2- 192.168.123.107:0/1618256251 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7e701a2640 0x7f7e701a2a60 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f7e60008970 tx=0x7f7e60008c70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.676+0000 7f7e5f7fe640 1 -- 192.168.123.107:0/1618256251 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7e6003c480 con 0x7f7e701a2640 2026-03-09T19:21:49.750 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.676+0000 7f7e74da5640 1 -- 192.168.123.107:0/1618256251 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7e701a2fa0 con 0x7f7e701a2640 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.676+0000 7f7e74da5640 1 -- 192.168.123.107:0/1618256251 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7e701a3b30 con 0x7f7e701a2640 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.677+0000 7f7e5f7fe640 1 -- 192.168.123.107:0/1618256251 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f7e60035040 con 0x7f7e701a2640 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.677+0000 7f7e74da5640 1 -- 192.168.123.107:0/1618256251 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7e34005350 con 0x7f7e701a2640 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.679+0000 7f7e5f7fe640 1 -- 192.168.123.107:0/1618256251 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7e6003b8c0 con 0x7f7e701a2640 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.679+0000 7f7e5f7fe640 1 -- 192.168.123.107:0/1618256251 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 1) v1 ==== 672+0+0 (secure 0 0 0) 0x7f7e6003c8e0 con 0x7f7e701a2640 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.679+0000 7f7e5f7fe640 1 -- 192.168.123.107:0/1618256251 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f7e60037070 con 0x7f7e701a2640 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.680+0000 7f7e5f7fe640 1 -- 192.168.123.107:0/1618256251 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+74806 (secure 0 0 0) 0x7f7e6004b050 con 0x7f7e701a2640 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.714+0000 7f7e74da5640 1 -- 192.168.123.107:0/1618256251 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f7e340051c0 con 0x7f7e701a2640 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.715+0000 7f7e5f7fe640 1 -- 192.168.123.107:0/1618256251 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7f7e60042070 con 0x7f7e701a2640 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.717+0000 7f7e5d7fa640 1 -- 192.168.123.107:0/1618256251 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7e701a2640 msgr2=0x7f7e701a2a60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.717+0000 7f7e5d7fa640 1 --2- 192.168.123.107:0/1618256251 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7e701a2640 0x7f7e701a2a60 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f7e60008970 tx=0x7f7e60008c70 comp rx=0 tx=0).stop 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.717+0000 7f7e5d7fa640 1 -- 192.168.123.107:0/1618256251 shutdown_connections 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.717+0000 7f7e5d7fa640 1 --2- 192.168.123.107:0/1618256251 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7e701a2640 0x7f7e701a2a60 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.717+0000 7f7e5d7fa640 1 -- 192.168.123.107:0/1618256251 >> 192.168.123.107:0/1618256251 conn(0x7f7e7006d2a0 msgr2=0x7f7e7006dab0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.717+0000 7f7e5d7fa640 1 -- 192.168.123.107:0/1618256251 shutdown_connections 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:49.717+0000 7f7e5d7fa640 1 -- 192.168.123.107:0/1618256251 wait complete. 2026-03-09T19:21:49.751 INFO:teuthology.orchestra.run.vm07.stdout:mgr not available, waiting (2/15)... 2026-03-09T19:21:49.966 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:49 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1618256251' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-09T19:21:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:50 vm07 ceph-mon[48545]: Activating manager daemon vm07.xacuym 2026-03-09T19:21:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:50 vm07 ceph-mon[48545]: mgrmap e2: vm07.xacuym(active, starting, since 0.00368758s) 2026-03-09T19:21:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:50 vm07 ceph-mon[48545]: from='mgr.14100 192.168.123.107:0/1455673162' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T19:21:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:50 vm07 ceph-mon[48545]: from='mgr.14100 192.168.123.107:0/1455673162' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T19:21:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:50 vm07 ceph-mon[48545]: from='mgr.14100 192.168.123.107:0/1455673162' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T19:21:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:50 vm07 ceph-mon[48545]: from='mgr.14100 192.168.123.107:0/1455673162' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:21:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:50 vm07 ceph-mon[48545]: from='mgr.14100 192.168.123.107:0/1455673162' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr metadata", "who": "vm07.xacuym", "id": "vm07.xacuym"}]: dispatch 2026-03-09T19:21:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:50 vm07 ceph-mon[48545]: Manager daemon vm07.xacuym is now available 2026-03-09T19:21:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:50 vm07 ceph-mon[48545]: from='mgr.14100 192.168.123.107:0/1455673162' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.xacuym/mirror_snapshot_schedule"}]: dispatch 2026-03-09T19:21:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:50 vm07 ceph-mon[48545]: from='mgr.14100 192.168.123.107:0/1455673162' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.xacuym/trash_purge_schedule"}]: dispatch 2026-03-09T19:21:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:50 vm07 ceph-mon[48545]: from='mgr.14100 192.168.123.107:0/1455673162' entity='mgr.vm07.xacuym' 2026-03-09T19:21:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:50 vm07 ceph-mon[48545]: from='mgr.14100 192.168.123.107:0/1455673162' entity='mgr.vm07.xacuym' 2026-03-09T19:21:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:50 vm07 ceph-mon[48545]: from='mgr.14100 192.168.123.107:0/1455673162' entity='mgr.vm07.xacuym' 2026-03-09T19:21:52.112 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout 2026-03-09T19:21:52.112 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout { 2026-03-09T19:21:52.112 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "fsid": "17715774-1bed-11f1-9ad8-1bc9d74ff594", 2026-03-09T19:21:52.112 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "health": { 2026-03-09T19:21:52.112 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-09T19:21:52.112 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-09T19:21:52.112 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-09T19:21:52.112 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:52.112 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-09T19:21:52.112 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-09T19:21:52.112 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout 0 2026-03-09T19:21:52.112 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout ], 2026-03-09T19:21:52.112 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-09T19:21:52.112 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "vm07" 2026-03-09T19:21:52.112 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout ], 2026-03-09T19:21:52.113 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "quorum_age": 5, 2026-03-09T19:21:52.113 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-09T19:21:52.113 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T19:21:52.113 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-09T19:21:52.113 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-09T19:21:52.113 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:52.113 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-09T19:21:52.113 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T19:21:52.113 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-09T19:21:52.113 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-09T19:21:52.113 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-09T19:21:52.113 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "restful" 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout ], 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "modified": "2026-03-09T19:21:45.300246+0000", 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout }, 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout } 2026-03-09T19:21:52.116 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.904+0000 7fd50f47b640 1 Processor -- start 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.904+0000 7fd50f47b640 1 -- start start 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.905+0000 7fd50f47b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd508071820 0x7fd508071c20 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.905+0000 7fd50f47b640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd508073260 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.905+0000 7fd50d1f0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd508071820 0x7fd508071c20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.905+0000 7fd50d1f0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd508071820 0x7fd508071c20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:44620/0 (socket says 192.168.123.107:44620) 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.905+0000 7fd50d1f0640 1 -- 192.168.123.107:0/2611599244 learned_addr learned my addr 192.168.123.107:0/2611599244 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.905+0000 7fd50d1f0640 1 -- 192.168.123.107:0/2611599244 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd508072160 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.906+0000 7fd50d1f0640 1 --2- 192.168.123.107:0/2611599244 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd508071820 0x7fd508071c20 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fd4f8009b80 tx=0x7fd4f802f190 comp rx=0 tx=0).ready entity=mon.0 client_cookie=e7608c50c9b9803d server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.906+0000 7fd4f7fff640 1 -- 192.168.123.107:0/2611599244 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd4f802fa10 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.907+0000 7fd4f7fff640 1 -- 192.168.123.107:0/2611599244 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7fd4f8037440 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.907+0000 7fd4f7fff640 1 -- 192.168.123.107:0/2611599244 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd4f8035540 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.907+0000 7fd50f47b640 1 -- 192.168.123.107:0/2611599244 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd508071820 msgr2=0x7fd508071c20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.907+0000 7fd50f47b640 1 --2- 192.168.123.107:0/2611599244 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd508071820 0x7fd508071c20 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fd4f8009b80 tx=0x7fd4f802f190 comp rx=0 tx=0).stop 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.907+0000 7fd50f47b640 1 -- 192.168.123.107:0/2611599244 shutdown_connections 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.907+0000 7fd50f47b640 1 --2- 192.168.123.107:0/2611599244 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd508071820 0x7fd508071c20 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.907+0000 7fd50f47b640 1 -- 192.168.123.107:0/2611599244 >> 192.168.123.107:0/2611599244 conn(0x7fd50806d7d0 msgr2=0x7fd50806fc10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.907+0000 7fd50f47b640 1 -- 192.168.123.107:0/2611599244 shutdown_connections 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.907+0000 7fd50f47b640 1 -- 192.168.123.107:0/2611599244 wait complete. 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.908+0000 7fd50f47b640 1 Processor -- start 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.908+0000 7fd50f47b640 1 -- start start 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.908+0000 7fd50f47b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd508071820 0x7fd5081ab090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.908+0000 7fd50f47b640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd5081ab5d0 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.908+0000 7fd50d1f0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd508071820 0x7fd5081ab090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.908+0000 7fd50d1f0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd508071820 0x7fd5081ab090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:44626/0 (socket says 192.168.123.107:44626) 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.908+0000 7fd50d1f0640 1 -- 192.168.123.107:0/396219139 learned_addr learned my addr 192.168.123.107:0/396219139 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.909+0000 7fd50d1f0640 1 -- 192.168.123.107:0/396219139 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd4f80095d0 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.909+0000 7fd50d1f0640 1 --2- 192.168.123.107:0/396219139 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd508071820 0x7fd5081ab090 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fd4f802f6c0 tx=0x7fd4f8037920 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.909+0000 7fd4f67fc640 1 -- 192.168.123.107:0/396219139 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd4f8037ad0 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.909+0000 7fd4f67fc640 1 -- 192.168.123.107:0/396219139 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7fd4f8035c80 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.909+0000 7fd50f47b640 1 -- 192.168.123.107:0/396219139 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd5081ab7d0 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.910+0000 7fd4f67fc640 1 -- 192.168.123.107:0/396219139 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd4f8041c80 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.910+0000 7fd50f47b640 1 -- 192.168.123.107:0/396219139 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd5081abc70 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.911+0000 7fd4f67fc640 1 -- 192.168.123.107:0/396219139 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 3) v1 ==== 49253+0+0 (secure 0 0 0) 0x7fd4f803e030 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.911+0000 7fd50f47b640 1 -- 192.168.123.107:0/396219139 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd508071ca0 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.913+0000 7fd4f67fc640 1 --2- 192.168.123.107:0/396219139 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7fd4e403cd50 0x7fd4e403f210 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.913+0000 7fd4f67fc640 1 -- 192.168.123.107:0/396219139 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fd4f8079220 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.915+0000 7fd50c9ef640 1 --2- 192.168.123.107:0/396219139 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7fd4e403cd50 0x7fd4e403f210 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.915+0000 7fd50c9ef640 1 --2- 192.168.123.107:0/396219139 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7fd4e403cd50 0x7fd4e403f210 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fd4fc0099c0 tx=0x7fd4fc006eb0 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:51.916+0000 7fd4f67fc640 1 -- 192.168.123.107:0/396219139 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fd4f8035800 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.048+0000 7fd50f47b640 1 -- 192.168.123.107:0/396219139 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7fd508110a90 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.050+0000 7fd4f67fc640 1 -- 192.168.123.107:0/396219139 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1240 (secure 0 0 0) 0x7fd508110a90 con 0x7fd508071820 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.053+0000 7fd50f47b640 1 -- 192.168.123.107:0/396219139 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7fd4e403cd50 msgr2=0x7fd4e403f210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.053+0000 7fd50f47b640 1 --2- 192.168.123.107:0/396219139 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7fd4e403cd50 0x7fd4e403f210 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fd4fc0099c0 tx=0x7fd4fc006eb0 comp rx=0 tx=0).stop 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.053+0000 7fd50f47b640 1 -- 192.168.123.107:0/396219139 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd508071820 msgr2=0x7fd5081ab090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.053+0000 7fd50f47b640 1 --2- 192.168.123.107:0/396219139 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd508071820 0x7fd5081ab090 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fd4f802f6c0 tx=0x7fd4f8037920 comp rx=0 tx=0).stop 2026-03-09T19:21:52.117 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.053+0000 7fd50f47b640 1 -- 192.168.123.107:0/396219139 shutdown_connections 2026-03-09T19:21:52.118 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.053+0000 7fd50f47b640 1 --2- 192.168.123.107:0/396219139 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7fd4e403cd50 0x7fd4e403f210 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:52.118 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.053+0000 7fd50f47b640 1 --2- 192.168.123.107:0/396219139 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd508071820 0x7fd5081ab090 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:52.118 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.053+0000 7fd50f47b640 1 -- 192.168.123.107:0/396219139 >> 192.168.123.107:0/396219139 conn(0x7fd50806d7d0 msgr2=0x7fd508112da0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:52.118 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.053+0000 7fd50f47b640 1 -- 192.168.123.107:0/396219139 shutdown_connections 2026-03-09T19:21:52.118 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.053+0000 7fd50f47b640 1 -- 192.168.123.107:0/396219139 wait complete. 2026-03-09T19:21:52.118 INFO:teuthology.orchestra.run.vm07.stdout:mgr is available 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout [global] 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout fsid = 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mon_cluster_log_file_level = debug 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout [osd] 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.239+0000 7f674f574640 1 Processor -- start 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.240+0000 7f674f574640 1 -- start start 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.240+0000 7f674f574640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67481082f0 0x7f67481086f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.240+0000 7f674f574640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6748108cc0 con 0x7f67481082f0 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.240+0000 7f674d2e9640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67481082f0 0x7f67481086f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.240+0000 7f674d2e9640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67481082f0 0x7f67481086f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:44632/0 (socket says 192.168.123.107:44632) 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.240+0000 7f674d2e9640 1 -- 192.168.123.107:0/950874696 learned_addr learned my addr 192.168.123.107:0/950874696 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.240+0000 7f674d2e9640 1 -- 192.168.123.107:0/950874696 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f67481094a0 con 0x7f67481082f0 2026-03-09T19:21:52.374 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.241+0000 7f674d2e9640 1 --2- 192.168.123.107:0/950874696 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67481082f0 0x7f67481086f0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f673c009920 tx=0x7f673c02ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=34dfec0b5e84d75c server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.241+0000 7f6737fff640 1 -- 192.168.123.107:0/950874696 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f673c02f9b0 con 0x7f67481082f0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.241+0000 7f6737fff640 1 -- 192.168.123.107:0/950874696 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f673c037440 con 0x7f67481082f0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.242+0000 7f674f574640 1 -- 192.168.123.107:0/950874696 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67481082f0 msgr2=0x7f67481086f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.242+0000 7f674f574640 1 --2- 192.168.123.107:0/950874696 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67481082f0 0x7f67481086f0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f673c009920 tx=0x7f673c02ef20 comp rx=0 tx=0).stop 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.242+0000 7f674f574640 1 -- 192.168.123.107:0/950874696 shutdown_connections 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.242+0000 7f674f574640 1 --2- 192.168.123.107:0/950874696 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67481082f0 0x7f67481086f0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.242+0000 7f674f574640 1 -- 192.168.123.107:0/950874696 >> 192.168.123.107:0/950874696 conn(0x7f674807ba00 msgr2=0x7f67481066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.242+0000 7f674f574640 1 -- 192.168.123.107:0/950874696 shutdown_connections 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.242+0000 7f674f574640 1 -- 192.168.123.107:0/950874696 wait complete. 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.242+0000 7f674f574640 1 Processor -- start 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.243+0000 7f674f574640 1 -- start start 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.243+0000 7f674f574640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67481082f0 0x7f674819e360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.243+0000 7f674f574640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f673c035340 con 0x7f67481082f0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.243+0000 7f674d2e9640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67481082f0 0x7f674819e360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.243+0000 7f674d2e9640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67481082f0 0x7f674819e360 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:44646/0 (socket says 192.168.123.107:44646) 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.243+0000 7f674d2e9640 1 -- 192.168.123.107:0/535195978 learned_addr learned my addr 192.168.123.107:0/535195978 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.243+0000 7f674d2e9640 1 -- 192.168.123.107:0/535195978 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f673c0095d0 con 0x7f67481082f0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.243+0000 7f674d2e9640 1 --2- 192.168.123.107:0/535195978 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67481082f0 0x7f674819e360 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f673c0098f0 tx=0x7f673c035db0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.244+0000 7f67367fc640 1 -- 192.168.123.107:0/535195978 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f673c037710 con 0x7f67481082f0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.244+0000 7f674f574640 1 -- 192.168.123.107:0/535195978 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f674819e8a0 con 0x7f67481082f0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.244+0000 7f674f574640 1 -- 192.168.123.107:0/535195978 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f674819ed40 con 0x7f67481082f0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.244+0000 7f67367fc640 1 -- 192.168.123.107:0/535195978 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f673c040470 con 0x7f67481082f0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.244+0000 7f67367fc640 1 -- 192.168.123.107:0/535195978 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f673c03f550 con 0x7f67481082f0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.245+0000 7f67367fc640 1 -- 192.168.123.107:0/535195978 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 3) v1 ==== 49253+0+0 (secure 0 0 0) 0x7f673c03e050 con 0x7f67481082f0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.245+0000 7f67367fc640 1 --2- 192.168.123.107:0/535195978 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7f671c03d0c0 0x7f671c03f580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.245+0000 7f67367fc640 1 -- 192.168.123.107:0/535195978 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f673c076240 con 0x7f67481082f0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.245+0000 7f674cae8640 1 --2- 192.168.123.107:0/535195978 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7f671c03d0c0 0x7f671c03f580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.246+0000 7f674cae8640 1 --2- 192.168.123.107:0/535195978 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7f671c03d0c0 0x7f671c03f580 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f67380099c0 tx=0x7f6738006eb0 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.246+0000 7f674f574640 1 -- 192.168.123.107:0/535195978 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f674819eff0 con 0x7f67481082f0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.249+0000 7f67367fc640 1 -- 192.168.123.107:0/535195978 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f673c03c070 con 0x7f67481082f0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.336+0000 7f674f574640 1 -- 192.168.123.107:0/535195978 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7f674810c7f0 con 0x7f67481082f0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.341+0000 7f67367fc640 1 -- 192.168.123.107:0/535195978 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v3) v1 ==== 70+0+409 (secure 0 0 0) 0x7f673c036de0 con 0x7f67481082f0 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.344+0000 7f674f574640 1 -- 192.168.123.107:0/535195978 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7f671c03d0c0 msgr2=0x7f671c03f580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.344+0000 7f674f574640 1 --2- 192.168.123.107:0/535195978 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7f671c03d0c0 0x7f671c03f580 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f67380099c0 tx=0x7f6738006eb0 comp rx=0 tx=0).stop 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.344+0000 7f674f574640 1 -- 192.168.123.107:0/535195978 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67481082f0 msgr2=0x7f674819e360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.344+0000 7f674f574640 1 --2- 192.168.123.107:0/535195978 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67481082f0 0x7f674819e360 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f673c0098f0 tx=0x7f673c035db0 comp rx=0 tx=0).stop 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.344+0000 7f674f574640 1 -- 192.168.123.107:0/535195978 shutdown_connections 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.344+0000 7f674f574640 1 --2- 192.168.123.107:0/535195978 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7f671c03d0c0 0x7f671c03f580 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.344+0000 7f674f574640 1 --2- 192.168.123.107:0/535195978 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67481082f0 0x7f674819e360 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.344+0000 7f674f574640 1 -- 192.168.123.107:0/535195978 >> 192.168.123.107:0/535195978 conn(0x7f674807ba00 msgr2=0x7f6748105df0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.345+0000 7f674f574640 1 -- 192.168.123.107:0/535195978 shutdown_connections 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.345+0000 7f674f574640 1 -- 192.168.123.107:0/535195978 wait complete. 2026-03-09T19:21:52.375 INFO:teuthology.orchestra.run.vm07.stdout:Enabling cephadm module... 2026-03-09T19:21:52.602 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:52 vm07 ceph-mon[48545]: mgrmap e3: vm07.xacuym(active, since 1.00771s) 2026-03-09T19:21:52.602 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:52 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/396219139' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-09T19:21:52.602 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:52 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/535195978' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch 2026-03-09T19:21:53.937 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.497+0000 7fc4a7668640 1 Processor -- start 2026-03-09T19:21:53.937 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.498+0000 7fc4a7668640 1 -- start start 2026-03-09T19:21:53.937 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.498+0000 7fc4a7668640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4a0110d60 0x7fc4a0111160 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:53.937 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.498+0000 7fc4a7668640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc4a0111730 con 0x7fc4a0110d60 2026-03-09T19:21:53.937 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.498+0000 7fc4a53dd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4a0110d60 0x7fc4a0111160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:53.937 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.498+0000 7fc4a53dd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4a0110d60 0x7fc4a0111160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:44656/0 (socket says 192.168.123.107:44656) 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.498+0000 7fc4a53dd640 1 -- 192.168.123.107:0/1732537007 learned_addr learned my addr 192.168.123.107:0/1732537007 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.499+0000 7fc4a53dd640 1 -- 192.168.123.107:0/1732537007 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc4a0111f10 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.499+0000 7fc4a53dd640 1 --2- 192.168.123.107:0/1732537007 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4a0110d60 0x7fc4a0111160 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fc494009920 tx=0x7fc49402ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=6d083681b845be04 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.499+0000 7fc48ffff640 1 -- 192.168.123.107:0/1732537007 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc49402f9b0 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.499+0000 7fc48ffff640 1 -- 192.168.123.107:0/1732537007 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7fc494037440 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.500+0000 7fc48ffff640 1 -- 192.168.123.107:0/1732537007 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc4940354e0 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.500+0000 7fc4a7668640 1 -- 192.168.123.107:0/1732537007 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4a0110d60 msgr2=0x7fc4a0111160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.500+0000 7fc4a7668640 1 --2- 192.168.123.107:0/1732537007 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4a0110d60 0x7fc4a0111160 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fc494009920 tx=0x7fc49402ef20 comp rx=0 tx=0).stop 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.500+0000 7fc4a7668640 1 -- 192.168.123.107:0/1732537007 shutdown_connections 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.500+0000 7fc4a7668640 1 --2- 192.168.123.107:0/1732537007 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4a0110d60 0x7fc4a0111160 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.500+0000 7fc4a7668640 1 -- 192.168.123.107:0/1732537007 >> 192.168.123.107:0/1732537007 conn(0x7fc4a006c1a0 msgr2=0x7fc4a006c5b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.501+0000 7fc4a7668640 1 -- 192.168.123.107:0/1732537007 shutdown_connections 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.501+0000 7fc4a7668640 1 -- 192.168.123.107:0/1732537007 wait complete. 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.501+0000 7fc4a7668640 1 Processor -- start 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.501+0000 7fc4a7668640 1 -- start start 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.502+0000 7fc4a7668640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4a0110d60 0x7fc4a01a6b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.502+0000 7fc4a7668640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc4a01a7080 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.502+0000 7fc4a53dd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4a0110d60 0x7fc4a01a6b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.502+0000 7fc4a53dd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4a0110d60 0x7fc4a01a6b40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:44662/0 (socket says 192.168.123.107:44662) 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.502+0000 7fc4a53dd640 1 -- 192.168.123.107:0/1885153273 learned_addr learned my addr 192.168.123.107:0/1885153273 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.502+0000 7fc4a53dd640 1 -- 192.168.123.107:0/1885153273 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc4940095d0 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.503+0000 7fc4a53dd640 1 --2- 192.168.123.107:0/1885153273 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4a0110d60 0x7fc4a01a6b40 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fc494006fd0 tx=0x7fc4940359d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.503+0000 7fc48e7fc640 1 -- 192.168.123.107:0/1885153273 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc494035da0 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.503+0000 7fc48e7fc640 1 -- 192.168.123.107:0/1885153273 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7fc49402fe30 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.503+0000 7fc48e7fc640 1 -- 192.168.123.107:0/1885153273 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc49403fd70 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.503+0000 7fc4a7668640 1 -- 192.168.123.107:0/1885153273 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc4a01a7280 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.503+0000 7fc4a7668640 1 -- 192.168.123.107:0/1885153273 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc4a01a7720 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.504+0000 7fc48e7fc640 1 -- 192.168.123.107:0/1885153273 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 3) v1 ==== 49253+0+0 (secure 0 0 0) 0x7fc49403e070 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.504+0000 7fc48e7fc640 1 --2- 192.168.123.107:0/1885153273 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7fc47003d070 0x7fc47003f530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.504+0000 7fc48e7fc640 1 -- 192.168.123.107:0/1885153273 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fc494076000 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.505+0000 7fc4a4bdc640 1 --2- 192.168.123.107:0/1885153273 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7fc47003d070 0x7fc47003f530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.505+0000 7fc4a4bdc640 1 --2- 192.168.123.107:0/1885153273 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7fc47003d070 0x7fc47003f530 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fc4900099c0 tx=0x7fc490006eb0 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.506+0000 7fc4a7668640 1 -- 192.168.123.107:0/1885153273 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc468005350 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.509+0000 7fc48e7fc640 1 -- 192.168.123.107:0/1885153273 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fc49403c030 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.603+0000 7fc48e7fc640 1 -- 192.168.123.107:0/1885153273 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mgrmap(e 4) v1 ==== 49359+0+0 (secure 0 0 0) 0x7fc494045270 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:52.641+0000 7fc4a7668640 1 -- 192.168.123.107:0/1885153273 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "cephadm"} v 0) v1 -- 0x7fc468005600 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:53.863+0000 7fc48e7fc640 1 -- 192.168.123.107:0/1885153273 <== mon.0 v2:192.168.123.107:3300/0 8 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "cephadm"}]=0 v5) v1 ==== 86+0+0 (secure 0 0 0) 0x7fc49403c030 con 0x7fc4a0110d60 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:53.868+0000 7fc4a7668640 1 -- 192.168.123.107:0/1885153273 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7fc47003d070 msgr2=0x7fc47003f530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:53.868+0000 7fc4a7668640 1 --2- 192.168.123.107:0/1885153273 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7fc47003d070 0x7fc47003f530 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fc4900099c0 tx=0x7fc490006eb0 comp rx=0 tx=0).stop 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:53.868+0000 7fc4a7668640 1 -- 192.168.123.107:0/1885153273 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4a0110d60 msgr2=0x7fc4a01a6b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:53.868+0000 7fc4a7668640 1 --2- 192.168.123.107:0/1885153273 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4a0110d60 0x7fc4a01a6b40 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fc494006fd0 tx=0x7fc4940359d0 comp rx=0 tx=0).stop 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:53.868+0000 7fc4a7668640 1 -- 192.168.123.107:0/1885153273 shutdown_connections 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:53.868+0000 7fc4a7668640 1 --2- 192.168.123.107:0/1885153273 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7fc47003d070 0x7fc47003f530 secure :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fc4900099c0 tx=0x7fc490006eb0 comp rx=0 tx=0).stop 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:53.868+0000 7fc4a7668640 1 --2- 192.168.123.107:0/1885153273 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4a0110d60 0x7fc4a01a6b40 secure :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fc494006fd0 tx=0x7fc4940359d0 comp rx=0 tx=0).stop 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:53.868+0000 7fc4a7668640 1 -- 192.168.123.107:0/1885153273 >> 192.168.123.107:0/1885153273 conn(0x7fc4a006c1a0 msgr2=0x7fc4a010e1b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:53.869+0000 7fc4a7668640 1 -- 192.168.123.107:0/1885153273 shutdown_connections 2026-03-09T19:21:53.938 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:53.869+0000 7fc4a7668640 1 -- 192.168.123.107:0/1885153273 wait complete. 2026-03-09T19:21:54.204 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:53 vm07 ceph-mon[48545]: mgrmap e4: vm07.xacuym(active, since 2s) 2026-03-09T19:21:54.204 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:53 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1885153273' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "cephadm"}]: dispatch 2026-03-09T19:21:54.236 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout { 2026-03-09T19:21:54.236 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "epoch": 5, 2026-03-09T19:21:54.236 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-09T19:21:54.236 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "active_name": "vm07.xacuym", 2026-03-09T19:21:54.236 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-09T19:21:54.236 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout } 2026-03-09T19:21:54.236 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.078+0000 7f64f1798640 1 Processor -- start 2026-03-09T19:21:54.236 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.078+0000 7f64f1798640 1 -- start start 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.078+0000 7f64f1798640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64ec104030 0x7f64ec104430 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.078+0000 7f64f1798640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f64ec104a00 con 0x7f64ec104030 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.078+0000 7f64ebfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64ec104030 0x7f64ec104430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.079+0000 7f64ebfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64ec104030 0x7f64ec104430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:44684/0 (socket says 192.168.123.107:44684) 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.079+0000 7f64ebfff640 1 -- 192.168.123.107:0/4081370997 learned_addr learned my addr 192.168.123.107:0/4081370997 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.080+0000 7f64ebfff640 1 -- 192.168.123.107:0/4081370997 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f64ec105230 con 0x7f64ec104030 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.080+0000 7f64ebfff640 1 --2- 192.168.123.107:0/4081370997 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64ec104030 0x7f64ec104430 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f64d80073f0 tx=0x7f64d8031120 comp rx=0 tx=0).ready entity=mon.0 client_cookie=8155a70f3741478b server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.081+0000 7f64eaffd640 1 -- 192.168.123.107:0/4081370997 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f64d8008d10 con 0x7f64ec104030 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.081+0000 7f64eaffd640 1 -- 192.168.123.107:0/4081370997 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f64d8008e70 con 0x7f64ec104030 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.081+0000 7f64f1798640 1 -- 192.168.123.107:0/4081370997 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64ec104030 msgr2=0x7f64ec104430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.081+0000 7f64f1798640 1 --2- 192.168.123.107:0/4081370997 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64ec104030 0x7f64ec104430 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f64d80073f0 tx=0x7f64d8031120 comp rx=0 tx=0).stop 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.081+0000 7f64f1798640 1 -- 192.168.123.107:0/4081370997 shutdown_connections 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.081+0000 7f64f1798640 1 --2- 192.168.123.107:0/4081370997 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64ec104030 0x7f64ec104430 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.081+0000 7f64f1798640 1 -- 192.168.123.107:0/4081370997 >> 192.168.123.107:0/4081370997 conn(0x7f64ec0ff7c0 msgr2=0x7f64ec101c00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.082+0000 7f64f1798640 1 -- 192.168.123.107:0/4081370997 shutdown_connections 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.082+0000 7f64f1798640 1 -- 192.168.123.107:0/4081370997 wait complete. 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.082+0000 7f64f1798640 1 Processor -- start 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.082+0000 7f64f1798640 1 -- start start 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.082+0000 7f64f1798640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64ec199a20 0x7f64ec199e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.082+0000 7f64f1798640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f64d800a8c0 con 0x7f64ec199a20 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.083+0000 7f64ebfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64ec199a20 0x7f64ec199e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.083+0000 7f64ebfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64ec199a20 0x7f64ec199e40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:44688/0 (socket says 192.168.123.107:44688) 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.083+0000 7f64ebfff640 1 -- 192.168.123.107:0/2569400049 learned_addr learned my addr 192.168.123.107:0/2569400049 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.083+0000 7f64ebfff640 1 -- 192.168.123.107:0/2569400049 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f64d80070a0 con 0x7f64ec199a20 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.083+0000 7f64ebfff640 1 --2- 192.168.123.107:0/2569400049 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64ec199a20 0x7f64ec199e40 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f64d8002410 tx=0x7f64d8038890 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.083+0000 7f64e97fa640 1 -- 192.168.123.107:0/2569400049 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f64d800f150 con 0x7f64ec199a20 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.083+0000 7f64f1798640 1 -- 192.168.123.107:0/2569400049 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f64ec19a380 con 0x7f64ec199a20 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.083+0000 7f64f1798640 1 -- 192.168.123.107:0/2569400049 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f64ec19af90 con 0x7f64ec199a20 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.084+0000 7f64e97fa640 1 -- 192.168.123.107:0/2569400049 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f64d8031da0 con 0x7f64ec199a20 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.084+0000 7f64e97fa640 1 -- 192.168.123.107:0/2569400049 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f64d80427c0 con 0x7f64ec199a20 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.085+0000 7f64e97fa640 1 -- 192.168.123.107:0/2569400049 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 5) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f64d8042920 con 0x7f64ec199a20 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.085+0000 7f64e97fa640 1 --2- 192.168.123.107:0/2569400049 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7f64bc03d1e0 0x7f64bc03f6a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.085+0000 7f64eb7fe640 1 -- 192.168.123.107:0/2569400049 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7f64bc03d1e0 msgr2=0x7f64bc03f6a0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/1456056000 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.085+0000 7f64f1798640 1 -- 192.168.123.107:0/2569400049 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f64ec108750 con 0x7f64ec199a20 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.085+0000 7f64eb7fe640 1 --2- 192.168.123.107:0/2569400049 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7f64bc03d1e0 0x7f64bc03f6a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.085+0000 7f64e97fa640 1 -- 192.168.123.107:0/2569400049 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f64d8075f50 con 0x7f64ec199a20 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.089+0000 7f64e97fa640 1 -- 192.168.123.107:0/2569400049 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f64ec108750 con 0x7f64ec199a20 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.203+0000 7f64f1798640 1 -- 192.168.123.107:0/2569400049 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7f64ec105d70 con 0x7f64ec199a20 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.206+0000 7f64e97fa640 1 -- 192.168.123.107:0/2569400049 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v5) v1 ==== 56+0+98 (secure 0 0 0) 0x7f64d804a960 con 0x7f64ec199a20 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.208+0000 7f64c6ffd640 1 -- 192.168.123.107:0/2569400049 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7f64bc03d1e0 msgr2=0x7f64bc03f6a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.208+0000 7f64c6ffd640 1 --2- 192.168.123.107:0/2569400049 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7f64bc03d1e0 0x7f64bc03f6a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.209+0000 7f64c6ffd640 1 -- 192.168.123.107:0/2569400049 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64ec199a20 msgr2=0x7f64ec199e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.209+0000 7f64c6ffd640 1 --2- 192.168.123.107:0/2569400049 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64ec199a20 0x7f64ec199e40 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f64d8002410 tx=0x7f64d8038890 comp rx=0 tx=0).stop 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.209+0000 7f64c6ffd640 1 -- 192.168.123.107:0/2569400049 shutdown_connections 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.209+0000 7f64c6ffd640 1 --2- 192.168.123.107:0/2569400049 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7f64bc03d1e0 0x7f64bc03f6a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.209+0000 7f64c6ffd640 1 --2- 192.168.123.107:0/2569400049 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64ec199a20 0x7f64ec199e40 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.209+0000 7f64c6ffd640 1 -- 192.168.123.107:0/2569400049 >> 192.168.123.107:0/2569400049 conn(0x7f64ec0ff7c0 msgr2=0x7f64ec0fff20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.209+0000 7f64c6ffd640 1 -- 192.168.123.107:0/2569400049 shutdown_connections 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.209+0000 7f64c6ffd640 1 -- 192.168.123.107:0/2569400049 wait complete. 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:Waiting for the mgr to restart... 2026-03-09T19:21:54.237 INFO:teuthology.orchestra.run.vm07.stdout:Waiting for mgr epoch 5... 2026-03-09T19:21:55.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:54 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1885153273' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "cephadm"}]': finished 2026-03-09T19:21:55.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:54 vm07 ceph-mon[48545]: mgrmap e5: vm07.xacuym(active, since 3s) 2026-03-09T19:21:55.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:54 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/2569400049' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-09T19:21:57.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:57 vm07 ceph-mon[48545]: Active manager daemon vm07.xacuym restarted 2026-03-09T19:21:57.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:57 vm07 ceph-mon[48545]: Activating manager daemon vm07.xacuym 2026-03-09T19:21:57.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:57 vm07 ceph-mon[48545]: osdmap e2: 0 total, 0 up, 0 in 2026-03-09T19:21:57.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:57 vm07 ceph-mon[48545]: mgrmap e6: vm07.xacuym(active, starting, since 0.00568536s) 2026-03-09T19:21:57.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:57 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:21:57.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:57 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr metadata", "who": "vm07.xacuym", "id": "vm07.xacuym"}]: dispatch 2026-03-09T19:21:57.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:57 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T19:21:57.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:57 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T19:21:57.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:57 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T19:21:57.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:57 vm07 ceph-mon[48545]: Manager daemon vm07.xacuym is now available 2026-03-09T19:21:57.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:57 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:21:57.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:57 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:21:57.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:57 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:21:57.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:57 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:21:57.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:57 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.xacuym/mirror_snapshot_schedule"}]: dispatch 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout { 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 7, 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout } 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.395+0000 7ff6bb577640 1 Processor -- start 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.395+0000 7ff6bb577640 1 -- start start 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.395+0000 7ff6bb577640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6bc071820 0x7ff6bc071c20 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.395+0000 7ff6bb577640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff6bc0721f0 con 0x7ff6bc071820 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.395+0000 7ff6ba575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6bc071820 0x7ff6bc071c20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.395+0000 7ff6ba575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6bc071820 0x7ff6bc071c20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:44698/0 (socket says 192.168.123.107:44698) 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.395+0000 7ff6ba575640 1 -- 192.168.123.107:0/467695116 learned_addr learned my addr 192.168.123.107:0/467695116 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.395+0000 7ff6ba575640 1 -- 192.168.123.107:0/467695116 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff6bc072330 con 0x7ff6bc071820 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.396+0000 7ff6ba575640 1 --2- 192.168.123.107:0/467695116 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6bc071820 0x7ff6bc071c20 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7ff6ac0089a0 tx=0x7ff6ac031440 comp rx=0 tx=0).ready entity=mon.0 client_cookie=4d96692d654e05d6 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.396+0000 7ff6b9573640 1 -- 192.168.123.107:0/467695116 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff6ac031e50 con 0x7ff6bc071820 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.396+0000 7ff6b9573640 1 -- 192.168.123.107:0/467695116 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7ff6ac035070 con 0x7ff6bc071820 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.396+0000 7ff6bb577640 1 -- 192.168.123.107:0/467695116 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6bc071820 msgr2=0x7ff6bc071c20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.396+0000 7ff6bb577640 1 --2- 192.168.123.107:0/467695116 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6bc071820 0x7ff6bc071c20 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7ff6ac0089a0 tx=0x7ff6ac031440 comp rx=0 tx=0).stop 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.396+0000 7ff6bb577640 1 -- 192.168.123.107:0/467695116 shutdown_connections 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.396+0000 7ff6bb577640 1 --2- 192.168.123.107:0/467695116 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6bc071820 0x7ff6bc071c20 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.396+0000 7ff6bb577640 1 -- 192.168.123.107:0/467695116 >> 192.168.123.107:0/467695116 conn(0x7ff6bc06d060 msgr2=0x7ff6bc06f480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.396+0000 7ff6bb577640 1 -- 192.168.123.107:0/467695116 shutdown_connections 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.396+0000 7ff6bb577640 1 -- 192.168.123.107:0/467695116 wait complete. 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.397+0000 7ff6bb577640 1 Processor -- start 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.397+0000 7ff6bb577640 1 -- start start 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.397+0000 7ff6bb577640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6bc1a2430 0x7ff6bc1a2850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.397+0000 7ff6bb577640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff6ac03b720 con 0x7ff6bc1a2430 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.397+0000 7ff6ba575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6bc1a2430 0x7ff6bc1a2850 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.397+0000 7ff6ba575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6bc1a2430 0x7ff6bc1a2850 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:44708/0 (socket says 192.168.123.107:44708) 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.397+0000 7ff6ba575640 1 -- 192.168.123.107:0/506266977 learned_addr learned my addr 192.168.123.107:0/506266977 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.397+0000 7ff6ba575640 1 -- 192.168.123.107:0/506266977 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff6ac008650 con 0x7ff6bc1a2430 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.397+0000 7ff6ba575640 1 --2- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6bc1a2430 0x7ff6bc1a2850 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7ff6ac008970 tx=0x7ff6ac008c70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.397+0000 7ff6ab7fe640 1 -- 192.168.123.107:0/506266977 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff6ac03c480 con 0x7ff6bc1a2430 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.397+0000 7ff6bb577640 1 -- 192.168.123.107:0/506266977 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff6bc1a2d90 con 0x7ff6bc1a2430 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.398+0000 7ff6bb577640 1 -- 192.168.123.107:0/506266977 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff6bc1a5930 con 0x7ff6bc1a2430 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.399+0000 7ff6ab7fe640 1 -- 192.168.123.107:0/506266977 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7ff6ac035040 con 0x7ff6bc1a2430 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.399+0000 7ff6ab7fe640 1 -- 192.168.123.107:0/506266977 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff6ac00b5c0 con 0x7ff6bc1a2430 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.399+0000 7ff6ab7fe640 1 -- 192.168.123.107:0/506266977 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 5) v1 ==== 49370+0+0 (secure 0 0 0) 0x7ff6ac00b720 con 0x7ff6bc1a2430 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.400+0000 7ff6ab7fe640 1 --2- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7ff6a003d260 0x7ff6a003f720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.400+0000 7ff6b9d74640 1 -- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7ff6a003d260 msgr2=0x7ff6a003f720 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/1456056000 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.400+0000 7ff6b9d74640 1 --2- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7ff6a003d260 0x7ff6a003f720 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.400+0000 7ff6ab7fe640 1 -- 192.168.123.107:0/506266977 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7ff6ac037070 con 0x7ff6bc1a2430 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.401+0000 7ff6bb577640 1 -- 192.168.123.107:0/506266977 --> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7ff6bc071820 con 0x7ff6a003d260 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.601+0000 7ff6b9d74640 1 -- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7ff6a003d260 msgr2=0x7ff6a003f720 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/1456056000 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:54.601+0000 7ff6b9d74640 1 --2- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7ff6a003d260 0x7ff6a003f720 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:55.002+0000 7ff6b9d74640 1 -- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7ff6a003d260 msgr2=0x7ff6a003f720 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/1456056000 2026-03-09T19:21:58.523 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:55.002+0000 7ff6b9d74640 1 --2- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7ff6a003d260 0x7ff6a003f720 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:55.803+0000 7ff6b9d74640 1 -- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7ff6a003d260 msgr2=0x7ff6a003f720 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/1456056000 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:55.803+0000 7ff6b9d74640 1 --2- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7ff6a003d260 0x7ff6a003f720 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:57.089+0000 7ff6ab7fe640 1 -- 192.168.123.107:0/506266977 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mgrmap(e 6) v1 ==== 49137+0+0 (secure 0 0 0) 0x7ff6ac00ba00 con 0x7ff6bc1a2430 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:57.090+0000 7ff6ab7fe640 1 -- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7ff6a003d260 msgr2=0x7ff6a003f720 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:57.090+0000 7ff6ab7fe640 1 --2- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7ff6a003d260 0x7ff6a003f720 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.474+0000 7ff6ab7fe640 1 -- 192.168.123.107:0/506266977 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mgrmap(e 7) v1 ==== 49264+0+0 (secure 0 0 0) 0x7ff6ac03ac60 con 0x7ff6bc1a2430 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.474+0000 7ff6ab7fe640 1 --2- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7ff6a00409d0 0x7ff6a0042dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.474+0000 7ff6ab7fe640 1 -- 192.168.123.107:0/506266977 --> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7ff6bc071820 con 0x7ff6a00409d0 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.476+0000 7ff6b9d74640 1 --2- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7ff6a00409d0 0x7ff6a0042dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.476+0000 7ff6b9d74640 1 --2- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7ff6a00409d0 0x7ff6a0042dc0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7ff6b4003e00 tx=0x7ff6b40073c0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.477+0000 7ff6ab7fe640 1 -- 192.168.123.107:0/506266977 <== mgr.14118 v2:192.168.123.107:6800/1318262611 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+7759 (secure 0 0 0) 0x7ff6bc071820 con 0x7ff6a00409d0 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.481+0000 7ff6bb577640 1 -- 192.168.123.107:0/506266977 --> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7ff6bc110810 con 0x7ff6a00409d0 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.481+0000 7ff6ab7fe640 1 -- 192.168.123.107:0/506266977 <== mgr.14118 v2:192.168.123.107:6800/1318262611 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+51 (secure 0 0 0) 0x7ff6bc110810 con 0x7ff6a00409d0 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.481+0000 7ff6a97fa640 1 -- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7ff6a00409d0 msgr2=0x7ff6a0042dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.481+0000 7ff6a97fa640 1 --2- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7ff6a00409d0 0x7ff6a0042dc0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7ff6b4003e00 tx=0x7ff6b40073c0 comp rx=0 tx=0).stop 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.481+0000 7ff6a97fa640 1 -- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6bc1a2430 msgr2=0x7ff6bc1a2850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.481+0000 7ff6a97fa640 1 --2- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6bc1a2430 0x7ff6bc1a2850 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7ff6ac008970 tx=0x7ff6ac008c70 comp rx=0 tx=0).stop 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.481+0000 7ff6a97fa640 1 -- 192.168.123.107:0/506266977 shutdown_connections 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.481+0000 7ff6a97fa640 1 --2- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7ff6a00409d0 0x7ff6a0042dc0 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.481+0000 7ff6a97fa640 1 --2- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:6800/1456056000,v1:192.168.123.107:6801/1456056000] conn(0x7ff6a003d260 0x7ff6a003f720 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.481+0000 7ff6a97fa640 1 --2- 192.168.123.107:0/506266977 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6bc1a2430 0x7ff6bc1a2850 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.481+0000 7ff6a97fa640 1 -- 192.168.123.107:0/506266977 >> 192.168.123.107:0/506266977 conn(0x7ff6bc06d060 msgr2=0x7ff6bc06e9e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.482+0000 7ff6a97fa640 1 -- 192.168.123.107:0/506266977 shutdown_connections 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.482+0000 7ff6a97fa640 1 -- 192.168.123.107:0/506266977 wait complete. 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:mgr epoch 5 is available 2026-03-09T19:21:58.524 INFO:teuthology.orchestra.run.vm07.stdout:Setting orchestrator backend to cephadm... 2026-03-09T19:21:58.730 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:58 vm07 ceph-mon[48545]: Found migration_current of "None". Setting to last migration. 2026-03-09T19:21:58.730 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:58 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.xacuym/trash_purge_schedule"}]: dispatch 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.664+0000 7f17aca74640 1 Processor -- start 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.664+0000 7f17aca74640 1 -- start start 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.665+0000 7f17aca74640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17a8071820 0x7f17a8071c20 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.665+0000 7f17aca74640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f17a80721f0 con 0x7f17a8071820 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.665+0000 7f17a77fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17a8071820 0x7f17a8071c20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.665+0000 7f17a77fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17a8071820 0x7f17a8071c20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40470/0 (socket says 192.168.123.107:40470) 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.665+0000 7f17a77fe640 1 -- 192.168.123.107:0/2606855603 learned_addr learned my addr 192.168.123.107:0/2606855603 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.665+0000 7f17a77fe640 1 -- 192.168.123.107:0/2606855603 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f17a8072330 con 0x7f17a8071820 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.666+0000 7f17a77fe640 1 --2- 192.168.123.107:0/2606855603 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17a8071820 0x7f17a8071c20 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f17980089a0 tx=0x7f1798031440 comp rx=0 tx=0).ready entity=mon.0 client_cookie=decab8a22317fb7e server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.666+0000 7f17a67fc640 1 -- 192.168.123.107:0/2606855603 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1798031e50 con 0x7f17a8071820 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.666+0000 7f17a67fc640 1 -- 192.168.123.107:0/2606855603 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f1798035070 con 0x7f17a8071820 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.666+0000 7f17aca74640 1 -- 192.168.123.107:0/2606855603 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17a8071820 msgr2=0x7f17a8071c20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.666+0000 7f17aca74640 1 --2- 192.168.123.107:0/2606855603 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17a8071820 0x7f17a8071c20 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f17980089a0 tx=0x7f1798031440 comp rx=0 tx=0).stop 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.666+0000 7f17aca74640 1 -- 192.168.123.107:0/2606855603 shutdown_connections 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.666+0000 7f17aca74640 1 --2- 192.168.123.107:0/2606855603 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17a8071820 0x7f17a8071c20 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.666+0000 7f17aca74640 1 -- 192.168.123.107:0/2606855603 >> 192.168.123.107:0/2606855603 conn(0x7f17a806d060 msgr2=0x7f17a806f480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.666+0000 7f17aca74640 1 -- 192.168.123.107:0/2606855603 shutdown_connections 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.666+0000 7f17aca74640 1 -- 192.168.123.107:0/2606855603 wait complete. 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.667+0000 7f17aca74640 1 Processor -- start 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.667+0000 7f17aca74640 1 -- start start 2026-03-09T19:21:58.816 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.667+0000 7f17aca74640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17a81aae70 0x7f17a81ab290 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.667+0000 7f17aca74640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f179803b720 con 0x7f17a81aae70 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.667+0000 7f17a77fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17a81aae70 0x7f17a81ab290 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.667+0000 7f17a77fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17a81aae70 0x7f17a81ab290 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40482/0 (socket says 192.168.123.107:40482) 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.667+0000 7f17a77fe640 1 -- 192.168.123.107:0/3288882751 learned_addr learned my addr 192.168.123.107:0/3288882751 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.667+0000 7f17a77fe640 1 -- 192.168.123.107:0/3288882751 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1798008650 con 0x7f17a81aae70 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.667+0000 7f17a77fe640 1 --2- 192.168.123.107:0/3288882751 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17a81aae70 0x7f17a81ab290 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f1798008970 tx=0x7f1798008c70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.668+0000 7f17a4ff9640 1 -- 192.168.123.107:0/3288882751 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f179803c480 con 0x7f17a81aae70 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.668+0000 7f17aca74640 1 -- 192.168.123.107:0/3288882751 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f17a81ab7d0 con 0x7f17a81aae70 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.668+0000 7f17aca74640 1 -- 192.168.123.107:0/3288882751 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f17a81ae370 con 0x7f17a81aae70 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.668+0000 7f17a4ff9640 1 -- 192.168.123.107:0/3288882751 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f1798035040 con 0x7f17a81aae70 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.668+0000 7f17a4ff9640 1 -- 192.168.123.107:0/3288882751 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f179800b5c0 con 0x7f17a81aae70 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.670+0000 7f17a4ff9640 1 -- 192.168.123.107:0/3288882751 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 7) v1 ==== 49264+0+0 (secure 0 0 0) 0x7f179800b720 con 0x7f17a81aae70 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.670+0000 7f17a4ff9640 1 --2- 192.168.123.107:0/3288882751 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f178003d110 0x7f178003f5d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.670+0000 7f17a4ff9640 1 -- 192.168.123.107:0/3288882751 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f1798037070 con 0x7f17a81aae70 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.670+0000 7f17a6ffd640 1 --2- 192.168.123.107:0/3288882751 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f178003d110 0x7f178003f5d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.671+0000 7f17a6ffd640 1 --2- 192.168.123.107:0/3288882751 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f178003d110 0x7f178003f5d0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f17a000ad30 tx=0x7f17a00093f0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.671+0000 7f17aca74640 1 -- 192.168.123.107:0/3288882751 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f17a8111070 con 0x7f17a81aae70 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.674+0000 7f17a4ff9640 1 -- 192.168.123.107:0/3288882751 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f1798042030 con 0x7f17a81aae70 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.776+0000 7f17aca74640 1 -- 192.168.123.107:0/3288882751 --> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] -- mgr_command(tid 0: {"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}) v1 -- 0x7f17a806f010 con 0x7f178003d110 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.785+0000 7f17a4ff9640 1 -- 192.168.123.107:0/3288882751 <== mgr.14118 v2:192.168.123.107:6800/1318262611 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f17a806f010 con 0x7f178003d110 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.788+0000 7f17867fc640 1 -- 192.168.123.107:0/3288882751 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f178003d110 msgr2=0x7f178003f5d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.788+0000 7f17867fc640 1 --2- 192.168.123.107:0/3288882751 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f178003d110 0x7f178003f5d0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f17a000ad30 tx=0x7f17a00093f0 comp rx=0 tx=0).stop 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.788+0000 7f17867fc640 1 -- 192.168.123.107:0/3288882751 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17a81aae70 msgr2=0x7f17a81ab290 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.788+0000 7f17867fc640 1 --2- 192.168.123.107:0/3288882751 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17a81aae70 0x7f17a81ab290 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f1798008970 tx=0x7f1798008c70 comp rx=0 tx=0).stop 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.788+0000 7f17867fc640 1 -- 192.168.123.107:0/3288882751 shutdown_connections 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.788+0000 7f17867fc640 1 --2- 192.168.123.107:0/3288882751 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f178003d110 0x7f178003f5d0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.788+0000 7f17867fc640 1 --2- 192.168.123.107:0/3288882751 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17a81aae70 0x7f17a81ab290 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.788+0000 7f17867fc640 1 -- 192.168.123.107:0/3288882751 >> 192.168.123.107:0/3288882751 conn(0x7f17a806d060 msgr2=0x7f17a806e900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.788+0000 7f17867fc640 1 -- 192.168.123.107:0/3288882751 shutdown_connections 2026-03-09T19:21:58.817 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.788+0000 7f17867fc640 1 -- 192.168.123.107:0/3288882751 wait complete. 2026-03-09T19:21:59.102 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout value unchanged 2026-03-09T19:21:59.102 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.955+0000 7fd942622640 1 Processor -- start 2026-03-09T19:21:59.102 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.955+0000 7fd942622640 1 -- start start 2026-03-09T19:21:59.102 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.955+0000 7fd942622640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd93c072f10 0x7fd93c0713c0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:59.102 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.955+0000 7fd942622640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd93c071900 con 0x7fd93c072f10 2026-03-09T19:21:59.102 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.955+0000 7fd941620640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd93c072f10 0x7fd93c0713c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:59.102 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.955+0000 7fd941620640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd93c072f10 0x7fd93c0713c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40494/0 (socket says 192.168.123.107:40494) 2026-03-09T19:21:59.102 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.955+0000 7fd941620640 1 -- 192.168.123.107:0/1820116001 learned_addr learned my addr 192.168.123.107:0/1820116001 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:59.103 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.956+0000 7fd941620640 1 -- 192.168.123.107:0/1820116001 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd93c071a40 con 0x7fd93c072f10 2026-03-09T19:21:59.103 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.956+0000 7fd941620640 1 --2- 192.168.123.107:0/1820116001 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd93c072f10 0x7fd93c0713c0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fd938008970 tx=0x7fd9380312f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=f09e8ddfb0722a0e server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:59.103 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.956+0000 7fd933fff640 1 -- 192.168.123.107:0/1820116001 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd938031d00 con 0x7fd93c072f10 2026-03-09T19:21:59.103 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.957+0000 7fd933fff640 1 -- 192.168.123.107:0/1820116001 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7fd938035070 con 0x7fd93c072f10 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.957+0000 7fd942622640 1 -- 192.168.123.107:0/1820116001 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd93c072f10 msgr2=0x7fd93c0713c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.957+0000 7fd942622640 1 --2- 192.168.123.107:0/1820116001 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd93c072f10 0x7fd93c0713c0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fd938008970 tx=0x7fd9380312f0 comp rx=0 tx=0).stop 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.957+0000 7fd942622640 1 -- 192.168.123.107:0/1820116001 shutdown_connections 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.957+0000 7fd942622640 1 --2- 192.168.123.107:0/1820116001 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd93c072f10 0x7fd93c0713c0 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.957+0000 7fd942622640 1 -- 192.168.123.107:0/1820116001 >> 192.168.123.107:0/1820116001 conn(0x7fd93c06d060 msgr2=0x7fd93c06f480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.957+0000 7fd942622640 1 -- 192.168.123.107:0/1820116001 shutdown_connections 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.957+0000 7fd942622640 1 -- 192.168.123.107:0/1820116001 wait complete. 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.957+0000 7fd942622640 1 Processor -- start 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.958+0000 7fd942622640 1 -- start start 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.958+0000 7fd942622640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd93c0866f0 0x7fd93c089bf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.958+0000 7fd942622640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd93c08a130 con 0x7fd93c0866f0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.958+0000 7fd941620640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd93c0866f0 0x7fd93c089bf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.958+0000 7fd941620640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd93c0866f0 0x7fd93c089bf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40502/0 (socket says 192.168.123.107:40502) 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.958+0000 7fd941620640 1 -- 192.168.123.107:0/3502430879 learned_addr learned my addr 192.168.123.107:0/3502430879 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.958+0000 7fd941620640 1 -- 192.168.123.107:0/3502430879 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd938008650 con 0x7fd93c0866f0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.958+0000 7fd941620640 1 --2- 192.168.123.107:0/3502430879 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd93c0866f0 0x7fd93c089bf0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fd938006ef0 tx=0x7fd938003cb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.959+0000 7fd9327fc640 1 -- 192.168.123.107:0/3502430879 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd9380084f0 con 0x7fd93c0866f0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.959+0000 7fd942622640 1 -- 192.168.123.107:0/3502430879 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd93c086b10 con 0x7fd93c0866f0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.959+0000 7fd942622640 1 -- 192.168.123.107:0/3502430879 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd93c086fb0 con 0x7fd93c0866f0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.959+0000 7fd9327fc640 1 -- 192.168.123.107:0/3502430879 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7fd938035040 con 0x7fd93c0866f0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.959+0000 7fd9327fc640 1 -- 192.168.123.107:0/3502430879 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd938003ea0 con 0x7fd93c0866f0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.960+0000 7fd942622640 1 -- 192.168.123.107:0/3502430879 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd90c005350 con 0x7fd93c0866f0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.961+0000 7fd9327fc640 1 -- 192.168.123.107:0/3502430879 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 7) v1 ==== 49264+0+0 (secure 0 0 0) 0x7fd938039030 con 0x7fd93c0866f0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.961+0000 7fd9327fc640 1 --2- 192.168.123.107:0/3502430879 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fd91c03d0c0 0x7fd91c03f580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.961+0000 7fd9327fc640 1 -- 192.168.123.107:0/3502430879 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fd9380774b0 con 0x7fd93c0866f0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.964+0000 7fd940e1f640 1 --2- 192.168.123.107:0/3502430879 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fd91c03d0c0 0x7fd91c03f580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.964+0000 7fd940e1f640 1 --2- 192.168.123.107:0/3502430879 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fd91c03d0c0 0x7fd91c03f580 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fd93400ad30 tx=0x7fd9340093f0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:58.967+0000 7fd9327fc640 1 -- 192.168.123.107:0/3502430879 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fd93803aa00 con 0x7fd93c0866f0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.063+0000 7fd942622640 1 -- 192.168.123.107:0/3502430879 --> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] -- mgr_command(tid 0: {"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}) v1 -- 0x7fd90c002bf0 con 0x7fd91c03d0c0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.064+0000 7fd9327fc640 1 -- 192.168.123.107:0/3502430879 <== mgr.14118 v2:192.168.123.107:6800/1318262611 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+16 (secure 0 0 0) 0x7fd90c002bf0 con 0x7fd91c03d0c0 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.066+0000 7fd907fff640 1 -- 192.168.123.107:0/3502430879 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fd91c03d0c0 msgr2=0x7fd91c03f580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.066+0000 7fd907fff640 1 --2- 192.168.123.107:0/3502430879 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fd91c03d0c0 0x7fd91c03f580 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fd93400ad30 tx=0x7fd9340093f0 comp rx=0 tx=0).stop 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.066+0000 7fd907fff640 1 -- 192.168.123.107:0/3502430879 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd93c0866f0 msgr2=0x7fd93c089bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.066+0000 7fd907fff640 1 --2- 192.168.123.107:0/3502430879 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd93c0866f0 0x7fd93c089bf0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fd938006ef0 tx=0x7fd938003cb0 comp rx=0 tx=0).stop 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.066+0000 7fd907fff640 1 -- 192.168.123.107:0/3502430879 shutdown_connections 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.066+0000 7fd907fff640 1 --2- 192.168.123.107:0/3502430879 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fd91c03d0c0 0x7fd91c03f580 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.066+0000 7fd907fff640 1 --2- 192.168.123.107:0/3502430879 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd93c0866f0 0x7fd93c089bf0 secure :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fd938006ef0 tx=0x7fd938003cb0 comp rx=0 tx=0).stop 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.066+0000 7fd907fff640 1 -- 192.168.123.107:0/3502430879 >> 192.168.123.107:0/3502430879 conn(0x7fd93c06d060 msgr2=0x7fd93c06e9c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.067+0000 7fd907fff640 1 -- 192.168.123.107:0/3502430879 shutdown_connections 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.067+0000 7fd907fff640 1 -- 192.168.123.107:0/3502430879 wait complete. 2026-03-09T19:21:59.104 INFO:teuthology.orchestra.run.vm07.stdout:Generating ssh key... 2026-03-09T19:21:59.424 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.253+0000 7f6183b73640 1 Processor -- start 2026-03-09T19:21:59.424 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.254+0000 7f6183b73640 1 -- start start 2026-03-09T19:21:59.424 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.254+0000 7f6183b73640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f617c1082f0 0x7f617c1086f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:59.424 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.254+0000 7f6183b73640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f617c108cc0 con 0x7f617c1082f0 2026-03-09T19:21:59.424 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.254+0000 7f61818e8640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f617c1082f0 0x7f617c1086f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:59.424 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.254+0000 7f61818e8640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f617c1082f0 0x7f617c1086f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40508/0 (socket says 192.168.123.107:40508) 2026-03-09T19:21:59.424 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.254+0000 7f61818e8640 1 -- 192.168.123.107:0/3274993454 learned_addr learned my addr 192.168.123.107:0/3274993454 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:59.424 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.254+0000 7f61818e8640 1 -- 192.168.123.107:0/3274993454 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f617c1094a0 con 0x7f617c1082f0 2026-03-09T19:21:59.424 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.255+0000 7f61818e8640 1 --2- 192.168.123.107:0/3274993454 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f617c1082f0 0x7f617c1086f0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f6164009920 tx=0x7f616402ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=8dd6992ad06ba477 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:59.424 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.255+0000 7f61808e6640 1 -- 192.168.123.107:0/3274993454 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f616402f9b0 con 0x7f617c1082f0 2026-03-09T19:21:59.424 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.255+0000 7f61808e6640 1 -- 192.168.123.107:0/3274993454 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f6164037440 con 0x7f617c1082f0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.255+0000 7f61808e6640 1 -- 192.168.123.107:0/3274993454 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f61640354e0 con 0x7f617c1082f0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.256+0000 7f6183b73640 1 -- 192.168.123.107:0/3274993454 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f617c1082f0 msgr2=0x7f617c1086f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.256+0000 7f6183b73640 1 --2- 192.168.123.107:0/3274993454 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f617c1082f0 0x7f617c1086f0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f6164009920 tx=0x7f616402ef20 comp rx=0 tx=0).stop 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.256+0000 7f6183b73640 1 -- 192.168.123.107:0/3274993454 shutdown_connections 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.256+0000 7f6183b73640 1 --2- 192.168.123.107:0/3274993454 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f617c1082f0 0x7f617c1086f0 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.256+0000 7f6183b73640 1 -- 192.168.123.107:0/3274993454 >> 192.168.123.107:0/3274993454 conn(0x7f617c07ba00 msgr2=0x7f617c1066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.256+0000 7f6183b73640 1 -- 192.168.123.107:0/3274993454 shutdown_connections 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.256+0000 7f6183b73640 1 -- 192.168.123.107:0/3274993454 wait complete. 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.257+0000 7f6183b73640 1 Processor -- start 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.257+0000 7f6183b73640 1 -- start start 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.257+0000 7f6183b73640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f617c1082f0 0x7f617c19e160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.257+0000 7f6183b73640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f617c19e6a0 con 0x7f617c1082f0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.257+0000 7f61818e8640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f617c1082f0 0x7f617c19e160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.257+0000 7f61818e8640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f617c1082f0 0x7f617c19e160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40520/0 (socket says 192.168.123.107:40520) 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.257+0000 7f61818e8640 1 -- 192.168.123.107:0/3044745739 learned_addr learned my addr 192.168.123.107:0/3044745739 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.257+0000 7f61818e8640 1 -- 192.168.123.107:0/3044745739 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f61640095d0 con 0x7f617c1082f0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.258+0000 7f61818e8640 1 --2- 192.168.123.107:0/3044745739 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f617c1082f0 0x7f617c19e160 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f61640098f0 tx=0x7f61640359b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.258+0000 7f6172ffd640 1 -- 192.168.123.107:0/3044745739 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6164035d00 con 0x7f617c1082f0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.258+0000 7f6172ffd640 1 -- 192.168.123.107:0/3044745739 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f6164035e60 con 0x7f617c1082f0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.258+0000 7f6172ffd640 1 -- 192.168.123.107:0/3044745739 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6164040d70 con 0x7f617c1082f0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.258+0000 7f6183b73640 1 -- 192.168.123.107:0/3044745739 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f617c19e8a0 con 0x7f617c1082f0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.258+0000 7f6183b73640 1 -- 192.168.123.107:0/3044745739 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f617c19ed40 con 0x7f617c1082f0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.259+0000 7f6172ffd640 1 -- 192.168.123.107:0/3044745739 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 7) v1 ==== 49264+0+0 (secure 0 0 0) 0x7f6164040600 con 0x7f617c1082f0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.260+0000 7f6183b73640 1 -- 192.168.123.107:0/3044745739 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6144005350 con 0x7f617c1082f0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.262+0000 7f6172ffd640 1 --2- 192.168.123.107:0/3044745739 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f615403d110 0x7f615403f5d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.262+0000 7f6172ffd640 1 -- 192.168.123.107:0/3044745739 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f61640758f0 con 0x7f617c1082f0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.262+0000 7f6172ffd640 1 -- 192.168.123.107:0/3044745739 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6164036630 con 0x7f617c1082f0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.263+0000 7f61810e7640 1 --2- 192.168.123.107:0/3044745739 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f615403d110 0x7f615403f5d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.263+0000 7f61810e7640 1 --2- 192.168.123.107:0/3044745739 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f615403d110 0x7f615403f5d0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f616c0099c0 tx=0x7f616c006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.349+0000 7f6183b73640 1 -- 192.168.123.107:0/3044745739 --> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] -- mgr_command(tid 0: {"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}) v1 -- 0x7f6144002bf0 con 0x7f615403d110 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.378+0000 7f6172ffd640 1 -- 192.168.123.107:0/3044745739 <== mgr.14118 v2:192.168.123.107:6800/1318262611 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f6144002bf0 con 0x7f615403d110 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.381+0000 7f6183b73640 1 -- 192.168.123.107:0/3044745739 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f615403d110 msgr2=0x7f615403f5d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.381+0000 7f6183b73640 1 --2- 192.168.123.107:0/3044745739 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f615403d110 0x7f615403f5d0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f616c0099c0 tx=0x7f616c006eb0 comp rx=0 tx=0).stop 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.381+0000 7f6183b73640 1 -- 192.168.123.107:0/3044745739 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f617c1082f0 msgr2=0x7f617c19e160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.381+0000 7f6183b73640 1 --2- 192.168.123.107:0/3044745739 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f617c1082f0 0x7f617c19e160 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f61640098f0 tx=0x7f61640359b0 comp rx=0 tx=0).stop 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.381+0000 7f6183b73640 1 -- 192.168.123.107:0/3044745739 shutdown_connections 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.381+0000 7f6183b73640 1 --2- 192.168.123.107:0/3044745739 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f615403d110 0x7f615403f5d0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.381+0000 7f6183b73640 1 --2- 192.168.123.107:0/3044745739 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f617c1082f0 0x7f617c19e160 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.381+0000 7f6183b73640 1 -- 192.168.123.107:0/3044745739 >> 192.168.123.107:0/3044745739 conn(0x7f617c07ba00 msgr2=0x7f617c105d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.381+0000 7f6183b73640 1 -- 192.168.123.107:0/3044745739 shutdown_connections 2026-03-09T19:21:59.425 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.381+0000 7f6183b73640 1 -- 192.168.123.107:0/3044745739 wait complete. 2026-03-09T19:21:59.641 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:59 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:21:59.641 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:59 vm07 ceph-mon[48545]: mgrmap e7: vm07.xacuym(active, since 1.3906s) 2026-03-09T19:21:59.641 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:59 vm07 ceph-mon[48545]: from='client.14122 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-09T19:21:59.641 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:59 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:21:59.641 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:59 vm07 ceph-mon[48545]: from='client.14122 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-09T19:21:59.641 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:59 vm07 ceph-mon[48545]: from='client.14130 -' entity='client.admin' cmd=[{"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:21:59.641 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:59 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:21:59.641 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:59 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:21:59.641 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:59 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:21:59.641 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:59 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:21:59.641 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:21:59 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:21:59.727 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIMjqCGWP8Zd5gsOjmAx27Qho/yQRsupkQt5TbUvfXgbc ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:21:59.727 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.544+0000 7f51eddb5640 1 Processor -- start 2026-03-09T19:21:59.727 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.544+0000 7f51eddb5640 1 -- start start 2026-03-09T19:21:59.727 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.544+0000 7f51eddb5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f51e807c8a0 0x7f51e807cca0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:59.727 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.544+0000 7f51eddb5640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f51e807d270 con 0x7f51e807c8a0 2026-03-09T19:21:59.727 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.544+0000 7f51e77fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f51e807c8a0 0x7f51e807cca0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:59.727 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.544+0000 7f51e77fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f51e807c8a0 0x7f51e807cca0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40526/0 (socket says 192.168.123.107:40526) 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.544+0000 7f51e77fe640 1 -- 192.168.123.107:0/3064627311 learned_addr learned my addr 192.168.123.107:0/3064627311 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.545+0000 7f51e77fe640 1 -- 192.168.123.107:0/3064627311 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f51e807daa0 con 0x7f51e807c8a0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.545+0000 7f51e77fe640 1 --2- 192.168.123.107:0/3064627311 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f51e807c8a0 0x7f51e807cca0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f51cc009920 tx=0x7f51cc02ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=3a5c7d6a32ea7f73 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.546+0000 7f51e67fc640 1 -- 192.168.123.107:0/3064627311 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f51cc02f9b0 con 0x7f51e807c8a0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.546+0000 7f51e67fc640 1 -- 192.168.123.107:0/3064627311 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f51cc037440 con 0x7f51e807c8a0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.546+0000 7f51eddb5640 1 -- 192.168.123.107:0/3064627311 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f51e807c8a0 msgr2=0x7f51e807cca0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.546+0000 7f51eddb5640 1 --2- 192.168.123.107:0/3064627311 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f51e807c8a0 0x7f51e807cca0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f51cc009920 tx=0x7f51cc02ef20 comp rx=0 tx=0).stop 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.546+0000 7f51eddb5640 1 -- 192.168.123.107:0/3064627311 shutdown_connections 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.546+0000 7f51eddb5640 1 --2- 192.168.123.107:0/3064627311 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f51e807c8a0 0x7f51e807cca0 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.546+0000 7f51eddb5640 1 -- 192.168.123.107:0/3064627311 >> 192.168.123.107:0/3064627311 conn(0x7f51e807ba00 msgr2=0x7f51e81066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.547+0000 7f51eddb5640 1 -- 192.168.123.107:0/3064627311 shutdown_connections 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.547+0000 7f51eddb5640 1 -- 192.168.123.107:0/3064627311 wait complete. 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.547+0000 7f51eddb5640 1 Processor -- start 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.547+0000 7f51eddb5640 1 -- start start 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.547+0000 7f51eddb5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f51e81a25a0 0x7f51e81a29c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.547+0000 7f51eddb5640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f51cc035340 con 0x7f51e81a25a0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.547+0000 7f51e77fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f51e81a25a0 0x7f51e81a29c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.547+0000 7f51e77fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f51e81a25a0 0x7f51e81a29c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40542/0 (socket says 192.168.123.107:40542) 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.548+0000 7f51e77fe640 1 -- 192.168.123.107:0/482416427 learned_addr learned my addr 192.168.123.107:0/482416427 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.548+0000 7f51e77fe640 1 -- 192.168.123.107:0/482416427 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f51cc0095d0 con 0x7f51e81a25a0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.548+0000 7f51e77fe640 1 --2- 192.168.123.107:0/482416427 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f51e81a25a0 0x7f51e81a29c0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f51cc02f450 tx=0x7f51cc0379e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.548+0000 7f51e4ff9640 1 -- 192.168.123.107:0/482416427 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f51cc037b60 con 0x7f51e81a25a0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.548+0000 7f51e4ff9640 1 -- 192.168.123.107:0/482416427 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f51cc02fe30 con 0x7f51e81a25a0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.548+0000 7f51e4ff9640 1 -- 192.168.123.107:0/482416427 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f51cc042dc0 con 0x7f51e81a25a0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.548+0000 7f51eddb5640 1 -- 192.168.123.107:0/482416427 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f51e81a2f00 con 0x7f51e81a25a0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.548+0000 7f51eddb5640 1 -- 192.168.123.107:0/482416427 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f51e81a5aa0 con 0x7f51e81a25a0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.549+0000 7f51e4ff9640 1 -- 192.168.123.107:0/482416427 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 7) v1 ==== 49264+0+0 (secure 0 0 0) 0x7f51cc0425f0 con 0x7f51e81a25a0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.549+0000 7f51eddb5640 1 -- 192.168.123.107:0/482416427 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f51b0005350 con 0x7f51e81a25a0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.550+0000 7f51e4ff9640 1 --2- 192.168.123.107:0/482416427 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f51c003d0c0 0x7f51c003f580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.550+0000 7f51e4ff9640 1 -- 192.168.123.107:0/482416427 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f51cc075530 con 0x7f51e81a25a0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.551+0000 7f51e6ffd640 1 --2- 192.168.123.107:0/482416427 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f51c003d0c0 0x7f51c003f580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.551+0000 7f51e6ffd640 1 --2- 192.168.123.107:0/482416427 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f51c003d0c0 0x7f51c003f580 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f51d4009a10 tx=0x7f51d4006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.552+0000 7f51e4ff9640 1 -- 192.168.123.107:0/482416427 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f51cc040540 con 0x7f51e81a25a0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.641+0000 7f51eddb5640 1 -- 192.168.123.107:0/482416427 --> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] -- mgr_command(tid 0: {"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}) v1 -- 0x7f51b0002bf0 con 0x7f51c003d0c0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.643+0000 7f51e4ff9640 1 -- 192.168.123.107:0/482416427 <== mgr.14118 v2:192.168.123.107:6800/1318262611 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+123 (secure 0 0 0) 0x7f51b0002bf0 con 0x7f51c003d0c0 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.647+0000 7f51eddb5640 1 -- 192.168.123.107:0/482416427 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f51c003d0c0 msgr2=0x7f51c003f580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.647+0000 7f51eddb5640 1 --2- 192.168.123.107:0/482416427 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f51c003d0c0 0x7f51c003f580 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f51d4009a10 tx=0x7f51d4006eb0 comp rx=0 tx=0).stop 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.647+0000 7f51eddb5640 1 -- 192.168.123.107:0/482416427 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f51e81a25a0 msgr2=0x7f51e81a29c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.647+0000 7f51eddb5640 1 --2- 192.168.123.107:0/482416427 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f51e81a25a0 0x7f51e81a29c0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f51cc02f450 tx=0x7f51cc0379e0 comp rx=0 tx=0).stop 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.647+0000 7f51eddb5640 1 -- 192.168.123.107:0/482416427 shutdown_connections 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.647+0000 7f51eddb5640 1 --2- 192.168.123.107:0/482416427 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f51c003d0c0 0x7f51c003f580 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.647+0000 7f51eddb5640 1 --2- 192.168.123.107:0/482416427 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f51e81a25a0 0x7f51e81a29c0 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.647+0000 7f51eddb5640 1 -- 192.168.123.107:0/482416427 >> 192.168.123.107:0/482416427 conn(0x7f51e807ba00 msgr2=0x7f51e8106c10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.648+0000 7f51eddb5640 1 -- 192.168.123.107:0/482416427 shutdown_connections 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.648+0000 7f51eddb5640 1 -- 192.168.123.107:0/482416427 wait complete. 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:Wrote public SSH key to /home/ubuntu/cephtest/ceph.pub 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:Adding key to root@localhost authorized_keys... 2026-03-09T19:21:59.728 INFO:teuthology.orchestra.run.vm07.stdout:Adding host vm07... 2026-03-09T19:22:01.080 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:00 vm07 ceph-mon[48545]: from='client.14132 -' entity='client.admin' cmd=[{"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:01.080 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:00 vm07 ceph-mon[48545]: [09/Mar/2026:19:21:59] ENGINE Bus STARTING 2026-03-09T19:22:01.080 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:00 vm07 ceph-mon[48545]: [09/Mar/2026:19:21:59] ENGINE Serving on https://192.168.123.107:7150 2026-03-09T19:22:01.080 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:00 vm07 ceph-mon[48545]: [09/Mar/2026:19:21:59] ENGINE Client ('192.168.123.107', 57410) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T19:22:01.080 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:00 vm07 ceph-mon[48545]: from='client.14134 -' entity='client.admin' cmd=[{"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:01.080 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:00 vm07 ceph-mon[48545]: Generating ssh key... 2026-03-09T19:22:01.080 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:00 vm07 ceph-mon[48545]: [09/Mar/2026:19:21:59] ENGINE Serving on http://192.168.123.107:8765 2026-03-09T19:22:01.080 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:00 vm07 ceph-mon[48545]: [09/Mar/2026:19:21:59] ENGINE Bus STARTED 2026-03-09T19:22:01.080 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:00 vm07 ceph-mon[48545]: from='client.14136 -' entity='client.admin' cmd=[{"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:01.080 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:00 vm07 ceph-mon[48545]: mgrmap e8: vm07.xacuym(active, since 2s) 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout Added host 'vm07' with addr '192.168.123.107' 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.837+0000 7f010c4f4640 1 Processor -- start 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.838+0000 7f010c4f4640 1 -- start start 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.838+0000 7f010c4f4640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01041082f0 0x7f01041086f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.838+0000 7f010c4f4640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0104108cc0 con 0x7f01041082f0 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.839+0000 7f010a269640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01041082f0 0x7f01041086f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.839+0000 7f010a269640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01041082f0 0x7f01041086f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40550/0 (socket says 192.168.123.107:40550) 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.839+0000 7f010a269640 1 -- 192.168.123.107:0/2613267822 learned_addr learned my addr 192.168.123.107:0/2613267822 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.839+0000 7f010a269640 1 -- 192.168.123.107:0/2613267822 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f01041094a0 con 0x7f01041082f0 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.840+0000 7f010a269640 1 --2- 192.168.123.107:0/2613267822 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01041082f0 0x7f01041086f0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f00ec009920 tx=0x7f00ec02ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=de6e3a1133973235 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.840+0000 7f0109267640 1 -- 192.168.123.107:0/2613267822 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f00ec02f9b0 con 0x7f01041082f0 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.840+0000 7f0109267640 1 -- 192.168.123.107:0/2613267822 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f00ec037440 con 0x7f01041082f0 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.840+0000 7f010c4f4640 1 -- 192.168.123.107:0/2613267822 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01041082f0 msgr2=0x7f01041086f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.840+0000 7f010c4f4640 1 --2- 192.168.123.107:0/2613267822 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01041082f0 0x7f01041086f0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f00ec009920 tx=0x7f00ec02ef20 comp rx=0 tx=0).stop 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.841+0000 7f010c4f4640 1 -- 192.168.123.107:0/2613267822 shutdown_connections 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.841+0000 7f010c4f4640 1 --2- 192.168.123.107:0/2613267822 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01041082f0 0x7f01041086f0 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.841+0000 7f010c4f4640 1 -- 192.168.123.107:0/2613267822 >> 192.168.123.107:0/2613267822 conn(0x7f010407ba00 msgr2=0x7f01041066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.841+0000 7f010c4f4640 1 -- 192.168.123.107:0/2613267822 shutdown_connections 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.841+0000 7f010c4f4640 1 -- 192.168.123.107:0/2613267822 wait complete. 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.841+0000 7f010c4f4640 1 Processor -- start 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.842+0000 7f010c4f4640 1 -- start start 2026-03-09T19:22:01.364 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.842+0000 7f010c4f4640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01041082f0 0x7f010419e070 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:01.365 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.842+0000 7f010c4f4640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f00ec035340 con 0x7f01041082f0 2026-03-09T19:22:01.365 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.842+0000 7f010a269640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01041082f0 0x7f010419e070 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:01.365 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.842+0000 7f010a269640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01041082f0 0x7f010419e070 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40556/0 (socket says 192.168.123.107:40556) 2026-03-09T19:22:01.365 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.842+0000 7f010a269640 1 -- 192.168.123.107:0/2036787213 learned_addr learned my addr 192.168.123.107:0/2036787213 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:01.365 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.842+0000 7f010a269640 1 -- 192.168.123.107:0/2036787213 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f00ec0095d0 con 0x7f01041082f0 2026-03-09T19:22:01.367 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.842+0000 7f010a269640 1 --2- 192.168.123.107:0/2036787213 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01041082f0 0x7f010419e070 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f00ec02f4d0 tx=0x7f00ec035d40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:01.367 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.843+0000 7f00fb7fe640 1 -- 192.168.123.107:0/2036787213 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f00ec02fbc0 con 0x7f01041082f0 2026-03-09T19:22:01.367 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.843+0000 7f00fb7fe640 1 -- 192.168.123.107:0/2036787213 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f00ec02fd20 con 0x7f01041082f0 2026-03-09T19:22:01.367 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.843+0000 7f00fb7fe640 1 -- 192.168.123.107:0/2036787213 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f00ec03e4f0 con 0x7f01041082f0 2026-03-09T19:22:01.367 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.843+0000 7f010c4f4640 1 -- 192.168.123.107:0/2036787213 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f010419e5b0 con 0x7f01041082f0 2026-03-09T19:22:01.367 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.843+0000 7f010c4f4640 1 -- 192.168.123.107:0/2036787213 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f010419e9f0 con 0x7f01041082f0 2026-03-09T19:22:01.367 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.844+0000 7f00fb7fe640 1 -- 192.168.123.107:0/2036787213 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 8) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f00ec03e070 con 0x7f01041082f0 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.844+0000 7f010c4f4640 1 -- 192.168.123.107:0/2036787213 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f010410cdd0 con 0x7f01041082f0 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.847+0000 7f00fb7fe640 1 --2- 192.168.123.107:0/2036787213 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f00e003cec0 0x7f00e003f380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.847+0000 7f00fb7fe640 1 -- 192.168.123.107:0/2036787213 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f00ec075300 con 0x7f01041082f0 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.847+0000 7f0109a68640 1 --2- 192.168.123.107:0/2036787213 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f00e003cec0 0x7f00e003f380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.848+0000 7f0109a68640 1 --2- 192.168.123.107:0/2036787213 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f00e003cec0 0x7f00e003f380 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f00f40099c0 tx=0x7f00f4006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.848+0000 7f00fb7fe640 1 -- 192.168.123.107:0/2036787213 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f00ec0482e0 con 0x7f01041082f0 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:21:59.939+0000 7f010c4f4640 1 -- 192.168.123.107:0/2036787213 --> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm07", "addr": "192.168.123.107", "target": ["mon-mgr", ""]}) v1 -- 0x7f01041063a0 con 0x7f00e003cec0 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.330+0000 7f00fb7fe640 1 -- 192.168.123.107:0/2036787213 <== mgr.14118 v2:192.168.123.107:6800/1318262611 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7f01041063a0 con 0x7f00e003cec0 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.334+0000 7f010c4f4640 1 -- 192.168.123.107:0/2036787213 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f00e003cec0 msgr2=0x7f00e003f380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.334+0000 7f010c4f4640 1 --2- 192.168.123.107:0/2036787213 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f00e003cec0 0x7f00e003f380 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f00f40099c0 tx=0x7f00f4006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.334+0000 7f010c4f4640 1 -- 192.168.123.107:0/2036787213 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01041082f0 msgr2=0x7f010419e070 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.334+0000 7f010c4f4640 1 --2- 192.168.123.107:0/2036787213 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01041082f0 0x7f010419e070 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f00ec02f4d0 tx=0x7f00ec035d40 comp rx=0 tx=0).stop 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.334+0000 7f010c4f4640 1 -- 192.168.123.107:0/2036787213 shutdown_connections 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.334+0000 7f010c4f4640 1 --2- 192.168.123.107:0/2036787213 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f00e003cec0 0x7f00e003f380 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.334+0000 7f010c4f4640 1 --2- 192.168.123.107:0/2036787213 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01041082f0 0x7f010419e070 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.334+0000 7f010c4f4640 1 -- 192.168.123.107:0/2036787213 >> 192.168.123.107:0/2036787213 conn(0x7f010407ba00 msgr2=0x7f0104105c90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.334+0000 7f010c4f4640 1 -- 192.168.123.107:0/2036787213 shutdown_connections 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.334+0000 7f010c4f4640 1 -- 192.168.123.107:0/2036787213 wait complete. 2026-03-09T19:22:01.368 INFO:teuthology.orchestra.run.vm07.stdout:Deploying mon service with default placement... 2026-03-09T19:22:01.695 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout Scheduled mon update... 2026-03-09T19:22:01.695 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.492+0000 7f93b71f4640 1 Processor -- start 2026-03-09T19:22:01.695 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.492+0000 7f93b71f4640 1 -- start start 2026-03-09T19:22:01.695 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.492+0000 7f93b71f4640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b0072f30 0x7f93b0071440 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.492+0000 7f93b71f4640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f93b0071a10 con 0x7f93b0072f30 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.496+0000 7f93b61f2640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b0072f30 0x7f93b0071440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.496+0000 7f93b61f2640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b0072f30 0x7f93b0071440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40562/0 (socket says 192.168.123.107:40562) 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.496+0000 7f93b61f2640 1 -- 192.168.123.107:0/4174500642 learned_addr learned my addr 192.168.123.107:0/4174500642 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.496+0000 7f93b61f2640 1 -- 192.168.123.107:0/4174500642 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f93b0071b50 con 0x7f93b0072f30 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.497+0000 7f93b61f2640 1 --2- 192.168.123.107:0/4174500642 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b0072f30 0x7f93b0071440 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f93a8009e80 tx=0x7f93a802f360 comp rx=0 tx=0).ready entity=mon.0 client_cookie=84e6edfd968bd7a5 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.498+0000 7f93b51f0640 1 -- 192.168.123.107:0/4174500642 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f93a8006ea0 con 0x7f93b0072f30 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.498+0000 7f93b51f0640 1 -- 192.168.123.107:0/4174500642 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f93a80373f0 con 0x7f93b0072f30 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.498+0000 7f93b51f0640 1 -- 192.168.123.107:0/4174500642 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f93a8007590 con 0x7f93b0072f30 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.500+0000 7f93b71f4640 1 -- 192.168.123.107:0/4174500642 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b0072f30 msgr2=0x7f93b0071440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.500+0000 7f93b71f4640 1 --2- 192.168.123.107:0/4174500642 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b0072f30 0x7f93b0071440 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f93a8009e80 tx=0x7f93a802f360 comp rx=0 tx=0).stop 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.500+0000 7f93b71f4640 1 -- 192.168.123.107:0/4174500642 shutdown_connections 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.500+0000 7f93b71f4640 1 --2- 192.168.123.107:0/4174500642 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b0072f30 0x7f93b0071440 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.500+0000 7f93b71f4640 1 -- 192.168.123.107:0/4174500642 >> 192.168.123.107:0/4174500642 conn(0x7f93b006d060 msgr2=0x7f93b006f480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.500+0000 7f93b71f4640 1 -- 192.168.123.107:0/4174500642 shutdown_connections 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.500+0000 7f93b71f4640 1 -- 192.168.123.107:0/4174500642 wait complete. 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.501+0000 7f93b71f4640 1 Processor -- start 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.501+0000 7f93b71f4640 1 -- start start 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.501+0000 7f93b71f4640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b0072f30 0x7f93b011cde0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.501+0000 7f93b71f4640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f93b011b510 con 0x7f93b0072f30 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.501+0000 7f93b61f2640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b0072f30 0x7f93b011cde0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.502+0000 7f93b61f2640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b0072f30 0x7f93b011cde0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40564/0 (socket says 192.168.123.107:40564) 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.502+0000 7f93b61f2640 1 -- 192.168.123.107:0/2902386291 learned_addr learned my addr 192.168.123.107:0/2902386291 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.502+0000 7f93b61f2640 1 -- 192.168.123.107:0/2902386291 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f93a8009b30 con 0x7f93b0072f30 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.502+0000 7f93b61f2640 1 --2- 192.168.123.107:0/2902386291 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b0072f30 0x7f93b011cde0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f93a8007670 tx=0x7f93a80076a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.502+0000 7f939f7fe640 1 -- 192.168.123.107:0/2902386291 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f93a8006ea0 con 0x7f93b0072f30 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.502+0000 7f939f7fe640 1 -- 192.168.123.107:0/2902386291 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1041+0+0 (secure 0 0 0) 0x7f93a8007ce0 con 0x7f93b0072f30 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.503+0000 7f939f7fe640 1 -- 192.168.123.107:0/2902386291 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f93a8041d90 con 0x7f93b0072f30 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.503+0000 7f93b71f4640 1 -- 192.168.123.107:0/2902386291 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f93b011b710 con 0x7f93b0072f30 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.503+0000 7f93b71f4640 1 -- 192.168.123.107:0/2902386291 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f93b011bbb0 con 0x7f93b0072f30 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.504+0000 7f939f7fe640 1 -- 192.168.123.107:0/2902386291 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 8) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f93a8041430 con 0x7f93b0072f30 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.504+0000 7f939f7fe640 1 --2- 192.168.123.107:0/2902386291 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f938c03ce20 0x7f938c03f2e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.504+0000 7f939f7fe640 1 -- 192.168.123.107:0/2902386291 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f93a8076160 con 0x7f93b0072f30 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.504+0000 7f93b59f1640 1 --2- 192.168.123.107:0/2902386291 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f938c03ce20 0x7f938c03f2e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.505+0000 7f93b59f1640 1 --2- 192.168.123.107:0/2902386291 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f938c03ce20 0x7f938c03f2e0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f93a40099c0 tx=0x7f93a4006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.505+0000 7f93b71f4640 1 -- 192.168.123.107:0/2902386291 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f93b0111000 con 0x7f93b0072f30 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.508+0000 7f939f7fe640 1 -- 192.168.123.107:0/2902386291 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f93a8048310 con 0x7f93b0072f30 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.619+0000 7f93b71f4640 1 -- 192.168.123.107:0/2902386291 --> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}) v1 -- 0x7f93b011bed0 con 0x7f938c03ce20 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.631+0000 7f939f7fe640 1 -- 192.168.123.107:0/2902386291 <== mgr.14118 v2:192.168.123.107:6800/1318262611 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7f93b011bed0 con 0x7f938c03ce20 2026-03-09T19:22:01.696 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.633+0000 7f93b71f4640 1 -- 192.168.123.107:0/2902386291 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f938c03ce20 msgr2=0x7f938c03f2e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:01.697 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.633+0000 7f93b71f4640 1 --2- 192.168.123.107:0/2902386291 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f938c03ce20 0x7f938c03f2e0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f93a40099c0 tx=0x7f93a4006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:01.697 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.633+0000 7f93b71f4640 1 -- 192.168.123.107:0/2902386291 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b0072f30 msgr2=0x7f93b011cde0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:01.697 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.633+0000 7f93b71f4640 1 --2- 192.168.123.107:0/2902386291 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b0072f30 0x7f93b011cde0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f93a8007670 tx=0x7f93a80076a0 comp rx=0 tx=0).stop 2026-03-09T19:22:01.697 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.633+0000 7f93b71f4640 1 -- 192.168.123.107:0/2902386291 shutdown_connections 2026-03-09T19:22:01.697 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.634+0000 7f93b71f4640 1 --2- 192.168.123.107:0/2902386291 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f938c03ce20 0x7f938c03f2e0 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:01.697 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.634+0000 7f93b71f4640 1 --2- 192.168.123.107:0/2902386291 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b0072f30 0x7f93b011cde0 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:01.697 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.634+0000 7f93b71f4640 1 -- 192.168.123.107:0/2902386291 >> 192.168.123.107:0/2902386291 conn(0x7f93b006d060 msgr2=0x7f93b0112990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:01.697 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.634+0000 7f93b71f4640 1 -- 192.168.123.107:0/2902386291 shutdown_connections 2026-03-09T19:22:01.697 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.634+0000 7f93b71f4640 1 -- 192.168.123.107:0/2902386291 wait complete. 2026-03-09T19:22:01.697 INFO:teuthology.orchestra.run.vm07.stdout:Deploying mgr service with default placement... 2026-03-09T19:22:01.870 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:01 vm07 ceph-mon[48545]: from='client.14138 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm07", "addr": "192.168.123.107", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:01.870 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:01 vm07 ceph-mon[48545]: Deploying cephadm binary to vm07 2026-03-09T19:22:01.870 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:01 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:22:01.870 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:01 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:22:01.870 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:01 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:22:01.870 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:01 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:22:02.030 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout Scheduled mgr update... 2026-03-09T19:22:02.030 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.870+0000 7fbc824d7640 1 Processor -- start 2026-03-09T19:22:02.030 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.870+0000 7fbc824d7640 1 -- start start 2026-03-09T19:22:02.030 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.871+0000 7fbc824d7640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbc7c0718f0 0x7fbc7c071cf0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:02.030 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.871+0000 7fbc824d7640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbc7c0722c0 con 0x7fbc7c0718f0 2026-03-09T19:22:02.030 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.872+0000 7fbc73fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbc7c0718f0 0x7fbc7c071cf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:02.030 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.872+0000 7fbc73fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbc7c0718f0 0x7fbc7c071cf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40578/0 (socket says 192.168.123.107:40578) 2026-03-09T19:22:02.031 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.872+0000 7fbc73fff640 1 -- 192.168.123.107:0/2980571350 learned_addr learned my addr 192.168.123.107:0/2980571350 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:02.031 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.872+0000 7fbc73fff640 1 -- 192.168.123.107:0/2980571350 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbc7c072400 con 0x7fbc7c0718f0 2026-03-09T19:22:02.031 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.873+0000 7fbc73fff640 1 --2- 192.168.123.107:0/2980571350 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbc7c0718f0 0x7fbc7c071cf0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fbc6c00d3a0 tx=0x7fbc6c0316b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=b9e69c793ebe3380 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.873+0000 7fbc7b7fe640 1 -- 192.168.123.107:0/2980571350 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbc6c006ea0 con 0x7fbc7c0718f0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.874+0000 7fbc7b7fe640 1 -- 192.168.123.107:0/2980571350 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbc6c034030 con 0x7fbc7c0718f0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.874+0000 7fbc824d7640 1 -- 192.168.123.107:0/2980571350 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbc7c0718f0 msgr2=0x7fbc7c071cf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.874+0000 7fbc824d7640 1 --2- 192.168.123.107:0/2980571350 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbc7c0718f0 0x7fbc7c071cf0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fbc6c00d3a0 tx=0x7fbc6c0316b0 comp rx=0 tx=0).stop 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.874+0000 7fbc824d7640 1 -- 192.168.123.107:0/2980571350 shutdown_connections 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.874+0000 7fbc824d7640 1 --2- 192.168.123.107:0/2980571350 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbc7c0718f0 0x7fbc7c071cf0 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.874+0000 7fbc824d7640 1 -- 192.168.123.107:0/2980571350 >> 192.168.123.107:0/2980571350 conn(0x7fbc7c06d080 msgr2=0x7fbc7c06f4a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.875+0000 7fbc824d7640 1 -- 192.168.123.107:0/2980571350 shutdown_connections 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.875+0000 7fbc824d7640 1 -- 192.168.123.107:0/2980571350 wait complete. 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.875+0000 7fbc824d7640 1 Processor -- start 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.875+0000 7fbc824d7640 1 -- start start 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.876+0000 7fbc824d7640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbc7c1a24d0 0x7fbc7c1a28f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.876+0000 7fbc824d7640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbc6c03a800 con 0x7fbc7c1a24d0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.876+0000 7fbc73fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbc7c1a24d0 0x7fbc7c1a28f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.876+0000 7fbc73fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbc7c1a24d0 0x7fbc7c1a28f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40592/0 (socket says 192.168.123.107:40592) 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.876+0000 7fbc73fff640 1 -- 192.168.123.107:0/1569433678 learned_addr learned my addr 192.168.123.107:0/1569433678 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.876+0000 7fbc73fff640 1 -- 192.168.123.107:0/1569433678 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbc6c00d050 con 0x7fbc7c1a24d0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.876+0000 7fbc73fff640 1 --2- 192.168.123.107:0/1569433678 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbc7c1a24d0 0x7fbc7c1a28f0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fbc6c007210 tx=0x7fbc6c007e30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.876+0000 7fbc79ffb640 1 -- 192.168.123.107:0/1569433678 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbc6c046070 con 0x7fbc7c1a24d0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.876+0000 7fbc824d7640 1 -- 192.168.123.107:0/1569433678 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbc7c1a2e30 con 0x7fbc7c1a24d0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.876+0000 7fbc824d7640 1 -- 192.168.123.107:0/1569433678 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbc7c1a39c0 con 0x7fbc7c1a24d0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.877+0000 7fbc824d7640 1 -- 192.168.123.107:0/1569433678 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbc48005350 con 0x7fbc7c1a24d0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.880+0000 7fbc79ffb640 1 -- 192.168.123.107:0/1569433678 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbc6c034050 con 0x7fbc7c1a24d0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.880+0000 7fbc79ffb640 1 -- 192.168.123.107:0/1569433678 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbc6c037b40 con 0x7fbc7c1a24d0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.880+0000 7fbc79ffb640 1 -- 192.168.123.107:0/1569433678 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 8) v1 ==== 49370+0+0 (secure 0 0 0) 0x7fbc6c041070 con 0x7fbc7c1a24d0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.881+0000 7fbc79ffb640 1 --2- 192.168.123.107:0/1569433678 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbc4403d280 0x7fbc4403f740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.881+0000 7fbc79ffb640 1 -- 192.168.123.107:0/1569433678 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fbc6c077120 con 0x7fbc7c1a24d0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.881+0000 7fbc7bfff640 1 --2- 192.168.123.107:0/1569433678 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbc4403d280 0x7fbc4403f740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.881+0000 7fbc7bfff640 1 --2- 192.168.123.107:0/1569433678 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbc4403d280 0x7fbc4403f740 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7fbc640099c0 tx=0x7fbc64006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.881+0000 7fbc79ffb640 1 -- 192.168.123.107:0/1569433678 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fbc6c077550 con 0x7fbc7c1a24d0 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.994+0000 7fbc824d7640 1 -- 192.168.123.107:0/1569433678 --> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}) v1 -- 0x7fbc48002bf0 con 0x7fbc4403d280 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:01.998+0000 7fbc79ffb640 1 -- 192.168.123.107:0/1569433678 <== mgr.14118 v2:192.168.123.107:6800/1318262611 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7fbc48002bf0 con 0x7fbc4403d280 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.002+0000 7fbc72ffd640 1 -- 192.168.123.107:0/1569433678 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbc4403d280 msgr2=0x7fbc4403f740 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.002+0000 7fbc72ffd640 1 --2- 192.168.123.107:0/1569433678 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbc4403d280 0x7fbc4403f740 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7fbc640099c0 tx=0x7fbc64006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.002+0000 7fbc72ffd640 1 -- 192.168.123.107:0/1569433678 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbc7c1a24d0 msgr2=0x7fbc7c1a28f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.002+0000 7fbc72ffd640 1 --2- 192.168.123.107:0/1569433678 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbc7c1a24d0 0x7fbc7c1a28f0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fbc6c007210 tx=0x7fbc6c007e30 comp rx=0 tx=0).stop 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.002+0000 7fbc72ffd640 1 -- 192.168.123.107:0/1569433678 shutdown_connections 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.002+0000 7fbc72ffd640 1 --2- 192.168.123.107:0/1569433678 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbc4403d280 0x7fbc4403f740 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.002+0000 7fbc72ffd640 1 --2- 192.168.123.107:0/1569433678 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbc7c1a24d0 0x7fbc7c1a28f0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.002+0000 7fbc72ffd640 1 -- 192.168.123.107:0/1569433678 >> 192.168.123.107:0/1569433678 conn(0x7fbc7c06d080 msgr2=0x7fbc7c06d840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.003+0000 7fbc72ffd640 1 -- 192.168.123.107:0/1569433678 shutdown_connections 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.004+0000 7fbc72ffd640 1 -- 192.168.123.107:0/1569433678 wait complete. 2026-03-09T19:22:02.032 INFO:teuthology.orchestra.run.vm07.stdout:Deploying crash service with default placement... 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout Scheduled crash update... 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.176+0000 7f9179a3b640 1 Processor -- start 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.176+0000 7f9179a3b640 1 -- start start 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.176+0000 7f9179a3b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9174105eb0 0x7f91741062b0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.177+0000 7f9172ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9174105eb0 0x7f91741062b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.177+0000 7f9172ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9174105eb0 0x7f91741062b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40594/0 (socket says 192.168.123.107:40594) 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.177+0000 7f9172ffd640 1 -- 192.168.123.107:0/1571090726 learned_addr learned my addr 192.168.123.107:0/1571090726 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.177+0000 7f9179a3b640 1 -- 192.168.123.107:0/1571090726 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f91741067f0 con 0x7f9174105eb0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.179+0000 7f9172ffd640 1 -- 192.168.123.107:0/1571090726 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9174106930 con 0x7f9174105eb0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.179+0000 7f9172ffd640 1 --2- 192.168.123.107:0/1571090726 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9174105eb0 0x7f91741062b0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f9168009920 tx=0x7f916802ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=91c59251bcfea2c6 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.180+0000 7f9171ffb640 1 -- 192.168.123.107:0/1571090726 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f916802f9b0 con 0x7f9174105eb0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.180+0000 7f9171ffb640 1 -- 192.168.123.107:0/1571090726 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9168037440 con 0x7f9174105eb0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.180+0000 7f9171ffb640 1 -- 192.168.123.107:0/1571090726 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9168035560 con 0x7f9174105eb0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.180+0000 7f9179a3b640 1 -- 192.168.123.107:0/1571090726 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9174105eb0 msgr2=0x7f91741062b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.180+0000 7f9179a3b640 1 --2- 192.168.123.107:0/1571090726 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9174105eb0 0x7f91741062b0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f9168009920 tx=0x7f916802ef20 comp rx=0 tx=0).stop 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.181+0000 7f9179a3b640 1 -- 192.168.123.107:0/1571090726 shutdown_connections 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.181+0000 7f9179a3b640 1 --2- 192.168.123.107:0/1571090726 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9174105eb0 0x7f91741062b0 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.181+0000 7f9179a3b640 1 -- 192.168.123.107:0/1571090726 >> 192.168.123.107:0/1571090726 conn(0x7f91741016d0 msgr2=0x7f9174103af0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.182+0000 7f9179a3b640 1 -- 192.168.123.107:0/1571090726 shutdown_connections 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.182+0000 7f9179a3b640 1 -- 192.168.123.107:0/1571090726 wait complete. 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.182+0000 7f9179a3b640 1 Processor -- start 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.182+0000 7f9179a3b640 1 -- start start 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.182+0000 7f9179a3b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9174105eb0 0x7f917407bbc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.182+0000 7f9179a3b640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f917407c100 con 0x7f9174105eb0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.183+0000 7f9172ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9174105eb0 0x7f917407bbc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.183+0000 7f9172ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9174105eb0 0x7f917407bbc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40610/0 (socket says 192.168.123.107:40610) 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.183+0000 7f9172ffd640 1 -- 192.168.123.107:0/1370919545 learned_addr learned my addr 192.168.123.107:0/1370919545 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.183+0000 7f9172ffd640 1 -- 192.168.123.107:0/1370919545 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f91680095d0 con 0x7f9174105eb0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.183+0000 7f9172ffd640 1 --2- 192.168.123.107:0/1370919545 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9174105eb0 0x7f917407bbc0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f9168037d90 tx=0x7f9168037990 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.184+0000 7f9178a39640 1 -- 192.168.123.107:0/1370919545 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f916802fe40 con 0x7f9174105eb0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.184+0000 7f9178a39640 1 -- 192.168.123.107:0/1370919545 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9168035ce0 con 0x7f9174105eb0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.184+0000 7f9178a39640 1 -- 192.168.123.107:0/1370919545 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f916803fdd0 con 0x7f9174105eb0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.184+0000 7f9179a3b640 1 -- 192.168.123.107:0/1370919545 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f917407a2f0 con 0x7f9174105eb0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.184+0000 7f9179a3b640 1 -- 192.168.123.107:0/1370919545 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f917407a7f0 con 0x7f9174105eb0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.186+0000 7f9178a39640 1 -- 192.168.123.107:0/1370919545 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 8) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f916803e070 con 0x7f9174105eb0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.186+0000 7f9178a39640 1 --2- 192.168.123.107:0/1370919545 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f915803d230 0x7f915803f6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.186+0000 7f9178a39640 1 -- 192.168.123.107:0/1370919545 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f91680764c0 con 0x7f9174105eb0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.186+0000 7f91727fc640 1 --2- 192.168.123.107:0/1370919545 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f915803d230 0x7f915803f6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.186+0000 7f9179a3b640 1 -- 192.168.123.107:0/1370919545 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f913c005350 con 0x7f9174105eb0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.189+0000 7f91727fc640 1 --2- 192.168.123.107:0/1370919545 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f915803d230 0x7f915803f6f0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f915c0099c0 tx=0x7f915c006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.189+0000 7f9178a39640 1 -- 192.168.123.107:0/1370919545 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f916803c020 con 0x7f9174105eb0 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.296+0000 7f9179a3b640 1 -- 192.168.123.107:0/1370919545 --> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}) v1 -- 0x7f913c002bf0 con 0x7f915803d230 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.301+0000 7f9178a39640 1 -- 192.168.123.107:0/1370919545 <== mgr.14118 v2:192.168.123.107:6800/1318262611 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+26 (secure 0 0 0) 0x7f913c002bf0 con 0x7f915803d230 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.304+0000 7f91527fc640 1 -- 192.168.123.107:0/1370919545 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f915803d230 msgr2=0x7f915803f6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:02.344 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.304+0000 7f91527fc640 1 --2- 192.168.123.107:0/1370919545 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f915803d230 0x7f915803f6f0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f915c0099c0 tx=0x7f915c006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:02.345 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.304+0000 7f91527fc640 1 -- 192.168.123.107:0/1370919545 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9174105eb0 msgr2=0x7f917407bbc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:02.345 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.304+0000 7f91527fc640 1 --2- 192.168.123.107:0/1370919545 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9174105eb0 0x7f917407bbc0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f9168037d90 tx=0x7f9168037990 comp rx=0 tx=0).stop 2026-03-09T19:22:02.345 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.305+0000 7f91527fc640 1 -- 192.168.123.107:0/1370919545 shutdown_connections 2026-03-09T19:22:02.345 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.305+0000 7f91527fc640 1 --2- 192.168.123.107:0/1370919545 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f915803d230 0x7f915803f6f0 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:02.345 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.305+0000 7f91527fc640 1 --2- 192.168.123.107:0/1370919545 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9174105eb0 0x7f917407bbc0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:02.345 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.305+0000 7f91527fc640 1 -- 192.168.123.107:0/1370919545 >> 192.168.123.107:0/1370919545 conn(0x7f91741016d0 msgr2=0x7f91741020d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:02.345 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.305+0000 7f91527fc640 1 -- 192.168.123.107:0/1370919545 shutdown_connections 2026-03-09T19:22:02.345 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.305+0000 7f91527fc640 1 -- 192.168.123.107:0/1370919545 wait complete. 2026-03-09T19:22:02.345 INFO:teuthology.orchestra.run.vm07.stdout:Deploying ceph-exporter service with default placement... 2026-03-09T19:22:03.004 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:02 vm07 ceph-mon[48545]: Added host vm07 2026-03-09T19:22:03.004 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:02 vm07 ceph-mon[48545]: from='client.14140 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:03.004 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:02 vm07 ceph-mon[48545]: Saving service mon spec with placement count:5 2026-03-09T19:22:03.004 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:02 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:22:03.004 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:02 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:22:03.004 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:02 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:22:03.004 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:02 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:22:03.004 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:02 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:22:03.006 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout Scheduled ceph-exporter update... 2026-03-09T19:22:03.007 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.480+0000 7f6f7c5d4640 1 Processor -- start 2026-03-09T19:22:03.007 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.481+0000 7f6f7c5d4640 1 -- start start 2026-03-09T19:22:03.007 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.481+0000 7f6f7c5d4640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f74108130 0x7f6f74108530 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:03.007 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.481+0000 7f6f7c5d4640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f74108b00 con 0x7f6f74108130 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.481+0000 7f6f7a349640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f74108130 0x7f6f74108530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.481+0000 7f6f7a349640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f74108130 0x7f6f74108530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40618/0 (socket says 192.168.123.107:40618) 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.481+0000 7f6f7a349640 1 -- 192.168.123.107:0/2664383049 learned_addr learned my addr 192.168.123.107:0/2664383049 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.482+0000 7f6f7a349640 1 -- 192.168.123.107:0/2664383049 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6f741092e0 con 0x7f6f74108130 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.482+0000 7f6f7a349640 1 --2- 192.168.123.107:0/2664383049 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f74108130 0x7f6f74108530 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f6f64009920 tx=0x7f6f6402ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=8b04287ccfedc3a0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.483+0000 7f6f79347640 1 -- 192.168.123.107:0/2664383049 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6f6402f9b0 con 0x7f6f74108130 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.483+0000 7f6f79347640 1 -- 192.168.123.107:0/2664383049 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6f64037440 con 0x7f6f74108130 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.483+0000 7f6f79347640 1 -- 192.168.123.107:0/2664383049 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6f64035560 con 0x7f6f74108130 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.483+0000 7f6f7c5d4640 1 -- 192.168.123.107:0/2664383049 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f74108130 msgr2=0x7f6f74108530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.483+0000 7f6f7c5d4640 1 --2- 192.168.123.107:0/2664383049 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f74108130 0x7f6f74108530 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f6f64009920 tx=0x7f6f6402ef20 comp rx=0 tx=0).stop 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.483+0000 7f6f7c5d4640 1 -- 192.168.123.107:0/2664383049 shutdown_connections 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.483+0000 7f6f7c5d4640 1 --2- 192.168.123.107:0/2664383049 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f74108130 0x7f6f74108530 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.483+0000 7f6f7c5d4640 1 -- 192.168.123.107:0/2664383049 >> 192.168.123.107:0/2664383049 conn(0x7f6f7407b6d0 msgr2=0x7f6f7407bae0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.484+0000 7f6f7c5d4640 1 -- 192.168.123.107:0/2664383049 shutdown_connections 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.484+0000 7f6f7c5d4640 1 -- 192.168.123.107:0/2664383049 wait complete. 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.484+0000 7f6f7c5d4640 1 Processor -- start 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.484+0000 7f6f7c5d4640 1 -- start start 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.484+0000 7f6f7c5d4640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f74108130 0x7f6f7407fc50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.484+0000 7f6f7c5d4640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f74080190 con 0x7f6f74108130 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.484+0000 7f6f7a349640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f74108130 0x7f6f7407fc50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.484+0000 7f6f7a349640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f74108130 0x7f6f7407fc50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40620/0 (socket says 192.168.123.107:40620) 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.484+0000 7f6f7a349640 1 -- 192.168.123.107:0/4083209839 learned_addr learned my addr 192.168.123.107:0/4083209839 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.485+0000 7f6f7a349640 1 -- 192.168.123.107:0/4083209839 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6f640095d0 con 0x7f6f74108130 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.485+0000 7f6f7a349640 1 --2- 192.168.123.107:0/4083209839 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f74108130 0x7f6f7407fc50 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f6f64009a50 tx=0x7f6f6402fbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.485+0000 7f6f637fe640 1 -- 192.168.123.107:0/4083209839 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6f64035820 con 0x7f6f74108130 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.485+0000 7f6f637fe640 1 -- 192.168.123.107:0/4083209839 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6f64035e40 con 0x7f6f74108130 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.485+0000 7f6f637fe640 1 -- 192.168.123.107:0/4083209839 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6f6403fdc0 con 0x7f6f74108130 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.485+0000 7f6f7c5d4640 1 -- 192.168.123.107:0/4083209839 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6f74080390 con 0x7f6f74108130 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.485+0000 7f6f7c5d4640 1 -- 192.168.123.107:0/4083209839 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6f7407c790 con 0x7f6f74108130 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.486+0000 7f6f7c5d4640 1 -- 192.168.123.107:0/4083209839 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6f3c005350 con 0x7f6f74108130 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.487+0000 7f6f637fe640 1 -- 192.168.123.107:0/4083209839 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 8) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f6f6403e070 con 0x7f6f74108130 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.487+0000 7f6f637fe640 1 --2- 192.168.123.107:0/4083209839 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f6f5003d230 0x7f6f5003f6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.487+0000 7f6f637fe640 1 -- 192.168.123.107:0/4083209839 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f6f64075f00 con 0x7f6f74108130 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.487+0000 7f6f79b48640 1 --2- 192.168.123.107:0/4083209839 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f6f5003d230 0x7f6f5003f6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.487+0000 7f6f79b48640 1 --2- 192.168.123.107:0/4083209839 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f6f5003d230 0x7f6f5003f6f0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f6f68009a10 tx=0x7f6f68006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.489+0000 7f6f637fe640 1 -- 192.168.123.107:0/4083209839 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6f64036db0 con 0x7f6f74108130 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.604+0000 7f6f7c5d4640 1 -- 192.168.123.107:0/4083209839 --> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7f6f3c002bf0 con 0x7f6f5003d230 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.609+0000 7f6f637fe640 1 -- 192.168.123.107:0/4083209839 <== mgr.14118 v2:192.168.123.107:6800/1318262611 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7f6f3c002bf0 con 0x7f6f5003d230 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.611+0000 7f6f617fa640 1 -- 192.168.123.107:0/4083209839 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f6f5003d230 msgr2=0x7f6f5003f6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.612+0000 7f6f617fa640 1 --2- 192.168.123.107:0/4083209839 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f6f5003d230 0x7f6f5003f6f0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f6f68009a10 tx=0x7f6f68006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.612+0000 7f6f617fa640 1 -- 192.168.123.107:0/4083209839 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f74108130 msgr2=0x7f6f7407fc50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.612+0000 7f6f617fa640 1 --2- 192.168.123.107:0/4083209839 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f74108130 0x7f6f7407fc50 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f6f64009a50 tx=0x7f6f6402fbe0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.612+0000 7f6f617fa640 1 -- 192.168.123.107:0/4083209839 shutdown_connections 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.612+0000 7f6f617fa640 1 --2- 192.168.123.107:0/4083209839 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f6f5003d230 0x7f6f5003f6f0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.612+0000 7f6f617fa640 1 --2- 192.168.123.107:0/4083209839 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f74108130 0x7f6f7407fc50 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.612+0000 7f6f617fa640 1 -- 192.168.123.107:0/4083209839 >> 192.168.123.107:0/4083209839 conn(0x7f6f7407b6d0 msgr2=0x7f6f74105720 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.612+0000 7f6f617fa640 1 -- 192.168.123.107:0/4083209839 shutdown_connections 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:02.612+0000 7f6f617fa640 1 -- 192.168.123.107:0/4083209839 wait complete. 2026-03-09T19:22:03.008 INFO:teuthology.orchestra.run.vm07.stdout:Deploying prometheus service with default placement... 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout Scheduled prometheus update... 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.135+0000 7fa4e0c9a640 1 Processor -- start 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.135+0000 7fa4e0c9a640 1 -- start start 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.135+0000 7fa4e0c9a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa4dc072f70 0x7fa4dc071480 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.135+0000 7fa4e0c9a640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa4dc071a50 con 0x7fa4dc072f70 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.136+0000 7fa4da575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa4dc072f70 0x7fa4dc071480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.136+0000 7fa4da575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa4dc072f70 0x7fa4dc071480 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40626/0 (socket says 192.168.123.107:40626) 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.136+0000 7fa4da575640 1 -- 192.168.123.107:0/2138548278 learned_addr learned my addr 192.168.123.107:0/2138548278 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.136+0000 7fa4da575640 1 -- 192.168.123.107:0/2138548278 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa4dc071b90 con 0x7fa4dc072f70 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.136+0000 7fa4da575640 1 --2- 192.168.123.107:0/2138548278 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa4dc072f70 0x7fa4dc071480 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fa4d0009920 tx=0x7fa4d002ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=34d281a37d30dd3e server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.137+0000 7fa4d9573640 1 -- 192.168.123.107:0/2138548278 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa4d002f9b0 con 0x7fa4dc072f70 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.137+0000 7fa4d9573640 1 -- 192.168.123.107:0/2138548278 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa4d0037440 con 0x7fa4dc072f70 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.137+0000 7fa4e0c9a640 1 -- 192.168.123.107:0/2138548278 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa4dc072f70 msgr2=0x7fa4dc071480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.137+0000 7fa4e0c9a640 1 --2- 192.168.123.107:0/2138548278 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa4dc072f70 0x7fa4dc071480 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fa4d0009920 tx=0x7fa4d002ef20 comp rx=0 tx=0).stop 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.137+0000 7fa4e0c9a640 1 -- 192.168.123.107:0/2138548278 shutdown_connections 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.137+0000 7fa4e0c9a640 1 --2- 192.168.123.107:0/2138548278 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa4dc072f70 0x7fa4dc071480 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.137+0000 7fa4e0c9a640 1 -- 192.168.123.107:0/2138548278 >> 192.168.123.107:0/2138548278 conn(0x7fa4dc06d080 msgr2=0x7fa4dc06f4a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.138+0000 7fa4e0c9a640 1 -- 192.168.123.107:0/2138548278 shutdown_connections 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.138+0000 7fa4e0c9a640 1 -- 192.168.123.107:0/2138548278 wait complete. 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.138+0000 7fa4e0c9a640 1 Processor -- start 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.138+0000 7fa4e0c9a640 1 -- start start 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.138+0000 7fa4e0c9a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa4dc11ced0 0x7fa4dc11b640 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.138+0000 7fa4e0c9a640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa4d00353c0 con 0x7fa4dc11ced0 2026-03-09T19:22:03.329 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.138+0000 7fa4da575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa4dc11ced0 0x7fa4dc11b640 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.138+0000 7fa4da575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa4dc11ced0 0x7fa4dc11b640 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40630/0 (socket says 192.168.123.107:40630) 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.138+0000 7fa4da575640 1 -- 192.168.123.107:0/2349758477 learned_addr learned my addr 192.168.123.107:0/2349758477 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.139+0000 7fa4da575640 1 -- 192.168.123.107:0/2349758477 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa4d00095d0 con 0x7fa4dc11ced0 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.139+0000 7fa4da575640 1 --2- 192.168.123.107:0/2349758477 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa4dc11ced0 0x7fa4dc11b640 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fa4d002f450 tx=0x7fa4d00379e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.140+0000 7fa4c77fe640 1 -- 192.168.123.107:0/2349758477 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa4d0037af0 con 0x7fa4dc11ced0 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.140+0000 7fa4e0c9a640 1 -- 192.168.123.107:0/2349758477 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa4dc11bb80 con 0x7fa4dc11ced0 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.140+0000 7fa4e0c9a640 1 -- 192.168.123.107:0/2349758477 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa4dc11c080 con 0x7fa4dc11ced0 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.141+0000 7fa4e0c9a640 1 -- 192.168.123.107:0/2349758477 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa4a0005350 con 0x7fa4dc11ced0 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.140+0000 7fa4c77fe640 1 -- 192.168.123.107:0/2349758477 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa4d002fe70 con 0x7fa4dc11ced0 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.141+0000 7fa4c77fe640 1 -- 192.168.123.107:0/2349758477 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa4d0042e00 con 0x7fa4dc11ced0 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.141+0000 7fa4c77fe640 1 -- 192.168.123.107:0/2349758477 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 8) v1 ==== 49370+0+0 (secure 0 0 0) 0x7fa4d004c430 con 0x7fa4dc11ced0 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.142+0000 7fa4c77fe640 1 --2- 192.168.123.107:0/2349758477 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fa4b403d230 0x7fa4b403f6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.142+0000 7fa4c77fe640 1 -- 192.168.123.107:0/2349758477 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fa4d00767b0 con 0x7fa4dc11ced0 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.143+0000 7fa4d9d74640 1 --2- 192.168.123.107:0/2349758477 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fa4b403d230 0x7fa4b403f6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.143+0000 7fa4d9d74640 1 --2- 192.168.123.107:0/2349758477 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fa4b403d230 0x7fa4b403f6f0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fa4c80099c0 tx=0x7fa4c8006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.146+0000 7fa4c77fe640 1 -- 192.168.123.107:0/2349758477 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fa4d00369a0 con 0x7fa4dc11ced0 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.247+0000 7fa4e0c9a640 1 -- 192.168.123.107:0/2349758477 --> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}) v1 -- 0x7fa4a0002bf0 con 0x7fa4b403d230 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.252+0000 7fa4c77fe640 1 -- 192.168.123.107:0/2349758477 <== mgr.14118 v2:192.168.123.107:6800/1318262611 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+31 (secure 0 0 0) 0x7fa4a0002bf0 con 0x7fa4b403d230 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.256+0000 7fa4e0c9a640 1 -- 192.168.123.107:0/2349758477 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fa4b403d230 msgr2=0x7fa4b403f6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.256+0000 7fa4e0c9a640 1 --2- 192.168.123.107:0/2349758477 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fa4b403d230 0x7fa4b403f6f0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fa4c80099c0 tx=0x7fa4c8006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.256+0000 7fa4e0c9a640 1 -- 192.168.123.107:0/2349758477 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa4dc11ced0 msgr2=0x7fa4dc11b640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.256+0000 7fa4e0c9a640 1 --2- 192.168.123.107:0/2349758477 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa4dc11ced0 0x7fa4dc11b640 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fa4d002f450 tx=0x7fa4d00379e0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.258+0000 7fa4e0c9a640 1 -- 192.168.123.107:0/2349758477 shutdown_connections 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.258+0000 7fa4e0c9a640 1 --2- 192.168.123.107:0/2349758477 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fa4b403d230 0x7fa4b403f6f0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.258+0000 7fa4e0c9a640 1 --2- 192.168.123.107:0/2349758477 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa4dc11ced0 0x7fa4dc11b640 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.258+0000 7fa4e0c9a640 1 -- 192.168.123.107:0/2349758477 >> 192.168.123.107:0/2349758477 conn(0x7fa4dc06d080 msgr2=0x7fa4dc06d900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.259+0000 7fa4e0c9a640 1 -- 192.168.123.107:0/2349758477 shutdown_connections 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.259+0000 7fa4e0c9a640 1 -- 192.168.123.107:0/2349758477 wait complete. 2026-03-09T19:22:03.330 INFO:teuthology.orchestra.run.vm07.stdout:Deploying grafana service with default placement... 2026-03-09T19:22:03.636 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout Scheduled grafana update... 2026-03-09T19:22:03.636 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.476+0000 7fac059e7640 1 Processor -- start 2026-03-09T19:22:03.636 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.476+0000 7fac059e7640 1 -- start start 2026-03-09T19:22:03.636 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.476+0000 7fac059e7640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fac00072f30 0x7fac00071440 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:03.636 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.476+0000 7fac059e7640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fac00071a10 con 0x7fac00072f30 2026-03-09T19:22:03.636 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.477+0000 7fac049e5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fac00072f30 0x7fac00071440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:03.637 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.477+0000 7fac049e5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fac00072f30 0x7fac00071440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40646/0 (socket says 192.168.123.107:40646) 2026-03-09T19:22:03.637 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.477+0000 7fac049e5640 1 -- 192.168.123.107:0/3021237742 learned_addr learned my addr 192.168.123.107:0/3021237742 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:03.637 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.479+0000 7fac049e5640 1 -- 192.168.123.107:0/3021237742 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fac00071b50 con 0x7fac00072f30 2026-03-09T19:22:03.637 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.479+0000 7fac049e5640 1 --2- 192.168.123.107:0/3021237742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fac00072f30 0x7fac00071440 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fabf4009e80 tx=0x7fabf402f360 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a56b8b45d95d8975 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:03.637 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.480+0000 7fabff7fe640 1 -- 192.168.123.107:0/3021237742 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fabf4006ea0 con 0x7fac00072f30 2026-03-09T19:22:03.637 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.480+0000 7fabff7fe640 1 -- 192.168.123.107:0/3021237742 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fabf402fb60 con 0x7fac00072f30 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.480+0000 7fabff7fe640 1 -- 192.168.123.107:0/3021237742 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fabf4007590 con 0x7fac00072f30 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.480+0000 7fac059e7640 1 -- 192.168.123.107:0/3021237742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fac00072f30 msgr2=0x7fac00071440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.480+0000 7fac059e7640 1 --2- 192.168.123.107:0/3021237742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fac00072f30 0x7fac00071440 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fabf4009e80 tx=0x7fabf402f360 comp rx=0 tx=0).stop 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.480+0000 7fac059e7640 1 -- 192.168.123.107:0/3021237742 shutdown_connections 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.480+0000 7fac059e7640 1 --2- 192.168.123.107:0/3021237742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fac00072f30 0x7fac00071440 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.480+0000 7fac059e7640 1 -- 192.168.123.107:0/3021237742 >> 192.168.123.107:0/3021237742 conn(0x7fac0006d060 msgr2=0x7fac0006f480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.480+0000 7fac059e7640 1 -- 192.168.123.107:0/3021237742 shutdown_connections 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.481+0000 7fac059e7640 1 -- 192.168.123.107:0/3021237742 wait complete. 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.481+0000 7fac059e7640 1 Processor -- start 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.481+0000 7fac059e7640 1 -- start start 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.481+0000 7fac059e7640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fac00072f30 0x7fac0011ce20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.481+0000 7fac059e7640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fac0011b550 con 0x7fac00072f30 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.481+0000 7fac049e5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fac00072f30 0x7fac0011ce20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.481+0000 7fac049e5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fac00072f30 0x7fac0011ce20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40658/0 (socket says 192.168.123.107:40658) 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.481+0000 7fac049e5640 1 -- 192.168.123.107:0/964828726 learned_addr learned my addr 192.168.123.107:0/964828726 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.482+0000 7fac049e5640 1 -- 192.168.123.107:0/964828726 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fabf4009b30 con 0x7fac00072f30 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.482+0000 7fac049e5640 1 --2- 192.168.123.107:0/964828726 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fac00072f30 0x7fac0011ce20 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fabf4039040 tx=0x7fabf4007df0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.482+0000 7fabfdffb640 1 -- 192.168.123.107:0/964828726 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fabf4037600 con 0x7fac00072f30 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.482+0000 7fac059e7640 1 -- 192.168.123.107:0/964828726 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fac0011b750 con 0x7fac00072f30 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.482+0000 7fac059e7640 1 -- 192.168.123.107:0/964828726 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fac0011bc50 con 0x7fac00072f30 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.483+0000 7fabfdffb640 1 -- 192.168.123.107:0/964828726 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fabf4037760 con 0x7fac00072f30 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.483+0000 7fabfdffb640 1 -- 192.168.123.107:0/964828726 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fabf403f3d0 con 0x7fac00072f30 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.484+0000 7fabfdffb640 1 -- 192.168.123.107:0/964828726 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 8) v1 ==== 49370+0+0 (secure 0 0 0) 0x7fabf403f530 con 0x7fac00072f30 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.484+0000 7fabfdffb640 1 --2- 192.168.123.107:0/964828726 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fabcc03d230 0x7fabcc03f6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.484+0000 7fabfffff640 1 --2- 192.168.123.107:0/964828726 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fabcc03d230 0x7fabcc03f6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.484+0000 7fabfdffb640 1 -- 192.168.123.107:0/964828726 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fabf40769c0 con 0x7fac00072f30 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.485+0000 7fabd37fe640 1 -- 192.168.123.107:0/964828726 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fabc8005350 con 0x7fac00072f30 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.488+0000 7fabfffff640 1 --2- 192.168.123.107:0/964828726 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fabcc03d230 0x7fabcc03f6f0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fabf0009a10 tx=0x7fabf0006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.488+0000 7fabfdffb640 1 -- 192.168.123.107:0/964828726 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fabf4040b00 con 0x7fac00072f30 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.588+0000 7fabd37fe640 1 -- 192.168.123.107:0/964828726 --> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}) v1 -- 0x7fabc8002bf0 con 0x7fabcc03d230 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.596+0000 7fabfdffb640 1 -- 192.168.123.107:0/964828726 <== mgr.14118 v2:192.168.123.107:6800/1318262611 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+28 (secure 0 0 0) 0x7fabc8002bf0 con 0x7fabcc03d230 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.598+0000 7fabd37fe640 1 -- 192.168.123.107:0/964828726 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fabcc03d230 msgr2=0x7fabcc03f6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.598+0000 7fabd37fe640 1 --2- 192.168.123.107:0/964828726 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fabcc03d230 0x7fabcc03f6f0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fabf0009a10 tx=0x7fabf0006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.598+0000 7fabd37fe640 1 -- 192.168.123.107:0/964828726 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fac00072f30 msgr2=0x7fac0011ce20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.598+0000 7fabd37fe640 1 --2- 192.168.123.107:0/964828726 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fac00072f30 0x7fac0011ce20 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fabf4039040 tx=0x7fabf4007df0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.598+0000 7fabd37fe640 1 -- 192.168.123.107:0/964828726 shutdown_connections 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.598+0000 7fabd37fe640 1 --2- 192.168.123.107:0/964828726 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fabcc03d230 0x7fabcc03f6f0 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.598+0000 7fabd37fe640 1 --2- 192.168.123.107:0/964828726 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fac00072f30 0x7fac0011ce20 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.598+0000 7fabd37fe640 1 -- 192.168.123.107:0/964828726 >> 192.168.123.107:0/964828726 conn(0x7fac0006d060 msgr2=0x7fac001129c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.599+0000 7fabd37fe640 1 -- 192.168.123.107:0/964828726 shutdown_connections 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.599+0000 7fabd37fe640 1 -- 192.168.123.107:0/964828726 wait complete. 2026-03-09T19:22:03.638 INFO:teuthology.orchestra.run.vm07.stdout:Deploying node-exporter service with default placement... 2026-03-09T19:22:03.895 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:03 vm07 ceph-mon[48545]: from='client.14142 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:03.895 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:03 vm07 ceph-mon[48545]: Saving service mgr spec with placement count:2 2026-03-09T19:22:03.895 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:03 vm07 ceph-mon[48545]: from='client.14144 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:03.895 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:03 vm07 ceph-mon[48545]: Saving service crash spec with placement * 2026-03-09T19:22:03.895 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:03 vm07 ceph-mon[48545]: from='client.14146 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:03.895 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:03 vm07 ceph-mon[48545]: Saving service ceph-exporter spec with placement * 2026-03-09T19:22:03.895 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:03 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:22:03.895 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:03 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:22:03.895 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:03 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:22:03.934 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout Scheduled node-exporter update... 2026-03-09T19:22:03.934 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.795+0000 7f15db65e640 1 Processor -- start 2026-03-09T19:22:03.934 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.795+0000 7f15db65e640 1 -- start start 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.796+0000 7f15db65e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15d00a48d0 0x7f15d00a4cd0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.796+0000 7f15db65e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f15d00a52a0 con 0x7f15d00a48d0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.796+0000 7f15da65c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15d00a48d0 0x7f15d00a4cd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.796+0000 7f15da65c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15d00a48d0 0x7f15d00a4cd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40672/0 (socket says 192.168.123.107:40672) 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.796+0000 7f15da65c640 1 -- 192.168.123.107:0/87369236 learned_addr learned my addr 192.168.123.107:0/87369236 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.796+0000 7f15da65c640 1 -- 192.168.123.107:0/87369236 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f15d00a3910 con 0x7f15d00a48d0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.797+0000 7f15da65c640 1 --2- 192.168.123.107:0/87369236 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15d00a48d0 0x7f15d00a4cd0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f15c8009920 tx=0x7f15c802ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=af9223df089c3cbe server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.797+0000 7f15d965a640 1 -- 192.168.123.107:0/87369236 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f15c802f9b0 con 0x7f15d00a48d0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.797+0000 7f15d965a640 1 -- 192.168.123.107:0/87369236 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f15c8037440 con 0x7f15d00a48d0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.797+0000 7f15d965a640 1 -- 192.168.123.107:0/87369236 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f15c8035560 con 0x7f15d00a48d0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.798+0000 7f15db65e640 1 -- 192.168.123.107:0/87369236 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15d00a48d0 msgr2=0x7f15d00a4cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.798+0000 7f15db65e640 1 --2- 192.168.123.107:0/87369236 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15d00a48d0 0x7f15d00a4cd0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f15c8009920 tx=0x7f15c802ef20 comp rx=0 tx=0).stop 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.798+0000 7f15db65e640 1 -- 192.168.123.107:0/87369236 shutdown_connections 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.798+0000 7f15db65e640 1 --2- 192.168.123.107:0/87369236 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15d00a48d0 0x7f15d00a4cd0 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.798+0000 7f15db65e640 1 -- 192.168.123.107:0/87369236 >> 192.168.123.107:0/87369236 conn(0x7f15d009fbe0 msgr2=0x7f15d00a2040 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.799+0000 7f15db65e640 1 -- 192.168.123.107:0/87369236 shutdown_connections 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.799+0000 7f15db65e640 1 -- 192.168.123.107:0/87369236 wait complete. 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.799+0000 7f15db65e640 1 Processor -- start 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.799+0000 7f15db65e640 1 -- start start 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.800+0000 7f15db65e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15d00a48d0 0x7f15d013e810 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.800+0000 7f15db65e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f15d013ed50 con 0x7f15d00a48d0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.800+0000 7f15da65c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15d00a48d0 0x7f15d013e810 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.800+0000 7f15da65c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15d00a48d0 0x7f15d013e810 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40678/0 (socket says 192.168.123.107:40678) 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.800+0000 7f15da65c640 1 -- 192.168.123.107:0/443462362 learned_addr learned my addr 192.168.123.107:0/443462362 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.800+0000 7f15da65c640 1 -- 192.168.123.107:0/443462362 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f15c80095d0 con 0x7f15d00a48d0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.800+0000 7f15da65c640 1 --2- 192.168.123.107:0/443462362 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15d00a48d0 0x7f15d013e810 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f15c8037d90 tx=0x7f15c8037920 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.801+0000 7f15bf7fe640 1 -- 192.168.123.107:0/443462362 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f15c8037b30 con 0x7f15d00a48d0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.801+0000 7f15bf7fe640 1 -- 192.168.123.107:0/443462362 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f15c803f3f0 con 0x7f15d00a48d0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.801+0000 7f15db65e640 1 -- 192.168.123.107:0/443462362 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f15d013ef50 con 0x7f15d00a48d0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.801+0000 7f15db65e640 1 -- 192.168.123.107:0/443462362 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f15d013f3f0 con 0x7f15d00a48d0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.802+0000 7f15db65e640 1 -- 192.168.123.107:0/443462362 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f159c005350 con 0x7f15d00a48d0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.802+0000 7f15bf7fe640 1 -- 192.168.123.107:0/443462362 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f15c8035bc0 con 0x7f15d00a48d0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.803+0000 7f15bf7fe640 1 -- 192.168.123.107:0/443462362 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 8) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f15c803e070 con 0x7f15d00a48d0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.803+0000 7f15bf7fe640 1 --2- 192.168.123.107:0/443462362 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f15a403d190 0x7f15a403f650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.803+0000 7f15bf7fe640 1 -- 192.168.123.107:0/443462362 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f15c8076100 con 0x7f15d00a48d0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.805+0000 7f15d9e5b640 1 --2- 192.168.123.107:0/443462362 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f15a403d190 0x7f15a403f650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.805+0000 7f15d9e5b640 1 --2- 192.168.123.107:0/443462362 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f15a403d190 0x7f15a403f650 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f15cc0099c0 tx=0x7f15cc006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.806+0000 7f15bf7fe640 1 -- 192.168.123.107:0/443462362 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f15c803f560 con 0x7f15d00a48d0 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.903+0000 7f15db65e640 1 -- 192.168.123.107:0/443462362 --> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7f159c002bf0 con 0x7f15a403d190 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.907+0000 7f15bf7fe640 1 -- 192.168.123.107:0/443462362 <== mgr.14118 v2:192.168.123.107:6800/1318262611 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7f159c002bf0 con 0x7f15a403d190 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.910+0000 7f15db65e640 1 -- 192.168.123.107:0/443462362 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f15a403d190 msgr2=0x7f15a403f650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.910+0000 7f15db65e640 1 --2- 192.168.123.107:0/443462362 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f15a403d190 0x7f15a403f650 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f15cc0099c0 tx=0x7f15cc006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.910+0000 7f15db65e640 1 -- 192.168.123.107:0/443462362 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15d00a48d0 msgr2=0x7f15d013e810 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.910+0000 7f15db65e640 1 --2- 192.168.123.107:0/443462362 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15d00a48d0 0x7f15d013e810 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f15c8037d90 tx=0x7f15c8037920 comp rx=0 tx=0).stop 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.910+0000 7f15db65e640 1 -- 192.168.123.107:0/443462362 shutdown_connections 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.910+0000 7f15db65e640 1 --2- 192.168.123.107:0/443462362 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f15a403d190 0x7f15a403f650 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.910+0000 7f15db65e640 1 --2- 192.168.123.107:0/443462362 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15d00a48d0 0x7f15d013e810 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.910+0000 7f15db65e640 1 -- 192.168.123.107:0/443462362 >> 192.168.123.107:0/443462362 conn(0x7f15d009fbe0 msgr2=0x7f15d00a05b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.910+0000 7f15db65e640 1 -- 192.168.123.107:0/443462362 shutdown_connections 2026-03-09T19:22:03.935 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:03.910+0000 7f15db65e640 1 -- 192.168.123.107:0/443462362 wait complete. 2026-03-09T19:22:03.936 INFO:teuthology.orchestra.run.vm07.stdout:Deploying alertmanager service with default placement... 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout Scheduled alertmanager update... 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.058+0000 7fbcf2405640 1 Processor -- start 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.058+0000 7fbcf2405640 1 -- start start 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.058+0000 7fbcf2405640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbcec1082f0 0x7fbcec1086f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.058+0000 7fbcf2405640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbcec108cc0 con 0x7fbcec1082f0 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.059+0000 7fbcebfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbcec1082f0 0x7fbcec1086f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.059+0000 7fbcebfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbcec1082f0 0x7fbcec1086f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40686/0 (socket says 192.168.123.107:40686) 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.059+0000 7fbcebfff640 1 -- 192.168.123.107:0/3986761842 learned_addr learned my addr 192.168.123.107:0/3986761842 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.059+0000 7fbcebfff640 1 -- 192.168.123.107:0/3986761842 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbcec1094a0 con 0x7fbcec1082f0 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.060+0000 7fbcebfff640 1 --2- 192.168.123.107:0/3986761842 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbcec1082f0 0x7fbcec1086f0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fbcd8009b80 tx=0x7fbcd802f190 comp rx=0 tx=0).ready entity=mon.0 client_cookie=ef135b54035829a6 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.060+0000 7fbceaffd640 1 -- 192.168.123.107:0/3986761842 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbcd802fa10 con 0x7fbcec1082f0 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.060+0000 7fbceaffd640 1 -- 192.168.123.107:0/3986761842 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbcd802fb70 con 0x7fbcec1082f0 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.060+0000 7fbceaffd640 1 -- 192.168.123.107:0/3986761842 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbcd80355b0 con 0x7fbcec1082f0 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.061+0000 7fbcf2405640 1 -- 192.168.123.107:0/3986761842 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbcec1082f0 msgr2=0x7fbcec1086f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.061+0000 7fbcf2405640 1 --2- 192.168.123.107:0/3986761842 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbcec1082f0 0x7fbcec1086f0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fbcd8009b80 tx=0x7fbcd802f190 comp rx=0 tx=0).stop 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.061+0000 7fbcf2405640 1 -- 192.168.123.107:0/3986761842 shutdown_connections 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.061+0000 7fbcf2405640 1 --2- 192.168.123.107:0/3986761842 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbcec1082f0 0x7fbcec1086f0 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.061+0000 7fbcf2405640 1 -- 192.168.123.107:0/3986761842 >> 192.168.123.107:0/3986761842 conn(0x7fbcec07b850 msgr2=0x7fbcec07bc60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.061+0000 7fbcf2405640 1 -- 192.168.123.107:0/3986761842 shutdown_connections 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.061+0000 7fbcf2405640 1 -- 192.168.123.107:0/3986761842 wait complete. 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.062+0000 7fbcf2405640 1 Processor -- start 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.062+0000 7fbcf2405640 1 -- start start 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.062+0000 7fbcf2405640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbcec19e2d0 0x7fbcec19e6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.062+0000 7fbcf2405640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbcec19ec30 con 0x7fbcec19e2d0 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.062+0000 7fbcebfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbcec19e2d0 0x7fbcec19e6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:04.194 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.062+0000 7fbcebfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbcec19e2d0 0x7fbcec19e6f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40696/0 (socket says 192.168.123.107:40696) 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.062+0000 7fbcebfff640 1 -- 192.168.123.107:0/250939619 learned_addr learned my addr 192.168.123.107:0/250939619 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.063+0000 7fbcebfff640 1 -- 192.168.123.107:0/250939619 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbcd80095d0 con 0x7fbcec19e2d0 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.063+0000 7fbcebfff640 1 --2- 192.168.123.107:0/250939619 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbcec19e2d0 0x7fbcec19e6f0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fbcd8009cb0 tx=0x7fbcd8035f50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.063+0000 7fbce97fa640 1 -- 192.168.123.107:0/250939619 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbcd80374c0 con 0x7fbcec19e2d0 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.063+0000 7fbce97fa640 1 -- 192.168.123.107:0/250939619 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbcd8037ae0 con 0x7fbcec19e2d0 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.063+0000 7fbce97fa640 1 -- 192.168.123.107:0/250939619 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbcd8042da0 con 0x7fbcec19e2d0 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.063+0000 7fbcf2405640 1 -- 192.168.123.107:0/250939619 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbcec19ee30 con 0x7fbcec19e2d0 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.063+0000 7fbcf2405640 1 -- 192.168.123.107:0/250939619 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbcec1a19a0 con 0x7fbcec19e2d0 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.064+0000 7fbce97fa640 1 -- 192.168.123.107:0/250939619 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 8) v1 ==== 49370+0+0 (secure 0 0 0) 0x7fbcd8042600 con 0x7fbcec19e2d0 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.064+0000 7fbcf2405640 1 -- 192.168.123.107:0/250939619 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbcec1086f0 con 0x7fbcec19e2d0 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.065+0000 7fbce97fa640 1 --2- 192.168.123.107:0/250939619 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbcc403d230 0x7fbcc403f6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.065+0000 7fbce97fa640 1 -- 192.168.123.107:0/250939619 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fbcd80760d0 con 0x7fbcec19e2d0 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.065+0000 7fbceb7fe640 1 --2- 192.168.123.107:0/250939619 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbcc403d230 0x7fbcc403f6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.066+0000 7fbceb7fe640 1 --2- 192.168.123.107:0/250939619 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbcc403d230 0x7fbcc403f6f0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fbcd00099c0 tx=0x7fbcd0006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.068+0000 7fbce97fa640 1 -- 192.168.123.107:0/250939619 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fbcd80aa820 con 0x7fbcec19e2d0 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.166+0000 7fbcf2405640 1 -- 192.168.123.107:0/250939619 --> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}) v1 -- 0x7fbcec10add0 con 0x7fbcc403d230 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.171+0000 7fbce97fa640 1 -- 192.168.123.107:0/250939619 <== mgr.14118 v2:192.168.123.107:6800/1318262611 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+33 (secure 0 0 0) 0x7fbcec10add0 con 0x7fbcc403d230 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.172+0000 7fbcf2405640 1 -- 192.168.123.107:0/250939619 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbcc403d230 msgr2=0x7fbcc403f6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.172+0000 7fbcf2405640 1 --2- 192.168.123.107:0/250939619 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbcc403d230 0x7fbcc403f6f0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fbcd00099c0 tx=0x7fbcd0006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.173+0000 7fbcf2405640 1 -- 192.168.123.107:0/250939619 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbcec19e2d0 msgr2=0x7fbcec19e6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.173+0000 7fbcf2405640 1 --2- 192.168.123.107:0/250939619 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbcec19e2d0 0x7fbcec19e6f0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fbcd8009cb0 tx=0x7fbcd8035f50 comp rx=0 tx=0).stop 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.173+0000 7fbcf2405640 1 -- 192.168.123.107:0/250939619 shutdown_connections 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.173+0000 7fbcf2405640 1 --2- 192.168.123.107:0/250939619 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbcc403d230 0x7fbcc403f6f0 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.173+0000 7fbcf2405640 1 --2- 192.168.123.107:0/250939619 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbcec19e2d0 0x7fbcec19e6f0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.173+0000 7fbcf2405640 1 -- 192.168.123.107:0/250939619 >> 192.168.123.107:0/250939619 conn(0x7fbcec07b850 msgr2=0x7fbcec10a6c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.173+0000 7fbcf2405640 1 -- 192.168.123.107:0/250939619 shutdown_connections 2026-03-09T19:22:04.196 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.173+0000 7fbcf2405640 1 -- 192.168.123.107:0/250939619 wait complete. 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.314+0000 7f25c5b63640 1 Processor -- start 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.314+0000 7f25c5b63640 1 -- start start 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.315+0000 7f25c5b63640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25c007c880 0x7f25c007cc80 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.315+0000 7f25c5b63640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f25c007d250 con 0x7f25c007c880 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.315+0000 7f25bf7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25c007c880 0x7f25c007cc80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.315+0000 7f25bf7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25c007c880 0x7f25c007cc80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40702/0 (socket says 192.168.123.107:40702) 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.315+0000 7f25bf7fe640 1 -- 192.168.123.107:0/1835282135 learned_addr learned my addr 192.168.123.107:0/1835282135 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.315+0000 7f25bf7fe640 1 -- 192.168.123.107:0/1835282135 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f25c007da80 con 0x7f25c007c880 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.316+0000 7f25bf7fe640 1 --2- 192.168.123.107:0/1835282135 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25c007c880 0x7f25c007cc80 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f25b001c080 tx=0x7f25b0040520 comp rx=0 tx=0).ready entity=mon.0 client_cookie=c8c7c3c842f2d83 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.316+0000 7f25be7fc640 1 -- 192.168.123.107:0/1835282135 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f25b001a0d0 con 0x7f25c007c880 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.316+0000 7f25be7fc640 1 -- 192.168.123.107:0/1835282135 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f25b0043050 con 0x7f25c007c880 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.316+0000 7f25c5b63640 1 -- 192.168.123.107:0/1835282135 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25c007c880 msgr2=0x7f25c007cc80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.316+0000 7f25c5b63640 1 --2- 192.168.123.107:0/1835282135 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25c007c880 0x7f25c007cc80 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f25b001c080 tx=0x7f25b0040520 comp rx=0 tx=0).stop 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.317+0000 7f25c5b63640 1 -- 192.168.123.107:0/1835282135 shutdown_connections 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.317+0000 7f25c5b63640 1 --2- 192.168.123.107:0/1835282135 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25c007c880 0x7f25c007cc80 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.317+0000 7f25c5b63640 1 -- 192.168.123.107:0/1835282135 >> 192.168.123.107:0/1835282135 conn(0x7f25c007b8f0 msgr2=0x7f25c01066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.317+0000 7f25c5b63640 1 -- 192.168.123.107:0/1835282135 shutdown_connections 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.317+0000 7f25c5b63640 1 -- 192.168.123.107:0/1835282135 wait complete. 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.317+0000 7f25c5b63640 1 Processor -- start 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.317+0000 7f25c5b63640 1 -- start start 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.318+0000 7f25c5b63640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25c007c880 0x7f25c0199eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.318+0000 7f25c5b63640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f25b00475a0 con 0x7f25c007c880 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.318+0000 7f25bf7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25c007c880 0x7f25c0199eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:04.474 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.318+0000 7f25bf7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25c007c880 0x7f25c0199eb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40704/0 (socket says 192.168.123.107:40704) 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.318+0000 7f25bf7fe640 1 -- 192.168.123.107:0/3058559653 learned_addr learned my addr 192.168.123.107:0/3058559653 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.318+0000 7f25bf7fe640 1 -- 192.168.123.107:0/3058559653 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f25b001aa70 con 0x7f25c007c880 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.318+0000 7f25bf7fe640 1 --2- 192.168.123.107:0/3058559653 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25c007c880 0x7f25c0199eb0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f25b0004270 tx=0x7f25b00042a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.318+0000 7f25bcff9640 1 -- 192.168.123.107:0/3058559653 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f25b0004480 con 0x7f25c007c880 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.319+0000 7f25bcff9640 1 -- 192.168.123.107:0/3058559653 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f25b00045e0 con 0x7f25c007c880 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.319+0000 7f25c5b63640 1 -- 192.168.123.107:0/3058559653 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f25c019a3f0 con 0x7f25c007c880 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.319+0000 7f25bcff9640 1 -- 192.168.123.107:0/3058559653 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f25b0018bb0 con 0x7f25c007c880 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.319+0000 7f25c5b63640 1 -- 192.168.123.107:0/3058559653 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f25c019a8f0 con 0x7f25c007c880 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.320+0000 7f25bcff9640 1 -- 192.168.123.107:0/3058559653 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 8) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f25b0053450 con 0x7f25c007c880 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.320+0000 7f25bcff9640 1 --2- 192.168.123.107:0/3058559653 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f259803ce70 0x7f259803f330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.320+0000 7f25bcff9640 1 -- 192.168.123.107:0/3058559653 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f25b008c580 con 0x7f25c007c880 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.320+0000 7f25beffd640 1 --2- 192.168.123.107:0/3058559653 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f259803ce70 0x7f259803f330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.320+0000 7f25beffd640 1 --2- 192.168.123.107:0/3058559653 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f259803ce70 0x7f259803f330 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f25ac0099c0 tx=0x7f25ac006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.321+0000 7f25c5b63640 1 -- 192.168.123.107:0/3058559653 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f25c007f4b0 con 0x7f25c007c880 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.324+0000 7f25bcff9640 1 -- 192.168.123.107:0/3058559653 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f25b004c080 con 0x7f25c007c880 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.416+0000 7f25c5b63640 1 -- 192.168.123.107:0/3058559653 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/container_init}] v 0) v1 -- 0x7f25c007cc80 con 0x7f25c007c880 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.421+0000 7f25bcff9640 1 -- 192.168.123.107:0/3058559653 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/container_init}]=0 v7)=0 v7) v1 ==== 142+0+0 (secure 0 0 0) 0x7f25b0057030 con 0x7f25c007c880 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.428+0000 7f25c5b63640 1 -- 192.168.123.107:0/3058559653 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f259803ce70 msgr2=0x7f259803f330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.428+0000 7f25c5b63640 1 --2- 192.168.123.107:0/3058559653 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f259803ce70 0x7f259803f330 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f25ac0099c0 tx=0x7f25ac006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.428+0000 7f25c5b63640 1 -- 192.168.123.107:0/3058559653 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25c007c880 msgr2=0x7f25c0199eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.428+0000 7f25c5b63640 1 --2- 192.168.123.107:0/3058559653 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25c007c880 0x7f25c0199eb0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f25b0004270 tx=0x7f25b00042a0 comp rx=0 tx=0).stop 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.428+0000 7f25c5b63640 1 -- 192.168.123.107:0/3058559653 shutdown_connections 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.428+0000 7f25c5b63640 1 --2- 192.168.123.107:0/3058559653 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f259803ce70 0x7f259803f330 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.428+0000 7f25c5b63640 1 --2- 192.168.123.107:0/3058559653 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25c007c880 0x7f25c0199eb0 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.428+0000 7f25c5b63640 1 -- 192.168.123.107:0/3058559653 >> 192.168.123.107:0/3058559653 conn(0x7f25c007b8f0 msgr2=0x7f25c0190c40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.428+0000 7f25c5b63640 1 -- 192.168.123.107:0/3058559653 shutdown_connections 2026-03-09T19:22:04.475 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.428+0000 7f25c5b63640 1 -- 192.168.123.107:0/3058559653 wait complete. 2026-03-09T19:22:04.743 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.589+0000 7f0e90c3e640 1 Processor -- start 2026-03-09T19:22:04.743 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.590+0000 7f0e90c3e640 1 -- start start 2026-03-09T19:22:04.743 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.590+0000 7f0e90c3e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e8c1082f0 0x7f0e8c1086f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:04.743 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.590+0000 7f0e90c3e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0e8c108cc0 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.590+0000 7f0e8a575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e8c1082f0 0x7f0e8c1086f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.590+0000 7f0e8a575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e8c1082f0 0x7f0e8c1086f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40718/0 (socket says 192.168.123.107:40718) 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.590+0000 7f0e8a575640 1 -- 192.168.123.107:0/3668077006 learned_addr learned my addr 192.168.123.107:0/3668077006 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.591+0000 7f0e8a575640 1 -- 192.168.123.107:0/3668077006 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0e8c109490 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.591+0000 7f0e8a575640 1 --2- 192.168.123.107:0/3668077006 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e8c1082f0 0x7f0e8c1086f0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f0e74009b30 tx=0x7f0e7402f140 comp rx=0 tx=0).ready entity=mon.0 client_cookie=12539ebc16666acb server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.592+0000 7f0e89573640 1 -- 192.168.123.107:0/3668077006 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0e7402fbd0 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.592+0000 7f0e89573640 1 -- 192.168.123.107:0/3668077006 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f0e7402fd30 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.592+0000 7f0e89573640 1 -- 192.168.123.107:0/3668077006 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0e74035780 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.592+0000 7f0e90c3e640 1 -- 192.168.123.107:0/3668077006 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e8c1082f0 msgr2=0x7f0e8c1086f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.592+0000 7f0e90c3e640 1 --2- 192.168.123.107:0/3668077006 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e8c1082f0 0x7f0e8c1086f0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f0e74009b30 tx=0x7f0e7402f140 comp rx=0 tx=0).stop 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.592+0000 7f0e90c3e640 1 -- 192.168.123.107:0/3668077006 shutdown_connections 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.592+0000 7f0e90c3e640 1 --2- 192.168.123.107:0/3668077006 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e8c1082f0 0x7f0e8c1086f0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.592+0000 7f0e90c3e640 1 -- 192.168.123.107:0/3668077006 >> 192.168.123.107:0/3668077006 conn(0x7f0e8c07b8c0 msgr2=0x7f0e8c1066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.592+0000 7f0e90c3e640 1 -- 192.168.123.107:0/3668077006 shutdown_connections 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.593+0000 7f0e90c3e640 1 -- 192.168.123.107:0/3668077006 wait complete. 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.593+0000 7f0e90c3e640 1 Processor -- start 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.593+0000 7f0e90c3e640 1 -- start start 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.593+0000 7f0e90c3e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e8c1082f0 0x7f0e8c19e0c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.593+0000 7f0e90c3e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0e8c19e600 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.593+0000 7f0e8a575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e8c1082f0 0x7f0e8c19e0c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.594+0000 7f0e8a575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e8c1082f0 0x7f0e8c19e0c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40720/0 (socket says 192.168.123.107:40720) 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.594+0000 7f0e8a575640 1 -- 192.168.123.107:0/3576817788 learned_addr learned my addr 192.168.123.107:0/3576817788 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.594+0000 7f0e8a575640 1 -- 192.168.123.107:0/3576817788 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0e740095d0 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.594+0000 7f0e8a575640 1 --2- 192.168.123.107:0/3576817788 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e8c1082f0 0x7f0e8c19e0c0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f0e74009c60 tx=0x7f0e74037670 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.594+0000 7f0e7b7fe640 1 -- 192.168.123.107:0/3576817788 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0e74037ad0 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.594+0000 7f0e7b7fe640 1 -- 192.168.123.107:0/3576817788 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f0e74037c30 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.594+0000 7f0e7b7fe640 1 -- 192.168.123.107:0/3576817788 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0e74036680 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.594+0000 7f0e90c3e640 1 -- 192.168.123.107:0/3576817788 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0e8c19e800 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.594+0000 7f0e90c3e640 1 -- 192.168.123.107:0/3576817788 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0e8c19eca0 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.595+0000 7f0e7b7fe640 1 -- 192.168.123.107:0/3576817788 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 8) v1 ==== 49370+0+0 (secure 0 0 0) 0x7f0e7403e070 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.595+0000 7f0e7b7fe640 1 --2- 192.168.123.107:0/3576817788 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f0e6403d1e0 0x7f0e6403f6a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.595+0000 7f0e7b7fe640 1 -- 192.168.123.107:0/3576817788 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f0e740769d0 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.596+0000 7f0e89d74640 1 --2- 192.168.123.107:0/3576817788 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f0e6403d1e0 0x7f0e6403f6a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.596+0000 7f0e90c3e640 1 -- 192.168.123.107:0/3576817788 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0e50005350 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.597+0000 7f0e89d74640 1 --2- 192.168.123.107:0/3576817788 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f0e6403d1e0 0x7f0e6403f6a0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f0e80009a10 tx=0x7f0e80006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.599+0000 7f0e7b7fe640 1 -- 192.168.123.107:0/3576817788 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f0e7403c030 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.685+0000 7f0e90c3e640 1 -- 192.168.123.107:0/3576817788 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command([{prefix=config set, name=mgr/dashboard/ssl_server_port}] v 0) v1 -- 0x7f0e50005b80 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.692+0000 7f0e7b7fe640 1 -- 192.168.123.107:0/3576817788 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/dashboard/ssl_server_port}]=0 v8)=0 v8) v1 ==== 130+0+0 (secure 0 0 0) 0x7f0e74073020 con 0x7f0e8c1082f0 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.697+0000 7f0e90c3e640 1 -- 192.168.123.107:0/3576817788 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f0e6403d1e0 msgr2=0x7f0e6403f6a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.697+0000 7f0e90c3e640 1 --2- 192.168.123.107:0/3576817788 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f0e6403d1e0 0x7f0e6403f6a0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f0e80009a10 tx=0x7f0e80006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.697+0000 7f0e90c3e640 1 -- 192.168.123.107:0/3576817788 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e8c1082f0 msgr2=0x7f0e8c19e0c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.697+0000 7f0e90c3e640 1 --2- 192.168.123.107:0/3576817788 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e8c1082f0 0x7f0e8c19e0c0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f0e74009c60 tx=0x7f0e74037670 comp rx=0 tx=0).stop 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.698+0000 7f0e90c3e640 1 -- 192.168.123.107:0/3576817788 shutdown_connections 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.698+0000 7f0e90c3e640 1 --2- 192.168.123.107:0/3576817788 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7f0e6403d1e0 0x7f0e6403f6a0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.698+0000 7f0e90c3e640 1 --2- 192.168.123.107:0/3576817788 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e8c1082f0 0x7f0e8c19e0c0 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.698+0000 7f0e90c3e640 1 -- 192.168.123.107:0/3576817788 >> 192.168.123.107:0/3576817788 conn(0x7f0e8c07b8c0 msgr2=0x7f0e8c105ce0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.698+0000 7f0e90c3e640 1 -- 192.168.123.107:0/3576817788 shutdown_connections 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.698+0000 7f0e90c3e640 1 -- 192.168.123.107:0/3576817788 wait complete. 2026-03-09T19:22:04.744 INFO:teuthology.orchestra.run.vm07.stdout:Enabling the dashboard module... 2026-03-09T19:22:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:04 vm07 ceph-mon[48545]: from='client.14148 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:04 vm07 ceph-mon[48545]: Saving service prometheus spec with placement count:1 2026-03-09T19:22:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:04 vm07 ceph-mon[48545]: from='client.14150 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:04 vm07 ceph-mon[48545]: Saving service grafana spec with placement count:1 2026-03-09T19:22:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:04 vm07 ceph-mon[48545]: from='client.14152 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:04 vm07 ceph-mon[48545]: Saving service node-exporter spec with placement * 2026-03-09T19:22:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:04 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:22:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:04 vm07 ceph-mon[48545]: from='mgr.14118 192.168.123.107:0/3670027531' entity='mgr.vm07.xacuym' 2026-03-09T19:22:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:04 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/3058559653' entity='client.admin' 2026-03-09T19:22:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:04 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/3576817788' entity='client.admin' 2026-03-09T19:22:05.988 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.862+0000 7fe7e4c7d640 1 Processor -- start 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.863+0000 7fe7e4c7d640 1 -- start start 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.863+0000 7fe7e4c7d640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe7e01082f0 0x7fe7e01086f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.863+0000 7fe7e4c7d640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe7e0108cc0 con 0x7fe7e01082f0 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.863+0000 7fe7de575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe7e01082f0 0x7fe7e01086f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.863+0000 7fe7de575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe7e01082f0 0x7fe7e01086f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40728/0 (socket says 192.168.123.107:40728) 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.863+0000 7fe7de575640 1 -- 192.168.123.107:0/2069495701 learned_addr learned my addr 192.168.123.107:0/2069495701 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.864+0000 7fe7de575640 1 -- 192.168.123.107:0/2069495701 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe7e0109490 con 0x7fe7e01082f0 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.864+0000 7fe7de575640 1 --2- 192.168.123.107:0/2069495701 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe7e01082f0 0x7fe7e01086f0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fe7c8009b80 tx=0x7fe7c802f190 comp rx=0 tx=0).ready entity=mon.0 client_cookie=838302fc8e6864ee server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.865+0000 7fe7dd573640 1 -- 192.168.123.107:0/2069495701 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe7c802fc20 con 0x7fe7e01082f0 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.865+0000 7fe7dd573640 1 -- 192.168.123.107:0/2069495701 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe7c802fd80 con 0x7fe7e01082f0 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.865+0000 7fe7e4c7d640 1 -- 192.168.123.107:0/2069495701 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe7e01082f0 msgr2=0x7fe7e01086f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.865+0000 7fe7e4c7d640 1 --2- 192.168.123.107:0/2069495701 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe7e01082f0 0x7fe7e01086f0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fe7c8009b80 tx=0x7fe7c802f190 comp rx=0 tx=0).stop 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.865+0000 7fe7e4c7d640 1 -- 192.168.123.107:0/2069495701 shutdown_connections 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.865+0000 7fe7e4c7d640 1 --2- 192.168.123.107:0/2069495701 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe7e01082f0 0x7fe7e01086f0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.865+0000 7fe7e4c7d640 1 -- 192.168.123.107:0/2069495701 >> 192.168.123.107:0/2069495701 conn(0x7fe7e007b8c0 msgr2=0x7fe7e01066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.866+0000 7fe7e4c7d640 1 -- 192.168.123.107:0/2069495701 shutdown_connections 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.866+0000 7fe7e4c7d640 1 -- 192.168.123.107:0/2069495701 wait complete. 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.866+0000 7fe7e4c7d640 1 Processor -- start 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.866+0000 7fe7e4c7d640 1 -- start start 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.866+0000 7fe7e4c7d640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe7e019e140 0x7fe7e019e560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.866+0000 7fe7e4c7d640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe7c8035620 con 0x7fe7e019e140 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.867+0000 7fe7de575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe7e019e140 0x7fe7e019e560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.867+0000 7fe7de575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe7e019e140 0x7fe7e019e560 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40738/0 (socket says 192.168.123.107:40738) 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.867+0000 7fe7de575640 1 -- 192.168.123.107:0/280156849 learned_addr learned my addr 192.168.123.107:0/280156849 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:05.989 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.867+0000 7fe7de575640 1 -- 192.168.123.107:0/280156849 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe7c80095d0 con 0x7fe7e019e140 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.867+0000 7fe7de575640 1 --2- 192.168.123.107:0/280156849 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe7e019e140 0x7fe7e019e560 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fe7c802f6c0 tx=0x7fe7c8003940 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.867+0000 7fe7cf7fe640 1 -- 192.168.123.107:0/280156849 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe7c80354c0 con 0x7fe7e019e140 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.867+0000 7fe7cf7fe640 1 -- 192.168.123.107:0/280156849 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe7c8037440 con 0x7fe7e019e140 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.867+0000 7fe7cf7fe640 1 -- 192.168.123.107:0/280156849 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe7c803f3f0 con 0x7fe7e019e140 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.868+0000 7fe7e4c7d640 1 -- 192.168.123.107:0/280156849 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe7e019eaa0 con 0x7fe7e019e140 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.868+0000 7fe7e4c7d640 1 -- 192.168.123.107:0/280156849 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe7e01a1640 con 0x7fe7e019e140 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.869+0000 7fe7cf7fe640 1 -- 192.168.123.107:0/280156849 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 8) v1 ==== 49370+0+0 (secure 0 0 0) 0x7fe7c803e070 con 0x7fe7e019e140 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.869+0000 7fe7e4c7d640 1 -- 192.168.123.107:0/280156849 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe7a4005350 con 0x7fe7e019e140 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.869+0000 7fe7cf7fe640 1 --2- 192.168.123.107:0/280156849 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fe7b403d1e0 0x7fe7b403f6a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.869+0000 7fe7cf7fe640 1 -- 192.168.123.107:0/280156849 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fe7c80757c0 con 0x7fe7e019e140 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.871+0000 7fe7ddd74640 1 --2- 192.168.123.107:0/280156849 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fe7b403d1e0 0x7fe7b403f6a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.872+0000 7fe7cf7fe640 1 -- 192.168.123.107:0/280156849 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fe7c8046310 con 0x7fe7e019e140 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.872+0000 7fe7ddd74640 1 --2- 192.168.123.107:0/280156849 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fe7b403d1e0 0x7fe7b403f6a0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fe7d40099c0 tx=0x7fe7d4006eb0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:04.982+0000 7fe7e4c7d640 1 -- 192.168.123.107:0/280156849 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "dashboard"} v 0) v1 -- 0x7fe7a40051c0 con 0x7fe7e019e140 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:05.909+0000 7fe7cf7fe640 1 -- 192.168.123.107:0/280156849 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "dashboard"}]=0 v9) v1 ==== 88+0+0 (secure 0 0 0) 0x7fe7c803c030 con 0x7fe7e019e140 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:05.911+0000 7fe7cf7fe640 1 -- 192.168.123.107:0/280156849 <== mon.0 v2:192.168.123.107:3300/0 8 ==== mgrmap(e 9) v1 ==== 49383+0+0 (secure 0 0 0) 0x7fe7c8037a00 con 0x7fe7e019e140 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:05.915+0000 7fe7e4c7d640 1 -- 192.168.123.107:0/280156849 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fe7b403d1e0 msgr2=0x7fe7b403f6a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:05.915+0000 7fe7e4c7d640 1 --2- 192.168.123.107:0/280156849 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fe7b403d1e0 0x7fe7b403f6a0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fe7d40099c0 tx=0x7fe7d4006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:05.915+0000 7fe7e4c7d640 1 -- 192.168.123.107:0/280156849 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe7e019e140 msgr2=0x7fe7e019e560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:05.915+0000 7fe7e4c7d640 1 --2- 192.168.123.107:0/280156849 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe7e019e140 0x7fe7e019e560 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fe7c802f6c0 tx=0x7fe7c8003940 comp rx=0 tx=0).stop 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:05.915+0000 7fe7e4c7d640 1 -- 192.168.123.107:0/280156849 shutdown_connections 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:05.915+0000 7fe7e4c7d640 1 --2- 192.168.123.107:0/280156849 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fe7b403d1e0 0x7fe7b403f6a0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:05.915+0000 7fe7e4c7d640 1 --2- 192.168.123.107:0/280156849 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe7e019e140 0x7fe7e019e560 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:05.915+0000 7fe7e4c7d640 1 -- 192.168.123.107:0/280156849 >> 192.168.123.107:0/280156849 conn(0x7fe7e007b8c0 msgr2=0x7fe7e0105ae0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:05.915+0000 7fe7e4c7d640 1 -- 192.168.123.107:0/280156849 shutdown_connections 2026-03-09T19:22:05.990 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:05.915+0000 7fe7e4c7d640 1 -- 192.168.123.107:0/280156849 wait complete. 2026-03-09T19:22:06.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:05 vm07 ceph-mon[48545]: from='client.14154 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:06.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:05 vm07 ceph-mon[48545]: Saving service alertmanager spec with placement count:1 2026-03-09T19:22:06.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:05 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/280156849' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch 2026-03-09T19:22:06.290 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout { 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "epoch": 9, 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "active_name": "vm07.xacuym", 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout } 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.117+0000 7fb8dc801640 1 Processor -- start 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.118+0000 7fb8dc801640 1 -- start start 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.118+0000 7fb8dc801640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8d8108130 0x7fb8d8108530 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.118+0000 7fb8dc801640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb8d8108b00 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.118+0000 7fb8d77fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8d8108130 0x7fb8d8108530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.118+0000 7fb8d77fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8d8108130 0x7fb8d8108530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40760/0 (socket says 192.168.123.107:40760) 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.118+0000 7fb8d77fe640 1 -- 192.168.123.107:0/4064281759 learned_addr learned my addr 192.168.123.107:0/4064281759 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.119+0000 7fb8d77fe640 1 -- 192.168.123.107:0/4064281759 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb8d8109290 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.119+0000 7fb8d77fe640 1 --2- 192.168.123.107:0/4064281759 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8d8108130 0x7fb8d8108530 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7fb8c4009920 tx=0x7fb8c402ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a0149818cfcb31f9 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.120+0000 7fb8d67fc640 1 -- 192.168.123.107:0/4064281759 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb8c402f9b0 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.120+0000 7fb8d67fc640 1 -- 192.168.123.107:0/4064281759 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fb8c4037440 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.120+0000 7fb8dc801640 1 -- 192.168.123.107:0/4064281759 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8d8108130 msgr2=0x7fb8d8108530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.120+0000 7fb8dc801640 1 --2- 192.168.123.107:0/4064281759 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8d8108130 0x7fb8d8108530 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7fb8c4009920 tx=0x7fb8c402ef20 comp rx=0 tx=0).stop 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.121+0000 7fb8dc801640 1 -- 192.168.123.107:0/4064281759 shutdown_connections 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.121+0000 7fb8dc801640 1 --2- 192.168.123.107:0/4064281759 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8d8108130 0x7fb8d8108530 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.121+0000 7fb8dc801640 1 -- 192.168.123.107:0/4064281759 >> 192.168.123.107:0/4064281759 conn(0x7fb8d807b6f0 msgr2=0x7fb8d807bb20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.121+0000 7fb8dc801640 1 -- 192.168.123.107:0/4064281759 shutdown_connections 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.121+0000 7fb8dc801640 1 -- 192.168.123.107:0/4064281759 wait complete. 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.121+0000 7fb8dc801640 1 Processor -- start 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.121+0000 7fb8dc801640 1 -- start start 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.122+0000 7fb8dc801640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8d8108130 0x7fb8d819de60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.122+0000 7fb8dc801640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb8c40353c0 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.122+0000 7fb8d77fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8d8108130 0x7fb8d819de60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.122+0000 7fb8d77fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8d8108130 0x7fb8d819de60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40772/0 (socket says 192.168.123.107:40772) 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.122+0000 7fb8d77fe640 1 -- 192.168.123.107:0/4103639896 learned_addr learned my addr 192.168.123.107:0/4103639896 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.122+0000 7fb8d77fe640 1 -- 192.168.123.107:0/4103639896 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb8c40095d0 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.122+0000 7fb8d77fe640 1 --2- 192.168.123.107:0/4103639896 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8d8108130 0x7fb8d819de60 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fb8c402f450 tx=0x7fb8c4037c90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.123+0000 7fb8d4ff9640 1 -- 192.168.123.107:0/4103639896 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb8c4035a30 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.123+0000 7fb8d4ff9640 1 -- 192.168.123.107:0/4103639896 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fb8c4035b90 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.123+0000 7fb8d4ff9640 1 -- 192.168.123.107:0/4103639896 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb8c4036540 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.123+0000 7fb8dc801640 1 -- 192.168.123.107:0/4103639896 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb8d819e3a0 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.123+0000 7fb8dc801640 1 -- 192.168.123.107:0/4103639896 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb8d819e840 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.123+0000 7fb8dc801640 1 -- 192.168.123.107:0/4103639896 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb8a4005350 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.127+0000 7fb8d4ff9640 1 -- 192.168.123.107:0/4103639896 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 9) v1 ==== 49383+0+0 (secure 0 0 0) 0x7fb8c403e070 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.127+0000 7fb8d4ff9640 1 --2- 192.168.123.107:0/4103639896 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fb8b003d230 0x7fb8b003f6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.127+0000 7fb8d4ff9640 1 -- 192.168.123.107:0/4103639896 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fb8c4076160 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.127+0000 7fb8d6ffd640 1 -- 192.168.123.107:0/4103639896 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fb8b003d230 msgr2=0x7fb8b003f6f0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/1318262611 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.127+0000 7fb8d6ffd640 1 --2- 192.168.123.107:0/4103639896 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fb8b003d230 0x7fb8b003f6f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.127+0000 7fb8d4ff9640 1 -- 192.168.123.107:0/4103639896 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fb8c4050db0 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.239+0000 7fb8dc801640 1 -- 192.168.123.107:0/4103639896 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7fb8a40058d0 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.240+0000 7fb8d4ff9640 1 -- 192.168.123.107:0/4103639896 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v9) v1 ==== 56+0+98 (secure 0 0 0) 0x7fb8c403c070 con 0x7fb8d8108130 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.246+0000 7fb8dc801640 1 -- 192.168.123.107:0/4103639896 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fb8b003d230 msgr2=0x7fb8b003f6f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.246+0000 7fb8dc801640 1 --2- 192.168.123.107:0/4103639896 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fb8b003d230 0x7fb8b003f6f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.246+0000 7fb8dc801640 1 -- 192.168.123.107:0/4103639896 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8d8108130 msgr2=0x7fb8d819de60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.246+0000 7fb8dc801640 1 --2- 192.168.123.107:0/4103639896 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8d8108130 0x7fb8d819de60 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fb8c402f450 tx=0x7fb8c4037c90 comp rx=0 tx=0).stop 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.248+0000 7fb8dc801640 1 -- 192.168.123.107:0/4103639896 shutdown_connections 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.248+0000 7fb8dc801640 1 --2- 192.168.123.107:0/4103639896 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fb8b003d230 0x7fb8b003f6f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.248+0000 7fb8dc801640 1 --2- 192.168.123.107:0/4103639896 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8d8108130 0x7fb8d819de60 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.248+0000 7fb8dc801640 1 -- 192.168.123.107:0/4103639896 >> 192.168.123.107:0/4103639896 conn(0x7fb8d807b6f0 msgr2=0x7fb8d8105530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.250+0000 7fb8dc801640 1 -- 192.168.123.107:0/4103639896 shutdown_connections 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.250+0000 7fb8dc801640 1 -- 192.168.123.107:0/4103639896 wait complete. 2026-03-09T19:22:06.291 INFO:teuthology.orchestra.run.vm07.stdout:Waiting for the mgr to restart... 2026-03-09T19:22:06.292 INFO:teuthology.orchestra.run.vm07.stdout:Waiting for mgr epoch 9... 2026-03-09T19:22:07.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:06 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/280156849' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished 2026-03-09T19:22:07.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:06 vm07 ceph-mon[48545]: mgrmap e9: vm07.xacuym(active, since 8s) 2026-03-09T19:22:07.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:06 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/4103639896' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-09T19:22:09.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:09 vm07 ceph-mon[48545]: Active manager daemon vm07.xacuym restarted 2026-03-09T19:22:09.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:09 vm07 ceph-mon[48545]: Activating manager daemon vm07.xacuym 2026-03-09T19:22:09.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:09 vm07 ceph-mon[48545]: osdmap e3: 0 total, 0 up, 0 in 2026-03-09T19:22:09.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:09 vm07 ceph-mon[48545]: mgrmap e10: vm07.xacuym(active, starting, since 0.00607774s) 2026-03-09T19:22:09.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:09 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:22:09.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:09 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr metadata", "who": "vm07.xacuym", "id": "vm07.xacuym"}]: dispatch 2026-03-09T19:22:09.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:09 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T19:22:09.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:09 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T19:22:09.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:09 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T19:22:09.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:09 vm07 ceph-mon[48545]: Manager daemon vm07.xacuym is now available 2026-03-09T19:22:09.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:09 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:22:10.181 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout { 2026-03-09T19:22:10.181 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 11, 2026-03-09T19:22:10.181 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-09T19:22:10.181 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout } 2026-03-09T19:22:10.181 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.422+0000 7fbb848a0640 1 Processor -- start 2026-03-09T19:22:10.181 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.422+0000 7fbb848a0640 1 -- start start 2026-03-09T19:22:10.181 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.422+0000 7fbb848a0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbb7c072f70 0x7fbb7c071480 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:10.181 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.422+0000 7fbb848a0640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbb7c071a50 con 0x7fbb7c072f70 2026-03-09T19:22:10.181 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.423+0000 7fbb82615640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbb7c072f70 0x7fbb7c071480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.423+0000 7fbb82615640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbb7c072f70 0x7fbb7c071480 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40782/0 (socket says 192.168.123.107:40782) 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.423+0000 7fbb82615640 1 -- 192.168.123.107:0/1668706167 learned_addr learned my addr 192.168.123.107:0/1668706167 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.423+0000 7fbb82615640 1 -- 192.168.123.107:0/1668706167 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbb7c071b90 con 0x7fbb7c072f70 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.424+0000 7fbb82615640 1 --2- 192.168.123.107:0/1668706167 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbb7c072f70 0x7fbb7c071480 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fbb74009920 tx=0x7fbb7402ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=f478dffa0591792b server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.424+0000 7fbb81613640 1 -- 192.168.123.107:0/1668706167 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbb7402f9b0 con 0x7fbb7c072f70 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.424+0000 7fbb81613640 1 -- 192.168.123.107:0/1668706167 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbb74037440 con 0x7fbb7c072f70 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.424+0000 7fbb81613640 1 -- 192.168.123.107:0/1668706167 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbb74035560 con 0x7fbb7c072f70 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.425+0000 7fbb848a0640 1 -- 192.168.123.107:0/1668706167 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbb7c072f70 msgr2=0x7fbb7c071480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.425+0000 7fbb848a0640 1 --2- 192.168.123.107:0/1668706167 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbb7c072f70 0x7fbb7c071480 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fbb74009920 tx=0x7fbb7402ef20 comp rx=0 tx=0).stop 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.425+0000 7fbb848a0640 1 -- 192.168.123.107:0/1668706167 shutdown_connections 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.425+0000 7fbb848a0640 1 --2- 192.168.123.107:0/1668706167 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbb7c072f70 0x7fbb7c071480 secure :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fbb74009920 tx=0x7fbb7402ef20 comp rx=0 tx=0).stop 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.425+0000 7fbb848a0640 1 -- 192.168.123.107:0/1668706167 >> 192.168.123.107:0/1668706167 conn(0x7fbb7c06d080 msgr2=0x7fbb7c06f4a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.425+0000 7fbb848a0640 1 -- 192.168.123.107:0/1668706167 shutdown_connections 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.426+0000 7fbb848a0640 1 -- 192.168.123.107:0/1668706167 wait complete. 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.426+0000 7fbb848a0640 1 Processor -- start 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.426+0000 7fbb848a0640 1 -- start start 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.426+0000 7fbb848a0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbb7c1ab0f0 0x7fbb7c1ab510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.426+0000 7fbb848a0640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbb7c1aba50 con 0x7fbb7c1ab0f0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.427+0000 7fbb82615640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbb7c1ab0f0 0x7fbb7c1ab510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.427+0000 7fbb82615640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbb7c1ab0f0 0x7fbb7c1ab510 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:40794/0 (socket says 192.168.123.107:40794) 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.427+0000 7fbb82615640 1 -- 192.168.123.107:0/3116092072 learned_addr learned my addr 192.168.123.107:0/3116092072 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.427+0000 7fbb82615640 1 -- 192.168.123.107:0/3116092072 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbb740095d0 con 0x7fbb7c1ab0f0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.427+0000 7fbb82615640 1 --2- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbb7c1ab0f0 0x7fbb7c1ab510 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7fbb7402f4d0 tx=0x7fbb740365d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.429+0000 7fbb6f7fe640 1 -- 192.168.123.107:0/3116092072 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbb74037b60 con 0x7fbb7c1ab0f0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.429+0000 7fbb848a0640 1 -- 192.168.123.107:0/3116092072 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbb7c1abc50 con 0x7fbb7c1ab0f0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.429+0000 7fbb848a0640 1 -- 192.168.123.107:0/3116092072 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbb7c07b240 con 0x7fbb7c1ab0f0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.429+0000 7fbb6f7fe640 1 -- 192.168.123.107:0/3116092072 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbb74036e10 con 0x7fbb7c1ab0f0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.429+0000 7fbb6f7fe640 1 -- 192.168.123.107:0/3116092072 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbb74041e20 con 0x7fbb7c1ab0f0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.430+0000 7fbb6f7fe640 1 -- 192.168.123.107:0/3116092072 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 9) v1 ==== 49383+0+0 (secure 0 0 0) 0x7fbb7403e030 con 0x7fbb7c1ab0f0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.431+0000 7fbb6f7fe640 1 --2- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbb5803d280 0x7fbb5803f740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.431+0000 7fbb6f7fe640 1 -- 192.168.123.107:0/3116092072 --> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7fbb5803fe50 con 0x7fbb5803d280 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.431+0000 7fbb6f7fe640 1 -- 192.168.123.107:0/3116092072 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fbb74075a90 con 0x7fbb7c1ab0f0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.431+0000 7fbb81e14640 1 -- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbb5803d280 msgr2=0x7fbb5803f740 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/1318262611 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.431+0000 7fbb81e14640 1 --2- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbb5803d280 0x7fbb5803f740 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.631+0000 7fbb81e14640 1 -- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbb5803d280 msgr2=0x7fbb5803f740 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/1318262611 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:06.631+0000 7fbb81e14640 1 --2- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbb5803d280 0x7fbb5803f740 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:07.032+0000 7fbb81e14640 1 -- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbb5803d280 msgr2=0x7fbb5803f740 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/1318262611 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:07.032+0000 7fbb81e14640 1 --2- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbb5803d280 0x7fbb5803f740 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:07.833+0000 7fbb81e14640 1 -- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbb5803d280 msgr2=0x7fbb5803f740 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/1318262611 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:07.833+0000 7fbb81e14640 1 --2- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbb5803d280 0x7fbb5803f740 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:09.113+0000 7fbb6f7fe640 1 -- 192.168.123.107:0/3116092072 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mgrmap(e 10) v1 ==== 49150+0+0 (secure 0 0 0) 0x7fbb740416b0 con 0x7fbb7c1ab0f0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:09.113+0000 7fbb6f7fe640 1 -- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbb5803d280 msgr2=0x7fbb5803f740 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:09.113+0000 7fbb6f7fe640 1 --2- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbb5803d280 0x7fbb5803f740 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.116+0000 7fbb6f7fe640 1 -- 192.168.123.107:0/3116092072 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mgrmap(e 11) v1 ==== 49277+0+0 (secure 0 0 0) 0x7fbb740486d0 con 0x7fbb7c1ab0f0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.116+0000 7fbb6f7fe640 1 --2- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fbb58040d30 0x7fbb58043120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.116+0000 7fbb6f7fe640 1 -- 192.168.123.107:0/3116092072 --> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7fbb5803fe50 con 0x7fbb58040d30 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.120+0000 7fbb81e14640 1 --2- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fbb58040d30 0x7fbb58043120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.123+0000 7fbb81e14640 1 --2- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fbb58040d30 0x7fbb58043120 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7fbb68003a80 tx=0x7fbb680092b0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.130+0000 7fbb6f7fe640 1 -- 192.168.123.107:0/3116092072 <== mgr.14162 v2:192.168.123.107:6800/1021580706 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+7759 (secure 0 0 0) 0x7fbb5803fe50 con 0x7fbb58040d30 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.132+0000 7fbb848a0640 1 -- 192.168.123.107:0/3116092072 --> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7fbb44002670 con 0x7fbb58040d30 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.133+0000 7fbb6f7fe640 1 -- 192.168.123.107:0/3116092072 <== mgr.14162 v2:192.168.123.107:6800/1021580706 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+52 (secure 0 0 0) 0x7fbb44002670 con 0x7fbb58040d30 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.133+0000 7fbb848a0640 1 -- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fbb58040d30 msgr2=0x7fbb58043120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.133+0000 7fbb848a0640 1 --2- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fbb58040d30 0x7fbb58043120 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7fbb68003a80 tx=0x7fbb680092b0 comp rx=0 tx=0).stop 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.133+0000 7fbb848a0640 1 -- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbb7c1ab0f0 msgr2=0x7fbb7c1ab510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.133+0000 7fbb848a0640 1 --2- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbb7c1ab0f0 0x7fbb7c1ab510 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7fbb7402f4d0 tx=0x7fbb740365d0 comp rx=0 tx=0).stop 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.133+0000 7fbb848a0640 1 -- 192.168.123.107:0/3116092072 shutdown_connections 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.133+0000 7fbb848a0640 1 --2- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fbb58040d30 0x7fbb58043120 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.133+0000 7fbb848a0640 1 --2- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:6800/1318262611,v1:192.168.123.107:6801/1318262611] conn(0x7fbb5803d280 0x7fbb5803f740 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:10.182 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.133+0000 7fbb848a0640 1 --2- 192.168.123.107:0/3116092072 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbb7c1ab0f0 0x7fbb7c1ab510 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:10.183 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.133+0000 7fbb848a0640 1 -- 192.168.123.107:0/3116092072 >> 192.168.123.107:0/3116092072 conn(0x7fbb7c06d080 msgr2=0x7fbb7c06dc10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:10.183 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.133+0000 7fbb848a0640 1 -- 192.168.123.107:0/3116092072 shutdown_connections 2026-03-09T19:22:10.183 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.133+0000 7fbb848a0640 1 -- 192.168.123.107:0/3116092072 wait complete. 2026-03-09T19:22:10.183 INFO:teuthology.orchestra.run.vm07.stdout:mgr epoch 9 is available 2026-03-09T19:22:10.183 INFO:teuthology.orchestra.run.vm07.stdout:Generating a dashboard self-signed certificate... 2026-03-09T19:22:10.373 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:10 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.xacuym/mirror_snapshot_schedule"}]: dispatch 2026-03-09T19:22:10.373 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:10 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.xacuym/trash_purge_schedule"}]: dispatch 2026-03-09T19:22:10.373 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:10 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:10.373 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:10 vm07 ceph-mon[48545]: mgrmap e11: vm07.xacuym(active, since 1.01155s) 2026-03-09T19:22:10.552 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout Self-signed certificate created 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.356+0000 7f80e570a640 1 Processor -- start 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.356+0000 7f80e570a640 1 -- start start 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.356+0000 7f80e570a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e0071820 0x7f80e0071c20 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.356+0000 7f80e570a640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80e00721f0 con 0x7f80e0071820 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.357+0000 7f80dffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e0071820 0x7f80e0071c20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.357+0000 7f80dffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e0071820 0x7f80e0071c20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:50918/0 (socket says 192.168.123.107:50918) 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.357+0000 7f80dffff640 1 -- 192.168.123.107:0/4083903916 learned_addr learned my addr 192.168.123.107:0/4083903916 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.357+0000 7f80dffff640 1 -- 192.168.123.107:0/4083903916 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f80e0072330 con 0x7f80e0071820 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.358+0000 7f80dffff640 1 --2- 192.168.123.107:0/4083903916 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e0071820 0x7f80e0071c20 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f80d0009b80 tx=0x7f80d002f190 comp rx=0 tx=0).ready entity=mon.0 client_cookie=50c3a806f78b8077 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.358+0000 7f80deffd640 1 -- 192.168.123.107:0/4083903916 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f80d002fa10 con 0x7f80e0071820 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.358+0000 7f80deffd640 1 -- 192.168.123.107:0/4083903916 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f80d002fb70 con 0x7f80e0071820 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.359+0000 7f80e570a640 1 -- 192.168.123.107:0/4083903916 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e0071820 msgr2=0x7f80e0071c20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.359+0000 7f80e570a640 1 --2- 192.168.123.107:0/4083903916 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e0071820 0x7f80e0071c20 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f80d0009b80 tx=0x7f80d002f190 comp rx=0 tx=0).stop 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.359+0000 7f80e570a640 1 -- 192.168.123.107:0/4083903916 shutdown_connections 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.359+0000 7f80e570a640 1 --2- 192.168.123.107:0/4083903916 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e0071820 0x7f80e0071c20 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.359+0000 7f80e570a640 1 -- 192.168.123.107:0/4083903916 >> 192.168.123.107:0/4083903916 conn(0x7f80e006d060 msgr2=0x7f80e006f480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.359+0000 7f80e570a640 1 -- 192.168.123.107:0/4083903916 shutdown_connections 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.359+0000 7f80e570a640 1 -- 192.168.123.107:0/4083903916 wait complete. 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.359+0000 7f80e570a640 1 Processor -- start 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.360+0000 7f80e570a640 1 -- start start 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.360+0000 7f80e570a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e0112a50 0x7f80e01a51b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.360+0000 7f80e570a640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80d0035410 con 0x7f80e0112a50 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.360+0000 7f80dffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e0112a50 0x7f80e01a51b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.360+0000 7f80dffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e0112a50 0x7f80e01a51b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:50922/0 (socket says 192.168.123.107:50922) 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.360+0000 7f80dffff640 1 -- 192.168.123.107:0/4141761689 learned_addr learned my addr 192.168.123.107:0/4141761689 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.361+0000 7f80dffff640 1 -- 192.168.123.107:0/4141761689 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f80d00095d0 con 0x7f80e0112a50 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.361+0000 7f80dffff640 1 --2- 192.168.123.107:0/4141761689 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e0112a50 0x7f80e01a51b0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f80d002f6c0 tx=0x7f80d0035a00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.361+0000 7f80dd7fa640 1 -- 192.168.123.107:0/4141761689 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f80d0035ac0 con 0x7f80e0112a50 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.361+0000 7f80e570a640 1 -- 192.168.123.107:0/4141761689 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f80e0112e70 con 0x7f80e0112a50 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.361+0000 7f80e570a640 1 -- 192.168.123.107:0/4141761689 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f80e01a5940 con 0x7f80e0112a50 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.362+0000 7f80dd7fa640 1 -- 192.168.123.107:0/4141761689 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f80d0035c20 con 0x7f80e0112a50 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.362+0000 7f80dd7fa640 1 -- 192.168.123.107:0/4141761689 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f80d00377e0 con 0x7f80e0112a50 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.363+0000 7f80dd7fa640 1 -- 192.168.123.107:0/4141761689 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 11) v1 ==== 49277+0+0 (secure 0 0 0) 0x7f80d003e070 con 0x7f80e0112a50 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.363+0000 7f80dd7fa640 1 --2- 192.168.123.107:0/4141761689 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f80b003d1b0 0x7f80b003f670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.363+0000 7f80dd7fa640 1 -- 192.168.123.107:0/4141761689 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f80d00762c0 con 0x7f80e0112a50 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.363+0000 7f80e570a640 1 -- 192.168.123.107:0/4141761689 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f80ac005350 con 0x7f80e0112a50 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.364+0000 7f80df7fe640 1 --2- 192.168.123.107:0/4141761689 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f80b003d1b0 0x7f80b003f670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.364+0000 7f80df7fe640 1 --2- 192.168.123.107:0/4141761689 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f80b003d1b0 0x7f80b003f670 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f80d800ad30 tx=0x7f80d80093f0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.370+0000 7f80dd7fa640 1 -- 192.168.123.107:0/4141761689 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f80d006ee60 con 0x7f80e0112a50 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.473+0000 7f80e570a640 1 -- 192.168.123.107:0/4141761689 --> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] -- mgr_command(tid 0: {"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}) v1 -- 0x7f80ac002bf0 con 0x7f80b003d1b0 2026-03-09T19:22:10.553 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.526+0000 7f80dd7fa640 1 -- 192.168.123.107:0/4141761689 <== mgr.14162 v2:192.168.123.107:6800/1021580706 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f80ac002bf0 con 0x7f80b003d1b0 2026-03-09T19:22:10.554 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.529+0000 7f80e570a640 1 -- 192.168.123.107:0/4141761689 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f80b003d1b0 msgr2=0x7f80b003f670 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:10.554 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.529+0000 7f80e570a640 1 --2- 192.168.123.107:0/4141761689 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f80b003d1b0 0x7f80b003f670 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f80d800ad30 tx=0x7f80d80093f0 comp rx=0 tx=0).stop 2026-03-09T19:22:10.554 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.529+0000 7f80e570a640 1 -- 192.168.123.107:0/4141761689 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e0112a50 msgr2=0x7f80e01a51b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:10.554 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.529+0000 7f80e570a640 1 --2- 192.168.123.107:0/4141761689 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e0112a50 0x7f80e01a51b0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f80d002f6c0 tx=0x7f80d0035a00 comp rx=0 tx=0).stop 2026-03-09T19:22:10.554 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.529+0000 7f80e570a640 1 -- 192.168.123.107:0/4141761689 shutdown_connections 2026-03-09T19:22:10.554 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.529+0000 7f80e570a640 1 --2- 192.168.123.107:0/4141761689 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f80b003d1b0 0x7f80b003f670 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:10.554 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.529+0000 7f80e570a640 1 --2- 192.168.123.107:0/4141761689 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e0112a50 0x7f80e01a51b0 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:10.554 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.529+0000 7f80e570a640 1 -- 192.168.123.107:0/4141761689 >> 192.168.123.107:0/4141761689 conn(0x7f80e006d060 msgr2=0x7f80e006e4e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:10.554 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.529+0000 7f80e570a640 1 -- 192.168.123.107:0/4141761689 shutdown_connections 2026-03-09T19:22:10.554 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.529+0000 7f80e570a640 1 -- 192.168.123.107:0/4141761689 wait complete. 2026-03-09T19:22:10.554 INFO:teuthology.orchestra.run.vm07.stdout:Creating initial admin user... 2026-03-09T19:22:10.999 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout {"username": "admin", "password": "$2b$12$Al7rq0hNnm1y5xUoQgsg2u/7zx0L9Lk9QDXbs.OShGX3ScEZY6d1q", "roles": ["administrator"], "name": null, "email": null, "lastUpdate": 1773084130, "enabled": true, "pwdExpirationDate": null, "pwdUpdateRequired": true} 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.695+0000 7f4a0a46e640 1 Processor -- start 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.695+0000 7f4a0a46e640 1 -- start start 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.696+0000 7f4a0a46e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a04106130 0x7f4a04106530 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.696+0000 7f4a0a46e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a04106b00 con 0x7f4a04106130 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.696+0000 7f4a03fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a04106130 0x7f4a04106530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.696+0000 7f4a03fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a04106130 0x7f4a04106530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:50934/0 (socket says 192.168.123.107:50934) 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.696+0000 7f4a03fff640 1 -- 192.168.123.107:0/3196087941 learned_addr learned my addr 192.168.123.107:0/3196087941 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.696+0000 7f4a03fff640 1 -- 192.168.123.107:0/3196087941 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4a04107290 con 0x7f4a04106130 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.697+0000 7f4a03fff640 1 --2- 192.168.123.107:0/3196087941 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a04106130 0x7f4a04106530 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f49f4009920 tx=0x7f49f402ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=4599ce1a52ee2f81 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.697+0000 7f4a02ffd640 1 -- 192.168.123.107:0/3196087941 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f49f402f9b0 con 0x7f4a04106130 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.697+0000 7f4a02ffd640 1 -- 192.168.123.107:0/3196087941 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f49f4037440 con 0x7f4a04106130 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.698+0000 7f4a0a46e640 1 -- 192.168.123.107:0/3196087941 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a04106130 msgr2=0x7f4a04106530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.698+0000 7f4a0a46e640 1 --2- 192.168.123.107:0/3196087941 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a04106130 0x7f4a04106530 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f49f4009920 tx=0x7f49f402ef20 comp rx=0 tx=0).stop 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.698+0000 7f4a0a46e640 1 -- 192.168.123.107:0/3196087941 shutdown_connections 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.698+0000 7f4a0a46e640 1 --2- 192.168.123.107:0/3196087941 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a04106130 0x7f4a04106530 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.698+0000 7f4a0a46e640 1 -- 192.168.123.107:0/3196087941 >> 192.168.123.107:0/3196087941 conn(0x7f4a041018e0 msgr2=0x7f4a04103d00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.698+0000 7f4a0a46e640 1 -- 192.168.123.107:0/3196087941 shutdown_connections 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.698+0000 7f4a0a46e640 1 -- 192.168.123.107:0/3196087941 wait complete. 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.699+0000 7f4a0a46e640 1 Processor -- start 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.699+0000 7f4a0a46e640 1 -- start start 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.699+0000 7f4a0a46e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a04106130 0x7f4a0419e100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.699+0000 7f4a0a46e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f49f40353c0 con 0x7f4a04106130 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.699+0000 7f4a03fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a04106130 0x7f4a0419e100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.699+0000 7f4a03fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a04106130 0x7f4a0419e100 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:50948/0 (socket says 192.168.123.107:50948) 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.699+0000 7f4a03fff640 1 -- 192.168.123.107:0/715713773 learned_addr learned my addr 192.168.123.107:0/715713773 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.699+0000 7f4a03fff640 1 -- 192.168.123.107:0/715713773 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f49f40095d0 con 0x7f4a04106130 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.699+0000 7f4a03fff640 1 --2- 192.168.123.107:0/715713773 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a04106130 0x7f4a0419e100 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f49f402f450 tx=0x7f49f4035f00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.700+0000 7f4a017fa640 1 -- 192.168.123.107:0/715713773 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f49f4037880 con 0x7f4a04106130 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.700+0000 7f4a017fa640 1 -- 192.168.123.107:0/715713773 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f49f40379e0 con 0x7f4a04106130 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.700+0000 7f4a0a46e640 1 -- 192.168.123.107:0/715713773 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4a0419e640 con 0x7f4a04106130 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.700+0000 7f4a0a46e640 1 -- 192.168.123.107:0/715713773 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4a0419eae0 con 0x7f4a04106130 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.700+0000 7f4a017fa640 1 -- 192.168.123.107:0/715713773 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f49f4036540 con 0x7f4a04106130 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.701+0000 7f4a017fa640 1 -- 192.168.123.107:0/715713773 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 11) v1 ==== 49277+0+0 (secure 0 0 0) 0x7f49f403e070 con 0x7f4a04106130 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.701+0000 7f4a017fa640 1 --2- 192.168.123.107:0/715713773 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f49d8045b20 0x7f49d8047fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.701+0000 7f4a017fa640 1 -- 192.168.123.107:0/715713773 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f49f40765f0 con 0x7f4a04106130 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.701+0000 7f4a037fe640 1 --2- 192.168.123.107:0/715713773 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f49d8045b20 0x7f49d8047fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.702+0000 7f4a037fe640 1 --2- 192.168.123.107:0/715713773 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f49d8045b20 0x7f49d8047fe0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f49f0009a10 tx=0x7f49f0006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.702+0000 7f4a0a46e640 1 -- 192.168.123.107:0/715713773 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f49d0005350 con 0x7f4a04106130 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.705+0000 7f4a017fa640 1 -- 192.168.123.107:0/715713773 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f49f403e930 con 0x7f4a04106130 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.797+0000 7f4a0a46e640 1 -- 192.168.123.107:0/715713773 --> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] -- mgr_command(tid 0: {"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}) v1 -- 0x7f49d0003c00 con 0x7f49d8045b20 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.949+0000 7f4a017fa640 1 -- 192.168.123.107:0/715713773 <== mgr.14162 v2:192.168.123.107:6800/1021580706 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+252 (secure 0 0 0) 0x7f49d0003c00 con 0x7f49d8045b20 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.952+0000 7f4a0a46e640 1 -- 192.168.123.107:0/715713773 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f49d8045b20 msgr2=0x7f49d8047fe0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.952+0000 7f4a0a46e640 1 --2- 192.168.123.107:0/715713773 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f49d8045b20 0x7f49d8047fe0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f49f0009a10 tx=0x7f49f0006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.952+0000 7f4a0a46e640 1 -- 192.168.123.107:0/715713773 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a04106130 msgr2=0x7f4a0419e100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.952+0000 7f4a0a46e640 1 --2- 192.168.123.107:0/715713773 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a04106130 0x7f4a0419e100 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f49f402f450 tx=0x7f49f4035f00 comp rx=0 tx=0).stop 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.952+0000 7f4a0a46e640 1 -- 192.168.123.107:0/715713773 shutdown_connections 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.952+0000 7f4a0a46e640 1 --2- 192.168.123.107:0/715713773 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f49d8045b20 0x7f49d8047fe0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.952+0000 7f4a0a46e640 1 --2- 192.168.123.107:0/715713773 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a04106130 0x7f4a0419e100 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.952+0000 7f4a0a46e640 1 -- 192.168.123.107:0/715713773 >> 192.168.123.107:0/715713773 conn(0x7f4a041018e0 msgr2=0x7f4a04102330 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.952+0000 7f4a0a46e640 1 -- 192.168.123.107:0/715713773 shutdown_connections 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:10.952+0000 7f4a0a46e640 1 -- 192.168.123.107:0/715713773 wait complete. 2026-03-09T19:22:11.000 INFO:teuthology.orchestra.run.vm07.stdout:Fetching dashboard port number... 2026-03-09T19:22:11.207 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:11 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:11.207 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:11 vm07 ceph-mon[48545]: [09/Mar/2026:19:22:10] ENGINE Bus STARTING 2026-03-09T19:22:11.207 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:11 vm07 ceph-mon[48545]: [09/Mar/2026:19:22:10] ENGINE Serving on https://192.168.123.107:7150 2026-03-09T19:22:11.207 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:11 vm07 ceph-mon[48545]: [09/Mar/2026:19:22:10] ENGINE Client ('192.168.123.107', 42726) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T19:22:11.207 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:11 vm07 ceph-mon[48545]: from='client.14174 -' entity='client.admin' cmd=[{"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:11.207 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:11 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:11.207 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:11 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:11.207 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:11 vm07 ceph-mon[48545]: [09/Mar/2026:19:22:10] ENGINE Serving on http://192.168.123.107:8765 2026-03-09T19:22:11.207 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:11 vm07 ceph-mon[48545]: [09/Mar/2026:19:22:10] ENGINE Bus STARTED 2026-03-09T19:22:11.207 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:11 vm07 ceph-mon[48545]: from='client.14176 -' entity='client.admin' cmd=[{"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:11.207 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:11 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:11.236 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stdout 8443 2026-03-09T19:22:11.236 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.107+0000 7f3231004640 1 Processor -- start 2026-03-09T19:22:11.236 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.107+0000 7f3231004640 1 -- start start 2026-03-09T19:22:11.236 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.107+0000 7f3231004640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f322c1082f0 0x7f322c1086f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:11.236 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.108+0000 7f3231004640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f322c108cc0 con 0x7f322c1082f0 2026-03-09T19:22:11.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.108+0000 7f322ad76640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f322c1082f0 0x7f322c1086f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:11.237 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.108+0000 7f322ad76640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f322c1082f0 0x7f322c1086f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:50964/0 (socket says 192.168.123.107:50964) 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.108+0000 7f322ad76640 1 -- 192.168.123.107:0/3051973791 learned_addr learned my addr 192.168.123.107:0/3051973791 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.109+0000 7f322ad76640 1 -- 192.168.123.107:0/3051973791 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f322c1094a0 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.110+0000 7f322ad76640 1 --2- 192.168.123.107:0/3051973791 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f322c1082f0 0x7f322c1086f0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f3214009920 tx=0x7f321402ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=5a0aa58579c8cdc server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.110+0000 7f3229d74640 1 -- 192.168.123.107:0/3051973791 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f321402f9b0 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.110+0000 7f3229d74640 1 -- 192.168.123.107:0/3051973791 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3214037440 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.110+0000 7f3229d74640 1 -- 192.168.123.107:0/3051973791 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3214035560 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.110+0000 7f3231004640 1 -- 192.168.123.107:0/3051973791 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f322c1082f0 msgr2=0x7f322c1086f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.110+0000 7f3231004640 1 --2- 192.168.123.107:0/3051973791 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f322c1082f0 0x7f322c1086f0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f3214009920 tx=0x7f321402ef20 comp rx=0 tx=0).stop 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.111+0000 7f3231004640 1 -- 192.168.123.107:0/3051973791 shutdown_connections 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.111+0000 7f3231004640 1 --2- 192.168.123.107:0/3051973791 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f322c1082f0 0x7f322c1086f0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.111+0000 7f3231004640 1 -- 192.168.123.107:0/3051973791 >> 192.168.123.107:0/3051973791 conn(0x7f322c07ba00 msgr2=0x7f322c1066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.111+0000 7f3231004640 1 -- 192.168.123.107:0/3051973791 shutdown_connections 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.111+0000 7f3231004640 1 -- 192.168.123.107:0/3051973791 wait complete. 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.112+0000 7f3231004640 1 Processor -- start 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.112+0000 7f3231004640 1 -- start start 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.112+0000 7f3231004640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f322c1082f0 0x7f322c19e130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.112+0000 7f3231004640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f322c19e670 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.112+0000 7f322ad76640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f322c1082f0 0x7f322c19e130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.112+0000 7f322ad76640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f322c1082f0 0x7f322c19e130 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:50980/0 (socket says 192.168.123.107:50980) 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.112+0000 7f322ad76640 1 -- 192.168.123.107:0/741466338 learned_addr learned my addr 192.168.123.107:0/741466338 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.113+0000 7f322ad76640 1 -- 192.168.123.107:0/741466338 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f32140095d0 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.113+0000 7f322ad76640 1 --2- 192.168.123.107:0/741466338 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f322c1082f0 0x7f322c19e130 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f3214009a50 tx=0x7f321402fbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.113+0000 7f320bfff640 1 -- 192.168.123.107:0/741466338 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3214035820 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.113+0000 7f320bfff640 1 -- 192.168.123.107:0/741466338 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3214035e40 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.113+0000 7f320bfff640 1 -- 192.168.123.107:0/741466338 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f32140363a0 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.113+0000 7f3231004640 1 -- 192.168.123.107:0/741466338 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f322c19e870 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.113+0000 7f3231004640 1 -- 192.168.123.107:0/741466338 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f322c19ec50 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.114+0000 7f3231004640 1 -- 192.168.123.107:0/741466338 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f31f0005350 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.117+0000 7f320bfff640 1 -- 192.168.123.107:0/741466338 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 11) v1 ==== 49277+0+0 (secure 0 0 0) 0x7f321403e070 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.117+0000 7f320bfff640 1 --2- 192.168.123.107:0/741466338 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f320003d160 0x7f320003f620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.117+0000 7f322a575640 1 --2- 192.168.123.107:0/741466338 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f320003d160 0x7f320003f620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.117+0000 7f322a575640 1 --2- 192.168.123.107:0/741466338 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f320003d160 0x7f320003f620 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f3220009a10 tx=0x7f3220006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.117+0000 7f320bfff640 1 -- 192.168.123.107:0/741466338 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f3214075e70 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.117+0000 7f320bfff640 1 -- 192.168.123.107:0/741466338 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f32140762f0 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.209+0000 7f3231004640 1 -- 192.168.123.107:0/741466338 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"} v 0) v1 -- 0x7f31f00051c0 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.211+0000 7f320bfff640 1 -- 192.168.123.107:0/741466338 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]=0 v8) v1 ==== 112+0+5 (secure 0 0 0) 0x7f321403c030 con 0x7f322c1082f0 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.213+0000 7f3231004640 1 -- 192.168.123.107:0/741466338 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f320003d160 msgr2=0x7f320003f620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.213+0000 7f3231004640 1 --2- 192.168.123.107:0/741466338 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f320003d160 0x7f320003f620 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f3220009a10 tx=0x7f3220006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.214+0000 7f3231004640 1 -- 192.168.123.107:0/741466338 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f322c1082f0 msgr2=0x7f322c19e130 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.214+0000 7f3231004640 1 --2- 192.168.123.107:0/741466338 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f322c1082f0 0x7f322c19e130 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f3214009a50 tx=0x7f321402fbe0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.214+0000 7f3231004640 1 -- 192.168.123.107:0/741466338 shutdown_connections 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.214+0000 7f3231004640 1 --2- 192.168.123.107:0/741466338 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f320003d160 0x7f320003f620 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.214+0000 7f3231004640 1 --2- 192.168.123.107:0/741466338 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f322c1082f0 0x7f322c19e130 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.214+0000 7f3231004640 1 -- 192.168.123.107:0/741466338 >> 192.168.123.107:0/741466338 conn(0x7f322c07ba00 msgr2=0x7f322c105d80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.214+0000 7f3231004640 1 -- 192.168.123.107:0/741466338 shutdown_connections 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.214+0000 7f3231004640 1 -- 192.168.123.107:0/741466338 wait complete. 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:firewalld does not appear to be present 2026-03-09T19:22:11.238 INFO:teuthology.orchestra.run.vm07.stdout:Not possible to open ports <[8443]>. firewalld.service is not available 2026-03-09T19:22:11.240 INFO:teuthology.orchestra.run.vm07.stdout:Ceph Dashboard is now available at: 2026-03-09T19:22:11.240 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:22:11.240 INFO:teuthology.orchestra.run.vm07.stdout: URL: https://vm07.local:8443/ 2026-03-09T19:22:11.240 INFO:teuthology.orchestra.run.vm07.stdout: User: admin 2026-03-09T19:22:11.240 INFO:teuthology.orchestra.run.vm07.stdout: Password: 77siejwftc 2026-03-09T19:22:11.240 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:22:11.240 INFO:teuthology.orchestra.run.vm07.stdout:Saving cluster configuration to /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config directory 2026-03-09T19:22:11.241 INFO:teuthology.orchestra.run.vm07.stdout:Enabling autotune for osd_memory_target 2026-03-09T19:22:11.479 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.356+0000 7f49e5745640 1 Processor -- start 2026-03-09T19:22:11.479 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.356+0000 7f49e5745640 1 -- start start 2026-03-09T19:22:11.479 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.356+0000 7f49e5745640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49e0103e20 0x7f49e0104220 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:11.479 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.356+0000 7f49e5745640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f49e01047f0 con 0x7f49e0103e20 2026-03-09T19:22:11.479 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.356+0000 7f49deffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49e0103e20 0x7f49e0104220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:11.479 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.356+0000 7f49deffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49e0103e20 0x7f49e0104220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:50990/0 (socket says 192.168.123.107:50990) 2026-03-09T19:22:11.479 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.356+0000 7f49deffd640 1 -- 192.168.123.107:0/2880328380 learned_addr learned my addr 192.168.123.107:0/2880328380 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:11.480 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.357+0000 7f49deffd640 1 -- 192.168.123.107:0/2880328380 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f49e0104fb0 con 0x7f49e0103e20 2026-03-09T19:22:11.480 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.357+0000 7f49deffd640 1 --2- 192.168.123.107:0/2880328380 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49e0103e20 0x7f49e0104220 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f49c8009920 tx=0x7f49c802ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=7ad5ae7eda14897 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:11.480 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.357+0000 7f49ddffb640 1 -- 192.168.123.107:0/2880328380 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f49c802f9b0 con 0x7f49e0103e20 2026-03-09T19:22:11.480 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.357+0000 7f49ddffb640 1 -- 192.168.123.107:0/2880328380 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f49c8037440 con 0x7f49e0103e20 2026-03-09T19:22:11.480 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.357+0000 7f49ddffb640 1 -- 192.168.123.107:0/2880328380 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f49c8035560 con 0x7f49e0103e20 2026-03-09T19:22:11.480 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.358+0000 7f49e5745640 1 -- 192.168.123.107:0/2880328380 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49e0103e20 msgr2=0x7f49e0104220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:11.480 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.358+0000 7f49e5745640 1 --2- 192.168.123.107:0/2880328380 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49e0103e20 0x7f49e0104220 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f49c8009920 tx=0x7f49c802ef20 comp rx=0 tx=0).stop 2026-03-09T19:22:11.480 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.358+0000 7f49e5745640 1 -- 192.168.123.107:0/2880328380 shutdown_connections 2026-03-09T19:22:11.480 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.358+0000 7f49e5745640 1 --2- 192.168.123.107:0/2880328380 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49e0103e20 0x7f49e0104220 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.480 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.358+0000 7f49e5745640 1 -- 192.168.123.107:0/2880328380 >> 192.168.123.107:0/2880328380 conn(0x7f49e00ff960 msgr2=0x7f49e0101dc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:11.480 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.358+0000 7f49e5745640 1 -- 192.168.123.107:0/2880328380 shutdown_connections 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.358+0000 7f49e5745640 1 -- 192.168.123.107:0/2880328380 wait complete. 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.359+0000 7f49e5745640 1 Processor -- start 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.359+0000 7f49e5745640 1 -- start start 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.359+0000 7f49e5745640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49e0103e20 0x7f49e01a2620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.359+0000 7f49e5745640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f49e01a2b60 con 0x7f49e0103e20 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.359+0000 7f49deffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49e0103e20 0x7f49e01a2620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.359+0000 7f49deffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49e0103e20 0x7f49e01a2620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:51004/0 (socket says 192.168.123.107:51004) 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.359+0000 7f49deffd640 1 -- 192.168.123.107:0/2712955325 learned_addr learned my addr 192.168.123.107:0/2712955325 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.359+0000 7f49deffd640 1 -- 192.168.123.107:0/2712955325 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f49c80095d0 con 0x7f49e0103e20 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.359+0000 7f49deffd640 1 --2- 192.168.123.107:0/2712955325 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49e0103e20 0x7f49e01a2620 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f49c8037d90 tx=0x7f49c802fbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.359+0000 7f49bffff640 1 -- 192.168.123.107:0/2712955325 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f49c8035820 con 0x7f49e0103e20 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.359+0000 7f49bffff640 1 -- 192.168.123.107:0/2712955325 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f49c8035e40 con 0x7f49e0103e20 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.359+0000 7f49e5745640 1 -- 192.168.123.107:0/2712955325 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f49e01a2d60 con 0x7f49e0103e20 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.359+0000 7f49e5745640 1 -- 192.168.123.107:0/2712955325 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f49e01a3200 con 0x7f49e0103e20 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.360+0000 7f49bffff640 1 -- 192.168.123.107:0/2712955325 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f49c80363a0 con 0x7f49e0103e20 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.360+0000 7f49bffff640 1 -- 192.168.123.107:0/2712955325 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 11) v1 ==== 49277+0+0 (secure 0 0 0) 0x7f49c803e070 con 0x7f49e0103e20 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.361+0000 7f49bffff640 1 --2- 192.168.123.107:0/2712955325 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f49b403d160 0x7f49b403f620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.361+0000 7f49bffff640 1 -- 192.168.123.107:0/2712955325 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f49c80765b0 con 0x7f49e0103e20 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.361+0000 7f49de7fc640 1 --2- 192.168.123.107:0/2712955325 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f49b403d160 0x7f49b403f620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.362+0000 7f49de7fc640 1 --2- 192.168.123.107:0/2712955325 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f49b403d160 0x7f49b403f620 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f49d40099c0 tx=0x7f49d4006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.362+0000 7f49e5745640 1 -- 192.168.123.107:0/2712955325 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f49e0104220 con 0x7f49e0103e20 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.365+0000 7f49bffff640 1 -- 192.168.123.107:0/2712955325 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f49c8036e20 con 0x7f49e0103e20 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.452+0000 7f49e5745640 1 -- 192.168.123.107:0/2712955325 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command([{prefix=config set, name=osd_memory_target_autotune}] v 0) v1 -- 0x7f49e0108200 con 0x7f49e0103e20 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.452+0000 7f49bffff640 1 -- 192.168.123.107:0/2712955325 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{prefix=config set, name=osd_memory_target_autotune}]=0 v8)=0 v8) v1 ==== 127+0+0 (secure 0 0 0) 0x7f49c803e350 con 0x7f49e0103e20 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.455+0000 7f49e5745640 1 -- 192.168.123.107:0/2712955325 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f49b403d160 msgr2=0x7f49b403f620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.455+0000 7f49e5745640 1 --2- 192.168.123.107:0/2712955325 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f49b403d160 0x7f49b403f620 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f49d40099c0 tx=0x7f49d4006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.455+0000 7f49e5745640 1 -- 192.168.123.107:0/2712955325 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49e0103e20 msgr2=0x7f49e01a2620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.455+0000 7f49e5745640 1 --2- 192.168.123.107:0/2712955325 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49e0103e20 0x7f49e01a2620 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f49c8037d90 tx=0x7f49c802fbe0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.455+0000 7f49e5745640 1 -- 192.168.123.107:0/2712955325 shutdown_connections 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.455+0000 7f49e5745640 1 --2- 192.168.123.107:0/2712955325 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f49b403d160 0x7f49b403f620 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.455+0000 7f49e5745640 1 --2- 192.168.123.107:0/2712955325 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49e0103e20 0x7f49e01a2620 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.455+0000 7f49e5745640 1 -- 192.168.123.107:0/2712955325 >> 192.168.123.107:0/2712955325 conn(0x7f49e00ff960 msgr2=0x7f49e010ac50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.455+0000 7f49e5745640 1 -- 192.168.123.107:0/2712955325 shutdown_connections 2026-03-09T19:22:11.481 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.455+0000 7f49e5745640 1 -- 192.168.123.107:0/2712955325 wait complete. 2026-03-09T19:22:11.772 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.593+0000 7fbec88bc640 1 Processor -- start 2026-03-09T19:22:11.772 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.594+0000 7fbec88bc640 1 -- start start 2026-03-09T19:22:11.772 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.594+0000 7fbec88bc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbec01082f0 0x7fbec01086f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:11.772 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.594+0000 7fbec88bc640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbec0108cc0 con 0x7fbec01082f0 2026-03-09T19:22:11.772 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.595+0000 7fbec6631640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbec01082f0 0x7fbec01086f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:11.772 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.595+0000 7fbec6631640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbec01082f0 0x7fbec01086f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:51020/0 (socket says 192.168.123.107:51020) 2026-03-09T19:22:11.772 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.595+0000 7fbec6631640 1 -- 192.168.123.107:0/3854853029 learned_addr learned my addr 192.168.123.107:0/3854853029 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:11.772 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.595+0000 7fbec6631640 1 -- 192.168.123.107:0/3854853029 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbec01094a0 con 0x7fbec01082f0 2026-03-09T19:22:11.772 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.595+0000 7fbec6631640 1 --2- 192.168.123.107:0/3854853029 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbec01082f0 0x7fbec01086f0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7fbeac009920 tx=0x7fbeac02ef20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=461a1b571f8abaa7 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.596+0000 7fbec562f640 1 -- 192.168.123.107:0/3854853029 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbeac02f9b0 con 0x7fbec01082f0 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.596+0000 7fbec562f640 1 -- 192.168.123.107:0/3854853029 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbeac037440 con 0x7fbec01082f0 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.596+0000 7fbec562f640 1 -- 192.168.123.107:0/3854853029 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbeac035560 con 0x7fbec01082f0 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.596+0000 7fbec88bc640 1 -- 192.168.123.107:0/3854853029 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbec01082f0 msgr2=0x7fbec01086f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.596+0000 7fbec88bc640 1 --2- 192.168.123.107:0/3854853029 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbec01082f0 0x7fbec01086f0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7fbeac009920 tx=0x7fbeac02ef20 comp rx=0 tx=0).stop 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.596+0000 7fbec88bc640 1 -- 192.168.123.107:0/3854853029 shutdown_connections 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.596+0000 7fbec88bc640 1 --2- 192.168.123.107:0/3854853029 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbec01082f0 0x7fbec01086f0 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.596+0000 7fbec88bc640 1 -- 192.168.123.107:0/3854853029 >> 192.168.123.107:0/3854853029 conn(0x7fbec007ba00 msgr2=0x7fbec01066a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.596+0000 7fbec88bc640 1 -- 192.168.123.107:0/3854853029 shutdown_connections 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.597+0000 7fbec88bc640 1 -- 192.168.123.107:0/3854853029 wait complete. 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.597+0000 7fbec88bc640 1 Processor -- start 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.597+0000 7fbec88bc640 1 -- start start 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.597+0000 7fbec88bc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbec019e390 0x7fbec019e7b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.597+0000 7fbec88bc640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbec019ecf0 con 0x7fbec019e390 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.598+0000 7fbec6631640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbec019e390 0x7fbec019e7b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.598+0000 7fbec6631640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbec019e390 0x7fbec019e7b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:51024/0 (socket says 192.168.123.107:51024) 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.598+0000 7fbec6631640 1 -- 192.168.123.107:0/1412968938 learned_addr learned my addr 192.168.123.107:0/1412968938 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.598+0000 7fbec6631640 1 -- 192.168.123.107:0/1412968938 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbeac0095d0 con 0x7fbec019e390 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.598+0000 7fbec6631640 1 --2- 192.168.123.107:0/1412968938 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbec019e390 0x7fbec019e7b0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7fbeac009a50 tx=0x7fbeac037990 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.598+0000 7fbeab7fe640 1 -- 192.168.123.107:0/1412968938 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbeac02fdf0 con 0x7fbec019e390 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.598+0000 7fbeab7fe640 1 -- 192.168.123.107:0/1412968938 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbeac035ce0 con 0x7fbec019e390 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.599+0000 7fbeab7fe640 1 -- 192.168.123.107:0/1412968938 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbeac041c60 con 0x7fbec019e390 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.599+0000 7fbec88bc640 1 -- 192.168.123.107:0/1412968938 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbec019eef0 con 0x7fbec019e390 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.599+0000 7fbec88bc640 1 -- 192.168.123.107:0/1412968938 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbec01a1a60 con 0x7fbec019e390 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.600+0000 7fbeab7fe640 1 -- 192.168.123.107:0/1412968938 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 11) v1 ==== 49277+0+0 (secure 0 0 0) 0x7fbeac03e030 con 0x7fbec019e390 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.600+0000 7fbec88bc640 1 -- 192.168.123.107:0/1412968938 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbec01086f0 con 0x7fbec019e390 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.601+0000 7fbeab7fe640 1 --2- 192.168.123.107:0/1412968938 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fbe9403cdf0 0x7fbe9403f2b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.601+0000 7fbeab7fe640 1 -- 192.168.123.107:0/1412968938 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7fbeac0758f0 con 0x7fbec019e390 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.602+0000 7fbeab7fe640 1 -- 192.168.123.107:0/1412968938 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fbeac041400 con 0x7fbec019e390 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.603+0000 7fbec5e30640 1 --2- 192.168.123.107:0/1412968938 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fbe9403cdf0 0x7fbe9403f2b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.603+0000 7fbec5e30640 1 --2- 192.168.123.107:0/1412968938 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fbe9403cdf0 0x7fbe9403f2b0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7fbeb8009a10 tx=0x7fbeb8006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.727+0000 7fbec88bc640 1 -- 192.168.123.107:0/1412968938 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command([{prefix=config-key set, key=mgr/dashboard/cluster/status}] v 0) v1 -- 0x7fbec010c6d0 con 0x7fbec019e390 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.730+0000 7fbeab7fe640 1 -- 192.168.123.107:0/1412968938 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{prefix=config-key set, key=mgr/dashboard/cluster/status}]=0 set mgr/dashboard/cluster/status v34)=0 set mgr/dashboard/cluster/status v34) v1 ==== 153+0+0 (secure 0 0 0) 0x7fbeac03c020 con 0x7fbec019e390 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr set mgr/dashboard/cluster/status 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.732+0000 7fbec88bc640 1 -- 192.168.123.107:0/1412968938 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fbe9403cdf0 msgr2=0x7fbe9403f2b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.732+0000 7fbec88bc640 1 --2- 192.168.123.107:0/1412968938 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fbe9403cdf0 0x7fbe9403f2b0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7fbeb8009a10 tx=0x7fbeb8006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.732+0000 7fbec88bc640 1 -- 192.168.123.107:0/1412968938 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbec019e390 msgr2=0x7fbec019e7b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.732+0000 7fbec88bc640 1 --2- 192.168.123.107:0/1412968938 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbec019e390 0x7fbec019e7b0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7fbeac009a50 tx=0x7fbeac037990 comp rx=0 tx=0).stop 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.732+0000 7fbec88bc640 1 -- 192.168.123.107:0/1412968938 shutdown_connections 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.732+0000 7fbec88bc640 1 --2- 192.168.123.107:0/1412968938 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fbe9403cdf0 0x7fbe9403f2b0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.732+0000 7fbec88bc640 1 --2- 192.168.123.107:0/1412968938 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbec019e390 0x7fbec019e7b0 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.732+0000 7fbec88bc640 1 -- 192.168.123.107:0/1412968938 >> 192.168.123.107:0/1412968938 conn(0x7fbec007ba00 msgr2=0x7fbec0105dc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.733+0000 7fbec88bc640 1 -- 192.168.123.107:0/1412968938 shutdown_connections 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:/usr/bin/ceph: stderr 2026-03-09T19:22:11.733+0000 7fbec88bc640 1 -- 192.168.123.107:0/1412968938 wait complete. 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:You can access the Ceph CLI as following in case of multi-cluster or non-default config: 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout: sudo /home/ubuntu/cephtest/cephadm shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout:Or, if you are only running a single cluster on this host: 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:22:11.774 INFO:teuthology.orchestra.run.vm07.stdout: sudo /home/ubuntu/cephtest/cephadm shell 2026-03-09T19:22:11.775 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:22:11.775 INFO:teuthology.orchestra.run.vm07.stdout:Please consider enabling telemetry to help improve Ceph: 2026-03-09T19:22:11.775 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:22:11.775 INFO:teuthology.orchestra.run.vm07.stdout: ceph telemetry on 2026-03-09T19:22:11.775 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:22:11.775 INFO:teuthology.orchestra.run.vm07.stdout:For more information see: 2026-03-09T19:22:11.775 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:22:11.775 INFO:teuthology.orchestra.run.vm07.stdout: https://docs.ceph.com/en/latest/mgr/telemetry/ 2026-03-09T19:22:11.775 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:22:11.775 INFO:teuthology.orchestra.run.vm07.stdout:Bootstrap complete. 2026-03-09T19:22:11.799 INFO:tasks.cephadm:Fetching config... 2026-03-09T19:22:11.799 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:22:11.799 DEBUG:teuthology.orchestra.run.vm07:> dd if=/etc/ceph/ceph.conf of=/dev/stdout 2026-03-09T19:22:11.873 INFO:tasks.cephadm:Fetching client.admin keyring... 2026-03-09T19:22:11.873 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:22:11.873 DEBUG:teuthology.orchestra.run.vm07:> dd if=/etc/ceph/ceph.client.admin.keyring of=/dev/stdout 2026-03-09T19:22:11.939 INFO:tasks.cephadm:Fetching mon keyring... 2026-03-09T19:22:11.939 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:22:11.939 DEBUG:teuthology.orchestra.run.vm07:> sudo dd if=/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/keyring of=/dev/stdout 2026-03-09T19:22:12.009 INFO:tasks.cephadm:Fetching pub ssh key... 2026-03-09T19:22:12.009 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:22:12.009 DEBUG:teuthology.orchestra.run.vm07:> dd if=/home/ubuntu/cephtest/ceph.pub of=/dev/stdout 2026-03-09T19:22:12.067 INFO:tasks.cephadm:Installing pub ssh key for root users... 2026-03-09T19:22:12.067 DEBUG:teuthology.orchestra.run.vm07:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIMjqCGWP8Zd5gsOjmAx27Qho/yQRsupkQt5TbUvfXgbc ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-09T19:22:12.152 INFO:teuthology.orchestra.run.vm07.stdout:ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIMjqCGWP8Zd5gsOjmAx27Qho/yQRsupkQt5TbUvfXgbc ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:22:12.170 DEBUG:teuthology.orchestra.run.vm08:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIMjqCGWP8Zd5gsOjmAx27Qho/yQRsupkQt5TbUvfXgbc ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-09T19:22:12.201 INFO:teuthology.orchestra.run.vm08.stdout:ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIMjqCGWP8Zd5gsOjmAx27Qho/yQRsupkQt5TbUvfXgbc ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:22:12.210 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph config set mgr mgr/cephadm/allow_ptrace true 2026-03-09T19:22:12.383 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:22:12.410 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:12 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/741466338' entity='client.admin' cmd=[{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]: dispatch 2026-03-09T19:22:12.410 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:12 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1412968938' entity='client.admin' 2026-03-09T19:22:12.410 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:12 vm07 ceph-mon[48545]: mgrmap e12: vm07.xacuym(active, since 2s) 2026-03-09T19:22:12.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.644+0000 7fcec90d6640 1 -- 192.168.123.107:0/324992417 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcec4072f10 msgr2=0x7fcec4071430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:12.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.644+0000 7fcec90d6640 1 --2- 192.168.123.107:0/324992417 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcec4072f10 0x7fcec4071430 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7fceb4009a00 tx=0x7fceb402f310 comp rx=0 tx=0).stop 2026-03-09T19:22:12.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.645+0000 7fcec90d6640 1 -- 192.168.123.107:0/324992417 shutdown_connections 2026-03-09T19:22:12.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.645+0000 7fcec90d6640 1 --2- 192.168.123.107:0/324992417 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcec4072f10 0x7fcec4071430 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:12.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.645+0000 7fcec90d6640 1 -- 192.168.123.107:0/324992417 >> 192.168.123.107:0/324992417 conn(0x7fcec406cfb0 msgr2=0x7fcec406f3f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:12.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.646+0000 7fcec90d6640 1 -- 192.168.123.107:0/324992417 shutdown_connections 2026-03-09T19:22:12.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.646+0000 7fcec90d6640 1 -- 192.168.123.107:0/324992417 wait complete. 2026-03-09T19:22:12.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.646+0000 7fcec90d6640 1 Processor -- start 2026-03-09T19:22:12.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.646+0000 7fcec90d6640 1 -- start start 2026-03-09T19:22:12.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.646+0000 7fcec90d6640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcec4072f10 0x7fcec41b1790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:12.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.646+0000 7fcec90d6640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcec41b1cd0 con 0x7fcec4072f10 2026-03-09T19:22:12.645 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.647+0000 7fcec3fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcec4072f10 0x7fcec41b1790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:12.645 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.647+0000 7fcec3fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcec4072f10 0x7fcec41b1790 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:51040/0 (socket says 192.168.123.107:51040) 2026-03-09T19:22:12.645 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.647+0000 7fcec3fff640 1 -- 192.168.123.107:0/3548129324 learned_addr learned my addr 192.168.123.107:0/3548129324 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:12.646 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.647+0000 7fcec3fff640 1 -- 192.168.123.107:0/3548129324 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fceb4009660 con 0x7fcec4072f10 2026-03-09T19:22:12.646 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.647+0000 7fcec3fff640 1 --2- 192.168.123.107:0/3548129324 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcec4072f10 0x7fcec41b1790 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fceb40059c0 tx=0x7fceb4004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:12.646 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.647+0000 7fcec1ffb640 1 -- 192.168.123.107:0/3548129324 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fceb4004450 con 0x7fcec4072f10 2026-03-09T19:22:12.646 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.647+0000 7fcec90d6640 1 -- 192.168.123.107:0/3548129324 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcec41b1e10 con 0x7fcec4072f10 2026-03-09T19:22:12.646 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.647+0000 7fcec90d6640 1 -- 192.168.123.107:0/3548129324 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcec41b21f0 con 0x7fcec4072f10 2026-03-09T19:22:12.646 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.648+0000 7fcec1ffb640 1 -- 192.168.123.107:0/3548129324 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fceb4038bf0 con 0x7fcec4072f10 2026-03-09T19:22:12.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.649+0000 7fcec1ffb640 1 -- 192.168.123.107:0/3548129324 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fceb4041a30 con 0x7fcec4072f10 2026-03-09T19:22:12.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.649+0000 7fcec1ffb640 1 -- 192.168.123.107:0/3548129324 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 12) v1 ==== 49383+0+0 (secure 0 0 0) 0x7fceb4041c50 con 0x7fcec4072f10 2026-03-09T19:22:12.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.649+0000 7fcec1ffb640 1 --2- 192.168.123.107:0/3548129324 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fce9803cf10 0x7fce9803f3d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:12.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.649+0000 7fcec90d6640 1 -- 192.168.123.107:0/3548129324 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fce84005350 con 0x7fcec4072f10 2026-03-09T19:22:12.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.649+0000 7fcebbfff640 1 --2- 192.168.123.107:0/3548129324 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fce9803cf10 0x7fce9803f3d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:12.650 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.649+0000 7fcebbfff640 1 --2- 192.168.123.107:0/3548129324 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fce9803cf10 0x7fce9803f3d0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7fceac009a10 tx=0x7fceac006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:12.650 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.649+0000 7fcec1ffb640 1 -- 192.168.123.107:0/3548129324 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7fceb4077290 con 0x7fcec4072f10 2026-03-09T19:22:12.650 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.652+0000 7fcec1ffb640 1 -- 192.168.123.107:0/3548129324 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fceb404b460 con 0x7fcec4072f10 2026-03-09T19:22:12.751 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.753+0000 7fcec90d6640 1 -- 192.168.123.107:0/3548129324 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/allow_ptrace}] v 0) v1 -- 0x7fce84005b80 con 0x7fcec4072f10 2026-03-09T19:22:12.758 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.756+0000 7fcec1ffb640 1 -- 192.168.123.107:0/3548129324 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/allow_ptrace}]=0 v9)=0 v9) v1 ==== 125+0+0 (secure 0 0 0) 0x7fceb4037840 con 0x7fcec4072f10 2026-03-09T19:22:12.759 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.761+0000 7fcebaffd640 1 -- 192.168.123.107:0/3548129324 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fce9803cf10 msgr2=0x7fce9803f3d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:12.759 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.761+0000 7fcebaffd640 1 --2- 192.168.123.107:0/3548129324 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fce9803cf10 0x7fce9803f3d0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7fceac009a10 tx=0x7fceac006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:12.759 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.761+0000 7fcebaffd640 1 -- 192.168.123.107:0/3548129324 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcec4072f10 msgr2=0x7fcec41b1790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:12.759 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.761+0000 7fcebaffd640 1 --2- 192.168.123.107:0/3548129324 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcec4072f10 0x7fcec41b1790 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fceb40059c0 tx=0x7fceb4004290 comp rx=0 tx=0).stop 2026-03-09T19:22:12.759 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.762+0000 7fcebaffd640 1 -- 192.168.123.107:0/3548129324 shutdown_connections 2026-03-09T19:22:12.759 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.762+0000 7fcebaffd640 1 --2- 192.168.123.107:0/3548129324 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fce9803cf10 0x7fce9803f3d0 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:12.759 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.762+0000 7fcebaffd640 1 --2- 192.168.123.107:0/3548129324 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcec4072f10 0x7fcec41b1790 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:12.760 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.762+0000 7fcebaffd640 1 -- 192.168.123.107:0/3548129324 >> 192.168.123.107:0/3548129324 conn(0x7fcec406cfb0 msgr2=0x7fcec406fe30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:12.760 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.762+0000 7fcebaffd640 1 -- 192.168.123.107:0/3548129324 shutdown_connections 2026-03-09T19:22:12.760 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:12.762+0000 7fcebaffd640 1 -- 192.168.123.107:0/3548129324 wait complete. 2026-03-09T19:22:12.804 INFO:tasks.cephadm:Distributing conf and client.admin keyring to all hosts + 0755 2026-03-09T19:22:12.804 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph orch client-keyring set client.admin '*' --mode 0755 2026-03-09T19:22:13.010 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:22:13.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.279+0000 7f61d587f640 1 -- 192.168.123.107:0/1148915993 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f61d0071680 msgr2=0x7f61d0071a80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:13.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.279+0000 7f61d587f640 1 --2- 192.168.123.107:0/1148915993 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f61d0071680 0x7f61d0071a80 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f61c4009a00 tx=0x7f61c402f310 comp rx=0 tx=0).stop 2026-03-09T19:22:13.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.279+0000 7f61d587f640 1 -- 192.168.123.107:0/1148915993 shutdown_connections 2026-03-09T19:22:13.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.279+0000 7f61d587f640 1 --2- 192.168.123.107:0/1148915993 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f61d0071680 0x7f61d0071a80 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:13.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.279+0000 7f61d587f640 1 -- 192.168.123.107:0/1148915993 >> 192.168.123.107:0/1148915993 conn(0x7f61d006d1c0 msgr2=0x7f61d006f600 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:13.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.279+0000 7f61d587f640 1 -- 192.168.123.107:0/1148915993 shutdown_connections 2026-03-09T19:22:13.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.279+0000 7f61d587f640 1 -- 192.168.123.107:0/1148915993 wait complete. 2026-03-09T19:22:13.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.279+0000 7f61d587f640 1 Processor -- start 2026-03-09T19:22:13.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.280+0000 7f61d587f640 1 -- start start 2026-03-09T19:22:13.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.280+0000 7f61d587f640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f61d0071680 0x7f61d01b1960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:13.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.280+0000 7f61d587f640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f61c4002e30 con 0x7f61d0071680 2026-03-09T19:22:13.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.280+0000 7f61ceffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f61d0071680 0x7f61d01b1960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:13.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.280+0000 7f61ceffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f61d0071680 0x7f61d01b1960 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:51054/0 (socket says 192.168.123.107:51054) 2026-03-09T19:22:13.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.280+0000 7f61ceffd640 1 -- 192.168.123.107:0/1190953411 learned_addr learned my addr 192.168.123.107:0/1190953411 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:13.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.280+0000 7f61ceffd640 1 -- 192.168.123.107:0/1190953411 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f61c4009660 con 0x7f61d0071680 2026-03-09T19:22:13.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.280+0000 7f61ceffd640 1 --2- 192.168.123.107:0/1190953411 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f61d0071680 0x7f61d01b1960 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f61c4002410 tx=0x7f61c4039970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:13.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.281+0000 7f61d487d640 1 -- 192.168.123.107:0/1190953411 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f61c40043d0 con 0x7f61d0071680 2026-03-09T19:22:13.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.281+0000 7f61d487d640 1 -- 192.168.123.107:0/1190953411 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f61c4039ed0 con 0x7f61d0071680 2026-03-09T19:22:13.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.281+0000 7f61d487d640 1 -- 192.168.123.107:0/1190953411 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f61c40403d0 con 0x7f61d0071680 2026-03-09T19:22:13.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.282+0000 7f61d587f640 1 -- 192.168.123.107:0/1190953411 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f61d01b1ea0 con 0x7f61d0071680 2026-03-09T19:22:13.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.282+0000 7f61d587f640 1 -- 192.168.123.107:0/1190953411 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f61d01b2280 con 0x7f61d0071680 2026-03-09T19:22:13.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.283+0000 7f61d487d640 1 -- 192.168.123.107:0/1190953411 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 12) v1 ==== 49383+0+0 (secure 0 0 0) 0x7f61c4041470 con 0x7f61d0071680 2026-03-09T19:22:13.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.283+0000 7f61d487d640 1 --2- 192.168.123.107:0/1190953411 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f61a403d280 0x7f61a403f740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:13.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.283+0000 7f61d487d640 1 -- 192.168.123.107:0/1190953411 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f61c40760f0 con 0x7f61d0071680 2026-03-09T19:22:13.282 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.283+0000 7f61d587f640 1 -- 192.168.123.107:0/1190953411 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f619c005350 con 0x7f61d0071680 2026-03-09T19:22:13.284 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.284+0000 7f61ce7fc640 1 --2- 192.168.123.107:0/1190953411 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f61a403d280 0x7f61a403f740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:13.286 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.286+0000 7f61ce7fc640 1 --2- 192.168.123.107:0/1190953411 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f61a403d280 0x7f61a403f740 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f61b80099c0 tx=0x7f61b8006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:13.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.289+0000 7f61d487d640 1 -- 192.168.123.107:0/1190953411 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f61c403d030 con 0x7f61d0071680 2026-03-09T19:22:13.382 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.384+0000 7f61d587f640 1 -- 192.168.123.107:0/1190953411 --> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] -- mgr_command(tid 0: {"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}) v1 -- 0x7f619c002bf0 con 0x7f61a403d280 2026-03-09T19:22:13.388 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.390+0000 7f61d487d640 1 -- 192.168.123.107:0/1190953411 <== mgr.14162 v2:192.168.123.107:6800/1021580706 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f619c002bf0 con 0x7f61a403d280 2026-03-09T19:22:13.391 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.393+0000 7f61d587f640 1 -- 192.168.123.107:0/1190953411 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f61a403d280 msgr2=0x7f61a403f740 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:13.391 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.393+0000 7f61d587f640 1 --2- 192.168.123.107:0/1190953411 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f61a403d280 0x7f61a403f740 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f61b80099c0 tx=0x7f61b8006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:13.391 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.393+0000 7f61d587f640 1 -- 192.168.123.107:0/1190953411 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f61d0071680 msgr2=0x7f61d01b1960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:13.391 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.393+0000 7f61d587f640 1 --2- 192.168.123.107:0/1190953411 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f61d0071680 0x7f61d01b1960 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f61c4002410 tx=0x7f61c4039970 comp rx=0 tx=0).stop 2026-03-09T19:22:13.391 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.393+0000 7f61d587f640 1 -- 192.168.123.107:0/1190953411 shutdown_connections 2026-03-09T19:22:13.391 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.393+0000 7f61d587f640 1 --2- 192.168.123.107:0/1190953411 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f61a403d280 0x7f61a403f740 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:13.391 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.393+0000 7f61d587f640 1 --2- 192.168.123.107:0/1190953411 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f61d0071680 0x7f61d01b1960 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:13.391 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.393+0000 7f61d587f640 1 -- 192.168.123.107:0/1190953411 >> 192.168.123.107:0/1190953411 conn(0x7f61d006d1c0 msgr2=0x7f61d006ff60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:13.391 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.394+0000 7f61d587f640 1 -- 192.168.123.107:0/1190953411 shutdown_connections 2026-03-09T19:22:13.391 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.394+0000 7f61d587f640 1 -- 192.168.123.107:0/1190953411 wait complete. 2026-03-09T19:22:13.453 INFO:tasks.cephadm:Writing (initial) conf and keyring to vm08 2026-03-09T19:22:13.453 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-09T19:22:13.453 DEBUG:teuthology.orchestra.run.vm08:> dd of=/etc/ceph/ceph.conf 2026-03-09T19:22:13.468 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-09T19:22:13.468 DEBUG:teuthology.orchestra.run.vm08:> dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:22:13.521 INFO:tasks.cephadm:Adding host vm08 to orchestrator... 2026-03-09T19:22:13.521 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph orch host add vm08 2026-03-09T19:22:13.684 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:22:13.807 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:13 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/3548129324' entity='client.admin' 2026-03-09T19:22:13.807 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:13 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:13.992 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.993+0000 7f5b3d2e3640 1 -- 192.168.123.107:0/1083462420 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5b38102450 msgr2=0x7f5b38102850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:13.992 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.993+0000 7f5b3d2e3640 1 --2- 192.168.123.107:0/1083462420 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5b38102450 0x7f5b38102850 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f5b24009a00 tx=0x7f5b2402f310 comp rx=0 tx=0).stop 2026-03-09T19:22:13.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.994+0000 7f5b3d2e3640 1 -- 192.168.123.107:0/1083462420 shutdown_connections 2026-03-09T19:22:13.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.994+0000 7f5b3d2e3640 1 --2- 192.168.123.107:0/1083462420 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5b38102450 0x7f5b38102850 secure :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f5b24009a00 tx=0x7f5b2402f310 comp rx=0 tx=0).stop 2026-03-09T19:22:13.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.994+0000 7f5b3d2e3640 1 -- 192.168.123.107:0/1083462420 >> 192.168.123.107:0/1083462420 conn(0x7f5b380fdca0 msgr2=0x7f5b38100090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:13.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.994+0000 7f5b3d2e3640 1 -- 192.168.123.107:0/1083462420 shutdown_connections 2026-03-09T19:22:13.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.994+0000 7f5b3d2e3640 1 -- 192.168.123.107:0/1083462420 wait complete. 2026-03-09T19:22:13.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.995+0000 7f5b3d2e3640 1 Processor -- start 2026-03-09T19:22:13.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.995+0000 7f5b3d2e3640 1 -- start start 2026-03-09T19:22:13.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.995+0000 7f5b3d2e3640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5b380759c0 0x7f5b38075de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:13.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.995+0000 7f5b3d2e3640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5b24002e30 con 0x7f5b380759c0 2026-03-09T19:22:13.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.996+0000 7f5b37fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5b380759c0 0x7f5b38075de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:13.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.996+0000 7f5b37fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5b380759c0 0x7f5b38075de0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:51086/0 (socket says 192.168.123.107:51086) 2026-03-09T19:22:13.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.996+0000 7f5b37fff640 1 -- 192.168.123.107:0/315035815 learned_addr learned my addr 192.168.123.107:0/315035815 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:13.997 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:13.997+0000 7f5b37fff640 1 -- 192.168.123.107:0/315035815 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5b24009660 con 0x7f5b380759c0 2026-03-09T19:22:13.999 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:14.000+0000 7f5b37fff640 1 --2- 192.168.123.107:0/315035815 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5b380759c0 0x7f5b38075de0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f5b2402f840 tx=0x7f5b240043c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:13.999 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:14.001+0000 7f5b357fa640 1 -- 192.168.123.107:0/315035815 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5b24038690 con 0x7f5b380759c0 2026-03-09T19:22:13.999 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:14.001+0000 7f5b3d2e3640 1 -- 192.168.123.107:0/315035815 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5b38076320 con 0x7f5b380759c0 2026-03-09T19:22:13.999 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:14.001+0000 7f5b3d2e3640 1 -- 192.168.123.107:0/315035815 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5b38078ec0 con 0x7f5b380759c0 2026-03-09T19:22:14.001 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:14.001+0000 7f5b357fa640 1 -- 192.168.123.107:0/315035815 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f5b24038cb0 con 0x7f5b380759c0 2026-03-09T19:22:14.001 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:14.002+0000 7f5b357fa640 1 -- 192.168.123.107:0/315035815 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5b24041ac0 con 0x7f5b380759c0 2026-03-09T19:22:14.001 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:14.003+0000 7f5b357fa640 1 -- 192.168.123.107:0/315035815 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 12) v1 ==== 49383+0+0 (secure 0 0 0) 0x7f5b24041ce0 con 0x7f5b380759c0 2026-03-09T19:22:14.001 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:14.003+0000 7f5b357fa640 1 --2- 192.168.123.107:0/315035815 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f5b0803d2d0 0x7f5b0803f790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:14.001 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:14.003+0000 7f5b357fa640 1 -- 192.168.123.107:0/315035815 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f5b24077020 con 0x7f5b380759c0 2026-03-09T19:22:14.001 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:14.003+0000 7f5b3d2e3640 1 -- 192.168.123.107:0/315035815 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5afc005350 con 0x7f5b380759c0 2026-03-09T19:22:14.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:14.003+0000 7f5b377fe640 1 --2- 192.168.123.107:0/315035815 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f5b0803d2d0 0x7f5b0803f790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:14.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:14.004+0000 7f5b377fe640 1 --2- 192.168.123.107:0/315035815 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f5b0803d2d0 0x7f5b0803f790 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f5b28009a10 tx=0x7f5b28006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:14.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:14.007+0000 7f5b357fa640 1 -- 192.168.123.107:0/315035815 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f5b24036b50 con 0x7f5b380759c0 2026-03-09T19:22:14.101 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:14.099+0000 7f5b3d2e3640 1 -- 192.168.123.107:0/315035815 --> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm08", "target": ["mon-mgr", ""]}) v1 -- 0x7f5afc002bf0 con 0x7f5b0803d2d0 2026-03-09T19:22:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:14 vm07 ceph-mon[48545]: from='client.14186 -' entity='client.admin' cmd=[{"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:14 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:14 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:14 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:22:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:14 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:22:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:14 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:22:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:14 vm07 ceph-mon[48545]: Updating vm07:/etc/ceph/ceph.conf 2026-03-09T19:22:15.420 INFO:teuthology.orchestra.run.vm07.stdout:Added host 'vm08' with addr '192.168.123.108' 2026-03-09T19:22:15.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:15.421+0000 7f5b357fa640 1 -- 192.168.123.107:0/315035815 <== mgr.14162 v2:192.168.123.107:6800/1021580706 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7f5afc002bf0 con 0x7f5b0803d2d0 2026-03-09T19:22:15.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:15.424+0000 7f5b3d2e3640 1 -- 192.168.123.107:0/315035815 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f5b0803d2d0 msgr2=0x7f5b0803f790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:15.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:15.424+0000 7f5b3d2e3640 1 --2- 192.168.123.107:0/315035815 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f5b0803d2d0 0x7f5b0803f790 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f5b28009a10 tx=0x7f5b28006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:15.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:15.425+0000 7f5b3d2e3640 1 -- 192.168.123.107:0/315035815 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5b380759c0 msgr2=0x7f5b38075de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:15.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:15.425+0000 7f5b3d2e3640 1 --2- 192.168.123.107:0/315035815 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5b380759c0 0x7f5b38075de0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f5b2402f840 tx=0x7f5b240043c0 comp rx=0 tx=0).stop 2026-03-09T19:22:15.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:15.425+0000 7f5b3d2e3640 1 -- 192.168.123.107:0/315035815 shutdown_connections 2026-03-09T19:22:15.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:15.425+0000 7f5b3d2e3640 1 --2- 192.168.123.107:0/315035815 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f5b0803d2d0 0x7f5b0803f790 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:15.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:15.425+0000 7f5b3d2e3640 1 --2- 192.168.123.107:0/315035815 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5b380759c0 0x7f5b38075de0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:15.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:15.425+0000 7f5b3d2e3640 1 -- 192.168.123.107:0/315035815 >> 192.168.123.107:0/315035815 conn(0x7f5b380fdca0 msgr2=0x7f5b380fe8a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:15.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:15.425+0000 7f5b3d2e3640 1 -- 192.168.123.107:0/315035815 shutdown_connections 2026-03-09T19:22:15.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:15.425+0000 7f5b3d2e3640 1 -- 192.168.123.107:0/315035815 wait complete. 2026-03-09T19:22:15.522 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph orch host ls --format=json 2026-03-09T19:22:15.839 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: from='client.14188 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm08", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: Updating vm07:/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: Deploying cephadm binary to vm08 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.client.admin.keyring 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: Deploying daemon ceph-exporter.vm07 on vm07 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm07", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm07", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-09T19:22:16.034 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:15 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:22:16.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.141+0000 7fa819cdc640 1 -- 192.168.123.107:0/1952386200 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa814071990 msgr2=0x7fa814071d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:16.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.141+0000 7fa819cdc640 1 --2- 192.168.123.107:0/1952386200 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa814071990 0x7fa814071d70 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fa8080099b0 tx=0x7fa80802f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:16.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.143+0000 7fa819cdc640 1 -- 192.168.123.107:0/1952386200 shutdown_connections 2026-03-09T19:22:16.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.143+0000 7fa819cdc640 1 --2- 192.168.123.107:0/1952386200 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa814071990 0x7fa814071d70 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:16.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.143+0000 7fa819cdc640 1 -- 192.168.123.107:0/1952386200 >> 192.168.123.107:0/1952386200 conn(0x7fa81406b190 msgr2=0x7fa81406b5a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:16.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.144+0000 7fa819cdc640 1 -- 192.168.123.107:0/1952386200 shutdown_connections 2026-03-09T19:22:16.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.144+0000 7fa819cdc640 1 -- 192.168.123.107:0/1952386200 wait complete. 2026-03-09T19:22:16.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.144+0000 7fa819cdc640 1 Processor -- start 2026-03-09T19:22:16.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.144+0000 7fa819cdc640 1 -- start start 2026-03-09T19:22:16.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.144+0000 7fa819cdc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa814071990 0x7fa814115d10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:16.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.144+0000 7fa819cdc640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa814116250 con 0x7fa814071990 2026-03-09T19:22:16.143 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.144+0000 7fa818cda640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa814071990 0x7fa814115d10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:16.143 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.144+0000 7fa818cda640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa814071990 0x7fa814115d10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:51114/0 (socket says 192.168.123.107:51114) 2026-03-09T19:22:16.143 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.144+0000 7fa818cda640 1 -- 192.168.123.107:0/261267312 learned_addr learned my addr 192.168.123.107:0/261267312 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:16.144 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.146+0000 7fa818cda640 1 -- 192.168.123.107:0/261267312 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa808009660 con 0x7fa814071990 2026-03-09T19:22:16.145 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.146+0000 7fa818cda640 1 --2- 192.168.123.107:0/261267312 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa814071990 0x7fa814115d10 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7fa80802f860 tx=0x7fa808004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:16.145 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.147+0000 7fa811ffb640 1 -- 192.168.123.107:0/261267312 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa808002a50 con 0x7fa814071990 2026-03-09T19:22:16.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.147+0000 7fa819cdc640 1 -- 192.168.123.107:0/261267312 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa814116450 con 0x7fa814071990 2026-03-09T19:22:16.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.147+0000 7fa819cdc640 1 -- 192.168.123.107:0/261267312 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa814116870 con 0x7fa814071990 2026-03-09T19:22:16.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.147+0000 7fa811ffb640 1 -- 192.168.123.107:0/261267312 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa808038930 con 0x7fa814071990 2026-03-09T19:22:16.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.147+0000 7fa811ffb640 1 -- 192.168.123.107:0/261267312 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa808041810 con 0x7fa814071990 2026-03-09T19:22:16.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.147+0000 7fa811ffb640 1 -- 192.168.123.107:0/261267312 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7fa808041970 con 0x7fa814071990 2026-03-09T19:22:16.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.148+0000 7fa811ffb640 1 --2- 192.168.123.107:0/261267312 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fa7e403d320 0x7fa7e403f7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:16.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.148+0000 7fa811ffb640 1 -- 192.168.123.107:0/261267312 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7fa808076f90 con 0x7fa814071990 2026-03-09T19:22:16.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.148+0000 7fa813fff640 1 --2- 192.168.123.107:0/261267312 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fa7e403d320 0x7fa7e403f7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:16.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.148+0000 7fa813fff640 1 --2- 192.168.123.107:0/261267312 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fa7e403d320 0x7fa7e403f7e0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7fa7fc0099c0 tx=0x7fa7fc006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:16.149 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.150+0000 7fa819cdc640 1 -- 192.168.123.107:0/261267312 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa814072e20 con 0x7fa814071990 2026-03-09T19:22:16.154 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.156+0000 7fa811ffb640 1 -- 192.168.123.107:0/261267312 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fa808041ca0 con 0x7fa814071990 2026-03-09T19:22:16.267 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.269+0000 7fa819cdc640 1 -- 192.168.123.107:0/261267312 --> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7fa814117210 con 0x7fa7e403d320 2026-03-09T19:22:16.270 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:22:16.270 INFO:teuthology.orchestra.run.vm07.stdout:[{"addr": "192.168.123.107", "hostname": "vm07", "labels": [], "status": ""}, {"addr": "192.168.123.108", "hostname": "vm08", "labels": [], "status": ""}] 2026-03-09T19:22:16.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.272+0000 7fa811ffb640 1 -- 192.168.123.107:0/261267312 <== mgr.14162 v2:192.168.123.107:6800/1021580706 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+155 (secure 0 0 0) 0x7fa814117210 con 0x7fa7e403d320 2026-03-09T19:22:16.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.281+0000 7fa819cdc640 1 -- 192.168.123.107:0/261267312 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fa7e403d320 msgr2=0x7fa7e403f7e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:16.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.281+0000 7fa819cdc640 1 --2- 192.168.123.107:0/261267312 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fa7e403d320 0x7fa7e403f7e0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7fa7fc0099c0 tx=0x7fa7fc006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:16.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.281+0000 7fa819cdc640 1 -- 192.168.123.107:0/261267312 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa814071990 msgr2=0x7fa814115d10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:16.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.281+0000 7fa819cdc640 1 --2- 192.168.123.107:0/261267312 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa814071990 0x7fa814115d10 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7fa80802f860 tx=0x7fa808004290 comp rx=0 tx=0).stop 2026-03-09T19:22:16.280 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.282+0000 7fa819cdc640 1 -- 192.168.123.107:0/261267312 shutdown_connections 2026-03-09T19:22:16.280 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.282+0000 7fa819cdc640 1 --2- 192.168.123.107:0/261267312 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fa7e403d320 0x7fa7e403f7e0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:16.280 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.282+0000 7fa819cdc640 1 --2- 192.168.123.107:0/261267312 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa814071990 0x7fa814115d10 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:16.280 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.282+0000 7fa819cdc640 1 -- 192.168.123.107:0/261267312 >> 192.168.123.107:0/261267312 conn(0x7fa81406b190 msgr2=0x7fa81406ec50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:16.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.283+0000 7fa819cdc640 1 -- 192.168.123.107:0/261267312 shutdown_connections 2026-03-09T19:22:16.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:16.283+0000 7fa819cdc640 1 -- 192.168.123.107:0/261267312 wait complete. 2026-03-09T19:22:16.458 INFO:tasks.cephadm:Setting crush tunables to default 2026-03-09T19:22:16.458 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd crush tunables default 2026-03-09T19:22:16.772 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:22:17.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:16 vm07 ceph-mon[48545]: Added host vm08 2026-03-09T19:22:17.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:16 vm07 ceph-mon[48545]: Deploying daemon crash.vm07 on vm07 2026-03-09T19:22:17.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:16 vm07 ceph-mon[48545]: mgrmap e13: vm07.xacuym(active, since 6s) 2026-03-09T19:22:17.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:16 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:17.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:16 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:17.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:16 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:17.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:16 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:17.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.125+0000 7efd74884640 1 -- 192.168.123.107:0/3768037586 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd680a4650 msgr2=0x7efd680a4a30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:17.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.125+0000 7efd74884640 1 --2- 192.168.123.107:0/3768037586 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd680a4650 0x7efd680a4a30 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7efd64009a00 tx=0x7efd6402f3a0 comp rx=0 tx=0).stop 2026-03-09T19:22:17.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.129+0000 7efd74884640 1 -- 192.168.123.107:0/3768037586 shutdown_connections 2026-03-09T19:22:17.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.129+0000 7efd74884640 1 --2- 192.168.123.107:0/3768037586 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd680a4650 0x7efd680a4a30 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:17.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.129+0000 7efd74884640 1 -- 192.168.123.107:0/3768037586 >> 192.168.123.107:0/3768037586 conn(0x7efd6801a150 msgr2=0x7efd6801a560 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:17.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.129+0000 7efd74884640 1 -- 192.168.123.107:0/3768037586 shutdown_connections 2026-03-09T19:22:17.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.129+0000 7efd74884640 1 -- 192.168.123.107:0/3768037586 wait complete. 2026-03-09T19:22:17.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.129+0000 7efd74884640 1 Processor -- start 2026-03-09T19:22:17.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.129+0000 7efd74884640 1 -- start start 2026-03-09T19:22:17.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.129+0000 7efd74884640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd680a4650 0x7efd680b4c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:17.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.129+0000 7efd74884640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efd680b1c80 con 0x7efd680a4650 2026-03-09T19:22:17.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.130+0000 7efd6ed76640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd680a4650 0x7efd680b4c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:17.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.130+0000 7efd6ed76640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd680a4650 0x7efd680b4c10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:51140/0 (socket says 192.168.123.107:51140) 2026-03-09T19:22:17.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.130+0000 7efd6ed76640 1 -- 192.168.123.107:0/2411280689 learned_addr learned my addr 192.168.123.107:0/2411280689 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:22:17.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.130+0000 7efd6ed76640 1 -- 192.168.123.107:0/2411280689 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efd64009660 con 0x7efd680a4650 2026-03-09T19:22:17.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.130+0000 7efd6ed76640 1 --2- 192.168.123.107:0/2411280689 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd680a4650 0x7efd680b4c10 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7efd64009b30 tx=0x7efd64002c80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:17.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.131+0000 7efd4ffff640 1 -- 192.168.123.107:0/2411280689 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efd64005260 con 0x7efd680a4650 2026-03-09T19:22:17.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.131+0000 7efd4ffff640 1 -- 192.168.123.107:0/2411280689 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7efd640047c0 con 0x7efd680a4650 2026-03-09T19:22:17.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.131+0000 7efd4ffff640 1 -- 192.168.123.107:0/2411280689 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efd64038820 con 0x7efd680a4650 2026-03-09T19:22:17.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.137+0000 7efd74884640 1 -- 192.168.123.107:0/2411280689 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efd680b1e80 con 0x7efd680a4650 2026-03-09T19:22:17.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.137+0000 7efd74884640 1 -- 192.168.123.107:0/2411280689 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efd680b22a0 con 0x7efd680a4650 2026-03-09T19:22:17.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.138+0000 7efd74884640 1 -- 192.168.123.107:0/2411280689 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efd3c005350 con 0x7efd680a4650 2026-03-09T19:22:17.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.142+0000 7efd4ffff640 1 -- 192.168.123.107:0/2411280689 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7efd6403f070 con 0x7efd680a4650 2026-03-09T19:22:17.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.143+0000 7efd4ffff640 1 --2- 192.168.123.107:0/2411280689 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7efd5003d3b0 0x7efd5003f870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:17.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.143+0000 7efd4ffff640 1 -- 192.168.123.107:0/2411280689 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1155+0+0 (secure 0 0 0) 0x7efd64050080 con 0x7efd680a4650 2026-03-09T19:22:17.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.143+0000 7efd6e575640 1 --2- 192.168.123.107:0/2411280689 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7efd5003d3b0 0x7efd5003f870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:17.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.144+0000 7efd6e575640 1 --2- 192.168.123.107:0/2411280689 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7efd5003d3b0 0x7efd5003f870 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7efd580099c0 tx=0x7efd58006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:17.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.144+0000 7efd4ffff640 1 -- 192.168.123.107:0/2411280689 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7efd640a38c0 con 0x7efd680a4650 2026-03-09T19:22:17.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.264+0000 7efd74884640 1 -- 192.168.123.107:0/2411280689 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd crush tunables", "profile": "default"} v 0) v1 -- 0x7efd3c0051c0 con 0x7efd680a4650 2026-03-09T19:22:17.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.949+0000 7efd4ffff640 1 -- 192.168.123.107:0/2411280689 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd crush tunables", "profile": "default"}]=0 adjusted tunables profile to default v4) v1 ==== 124+0+0 (secure 0 0 0) 0x7efd6403d090 con 0x7efd680a4650 2026-03-09T19:22:17.949 INFO:teuthology.orchestra.run.vm07.stderr:adjusted tunables profile to default 2026-03-09T19:22:17.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.951+0000 7efd74884640 1 -- 192.168.123.107:0/2411280689 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7efd5003d3b0 msgr2=0x7efd5003f870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:17.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.951+0000 7efd74884640 1 --2- 192.168.123.107:0/2411280689 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7efd5003d3b0 0x7efd5003f870 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7efd580099c0 tx=0x7efd58006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:17.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.952+0000 7efd74884640 1 -- 192.168.123.107:0/2411280689 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd680a4650 msgr2=0x7efd680b4c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:17.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.952+0000 7efd74884640 1 --2- 192.168.123.107:0/2411280689 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd680a4650 0x7efd680b4c10 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7efd64009b30 tx=0x7efd64002c80 comp rx=0 tx=0).stop 2026-03-09T19:22:17.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.952+0000 7efd74884640 1 -- 192.168.123.107:0/2411280689 shutdown_connections 2026-03-09T19:22:17.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.952+0000 7efd74884640 1 --2- 192.168.123.107:0/2411280689 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7efd5003d3b0 0x7efd5003f870 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:17.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.952+0000 7efd74884640 1 --2- 192.168.123.107:0/2411280689 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd680a4650 0x7efd680b4c10 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:17.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.952+0000 7efd74884640 1 -- 192.168.123.107:0/2411280689 >> 192.168.123.107:0/2411280689 conn(0x7efd6801a150 msgr2=0x7efd680a1910 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:17.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.952+0000 7efd74884640 1 -- 192.168.123.107:0/2411280689 shutdown_connections 2026-03-09T19:22:17.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:22:17.952+0000 7efd74884640 1 -- 192.168.123.107:0/2411280689 wait complete. 2026-03-09T19:22:17.996 INFO:tasks.cephadm:Adding mon.vm07 on vm07 2026-03-09T19:22:17.996 INFO:tasks.cephadm:Adding mon.vm08 on vm08 2026-03-09T19:22:17.996 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph orch apply mon '2;vm07:192.168.123.107=vm07;vm08:192.168.123.108=vm08' 2026-03-09T19:22:18.145 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:18.184 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:18.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:17 vm07 ceph-mon[48545]: from='client.14191 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T19:22:18.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:17 vm07 ceph-mon[48545]: Deploying daemon node-exporter.vm07 on vm07 2026-03-09T19:22:18.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:17 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/2411280689' entity='client.admin' cmd=[{"prefix": "osd crush tunables", "profile": "default"}]: dispatch 2026-03-09T19:22:19.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:18 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/2411280689' entity='client.admin' cmd='[{"prefix": "osd crush tunables", "profile": "default"}]': finished 2026-03-09T19:22:19.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:18 vm07 ceph-mon[48545]: osdmap e4: 0 total, 0 up, 0 in 2026-03-09T19:22:19.284 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.284+0000 7fe881608640 1 -- 192.168.123.108:0/2103608956 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe87c0ff860 msgr2=0x7fe87c101c50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:19.284 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.284+0000 7fe881608640 1 --2- 192.168.123.108:0/2103608956 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe87c0ff860 0x7fe87c101c50 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7fe86c0099b0 tx=0x7fe86c02f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:19.284 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.285+0000 7fe881608640 1 -- 192.168.123.108:0/2103608956 shutdown_connections 2026-03-09T19:22:19.284 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.285+0000 7fe881608640 1 --2- 192.168.123.108:0/2103608956 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe87c0ff860 0x7fe87c101c50 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:19.284 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.285+0000 7fe881608640 1 -- 192.168.123.108:0/2103608956 >> 192.168.123.108:0/2103608956 conn(0x7fe87c0f9b60 msgr2=0x7fe87c0fbf80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:19.284 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.285+0000 7fe881608640 1 -- 192.168.123.108:0/2103608956 shutdown_connections 2026-03-09T19:22:19.284 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.286+0000 7fe881608640 1 -- 192.168.123.108:0/2103608956 wait complete. 2026-03-09T19:22:19.284 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.286+0000 7fe881608640 1 Processor -- start 2026-03-09T19:22:19.284 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.286+0000 7fe881608640 1 -- start start 2026-03-09T19:22:19.284 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.286+0000 7fe881608640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe87c0ff860 0x7fe87c199890 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:19.285 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.286+0000 7fe881608640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe87c199dd0 con 0x7fe87c0ff860 2026-03-09T19:22:19.285 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.286+0000 7fe87bfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe87c0ff860 0x7fe87c199890 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:19.285 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.286+0000 7fe87bfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe87c0ff860 0x7fe87c199890 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:59048/0 (socket says 192.168.123.108:59048) 2026-03-09T19:22:19.285 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.286+0000 7fe87bfff640 1 -- 192.168.123.108:0/890527666 learned_addr learned my addr 192.168.123.108:0/890527666 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:19.285 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.287+0000 7fe87bfff640 1 -- 192.168.123.108:0/890527666 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe86c009660 con 0x7fe87c0ff860 2026-03-09T19:22:19.285 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.287+0000 7fe87bfff640 1 --2- 192.168.123.108:0/890527666 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe87c0ff860 0x7fe87c199890 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fe86c005ec0 tx=0x7fe86c004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:19.285 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.287+0000 7fe8797fa640 1 -- 192.168.123.108:0/890527666 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe86c004450 con 0x7fe87c0ff860 2026-03-09T19:22:19.285 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.287+0000 7fe881608640 1 -- 192.168.123.108:0/890527666 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe87c199fd0 con 0x7fe87c0ff860 2026-03-09T19:22:19.285 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.287+0000 7fe8797fa640 1 -- 192.168.123.108:0/890527666 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe86c038b40 con 0x7fe87c0ff860 2026-03-09T19:22:19.286 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.287+0000 7fe8797fa640 1 -- 192.168.123.108:0/890527666 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe86c041a00 con 0x7fe87c0ff860 2026-03-09T19:22:19.286 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.287+0000 7fe881608640 1 -- 192.168.123.108:0/890527666 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe87c19a470 con 0x7fe87c0ff860 2026-03-09T19:22:19.287 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.288+0000 7fe8797fa640 1 -- 192.168.123.108:0/890527666 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7fe86c038cb0 con 0x7fe87c0ff860 2026-03-09T19:22:19.287 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.288+0000 7fe881608640 1 -- 192.168.123.108:0/890527666 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe87c0ffcc0 con 0x7fe87c0ff860 2026-03-09T19:22:19.287 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.288+0000 7fe8797fa640 1 --2- 192.168.123.108:0/890527666 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fe85403cf10 0x7fe85403f3d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:19.287 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.289+0000 7fe8797fa640 1 -- 192.168.123.108:0/890527666 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7fe86c076500 con 0x7fe87c0ff860 2026-03-09T19:22:19.287 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.289+0000 7fe87b7fe640 1 --2- 192.168.123.108:0/890527666 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fe85403cf10 0x7fe85403f3d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:19.288 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.290+0000 7fe87b7fe640 1 --2- 192.168.123.108:0/890527666 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fe85403cf10 0x7fe85403f3d0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fe8680099c0 tx=0x7fe868006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:19.290 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.291+0000 7fe8797fa640 1 -- 192.168.123.108:0/890527666 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fe86c035d20 con 0x7fe87c0ff860 2026-03-09T19:22:19.386 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.387+0000 7fe881608640 1 -- 192.168.123.108:0/890527666 --> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "placement": "2;vm07:192.168.123.107=vm07;vm08:192.168.123.108=vm08", "target": ["mon-mgr", ""]}) v1 -- 0x7fe87c102b10 con 0x7fe85403cf10 2026-03-09T19:22:19.391 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.392+0000 7fe8797fa640 1 -- 192.168.123.108:0/890527666 <== mgr.14162 v2:192.168.123.107:6800/1021580706 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7fe87c102b10 con 0x7fe85403cf10 2026-03-09T19:22:19.391 INFO:teuthology.orchestra.run.vm08.stdout:Scheduled mon update... 2026-03-09T19:22:19.393 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.395+0000 7fe881608640 1 -- 192.168.123.108:0/890527666 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fe85403cf10 msgr2=0x7fe85403f3d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:19.393 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.395+0000 7fe881608640 1 --2- 192.168.123.108:0/890527666 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fe85403cf10 0x7fe85403f3d0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fe8680099c0 tx=0x7fe868006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:19.393 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.395+0000 7fe881608640 1 -- 192.168.123.108:0/890527666 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe87c0ff860 msgr2=0x7fe87c199890 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:19.393 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.395+0000 7fe881608640 1 --2- 192.168.123.108:0/890527666 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe87c0ff860 0x7fe87c199890 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fe86c005ec0 tx=0x7fe86c004290 comp rx=0 tx=0).stop 2026-03-09T19:22:19.393 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.395+0000 7fe881608640 1 -- 192.168.123.108:0/890527666 shutdown_connections 2026-03-09T19:22:19.393 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.395+0000 7fe881608640 1 --2- 192.168.123.108:0/890527666 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fe85403cf10 0x7fe85403f3d0 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:19.394 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.395+0000 7fe881608640 1 --2- 192.168.123.108:0/890527666 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe87c0ff860 0x7fe87c199890 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:19.394 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.395+0000 7fe881608640 1 -- 192.168.123.108:0/890527666 >> 192.168.123.108:0/890527666 conn(0x7fe87c0f9b60 msgr2=0x7fe87c0fbf50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:19.394 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.395+0000 7fe881608640 1 -- 192.168.123.108:0/890527666 shutdown_connections 2026-03-09T19:22:19.394 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.395+0000 7fe881608640 1 -- 192.168.123.108:0/890527666 wait complete. 2026-03-09T19:22:19.457 DEBUG:teuthology.orchestra.run.vm08:mon.vm08> sudo journalctl -f -n 0 -u ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mon.vm08.service 2026-03-09T19:22:19.458 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:19.458 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:19.639 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:19.678 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:19.928 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.929+0000 7ff9f3710640 1 -- 192.168.123.108:0/65585417 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9ec102640 msgr2=0x7ff9ec102a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:19.928 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.929+0000 7ff9f3710640 1 --2- 192.168.123.108:0/65585417 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9ec102640 0x7ff9ec102a40 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7ff9dc0099b0 tx=0x7ff9dc02f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:19.928 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.930+0000 7ff9f3710640 1 -- 192.168.123.108:0/65585417 shutdown_connections 2026-03-09T19:22:19.928 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.930+0000 7ff9f3710640 1 --2- 192.168.123.108:0/65585417 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9ec102640 0x7ff9ec102a40 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:19.928 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.930+0000 7ff9f3710640 1 -- 192.168.123.108:0/65585417 >> 192.168.123.108:0/65585417 conn(0x7ff9ec0fde70 msgr2=0x7ff9ec100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:19.928 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.930+0000 7ff9f3710640 1 -- 192.168.123.108:0/65585417 shutdown_connections 2026-03-09T19:22:19.928 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.930+0000 7ff9f3710640 1 -- 192.168.123.108:0/65585417 wait complete. 2026-03-09T19:22:19.929 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.930+0000 7ff9f3710640 1 Processor -- start 2026-03-09T19:22:19.929 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.930+0000 7ff9f3710640 1 -- start start 2026-03-09T19:22:19.929 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.931+0000 7ff9f3710640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9ec102640 0x7ff9ec199950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:19.929 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.931+0000 7ff9f3710640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff9ec199e90 con 0x7ff9ec102640 2026-03-09T19:22:19.929 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.931+0000 7ff9f1485640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9ec102640 0x7ff9ec199950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:19.929 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.931+0000 7ff9f1485640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9ec102640 0x7ff9ec199950 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:59062/0 (socket says 192.168.123.108:59062) 2026-03-09T19:22:19.929 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.931+0000 7ff9f1485640 1 -- 192.168.123.108:0/3505487809 learned_addr learned my addr 192.168.123.108:0/3505487809 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:19.930 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.931+0000 7ff9f1485640 1 -- 192.168.123.108:0/3505487809 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff9dc009660 con 0x7ff9ec102640 2026-03-09T19:22:19.930 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.932+0000 7ff9f1485640 1 --2- 192.168.123.108:0/3505487809 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9ec102640 0x7ff9ec199950 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7ff9dc02f860 tx=0x7ff9dc004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:19.930 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.932+0000 7ff9da7fc640 1 -- 192.168.123.108:0/3505487809 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff9dc0043b0 con 0x7ff9ec102640 2026-03-09T19:22:19.930 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.932+0000 7ff9da7fc640 1 -- 192.168.123.108:0/3505487809 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff9dc038b40 con 0x7ff9ec102640 2026-03-09T19:22:19.931 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.932+0000 7ff9f3710640 1 -- 192.168.123.108:0/3505487809 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff9ec19a090 con 0x7ff9ec102640 2026-03-09T19:22:19.931 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.932+0000 7ff9da7fc640 1 -- 192.168.123.108:0/3505487809 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff9dc0418f0 con 0x7ff9ec102640 2026-03-09T19:22:19.931 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.932+0000 7ff9f3710640 1 -- 192.168.123.108:0/3505487809 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff9ec19a530 con 0x7ff9ec102640 2026-03-09T19:22:19.932 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.933+0000 7ff9da7fc640 1 -- 192.168.123.108:0/3505487809 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7ff9dc041a50 con 0x7ff9ec102640 2026-03-09T19:22:19.932 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.933+0000 7ff9f3710640 1 -- 192.168.123.108:0/3505487809 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff9b8005350 con 0x7ff9ec102640 2026-03-09T19:22:19.932 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.933+0000 7ff9da7fc640 1 --2- 192.168.123.108:0/3505487809 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7ff9bc03d2d0 0x7ff9bc03f790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:19.932 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.933+0000 7ff9da7fc640 1 -- 192.168.123.108:0/3505487809 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7ff9dc0771e0 con 0x7ff9ec102640 2026-03-09T19:22:19.932 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.934+0000 7ff9f0c84640 1 --2- 192.168.123.108:0/3505487809 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7ff9bc03d2d0 0x7ff9bc03f790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:19.932 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.934+0000 7ff9f0c84640 1 --2- 192.168.123.108:0/3505487809 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7ff9bc03d2d0 0x7ff9bc03f790 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7ff9e00099c0 tx=0x7ff9e0006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:19.935 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:19.936+0000 7ff9da7fc640 1 -- 192.168.123.108:0/3505487809 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7ff9dc04a3c0 con 0x7ff9ec102640 2026-03-09T19:22:20.061 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:20.063+0000 7ff9f3710640 1 -- 192.168.123.108:0/3505487809 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7ff9b80051c0 con 0x7ff9ec102640 2026-03-09T19:22:20.063 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:20.065+0000 7ff9da7fc640 1 -- 192.168.123.108:0/3505487809 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7ff9dc0373d0 con 0x7ff9ec102640 2026-03-09T19:22:20.063 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:20.063 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:20.064 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:20.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:20.067+0000 7ff9f3710640 1 -- 192.168.123.108:0/3505487809 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7ff9bc03d2d0 msgr2=0x7ff9bc03f790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:20.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:20.067+0000 7ff9f3710640 1 --2- 192.168.123.108:0/3505487809 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7ff9bc03d2d0 0x7ff9bc03f790 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7ff9e00099c0 tx=0x7ff9e0006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:20.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:20.067+0000 7ff9f3710640 1 -- 192.168.123.108:0/3505487809 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9ec102640 msgr2=0x7ff9ec199950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:20.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:20.067+0000 7ff9f3710640 1 --2- 192.168.123.108:0/3505487809 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9ec102640 0x7ff9ec199950 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7ff9dc02f860 tx=0x7ff9dc004270 comp rx=0 tx=0).stop 2026-03-09T19:22:20.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:20.068+0000 7ff9f3710640 1 -- 192.168.123.108:0/3505487809 shutdown_connections 2026-03-09T19:22:20.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:20.068+0000 7ff9f3710640 1 --2- 192.168.123.108:0/3505487809 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7ff9bc03d2d0 0x7ff9bc03f790 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:20.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:20.068+0000 7ff9f3710640 1 --2- 192.168.123.108:0/3505487809 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9ec102640 0x7ff9ec199950 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:20.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:20.068+0000 7ff9f3710640 1 -- 192.168.123.108:0/3505487809 >> 192.168.123.108:0/3505487809 conn(0x7ff9ec0fde70 msgr2=0x7ff9ec0fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:20.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:20.068+0000 7ff9f3710640 1 -- 192.168.123.108:0/3505487809 shutdown_connections 2026-03-09T19:22:20.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:20.068+0000 7ff9f3710640 1 -- 192.168.123.108:0/3505487809 wait complete. 2026-03-09T19:22:20.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:20 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:20.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:20 vm07 ceph-mon[48545]: from='client.14195 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "placement": "2;vm07:192.168.123.107=vm07;vm08:192.168.123.108=vm08", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:22:20.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:20 vm07 ceph-mon[48545]: Saving service mon spec with placement vm07:192.168.123.107=vm07;vm08:192.168.123.108=vm08;count:2 2026-03-09T19:22:20.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:20 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:20.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:20 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:20.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:20 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:20.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:20 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:20.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:20 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:20.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:20 vm07 ceph-mon[48545]: Deploying daemon alertmanager.vm07 on vm07 2026-03-09T19:22:20.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:20 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/3505487809' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:21.134 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:21.134 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:21.278 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:21.318 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:21.661 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.662+0000 7f40e3787640 1 -- 192.168.123.108:0/1953493561 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40dc102620 msgr2=0x7f40dc102a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:21.661 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.662+0000 7f40e3787640 1 --2- 192.168.123.108:0/1953493561 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40dc102620 0x7f40dc102a20 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f40c40099b0 tx=0x7f40c402f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:21.661 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.663+0000 7f40e3787640 1 -- 192.168.123.108:0/1953493561 shutdown_connections 2026-03-09T19:22:21.661 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.663+0000 7f40e3787640 1 --2- 192.168.123.108:0/1953493561 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40dc102620 0x7f40dc102a20 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:21.661 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.663+0000 7f40e3787640 1 -- 192.168.123.108:0/1953493561 >> 192.168.123.108:0/1953493561 conn(0x7f40dc0fde70 msgr2=0x7f40dc100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:21.661 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.663+0000 7f40e3787640 1 -- 192.168.123.108:0/1953493561 shutdown_connections 2026-03-09T19:22:21.661 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.663+0000 7f40e3787640 1 -- 192.168.123.108:0/1953493561 wait complete. 2026-03-09T19:22:21.662 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.664+0000 7f40e3787640 1 Processor -- start 2026-03-09T19:22:21.662 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.664+0000 7f40e3787640 1 -- start start 2026-03-09T19:22:21.662 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.664+0000 7f40e3787640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40dc102620 0x7f40dc199990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:21.662 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.664+0000 7f40e3787640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f40dc199ed0 con 0x7f40dc102620 2026-03-09T19:22:21.662 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.664+0000 7f40e14fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40dc102620 0x7f40dc199990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:21.663 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.664+0000 7f40e14fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40dc102620 0x7f40dc199990 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:59076/0 (socket says 192.168.123.108:59076) 2026-03-09T19:22:21.663 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.664+0000 7f40e14fc640 1 -- 192.168.123.108:0/641049277 learned_addr learned my addr 192.168.123.108:0/641049277 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:21.663 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.664+0000 7f40e14fc640 1 -- 192.168.123.108:0/641049277 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f40c4009660 con 0x7f40dc102620 2026-03-09T19:22:21.663 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.664+0000 7f40e14fc640 1 --2- 192.168.123.108:0/641049277 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40dc102620 0x7f40dc199990 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f40c40042c0 tx=0x7f40c40042f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:21.663 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.665+0000 7f40d27fc640 1 -- 192.168.123.108:0/641049277 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f40c4038680 con 0x7f40dc102620 2026-03-09T19:22:21.663 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.665+0000 7f40d27fc640 1 -- 192.168.123.108:0/641049277 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f40c4038ca0 con 0x7f40dc102620 2026-03-09T19:22:21.664 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.665+0000 7f40e3787640 1 -- 192.168.123.108:0/641049277 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f40dc19a0d0 con 0x7f40dc102620 2026-03-09T19:22:21.664 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.665+0000 7f40d27fc640 1 -- 192.168.123.108:0/641049277 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f40c40419c0 con 0x7f40dc102620 2026-03-09T19:22:21.664 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.665+0000 7f40e3787640 1 -- 192.168.123.108:0/641049277 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f40dc19a570 con 0x7f40dc102620 2026-03-09T19:22:21.664 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.666+0000 7f40e3787640 1 -- 192.168.123.108:0/641049277 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f40dc102aa0 con 0x7f40dc102620 2026-03-09T19:22:21.665 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.666+0000 7f40d27fc640 1 -- 192.168.123.108:0/641049277 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f40c40387e0 con 0x7f40dc102620 2026-03-09T19:22:21.665 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.666+0000 7f40d27fc640 1 --2- 192.168.123.108:0/641049277 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f40b803cfb0 0x7f40b803f470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:21.665 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.667+0000 7f40d27fc640 1 -- 192.168.123.108:0/641049277 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f40c4075d10 con 0x7f40dc102620 2026-03-09T19:22:21.665 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.667+0000 7f40e0cfb640 1 --2- 192.168.123.108:0/641049277 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f40b803cfb0 0x7f40b803f470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:21.666 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.667+0000 7f40e0cfb640 1 --2- 192.168.123.108:0/641049277 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f40b803cfb0 0x7f40b803f470 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f40cc0099c0 tx=0x7f40cc006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:21.667 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.669+0000 7f40d27fc640 1 -- 192.168.123.108:0/641049277 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f40c404acb0 con 0x7f40dc102620 2026-03-09T19:22:21.793 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.794+0000 7f40e3787640 1 -- 192.168.123.108:0/641049277 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f40dc108530 con 0x7f40dc102620 2026-03-09T19:22:21.795 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.796+0000 7f40d27fc640 1 -- 192.168.123.108:0/641049277 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f40c4035320 con 0x7f40dc102620 2026-03-09T19:22:21.795 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:21.795 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:21.795 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:21.798 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.799+0000 7f40e3787640 1 -- 192.168.123.108:0/641049277 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f40b803cfb0 msgr2=0x7f40b803f470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:21.798 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.799+0000 7f40e3787640 1 --2- 192.168.123.108:0/641049277 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f40b803cfb0 0x7f40b803f470 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f40cc0099c0 tx=0x7f40cc006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:21.798 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.799+0000 7f40e3787640 1 -- 192.168.123.108:0/641049277 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40dc102620 msgr2=0x7f40dc199990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:21.798 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.799+0000 7f40e3787640 1 --2- 192.168.123.108:0/641049277 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40dc102620 0x7f40dc199990 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f40c40042c0 tx=0x7f40c40042f0 comp rx=0 tx=0).stop 2026-03-09T19:22:21.798 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.799+0000 7f40e3787640 1 -- 192.168.123.108:0/641049277 shutdown_connections 2026-03-09T19:22:21.798 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.800+0000 7f40e3787640 1 --2- 192.168.123.108:0/641049277 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f40b803cfb0 0x7f40b803f470 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:21.798 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.800+0000 7f40e3787640 1 --2- 192.168.123.108:0/641049277 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40dc102620 0x7f40dc199990 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:21.798 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.800+0000 7f40e3787640 1 -- 192.168.123.108:0/641049277 >> 192.168.123.108:0/641049277 conn(0x7f40dc0fde70 msgr2=0x7f40dc0fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:21.798 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.800+0000 7f40e3787640 1 -- 192.168.123.108:0/641049277 shutdown_connections 2026-03-09T19:22:21.798 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:21.800+0000 7f40e3787640 1 -- 192.168.123.108:0/641049277 wait complete. 2026-03-09T19:22:22.230 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:21 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/641049277' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:22.864 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:22.864 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:23.010 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:23.048 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:23.350 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.351+0000 7fc45c457640 1 -- 192.168.123.108:0/1195334484 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4540ff260 msgr2=0x7fc4540ff660 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:23.350 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.351+0000 7fc45c457640 1 --2- 192.168.123.108:0/1195334484 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4540ff260 0x7fc4540ff660 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7fc4440099b0 tx=0x7fc44402f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:23.350 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.352+0000 7fc45c457640 1 -- 192.168.123.108:0/1195334484 shutdown_connections 2026-03-09T19:22:23.350 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.352+0000 7fc45c457640 1 --2- 192.168.123.108:0/1195334484 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4540ff260 0x7fc4540ff660 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:23.350 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.352+0000 7fc45c457640 1 -- 192.168.123.108:0/1195334484 >> 192.168.123.108:0/1195334484 conn(0x7fc4540faa10 msgr2=0x7fc4540fce30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:23.351 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.352+0000 7fc45c457640 1 -- 192.168.123.108:0/1195334484 shutdown_connections 2026-03-09T19:22:23.351 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.352+0000 7fc45c457640 1 -- 192.168.123.108:0/1195334484 wait complete. 2026-03-09T19:22:23.351 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.353+0000 7fc45c457640 1 Processor -- start 2026-03-09T19:22:23.351 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.353+0000 7fc45c457640 1 -- start start 2026-03-09T19:22:23.352 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.353+0000 7fc45c457640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4540ff260 0x7fc454195430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:23.352 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.353+0000 7fc45c457640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc454195970 con 0x7fc4540ff260 2026-03-09T19:22:23.352 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.354+0000 7fc45a1cc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4540ff260 0x7fc454195430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:23.352 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.354+0000 7fc45a1cc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4540ff260 0x7fc454195430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:59100/0 (socket says 192.168.123.108:59100) 2026-03-09T19:22:23.352 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.354+0000 7fc45a1cc640 1 -- 192.168.123.108:0/2915425023 learned_addr learned my addr 192.168.123.108:0/2915425023 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:23.352 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.354+0000 7fc45a1cc640 1 -- 192.168.123.108:0/2915425023 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc444009660 con 0x7fc4540ff260 2026-03-09T19:22:23.353 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.354+0000 7fc45a1cc640 1 --2- 192.168.123.108:0/2915425023 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4540ff260 0x7fc454195430 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7fc44402f860 tx=0x7fc444004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:23.353 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.354+0000 7fc4437fe640 1 -- 192.168.123.108:0/2915425023 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc4440043b0 con 0x7fc4540ff260 2026-03-09T19:22:23.354 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.354+0000 7fc4437fe640 1 -- 192.168.123.108:0/2915425023 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fc444038b40 con 0x7fc4540ff260 2026-03-09T19:22:23.354 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.354+0000 7fc4437fe640 1 -- 192.168.123.108:0/2915425023 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc4440418f0 con 0x7fc4540ff260 2026-03-09T19:22:23.354 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.354+0000 7fc45c457640 1 -- 192.168.123.108:0/2915425023 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc454195b70 con 0x7fc4540ff260 2026-03-09T19:22:23.354 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.355+0000 7fc45c457640 1 -- 192.168.123.108:0/2915425023 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc454196010 con 0x7fc4540ff260 2026-03-09T19:22:23.354 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.356+0000 7fc45c457640 1 -- 192.168.123.108:0/2915425023 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc424005350 con 0x7fc4540ff260 2026-03-09T19:22:23.355 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.357+0000 7fc4437fe640 1 -- 192.168.123.108:0/2915425023 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7fc444038cb0 con 0x7fc4540ff260 2026-03-09T19:22:23.355 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.357+0000 7fc4437fe640 1 --2- 192.168.123.108:0/2915425023 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fc43403d2d0 0x7fc43403f790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:23.355 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.357+0000 7fc4437fe640 1 -- 192.168.123.108:0/2915425023 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7fc444076470 con 0x7fc4540ff260 2026-03-09T19:22:23.356 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.357+0000 7fc4599cb640 1 --2- 192.168.123.108:0/2915425023 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fc43403d2d0 0x7fc43403f790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:23.356 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.358+0000 7fc4599cb640 1 --2- 192.168.123.108:0/2915425023 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fc43403d2d0 0x7fc43403f790 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fc4480099c0 tx=0x7fc448006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:23.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.360+0000 7fc4437fe640 1 -- 192.168.123.108:0/2915425023 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fc444048310 con 0x7fc4540ff260 2026-03-09T19:22:23.479 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.481+0000 7fc45c457640 1 -- 192.168.123.108:0/2915425023 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fc4240051c0 con 0x7fc4540ff260 2026-03-09T19:22:23.482 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.483+0000 7fc4437fe640 1 -- 192.168.123.108:0/2915425023 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7fc444051310 con 0x7fc4540ff260 2026-03-09T19:22:23.482 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:23.482 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:23.482 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:23.484 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.486+0000 7fc45c457640 1 -- 192.168.123.108:0/2915425023 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fc43403d2d0 msgr2=0x7fc43403f790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:23.484 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.486+0000 7fc45c457640 1 --2- 192.168.123.108:0/2915425023 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fc43403d2d0 0x7fc43403f790 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fc4480099c0 tx=0x7fc448006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:23.484 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.486+0000 7fc45c457640 1 -- 192.168.123.108:0/2915425023 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4540ff260 msgr2=0x7fc454195430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:23.484 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.486+0000 7fc45c457640 1 --2- 192.168.123.108:0/2915425023 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4540ff260 0x7fc454195430 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7fc44402f860 tx=0x7fc444004270 comp rx=0 tx=0).stop 2026-03-09T19:22:23.484 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.486+0000 7fc45c457640 1 -- 192.168.123.108:0/2915425023 shutdown_connections 2026-03-09T19:22:23.485 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.486+0000 7fc45c457640 1 --2- 192.168.123.108:0/2915425023 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fc43403d2d0 0x7fc43403f790 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:23.485 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.486+0000 7fc45c457640 1 --2- 192.168.123.108:0/2915425023 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4540ff260 0x7fc454195430 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:23.485 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.486+0000 7fc45c457640 1 -- 192.168.123.108:0/2915425023 >> 192.168.123.108:0/2915425023 conn(0x7fc4540faa10 msgr2=0x7fc4540fb610 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:23.485 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.486+0000 7fc45c457640 1 -- 192.168.123.108:0/2915425023 shutdown_connections 2026-03-09T19:22:23.485 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:23.486+0000 7fc45c457640 1 -- 192.168.123.108:0/2915425023 wait complete. 2026-03-09T19:22:24.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:24 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:24.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:24 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:24.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:24 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:24.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:24 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:24.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:24 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:24.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:24 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:24.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:24 vm07 ceph-mon[48545]: Regenerating cephadm self-signed grafana TLS certificates 2026-03-09T19:22:24.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:24 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:24.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:24 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:24.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:24 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-09T19:22:24.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:24 vm07 ceph-mon[48545]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-09T19:22:24.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:24 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:24.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:24 vm07 ceph-mon[48545]: Deploying daemon grafana.vm07 on vm07 2026-03-09T19:22:24.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:24 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/2915425023' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:24.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:24 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:24.545 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:24.545 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:24.687 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:24.724 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:24.954 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.955+0000 7f3b2c40f640 1 -- 192.168.123.108:0/4109965418 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b24100390 msgr2=0x7f3b24100790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:24.954 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.955+0000 7f3b2c40f640 1 --2- 192.168.123.108:0/4109965418 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b24100390 0x7f3b24100790 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f3b140099b0 tx=0x7f3b1402f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:24.954 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.955+0000 7f3b2c40f640 1 -- 192.168.123.108:0/4109965418 shutdown_connections 2026-03-09T19:22:24.954 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.955+0000 7f3b2c40f640 1 --2- 192.168.123.108:0/4109965418 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b24100390 0x7f3b24100790 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:24.954 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.955+0000 7f3b2c40f640 1 -- 192.168.123.108:0/4109965418 >> 192.168.123.108:0/4109965418 conn(0x7f3b240fbb60 msgr2=0x7f3b240fdf80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:24.954 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.956+0000 7f3b2c40f640 1 -- 192.168.123.108:0/4109965418 shutdown_connections 2026-03-09T19:22:24.954 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.956+0000 7f3b2c40f640 1 -- 192.168.123.108:0/4109965418 wait complete. 2026-03-09T19:22:24.955 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.956+0000 7f3b2c40f640 1 Processor -- start 2026-03-09T19:22:24.955 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.956+0000 7f3b2c40f640 1 -- start start 2026-03-09T19:22:24.955 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.957+0000 7f3b2c40f640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b24100390 0x7f3b241976c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:24.955 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.957+0000 7f3b2c40f640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3b240753b0 con 0x7f3b24100390 2026-03-09T19:22:24.955 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.957+0000 7f3b2a184640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b24100390 0x7f3b241976c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:24.955 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.957+0000 7f3b2a184640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b24100390 0x7f3b241976c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:59122/0 (socket says 192.168.123.108:59122) 2026-03-09T19:22:24.955 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.957+0000 7f3b2a184640 1 -- 192.168.123.108:0/3316261478 learned_addr learned my addr 192.168.123.108:0/3316261478 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:24.955 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.957+0000 7f3b2a184640 1 -- 192.168.123.108:0/3316261478 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3b14009660 con 0x7f3b24100390 2026-03-09T19:22:24.956 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.957+0000 7f3b2a184640 1 --2- 192.168.123.108:0/3316261478 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b24100390 0x7f3b241976c0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f3b1402f860 tx=0x7f3b14004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:24.957 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.957+0000 7f3b137fe640 1 -- 192.168.123.108:0/3316261478 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3b140043b0 con 0x7f3b24100390 2026-03-09T19:22:24.957 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.957+0000 7f3b137fe640 1 -- 192.168.123.108:0/3316261478 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3b14038b40 con 0x7f3b24100390 2026-03-09T19:22:24.957 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.958+0000 7f3b137fe640 1 -- 192.168.123.108:0/3316261478 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3b140418f0 con 0x7f3b24100390 2026-03-09T19:22:24.957 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.958+0000 7f3b2c40f640 1 -- 192.168.123.108:0/3316261478 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3b24197c00 con 0x7f3b24100390 2026-03-09T19:22:24.957 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.958+0000 7f3b2c40f640 1 -- 192.168.123.108:0/3316261478 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3b24197fe0 con 0x7f3b24100390 2026-03-09T19:22:24.957 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.959+0000 7f3b137fe640 1 -- 192.168.123.108:0/3316261478 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f3b14038cb0 con 0x7f3b24100390 2026-03-09T19:22:24.958 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.959+0000 7f3b2c40f640 1 -- 192.168.123.108:0/3316261478 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3b24100810 con 0x7f3b24100390 2026-03-09T19:22:24.958 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.959+0000 7f3b137fe640 1 --2- 192.168.123.108:0/3316261478 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f3af803d2d0 0x7f3af803f790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:24.958 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.959+0000 7f3b137fe640 1 -- 192.168.123.108:0/3316261478 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f3b14076250 con 0x7f3b24100390 2026-03-09T19:22:24.958 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.959+0000 7f3b29983640 1 --2- 192.168.123.108:0/3316261478 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f3af803d2d0 0x7f3af803f790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:24.958 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.960+0000 7f3b29983640 1 --2- 192.168.123.108:0/3316261478 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f3af803d2d0 0x7f3af803f790 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f3b180099c0 tx=0x7f3b18006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:24.963 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:24.964+0000 7f3b137fe640 1 -- 192.168.123.108:0/3316261478 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f3b14035320 con 0x7f3b24100390 2026-03-09T19:22:25.084 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:25.085+0000 7f3b2c40f640 1 -- 192.168.123.108:0/3316261478 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f3b241983c0 con 0x7f3b24100390 2026-03-09T19:22:25.084 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:25.086+0000 7f3b137fe640 1 -- 192.168.123.108:0/3316261478 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f3b140373d0 con 0x7f3b24100390 2026-03-09T19:22:25.084 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:25.084 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:25.084 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:25.087 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:25.088+0000 7f3b2c40f640 1 -- 192.168.123.108:0/3316261478 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f3af803d2d0 msgr2=0x7f3af803f790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:25.087 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:25.088+0000 7f3b2c40f640 1 --2- 192.168.123.108:0/3316261478 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f3af803d2d0 0x7f3af803f790 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f3b180099c0 tx=0x7f3b18006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:25.087 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:25.088+0000 7f3b2c40f640 1 -- 192.168.123.108:0/3316261478 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b24100390 msgr2=0x7f3b241976c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:25.087 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:25.088+0000 7f3b2c40f640 1 --2- 192.168.123.108:0/3316261478 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b24100390 0x7f3b241976c0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f3b1402f860 tx=0x7f3b14004270 comp rx=0 tx=0).stop 2026-03-09T19:22:25.087 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:25.088+0000 7f3b2c40f640 1 -- 192.168.123.108:0/3316261478 shutdown_connections 2026-03-09T19:22:25.087 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:25.088+0000 7f3b2c40f640 1 --2- 192.168.123.108:0/3316261478 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f3af803d2d0 0x7f3af803f790 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:25.087 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:25.088+0000 7f3b2c40f640 1 --2- 192.168.123.108:0/3316261478 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b24100390 0x7f3b241976c0 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:25.087 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:25.088+0000 7f3b2c40f640 1 -- 192.168.123.108:0/3316261478 >> 192.168.123.108:0/3316261478 conn(0x7f3b240fbb60 msgr2=0x7f3b240fc760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:25.087 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:25.089+0000 7f3b2c40f640 1 -- 192.168.123.108:0/3316261478 shutdown_connections 2026-03-09T19:22:25.087 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:25.089+0000 7f3b2c40f640 1 -- 192.168.123.108:0/3316261478 wait complete. 2026-03-09T19:22:25.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:25 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/3316261478' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:26.222 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:26.222 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:26.381 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:26.422 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:26.773 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.773+0000 7f4dd0997640 1 -- 192.168.123.108:0/2964537216 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4dc8102620 msgr2=0x7f4dc8102a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:26.773 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.773+0000 7f4dd0997640 1 --2- 192.168.123.108:0/2964537216 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4dc8102620 0x7f4dc8102a20 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f4dc00099b0 tx=0x7f4dc002f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:26.773 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.775+0000 7f4dd0997640 1 -- 192.168.123.108:0/2964537216 shutdown_connections 2026-03-09T19:22:26.774 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.775+0000 7f4dd0997640 1 --2- 192.168.123.108:0/2964537216 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4dc8102620 0x7f4dc8102a20 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:26.774 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.775+0000 7f4dd0997640 1 -- 192.168.123.108:0/2964537216 >> 192.168.123.108:0/2964537216 conn(0x7f4dc80fde70 msgr2=0x7f4dc8100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:26.774 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.775+0000 7f4dd0997640 1 -- 192.168.123.108:0/2964537216 shutdown_connections 2026-03-09T19:22:26.774 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.776+0000 7f4dd0997640 1 -- 192.168.123.108:0/2964537216 wait complete. 2026-03-09T19:22:26.775 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.776+0000 7f4dd0997640 1 Processor -- start 2026-03-09T19:22:26.775 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.777+0000 7f4dd0997640 1 -- start start 2026-03-09T19:22:26.775 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.777+0000 7f4dd0997640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4dc8102620 0x7f4dc8078ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:26.775 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.777+0000 7f4dd0997640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4dc8079400 con 0x7f4dc8102620 2026-03-09T19:22:26.776 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.777+0000 7f4dce70c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4dc8102620 0x7f4dc8078ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:26.776 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.778+0000 7f4dce70c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4dc8102620 0x7f4dc8078ec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:59134/0 (socket says 192.168.123.108:59134) 2026-03-09T19:22:26.776 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.778+0000 7f4dce70c640 1 -- 192.168.123.108:0/450980674 learned_addr learned my addr 192.168.123.108:0/450980674 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:26.777 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.778+0000 7f4dce70c640 1 -- 192.168.123.108:0/450980674 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4dc0009660 con 0x7f4dc8102620 2026-03-09T19:22:26.777 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.779+0000 7f4dce70c640 1 --2- 192.168.123.108:0/450980674 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4dc8102620 0x7f4dc8078ec0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f4dc00042c0 tx=0x7f4dc00042f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:26.777 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.779+0000 7f4db37fe640 1 -- 192.168.123.108:0/450980674 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4dc0038680 con 0x7f4dc8102620 2026-03-09T19:22:26.777 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.779+0000 7f4dd0997640 1 -- 192.168.123.108:0/450980674 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4dc8079600 con 0x7f4dc8102620 2026-03-09T19:22:26.778 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.779+0000 7f4dd0997640 1 -- 192.168.123.108:0/450980674 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4dc8075a00 con 0x7f4dc8102620 2026-03-09T19:22:26.778 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.780+0000 7f4db37fe640 1 -- 192.168.123.108:0/450980674 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f4dc0038ca0 con 0x7f4dc8102620 2026-03-09T19:22:26.778 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.780+0000 7f4db37fe640 1 -- 192.168.123.108:0/450980674 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4dc00418f0 con 0x7f4dc8102620 2026-03-09T19:22:26.779 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.780+0000 7f4db37fe640 1 -- 192.168.123.108:0/450980674 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f4dc0041b80 con 0x7f4dc8102620 2026-03-09T19:22:26.779 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.780+0000 7f4db37fe640 1 --2- 192.168.123.108:0/450980674 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f4d9c03d280 0x7f4d9c03f740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:26.779 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.781+0000 7f4db37fe640 1 -- 192.168.123.108:0/450980674 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f4dc0076340 con 0x7f4dc8102620 2026-03-09T19:22:26.779 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.781+0000 7f4dd0997640 1 -- 192.168.123.108:0/450980674 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4dc8102aa0 con 0x7f4dc8102620 2026-03-09T19:22:26.779 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.781+0000 7f4dcdf0b640 1 --2- 192.168.123.108:0/450980674 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f4d9c03d280 0x7f4d9c03f740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:26.780 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.782+0000 7f4dcdf0b640 1 --2- 192.168.123.108:0/450980674 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f4d9c03d280 0x7f4d9c03f740 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f4db40099c0 tx=0x7f4db4006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:26.783 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.784+0000 7f4db37fe640 1 -- 192.168.123.108:0/450980674 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f4dc00375b0 con 0x7f4dc8102620 2026-03-09T19:22:26.925 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.926+0000 7f4dd0997640 1 -- 192.168.123.108:0/450980674 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f4dc81086d0 con 0x7f4dc8102620 2026-03-09T19:22:26.926 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.927+0000 7f4db37fe640 1 -- 192.168.123.108:0/450980674 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f4dc005a090 con 0x7f4dc8102620 2026-03-09T19:22:26.926 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:26.926 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:26.926 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:26.928 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.930+0000 7f4dd0997640 1 -- 192.168.123.108:0/450980674 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f4d9c03d280 msgr2=0x7f4d9c03f740 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:26.928 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.930+0000 7f4dd0997640 1 --2- 192.168.123.108:0/450980674 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f4d9c03d280 0x7f4d9c03f740 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f4db40099c0 tx=0x7f4db4006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:26.928 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.930+0000 7f4dd0997640 1 -- 192.168.123.108:0/450980674 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4dc8102620 msgr2=0x7f4dc8078ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:26.928 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.930+0000 7f4dd0997640 1 --2- 192.168.123.108:0/450980674 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4dc8102620 0x7f4dc8078ec0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f4dc00042c0 tx=0x7f4dc00042f0 comp rx=0 tx=0).stop 2026-03-09T19:22:26.929 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.930+0000 7f4dd0997640 1 -- 192.168.123.108:0/450980674 shutdown_connections 2026-03-09T19:22:26.929 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.930+0000 7f4dd0997640 1 --2- 192.168.123.108:0/450980674 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f4d9c03d280 0x7f4d9c03f740 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:26.929 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.931+0000 7f4dd0997640 1 --2- 192.168.123.108:0/450980674 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4dc8102620 0x7f4dc8078ec0 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:26.929 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.931+0000 7f4dd0997640 1 -- 192.168.123.108:0/450980674 >> 192.168.123.108:0/450980674 conn(0x7f4dc80fde70 msgr2=0x7f4dc80fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:26.929 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.931+0000 7f4dd0997640 1 -- 192.168.123.108:0/450980674 shutdown_connections 2026-03-09T19:22:26.929 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:26.931+0000 7f4dd0997640 1 -- 192.168.123.108:0/450980674 wait complete. 2026-03-09T19:22:27.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:27 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/450980674' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:28.137 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:28.137 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:28.284 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:28.324 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:28.595 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.595+0000 7f216eeb9640 1 -- 192.168.123.108:0/473703533 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2168102620 msgr2=0x7f2168102a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:28.595 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.595+0000 7f216eeb9640 1 --2- 192.168.123.108:0/473703533 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2168102620 0x7f2168102a20 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f21500099b0 tx=0x7f215002f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:28.595 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.596+0000 7f216eeb9640 1 -- 192.168.123.108:0/473703533 shutdown_connections 2026-03-09T19:22:28.595 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.596+0000 7f216eeb9640 1 --2- 192.168.123.108:0/473703533 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2168102620 0x7f2168102a20 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:28.595 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.596+0000 7f216eeb9640 1 -- 192.168.123.108:0/473703533 >> 192.168.123.108:0/473703533 conn(0x7f21680fde70 msgr2=0x7f2168100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:28.595 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.597+0000 7f216eeb9640 1 -- 192.168.123.108:0/473703533 shutdown_connections 2026-03-09T19:22:28.595 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.597+0000 7f216eeb9640 1 -- 192.168.123.108:0/473703533 wait complete. 2026-03-09T19:22:28.596 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.597+0000 7f216eeb9640 1 Processor -- start 2026-03-09T19:22:28.596 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.597+0000 7f216eeb9640 1 -- start start 2026-03-09T19:22:28.596 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.598+0000 7f216eeb9640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2168102620 0x7f2168199990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:28.596 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.598+0000 7f216eeb9640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2168199ed0 con 0x7f2168102620 2026-03-09T19:22:28.596 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.598+0000 7f216cc2e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2168102620 0x7f2168199990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:28.596 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.598+0000 7f216cc2e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2168102620 0x7f2168199990 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:49982/0 (socket says 192.168.123.108:49982) 2026-03-09T19:22:28.596 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.598+0000 7f216cc2e640 1 -- 192.168.123.108:0/3494306942 learned_addr learned my addr 192.168.123.108:0/3494306942 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:28.597 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.598+0000 7f216cc2e640 1 -- 192.168.123.108:0/3494306942 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2150009660 con 0x7f2168102620 2026-03-09T19:22:28.597 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.599+0000 7f216cc2e640 1 --2- 192.168.123.108:0/3494306942 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2168102620 0x7f2168199990 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f21500042c0 tx=0x7f21500042f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:28.597 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.599+0000 7f215dffb640 1 -- 192.168.123.108:0/3494306942 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2150038680 con 0x7f2168102620 2026-03-09T19:22:28.597 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.599+0000 7f216eeb9640 1 -- 192.168.123.108:0/3494306942 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f216819a0d0 con 0x7f2168102620 2026-03-09T19:22:28.597 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.599+0000 7f216eeb9640 1 -- 192.168.123.108:0/3494306942 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f216819a570 con 0x7f2168102620 2026-03-09T19:22:28.598 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.599+0000 7f215dffb640 1 -- 192.168.123.108:0/3494306942 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f2150038ca0 con 0x7f2168102620 2026-03-09T19:22:28.598 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.599+0000 7f215dffb640 1 -- 192.168.123.108:0/3494306942 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f21500418f0 con 0x7f2168102620 2026-03-09T19:22:28.599 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.600+0000 7f21577fe640 1 -- 192.168.123.108:0/3494306942 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2168102aa0 con 0x7f2168102620 2026-03-09T19:22:28.599 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.601+0000 7f215dffb640 1 -- 192.168.123.108:0/3494306942 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f2150041b60 con 0x7f2168102620 2026-03-09T19:22:28.600 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.601+0000 7f215dffb640 1 --2- 192.168.123.108:0/3494306942 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f213003d2d0 0x7f213003f790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:28.600 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.601+0000 7f215dffb640 1 -- 192.168.123.108:0/3494306942 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f2150076a30 con 0x7f2168102620 2026-03-09T19:22:28.600 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.602+0000 7f215ffff640 1 --2- 192.168.123.108:0/3494306942 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f213003d2d0 0x7f213003f790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:28.600 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.602+0000 7f215ffff640 1 --2- 192.168.123.108:0/3494306942 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f213003d2d0 0x7f213003f790 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f21580099c0 tx=0x7f2158006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:28.602 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.604+0000 7f215dffb640 1 -- 192.168.123.108:0/3494306942 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f2150038e10 con 0x7f2168102620 2026-03-09T19:22:28.732 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.733+0000 7f21577fe640 1 -- 192.168.123.108:0/3494306942 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f2168108530 con 0x7f2168102620 2026-03-09T19:22:28.733 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.735+0000 7f215dffb640 1 -- 192.168.123.108:0/3494306942 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f2150038e10 con 0x7f2168102620 2026-03-09T19:22:28.734 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:28.734 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:28.734 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:28.736 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.737+0000 7f21577fe640 1 -- 192.168.123.108:0/3494306942 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f213003d2d0 msgr2=0x7f213003f790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:28.736 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.737+0000 7f21577fe640 1 --2- 192.168.123.108:0/3494306942 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f213003d2d0 0x7f213003f790 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f21580099c0 tx=0x7f2158006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:28.736 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.737+0000 7f21577fe640 1 -- 192.168.123.108:0/3494306942 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2168102620 msgr2=0x7f2168199990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:28.736 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.737+0000 7f21577fe640 1 --2- 192.168.123.108:0/3494306942 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2168102620 0x7f2168199990 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f21500042c0 tx=0x7f21500042f0 comp rx=0 tx=0).stop 2026-03-09T19:22:28.736 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.738+0000 7f21577fe640 1 -- 192.168.123.108:0/3494306942 shutdown_connections 2026-03-09T19:22:28.736 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.738+0000 7f21577fe640 1 --2- 192.168.123.108:0/3494306942 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f213003d2d0 0x7f213003f790 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:28.736 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.738+0000 7f21577fe640 1 --2- 192.168.123.108:0/3494306942 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2168102620 0x7f2168199990 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:28.736 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.738+0000 7f21577fe640 1 -- 192.168.123.108:0/3494306942 >> 192.168.123.108:0/3494306942 conn(0x7f21680fde70 msgr2=0x7f21680fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:28.736 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.738+0000 7f21577fe640 1 -- 192.168.123.108:0/3494306942 shutdown_connections 2026-03-09T19:22:28.736 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:28.738+0000 7f21577fe640 1 -- 192.168.123.108:0/3494306942 wait complete. 2026-03-09T19:22:29.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:28 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/3494306942' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:29.783 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:29.783 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:29.944 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:29.984 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:30.631 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.631+0000 7f71dd6a8640 1 -- 192.168.123.108:0/639152725 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f71d8102620 msgr2=0x7f71d8102a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:30.631 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.631+0000 7f71dd6a8640 1 --2- 192.168.123.108:0/639152725 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f71d8102620 0x7f71d8102a20 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f71c00099b0 tx=0x7f71c002f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:30.631 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.632+0000 7f71dd6a8640 1 -- 192.168.123.108:0/639152725 shutdown_connections 2026-03-09T19:22:30.631 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.632+0000 7f71dd6a8640 1 --2- 192.168.123.108:0/639152725 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f71d8102620 0x7f71d8102a20 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:30.631 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.632+0000 7f71dd6a8640 1 -- 192.168.123.108:0/639152725 >> 192.168.123.108:0/639152725 conn(0x7f71d80fde70 msgr2=0x7f71d8100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:30.631 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.632+0000 7f71dd6a8640 1 -- 192.168.123.108:0/639152725 shutdown_connections 2026-03-09T19:22:30.631 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.632+0000 7f71dd6a8640 1 -- 192.168.123.108:0/639152725 wait complete. 2026-03-09T19:22:30.631 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.633+0000 7f71dd6a8640 1 Processor -- start 2026-03-09T19:22:30.631 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.633+0000 7f71dd6a8640 1 -- start start 2026-03-09T19:22:30.632 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.633+0000 7f71dd6a8640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f71d8199bb0 0x7f71d8199fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:30.632 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.633+0000 7f71dd6a8640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f71d819a510 con 0x7f71d8199bb0 2026-03-09T19:22:30.632 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.634+0000 7f71d6ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f71d8199bb0 0x7f71d8199fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:30.632 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.634+0000 7f71d6ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f71d8199bb0 0x7f71d8199fd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:50004/0 (socket says 192.168.123.108:50004) 2026-03-09T19:22:30.632 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.634+0000 7f71d6ffd640 1 -- 192.168.123.108:0/2420179627 learned_addr learned my addr 192.168.123.108:0/2420179627 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:30.635 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.636+0000 7f71d6ffd640 1 -- 192.168.123.108:0/2420179627 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f71c0009660 con 0x7f71d8199bb0 2026-03-09T19:22:30.636 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.637+0000 7f71d6ffd640 1 --2- 192.168.123.108:0/2420179627 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f71d8199bb0 0x7f71d8199fd0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f71d8103680 tx=0x7f71c0004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:30.636 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.637+0000 7f71b7fff640 1 -- 192.168.123.108:0/2420179627 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f71c0038470 con 0x7f71d8199bb0 2026-03-09T19:22:30.636 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.637+0000 7f71dd6a8640 1 -- 192.168.123.108:0/2420179627 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f71d819a710 con 0x7f71d8199bb0 2026-03-09T19:22:30.636 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.638+0000 7f71dd6a8640 1 -- 192.168.123.108:0/2420179627 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f71d819d280 con 0x7f71d8199bb0 2026-03-09T19:22:30.638 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.639+0000 7f71b7fff640 1 -- 192.168.123.108:0/2420179627 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f71c0038a90 con 0x7f71d8199bb0 2026-03-09T19:22:30.638 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.639+0000 7f71b7fff640 1 -- 192.168.123.108:0/2420179627 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f71c0041840 con 0x7f71d8199bb0 2026-03-09T19:22:30.638 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.639+0000 7f71b7fff640 1 -- 192.168.123.108:0/2420179627 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f71c0041a60 con 0x7f71d8199bb0 2026-03-09T19:22:30.638 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.639+0000 7f71b7fff640 1 --2- 192.168.123.108:0/2420179627 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f71ac03cfb0 0x7f71ac03f470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:30.638 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.639+0000 7f71dd6a8640 1 -- 192.168.123.108:0/2420179627 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f719c005350 con 0x7f71d8199bb0 2026-03-09T19:22:30.638 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.640+0000 7f71b7fff640 1 -- 192.168.123.108:0/2420179627 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f71c0077170 con 0x7f71d8199bb0 2026-03-09T19:22:30.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.641+0000 7f71d67fc640 1 --2- 192.168.123.108:0/2420179627 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f71ac03cfb0 0x7f71ac03f470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:30.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.642+0000 7f71d67fc640 1 --2- 192.168.123.108:0/2420179627 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f71ac03cfb0 0x7f71ac03f470 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f71cc0099c0 tx=0x7f71cc006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:30.645 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.646+0000 7f71b7fff640 1 -- 192.168.123.108:0/2420179627 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f71c0037bb0 con 0x7f71d8199bb0 2026-03-09T19:22:30.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:30 vm07 ceph-mon[48545]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:22:30.780 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.781+0000 7f71dd6a8640 1 -- 192.168.123.108:0/2420179627 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f719c005600 con 0x7f71d8199bb0 2026-03-09T19:22:30.781 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.782+0000 7f71b7fff640 1 -- 192.168.123.108:0/2420179627 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f71c00373d0 con 0x7f71d8199bb0 2026-03-09T19:22:30.782 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:30.782 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:30.782 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:30.784 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.785+0000 7f71dd6a8640 1 -- 192.168.123.108:0/2420179627 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f71ac03cfb0 msgr2=0x7f71ac03f470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:30.784 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.786+0000 7f71dd6a8640 1 --2- 192.168.123.108:0/2420179627 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f71ac03cfb0 0x7f71ac03f470 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f71cc0099c0 tx=0x7f71cc006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:30.784 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.786+0000 7f71dd6a8640 1 -- 192.168.123.108:0/2420179627 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f71d8199bb0 msgr2=0x7f71d8199fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:30.784 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.786+0000 7f71dd6a8640 1 --2- 192.168.123.108:0/2420179627 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f71d8199bb0 0x7f71d8199fd0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f71d8103680 tx=0x7f71c0004290 comp rx=0 tx=0).stop 2026-03-09T19:22:30.785 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.786+0000 7f71dd6a8640 1 -- 192.168.123.108:0/2420179627 shutdown_connections 2026-03-09T19:22:30.785 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.786+0000 7f71dd6a8640 1 --2- 192.168.123.108:0/2420179627 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f71ac03cfb0 0x7f71ac03f470 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:30.785 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.786+0000 7f71dd6a8640 1 --2- 192.168.123.108:0/2420179627 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f71d8199bb0 0x7f71d8199fd0 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:30.785 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.786+0000 7f71dd6a8640 1 -- 192.168.123.108:0/2420179627 >> 192.168.123.108:0/2420179627 conn(0x7f71d80fde70 msgr2=0x7f71d80fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:30.785 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.787+0000 7f71dd6a8640 1 -- 192.168.123.108:0/2420179627 shutdown_connections 2026-03-09T19:22:30.785 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:30.787+0000 7f71dd6a8640 1 -- 192.168.123.108:0/2420179627 wait complete. 2026-03-09T19:22:31.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:31 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/2420179627' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:31.846 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:31.846 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:31.999 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:32.037 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:32.680 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.681+0000 7f6c05db5640 1 -- 192.168.123.108:0/1896932341 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6c00102620 msgr2=0x7f6c00102a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:32.680 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.681+0000 7f6c05db5640 1 --2- 192.168.123.108:0/1896932341 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6c00102620 0x7f6c00102a20 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f6bf00099b0 tx=0x7f6bf002f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:32.680 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.681+0000 7f6c05db5640 1 -- 192.168.123.108:0/1896932341 shutdown_connections 2026-03-09T19:22:32.680 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.681+0000 7f6c05db5640 1 --2- 192.168.123.108:0/1896932341 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6c00102620 0x7f6c00102a20 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:32.680 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.682+0000 7f6c05db5640 1 -- 192.168.123.108:0/1896932341 >> 192.168.123.108:0/1896932341 conn(0x7f6c000fde70 msgr2=0x7f6c00100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:32.681 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.682+0000 7f6c05db5640 1 -- 192.168.123.108:0/1896932341 shutdown_connections 2026-03-09T19:22:32.681 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.682+0000 7f6c05db5640 1 -- 192.168.123.108:0/1896932341 wait complete. 2026-03-09T19:22:32.681 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.682+0000 7f6c05db5640 1 Processor -- start 2026-03-09T19:22:32.681 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.682+0000 7f6c05db5640 1 -- start start 2026-03-09T19:22:32.681 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.683+0000 7f6c05db5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6c00102620 0x7f6c001999a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:32.681 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.683+0000 7f6c05db5640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6c00199ee0 con 0x7f6c00102620 2026-03-09T19:22:32.681 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.683+0000 7f6bff7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6c00102620 0x7f6c001999a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:32.682 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.683+0000 7f6bff7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6c00102620 0x7f6c001999a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:50032/0 (socket says 192.168.123.108:50032) 2026-03-09T19:22:32.682 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.683+0000 7f6bff7fe640 1 -- 192.168.123.108:0/2560386764 learned_addr learned my addr 192.168.123.108:0/2560386764 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:32.682 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.684+0000 7f6bff7fe640 1 -- 192.168.123.108:0/2560386764 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6bf0009660 con 0x7f6c00102620 2026-03-09T19:22:32.682 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.684+0000 7f6bff7fe640 1 --2- 192.168.123.108:0/2560386764 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6c00102620 0x7f6c001999a0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f6bf00042c0 tx=0x7f6bf00042f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:32.683 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.684+0000 7f6bfcff9640 1 -- 192.168.123.108:0/2560386764 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6bf003d070 con 0x7f6c00102620 2026-03-09T19:22:32.683 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.684+0000 7f6bfcff9640 1 -- 192.168.123.108:0/2560386764 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6bf0038b40 con 0x7f6c00102620 2026-03-09T19:22:32.683 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.684+0000 7f6bfcff9640 1 -- 192.168.123.108:0/2560386764 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6bf0041a40 con 0x7f6c00102620 2026-03-09T19:22:32.683 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.684+0000 7f6c05db5640 1 -- 192.168.123.108:0/2560386764 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6c0019a0e0 con 0x7f6c00102620 2026-03-09T19:22:32.683 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.684+0000 7f6c05db5640 1 -- 192.168.123.108:0/2560386764 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6c0019a4c0 con 0x7f6c00102620 2026-03-09T19:22:32.684 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.685+0000 7f6c05db5640 1 -- 192.168.123.108:0/2560386764 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6bc4005350 con 0x7f6c00102620 2026-03-09T19:22:32.684 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.686+0000 7f6bfcff9640 1 -- 192.168.123.108:0/2560386764 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f6bf0038cb0 con 0x7f6c00102620 2026-03-09T19:22:32.684 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.686+0000 7f6bfcff9640 1 --2- 192.168.123.108:0/2560386764 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f6bd403d280 0x7f6bd403f740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:32.684 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.686+0000 7f6bfcff9640 1 -- 192.168.123.108:0/2560386764 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f6bf0076160 con 0x7f6c00102620 2026-03-09T19:22:32.685 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.686+0000 7f6bfeffd640 1 --2- 192.168.123.108:0/2560386764 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f6bd403d280 0x7f6bd403f740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:32.685 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.686+0000 7f6bfeffd640 1 --2- 192.168.123.108:0/2560386764 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f6bd403d280 0x7f6bd403f740 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f6bec0099c0 tx=0x7f6bec006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:32.687 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.688+0000 7f6bfcff9640 1 -- 192.168.123.108:0/2560386764 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6bf0049540 con 0x7f6c00102620 2026-03-09T19:22:32.835 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.836+0000 7f6c05db5640 1 -- 192.168.123.108:0/2560386764 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f6bc40051c0 con 0x7f6c00102620 2026-03-09T19:22:32.836 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.837+0000 7f6bfcff9640 1 -- 192.168.123.108:0/2560386764 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f6bf0046030 con 0x7f6c00102620 2026-03-09T19:22:32.837 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:32.837 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:32.837 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:32.839 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.840+0000 7f6c05db5640 1 -- 192.168.123.108:0/2560386764 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f6bd403d280 msgr2=0x7f6bd403f740 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:32.839 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.841+0000 7f6c05db5640 1 --2- 192.168.123.108:0/2560386764 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f6bd403d280 0x7f6bd403f740 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f6bec0099c0 tx=0x7f6bec006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:32.839 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.841+0000 7f6c05db5640 1 -- 192.168.123.108:0/2560386764 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6c00102620 msgr2=0x7f6c001999a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:32.839 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.841+0000 7f6c05db5640 1 --2- 192.168.123.108:0/2560386764 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6c00102620 0x7f6c001999a0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f6bf00042c0 tx=0x7f6bf00042f0 comp rx=0 tx=0).stop 2026-03-09T19:22:32.840 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.841+0000 7f6c05db5640 1 -- 192.168.123.108:0/2560386764 shutdown_connections 2026-03-09T19:22:32.840 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.841+0000 7f6c05db5640 1 --2- 192.168.123.108:0/2560386764 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f6bd403d280 0x7f6bd403f740 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:32.840 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.841+0000 7f6c05db5640 1 --2- 192.168.123.108:0/2560386764 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6c00102620 0x7f6c001999a0 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:32.840 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.842+0000 7f6c05db5640 1 -- 192.168.123.108:0/2560386764 >> 192.168.123.108:0/2560386764 conn(0x7f6c000fde70 msgr2=0x7f6c000fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:32.840 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.842+0000 7f6c05db5640 1 -- 192.168.123.108:0/2560386764 shutdown_connections 2026-03-09T19:22:32.840 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:32.842+0000 7f6c05db5640 1 -- 192.168.123.108:0/2560386764 wait complete. 2026-03-09T19:22:32.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:32 vm07 ceph-mon[48545]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:22:33.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:33 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/2560386764' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:34.010 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:34.011 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:34.151 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:34.193 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:34.423 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.423+0000 7f43dcb02640 1 -- 192.168.123.108:0/4047150695 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43d8102640 msgr2=0x7f43d8102a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:34.423 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.423+0000 7f43dcb02640 1 --2- 192.168.123.108:0/4047150695 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43d8102640 0x7f43d8102a40 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f43c40099b0 tx=0x7f43c402f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:34.423 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.424+0000 7f43dcb02640 1 -- 192.168.123.108:0/4047150695 shutdown_connections 2026-03-09T19:22:34.423 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.424+0000 7f43dcb02640 1 --2- 192.168.123.108:0/4047150695 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43d8102640 0x7f43d8102a40 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:34.423 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.424+0000 7f43dcb02640 1 -- 192.168.123.108:0/4047150695 >> 192.168.123.108:0/4047150695 conn(0x7f43d80fde70 msgr2=0x7f43d8100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:34.423 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.424+0000 7f43dcb02640 1 -- 192.168.123.108:0/4047150695 shutdown_connections 2026-03-09T19:22:34.423 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.424+0000 7f43dcb02640 1 -- 192.168.123.108:0/4047150695 wait complete. 2026-03-09T19:22:34.423 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.425+0000 7f43dcb02640 1 Processor -- start 2026-03-09T19:22:34.423 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.425+0000 7f43dcb02640 1 -- start start 2026-03-09T19:22:34.423 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.425+0000 7f43dcb02640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43d8102640 0x7f43d8078ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:34.423 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.425+0000 7f43dcb02640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f43d8079400 con 0x7f43d8102640 2026-03-09T19:22:34.424 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.425+0000 7f43d6575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43d8102640 0x7f43d8078ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:34.424 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.425+0000 7f43d6575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43d8102640 0x7f43d8078ec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:50048/0 (socket says 192.168.123.108:50048) 2026-03-09T19:22:34.424 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.425+0000 7f43d6575640 1 -- 192.168.123.108:0/3340773970 learned_addr learned my addr 192.168.123.108:0/3340773970 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:34.424 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.426+0000 7f43d6575640 1 -- 192.168.123.108:0/3340773970 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f43c4009660 con 0x7f43d8102640 2026-03-09T19:22:34.424 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.426+0000 7f43d6575640 1 --2- 192.168.123.108:0/3340773970 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43d8102640 0x7f43d8078ec0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f43c402f860 tx=0x7f43c4004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:34.424 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.426+0000 7f43c37fe640 1 -- 192.168.123.108:0/3340773970 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f43c40043b0 con 0x7f43d8102640 2026-03-09T19:22:34.424 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.426+0000 7f43dcb02640 1 -- 192.168.123.108:0/3340773970 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f43d8079600 con 0x7f43d8102640 2026-03-09T19:22:34.425 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.426+0000 7f43dcb02640 1 -- 192.168.123.108:0/3340773970 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f43d8075a00 con 0x7f43d8102640 2026-03-09T19:22:34.426 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.426+0000 7f43c37fe640 1 -- 192.168.123.108:0/3340773970 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f43c4038b40 con 0x7f43d8102640 2026-03-09T19:22:34.426 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.426+0000 7f43c37fe640 1 -- 192.168.123.108:0/3340773970 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f43c40418f0 con 0x7f43d8102640 2026-03-09T19:22:34.426 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.427+0000 7f43c37fe640 1 -- 192.168.123.108:0/3340773970 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f43c4041a50 con 0x7f43d8102640 2026-03-09T19:22:34.426 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.428+0000 7f43c37fe640 1 --2- 192.168.123.108:0/3340773970 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f43a003d2d0 0x7f43a003f790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:34.426 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.428+0000 7f43c37fe640 1 -- 192.168.123.108:0/3340773970 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f43c4038680 con 0x7f43d8102640 2026-03-09T19:22:34.426 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.428+0000 7f43d5d74640 1 --2- 192.168.123.108:0/3340773970 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f43a003d2d0 0x7f43a003f790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:34.426 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.428+0000 7f43dcb02640 1 -- 192.168.123.108:0/3340773970 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f43a4005350 con 0x7f43d8102640 2026-03-09T19:22:34.427 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.429+0000 7f43d5d74640 1 --2- 192.168.123.108:0/3340773970 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f43a003d2d0 0x7f43a003f790 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f43cc0099c0 tx=0x7f43cc006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:34.429 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.431+0000 7f43c37fe640 1 -- 192.168.123.108:0/3340773970 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f43c4037790 con 0x7f43d8102640 2026-03-09T19:22:34.555 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.556+0000 7f43dcb02640 1 -- 192.168.123.108:0/3340773970 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f43a40051c0 con 0x7f43d8102640 2026-03-09T19:22:34.567 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.567+0000 7f43c37fe640 1 -- 192.168.123.108:0/3340773970 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f43c40373d0 con 0x7f43d8102640 2026-03-09T19:22:34.567 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:34.567 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:34.567 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:34.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.570+0000 7f43dcb02640 1 -- 192.168.123.108:0/3340773970 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f43a003d2d0 msgr2=0x7f43a003f790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:34.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.570+0000 7f43dcb02640 1 --2- 192.168.123.108:0/3340773970 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f43a003d2d0 0x7f43a003f790 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f43cc0099c0 tx=0x7f43cc006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:34.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.570+0000 7f43dcb02640 1 -- 192.168.123.108:0/3340773970 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43d8102640 msgr2=0x7f43d8078ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:34.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.570+0000 7f43dcb02640 1 --2- 192.168.123.108:0/3340773970 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43d8102640 0x7f43d8078ec0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f43c402f860 tx=0x7f43c4004270 comp rx=0 tx=0).stop 2026-03-09T19:22:34.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.570+0000 7f43dcb02640 1 -- 192.168.123.108:0/3340773970 shutdown_connections 2026-03-09T19:22:34.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.570+0000 7f43dcb02640 1 --2- 192.168.123.108:0/3340773970 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f43a003d2d0 0x7f43a003f790 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:34.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.570+0000 7f43dcb02640 1 --2- 192.168.123.108:0/3340773970 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43d8102640 0x7f43d8078ec0 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:34.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.571+0000 7f43dcb02640 1 -- 192.168.123.108:0/3340773970 >> 192.168.123.108:0/3340773970 conn(0x7f43d80fde70 msgr2=0x7f43d80fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:34.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.571+0000 7f43dcb02640 1 -- 192.168.123.108:0/3340773970 shutdown_connections 2026-03-09T19:22:34.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:34.571+0000 7f43dcb02640 1 -- 192.168.123.108:0/3340773970 wait complete. 2026-03-09T19:22:34.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:34 vm07 ceph-mon[48545]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:22:34.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:34 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/3340773970' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:35.693 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:35.693 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:35.846 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:35.885 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:36.116 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.117+0000 7ff3da783640 1 -- 192.168.123.108:0/1115381734 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3d4102470 msgr2=0x7ff3d4102870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:36.116 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.117+0000 7ff3da783640 1 --2- 192.168.123.108:0/1115381734 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3d4102470 0x7ff3d4102870 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7ff3bc0099b0 tx=0x7ff3bc02f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:36.116 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.117+0000 7ff3da783640 1 -- 192.168.123.108:0/1115381734 shutdown_connections 2026-03-09T19:22:36.116 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.117+0000 7ff3da783640 1 --2- 192.168.123.108:0/1115381734 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3d4102470 0x7ff3d4102870 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:36.116 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.117+0000 7ff3da783640 1 -- 192.168.123.108:0/1115381734 >> 192.168.123.108:0/1115381734 conn(0x7ff3d40fdca0 msgr2=0x7ff3d4100090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:36.116 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.117+0000 7ff3da783640 1 -- 192.168.123.108:0/1115381734 shutdown_connections 2026-03-09T19:22:36.116 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.118+0000 7ff3da783640 1 -- 192.168.123.108:0/1115381734 wait complete. 2026-03-09T19:22:36.117 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.118+0000 7ff3da783640 1 Processor -- start 2026-03-09T19:22:36.117 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.118+0000 7ff3da783640 1 -- start start 2026-03-09T19:22:36.117 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.119+0000 7ff3da783640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3d4102470 0x7ff3d41997f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:36.117 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.119+0000 7ff3da783640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff3d4199d30 con 0x7ff3d4102470 2026-03-09T19:22:36.118 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.119+0000 7ff3d9781640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3d4102470 0x7ff3d41997f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:36.118 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.119+0000 7ff3d9781640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3d4102470 0x7ff3d41997f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:50062/0 (socket says 192.168.123.108:50062) 2026-03-09T19:22:36.118 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.119+0000 7ff3d9781640 1 -- 192.168.123.108:0/430114337 learned_addr learned my addr 192.168.123.108:0/430114337 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:36.118 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.120+0000 7ff3d9781640 1 -- 192.168.123.108:0/430114337 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff3bc009660 con 0x7ff3d4102470 2026-03-09T19:22:36.118 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.120+0000 7ff3d9781640 1 --2- 192.168.123.108:0/430114337 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3d4102470 0x7ff3d41997f0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7ff3bc0042c0 tx=0x7ff3bc0042f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:36.119 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.120+0000 7ff3ca7fc640 1 -- 192.168.123.108:0/430114337 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff3bc03d070 con 0x7ff3d4102470 2026-03-09T19:22:36.119 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.120+0000 7ff3da783640 1 -- 192.168.123.108:0/430114337 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff3d4199f30 con 0x7ff3d4102470 2026-03-09T19:22:36.119 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.121+0000 7ff3da783640 1 -- 192.168.123.108:0/430114337 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff3d419a3d0 con 0x7ff3d4102470 2026-03-09T19:22:36.121 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.121+0000 7ff3ca7fc640 1 -- 192.168.123.108:0/430114337 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff3bc038b40 con 0x7ff3d4102470 2026-03-09T19:22:36.121 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.121+0000 7ff3ca7fc640 1 -- 192.168.123.108:0/430114337 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff3bc041960 con 0x7ff3d4102470 2026-03-09T19:22:36.121 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.122+0000 7ff3ca7fc640 1 -- 192.168.123.108:0/430114337 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7ff3bc04b430 con 0x7ff3d4102470 2026-03-09T19:22:36.121 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.122+0000 7ff3da783640 1 -- 192.168.123.108:0/430114337 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff3d41028f0 con 0x7ff3d4102470 2026-03-09T19:22:36.121 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.123+0000 7ff3ca7fc640 1 --2- 192.168.123.108:0/430114337 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7ff3b003d280 0x7ff3b003f740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:36.121 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.123+0000 7ff3ca7fc640 1 -- 192.168.123.108:0/430114337 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7ff3bc0388c0 con 0x7ff3d4102470 2026-03-09T19:22:36.122 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.123+0000 7ff3d8f80640 1 --2- 192.168.123.108:0/430114337 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7ff3b003d280 0x7ff3b003f740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:36.123 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.124+0000 7ff3d8f80640 1 --2- 192.168.123.108:0/430114337 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7ff3b003d280 0x7ff3b003f740 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7ff3c40099c0 tx=0x7ff3c4006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:36.125 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.126+0000 7ff3ca7fc640 1 -- 192.168.123.108:0/430114337 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7ff3bc03faf0 con 0x7ff3d4102470 2026-03-09T19:22:36.251 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.252+0000 7ff3da783640 1 -- 192.168.123.108:0/430114337 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7ff3d4108380 con 0x7ff3d4102470 2026-03-09T19:22:36.252 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.254+0000 7ff3ca7fc640 1 -- 192.168.123.108:0/430114337 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7ff3d4108380 con 0x7ff3d4102470 2026-03-09T19:22:36.253 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:36.253 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:36.253 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:36.255 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.256+0000 7ff3da783640 1 -- 192.168.123.108:0/430114337 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7ff3b003d280 msgr2=0x7ff3b003f740 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:36.255 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.257+0000 7ff3da783640 1 --2- 192.168.123.108:0/430114337 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7ff3b003d280 0x7ff3b003f740 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7ff3c40099c0 tx=0x7ff3c4006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:36.255 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.257+0000 7ff3da783640 1 -- 192.168.123.108:0/430114337 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3d4102470 msgr2=0x7ff3d41997f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:36.255 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.257+0000 7ff3da783640 1 --2- 192.168.123.108:0/430114337 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3d4102470 0x7ff3d41997f0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7ff3bc0042c0 tx=0x7ff3bc0042f0 comp rx=0 tx=0).stop 2026-03-09T19:22:36.255 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.257+0000 7ff3da783640 1 -- 192.168.123.108:0/430114337 shutdown_connections 2026-03-09T19:22:36.256 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.257+0000 7ff3da783640 1 --2- 192.168.123.108:0/430114337 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7ff3b003d280 0x7ff3b003f740 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:36.256 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.257+0000 7ff3da783640 1 --2- 192.168.123.108:0/430114337 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3d4102470 0x7ff3d41997f0 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:36.256 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.257+0000 7ff3da783640 1 -- 192.168.123.108:0/430114337 >> 192.168.123.108:0/430114337 conn(0x7ff3d40fdca0 msgr2=0x7ff3d40fe8a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:36.256 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.258+0000 7ff3da783640 1 -- 192.168.123.108:0/430114337 shutdown_connections 2026-03-09T19:22:36.256 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:36.258+0000 7ff3da783640 1 -- 192.168.123.108:0/430114337 wait complete. 2026-03-09T19:22:37.319 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:37.320 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:37.472 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:37.513 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:37.555 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:37 vm07 ceph-mon[48545]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:22:37.555 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:37 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/430114337' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:37.758 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.759+0000 7fb5bdedc640 1 -- 192.168.123.108:0/3235876048 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb5b8102620 msgr2=0x7fb5b8102a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:37.758 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.759+0000 7fb5bdedc640 1 --2- 192.168.123.108:0/3235876048 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb5b8102620 0x7fb5b8102a20 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7fb5a80099b0 tx=0x7fb5a802f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:37.758 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.760+0000 7fb5bdedc640 1 -- 192.168.123.108:0/3235876048 shutdown_connections 2026-03-09T19:22:37.758 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.760+0000 7fb5bdedc640 1 --2- 192.168.123.108:0/3235876048 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb5b8102620 0x7fb5b8102a20 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:37.758 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.760+0000 7fb5bdedc640 1 -- 192.168.123.108:0/3235876048 >> 192.168.123.108:0/3235876048 conn(0x7fb5b80fde70 msgr2=0x7fb5b8100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:37.758 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.760+0000 7fb5bdedc640 1 -- 192.168.123.108:0/3235876048 shutdown_connections 2026-03-09T19:22:37.759 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.760+0000 7fb5bdedc640 1 -- 192.168.123.108:0/3235876048 wait complete. 2026-03-09T19:22:37.759 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.761+0000 7fb5bdedc640 1 Processor -- start 2026-03-09T19:22:37.759 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.761+0000 7fb5bdedc640 1 -- start start 2026-03-09T19:22:37.759 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.761+0000 7fb5bdedc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb5b8102620 0x7fb5b8199990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:37.759 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.761+0000 7fb5bdedc640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb5b8199ed0 con 0x7fb5b8102620 2026-03-09T19:22:37.760 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.761+0000 7fb5b77fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb5b8102620 0x7fb5b8199990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:37.760 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.761+0000 7fb5b77fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb5b8102620 0x7fb5b8199990 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:50320/0 (socket says 192.168.123.108:50320) 2026-03-09T19:22:37.760 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.761+0000 7fb5b77fe640 1 -- 192.168.123.108:0/3231027580 learned_addr learned my addr 192.168.123.108:0/3231027580 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:37.760 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.762+0000 7fb5b77fe640 1 -- 192.168.123.108:0/3231027580 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb5a8009660 con 0x7fb5b8102620 2026-03-09T19:22:37.760 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.762+0000 7fb5b77fe640 1 --2- 192.168.123.108:0/3231027580 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb5b8102620 0x7fb5b8199990 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fb5a802f860 tx=0x7fb5a8004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:37.761 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.762+0000 7fb5b4ff9640 1 -- 192.168.123.108:0/3231027580 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb5a80043b0 con 0x7fb5b8102620 2026-03-09T19:22:37.761 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.762+0000 7fb5b4ff9640 1 -- 192.168.123.108:0/3231027580 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fb5a8038b40 con 0x7fb5b8102620 2026-03-09T19:22:37.761 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.763+0000 7fb5b4ff9640 1 -- 192.168.123.108:0/3231027580 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb5a80418f0 con 0x7fb5b8102620 2026-03-09T19:22:37.761 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.763+0000 7fb5bdedc640 1 -- 192.168.123.108:0/3231027580 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb5b819a0d0 con 0x7fb5b8102620 2026-03-09T19:22:37.762 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.763+0000 7fb5bdedc640 1 -- 192.168.123.108:0/3231027580 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb5b819a570 con 0x7fb5b8102620 2026-03-09T19:22:37.762 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.764+0000 7fb5bdedc640 1 -- 192.168.123.108:0/3231027580 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb5b8102aa0 con 0x7fb5b8102620 2026-03-09T19:22:37.762 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.764+0000 7fb5b4ff9640 1 -- 192.168.123.108:0/3231027580 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7fb5a8038cb0 con 0x7fb5b8102620 2026-03-09T19:22:37.763 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.764+0000 7fb5b4ff9640 1 --2- 192.168.123.108:0/3231027580 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fb58c03cf60 0x7fb58c03f420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:37.763 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.764+0000 7fb5b4ff9640 1 -- 192.168.123.108:0/3231027580 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7fb5a8075b60 con 0x7fb5b8102620 2026-03-09T19:22:37.763 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.764+0000 7fb5b6ffd640 1 --2- 192.168.123.108:0/3231027580 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fb58c03cf60 0x7fb58c03f420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:37.763 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.765+0000 7fb5b6ffd640 1 --2- 192.168.123.108:0/3231027580 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fb58c03cf60 0x7fb58c03f420 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fb5a40099c0 tx=0x7fb5a4006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:37.765 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.767+0000 7fb5b4ff9640 1 -- 192.168.123.108:0/3231027580 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fb5a8048b40 con 0x7fb5b8102620 2026-03-09T19:22:37.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.900+0000 7fb5bdedc640 1 -- 192.168.123.108:0/3231027580 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fb5b8108530 con 0x7fb5b8102620 2026-03-09T19:22:37.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.902+0000 7fb5b4ff9640 1 -- 192.168.123.108:0/3231027580 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7fb5a8049df0 con 0x7fb5b8102620 2026-03-09T19:22:37.901 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:37.901 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:37.901 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:37.903 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.905+0000 7fb5bdedc640 1 -- 192.168.123.108:0/3231027580 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fb58c03cf60 msgr2=0x7fb58c03f420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:37.903 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.905+0000 7fb5bdedc640 1 --2- 192.168.123.108:0/3231027580 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fb58c03cf60 0x7fb58c03f420 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fb5a40099c0 tx=0x7fb5a4006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:37.903 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.905+0000 7fb5bdedc640 1 -- 192.168.123.108:0/3231027580 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb5b8102620 msgr2=0x7fb5b8199990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:37.903 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.905+0000 7fb5bdedc640 1 --2- 192.168.123.108:0/3231027580 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb5b8102620 0x7fb5b8199990 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fb5a802f860 tx=0x7fb5a8004270 comp rx=0 tx=0).stop 2026-03-09T19:22:37.903 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.905+0000 7fb5bdedc640 1 -- 192.168.123.108:0/3231027580 shutdown_connections 2026-03-09T19:22:37.904 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.905+0000 7fb5bdedc640 1 --2- 192.168.123.108:0/3231027580 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fb58c03cf60 0x7fb58c03f420 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:37.904 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.905+0000 7fb5bdedc640 1 --2- 192.168.123.108:0/3231027580 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb5b8102620 0x7fb5b8199990 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:37.904 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.905+0000 7fb5bdedc640 1 -- 192.168.123.108:0/3231027580 >> 192.168.123.108:0/3231027580 conn(0x7fb5b80fde70 msgr2=0x7fb5b80fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:37.904 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.905+0000 7fb5bdedc640 1 -- 192.168.123.108:0/3231027580 shutdown_connections 2026-03-09T19:22:37.904 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:37.906+0000 7fb5bdedc640 1 -- 192.168.123.108:0/3231027580 wait complete. 2026-03-09T19:22:38.436 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:38 vm07 ceph-mon[48545]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:22:38.436 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:38 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/3231027580' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:38.436 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:38 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:38.436 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:38 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:38.436 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:38 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:38.436 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:38 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:38.436 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:38 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:38.436 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:38 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:38.436 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:38 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:38.436 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:38 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:38.967 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:38.968 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:39.106 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:39.141 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:39.361 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.362+0000 7f7b10f1b640 1 -- 192.168.123.108:0/4135109245 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b0c102640 msgr2=0x7f7b0c102a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:39.361 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.362+0000 7f7b10f1b640 1 --2- 192.168.123.108:0/4135109245 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b0c102640 0x7f7b0c102a40 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f7af80099b0 tx=0x7f7af802f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:39.361 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.362+0000 7f7b10f1b640 1 -- 192.168.123.108:0/4135109245 shutdown_connections 2026-03-09T19:22:39.361 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.362+0000 7f7b10f1b640 1 --2- 192.168.123.108:0/4135109245 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b0c102640 0x7f7b0c102a40 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:39.361 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.363+0000 7f7b10f1b640 1 -- 192.168.123.108:0/4135109245 >> 192.168.123.108:0/4135109245 conn(0x7f7b0c0fde70 msgr2=0x7f7b0c100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:39.362 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.363+0000 7f7b10f1b640 1 -- 192.168.123.108:0/4135109245 shutdown_connections 2026-03-09T19:22:39.362 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.363+0000 7f7b10f1b640 1 -- 192.168.123.108:0/4135109245 wait complete. 2026-03-09T19:22:39.362 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.363+0000 7f7b10f1b640 1 Processor -- start 2026-03-09T19:22:39.362 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.364+0000 7f7b10f1b640 1 -- start start 2026-03-09T19:22:39.362 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.364+0000 7f7b10f1b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b0c102640 0x7f7b0c199950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:39.362 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.364+0000 7f7b10f1b640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7b0c199e90 con 0x7f7b0c102640 2026-03-09T19:22:39.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.364+0000 7f7b0a575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b0c102640 0x7f7b0c199950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:39.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.364+0000 7f7b0a575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b0c102640 0x7f7b0c199950 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:50340/0 (socket says 192.168.123.108:50340) 2026-03-09T19:22:39.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.364+0000 7f7b0a575640 1 -- 192.168.123.108:0/98367126 learned_addr learned my addr 192.168.123.108:0/98367126 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:39.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.365+0000 7f7b0a575640 1 -- 192.168.123.108:0/98367126 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7af8009660 con 0x7f7b0c102640 2026-03-09T19:22:39.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.365+0000 7f7b0a575640 1 --2- 192.168.123.108:0/98367126 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b0c102640 0x7f7b0c199950 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f7af802f860 tx=0x7f7af8004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:39.365 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.365+0000 7f7af77fe640 1 -- 192.168.123.108:0/98367126 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7af80043b0 con 0x7f7b0c102640 2026-03-09T19:22:39.365 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.365+0000 7f7af77fe640 1 -- 192.168.123.108:0/98367126 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f7af8038b40 con 0x7f7b0c102640 2026-03-09T19:22:39.365 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.365+0000 7f7af77fe640 1 -- 192.168.123.108:0/98367126 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7af80418f0 con 0x7f7b0c102640 2026-03-09T19:22:39.365 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.365+0000 7f7b10f1b640 1 -- 192.168.123.108:0/98367126 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7b0c19a090 con 0x7f7b0c102640 2026-03-09T19:22:39.365 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.365+0000 7f7b10f1b640 1 -- 192.168.123.108:0/98367126 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7b0c19a530 con 0x7f7b0c102640 2026-03-09T19:22:39.365 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.366+0000 7f7b10f1b640 1 -- 192.168.123.108:0/98367126 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7ad8005350 con 0x7f7b0c102640 2026-03-09T19:22:39.366 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.366+0000 7f7af77fe640 1 -- 192.168.123.108:0/98367126 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f7af8038cb0 con 0x7f7b0c102640 2026-03-09T19:22:39.366 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.366+0000 7f7af77fe640 1 --2- 192.168.123.108:0/98367126 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f7ae803d2d0 0x7f7ae803f790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:39.366 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.366+0000 7f7af77fe640 1 -- 192.168.123.108:0/98367126 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f7af8076250 con 0x7f7b0c102640 2026-03-09T19:22:39.366 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.367+0000 7f7b09d74640 1 --2- 192.168.123.108:0/98367126 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f7ae803d2d0 0x7f7ae803f790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:39.366 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.367+0000 7f7b09d74640 1 --2- 192.168.123.108:0/98367126 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f7ae803d2d0 0x7f7ae803f790 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f7b000099c0 tx=0x7f7b00006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:39.368 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.369+0000 7f7af77fe640 1 -- 192.168.123.108:0/98367126 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f7af8035320 con 0x7f7b0c102640 2026-03-09T19:22:39.498 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.498+0000 7f7b10f1b640 1 -- 192.168.123.108:0/98367126 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f7ad80051c0 con 0x7f7b0c102640 2026-03-09T19:22:39.498 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.500+0000 7f7af77fe640 1 -- 192.168.123.108:0/98367126 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f7af80373d0 con 0x7f7b0c102640 2026-03-09T19:22:39.499 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:39.499 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:39.499 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:39.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.502+0000 7f7b10f1b640 1 -- 192.168.123.108:0/98367126 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f7ae803d2d0 msgr2=0x7f7ae803f790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:39.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.502+0000 7f7b10f1b640 1 --2- 192.168.123.108:0/98367126 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f7ae803d2d0 0x7f7ae803f790 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f7b000099c0 tx=0x7f7b00006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:39.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.503+0000 7f7b10f1b640 1 -- 192.168.123.108:0/98367126 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b0c102640 msgr2=0x7f7b0c199950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:39.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.503+0000 7f7b10f1b640 1 --2- 192.168.123.108:0/98367126 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b0c102640 0x7f7b0c199950 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f7af802f860 tx=0x7f7af8004270 comp rx=0 tx=0).stop 2026-03-09T19:22:39.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.503+0000 7f7b10f1b640 1 -- 192.168.123.108:0/98367126 shutdown_connections 2026-03-09T19:22:39.502 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.503+0000 7f7b10f1b640 1 --2- 192.168.123.108:0/98367126 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f7ae803d2d0 0x7f7ae803f790 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:39.502 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.503+0000 7f7b10f1b640 1 --2- 192.168.123.108:0/98367126 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b0c102640 0x7f7b0c199950 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:39.502 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.503+0000 7f7b10f1b640 1 -- 192.168.123.108:0/98367126 >> 192.168.123.108:0/98367126 conn(0x7f7b0c0fde70 msgr2=0x7f7b0c0fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:39.502 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.504+0000 7f7b10f1b640 1 -- 192.168.123.108:0/98367126 shutdown_connections 2026-03-09T19:22:39.503 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:39.504+0000 7f7b10f1b640 1 -- 192.168.123.108:0/98367126 wait complete. 2026-03-09T19:22:39.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:39 vm07 ceph-mon[48545]: Deploying daemon prometheus.vm07 on vm07 2026-03-09T19:22:39.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:39 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:40.573 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:40.574 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:40.705 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:40.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:40 vm07 ceph-mon[48545]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:22:40.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:40 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/98367126' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:40.742 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:40.993 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.994+0000 7f639ec7c640 1 -- 192.168.123.108:0/234513724 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63980ff550 msgr2=0x7f63980ff950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:40.993 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.994+0000 7f639ec7c640 1 --2- 192.168.123.108:0/234513724 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63980ff550 0x7f63980ff950 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f63800099b0 tx=0x7f638002f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:40.993 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.994+0000 7f639ec7c640 1 -- 192.168.123.108:0/234513724 shutdown_connections 2026-03-09T19:22:40.993 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.994+0000 7f639ec7c640 1 --2- 192.168.123.108:0/234513724 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63980ff550 0x7f63980ff950 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:40.993 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.994+0000 7f639ec7c640 1 -- 192.168.123.108:0/234513724 >> 192.168.123.108:0/234513724 conn(0x7f63980f9d00 msgr2=0x7f63980fc120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:40.993 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.995+0000 7f639ec7c640 1 -- 192.168.123.108:0/234513724 shutdown_connections 2026-03-09T19:22:40.993 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.995+0000 7f639ec7c640 1 -- 192.168.123.108:0/234513724 wait complete. 2026-03-09T19:22:40.993 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.995+0000 7f639ec7c640 1 Processor -- start 2026-03-09T19:22:40.994 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.995+0000 7f639ec7c640 1 -- start start 2026-03-09T19:22:40.994 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.995+0000 7f639ec7c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63980ff550 0x7f6398199970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:40.994 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.995+0000 7f639ec7c640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6398199eb0 con 0x7f63980ff550 2026-03-09T19:22:40.994 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.996+0000 7f639c9f1640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63980ff550 0x7f6398199970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:40.994 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.996+0000 7f639c9f1640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63980ff550 0x7f6398199970 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:50356/0 (socket says 192.168.123.108:50356) 2026-03-09T19:22:40.994 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.996+0000 7f639c9f1640 1 -- 192.168.123.108:0/1391778285 learned_addr learned my addr 192.168.123.108:0/1391778285 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:40.995 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.996+0000 7f639c9f1640 1 -- 192.168.123.108:0/1391778285 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6380009660 con 0x7f63980ff550 2026-03-09T19:22:40.995 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.996+0000 7f639c9f1640 1 --2- 192.168.123.108:0/1391778285 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63980ff550 0x7f6398199970 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f638002f860 tx=0x7f6380004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:40.995 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.997+0000 7f638dffb640 1 -- 192.168.123.108:0/1391778285 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f63800043b0 con 0x7f63980ff550 2026-03-09T19:22:40.995 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.997+0000 7f639ec7c640 1 -- 192.168.123.108:0/1391778285 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f639819a0b0 con 0x7f63980ff550 2026-03-09T19:22:40.995 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.997+0000 7f639ec7c640 1 -- 192.168.123.108:0/1391778285 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f639819a550 con 0x7f63980ff550 2026-03-09T19:22:40.995 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.997+0000 7f638dffb640 1 -- 192.168.123.108:0/1391778285 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6380038b40 con 0x7f63980ff550 2026-03-09T19:22:40.995 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.997+0000 7f638dffb640 1 -- 192.168.123.108:0/1391778285 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6380041810 con 0x7f63980ff550 2026-03-09T19:22:40.997 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.998+0000 7f638dffb640 1 -- 192.168.123.108:0/1391778285 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f6380041ac0 con 0x7f63980ff550 2026-03-09T19:22:40.997 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.998+0000 7f638dffb640 1 --2- 192.168.123.108:0/1391778285 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f636803d2d0 0x7f636803f790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:40.997 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.998+0000 7f638dffb640 1 -- 192.168.123.108:0/1391778285 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f63800771c0 con 0x7f63980ff550 2026-03-09T19:22:40.997 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.998+0000 7f639ec7c640 1 -- 192.168.123.108:0/1391778285 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f63980ff9d0 con 0x7f63980ff550 2026-03-09T19:22:40.997 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.998+0000 7f638ffff640 1 --2- 192.168.123.108:0/1391778285 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f636803d2d0 0x7f636803f790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:40.997 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:40.999+0000 7f638ffff640 1 --2- 192.168.123.108:0/1391778285 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f636803d2d0 0x7f636803f790 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f63880099c0 tx=0x7f6388006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:41.000 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:41.001+0000 7f638dffb640 1 -- 192.168.123.108:0/1391778285 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6380041df0 con 0x7f63980ff550 2026-03-09T19:22:41.124 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:41.125+0000 7f639ec7c640 1 -- 192.168.123.108:0/1391778285 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f6398108470 con 0x7f63980ff550 2026-03-09T19:22:41.124 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:41.126+0000 7f638dffb640 1 -- 192.168.123.108:0/1391778285 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f63800373d0 con 0x7f63980ff550 2026-03-09T19:22:41.124 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:41.124 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:41.124 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:41.127 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:41.128+0000 7f639ec7c640 1 -- 192.168.123.108:0/1391778285 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f636803d2d0 msgr2=0x7f636803f790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:41.127 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:41.128+0000 7f639ec7c640 1 --2- 192.168.123.108:0/1391778285 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f636803d2d0 0x7f636803f790 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f63880099c0 tx=0x7f6388006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:41.127 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:41.128+0000 7f639ec7c640 1 -- 192.168.123.108:0/1391778285 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63980ff550 msgr2=0x7f6398199970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:41.127 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:41.128+0000 7f639ec7c640 1 --2- 192.168.123.108:0/1391778285 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63980ff550 0x7f6398199970 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f638002f860 tx=0x7f6380004270 comp rx=0 tx=0).stop 2026-03-09T19:22:41.127 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:41.129+0000 7f639ec7c640 1 -- 192.168.123.108:0/1391778285 shutdown_connections 2026-03-09T19:22:41.127 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:41.129+0000 7f639ec7c640 1 --2- 192.168.123.108:0/1391778285 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f636803d2d0 0x7f636803f790 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:41.127 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:41.129+0000 7f639ec7c640 1 --2- 192.168.123.108:0/1391778285 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63980ff550 0x7f6398199970 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:41.127 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:41.129+0000 7f639ec7c640 1 -- 192.168.123.108:0/1391778285 >> 192.168.123.108:0/1391778285 conn(0x7f63980f9d00 msgr2=0x7f63980fa900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:41.127 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:41.129+0000 7f639ec7c640 1 -- 192.168.123.108:0/1391778285 shutdown_connections 2026-03-09T19:22:41.127 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:41.129+0000 7f639ec7c640 1 -- 192.168.123.108:0/1391778285 wait complete. 2026-03-09T19:22:41.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:41 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/1391778285' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:42.195 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:42.195 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:42.342 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:42.381 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:42.606 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.607+0000 7fa4214c3640 1 -- 192.168.123.108:0/177059180 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa41c102640 msgr2=0x7fa41c102a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:42.606 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.607+0000 7fa4214c3640 1 --2- 192.168.123.108:0/177059180 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa41c102640 0x7fa41c102a40 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7fa4040099b0 tx=0x7fa40402f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:42.607 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.608+0000 7fa4214c3640 1 -- 192.168.123.108:0/177059180 shutdown_connections 2026-03-09T19:22:42.607 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.608+0000 7fa4214c3640 1 --2- 192.168.123.108:0/177059180 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa41c102640 0x7fa41c102a40 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:42.607 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.608+0000 7fa4214c3640 1 -- 192.168.123.108:0/177059180 >> 192.168.123.108:0/177059180 conn(0x7fa41c0fde70 msgr2=0x7fa41c100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:42.607 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.608+0000 7fa4214c3640 1 -- 192.168.123.108:0/177059180 shutdown_connections 2026-03-09T19:22:42.607 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.608+0000 7fa4214c3640 1 -- 192.168.123.108:0/177059180 wait complete. 2026-03-09T19:22:42.607 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.609+0000 7fa4214c3640 1 Processor -- start 2026-03-09T19:22:42.607 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.609+0000 7fa4214c3640 1 -- start start 2026-03-09T19:22:42.607 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.609+0000 7fa4214c3640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa41c102640 0x7fa41c1999e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:42.608 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.609+0000 7fa4214c3640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa41c199f20 con 0x7fa41c102640 2026-03-09T19:22:42.608 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.609+0000 7fa41affd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa41c102640 0x7fa41c1999e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:42.608 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.609+0000 7fa41affd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa41c102640 0x7fa41c1999e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:50378/0 (socket says 192.168.123.108:50378) 2026-03-09T19:22:42.608 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.609+0000 7fa41affd640 1 -- 192.168.123.108:0/2408466330 learned_addr learned my addr 192.168.123.108:0/2408466330 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:42.608 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.610+0000 7fa41affd640 1 -- 192.168.123.108:0/2408466330 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa404009660 con 0x7fa41c102640 2026-03-09T19:22:42.609 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.610+0000 7fa41affd640 1 --2- 192.168.123.108:0/2408466330 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa41c102640 0x7fa41c1999e0 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7fa40402f860 tx=0x7fa404004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:42.610 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.610+0000 7fa3fbfff640 1 -- 192.168.123.108:0/2408466330 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa404038470 con 0x7fa41c102640 2026-03-09T19:22:42.610 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.610+0000 7fa3fbfff640 1 -- 192.168.123.108:0/2408466330 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa404038a90 con 0x7fa41c102640 2026-03-09T19:22:42.610 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.610+0000 7fa3fbfff640 1 -- 192.168.123.108:0/2408466330 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa4040418e0 con 0x7fa41c102640 2026-03-09T19:22:42.610 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.611+0000 7fa4214c3640 1 -- 192.168.123.108:0/2408466330 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa41c19a120 con 0x7fa41c102640 2026-03-09T19:22:42.610 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.611+0000 7fa4214c3640 1 -- 192.168.123.108:0/2408466330 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa41c19a500 con 0x7fa41c102640 2026-03-09T19:22:42.610 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.612+0000 7fa3fbfff640 1 -- 192.168.123.108:0/2408466330 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7fa404038c00 con 0x7fa41c102640 2026-03-09T19:22:42.610 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.612+0000 7fa3fbfff640 1 --2- 192.168.123.108:0/2408466330 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fa3f403d2d0 0x7fa3f403f790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:42.611 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.612+0000 7fa3fbfff640 1 -- 192.168.123.108:0/2408466330 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7fa404076110 con 0x7fa41c102640 2026-03-09T19:22:42.611 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.612+0000 7fa4214c3640 1 -- 192.168.123.108:0/2408466330 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa3e8005350 con 0x7fa41c102640 2026-03-09T19:22:42.611 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.612+0000 7fa41a7fc640 1 --2- 192.168.123.108:0/2408466330 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fa3f403d2d0 0x7fa3f403f790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:42.611 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.613+0000 7fa41a7fc640 1 --2- 192.168.123.108:0/2408466330 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fa3f403d2d0 0x7fa3f403f790 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fa4100099c0 tx=0x7fa410006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:42.614 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.615+0000 7fa3fbfff640 1 -- 192.168.123.108:0/2408466330 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fa404041400 con 0x7fa41c102640 2026-03-09T19:22:42.733 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:42 vm07 ceph-mon[48545]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:22:42.740 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.742+0000 7fa4214c3640 1 -- 192.168.123.108:0/2408466330 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fa3e80051c0 con 0x7fa41c102640 2026-03-09T19:22:42.741 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.742+0000 7fa3fbfff640 1 -- 192.168.123.108:0/2408466330 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7fa40405a090 con 0x7fa41c102640 2026-03-09T19:22:42.741 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:42.741 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:42.741 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:42.743 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.745+0000 7fa4214c3640 1 -- 192.168.123.108:0/2408466330 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fa3f403d2d0 msgr2=0x7fa3f403f790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:42.743 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.745+0000 7fa4214c3640 1 --2- 192.168.123.108:0/2408466330 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fa3f403d2d0 0x7fa3f403f790 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fa4100099c0 tx=0x7fa410006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:42.743 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.745+0000 7fa4214c3640 1 -- 192.168.123.108:0/2408466330 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa41c102640 msgr2=0x7fa41c1999e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:42.743 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.745+0000 7fa4214c3640 1 --2- 192.168.123.108:0/2408466330 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa41c102640 0x7fa41c1999e0 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7fa40402f860 tx=0x7fa404004290 comp rx=0 tx=0).stop 2026-03-09T19:22:42.743 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.745+0000 7fa4214c3640 1 -- 192.168.123.108:0/2408466330 shutdown_connections 2026-03-09T19:22:42.744 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.745+0000 7fa4214c3640 1 --2- 192.168.123.108:0/2408466330 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7fa3f403d2d0 0x7fa3f403f790 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:42.744 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.745+0000 7fa4214c3640 1 --2- 192.168.123.108:0/2408466330 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa41c102640 0x7fa41c1999e0 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:42.744 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.745+0000 7fa4214c3640 1 -- 192.168.123.108:0/2408466330 >> 192.168.123.108:0/2408466330 conn(0x7fa41c0fde70 msgr2=0x7fa41c0fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:42.744 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.746+0000 7fa4214c3640 1 -- 192.168.123.108:0/2408466330 shutdown_connections 2026-03-09T19:22:42.744 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:42.746+0000 7fa4214c3640 1 -- 192.168.123.108:0/2408466330 wait complete. 2026-03-09T19:22:43.694 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:43 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/2408466330' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:43.814 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:43.814 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:43.951 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:43.989 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:44.228 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.228+0000 7f6f58c17640 1 -- 192.168.123.108:0/345542089 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f54102620 msgr2=0x7f6f54102a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:44.228 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.228+0000 7f6f58c17640 1 --2- 192.168.123.108:0/345542089 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f54102620 0x7f6f54102a20 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f6f3c0099b0 tx=0x7f6f3c02f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:44.228 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.229+0000 7f6f58c17640 1 -- 192.168.123.108:0/345542089 shutdown_connections 2026-03-09T19:22:44.228 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.229+0000 7f6f58c17640 1 --2- 192.168.123.108:0/345542089 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f54102620 0x7f6f54102a20 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:44.228 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.229+0000 7f6f58c17640 1 -- 192.168.123.108:0/345542089 >> 192.168.123.108:0/345542089 conn(0x7f6f540fde70 msgr2=0x7f6f54100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:44.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.230+0000 7f6f58c17640 1 -- 192.168.123.108:0/345542089 shutdown_connections 2026-03-09T19:22:44.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.230+0000 7f6f58c17640 1 -- 192.168.123.108:0/345542089 wait complete. 2026-03-09T19:22:44.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.230+0000 7f6f58c17640 1 Processor -- start 2026-03-09T19:22:44.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.230+0000 7f6f58c17640 1 -- start start 2026-03-09T19:22:44.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.231+0000 7f6f58c17640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f54102620 0x7f6f54199990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:44.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.231+0000 7f6f58c17640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f54199ed0 con 0x7f6f54102620 2026-03-09T19:22:44.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.231+0000 7f6f52575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f54102620 0x7f6f54199990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:44.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.231+0000 7f6f52575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f54102620 0x7f6f54199990 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:50384/0 (socket says 192.168.123.108:50384) 2026-03-09T19:22:44.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.231+0000 7f6f52575640 1 -- 192.168.123.108:0/1530647403 learned_addr learned my addr 192.168.123.108:0/1530647403 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:44.230 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.231+0000 7f6f52575640 1 -- 192.168.123.108:0/1530647403 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6f3c009660 con 0x7f6f54102620 2026-03-09T19:22:44.230 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.232+0000 7f6f52575640 1 --2- 192.168.123.108:0/1530647403 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f54102620 0x7f6f54199990 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f6f3c02f860 tx=0x7f6f3c004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:44.230 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.232+0000 7f6f3b7fe640 1 -- 192.168.123.108:0/1530647403 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6f3c0043b0 con 0x7f6f54102620 2026-03-09T19:22:44.230 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.232+0000 7f6f3b7fe640 1 -- 192.168.123.108:0/1530647403 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6f3c038b40 con 0x7f6f54102620 2026-03-09T19:22:44.230 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.232+0000 7f6f58c17640 1 -- 192.168.123.108:0/1530647403 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6f5419a0d0 con 0x7f6f54102620 2026-03-09T19:22:44.231 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.233+0000 7f6f3b7fe640 1 -- 192.168.123.108:0/1530647403 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6f3c0418f0 con 0x7f6f54102620 2026-03-09T19:22:44.231 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.233+0000 7f6f58c17640 1 -- 192.168.123.108:0/1530647403 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6f5419a570 con 0x7f6f54102620 2026-03-09T19:22:44.232 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.233+0000 7f6f3b7fe640 1 -- 192.168.123.108:0/1530647403 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 13) v1 ==== 49429+0+0 (secure 0 0 0) 0x7f6f3c038cb0 con 0x7f6f54102620 2026-03-09T19:22:44.232 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.233+0000 7f6f58c17640 1 -- 192.168.123.108:0/1530647403 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6f54102aa0 con 0x7f6f54102620 2026-03-09T19:22:44.232 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.234+0000 7f6f3b7fe640 1 --2- 192.168.123.108:0/1530647403 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f6f2403cf60 0x7f6f2403f420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:44.232 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.234+0000 7f6f3b7fe640 1 -- 192.168.123.108:0/1530647403 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f6f3c0761f0 con 0x7f6f54102620 2026-03-09T19:22:44.232 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.234+0000 7f6f51d74640 1 --2- 192.168.123.108:0/1530647403 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f6f2403cf60 0x7f6f2403f420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:44.233 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.235+0000 7f6f51d74640 1 --2- 192.168.123.108:0/1530647403 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f6f2403cf60 0x7f6f2403f420 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f6f480099c0 tx=0x7f6f48006eb0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:44.236 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.237+0000 7f6f3b7fe640 1 -- 192.168.123.108:0/1530647403 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6f3c040360 con 0x7f6f54102620 2026-03-09T19:22:44.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.372+0000 7f6f58c17640 1 -- 192.168.123.108:0/1530647403 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f6f54108530 con 0x7f6f54102620 2026-03-09T19:22:44.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.375+0000 7f6f3b7fe640 1 -- 192.168.123.108:0/1530647403 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f6f3c035320 con 0x7f6f54102620 2026-03-09T19:22:44.374 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:44.374 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:44.374 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:44.376 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.378+0000 7f6f58c17640 1 -- 192.168.123.108:0/1530647403 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f6f2403cf60 msgr2=0x7f6f2403f420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:44.377 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.378+0000 7f6f58c17640 1 --2- 192.168.123.108:0/1530647403 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f6f2403cf60 0x7f6f2403f420 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f6f480099c0 tx=0x7f6f48006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:44.377 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.378+0000 7f6f58c17640 1 -- 192.168.123.108:0/1530647403 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f54102620 msgr2=0x7f6f54199990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:44.377 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.378+0000 7f6f58c17640 1 --2- 192.168.123.108:0/1530647403 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f54102620 0x7f6f54199990 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f6f3c02f860 tx=0x7f6f3c004270 comp rx=0 tx=0).stop 2026-03-09T19:22:44.377 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.378+0000 7f6f58c17640 1 -- 192.168.123.108:0/1530647403 shutdown_connections 2026-03-09T19:22:44.377 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.378+0000 7f6f58c17640 1 --2- 192.168.123.108:0/1530647403 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f6f2403cf60 0x7f6f2403f420 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:44.377 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.378+0000 7f6f58c17640 1 --2- 192.168.123.108:0/1530647403 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f54102620 0x7f6f54199990 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:44.377 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.378+0000 7f6f58c17640 1 -- 192.168.123.108:0/1530647403 >> 192.168.123.108:0/1530647403 conn(0x7f6f540fde70 msgr2=0x7f6f540fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:44.377 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.379+0000 7f6f58c17640 1 -- 192.168.123.108:0/1530647403 shutdown_connections 2026-03-09T19:22:44.377 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:44.379+0000 7f6f58c17640 1 -- 192.168.123.108:0/1530647403 wait complete. 2026-03-09T19:22:44.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:44 vm07 ceph-mon[48545]: pgmap v11: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:22:44.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:44 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:44.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:44 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:44.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:44 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:44.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:44 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch 2026-03-09T19:22:44.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:44 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' 2026-03-09T19:22:44.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:44 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/1530647403' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:45.431 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:45.431 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:45.592 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:45.637 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:45.875 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.875+0000 7f5d1d4c0640 1 -- 192.168.123.108:0/1119682495 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d180fe470 msgr2=0x7f5d180fe870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:45.875 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.875+0000 7f5d1d4c0640 1 --2- 192.168.123.108:0/1119682495 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d180fe470 0x7f5d180fe870 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f5d080099b0 tx=0x7f5d0802f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:45.875 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.876+0000 7f5d1d4c0640 1 -- 192.168.123.108:0/1119682495 shutdown_connections 2026-03-09T19:22:45.875 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.876+0000 7f5d1d4c0640 1 --2- 192.168.123.108:0/1119682495 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d180fe470 0x7f5d180fe870 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:45.875 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.876+0000 7f5d1d4c0640 1 -- 192.168.123.108:0/1119682495 >> 192.168.123.108:0/1119682495 conn(0x7f5d180f9fc0 msgr2=0x7f5d180fc3e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:45.875 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.876+0000 7f5d1d4c0640 1 -- 192.168.123.108:0/1119682495 shutdown_connections 2026-03-09T19:22:45.875 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.877+0000 7f5d1d4c0640 1 -- 192.168.123.108:0/1119682495 wait complete. 2026-03-09T19:22:45.875 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.877+0000 7f5d1d4c0640 1 Processor -- start 2026-03-09T19:22:45.875 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.877+0000 7f5d1d4c0640 1 -- start start 2026-03-09T19:22:45.876 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.877+0000 7f5d1d4c0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d18100c90 0x7f5d180ff400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:45.876 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.877+0000 7f5d1d4c0640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d181010b0 con 0x7f5d18100c90 2026-03-09T19:22:45.876 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.877+0000 7f5d17fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d18100c90 0x7f5d180ff400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:45.876 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.877+0000 7f5d17fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d18100c90 0x7f5d180ff400 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:50398/0 (socket says 192.168.123.108:50398) 2026-03-09T19:22:45.876 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.877+0000 7f5d17fff640 1 -- 192.168.123.108:0/3511153716 learned_addr learned my addr 192.168.123.108:0/3511153716 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:45.876 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.878+0000 7f5d17fff640 1 -- 192.168.123.108:0/3511153716 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5d08009660 con 0x7f5d18100c90 2026-03-09T19:22:45.876 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.878+0000 7f5d17fff640 1 --2- 192.168.123.108:0/3511153716 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d18100c90 0x7f5d180ff400 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f5d08002bf0 tx=0x7f5d08004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:45.877 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.878+0000 7f5d157fa640 1 -- 192.168.123.108:0/3511153716 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5d0802fea0 con 0x7f5d18100c90 2026-03-09T19:22:45.877 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.878+0000 7f5d157fa640 1 -- 192.168.123.108:0/3511153716 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f5d08004430 con 0x7f5d18100c90 2026-03-09T19:22:45.878 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.879+0000 7f5d1d4c0640 1 -- 192.168.123.108:0/3511153716 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5d181012b0 con 0x7f5d18100c90 2026-03-09T19:22:45.878 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.879+0000 7f5d1d4c0640 1 -- 192.168.123.108:0/3511153716 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5d180ffb90 con 0x7f5d18100c90 2026-03-09T19:22:45.878 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.879+0000 7f5d1d4c0640 1 -- 192.168.123.108:0/3511153716 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5cd8005350 con 0x7f5d18100c90 2026-03-09T19:22:45.878 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.879+0000 7f5d157fa640 1 -- 192.168.123.108:0/3511153716 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5d08041980 con 0x7f5d18100c90 2026-03-09T19:22:45.880 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.882+0000 7f5d157fa640 1 -- 192.168.123.108:0/3511153716 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 14) v1 ==== 49443+0+0 (secure 0 0 0) 0x7f5d08041ae0 con 0x7f5d18100c90 2026-03-09T19:22:45.880 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.882+0000 7f5d157fa640 1 --2- 192.168.123.108:0/3511153716 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f5cf003d000 0x7f5cf003f4c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:45.880 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.882+0000 7f5d177fe640 1 -- 192.168.123.108:0/3511153716 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f5cf003d000 msgr2=0x7f5cf003f4c0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/1021580706 2026-03-09T19:22:45.880 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.882+0000 7f5d177fe640 1 --2- 192.168.123.108:0/3511153716 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f5cf003d000 0x7f5cf003f4c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T19:22:45.880 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.882+0000 7f5d157fa640 1 -- 192.168.123.108:0/3511153716 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f5d08076910 con 0x7f5d18100c90 2026-03-09T19:22:45.881 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:45.883+0000 7f5d157fa640 1 -- 192.168.123.108:0/3511153716 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f5d080375b0 con 0x7f5d18100c90 2026-03-09T19:22:46.001 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:46.003+0000 7f5d1d4c0640 1 -- 192.168.123.108:0/3511153716 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f5cd8005600 con 0x7f5d18100c90 2026-03-09T19:22:46.002 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:46.004+0000 7f5d157fa640 1 -- 192.168.123.108:0/3511153716 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f5d080373d0 con 0x7f5d18100c90 2026-03-09T19:22:46.003 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:46.003 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:46.003 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:46.005 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:46.006+0000 7f5d1d4c0640 1 -- 192.168.123.108:0/3511153716 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f5cf003d000 msgr2=0x7f5cf003f4c0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:22:46.005 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:46.007+0000 7f5d1d4c0640 1 --2- 192.168.123.108:0/3511153716 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f5cf003d000 0x7f5cf003f4c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:46.005 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:46.007+0000 7f5d1d4c0640 1 -- 192.168.123.108:0/3511153716 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d18100c90 msgr2=0x7f5d180ff400 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:46.005 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:46.007+0000 7f5d1d4c0640 1 --2- 192.168.123.108:0/3511153716 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d18100c90 0x7f5d180ff400 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f5d08002bf0 tx=0x7f5d08004290 comp rx=0 tx=0).stop 2026-03-09T19:22:46.005 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:46.007+0000 7f5d1d4c0640 1 -- 192.168.123.108:0/3511153716 shutdown_connections 2026-03-09T19:22:46.005 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:46.007+0000 7f5d1d4c0640 1 --2- 192.168.123.108:0/3511153716 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f5cf003d000 0x7f5cf003f4c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:46.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:46.007+0000 7f5d1d4c0640 1 --2- 192.168.123.108:0/3511153716 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d18100c90 0x7f5d180ff400 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:46.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:46.007+0000 7f5d1d4c0640 1 -- 192.168.123.108:0/3511153716 >> 192.168.123.108:0/3511153716 conn(0x7f5d180f9fc0 msgr2=0x7f5d180fabc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:46.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:46.008+0000 7f5d1d4c0640 1 -- 192.168.123.108:0/3511153716 shutdown_connections 2026-03-09T19:22:46.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:46.008+0000 7f5d1d4c0640 1 -- 192.168.123.108:0/3511153716 wait complete. 2026-03-09T19:22:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:46 vm07 ceph-mon[48545]: from='mgr.14162 192.168.123.107:0/649253668' entity='mgr.vm07.xacuym' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished 2026-03-09T19:22:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:46 vm07 ceph-mon[48545]: mgrmap e14: vm07.xacuym(active, since 36s) 2026-03-09T19:22:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:46 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/3511153716' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:47.065 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:47.065 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:47.205 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:47.242 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:47.475 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.475+0000 7f31b8f92640 1 -- 192.168.123.108:0/1700355555 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31b4102620 msgr2=0x7f31b4102a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:47.475 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.475+0000 7f31b8f92640 1 --2- 192.168.123.108:0/1700355555 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31b4102620 0x7f31b4102a20 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f319c0099b0 tx=0x7f319c02f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:47.475 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.476+0000 7f31b8f92640 1 -- 192.168.123.108:0/1700355555 shutdown_connections 2026-03-09T19:22:47.475 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.476+0000 7f31b8f92640 1 --2- 192.168.123.108:0/1700355555 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31b4102620 0x7f31b4102a20 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:47.475 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.476+0000 7f31b8f92640 1 -- 192.168.123.108:0/1700355555 >> 192.168.123.108:0/1700355555 conn(0x7f31b40fde70 msgr2=0x7f31b4100260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:47.475 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.476+0000 7f31b8f92640 1 -- 192.168.123.108:0/1700355555 shutdown_connections 2026-03-09T19:22:47.475 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.476+0000 7f31b8f92640 1 -- 192.168.123.108:0/1700355555 wait complete. 2026-03-09T19:22:47.475 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.477+0000 7f31b8f92640 1 Processor -- start 2026-03-09T19:22:47.475 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.477+0000 7f31b8f92640 1 -- start start 2026-03-09T19:22:47.475 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.477+0000 7f31b8f92640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31b4102620 0x7f31b4078ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:47.475 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.477+0000 7f31b8f92640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f31b4079400 con 0x7f31b4102620 2026-03-09T19:22:47.476 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.477+0000 7f31b2575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31b4102620 0x7f31b4078ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:47.476 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.477+0000 7f31b2575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31b4102620 0x7f31b4078ec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:50424/0 (socket says 192.168.123.108:50424) 2026-03-09T19:22:47.476 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.477+0000 7f31b2575640 1 -- 192.168.123.108:0/1487459470 learned_addr learned my addr 192.168.123.108:0/1487459470 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:47.476 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.478+0000 7f31b2575640 1 -- 192.168.123.108:0/1487459470 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f319c009660 con 0x7f31b4102620 2026-03-09T19:22:47.476 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.478+0000 7f31b2575640 1 --2- 192.168.123.108:0/1487459470 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31b4102620 0x7f31b4078ec0 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f319c02f860 tx=0x7f319c004270 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:47.477 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.478+0000 7f319b7fe640 1 -- 192.168.123.108:0/1487459470 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f319c0043b0 con 0x7f31b4102620 2026-03-09T19:22:47.477 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.478+0000 7f319b7fe640 1 -- 192.168.123.108:0/1487459470 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f319c038b40 con 0x7f31b4102620 2026-03-09T19:22:47.477 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.478+0000 7f319b7fe640 1 -- 192.168.123.108:0/1487459470 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f319c0418f0 con 0x7f31b4102620 2026-03-09T19:22:47.477 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.478+0000 7f31b8f92640 1 -- 192.168.123.108:0/1487459470 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f31b4079600 con 0x7f31b4102620 2026-03-09T19:22:47.477 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.478+0000 7f31b8f92640 1 -- 192.168.123.108:0/1487459470 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f31b4075a00 con 0x7f31b4102620 2026-03-09T19:22:47.477 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.479+0000 7f319b7fe640 1 -- 192.168.123.108:0/1487459470 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 14) v1 ==== 49443+0+0 (secure 0 0 0) 0x7f319c038cb0 con 0x7f31b4102620 2026-03-09T19:22:47.477 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.479+0000 7f31b8f92640 1 -- 192.168.123.108:0/1487459470 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f31b4102aa0 con 0x7f31b4102620 2026-03-09T19:22:47.478 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.479+0000 7f319b7fe640 1 --2- 192.168.123.108:0/1487459470 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f318c03d320 0x7f318c03f7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:47.478 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.479+0000 7f319b7fe640 1 -- 192.168.123.108:0/1487459470 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1155+0+0 (secure 0 0 0) 0x7f319c0761d0 con 0x7f31b4102620 2026-03-09T19:22:47.478 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.479+0000 7f31b1d74640 1 -- 192.168.123.108:0/1487459470 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f318c03d320 msgr2=0x7f318c03f7e0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/1021580706 2026-03-09T19:22:47.478 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.479+0000 7f31b1d74640 1 --2- 192.168.123.108:0/1487459470 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f318c03d320 0x7f318c03f7e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T19:22:47.480 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.481+0000 7f319b7fe640 1 -- 192.168.123.108:0/1487459470 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f319c041400 con 0x7f31b4102620 2026-03-09T19:22:47.601 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.602+0000 7f31b8f92640 1 -- 192.168.123.108:0/1487459470 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f31b41087b0 con 0x7f31b4102620 2026-03-09T19:22:47.601 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.603+0000 7f319b7fe640 1 -- 192.168.123.108:0/1487459470 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f319c0379e0 con 0x7f31b4102620 2026-03-09T19:22:47.601 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:47.601 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:47.601 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:47.603 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.605+0000 7f31b8f92640 1 -- 192.168.123.108:0/1487459470 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f318c03d320 msgr2=0x7f318c03f7e0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:22:47.604 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.605+0000 7f31b8f92640 1 --2- 192.168.123.108:0/1487459470 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f318c03d320 0x7f318c03f7e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:47.604 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.605+0000 7f31b8f92640 1 -- 192.168.123.108:0/1487459470 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31b4102620 msgr2=0x7f31b4078ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:47.604 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.605+0000 7f31b8f92640 1 --2- 192.168.123.108:0/1487459470 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31b4102620 0x7f31b4078ec0 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f319c02f860 tx=0x7f319c004270 comp rx=0 tx=0).stop 2026-03-09T19:22:47.604 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.605+0000 7f31b8f92640 1 -- 192.168.123.108:0/1487459470 shutdown_connections 2026-03-09T19:22:47.604 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.605+0000 7f31b8f92640 1 --2- 192.168.123.108:0/1487459470 >> [v2:192.168.123.107:6800/1021580706,v1:192.168.123.107:6801/1021580706] conn(0x7f318c03d320 0x7f318c03f7e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:47.604 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.605+0000 7f31b8f92640 1 --2- 192.168.123.108:0/1487459470 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31b4102620 0x7f31b4078ec0 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:47.604 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.605+0000 7f31b8f92640 1 -- 192.168.123.108:0/1487459470 >> 192.168.123.108:0/1487459470 conn(0x7f31b40fde70 msgr2=0x7f31b40fea70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:47.604 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.606+0000 7f31b8f92640 1 -- 192.168.123.108:0/1487459470 shutdown_connections 2026-03-09T19:22:47.604 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:47.606+0000 7f31b8f92640 1 -- 192.168.123.108:0/1487459470 wait complete. 2026-03-09T19:22:47.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:47 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/1487459470' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:48.656 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:48.656 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:48.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:48 vm07 ceph-mon[48545]: Active manager daemon vm07.xacuym restarted 2026-03-09T19:22:48.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:48 vm07 ceph-mon[48545]: Activating manager daemon vm07.xacuym 2026-03-09T19:22:48.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:48 vm07 ceph-mon[48545]: osdmap e5: 0 total, 0 up, 0 in 2026-03-09T19:22:48.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:48 vm07 ceph-mon[48545]: mgrmap e15: vm07.xacuym(active, starting, since 0.00449682s) 2026-03-09T19:22:48.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:22:48.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr metadata", "who": "vm07.xacuym", "id": "vm07.xacuym"}]: dispatch 2026-03-09T19:22:48.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T19:22:48.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T19:22:48.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T19:22:48.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:48 vm07 ceph-mon[48545]: Manager daemon vm07.xacuym is now available 2026-03-09T19:22:48.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:48.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:22:48.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:22:48.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.xacuym/mirror_snapshot_schedule"}]: dispatch 2026-03-09T19:22:48.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.xacuym/trash_purge_schedule"}]: dispatch 2026-03-09T19:22:48.823 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:48.872 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:49.150 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.150+0000 7fa1fe5f0640 1 -- 192.168.123.108:0/1231461832 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa1f8071510 msgr2=0x7fa1f8071910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:49.150 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.150+0000 7fa1fe5f0640 1 --2- 192.168.123.108:0/1231461832 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa1f8071510 0x7fa1f8071910 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7fa1e80099b0 tx=0x7fa1e802f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:49.150 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.150+0000 7fa1fe5f0640 1 -- 192.168.123.108:0/1231461832 shutdown_connections 2026-03-09T19:22:49.150 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.150+0000 7fa1fe5f0640 1 --2- 192.168.123.108:0/1231461832 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa1f8071510 0x7fa1f8071910 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:49.150 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.150+0000 7fa1fe5f0640 1 -- 192.168.123.108:0/1231461832 >> 192.168.123.108:0/1231461832 conn(0x7fa1f806cfb0 msgr2=0x7fa1f806f3f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:49.150 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.151+0000 7fa1fe5f0640 1 -- 192.168.123.108:0/1231461832 shutdown_connections 2026-03-09T19:22:49.150 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.151+0000 7fa1fe5f0640 1 -- 192.168.123.108:0/1231461832 wait complete. 2026-03-09T19:22:49.150 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.151+0000 7fa1fe5f0640 1 Processor -- start 2026-03-09T19:22:49.150 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.151+0000 7fa1fe5f0640 1 -- start start 2026-03-09T19:22:49.151 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.152+0000 7fa1fe5f0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa1f8071510 0x7fa1f80753b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:49.151 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.152+0000 7fa1fe5f0640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa1f80758f0 con 0x7fa1f8071510 2026-03-09T19:22:49.151 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.152+0000 7fa1fd5ee640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa1f8071510 0x7fa1f80753b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:49.151 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.152+0000 7fa1fd5ee640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa1f8071510 0x7fa1f80753b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:52136/0 (socket says 192.168.123.108:52136) 2026-03-09T19:22:49.151 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.152+0000 7fa1fd5ee640 1 -- 192.168.123.108:0/3877526429 learned_addr learned my addr 192.168.123.108:0/3877526429 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:49.153 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.155+0000 7fa1fd5ee640 1 -- 192.168.123.108:0/3877526429 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa1e8009660 con 0x7fa1f8071510 2026-03-09T19:22:49.153 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.155+0000 7fa1fd5ee640 1 --2- 192.168.123.108:0/3877526429 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa1f8071510 0x7fa1f80753b0 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7fa1e8009ae0 tx=0x7fa1e8004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:49.155 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.156+0000 7fa1e67fc640 1 -- 192.168.123.108:0/3877526429 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa1e80043b0 con 0x7fa1f8071510 2026-03-09T19:22:49.155 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.156+0000 7fa1fe5f0640 1 -- 192.168.123.108:0/3877526429 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa1f8075af0 con 0x7fa1f8071510 2026-03-09T19:22:49.155 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.156+0000 7fa1fe5f0640 1 -- 192.168.123.108:0/3877526429 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa1f8075f90 con 0x7fa1f8071510 2026-03-09T19:22:49.155 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.156+0000 7fa1e67fc640 1 -- 192.168.123.108:0/3877526429 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa1e8038b40 con 0x7fa1f8071510 2026-03-09T19:22:49.155 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.156+0000 7fa1e67fc640 1 -- 192.168.123.108:0/3877526429 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa1e80419c0 con 0x7fa1f8071510 2026-03-09T19:22:49.156 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.157+0000 7fa1e67fc640 1 -- 192.168.123.108:0/3877526429 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 15) v1 ==== 49164+0+0 (secure 0 0 0) 0x7fa1e8041b20 con 0x7fa1f8071510 2026-03-09T19:22:49.156 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.157+0000 7fa1e67fc640 1 -- 192.168.123.108:0/3877526429 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7fa1e8076d90 con 0x7fa1f8071510 2026-03-09T19:22:49.157 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.158+0000 7fa1fe5f0640 1 -- 192.168.123.108:0/3877526429 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa1c8005350 con 0x7fa1f8071510 2026-03-09T19:22:49.174 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.165+0000 7fa1e67fc640 1 -- 192.168.123.108:0/3877526429 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fa1e8037bb0 con 0x7fa1f8071510 2026-03-09T19:22:49.286 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.285+0000 7fa1fe5f0640 1 -- 192.168.123.108:0/3877526429 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fa1c80051c0 con 0x7fa1f8071510 2026-03-09T19:22:49.286 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.287+0000 7fa1e67fc640 1 -- 192.168.123.108:0/3877526429 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7fa1e80373d0 con 0x7fa1f8071510 2026-03-09T19:22:49.286 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:49.286 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:49.286 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:49.290 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.291+0000 7fa1fe5f0640 1 -- 192.168.123.108:0/3877526429 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa1f8071510 msgr2=0x7fa1f80753b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:49.290 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.291+0000 7fa1fe5f0640 1 --2- 192.168.123.108:0/3877526429 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa1f8071510 0x7fa1f80753b0 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7fa1e8009ae0 tx=0x7fa1e8004290 comp rx=0 tx=0).stop 2026-03-09T19:22:49.290 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.292+0000 7fa1fe5f0640 1 -- 192.168.123.108:0/3877526429 shutdown_connections 2026-03-09T19:22:49.290 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.292+0000 7fa1fe5f0640 1 --2- 192.168.123.108:0/3877526429 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa1f8071510 0x7fa1f80753b0 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:49.291 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.292+0000 7fa1fe5f0640 1 -- 192.168.123.108:0/3877526429 >> 192.168.123.108:0/3877526429 conn(0x7fa1f806cfb0 msgr2=0x7fa1f806dc00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:49.291 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.293+0000 7fa1fe5f0640 1 -- 192.168.123.108:0/3877526429 shutdown_connections 2026-03-09T19:22:49.291 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:49.293+0000 7fa1fe5f0640 1 -- 192.168.123.108:0/3877526429 wait complete. 2026-03-09T19:22:49.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:49 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:49.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:49 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:49.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:49 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/3877526429' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:49.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:49 vm07 ceph-mon[48545]: mgrmap e16: vm07.xacuym(active, since 1.0078s) 2026-03-09T19:22:49.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:49 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:49.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:49 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:49.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:49 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:50.364 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:50.364 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:50.531 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:50.580 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T19:22:50.831 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.832+0000 7fde43577640 1 -- 192.168.123.108:0/984774615 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fde44071510 msgr2=0x7fde44071910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:50.831 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.832+0000 7fde43577640 1 --2- 192.168.123.108:0/984774615 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fde44071510 0x7fde44071910 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7fde2c0099b0 tx=0x7fde2c02f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:50.831 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.832+0000 7fde43577640 1 -- 192.168.123.108:0/984774615 shutdown_connections 2026-03-09T19:22:50.831 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.832+0000 7fde43577640 1 --2- 192.168.123.108:0/984774615 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fde44071510 0x7fde44071910 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:50.831 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.832+0000 7fde43577640 1 -- 192.168.123.108:0/984774615 >> 192.168.123.108:0/984774615 conn(0x7fde4406cfb0 msgr2=0x7fde4406f3f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:50.831 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.832+0000 7fde43577640 1 -- 192.168.123.108:0/984774615 shutdown_connections 2026-03-09T19:22:50.831 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.832+0000 7fde43577640 1 -- 192.168.123.108:0/984774615 wait complete. 2026-03-09T19:22:50.831 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.832+0000 7fde43577640 1 Processor -- start 2026-03-09T19:22:50.831 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.833+0000 7fde43577640 1 -- start start 2026-03-09T19:22:50.831 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.833+0000 7fde43577640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fde44071510 0x7fde441a6620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:50.831 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.833+0000 7fde43577640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fde2c002dc0 con 0x7fde44071510 2026-03-09T19:22:50.831 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.833+0000 7fde42575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fde44071510 0x7fde441a6620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:50.831 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.833+0000 7fde42575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fde44071510 0x7fde441a6620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:52156/0 (socket says 192.168.123.108:52156) 2026-03-09T19:22:50.831 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.833+0000 7fde42575640 1 -- 192.168.123.108:0/3175136600 learned_addr learned my addr 192.168.123.108:0/3175136600 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:50.832 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.833+0000 7fde42575640 1 -- 192.168.123.108:0/3175136600 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fde2c009660 con 0x7fde44071510 2026-03-09T19:22:50.832 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.833+0000 7fde42575640 1 --2- 192.168.123.108:0/3175136600 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fde44071510 0x7fde441a6620 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7fde2c002f30 tx=0x7fde2c0043d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:50.832 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.833+0000 7fde3b7fe640 1 -- 192.168.123.108:0/3175136600 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fde2c03d070 con 0x7fde44071510 2026-03-09T19:22:50.833 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.834+0000 7fde43577640 1 -- 192.168.123.108:0/3175136600 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fde441a6b60 con 0x7fde44071510 2026-03-09T19:22:50.833 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.834+0000 7fde43577640 1 -- 192.168.123.108:0/3175136600 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fde441a7000 con 0x7fde44071510 2026-03-09T19:22:50.833 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.834+0000 7fde3b7fe640 1 -- 192.168.123.108:0/3175136600 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fde2c038d60 con 0x7fde44071510 2026-03-09T19:22:50.833 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.834+0000 7fde3b7fe640 1 -- 192.168.123.108:0/3175136600 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fde2c041d30 con 0x7fde44071510 2026-03-09T19:22:50.833 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.835+0000 7fde397fa640 1 -- 192.168.123.108:0/3175136600 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fde4410d120 con 0x7fde44071510 2026-03-09T19:22:50.833 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.835+0000 7fde3b7fe640 1 -- 192.168.123.108:0/3175136600 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 16) v1 ==== 49291+0+0 (secure 0 0 0) 0x7fde2c049050 con 0x7fde44071510 2026-03-09T19:22:50.834 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.835+0000 7fde3b7fe640 1 --2- 192.168.123.108:0/3175136600 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fde1003d1b0 0x7fde1003f670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:50.834 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.835+0000 7fde3b7fe640 1 -- 192.168.123.108:0/3175136600 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7fde2c075cf0 con 0x7fde44071510 2026-03-09T19:22:50.834 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.836+0000 7fde41d74640 1 --2- 192.168.123.108:0/3175136600 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fde1003d1b0 0x7fde1003f670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:50.834 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.836+0000 7fde41d74640 1 --2- 192.168.123.108:0/3175136600 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fde1003d1b0 0x7fde1003f670 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fde340099c0 tx=0x7fde34006eb0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:50.836 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.838+0000 7fde3b7fe640 1 -- 192.168.123.108:0/3175136600 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fde2c041550 con 0x7fde44071510 2026-03-09T19:22:50.976 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.975+0000 7fde397fa640 1 -- 192.168.123.108:0/3175136600 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fde44071910 con 0x7fde44071510 2026-03-09T19:22:50.980 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:50.980 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:50.981 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.975+0000 7fde3b7fe640 1 -- 192.168.123.108:0/3175136600 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7fde2c040b60 con 0x7fde44071510 2026-03-09T19:22:50.981 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:50.982 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.982+0000 7fde43577640 1 -- 192.168.123.108:0/3175136600 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fde1003d1b0 msgr2=0x7fde1003f670 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:50.982 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.982+0000 7fde43577640 1 --2- 192.168.123.108:0/3175136600 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fde1003d1b0 0x7fde1003f670 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fde340099c0 tx=0x7fde34006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:50.983 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.982+0000 7fde43577640 1 -- 192.168.123.108:0/3175136600 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fde44071510 msgr2=0x7fde441a6620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:50.983 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.982+0000 7fde43577640 1 --2- 192.168.123.108:0/3175136600 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fde44071510 0x7fde441a6620 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7fde2c002f30 tx=0x7fde2c0043d0 comp rx=0 tx=0).stop 2026-03-09T19:22:50.983 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.982+0000 7fde43577640 1 -- 192.168.123.108:0/3175136600 shutdown_connections 2026-03-09T19:22:50.983 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.982+0000 7fde43577640 1 --2- 192.168.123.108:0/3175136600 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fde1003d1b0 0x7fde1003f670 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:50.983 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.982+0000 7fde43577640 1 --2- 192.168.123.108:0/3175136600 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fde44071510 0x7fde441a6620 secure :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7fde2c002f30 tx=0x7fde2c0043d0 comp rx=0 tx=0).stop 2026-03-09T19:22:50.983 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.982+0000 7fde43577640 1 -- 192.168.123.108:0/3175136600 >> 192.168.123.108:0/3175136600 conn(0x7fde4406cfb0 msgr2=0x7fde4406f3f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:50.983 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.982+0000 7fde43577640 1 -- 192.168.123.108:0/3175136600 shutdown_connections 2026-03-09T19:22:50.983 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:50.983+0000 7fde43577640 1 -- 192.168.123.108:0/3175136600 wait complete. 2026-03-09T19:22:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:51 vm07 ceph-mon[48545]: [09/Mar/2026:19:22:49] ENGINE Bus STARTING 2026-03-09T19:22:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:51 vm07 ceph-mon[48545]: [09/Mar/2026:19:22:49] ENGINE Serving on http://192.168.123.107:8765 2026-03-09T19:22:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:51 vm07 ceph-mon[48545]: [09/Mar/2026:19:22:49] ENGINE Serving on https://192.168.123.107:7150 2026-03-09T19:22:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:51 vm07 ceph-mon[48545]: [09/Mar/2026:19:22:49] ENGINE Bus STARTED 2026-03-09T19:22:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:51 vm07 ceph-mon[48545]: [09/Mar/2026:19:22:49] ENGINE Client ('192.168.123.107', 42122) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T19:22:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:51 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:51 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:51 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:22:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:51 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:51 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:51 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:51 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:51 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:22:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:51 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:22:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:51 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:22:51.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:51 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/3175136600' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:52.031 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:52.031 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:52.216 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:22:52.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:52 vm07 ceph-mon[48545]: Updating vm07:/etc/ceph/ceph.conf 2026-03-09T19:22:52.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:52 vm07 ceph-mon[48545]: Updating vm08:/etc/ceph/ceph.conf 2026-03-09T19:22:52.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:52 vm07 ceph-mon[48545]: mgrmap e17: vm07.xacuym(active, since 2s) 2026-03-09T19:22:52.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:52 vm07 ceph-mon[48545]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:22:52.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:52 vm07 ceph-mon[48545]: Updating vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:22:52.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:52 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:52.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:52 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:52.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:52 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:52.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:52 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:52.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:52 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:52.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:52 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:22:52.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:52 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T19:22:52.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:52 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:22:52.861 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.861+0000 7f7079cbc640 1 -- 192.168.123.108:0/1067902495 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7074108480 msgr2=0x7f7074108860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:52.861 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.862+0000 7f7079cbc640 1 --2- 192.168.123.108:0/1067902495 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7074108480 0x7f7074108860 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f70640099b0 tx=0x7f706402f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:52.861 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.862+0000 7f7079cbc640 1 -- 192.168.123.108:0/1067902495 shutdown_connections 2026-03-09T19:22:52.861 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.862+0000 7f7079cbc640 1 --2- 192.168.123.108:0/1067902495 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7074108480 0x7f7074108860 secure :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f70640099b0 tx=0x7f706402f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:52.861 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.862+0000 7f7079cbc640 1 -- 192.168.123.108:0/1067902495 >> 192.168.123.108:0/1067902495 conn(0x7f70740fe0d0 msgr2=0x7f70741004f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:52.861 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.863+0000 7f7079cbc640 1 -- 192.168.123.108:0/1067902495 shutdown_connections 2026-03-09T19:22:52.861 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.863+0000 7f7079cbc640 1 -- 192.168.123.108:0/1067902495 wait complete. 2026-03-09T19:22:52.862 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.863+0000 7f7079cbc640 1 Processor -- start 2026-03-09T19:22:52.862 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.863+0000 7f7079cbc640 1 -- start start 2026-03-09T19:22:52.862 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.864+0000 7f7079cbc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7074199e20 0x7f707419a200 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:52.862 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.864+0000 7f7079cbc640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f707419a740 con 0x7f7074199e20 2026-03-09T19:22:52.862 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.864+0000 7f70737fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7074199e20 0x7f707419a200 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:52.863 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.864+0000 7f70737fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7074199e20 0x7f707419a200 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:52168/0 (socket says 192.168.123.108:52168) 2026-03-09T19:22:52.863 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.864+0000 7f70737fe640 1 -- 192.168.123.108:0/2633401742 learned_addr learned my addr 192.168.123.108:0/2633401742 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:52.863 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.864+0000 7f70737fe640 1 -- 192.168.123.108:0/2633401742 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7064009660 con 0x7f7074199e20 2026-03-09T19:22:52.863 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.865+0000 7f70737fe640 1 --2- 192.168.123.108:0/2633401742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7074199e20 0x7f707419a200 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f7064002ba0 tx=0x7f70640042f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:52.863 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.865+0000 7f7070ff9640 1 -- 192.168.123.108:0/2633401742 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f706403c070 con 0x7f7074199e20 2026-03-09T19:22:52.863 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.865+0000 7f7070ff9640 1 -- 192.168.123.108:0/2633401742 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f706403d040 con 0x7f7074199e20 2026-03-09T19:22:52.864 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.865+0000 7f7070ff9640 1 -- 192.168.123.108:0/2633401742 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f706402fb30 con 0x7f7074199e20 2026-03-09T19:22:52.864 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.865+0000 7f7079cbc640 1 -- 192.168.123.108:0/2633401742 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f707419a940 con 0x7f7074199e20 2026-03-09T19:22:52.864 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.865+0000 7f7079cbc640 1 -- 192.168.123.108:0/2633401742 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f707419e930 con 0x7f7074199e20 2026-03-09T19:22:52.864 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.866+0000 7f7079cbc640 1 -- 192.168.123.108:0/2633401742 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7040005350 con 0x7f7074199e20 2026-03-09T19:22:52.869 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.871+0000 7f7070ff9640 1 -- 192.168.123.108:0/2633401742 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 17) v1 ==== 49489+0+0 (secure 0 0 0) 0x7f7064038470 con 0x7f7074199e20 2026-03-09T19:22:52.869 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.871+0000 7f7070ff9640 1 --2- 192.168.123.108:0/2633401742 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f705003d0a0 0x7f705003f560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:52.869 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.871+0000 7f7070ff9640 1 -- 192.168.123.108:0/2633401742 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7f7064076820 con 0x7f7074199e20 2026-03-09T19:22:52.870 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.871+0000 7f7070ff9640 1 -- 192.168.123.108:0/2633401742 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f7064076c50 con 0x7f7074199e20 2026-03-09T19:22:52.870 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.871+0000 7f7072ffd640 1 --2- 192.168.123.108:0/2633401742 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f705003d0a0 0x7f705003f560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:52.874 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:52.876+0000 7f7072ffd640 1 --2- 192.168.123.108:0/2633401742 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f705003d0a0 0x7f705003f560 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f70600099c0 tx=0x7f7060006eb0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:52.999 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:53.000+0000 7f7079cbc640 1 -- 192.168.123.108:0/2633401742 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f7040005600 con 0x7f7074199e20 2026-03-09T19:22:53.000 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:53.001+0000 7f7070ff9640 1 -- 192.168.123.108:0/2633401742 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7f7064047030 con 0x7f7074199e20 2026-03-09T19:22:53.002 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:53.002 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:53.002 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:53.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:53.007+0000 7f7079cbc640 1 -- 192.168.123.108:0/2633401742 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f705003d0a0 msgr2=0x7f705003f560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:53.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:53.007+0000 7f7079cbc640 1 --2- 192.168.123.108:0/2633401742 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f705003d0a0 0x7f705003f560 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f70600099c0 tx=0x7f7060006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:53.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:53.007+0000 7f7079cbc640 1 -- 192.168.123.108:0/2633401742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7074199e20 msgr2=0x7f707419a200 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:53.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:53.007+0000 7f7079cbc640 1 --2- 192.168.123.108:0/2633401742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7074199e20 0x7f707419a200 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f7064002ba0 tx=0x7f70640042f0 comp rx=0 tx=0).stop 2026-03-09T19:22:53.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:53.007+0000 7f7079cbc640 1 -- 192.168.123.108:0/2633401742 shutdown_connections 2026-03-09T19:22:53.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:53.007+0000 7f7079cbc640 1 --2- 192.168.123.108:0/2633401742 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f705003d0a0 0x7f705003f560 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:53.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:53.007+0000 7f7079cbc640 1 --2- 192.168.123.108:0/2633401742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7074199e20 0x7f707419a200 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:53.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:53.007+0000 7f7079cbc640 1 -- 192.168.123.108:0/2633401742 >> 192.168.123.108:0/2633401742 conn(0x7f70740fe0d0 msgr2=0x7f70740ffd80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:53.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:53.008+0000 7f7079cbc640 1 -- 192.168.123.108:0/2633401742 shutdown_connections 2026-03-09T19:22:53.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:53.008+0000 7f7079cbc640 1 -- 192.168.123.108:0/2633401742 wait complete. 2026-03-09T19:22:53.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:53 vm07 ceph-mon[48545]: Updating vm07:/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:22:53.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:53 vm07 ceph-mon[48545]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:22:53.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:53 vm07 ceph-mon[48545]: Updating vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.client.admin.keyring 2026-03-09T19:22:53.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:53 vm07 ceph-mon[48545]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.client.admin.keyring 2026-03-09T19:22:53.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:53 vm07 ceph-mon[48545]: Deploying daemon ceph-exporter.vm08 on vm08 2026-03-09T19:22:53.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:53 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/2633401742' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:54.063 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:54.063 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:54.306 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:22:54.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:54 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:54.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:54 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:54.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:54 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:54.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:54 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:54.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:54 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T19:22:54.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:54 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-09T19:22:54.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:54 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:22:54.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:54 vm07 ceph-mon[48545]: Deploying daemon crash.vm08 on vm08 2026-03-09T19:22:54.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:54 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:54.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:54 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:54.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:54 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:54.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:54 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:54.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:54 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:54.567 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.567+0000 7ff43d895640 1 -- 192.168.123.108:0/2381278853 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4380ff460 msgr2=0x7ff4380ff840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:54.567 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.567+0000 7ff43d895640 1 --2- 192.168.123.108:0/2381278853 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4380ff460 0x7ff4380ff840 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7ff42c0099b0 tx=0x7ff42c02f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:54.567 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.568+0000 7ff43d895640 1 -- 192.168.123.108:0/2381278853 shutdown_connections 2026-03-09T19:22:54.567 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.568+0000 7ff43d895640 1 --2- 192.168.123.108:0/2381278853 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4380ff460 0x7ff4380ff840 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:54.567 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.568+0000 7ff43d895640 1 -- 192.168.123.108:0/2381278853 >> 192.168.123.108:0/2381278853 conn(0x7ff4380fb310 msgr2=0x7ff4380fd700 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:54.567 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.568+0000 7ff43d895640 1 -- 192.168.123.108:0/2381278853 shutdown_connections 2026-03-09T19:22:54.567 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.568+0000 7ff43d895640 1 -- 192.168.123.108:0/2381278853 wait complete. 2026-03-09T19:22:54.567 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.569+0000 7ff43d895640 1 Processor -- start 2026-03-09T19:22:54.568 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.569+0000 7ff43d895640 1 -- start start 2026-03-09T19:22:54.568 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.569+0000 7ff43d895640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4380ff460 0x7ff43819b600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:54.568 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.569+0000 7ff43d895640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff43819bb40 con 0x7ff4380ff460 2026-03-09T19:22:54.568 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.570+0000 7ff436ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4380ff460 0x7ff43819b600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:54.568 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.570+0000 7ff436ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4380ff460 0x7ff43819b600 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:52194/0 (socket says 192.168.123.108:52194) 2026-03-09T19:22:54.568 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.570+0000 7ff436ffd640 1 -- 192.168.123.108:0/344433611 learned_addr learned my addr 192.168.123.108:0/344433611 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:54.568 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.570+0000 7ff436ffd640 1 -- 192.168.123.108:0/344433611 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff42c009660 con 0x7ff4380ff460 2026-03-09T19:22:54.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.570+0000 7ff436ffd640 1 --2- 192.168.123.108:0/344433611 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4380ff460 0x7ff43819b600 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7ff42c002410 tx=0x7ff42c004060 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:54.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.571+0000 7ff43c893640 1 -- 192.168.123.108:0/344433611 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff42c0043e0 con 0x7ff4380ff460 2026-03-09T19:22:54.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.571+0000 7ff43d895640 1 -- 192.168.123.108:0/344433611 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff43819bd40 con 0x7ff4380ff460 2026-03-09T19:22:54.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.571+0000 7ff43d895640 1 -- 192.168.123.108:0/344433611 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff4381956b0 con 0x7ff4380ff460 2026-03-09T19:22:54.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.571+0000 7ff43c893640 1 -- 192.168.123.108:0/344433611 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff42c038930 con 0x7ff4380ff460 2026-03-09T19:22:54.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.571+0000 7ff43c893640 1 -- 192.168.123.108:0/344433611 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff42c041760 con 0x7ff4380ff460 2026-03-09T19:22:54.571 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.572+0000 7ff43c893640 1 -- 192.168.123.108:0/344433611 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 17) v1 ==== 49489+0+0 (secure 0 0 0) 0x7ff42c038aa0 con 0x7ff4380ff460 2026-03-09T19:22:54.571 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.572+0000 7ff43d895640 1 -- 192.168.123.108:0/344433611 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff43819c160 con 0x7ff4380ff460 2026-03-09T19:22:54.571 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.572+0000 7ff43c893640 1 --2- 192.168.123.108:0/344433611 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff40c03d370 0x7ff40c03f830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:54.571 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.572+0000 7ff43c893640 1 -- 192.168.123.108:0/344433611 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7ff42c0762f0 con 0x7ff4380ff460 2026-03-09T19:22:54.573 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.575+0000 7ff4367fc640 1 --2- 192.168.123.108:0/344433611 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff40c03d370 0x7ff40c03f830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:54.573 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.575+0000 7ff43c893640 1 -- 192.168.123.108:0/344433611 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7ff42c035320 con 0x7ff4380ff460 2026-03-09T19:22:54.573 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.575+0000 7ff4367fc640 1 --2- 192.168.123.108:0/344433611 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff40c03d370 0x7ff40c03f830 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7ff4200099c0 tx=0x7ff420006eb0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:54.697 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.698+0000 7ff43d895640 1 -- 192.168.123.108:0/344433611 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7ff438196bf0 con 0x7ff4380ff460 2026-03-09T19:22:54.697 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.698+0000 7ff43c893640 1 -- 192.168.123.108:0/344433611 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7ff42c037b80 con 0x7ff4380ff460 2026-03-09T19:22:54.697 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:54.697 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:54.697 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:54.699 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.701+0000 7ff43d895640 1 -- 192.168.123.108:0/344433611 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff40c03d370 msgr2=0x7ff40c03f830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:54.699 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.701+0000 7ff43d895640 1 --2- 192.168.123.108:0/344433611 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff40c03d370 0x7ff40c03f830 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7ff4200099c0 tx=0x7ff420006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:54.699 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.701+0000 7ff43d895640 1 -- 192.168.123.108:0/344433611 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4380ff460 msgr2=0x7ff43819b600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:54.699 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.701+0000 7ff43d895640 1 --2- 192.168.123.108:0/344433611 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4380ff460 0x7ff43819b600 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7ff42c002410 tx=0x7ff42c004060 comp rx=0 tx=0).stop 2026-03-09T19:22:54.699 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.701+0000 7ff43d895640 1 -- 192.168.123.108:0/344433611 shutdown_connections 2026-03-09T19:22:54.699 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.701+0000 7ff43d895640 1 --2- 192.168.123.108:0/344433611 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff40c03d370 0x7ff40c03f830 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:54.699 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.701+0000 7ff43d895640 1 --2- 192.168.123.108:0/344433611 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4380ff460 0x7ff43819b600 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:54.699 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.701+0000 7ff43d895640 1 -- 192.168.123.108:0/344433611 >> 192.168.123.108:0/344433611 conn(0x7ff4380fb310 msgr2=0x7ff4380fb6f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:54.699 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.701+0000 7ff43d895640 1 -- 192.168.123.108:0/344433611 shutdown_connections 2026-03-09T19:22:54.700 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:54.701+0000 7ff43d895640 1 -- 192.168.123.108:0/344433611 wait complete. 2026-03-09T19:22:55.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:55 vm07 ceph-mon[48545]: Deploying daemon node-exporter.vm08 on vm08 2026-03-09T19:22:55.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:55 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/344433611' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:55.757 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:55.757 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:55.899 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:22:56.171 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.172+0000 7ff0aa083640 1 -- 192.168.123.108:0/1495467303 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff0a410b7f0 msgr2=0x7ff0a410bbd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:56.171 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.172+0000 7ff0aa083640 1 --2- 192.168.123.108:0/1495467303 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff0a410b7f0 0x7ff0a410bbd0 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7ff0940099b0 tx=0x7ff09402f2b0 comp rx=0 tx=0).stop 2026-03-09T19:22:56.171 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.172+0000 7ff0aa083640 1 -- 192.168.123.108:0/1495467303 shutdown_connections 2026-03-09T19:22:56.171 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.172+0000 7ff0aa083640 1 --2- 192.168.123.108:0/1495467303 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff0a410b7f0 0x7ff0a410bbd0 unknown :-1 s=CLOSED pgs=155 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:56.171 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.172+0000 7ff0aa083640 1 -- 192.168.123.108:0/1495467303 >> 192.168.123.108:0/1495467303 conn(0x7ff0a406a350 msgr2=0x7ff0a406a760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:56.171 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.172+0000 7ff0aa083640 1 -- 192.168.123.108:0/1495467303 shutdown_connections 2026-03-09T19:22:56.171 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.172+0000 7ff0aa083640 1 -- 192.168.123.108:0/1495467303 wait complete. 2026-03-09T19:22:56.172 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.174+0000 7ff0aa083640 1 Processor -- start 2026-03-09T19:22:56.172 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.174+0000 7ff0aa083640 1 -- start start 2026-03-09T19:22:56.173 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.174+0000 7ff0aa083640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff0a410b7f0 0x7ff0a41a23c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:56.173 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.174+0000 7ff0aa083640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff0a41a2900 con 0x7ff0a410b7f0 2026-03-09T19:22:56.173 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.174+0000 7ff0a9081640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff0a410b7f0 0x7ff0a41a23c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:56.174 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.175+0000 7ff0a9081640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff0a410b7f0 0x7ff0a41a23c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:52216/0 (socket says 192.168.123.108:52216) 2026-03-09T19:22:56.174 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.175+0000 7ff0a9081640 1 -- 192.168.123.108:0/2107481504 learned_addr learned my addr 192.168.123.108:0/2107481504 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:56.174 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.176+0000 7ff0a9081640 1 -- 192.168.123.108:0/2107481504 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff094009660 con 0x7ff0a410b7f0 2026-03-09T19:22:56.174 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.176+0000 7ff0a9081640 1 --2- 192.168.123.108:0/2107481504 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff0a410b7f0 0x7ff0a41a23c0 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7ff094004290 tx=0x7ff0940042c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:56.175 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.177+0000 7ff0927fc640 1 -- 192.168.123.108:0/2107481504 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff0940043b0 con 0x7ff0a410b7f0 2026-03-09T19:22:56.175 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.177+0000 7ff0927fc640 1 -- 192.168.123.108:0/2107481504 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff094038b40 con 0x7ff0a410b7f0 2026-03-09T19:22:56.175 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.177+0000 7ff0aa083640 1 -- 192.168.123.108:0/2107481504 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff0a41a2b00 con 0x7ff0a410b7f0 2026-03-09T19:22:56.175 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.177+0000 7ff0aa083640 1 -- 192.168.123.108:0/2107481504 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff0a41a2ec0 con 0x7ff0a410b7f0 2026-03-09T19:22:56.177 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.179+0000 7ff0927fc640 1 -- 192.168.123.108:0/2107481504 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff094041a10 con 0x7ff0a410b7f0 2026-03-09T19:22:56.179 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.181+0000 7ff073fff640 1 -- 192.168.123.108:0/2107481504 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff0a410c5a0 con 0x7ff0a410b7f0 2026-03-09T19:22:56.179 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.181+0000 7ff0927fc640 1 -- 192.168.123.108:0/2107481504 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 17) v1 ==== 49489+0+0 (secure 0 0 0) 0x7ff094041c30 con 0x7ff0a410b7f0 2026-03-09T19:22:56.179 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.181+0000 7ff0927fc640 1 --2- 192.168.123.108:0/2107481504 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff07403d410 0x7ff07403f8d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:56.180 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.181+0000 7ff0927fc640 1 -- 192.168.123.108:0/2107481504 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7ff0940771f0 con 0x7ff0a410b7f0 2026-03-09T19:22:56.182 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.183+0000 7ff0a8880640 1 --2- 192.168.123.108:0/2107481504 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff07403d410 0x7ff07403f8d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:56.182 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.184+0000 7ff0927fc640 1 -- 192.168.123.108:0/2107481504 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7ff09404b460 con 0x7ff0a410b7f0 2026-03-09T19:22:56.184 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.186+0000 7ff0a8880640 1 --2- 192.168.123.108:0/2107481504 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff07403d410 0x7ff07403f8d0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7ff0980099c0 tx=0x7ff098006eb0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:56.336 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.337+0000 7ff073fff640 1 -- 192.168.123.108:0/2107481504 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7ff0a41152e0 con 0x7ff0a410b7f0 2026-03-09T19:22:56.338 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.339+0000 7ff0927fc640 1 -- 192.168.123.108:0/2107481504 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7ff094037840 con 0x7ff0a410b7f0 2026-03-09T19:22:56.338 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:56.338 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:56.338 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:56.340 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.341+0000 7ff073fff640 1 -- 192.168.123.108:0/2107481504 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff07403d410 msgr2=0x7ff07403f8d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:56.340 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.341+0000 7ff073fff640 1 --2- 192.168.123.108:0/2107481504 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff07403d410 0x7ff07403f8d0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7ff0980099c0 tx=0x7ff098006eb0 comp rx=0 tx=0).stop 2026-03-09T19:22:56.340 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.341+0000 7ff073fff640 1 -- 192.168.123.108:0/2107481504 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff0a410b7f0 msgr2=0x7ff0a41a23c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:56.340 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.341+0000 7ff073fff640 1 --2- 192.168.123.108:0/2107481504 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff0a410b7f0 0x7ff0a41a23c0 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7ff094004290 tx=0x7ff0940042c0 comp rx=0 tx=0).stop 2026-03-09T19:22:56.340 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.341+0000 7ff073fff640 1 -- 192.168.123.108:0/2107481504 shutdown_connections 2026-03-09T19:22:56.340 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.341+0000 7ff073fff640 1 --2- 192.168.123.108:0/2107481504 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff07403d410 0x7ff07403f8d0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:56.340 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.341+0000 7ff073fff640 1 --2- 192.168.123.108:0/2107481504 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff0a410b7f0 0x7ff0a41a23c0 unknown :-1 s=CLOSED pgs=156 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:56.340 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.341+0000 7ff073fff640 1 -- 192.168.123.108:0/2107481504 >> 192.168.123.108:0/2107481504 conn(0x7ff0a406a350 msgr2=0x7ff0a4112b80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:56.340 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.342+0000 7ff073fff640 1 -- 192.168.123.108:0/2107481504 shutdown_connections 2026-03-09T19:22:56.340 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:56.342+0000 7ff073fff640 1 -- 192.168.123.108:0/2107481504 wait complete. 2026-03-09T19:22:56.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:56 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/2107481504' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:57.383 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:57.383 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:57.642 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:22:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:57.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.mxylvw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:22:57.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.vm08.mxylvw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished 2026-03-09T19:22:57.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T19:22:57.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:22:57.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:57 vm07 ceph-mon[48545]: Deploying daemon mgr.vm08.mxylvw on vm08 2026-03-09T19:22:57.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:57.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:57.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:57.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:57.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T19:22:57.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:22:58.190 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.190+0000 7fb54c480640 1 -- 192.168.123.108:0/2857275721 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb54410b940 msgr2=0x7fb54410bd20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:58.190 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.190+0000 7fb54c480640 1 --2- 192.168.123.108:0/2857275721 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb54410b940 0x7fb54410bd20 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7fb540007920 tx=0x7fb5400300e0 comp rx=0 tx=0).stop 2026-03-09T19:22:58.190 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.190+0000 7fb54c480640 1 -- 192.168.123.108:0/2857275721 shutdown_connections 2026-03-09T19:22:58.190 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.190+0000 7fb54c480640 1 --2- 192.168.123.108:0/2857275721 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb54410b940 0x7fb54410bd20 unknown :-1 s=CLOSED pgs=159 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:58.190 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.190+0000 7fb54c480640 1 -- 192.168.123.108:0/2857275721 >> 192.168.123.108:0/2857275721 conn(0x7fb54406b1e0 msgr2=0x7fb54406b5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:58.191 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.190+0000 7fb54c480640 1 -- 192.168.123.108:0/2857275721 shutdown_connections 2026-03-09T19:22:58.191 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.190+0000 7fb54c480640 1 -- 192.168.123.108:0/2857275721 wait complete. 2026-03-09T19:22:58.191 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.192+0000 7fb54c480640 1 Processor -- start 2026-03-09T19:22:58.191 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.192+0000 7fb54c480640 1 -- start start 2026-03-09T19:22:58.191 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.192+0000 7fb54c480640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb544083b50 0x7fb544083f30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:58.191 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.192+0000 7fb54c480640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb540002e00 con 0x7fb544083b50 2026-03-09T19:22:58.191 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.192+0000 7fb54a1f5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb544083b50 0x7fb544083f30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:58.191 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.192+0000 7fb54a1f5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb544083b50 0x7fb544083f30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:33662/0 (socket says 192.168.123.108:33662) 2026-03-09T19:22:58.191 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.192+0000 7fb54a1f5640 1 -- 192.168.123.108:0/2025705568 learned_addr learned my addr 192.168.123.108:0/2025705568 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:22:58.191 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.193+0000 7fb54a1f5640 1 -- 192.168.123.108:0/2025705568 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb5400075d0 con 0x7fb544083b50 2026-03-09T19:22:58.194 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.196+0000 7fb54a1f5640 1 --2- 192.168.123.108:0/2025705568 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb544083b50 0x7fb544083f30 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7fb540030690 tx=0x7fb540002910 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:58.194 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.196+0000 7fb53b7fe640 1 -- 192.168.123.108:0/2025705568 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb540039690 con 0x7fb544083b50 2026-03-09T19:22:58.195 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.196+0000 7fb54c480640 1 -- 192.168.123.108:0/2025705568 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb5440844d0 con 0x7fb544083b50 2026-03-09T19:22:58.195 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.196+0000 7fb54c480640 1 -- 192.168.123.108:0/2025705568 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb54407cdb0 con 0x7fb544083b50 2026-03-09T19:22:58.195 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.197+0000 7fb53b7fe640 1 -- 192.168.123.108:0/2025705568 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fb540039cb0 con 0x7fb544083b50 2026-03-09T19:22:58.195 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.197+0000 7fb53b7fe640 1 -- 192.168.123.108:0/2025705568 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb540041ba0 con 0x7fb544083b50 2026-03-09T19:22:58.196 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.198+0000 7fb53b7fe640 1 -- 192.168.123.108:0/2025705568 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 17) v1 ==== 49489+0+0 (secure 0 0 0) 0x7fb54004b430 con 0x7fb544083b50 2026-03-09T19:22:58.196 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.198+0000 7fb53b7fe640 1 --2- 192.168.123.108:0/2025705568 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb52403d410 0x7fb52403f8d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:22:58.197 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.198+0000 7fb53b7fe640 1 -- 192.168.123.108:0/2025705568 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7fb540077d00 con 0x7fb544083b50 2026-03-09T19:22:58.198 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.199+0000 7fb5499f4640 1 --2- 192.168.123.108:0/2025705568 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb52403d410 0x7fb52403f8d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:22:58.198 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.200+0000 7fb5499f4640 1 --2- 192.168.123.108:0/2025705568 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb52403d410 0x7fb52403f8d0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7fb53c00ad30 tx=0x7fb53c0093f0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:22:58.199 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.200+0000 7fb54c480640 1 -- 192.168.123.108:0/2025705568 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb514005350 con 0x7fb544083b50 2026-03-09T19:22:58.202 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.203+0000 7fb53b7fe640 1 -- 192.168.123.108:0/2025705568 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fb540046020 con 0x7fb544083b50 2026-03-09T19:22:58.428 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.429+0000 7fb54c480640 1 -- 192.168.123.108:0/2025705568 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fb5140051c0 con 0x7fb544083b50 2026-03-09T19:22:58.428 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.430+0000 7fb53b7fe640 1 -- 192.168.123.108:0/2025705568 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+747 (secure 0 0 0) 0x7fb540036b00 con 0x7fb544083b50 2026-03-09T19:22:58.430 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:22:58.430 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:21:44.113262Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T19:22:58.430 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-09T19:22:58.432 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.434+0000 7fb54c480640 1 -- 192.168.123.108:0/2025705568 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb52403d410 msgr2=0x7fb52403f8d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:58.433 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.434+0000 7fb54c480640 1 --2- 192.168.123.108:0/2025705568 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb52403d410 0x7fb52403f8d0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7fb53c00ad30 tx=0x7fb53c0093f0 comp rx=0 tx=0).stop 2026-03-09T19:22:58.433 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.434+0000 7fb54c480640 1 -- 192.168.123.108:0/2025705568 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb544083b50 msgr2=0x7fb544083f30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:22:58.433 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.434+0000 7fb54c480640 1 --2- 192.168.123.108:0/2025705568 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb544083b50 0x7fb544083f30 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7fb540030690 tx=0x7fb540002910 comp rx=0 tx=0).stop 2026-03-09T19:22:58.433 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.434+0000 7fb54c480640 1 -- 192.168.123.108:0/2025705568 shutdown_connections 2026-03-09T19:22:58.433 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.434+0000 7fb54c480640 1 --2- 192.168.123.108:0/2025705568 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb52403d410 0x7fb52403f8d0 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:58.433 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.434+0000 7fb54c480640 1 --2- 192.168.123.108:0/2025705568 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb544083b50 0x7fb544083f30 unknown :-1 s=CLOSED pgs=160 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:22:58.433 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.434+0000 7fb54c480640 1 -- 192.168.123.108:0/2025705568 >> 192.168.123.108:0/2025705568 conn(0x7fb54406b1e0 msgr2=0x7fb544073e80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:22:58.433 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.434+0000 7fb54c480640 1 -- 192.168.123.108:0/2025705568 shutdown_connections 2026-03-09T19:22:58.433 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:22:58.434+0000 7fb54c480640 1 -- 192.168.123.108:0/2025705568 wait complete. 2026-03-09T19:22:58.849 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:22:58 vm08 ceph-mon[57794]: mon.vm08@-1(synchronizing).paxosservice(auth 1..8) refresh upgraded, format 0 -> 3 2026-03-09T19:22:58.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:58 vm07 ceph-mon[48545]: Deploying daemon mon.vm08 on vm08 2026-03-09T19:22:58.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:58 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:22:58.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:22:58 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/2025705568' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:22:59.490 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T19:22:59.490 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mon dump -f json 2026-03-09T19:22:59.664 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm08/config 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: mon.vm07 calling monitor election 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: from='mgr.? 192.168.123.108:0/1497782998' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.mxylvw/crt"}]: dispatch 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: mon.vm08 calling monitor election 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: mon.vm07 is new leader, mons vm07,vm08 in quorum (ranks 0,1) 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: monmap e2: 2 mons at {vm07=[v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0],vm08=[v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0]} removed_ranks: {} disallowed_leaders: {} 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: fsmap 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: osdmap e5: 0 total, 0 up, 0 in 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: mgrmap e17: vm07.xacuym(active, since 15s) 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: overall HEALTH_OK 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: Standby manager daemon vm08.mxylvw started 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: from='mgr.? 192.168.123.108:0/1497782998' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: from='mgr.? 192.168.123.108:0/1497782998' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.mxylvw/key"}]: dispatch 2026-03-09T19:23:04.041 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:03 vm08 ceph-mon[57794]: from='mgr.? 192.168.123.108:0/1497782998' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T19:23:04.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:23:04.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: mon.vm07 calling monitor election 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: from='mgr.? 192.168.123.108:0/1497782998' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.mxylvw/crt"}]: dispatch 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: mon.vm08 calling monitor election 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: mon.vm07 is new leader, mons vm07,vm08 in quorum (ranks 0,1) 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: monmap e2: 2 mons at {vm07=[v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0],vm08=[v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0]} removed_ranks: {} disallowed_leaders: {} 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: fsmap 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: osdmap e5: 0 total, 0 up, 0 in 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: mgrmap e17: vm07.xacuym(active, since 15s) 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: overall HEALTH_OK 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: Standby manager daemon vm08.mxylvw started 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: from='mgr.? 192.168.123.108:0/1497782998' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: from='mgr.? 192.168.123.108:0/1497782998' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.mxylvw/key"}]: dispatch 2026-03-09T19:23:04.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:03 vm07 ceph-mon[48545]: from='mgr.? 192.168.123.108:0/1497782998' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T19:23:04.335 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.335+0000 7f50c6227640 1 -- 192.168.123.108:0/2201006276 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f50a8003630 msgr2=0x7f50a8005ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:04.335 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.335+0000 7f50c6227640 1 --2- 192.168.123.108:0/2201006276 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f50a8003630 0x7f50a8005ac0 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f50c006d6d0 tx=0x7f50bc030b10 comp rx=0 tx=0).stop 2026-03-09T19:23:04.335 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.335+0000 7f50c6227640 1 -- 192.168.123.108:0/2201006276 shutdown_connections 2026-03-09T19:23:04.335 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.335+0000 7f50c6227640 1 --2- 192.168.123.108:0/2201006276 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f50a8003630 0x7f50a8005ac0 secure :-1 s=CLOSED pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f50c006d6d0 tx=0x7f50bc030b10 comp rx=0 tx=0).stop 2026-03-09T19:23:04.335 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.335+0000 7f50c6227640 1 -- 192.168.123.108:0/2201006276 >> 192.168.123.108:0/2201006276 conn(0x7f50c006c640 msgr2=0x7f50c006ca50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:23:04.335 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.335+0000 7f50c6227640 1 -- 192.168.123.108:0/2201006276 shutdown_connections 2026-03-09T19:23:04.335 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.335+0000 7f50c6227640 1 -- 192.168.123.108:0/2201006276 wait complete. 2026-03-09T19:23:04.335 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.335+0000 7f50c6227640 1 Processor -- start 2026-03-09T19:23:04.335 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.335+0000 7f50c6227640 1 -- start start 2026-03-09T19:23:04.335 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.335+0000 7f50c6227640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f50c01b56e0 0x7f50c01b5ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:04.335 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.336+0000 7f50c6227640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f50c01b6000 0x7f50c01177a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:04.335 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.336+0000 7f50c6227640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f50c0117e60 con 0x7f50c01b56e0 2026-03-09T19:23:04.335 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.336+0000 7f50c6227640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f50c0117fd0 con 0x7f50c01b6000 2026-03-09T19:23:04.335 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.336+0000 7f50c4a24640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f50c01b6000 0x7f50c01177a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:04.336 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.336+0000 7f50c4a24640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f50c01b6000 0x7f50c01177a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.108:56662/0 (socket says 192.168.123.108:56662) 2026-03-09T19:23:04.336 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.336+0000 7f50c4a24640 1 -- 192.168.123.108:0/1614469021 learned_addr learned my addr 192.168.123.108:0/1614469021 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:23:04.336 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.336+0000 7f50c5225640 1 --2- 192.168.123.108:0/1614469021 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f50c01b56e0 0x7f50c01b5ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:04.336 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.336+0000 7f50c4a24640 1 -- 192.168.123.108:0/1614469021 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f50c01b6000 msgr2=0x7f50c01177a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 12 2026-03-09T19:23:04.336 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.336+0000 7f50c4a24640 1 -- 192.168.123.108:0/1614469021 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f50c01b6000 msgr2=0x7f50c01177a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-09T19:23:04.336 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.336+0000 7f50c4a24640 1 --2- 192.168.123.108:0/1614469021 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f50c01b6000 0x7f50c01177a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-09T19:23:04.336 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.336+0000 7f50c4a24640 1 --2- 192.168.123.108:0/1614469021 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f50c01b6000 0x7f50c01177a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T19:23:04.336 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.337+0000 7f50c5225640 1 -- 192.168.123.108:0/1614469021 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f50c01b6000 msgr2=0x7f50c01177a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:23:04.336 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.337+0000 7f50c5225640 1 --2- 192.168.123.108:0/1614469021 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f50c01b6000 0x7f50c01177a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:04.336 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.337+0000 7f50c5225640 1 -- 192.168.123.108:0/1614469021 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f50bc004560 con 0x7f50c01b56e0 2026-03-09T19:23:04.336 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.337+0000 7f50c5225640 1 --2- 192.168.123.108:0/1614469021 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f50c01b56e0 0x7f50c01b5ac0 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f50bc005fb0 tx=0x7f50bc0349b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:23:04.337 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.337+0000 7f50ae7fc640 1 -- 192.168.123.108:0/1614469021 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f50bc004d00 con 0x7f50c01b56e0 2026-03-09T19:23:04.337 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.337+0000 7f50c6227640 1 -- 192.168.123.108:0/1614469021 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f50c0118250 con 0x7f50c01b56e0 2026-03-09T19:23:04.337 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.337+0000 7f50c6227640 1 -- 192.168.123.108:0/1614469021 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f50c01ba510 con 0x7f50c01b56e0 2026-03-09T19:23:04.337 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.338+0000 7f50ae7fc640 1 -- 192.168.123.108:0/1614469021 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f50bc004e60 con 0x7f50c01b56e0 2026-03-09T19:23:04.337 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.338+0000 7f50c6227640 1 -- 192.168.123.108:0/1614469021 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f50c0072f00 con 0x7f50c01b56e0 2026-03-09T19:23:04.337 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.338+0000 7f50ae7fc640 1 -- 192.168.123.108:0/1614469021 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f50bc035bc0 con 0x7f50c01b56e0 2026-03-09T19:23:04.339 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.339+0000 7f50ae7fc640 1 -- 192.168.123.108:0/1614469021 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 18) v1 ==== 98424+0+0 (secure 0 0 0) 0x7f50bc035460 con 0x7f50c01b56e0 2026-03-09T19:23:04.339 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.340+0000 7f50ae7fc640 1 --2- 192.168.123.108:0/1614469021 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5098076130 0x7f50980785f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:04.339 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.340+0000 7f50ae7fc640 1 -- 192.168.123.108:0/1614469021 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7f50bc0bd030 con 0x7f50c01b56e0 2026-03-09T19:23:04.340 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.341+0000 7f50c4a24640 1 --2- 192.168.123.108:0/1614469021 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5098076130 0x7f50980785f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:04.340 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.342+0000 7f50c4a24640 1 --2- 192.168.123.108:0/1614469021 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5098076130 0x7f50980785f0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f50b8009400 tx=0x7f50b8007040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:23:04.341 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.342+0000 7f50ae7fc640 1 -- 192.168.123.108:0/1614469021 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f50bc087710 con 0x7f50c01b56e0 2026-03-09T19:23:04.471 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.472+0000 7f50c6227640 1 -- 192.168.123.108:0/1614469021 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f50c010c3a0 con 0x7f50c01b56e0 2026-03-09T19:23:04.471 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:23:04.471 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":2,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","modified":"2026-03-09T19:22:58.690412Z","created":"2026-03-09T19:21:44.113262Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"},{"rank":1,"name":"vm08","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:3300","nonce":0},{"type":"v1","addr":"192.168.123.108:6789","nonce":0}]},"addr":"192.168.123.108:6789/0","public_addr":"192.168.123.108:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0,1]} 2026-03-09T19:23:04.471 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.472+0000 7f50ae7fc640 1 -- 192.168.123.108:0/1614469021 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 2 v2) v1 ==== 95+0+1028 (secure 0 0 0) 0x7f50bc083070 con 0x7f50c01b56e0 2026-03-09T19:23:04.471 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 2 2026-03-09T19:23:04.473 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.475+0000 7f50c6227640 1 -- 192.168.123.108:0/1614469021 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5098076130 msgr2=0x7f50980785f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:04.473 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.475+0000 7f50c6227640 1 --2- 192.168.123.108:0/1614469021 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5098076130 0x7f50980785f0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f50b8009400 tx=0x7f50b8007040 comp rx=0 tx=0).stop 2026-03-09T19:23:04.473 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.475+0000 7f50c6227640 1 -- 192.168.123.108:0/1614469021 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f50c01b56e0 msgr2=0x7f50c01b5ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:04.473 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.475+0000 7f50c6227640 1 --2- 192.168.123.108:0/1614469021 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f50c01b56e0 0x7f50c01b5ac0 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f50bc005fb0 tx=0x7f50bc0349b0 comp rx=0 tx=0).stop 2026-03-09T19:23:04.473 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.475+0000 7f50c6227640 1 -- 192.168.123.108:0/1614469021 shutdown_connections 2026-03-09T19:23:04.473 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.475+0000 7f50c6227640 1 --2- 192.168.123.108:0/1614469021 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5098076130 0x7f50980785f0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:04.473 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.475+0000 7f50c6227640 1 --2- 192.168.123.108:0/1614469021 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f50c01b6000 0x7f50c01177a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:04.473 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.475+0000 7f50c6227640 1 --2- 192.168.123.108:0/1614469021 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f50c01b56e0 0x7f50c01b5ac0 unknown :-1 s=CLOSED pgs=164 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:04.474 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.475+0000 7f50c6227640 1 -- 192.168.123.108:0/1614469021 >> 192.168.123.108:0/1614469021 conn(0x7f50c006c640 msgr2=0x7f50c010dd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:23:04.474 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.475+0000 7f50c6227640 1 -- 192.168.123.108:0/1614469021 shutdown_connections 2026-03-09T19:23:04.474 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:04.475+0000 7f50c6227640 1 -- 192.168.123.108:0/1614469021 wait complete. 2026-03-09T19:23:04.518 INFO:tasks.cephadm:Generating final ceph.conf file... 2026-03-09T19:23:04.518 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph config generate-minimal-conf 2026-03-09T19:23:04.689 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:23:04.795 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:04 vm07 ceph-mon[48545]: mgrmap e18: vm07.xacuym(active, since 15s), standbys: vm08.mxylvw 2026-03-09T19:23:04.795 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:04 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr metadata", "who": "vm08.mxylvw", "id": "vm08.mxylvw"}]: dispatch 2026-03-09T19:23:04.795 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:04 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:04.795 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:04 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:04.795 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:04 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:04.795 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:04 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:04.795 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:04 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/1614469021' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:23:04.795 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:04 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:04.795 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:04 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:04.795 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:04 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:04.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.939+0000 7f1747fff640 1 -- 192.168.123.107:0/42262706 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1748106380 msgr2=0x7f1748106760 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:04.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.939+0000 7f1747fff640 1 --2- 192.168.123.107:0/42262706 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1748106380 0x7f1748106760 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7f1730009a00 tx=0x7f173002f310 comp rx=0 tx=0).stop 2026-03-09T19:23:04.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.940+0000 7f1747fff640 1 -- 192.168.123.107:0/42262706 shutdown_connections 2026-03-09T19:23:04.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.940+0000 7f1747fff640 1 --2- 192.168.123.107:0/42262706 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1748106380 0x7f1748106760 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:04.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.940+0000 7f1747fff640 1 -- 192.168.123.107:0/42262706 >> 192.168.123.107:0/42262706 conn(0x7f17480fa000 msgr2=0x7f17480fc420 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:23:04.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.940+0000 7f1747fff640 1 -- 192.168.123.107:0/42262706 shutdown_connections 2026-03-09T19:23:04.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.940+0000 7f1747fff640 1 -- 192.168.123.107:0/42262706 wait complete. 2026-03-09T19:23:04.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.941+0000 7f1747fff640 1 Processor -- start 2026-03-09T19:23:04.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.941+0000 7f1747fff640 1 -- start start 2026-03-09T19:23:04.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.941+0000 7f1747fff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f174819b880 0x7f174819bc60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:04.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.941+0000 7f1747fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f174819c1a0 0x7f1748195970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:04.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.941+0000 7f1747fff640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1748195f40 con 0x7f174819c1a0 2026-03-09T19:23:04.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.941+0000 7f17467fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f174819c1a0 0x7f1748195970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:04.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.941+0000 7f17467fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f174819c1a0 0x7f1748195970 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:37858/0 (socket says 192.168.123.107:37858) 2026-03-09T19:23:04.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.941+0000 7f17467fc640 1 -- 192.168.123.107:0/59937872 learned_addr learned my addr 192.168.123.107:0/59937872 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:23:04.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.942+0000 7f1747fff640 1 -- 192.168.123.107:0/59937872 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f17481960b0 con 0x7f174819b880 2026-03-09T19:23:04.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.942+0000 7f17467fc640 1 -- 192.168.123.107:0/59937872 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f174819b880 msgr2=0x7f174819bc60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:04.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.942+0000 7f17467fc640 1 --2- 192.168.123.107:0/59937872 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f174819b880 0x7f174819bc60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:04.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.942+0000 7f17467fc640 1 -- 192.168.123.107:0/59937872 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1730009660 con 0x7f174819c1a0 2026-03-09T19:23:04.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.942+0000 7f17467fc640 1 --2- 192.168.123.107:0/59937872 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f174819c1a0 0x7f1748195970 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7f173c00b700 tx=0x7f173c00bbd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:23:04.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.942+0000 7f1727fff640 1 -- 192.168.123.107:0/59937872 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f173c00c8e0 con 0x7f174819c1a0 2026-03-09T19:23:04.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.942+0000 7f1727fff640 1 -- 192.168.123.107:0/59937872 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f173c014440 con 0x7f174819c1a0 2026-03-09T19:23:04.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.942+0000 7f1727fff640 1 -- 192.168.123.107:0/59937872 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f173c012550 con 0x7f174819c1a0 2026-03-09T19:23:04.941 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.942+0000 7f1747fff640 1 -- 192.168.123.107:0/59937872 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1748196390 con 0x7f174819c1a0 2026-03-09T19:23:04.942 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.943+0000 7f1747fff640 1 -- 192.168.123.107:0/59937872 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f17481a0690 con 0x7f174819c1a0 2026-03-09T19:23:04.945 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.944+0000 7f1747fff640 1 -- 192.168.123.107:0/59937872 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f170c005350 con 0x7f174819c1a0 2026-03-09T19:23:04.945 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.944+0000 7f1727fff640 1 -- 192.168.123.107:0/59937872 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 18) v1 ==== 98424+0+0 (secure 0 0 0) 0x7f173c01b020 con 0x7f174819c1a0 2026-03-09T19:23:04.945 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.945+0000 7f1727fff640 1 --2- 192.168.123.107:0/59937872 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f171c075f20 0x7f171c0783e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:04.945 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.945+0000 7f1727fff640 1 -- 192.168.123.107:0/59937872 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7f173c05f210 con 0x7f174819c1a0 2026-03-09T19:23:04.945 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.947+0000 7f1727fff640 1 -- 192.168.123.107:0/59937872 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f173c09b050 con 0x7f174819c1a0 2026-03-09T19:23:04.945 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.947+0000 7f1746ffd640 1 --2- 192.168.123.107:0/59937872 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f171c075f20 0x7f171c0783e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:04.948 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:04.948+0000 7f1746ffd640 1 --2- 192.168.123.107:0/59937872 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f171c075f20 0x7f171c0783e0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f173002f840 tx=0x7f1730005b00 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:23:05.037 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:05.038+0000 7f1747fff640 1 -- 192.168.123.107:0/59937872 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7f170c0058d0 con 0x7f174819c1a0 2026-03-09T19:23:05.037 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:05.038+0000 7f1727fff640 1 -- 192.168.123.107:0/59937872 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v10) v1 ==== 76+0+235 (secure 0 0 0) 0x7f173c05f4d0 con 0x7f174819c1a0 2026-03-09T19:23:05.037 INFO:teuthology.orchestra.run.vm07.stdout:# minimal ceph.conf for 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:23:05.037 INFO:teuthology.orchestra.run.vm07.stdout:[global] 2026-03-09T19:23:05.037 INFO:teuthology.orchestra.run.vm07.stdout: fsid = 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:23:05.037 INFO:teuthology.orchestra.run.vm07.stdout: mon_host = [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] 2026-03-09T19:23:05.039 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:05.040+0000 7f1747fff640 1 -- 192.168.123.107:0/59937872 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f171c075f20 msgr2=0x7f171c0783e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:05.039 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:05.040+0000 7f1747fff640 1 --2- 192.168.123.107:0/59937872 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f171c075f20 0x7f171c0783e0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f173002f840 tx=0x7f1730005b00 comp rx=0 tx=0).stop 2026-03-09T19:23:05.039 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:05.040+0000 7f1747fff640 1 -- 192.168.123.107:0/59937872 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f174819c1a0 msgr2=0x7f1748195970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:05.039 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:05.040+0000 7f1747fff640 1 --2- 192.168.123.107:0/59937872 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f174819c1a0 0x7f1748195970 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7f173c00b700 tx=0x7f173c00bbd0 comp rx=0 tx=0).stop 2026-03-09T19:23:05.039 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:05.041+0000 7f1747fff640 1 -- 192.168.123.107:0/59937872 shutdown_connections 2026-03-09T19:23:05.039 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:05.041+0000 7f1747fff640 1 --2- 192.168.123.107:0/59937872 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f171c075f20 0x7f171c0783e0 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:05.039 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:05.041+0000 7f1747fff640 1 --2- 192.168.123.107:0/59937872 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f174819c1a0 0x7f1748195970 unknown :-1 s=CLOSED pgs=166 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:05.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:05.041+0000 7f1747fff640 1 --2- 192.168.123.107:0/59937872 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f174819b880 0x7f174819bc60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:05.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:05.041+0000 7f1747fff640 1 -- 192.168.123.107:0/59937872 >> 192.168.123.107:0/59937872 conn(0x7f17480fa000 msgr2=0x7f17480fbb70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:23:05.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:05.041+0000 7f1747fff640 1 -- 192.168.123.107:0/59937872 shutdown_connections 2026-03-09T19:23:05.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:05.041+0000 7f1747fff640 1 -- 192.168.123.107:0/59937872 wait complete. 2026-03-09T19:23:05.080 INFO:tasks.cephadm:Distributing (final) config and client.admin keyring... 2026-03-09T19:23:05.080 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:23:05.080 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/etc/ceph/ceph.conf 2026-03-09T19:23:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:04 vm08 ceph-mon[57794]: mgrmap e18: vm07.xacuym(active, since 15s), standbys: vm08.mxylvw 2026-03-09T19:23:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:04 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr metadata", "who": "vm08.mxylvw", "id": "vm08.mxylvw"}]: dispatch 2026-03-09T19:23:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:04 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:04 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:04 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:04 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:04 vm08 ceph-mon[57794]: from='client.? 192.168.123.108:0/1614469021' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T19:23:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:04 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:04 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:04 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:05.107 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:23:05.107 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:23:05.175 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-09T19:23:05.175 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/etc/ceph/ceph.conf 2026-03-09T19:23:05.199 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-09T19:23:05.199 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:23:05.263 INFO:tasks.cephadm:Deploying OSDs... 2026-03-09T19:23:05.263 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:23:05.263 DEBUG:teuthology.orchestra.run.vm07:> dd if=/scratch_devs of=/dev/stdout 2026-03-09T19:23:05.282 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T19:23:05.283 DEBUG:teuthology.orchestra.run.vm07:> ls /dev/[sv]d? 2026-03-09T19:23:05.342 INFO:teuthology.orchestra.run.vm07.stdout:/dev/vda 2026-03-09T19:23:05.342 INFO:teuthology.orchestra.run.vm07.stdout:/dev/vdb 2026-03-09T19:23:05.342 INFO:teuthology.orchestra.run.vm07.stdout:/dev/vdc 2026-03-09T19:23:05.342 INFO:teuthology.orchestra.run.vm07.stdout:/dev/vdd 2026-03-09T19:23:05.342 INFO:teuthology.orchestra.run.vm07.stdout:/dev/vde 2026-03-09T19:23:05.342 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-09T19:23:05.342 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-09T19:23:05.342 DEBUG:teuthology.orchestra.run.vm07:> stat /dev/vdb 2026-03-09T19:23:05.409 INFO:teuthology.orchestra.run.vm07.stdout: File: /dev/vdb 2026-03-09T19:23:05.409 INFO:teuthology.orchestra.run.vm07.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T19:23:05.409 INFO:teuthology.orchestra.run.vm07.stdout:Device: 6h/6d Inode: 221 Links: 1 Device type: fc,10 2026-03-09T19:23:05.409 INFO:teuthology.orchestra.run.vm07.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T19:23:05.409 INFO:teuthology.orchestra.run.vm07.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T19:23:05.409 INFO:teuthology.orchestra.run.vm07.stdout:Access: 2026-03-09 19:22:13.373155882 +0000 2026-03-09T19:23:05.410 INFO:teuthology.orchestra.run.vm07.stdout:Modify: 2026-03-09 19:17:09.033000000 +0000 2026-03-09T19:23:05.410 INFO:teuthology.orchestra.run.vm07.stdout:Change: 2026-03-09 19:17:09.033000000 +0000 2026-03-09T19:23:05.410 INFO:teuthology.orchestra.run.vm07.stdout: Birth: 2026-03-09 19:17:07.192000000 +0000 2026-03-09T19:23:05.410 DEBUG:teuthology.orchestra.run.vm07:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-09T19:23:05.495 INFO:teuthology.orchestra.run.vm07.stderr:1+0 records in 2026-03-09T19:23:05.495 INFO:teuthology.orchestra.run.vm07.stderr:1+0 records out 2026-03-09T19:23:05.495 INFO:teuthology.orchestra.run.vm07.stderr:512 bytes copied, 0.000135012 s, 3.8 MB/s 2026-03-09T19:23:05.496 DEBUG:teuthology.orchestra.run.vm07:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-09T19:23:05.519 DEBUG:teuthology.orchestra.run.vm07:> stat /dev/vdc 2026-03-09T19:23:05.576 INFO:teuthology.orchestra.run.vm07.stdout: File: /dev/vdc 2026-03-09T19:23:05.576 INFO:teuthology.orchestra.run.vm07.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T19:23:05.576 INFO:teuthology.orchestra.run.vm07.stdout:Device: 6h/6d Inode: 222 Links: 1 Device type: fc,20 2026-03-09T19:23:05.576 INFO:teuthology.orchestra.run.vm07.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T19:23:05.576 INFO:teuthology.orchestra.run.vm07.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T19:23:05.576 INFO:teuthology.orchestra.run.vm07.stdout:Access: 2026-03-09 19:22:13.433155930 +0000 2026-03-09T19:23:05.576 INFO:teuthology.orchestra.run.vm07.stdout:Modify: 2026-03-09 19:17:09.027000000 +0000 2026-03-09T19:23:05.576 INFO:teuthology.orchestra.run.vm07.stdout:Change: 2026-03-09 19:17:09.027000000 +0000 2026-03-09T19:23:05.576 INFO:teuthology.orchestra.run.vm07.stdout: Birth: 2026-03-09 19:17:07.196000000 +0000 2026-03-09T19:23:05.577 DEBUG:teuthology.orchestra.run.vm07:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-09T19:23:05.641 INFO:teuthology.orchestra.run.vm07.stderr:1+0 records in 2026-03-09T19:23:05.641 INFO:teuthology.orchestra.run.vm07.stderr:1+0 records out 2026-03-09T19:23:05.641 INFO:teuthology.orchestra.run.vm07.stderr:512 bytes copied, 0.000125304 s, 4.1 MB/s 2026-03-09T19:23:05.642 DEBUG:teuthology.orchestra.run.vm07:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-09T19:23:05.708 DEBUG:teuthology.orchestra.run.vm07:> stat /dev/vdd 2026-03-09T19:23:05.772 INFO:teuthology.orchestra.run.vm07.stdout: File: /dev/vdd 2026-03-09T19:23:05.772 INFO:teuthology.orchestra.run.vm07.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T19:23:05.772 INFO:teuthology.orchestra.run.vm07.stdout:Device: 6h/6d Inode: 223 Links: 1 Device type: fc,30 2026-03-09T19:23:05.772 INFO:teuthology.orchestra.run.vm07.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T19:23:05.772 INFO:teuthology.orchestra.run.vm07.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T19:23:05.772 INFO:teuthology.orchestra.run.vm07.stdout:Access: 2026-03-09 19:22:13.502155986 +0000 2026-03-09T19:23:05.772 INFO:teuthology.orchestra.run.vm07.stdout:Modify: 2026-03-09 19:17:09.025000000 +0000 2026-03-09T19:23:05.772 INFO:teuthology.orchestra.run.vm07.stdout:Change: 2026-03-09 19:17:09.025000000 +0000 2026-03-09T19:23:05.772 INFO:teuthology.orchestra.run.vm07.stdout: Birth: 2026-03-09 19:17:07.201000000 +0000 2026-03-09T19:23:05.773 DEBUG:teuthology.orchestra.run.vm07:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-09T19:23:05.842 INFO:teuthology.orchestra.run.vm07.stderr:1+0 records in 2026-03-09T19:23:05.842 INFO:teuthology.orchestra.run.vm07.stderr:1+0 records out 2026-03-09T19:23:05.842 INFO:teuthology.orchestra.run.vm07.stderr:512 bytes copied, 0.000128531 s, 4.0 MB/s 2026-03-09T19:23:05.843 DEBUG:teuthology.orchestra.run.vm07:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-09T19:23:05.905 DEBUG:teuthology.orchestra.run.vm07:> stat /dev/vde 2026-03-09T19:23:05.949 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: Updating vm07:/etc/ceph/ceph.conf 2026-03-09T19:23:05.949 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: Updating vm08:/etc/ceph/ceph.conf 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: Updating vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: Reconfiguring mon.vm07 (unknown last config time)... 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: Reconfiguring daemon mon.vm07 on vm07 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/59937872' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.xacuym", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm07", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T19:23:05.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:05.984 INFO:teuthology.orchestra.run.vm07.stdout: File: /dev/vde 2026-03-09T19:23:05.984 INFO:teuthology.orchestra.run.vm07.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T19:23:05.984 INFO:teuthology.orchestra.run.vm07.stdout:Device: 6h/6d Inode: 226 Links: 1 Device type: fc,40 2026-03-09T19:23:05.984 INFO:teuthology.orchestra.run.vm07.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T19:23:05.984 INFO:teuthology.orchestra.run.vm07.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T19:23:05.984 INFO:teuthology.orchestra.run.vm07.stdout:Access: 2026-03-09 19:22:13.560156032 +0000 2026-03-09T19:23:05.984 INFO:teuthology.orchestra.run.vm07.stdout:Modify: 2026-03-09 19:17:09.033000000 +0000 2026-03-09T19:23:05.984 INFO:teuthology.orchestra.run.vm07.stdout:Change: 2026-03-09 19:17:09.033000000 +0000 2026-03-09T19:23:05.984 INFO:teuthology.orchestra.run.vm07.stdout: Birth: 2026-03-09 19:17:07.205000000 +0000 2026-03-09T19:23:05.984 DEBUG:teuthology.orchestra.run.vm07:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-09T19:23:06.052 INFO:teuthology.orchestra.run.vm07.stderr:1+0 records in 2026-03-09T19:23:06.052 INFO:teuthology.orchestra.run.vm07.stderr:1+0 records out 2026-03-09T19:23:06.052 INFO:teuthology.orchestra.run.vm07.stderr:512 bytes copied, 0.000179807 s, 2.8 MB/s 2026-03-09T19:23:06.053 DEBUG:teuthology.orchestra.run.vm07:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-09T19:23:06.115 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-09T19:23:06.115 DEBUG:teuthology.orchestra.run.vm08:> dd if=/scratch_devs of=/dev/stdout 2026-03-09T19:23:06.130 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T19:23:06.130 DEBUG:teuthology.orchestra.run.vm08:> ls /dev/[sv]d? 2026-03-09T19:23:06.185 INFO:teuthology.orchestra.run.vm08.stdout:/dev/vda 2026-03-09T19:23:06.185 INFO:teuthology.orchestra.run.vm08.stdout:/dev/vdb 2026-03-09T19:23:06.185 INFO:teuthology.orchestra.run.vm08.stdout:/dev/vdc 2026-03-09T19:23:06.185 INFO:teuthology.orchestra.run.vm08.stdout:/dev/vdd 2026-03-09T19:23:06.185 INFO:teuthology.orchestra.run.vm08.stdout:/dev/vde 2026-03-09T19:23:06.186 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-09T19:23:06.186 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-09T19:23:06.186 DEBUG:teuthology.orchestra.run.vm08:> stat /dev/vdb 2026-03-09T19:23:06.241 INFO:teuthology.orchestra.run.vm08.stdout: File: /dev/vdb 2026-03-09T19:23:06.241 INFO:teuthology.orchestra.run.vm08.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T19:23:06.241 INFO:teuthology.orchestra.run.vm08.stdout:Device: 6h/6d Inode: 254 Links: 1 Device type: fc,10 2026-03-09T19:23:06.241 INFO:teuthology.orchestra.run.vm08.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T19:23:06.241 INFO:teuthology.orchestra.run.vm08.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T19:23:06.241 INFO:teuthology.orchestra.run.vm08.stdout:Access: 2026-03-09 19:22:50.436698730 +0000 2026-03-09T19:23:06.241 INFO:teuthology.orchestra.run.vm08.stdout:Modify: 2026-03-09 19:16:38.071000000 +0000 2026-03-09T19:23:06.241 INFO:teuthology.orchestra.run.vm08.stdout:Change: 2026-03-09 19:16:38.071000000 +0000 2026-03-09T19:23:06.241 INFO:teuthology.orchestra.run.vm08.stdout: Birth: 2026-03-09 19:16:36.252000000 +0000 2026-03-09T19:23:06.242 DEBUG:teuthology.orchestra.run.vm08:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: Updating vm07:/etc/ceph/ceph.conf 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: Updating vm08:/etc/ceph/ceph.conf 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: Updating vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: Reconfiguring mon.vm07 (unknown last config time)... 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: Reconfiguring daemon mon.vm07 on vm07 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/59937872' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.xacuym", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:23:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:06.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:06.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:06.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm07", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T19:23:06.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:06.304 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records in 2026-03-09T19:23:06.304 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records out 2026-03-09T19:23:06.304 INFO:teuthology.orchestra.run.vm08.stderr:512 bytes copied, 0.000148999 s, 3.4 MB/s 2026-03-09T19:23:06.305 DEBUG:teuthology.orchestra.run.vm08:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-09T19:23:06.361 DEBUG:teuthology.orchestra.run.vm08:> stat /dev/vdc 2026-03-09T19:23:06.418 INFO:teuthology.orchestra.run.vm08.stdout: File: /dev/vdc 2026-03-09T19:23:06.418 INFO:teuthology.orchestra.run.vm08.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T19:23:06.418 INFO:teuthology.orchestra.run.vm08.stdout:Device: 6h/6d Inode: 255 Links: 1 Device type: fc,20 2026-03-09T19:23:06.418 INFO:teuthology.orchestra.run.vm08.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T19:23:06.418 INFO:teuthology.orchestra.run.vm08.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T19:23:06.418 INFO:teuthology.orchestra.run.vm08.stdout:Access: 2026-03-09 19:22:50.489698753 +0000 2026-03-09T19:23:06.418 INFO:teuthology.orchestra.run.vm08.stdout:Modify: 2026-03-09 19:16:38.059000000 +0000 2026-03-09T19:23:06.418 INFO:teuthology.orchestra.run.vm08.stdout:Change: 2026-03-09 19:16:38.059000000 +0000 2026-03-09T19:23:06.418 INFO:teuthology.orchestra.run.vm08.stdout: Birth: 2026-03-09 19:16:36.259000000 +0000 2026-03-09T19:23:06.418 DEBUG:teuthology.orchestra.run.vm08:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-09T19:23:06.483 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records in 2026-03-09T19:23:06.483 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records out 2026-03-09T19:23:06.483 INFO:teuthology.orchestra.run.vm08.stderr:512 bytes copied, 0.000135323 s, 3.8 MB/s 2026-03-09T19:23:06.484 DEBUG:teuthology.orchestra.run.vm08:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-09T19:23:06.543 DEBUG:teuthology.orchestra.run.vm08:> stat /dev/vdd 2026-03-09T19:23:06.602 INFO:teuthology.orchestra.run.vm08.stdout: File: /dev/vdd 2026-03-09T19:23:06.602 INFO:teuthology.orchestra.run.vm08.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T19:23:06.602 INFO:teuthology.orchestra.run.vm08.stdout:Device: 6h/6d Inode: 256 Links: 1 Device type: fc,30 2026-03-09T19:23:06.602 INFO:teuthology.orchestra.run.vm08.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T19:23:06.602 INFO:teuthology.orchestra.run.vm08.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T19:23:06.602 INFO:teuthology.orchestra.run.vm08.stdout:Access: 2026-03-09 19:22:50.540698775 +0000 2026-03-09T19:23:06.602 INFO:teuthology.orchestra.run.vm08.stdout:Modify: 2026-03-09 19:16:38.060000000 +0000 2026-03-09T19:23:06.602 INFO:teuthology.orchestra.run.vm08.stdout:Change: 2026-03-09 19:16:38.060000000 +0000 2026-03-09T19:23:06.602 INFO:teuthology.orchestra.run.vm08.stdout: Birth: 2026-03-09 19:16:36.263000000 +0000 2026-03-09T19:23:06.602 DEBUG:teuthology.orchestra.run.vm08:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-09T19:23:06.665 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records in 2026-03-09T19:23:06.665 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records out 2026-03-09T19:23:06.665 INFO:teuthology.orchestra.run.vm08.stderr:512 bytes copied, 0.000120275 s, 4.3 MB/s 2026-03-09T19:23:06.666 DEBUG:teuthology.orchestra.run.vm08:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-09T19:23:06.722 DEBUG:teuthology.orchestra.run.vm08:> stat /dev/vde 2026-03-09T19:23:06.778 INFO:teuthology.orchestra.run.vm08.stdout: File: /dev/vde 2026-03-09T19:23:06.778 INFO:teuthology.orchestra.run.vm08.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T19:23:06.778 INFO:teuthology.orchestra.run.vm08.stdout:Device: 6h/6d Inode: 257 Links: 1 Device type: fc,40 2026-03-09T19:23:06.778 INFO:teuthology.orchestra.run.vm08.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T19:23:06.778 INFO:teuthology.orchestra.run.vm08.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T19:23:06.778 INFO:teuthology.orchestra.run.vm08.stdout:Access: 2026-03-09 19:22:50.586698794 +0000 2026-03-09T19:23:06.778 INFO:teuthology.orchestra.run.vm08.stdout:Modify: 2026-03-09 19:16:38.061000000 +0000 2026-03-09T19:23:06.778 INFO:teuthology.orchestra.run.vm08.stdout:Change: 2026-03-09 19:16:38.061000000 +0000 2026-03-09T19:23:06.778 INFO:teuthology.orchestra.run.vm08.stdout: Birth: 2026-03-09 19:16:36.314000000 +0000 2026-03-09T19:23:06.778 DEBUG:teuthology.orchestra.run.vm08:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-09T19:23:06.842 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records in 2026-03-09T19:23:06.843 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records out 2026-03-09T19:23:06.843 INFO:teuthology.orchestra.run.vm08.stderr:512 bytes copied, 0.000187461 s, 2.7 MB/s 2026-03-09T19:23:06.843 DEBUG:teuthology.orchestra.run.vm08:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-09T19:23:06.900 INFO:tasks.cephadm:Deploying osd.0 on vm07 with /dev/vde... 2026-03-09T19:23:06.900 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- lvm zap /dev/vde 2026-03-09T19:23:07.497 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:23:07.530 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:07 vm07 ceph-mon[48545]: Reconfiguring mgr.vm07.xacuym (unknown last config time)... 2026-03-09T19:23:07.530 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:07 vm07 ceph-mon[48545]: Reconfiguring daemon mgr.vm07.xacuym on vm07 2026-03-09T19:23:07.530 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:07 vm07 ceph-mon[48545]: Reconfiguring ceph-exporter.vm07 (monmap changed)... 2026-03-09T19:23:07.530 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:07 vm07 ceph-mon[48545]: Reconfiguring daemon ceph-exporter.vm07 on vm07 2026-03-09T19:23:07.530 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:07 vm07 ceph-mon[48545]: Reconfiguring crash.vm07 (monmap changed)... 2026-03-09T19:23:07.530 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:07 vm07 ceph-mon[48545]: Reconfiguring daemon crash.vm07 on vm07 2026-03-09T19:23:07.530 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:07 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:07.530 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:07 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:07.530 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:07 vm07 ceph-mon[48545]: Reconfiguring alertmanager.vm07 (dependencies changed)... 2026-03-09T19:23:07.530 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:07 vm07 ceph-mon[48545]: Reconfiguring daemon alertmanager.vm07 on vm07 2026-03-09T19:23:07.530 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:07 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:07.530 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:07 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:07 vm08 ceph-mon[57794]: Reconfiguring mgr.vm07.xacuym (unknown last config time)... 2026-03-09T19:23:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:07 vm08 ceph-mon[57794]: Reconfiguring daemon mgr.vm07.xacuym on vm07 2026-03-09T19:23:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:07 vm08 ceph-mon[57794]: Reconfiguring ceph-exporter.vm07 (monmap changed)... 2026-03-09T19:23:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:07 vm08 ceph-mon[57794]: Reconfiguring daemon ceph-exporter.vm07 on vm07 2026-03-09T19:23:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:07 vm08 ceph-mon[57794]: Reconfiguring crash.vm07 (monmap changed)... 2026-03-09T19:23:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:07 vm08 ceph-mon[57794]: Reconfiguring daemon crash.vm07 on vm07 2026-03-09T19:23:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:07 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:07 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:07 vm08 ceph-mon[57794]: Reconfiguring alertmanager.vm07 (dependencies changed)... 2026-03-09T19:23:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:07 vm08 ceph-mon[57794]: Reconfiguring daemon alertmanager.vm07 on vm07 2026-03-09T19:23:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:07 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:07 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:08.247 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:23:08.258 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph orch daemon add osd vm07:/dev/vde 2026-03-09T19:23:08.398 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:08 vm07 ceph-mon[48545]: Reconfiguring grafana.vm07 (dependencies changed)... 2026-03-09T19:23:08.398 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:08 vm07 ceph-mon[48545]: Reconfiguring daemon grafana.vm07 on vm07 2026-03-09T19:23:08.398 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:08 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:08.398 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:08 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:08.398 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:08 vm07 ceph-mon[48545]: Reconfiguring prometheus.vm07 (dependencies changed)... 2026-03-09T19:23:08.398 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:08 vm07 ceph-mon[48545]: Reconfiguring daemon prometheus.vm07 on vm07 2026-03-09T19:23:08.566 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:23:08.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:08 vm08 ceph-mon[57794]: Reconfiguring grafana.vm07 (dependencies changed)... 2026-03-09T19:23:08.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:08 vm08 ceph-mon[57794]: Reconfiguring daemon grafana.vm07 on vm07 2026-03-09T19:23:08.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:08 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:08.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:08 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:08.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:08 vm08 ceph-mon[57794]: Reconfiguring prometheus.vm07 (dependencies changed)... 2026-03-09T19:23:08.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:08 vm08 ceph-mon[57794]: Reconfiguring daemon prometheus.vm07 on vm07 2026-03-09T19:23:08.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.965+0000 7f650f577640 1 -- 192.168.123.107:0/38214495 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6510072420 msgr2=0x7f6510077190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:08.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.965+0000 7f650f577640 1 --2- 192.168.123.107:0/38214495 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6510072420 0x7f6510077190 secure :-1 s=READY pgs=167 cs=0 l=1 rev1=1 crypto rx=0x7f6508008090 tx=0x7f6508031ea0 comp rx=0 tx=0).stop 2026-03-09T19:23:08.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.965+0000 7f650f577640 1 -- 192.168.123.107:0/38214495 shutdown_connections 2026-03-09T19:23:08.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.965+0000 7f650f577640 1 --2- 192.168.123.107:0/38214495 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6510072420 0x7f6510077190 unknown :-1 s=CLOSED pgs=167 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:08.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.965+0000 7f650f577640 1 --2- 192.168.123.107:0/38214495 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6510071a50 0x7f6510071e50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:08.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.965+0000 7f650f577640 1 -- 192.168.123.107:0/38214495 >> 192.168.123.107:0/38214495 conn(0x7f651006d4f0 msgr2=0x7f651006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:23:08.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.965+0000 7f650f577640 1 -- 192.168.123.107:0/38214495 shutdown_connections 2026-03-09T19:23:08.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.965+0000 7f650f577640 1 -- 192.168.123.107:0/38214495 wait complete. 2026-03-09T19:23:08.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.967+0000 7f650f577640 1 Processor -- start 2026-03-09T19:23:08.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.967+0000 7f650f577640 1 -- start start 2026-03-09T19:23:08.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.967+0000 7f650f577640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6510071a50 0x7f65100840d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:08.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.967+0000 7f650f577640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6510082720 0x7f6510082ba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:08.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.967+0000 7f650f577640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6510084610 con 0x7f6510071a50 2026-03-09T19:23:08.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.967+0000 7f650f577640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f65100830e0 con 0x7f6510082720 2026-03-09T19:23:08.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.968+0000 7f650e575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6510071a50 0x7f65100840d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:08.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.968+0000 7f650e575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6510071a50 0x7f65100840d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:53186/0 (socket says 192.168.123.107:53186) 2026-03-09T19:23:08.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.968+0000 7f650e575640 1 -- 192.168.123.107:0/62210200 learned_addr learned my addr 192.168.123.107:0/62210200 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:23:08.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.968+0000 7f650dd74640 1 --2- 192.168.123.107:0/62210200 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6510082720 0x7f6510082ba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:08.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.968+0000 7f650e575640 1 -- 192.168.123.107:0/62210200 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6510082720 msgr2=0x7f6510082ba0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:08.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.968+0000 7f650e575640 1 --2- 192.168.123.107:0/62210200 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6510082720 0x7f6510082ba0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:08.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.968+0000 7f650e575640 1 -- 192.168.123.107:0/62210200 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6508007ce0 con 0x7f6510071a50 2026-03-09T19:23:08.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.968+0000 7f650e575640 1 --2- 192.168.123.107:0/62210200 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6510071a50 0x7f65100840d0 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7f650000d8d0 tx=0x7f650000dda0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:23:08.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.968+0000 7f64ff7fe640 1 -- 192.168.123.107:0/62210200 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6500004490 con 0x7f6510071a50 2026-03-09T19:23:08.967 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.969+0000 7f650f577640 1 -- 192.168.123.107:0/62210200 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6510083390 con 0x7f6510071a50 2026-03-09T19:23:08.967 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.969+0000 7f650f577640 1 -- 192.168.123.107:0/62210200 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f651012ef70 con 0x7f6510071a50 2026-03-09T19:23:08.967 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.969+0000 7f64ff7fe640 1 -- 192.168.123.107:0/62210200 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f650000bd00 con 0x7f6510071a50 2026-03-09T19:23:08.967 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.969+0000 7f64ff7fe640 1 -- 192.168.123.107:0/62210200 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6500010460 con 0x7f6510071a50 2026-03-09T19:23:08.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.970+0000 7f64ff7fe640 1 -- 192.168.123.107:0/62210200 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 18) v1 ==== 98424+0+0 (secure 0 0 0) 0x7f6500010610 con 0x7f6510071a50 2026-03-09T19:23:08.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.971+0000 7f64ff7fe640 1 --2- 192.168.123.107:0/62210200 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f64ec076130 0x7f64ec0785f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:08.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.971+0000 7f650dd74640 1 --2- 192.168.123.107:0/62210200 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f64ec076130 0x7f64ec0785f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:08.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.971+0000 7f64ff7fe640 1 -- 192.168.123.107:0/62210200 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1370+0+0 (secure 0 0 0) 0x7f6500097960 con 0x7f6510071a50 2026-03-09T19:23:08.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.972+0000 7f650dd74640 1 --2- 192.168.123.107:0/62210200 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f64ec076130 0x7f64ec0785f0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f65080323b0 tx=0x7f650803d040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:23:08.970 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.972+0000 7f650f577640 1 -- 192.168.123.107:0/62210200 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f64dc005350 con 0x7f6510071a50 2026-03-09T19:23:08.974 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:08.976+0000 7f64ff7fe640 1 -- 192.168.123.107:0/62210200 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6500061fe0 con 0x7f6510071a50 2026-03-09T19:23:09.102 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:09.103+0000 7f650f577640 1 -- 192.168.123.107:0/62210200 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7f64dc002bf0 con 0x7f64ec076130 2026-03-09T19:23:09.667 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:09 vm07 ceph-mon[48545]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:23:09.667 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:09 vm07 ceph-mon[48545]: from='client.14264 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:23:09.667 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:09 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T19:23:09.667 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:09 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T19:23:09.667 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:09 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:09.667 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:09 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:09.667 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:09 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:09.667 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:09 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:23:09.667 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:09 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:09.667 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:09 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:09.668 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:09 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:09.668 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:09 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T19:23:09.668 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:09 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:09.732 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:09 vm08 ceph-mon[57794]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:23:09.732 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:09 vm08 ceph-mon[57794]: from='client.14264 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:23:09.732 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:09 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T19:23:09.732 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:09 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T19:23:09.732 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:09 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:09.732 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:09 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:09.732 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:09 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:09.732 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:09 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:23:09.732 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:09 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:09.732 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:09 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:09.732 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:09 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:09.732 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:09 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T19:23:09.732 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:09 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: Reconfiguring ceph-exporter.vm08 (monmap changed)... 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: Reconfiguring daemon ceph-exporter.vm08 on vm08 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: Reconfiguring crash.vm08 (monmap changed)... 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: Reconfiguring daemon crash.vm08 on vm08 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: Reconfiguring mgr.vm08.mxylvw (monmap changed)... 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.mxylvw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: Reconfiguring daemon mgr.vm08.mxylvw on vm08 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/3187292623' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "133448fe-3146-488b-ab63-557fcf7f955d"}]: dispatch 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/3187292623' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "133448fe-3146-488b-ab63-557fcf7f955d"}]': finished 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: osdmap e6: 1 total, 0 up, 1 in 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: Reconfiguring mon.vm08 (monmap changed)... 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: Reconfiguring daemon mon.vm08 on vm08 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm07.local:9093"}]: dispatch 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm07.local:3000"}]: dispatch 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm07.local:9095"}]: dispatch 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:11.086 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:10 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1053067975' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T19:23:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: Reconfiguring ceph-exporter.vm08 (monmap changed)... 2026-03-09T19:23:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: Reconfiguring daemon ceph-exporter.vm08 on vm08 2026-03-09T19:23:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: Reconfiguring crash.vm08 (monmap changed)... 2026-03-09T19:23:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: Reconfiguring daemon crash.vm08 on vm08 2026-03-09T19:23:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: Reconfiguring mgr.vm08.mxylvw (monmap changed)... 2026-03-09T19:23:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.mxylvw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:23:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T19:23:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: Reconfiguring daemon mgr.vm08.mxylvw on vm08 2026-03-09T19:23:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/3187292623' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "133448fe-3146-488b-ab63-557fcf7f955d"}]: dispatch 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/3187292623' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "133448fe-3146-488b-ab63-557fcf7f955d"}]': finished 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: osdmap e6: 1 total, 0 up, 1 in 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: Reconfiguring mon.vm08 (monmap changed)... 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: Reconfiguring daemon mon.vm08 on vm08 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm07.local:9093"}]: dispatch 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm07.local:3000"}]: dispatch 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm07.local:9095"}]: dispatch 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:11.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:10 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/1053067975' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T19:23:12.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:12 vm08 ceph-mon[57794]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:23:12.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:12 vm08 ceph-mon[57794]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T19:23:12.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:12 vm08 ceph-mon[57794]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm07.local:9093"}]: dispatch 2026-03-09T19:23:12.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:12 vm08 ceph-mon[57794]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T19:23:12.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:12 vm08 ceph-mon[57794]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm07.local:3000"}]: dispatch 2026-03-09T19:23:12.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:12 vm08 ceph-mon[57794]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T19:23:12.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:12 vm08 ceph-mon[57794]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm07.local:9095"}]: dispatch 2026-03-09T19:23:12.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:12 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:12.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:12 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:12.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:12 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:12.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:12 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:12.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:12 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:12.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:12 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:12.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:12 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:12.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:12 vm07 ceph-mon[48545]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:23:12.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:12 vm07 ceph-mon[48545]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T19:23:12.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:12 vm07 ceph-mon[48545]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm07.local:9093"}]: dispatch 2026-03-09T19:23:12.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:12 vm07 ceph-mon[48545]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T19:23:12.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:12 vm07 ceph-mon[48545]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm07.local:3000"}]: dispatch 2026-03-09T19:23:12.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:12 vm07 ceph-mon[48545]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T19:23:12.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:12 vm07 ceph-mon[48545]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm07.local:9095"}]: dispatch 2026-03-09T19:23:12.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:12 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:12.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:12 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:12.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:12 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:12.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:12 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:12.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:12 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:12.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:12 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:12.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:12 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:14.077 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:14 vm07 ceph-mon[48545]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:23:14.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:14 vm08 ceph-mon[57794]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:23:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:15 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T19:23:15.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:15 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:15.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:15 vm07 ceph-mon[48545]: Deploying daemon osd.0 on vm07 2026-03-09T19:23:15.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:15 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T19:23:15.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:15 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:15.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:15 vm08 ceph-mon[57794]: Deploying daemon osd.0 on vm07 2026-03-09T19:23:16.230 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:16 vm07 ceph-mon[48545]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:23:16.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:16 vm08 ceph-mon[57794]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:23:17.161 INFO:teuthology.orchestra.run.vm07.stdout:Created osd(s) 0 on host 'vm07' 2026-03-09T19:23:17.161 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:17.162+0000 7f64ff7fe640 1 -- 192.168.123.107:0/62210200 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f64dc002bf0 con 0x7f64ec076130 2026-03-09T19:23:17.163 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:17.165+0000 7f64fd7fa640 1 -- 192.168.123.107:0/62210200 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f64ec076130 msgr2=0x7f64ec0785f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:17.163 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:17.165+0000 7f64fd7fa640 1 --2- 192.168.123.107:0/62210200 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f64ec076130 0x7f64ec0785f0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f65080323b0 tx=0x7f650803d040 comp rx=0 tx=0).stop 2026-03-09T19:23:17.163 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:17.165+0000 7f64fd7fa640 1 -- 192.168.123.107:0/62210200 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6510071a50 msgr2=0x7f65100840d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:17.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:17.165+0000 7f64fd7fa640 1 --2- 192.168.123.107:0/62210200 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6510071a50 0x7f65100840d0 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7f650000d8d0 tx=0x7f650000dda0 comp rx=0 tx=0).stop 2026-03-09T19:23:17.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:17.165+0000 7f64fd7fa640 1 -- 192.168.123.107:0/62210200 shutdown_connections 2026-03-09T19:23:17.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:17.165+0000 7f64fd7fa640 1 --2- 192.168.123.107:0/62210200 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f64ec076130 0x7f64ec0785f0 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:17.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:17.165+0000 7f64fd7fa640 1 --2- 192.168.123.107:0/62210200 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6510082720 0x7f6510082ba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:17.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:17.165+0000 7f64fd7fa640 1 --2- 192.168.123.107:0/62210200 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6510071a50 0x7f65100840d0 unknown :-1 s=CLOSED pgs=168 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:17.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:17.165+0000 7f64fd7fa640 1 -- 192.168.123.107:0/62210200 >> 192.168.123.107:0/62210200 conn(0x7f651006d4f0 msgr2=0x7f6510073130 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:23:17.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:17.165+0000 7f64fd7fa640 1 -- 192.168.123.107:0/62210200 shutdown_connections 2026-03-09T19:23:17.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:17.165+0000 7f64fd7fa640 1 -- 192.168.123.107:0/62210200 wait complete. 2026-03-09T19:23:17.222 DEBUG:teuthology.orchestra.run.vm07:osd.0> sudo journalctl -f -n 0 -u ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.0.service 2026-03-09T19:23:17.227 INFO:tasks.cephadm:Deploying osd.1 on vm07 with /dev/vdd... 2026-03-09T19:23:17.227 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- lvm zap /dev/vdd 2026-03-09T19:23:17.307 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:17 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:17.307 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:17 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:17.307 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:17 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:17.307 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:17 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:17.307 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:17 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:17.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:17 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:17.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:17 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:17.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:17 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:17.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:17 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:17.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:17 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:17.784 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:23:17.993 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:23:17 vm07 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0[68186]: 2026-03-09T19:23:17.746+0000 7f943d24e740 -1 osd.0 0 log_to_monitors true 2026-03-09T19:23:18.413 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:23:18.440 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph orch daemon add osd vm07:/dev/vdd 2026-03-09T19:23:18.512 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:18 vm07 ceph-mon[48545]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:23:18.512 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:18 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:18.512 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:18 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:18.512 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:18 vm07 ceph-mon[48545]: from='osd.0 [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T19:23:18.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:18 vm08 ceph-mon[57794]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:23:18.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:18 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:18.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:18 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:18.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:18 vm08 ceph-mon[57794]: from='osd.0 [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T19:23:18.676 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:23:19.054 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.055+0000 7f18bbb90640 1 -- 192.168.123.107:0/1350012955 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f18b4075ba0 msgr2=0x7f18b4075fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:19.054 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.056+0000 7f18bbb90640 1 --2- 192.168.123.107:0/1350012955 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f18b4075ba0 0x7f18b4075fa0 secure :-1 s=READY pgs=175 cs=0 l=1 rev1=1 crypto rx=0x7f18a40099b0 tx=0x7f18a402f240 comp rx=0 tx=0).stop 2026-03-09T19:23:19.054 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.056+0000 7f18bbb90640 1 -- 192.168.123.107:0/1350012955 shutdown_connections 2026-03-09T19:23:19.054 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.056+0000 7f18bbb90640 1 --2- 192.168.123.107:0/1350012955 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f18b4076df0 0x7f18b4077250 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:19.054 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.056+0000 7f18bbb90640 1 --2- 192.168.123.107:0/1350012955 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f18b4075ba0 0x7f18b4075fa0 unknown :-1 s=CLOSED pgs=175 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:19.054 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.056+0000 7f18bbb90640 1 -- 192.168.123.107:0/1350012955 >> 192.168.123.107:0/1350012955 conn(0x7f18b40fe0c0 msgr2=0x7f18b41004e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:23:19.054 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.056+0000 7f18bbb90640 1 -- 192.168.123.107:0/1350012955 shutdown_connections 2026-03-09T19:23:19.054 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.056+0000 7f18bbb90640 1 -- 192.168.123.107:0/1350012955 wait complete. 2026-03-09T19:23:19.055 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.057+0000 7f18bbb90640 1 Processor -- start 2026-03-09T19:23:19.055 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.057+0000 7f18bbb90640 1 -- start start 2026-03-09T19:23:19.055 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.057+0000 7f18bbb90640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f18b4075ba0 0x7f18b419e670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:19.055 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.058+0000 7f18bbb90640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f18b4076df0 0x7f18b419ebb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:19.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.058+0000 7f18b9905640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f18b4075ba0 0x7f18b419e670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:19.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.058+0000 7f18b9905640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f18b4075ba0 0x7f18b419e670 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:33952/0 (socket says 192.168.123.107:33952) 2026-03-09T19:23:19.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.058+0000 7f18b9905640 1 -- 192.168.123.107:0/1949243885 learned_addr learned my addr 192.168.123.107:0/1949243885 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:23:19.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.058+0000 7f18b9104640 1 --2- 192.168.123.107:0/1949243885 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f18b4076df0 0x7f18b419ebb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:19.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.058+0000 7f18bbb90640 1 -- 192.168.123.107:0/1949243885 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f18b419f180 con 0x7f18b4076df0 2026-03-09T19:23:19.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.058+0000 7f18bbb90640 1 -- 192.168.123.107:0/1949243885 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f18b419f2f0 con 0x7f18b4075ba0 2026-03-09T19:23:19.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.061+0000 7f18b9104640 1 -- 192.168.123.107:0/1949243885 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f18b4075ba0 msgr2=0x7f18b419e670 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:19.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.061+0000 7f18b9104640 1 --2- 192.168.123.107:0/1949243885 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f18b4075ba0 0x7f18b419e670 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:19.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.061+0000 7f18b9104640 1 -- 192.168.123.107:0/1949243885 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f18a4009660 con 0x7f18b4076df0 2026-03-09T19:23:19.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.062+0000 7f18b9104640 1 --2- 192.168.123.107:0/1949243885 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f18b4076df0 0x7f18b419ebb0 secure :-1 s=READY pgs=176 cs=0 l=1 rev1=1 crypto rx=0x7f18a800b500 tx=0x7f18a800b9d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:23:19.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.063+0000 7f18a2ffd640 1 -- 192.168.123.107:0/1949243885 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f18a8004280 con 0x7f18b4076df0 2026-03-09T19:23:19.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.063+0000 7f18bbb90640 1 -- 192.168.123.107:0/1949243885 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f18b41a3d90 con 0x7f18b4076df0 2026-03-09T19:23:19.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.063+0000 7f18bbb90640 1 -- 192.168.123.107:0/1949243885 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f18b41a4330 con 0x7f18b4076df0 2026-03-09T19:23:19.062 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.064+0000 7f18a2ffd640 1 -- 192.168.123.107:0/1949243885 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f18a80043e0 con 0x7f18b4076df0 2026-03-09T19:23:19.062 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.064+0000 7f18a2ffd640 1 -- 192.168.123.107:0/1949243885 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f18a800fb30 con 0x7f18b4076df0 2026-03-09T19:23:19.063 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.065+0000 7f18a2ffd640 1 -- 192.168.123.107:0/1949243885 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 18) v1 ==== 98424+0+0 (secure 0 0 0) 0x7f18a800fc90 con 0x7f18b4076df0 2026-03-09T19:23:19.064 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.066+0000 7f18a2ffd640 1 --2- 192.168.123.107:0/1949243885 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1894076200 0x7f18940786c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:19.064 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.066+0000 7f18b9905640 1 --2- 192.168.123.107:0/1949243885 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1894076200 0x7f18940786c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:19.064 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.066+0000 7f18a2ffd640 1 -- 192.168.123.107:0/1949243885 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(7..7 src has 1..7) v4 ==== 1576+0+0 (secure 0 0 0) 0x7f18a8095930 con 0x7f18b4076df0 2026-03-09T19:23:19.065 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.067+0000 7f18b9905640 1 --2- 192.168.123.107:0/1949243885 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1894076200 0x7f18940786c0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f18a4002cc0 tx=0x7f18a403a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:23:19.065 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.067+0000 7f18bbb90640 1 -- 192.168.123.107:0/1949243885 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f187c005350 con 0x7f18b4076df0 2026-03-09T19:23:19.068 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.070+0000 7f18a2ffd640 1 -- 192.168.123.107:0/1949243885 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f18a8060ed0 con 0x7f18b4076df0 2026-03-09T19:23:19.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:19.173+0000 7f18bbb90640 1 -- 192.168.123.107:0/1949243885 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7f187c002bf0 con 0x7f1894076200 2026-03-09T19:23:19.614 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:23:19 vm07 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0[68186]: 2026-03-09T19:23:19.355+0000 7f94389aa640 -1 osd.0 0 waiting for initial osdmap 2026-03-09T19:23:19.614 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:23:19 vm07 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0[68186]: 2026-03-09T19:23:19.392+0000 7f9434fe6640 -1 osd.0 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T19:23:19.614 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:19 vm07 ceph-mon[48545]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:23:19.614 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:19 vm07 ceph-mon[48545]: from='osd.0 [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-09T19:23:19.614 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:19 vm07 ceph-mon[48545]: osdmap e7: 1 total, 0 up, 1 in 2026-03-09T19:23:19.614 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:19 vm07 ceph-mon[48545]: from='osd.0 [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-09T19:23:19.614 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:19 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:23:19.614 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:19 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:23:19.614 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:19 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:19.614 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:19 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:19.614 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:19 vm07 ceph-mon[48545]: from='client.14280 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:23:19.614 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:19 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T19:23:19.614 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:19 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T19:23:19.614 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:19 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:19.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:19 vm08 ceph-mon[57794]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:23:19.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:19 vm08 ceph-mon[57794]: from='osd.0 [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-09T19:23:19.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:19 vm08 ceph-mon[57794]: osdmap e7: 1 total, 0 up, 1 in 2026-03-09T19:23:19.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:19 vm08 ceph-mon[57794]: from='osd.0 [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-09T19:23:19.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:19 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:23:19.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:19 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:23:19.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:19 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:19.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:19 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:19.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:19 vm08 ceph-mon[57794]: from='client.14280 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:23:19.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:19 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T19:23:19.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:19 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T19:23:19.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:19 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:20.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:20 vm07 ceph-mon[48545]: from='osd.0 [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm07", "root=default"]}]': finished 2026-03-09T19:23:20.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:20 vm07 ceph-mon[48545]: osdmap e8: 1 total, 0 up, 1 in 2026-03-09T19:23:20.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:20 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:23:20.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:20 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:23:20.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:20 vm07 ceph-mon[48545]: Detected new or changed devices on vm07 2026-03-09T19:23:20.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:20 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:20.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:20 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:20.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:20 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:23:20.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:20 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:20.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:20 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:20.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:20 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:20.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:20 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:20.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:20 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1839199281' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0f0316a2-1b3a-4bd0-b463-b3d326b0fb51"}]: dispatch 2026-03-09T19:23:20.824 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:20 vm08 ceph-mon[57794]: from='osd.0 [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm07", "root=default"]}]': finished 2026-03-09T19:23:20.824 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:20 vm08 ceph-mon[57794]: osdmap e8: 1 total, 0 up, 1 in 2026-03-09T19:23:20.824 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:20 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:23:20.824 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:20 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:23:20.824 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:20 vm08 ceph-mon[57794]: Detected new or changed devices on vm07 2026-03-09T19:23:20.824 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:20 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:20.825 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:20 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:20.825 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:20 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:23:20.825 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:20 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:20.825 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:20 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:20.825 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:20 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:20.825 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:20 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:20.825 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:20 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/1839199281' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0f0316a2-1b3a-4bd0-b463-b3d326b0fb51"}]: dispatch 2026-03-09T19:23:21.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:21 vm07 ceph-mon[48545]: purged_snaps scrub starts 2026-03-09T19:23:21.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:21 vm07 ceph-mon[48545]: purged_snaps scrub ok 2026-03-09T19:23:21.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:21 vm07 ceph-mon[48545]: pgmap v12: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:23:21.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:21 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:23:21.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:21 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1839199281' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "0f0316a2-1b3a-4bd0-b463-b3d326b0fb51"}]': finished 2026-03-09T19:23:21.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:21 vm07 ceph-mon[48545]: osd.0 [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186] boot 2026-03-09T19:23:21.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:21 vm07 ceph-mon[48545]: osdmap e9: 2 total, 1 up, 2 in 2026-03-09T19:23:21.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:21 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:23:21.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:21 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:23:21.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:21 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:21.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:21 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:21.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:21 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1982577442' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T19:23:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:21 vm08 ceph-mon[57794]: purged_snaps scrub starts 2026-03-09T19:23:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:21 vm08 ceph-mon[57794]: purged_snaps scrub ok 2026-03-09T19:23:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:21 vm08 ceph-mon[57794]: pgmap v12: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T19:23:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:21 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:23:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:21 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/1839199281' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "0f0316a2-1b3a-4bd0-b463-b3d326b0fb51"}]': finished 2026-03-09T19:23:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:21 vm08 ceph-mon[57794]: osd.0 [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186] boot 2026-03-09T19:23:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:21 vm08 ceph-mon[57794]: osdmap e9: 2 total, 1 up, 2 in 2026-03-09T19:23:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:21 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:23:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:21 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:23:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:21 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:21 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:21 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/1982577442' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T19:23:22.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:22 vm07 ceph-mon[48545]: osdmap e10: 2 total, 1 up, 2 in 2026-03-09T19:23:22.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:22 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:23:22.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:22 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:22.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:22 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:22.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:22 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:22.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:22 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:22.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:22 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:23.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:22 vm08 ceph-mon[57794]: osdmap e10: 2 total, 1 up, 2 in 2026-03-09T19:23:23.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:22 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:23:23.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:22 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:23.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:22 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:23.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:22 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:23.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:22 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:23.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:22 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:24.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:24 vm07 ceph-mon[48545]: pgmap v15: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T19:23:24.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:24 vm08 ceph-mon[57794]: pgmap v15: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T19:23:26.205 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:26 vm07 ceph-mon[48545]: pgmap v16: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T19:23:26.205 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:26 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T19:23:26.205 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:26 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:26.205 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:26 vm07 ceph-mon[48545]: Deploying daemon osd.1 on vm07 2026-03-09T19:23:26.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:26 vm08 ceph-mon[57794]: pgmap v16: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T19:23:26.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:26 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T19:23:26.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:26 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:26.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:26 vm08 ceph-mon[57794]: Deploying daemon osd.1 on vm07 2026-03-09T19:23:27.269 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:27 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:27.269 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:27 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:27.269 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:27 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:27 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:27 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:27 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:27.633 INFO:teuthology.orchestra.run.vm07.stdout:Created osd(s) 1 on host 'vm07' 2026-03-09T19:23:27.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:27.633+0000 7f18a2ffd640 1 -- 192.168.123.107:0/1949243885 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f187c002bf0 con 0x7f1894076200 2026-03-09T19:23:27.634 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:27.635+0000 7f18bbb90640 1 -- 192.168.123.107:0/1949243885 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1894076200 msgr2=0x7f18940786c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:27.634 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:27.635+0000 7f18bbb90640 1 --2- 192.168.123.107:0/1949243885 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1894076200 0x7f18940786c0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f18a4002cc0 tx=0x7f18a403a040 comp rx=0 tx=0).stop 2026-03-09T19:23:27.634 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:27.635+0000 7f18bbb90640 1 -- 192.168.123.107:0/1949243885 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f18b4076df0 msgr2=0x7f18b419ebb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:27.634 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:27.635+0000 7f18bbb90640 1 --2- 192.168.123.107:0/1949243885 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f18b4076df0 0x7f18b419ebb0 secure :-1 s=READY pgs=176 cs=0 l=1 rev1=1 crypto rx=0x7f18a800b500 tx=0x7f18a800b9d0 comp rx=0 tx=0).stop 2026-03-09T19:23:27.634 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:27.635+0000 7f18bbb90640 1 -- 192.168.123.107:0/1949243885 shutdown_connections 2026-03-09T19:23:27.634 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:27.635+0000 7f18bbb90640 1 --2- 192.168.123.107:0/1949243885 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1894076200 0x7f18940786c0 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:27.634 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:27.635+0000 7f18bbb90640 1 --2- 192.168.123.107:0/1949243885 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f18b4076df0 0x7f18b419ebb0 unknown :-1 s=CLOSED pgs=176 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:27.634 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:27.635+0000 7f18bbb90640 1 --2- 192.168.123.107:0/1949243885 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f18b4075ba0 0x7f18b419e670 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:27.634 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:27.635+0000 7f18bbb90640 1 -- 192.168.123.107:0/1949243885 >> 192.168.123.107:0/1949243885 conn(0x7f18b40fe0c0 msgr2=0x7f18b40ffb50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:23:27.634 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:27.635+0000 7f18bbb90640 1 -- 192.168.123.107:0/1949243885 shutdown_connections 2026-03-09T19:23:27.634 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:27.635+0000 7f18bbb90640 1 -- 192.168.123.107:0/1949243885 wait complete. 2026-03-09T19:23:27.678 DEBUG:teuthology.orchestra.run.vm07:osd.1> sudo journalctl -f -n 0 -u ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.1.service 2026-03-09T19:23:27.720 INFO:tasks.cephadm:Deploying osd.2 on vm07 with /dev/vdc... 2026-03-09T19:23:27.720 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- lvm zap /dev/vdc 2026-03-09T19:23:28.007 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:23:28.326 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:28 vm07 ceph-mon[48545]: pgmap v17: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T19:23:28.326 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:28 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:28.326 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:28 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:28.326 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:28 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:28.326 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:28 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:28.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:28 vm08 ceph-mon[57794]: pgmap v17: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T19:23:28.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:28 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:28.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:28 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:28.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:28 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:28.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:28 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:28.637 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:23:28.651 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph orch daemon add osd vm07:/dev/vdc 2026-03-09T19:23:28.834 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:23:28 vm07 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1[75486]: 2026-03-09T19:23:28.594+0000 7f5758a6f740 -1 osd.1 0 log_to_monitors true 2026-03-09T19:23:28.884 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:23:29.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.165+0000 7f4d21481640 1 -- 192.168.123.107:0/187860021 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d1c071d40 msgr2=0x7f4d1c072140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:29.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.166+0000 7f4d19ffb640 1 -- 192.168.123.107:0/187860021 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4d1002fa80 con 0x7f4d1c071d40 2026-03-09T19:23:29.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.166+0000 7f4d21481640 1 --2- 192.168.123.107:0/187860021 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d1c071d40 0x7f4d1c072140 secure :-1 s=READY pgs=183 cs=0 l=1 rev1=1 crypto rx=0x7f4d100099b0 tx=0x7f4d1002f220 comp rx=0 tx=0).stop 2026-03-09T19:23:29.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.166+0000 7f4d21481640 1 -- 192.168.123.107:0/187860021 shutdown_connections 2026-03-09T19:23:29.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.166+0000 7f4d21481640 1 --2- 192.168.123.107:0/187860021 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d1c072710 0x7f4d1c10c590 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:29.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.166+0000 7f4d21481640 1 --2- 192.168.123.107:0/187860021 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d1c071d40 0x7f4d1c072140 secure :-1 s=CLOSED pgs=183 cs=0 l=1 rev1=1 crypto rx=0x7f4d100099b0 tx=0x7f4d1002f220 comp rx=0 tx=0).stop 2026-03-09T19:23:29.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.166+0000 7f4d21481640 1 -- 192.168.123.107:0/187860021 >> 192.168.123.107:0/187860021 conn(0x7f4d1c06d660 msgr2=0x7f4d1c06faa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:23:29.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.166+0000 7f4d21481640 1 -- 192.168.123.107:0/187860021 shutdown_connections 2026-03-09T19:23:29.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.166+0000 7f4d21481640 1 -- 192.168.123.107:0/187860021 wait complete. 2026-03-09T19:23:29.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.167+0000 7f4d21481640 1 Processor -- start 2026-03-09T19:23:29.165 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.167+0000 7f4d21481640 1 -- start start 2026-03-09T19:23:29.165 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.167+0000 7f4d21481640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d1c072710 0x7f4d1c116c80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:29.165 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.167+0000 7f4d21481640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d1c1171c0 0x7f4d1c117640 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:29.165 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.167+0000 7f4d21481640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d1c118630 con 0x7f4d1c072710 2026-03-09T19:23:29.165 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.167+0000 7f4d21481640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d1c1b58e0 con 0x7f4d1c1171c0 2026-03-09T19:23:29.165 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.167+0000 7f4d1affd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d1c072710 0x7f4d1c116c80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:29.165 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.167+0000 7f4d1a7fc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d1c1171c0 0x7f4d1c117640 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:29.165 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.167+0000 7f4d1a7fc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d1c1171c0 0x7f4d1c117640 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:49852/0 (socket says 192.168.123.107:49852) 2026-03-09T19:23:29.165 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.167+0000 7f4d1affd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d1c072710 0x7f4d1c116c80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:35484/0 (socket says 192.168.123.107:35484) 2026-03-09T19:23:29.165 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.168+0000 7f4d1a7fc640 1 -- 192.168.123.107:0/4020951659 learned_addr learned my addr 192.168.123.107:0/4020951659 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:23:29.166 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.168+0000 7f4d1affd640 1 -- 192.168.123.107:0/4020951659 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d1c1171c0 msgr2=0x7f4d1c117640 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:29.166 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.168+0000 7f4d1affd640 1 --2- 192.168.123.107:0/4020951659 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d1c1171c0 0x7f4d1c117640 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:29.166 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.168+0000 7f4d1affd640 1 -- 192.168.123.107:0/4020951659 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4d10009660 con 0x7f4d1c072710 2026-03-09T19:23:29.166 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.168+0000 7f4d1affd640 1 --2- 192.168.123.107:0/4020951659 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d1c072710 0x7f4d1c116c80 secure :-1 s=READY pgs=184 cs=0 l=1 rev1=1 crypto rx=0x7f4d10008000 tx=0x7f4d10030cc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:23:29.166 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.169+0000 7f4cfbfff640 1 -- 192.168.123.107:0/4020951659 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4d1003c040 con 0x7f4d1c072710 2026-03-09T19:23:29.166 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.169+0000 7f4cfbfff640 1 -- 192.168.123.107:0/4020951659 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f4d10002c20 con 0x7f4d1c072710 2026-03-09T19:23:29.166 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.169+0000 7f4cfbfff640 1 -- 192.168.123.107:0/4020951659 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4d10041870 con 0x7f4d1c072710 2026-03-09T19:23:29.167 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.169+0000 7f4d21481640 1 -- 192.168.123.107:0/4020951659 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4d1c1b5a20 con 0x7f4d1c072710 2026-03-09T19:23:29.167 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.169+0000 7f4d21481640 1 -- 192.168.123.107:0/4020951659 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4d1c1b5e40 con 0x7f4d1c072710 2026-03-09T19:23:29.168 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.169+0000 7f4d21481640 1 -- 192.168.123.107:0/4020951659 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4ce8005350 con 0x7f4d1c072710 2026-03-09T19:23:29.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.173+0000 7f4cfbfff640 1 -- 192.168.123.107:0/4020951659 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 18) v1 ==== 98424+0+0 (secure 0 0 0) 0x7f4d10048030 con 0x7f4d1c072710 2026-03-09T19:23:29.172 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.173+0000 7f4cfbfff640 1 --2- 192.168.123.107:0/4020951659 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4cec076130 0x7f4cec0785f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:29.172 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.173+0000 7f4cfbfff640 1 -- 192.168.123.107:0/4020951659 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(10..10 src has 1..10) v4 ==== 2087+0+0 (secure 0 0 0) 0x7f4d10085f90 con 0x7f4d1c072710 2026-03-09T19:23:29.172 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.174+0000 7f4cfbfff640 1 -- 192.168.123.107:0/4020951659 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f4d10082d10 con 0x7f4d1c072710 2026-03-09T19:23:29.179 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.181+0000 7f4d1a7fc640 1 --2- 192.168.123.107:0/4020951659 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4cec076130 0x7f4cec0785f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:29.181 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.181+0000 7f4d1a7fc640 1 --2- 192.168.123.107:0/4020951659 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4cec076130 0x7f4cec0785f0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f4d1c1183b0 tx=0x7f4d04007450 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:23:29.268 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:29.270+0000 7f4d21481640 1 -- 192.168.123.107:0/4020951659 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7f4ce8002bf0 con 0x7f4cec076130 2026-03-09T19:23:29.480 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:29 vm07 ceph-mon[48545]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T19:23:29.481 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:29 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:29.481 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:29 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:29.481 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:29 vm07 ceph-mon[48545]: from='osd.1 [v2:192.168.123.107:6810/407429515,v1:192.168.123.107:6811/407429515]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T19:23:29.481 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:29 vm07 ceph-mon[48545]: from='osd.1 ' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T19:23:29.481 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:29 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T19:23:29.481 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:29 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T19:23:29.481 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:29 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:29.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:29 vm08 ceph-mon[57794]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T19:23:29.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:29 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:29.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:29 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:29.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:29 vm08 ceph-mon[57794]: from='osd.1 [v2:192.168.123.107:6810/407429515,v1:192.168.123.107:6811/407429515]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T19:23:29.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:29 vm08 ceph-mon[57794]: from='osd.1 ' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T19:23:29.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:29 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T19:23:29.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:29 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T19:23:29.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:29 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:30.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='client.14296 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:23:30.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='osd.1 ' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: osdmap e11: 2 total, 1 up, 2 in 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='osd.1 [v2:192.168.123.107:6810/407429515,v1:192.168.123.107:6811/407429515]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='osd.1 ' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: Detected new or changed devices on vm07 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1838746949' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "8ea4ceda-8f60-4699-976a-464d32f7e944"}]: dispatch 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='osd.1 ' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm07", "root=default"]}]': finished 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1838746949' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "8ea4ceda-8f60-4699-976a-464d32f7e944"}]': finished 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: osdmap e12: 3 total, 1 up, 3 in 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:23:30.790 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:30 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='client.14296 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='osd.1 ' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: osdmap e11: 2 total, 1 up, 2 in 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='osd.1 [v2:192.168.123.107:6810/407429515,v1:192.168.123.107:6811/407429515]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='osd.1 ' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: Detected new or changed devices on vm07 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/1838746949' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "8ea4ceda-8f60-4699-976a-464d32f7e944"}]: dispatch 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='osd.1 ' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm07", "root=default"]}]': finished 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/1838746949' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "8ea4ceda-8f60-4699-976a-464d32f7e944"}]': finished 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: osdmap e12: 3 total, 1 up, 3 in 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:23:30.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:30 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:23:31.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:31 vm07 ceph-mon[48545]: pgmap v21: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T19:23:31.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:31 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/3940355072' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T19:23:31.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:31 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:23:31.844 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:31 vm08 ceph-mon[57794]: pgmap v21: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T19:23:31.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:31 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/3940355072' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T19:23:31.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:31 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:23:32.228 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:23:31 vm07 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1[75486]: 2026-03-09T19:23:31.797+0000 7f57541cb640 -1 osd.1 0 waiting for initial osdmap 2026-03-09T19:23:32.228 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:23:31 vm07 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1[75486]: 2026-03-09T19:23:31.804+0000 7f5750006640 -1 osd.1 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T19:23:32.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:32 vm08 ceph-mon[57794]: purged_snaps scrub starts 2026-03-09T19:23:32.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:32 vm08 ceph-mon[57794]: purged_snaps scrub ok 2026-03-09T19:23:32.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:32 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:32.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:32 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:32.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:32 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:32.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:32 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:32.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:32 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:32.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:32 vm08 ceph-mon[57794]: from='osd.1 ' entity='osd.1' 2026-03-09T19:23:32.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:32 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:23:32.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:32 vm07 ceph-mon[48545]: purged_snaps scrub starts 2026-03-09T19:23:32.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:32 vm07 ceph-mon[48545]: purged_snaps scrub ok 2026-03-09T19:23:32.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:32 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:32.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:32 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:32.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:32 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:32.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:32 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:32.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:32 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:32.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:32 vm07 ceph-mon[48545]: from='osd.1 ' entity='osd.1' 2026-03-09T19:23:32.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:32 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:23:33.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:33 vm07 ceph-mon[48545]: pgmap v22: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T19:23:33.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:33 vm07 ceph-mon[48545]: osd.1 [v2:192.168.123.107:6810/407429515,v1:192.168.123.107:6811/407429515] boot 2026-03-09T19:23:33.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:33 vm07 ceph-mon[48545]: osdmap e13: 3 total, 2 up, 3 in 2026-03-09T19:23:33.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:33 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:23:33.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:33 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:23:33.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:33 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:23:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:33 vm08 ceph-mon[57794]: pgmap v22: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T19:23:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:33 vm08 ceph-mon[57794]: osd.1 [v2:192.168.123.107:6810/407429515,v1:192.168.123.107:6811/407429515] boot 2026-03-09T19:23:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:33 vm08 ceph-mon[57794]: osdmap e13: 3 total, 2 up, 3 in 2026-03-09T19:23:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:33 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:23:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:33 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:23:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:33 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:23:34.944 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:34 vm07 ceph-mon[48545]: osdmap e14: 3 total, 2 up, 3 in 2026-03-09T19:23:34.944 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:34 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:23:34.944 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:34 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T19:23:34.944 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:34 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:35.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:34 vm08 ceph-mon[57794]: osdmap e14: 3 total, 2 up, 3 in 2026-03-09T19:23:35.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:34 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:23:35.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:34 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T19:23:35.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:34 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:36.062 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:35 vm07 ceph-mon[48545]: pgmap v25: 0 pgs: ; 0 B data, 57 MiB used, 40 GiB / 40 GiB avail 2026-03-09T19:23:36.062 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:35 vm07 ceph-mon[48545]: Deploying daemon osd.2 on vm07 2026-03-09T19:23:36.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:35 vm08 ceph-mon[57794]: pgmap v25: 0 pgs: ; 0 B data, 57 MiB used, 40 GiB / 40 GiB avail 2026-03-09T19:23:36.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:35 vm08 ceph-mon[57794]: Deploying daemon osd.2 on vm07 2026-03-09T19:23:36.917 INFO:teuthology.orchestra.run.vm07.stdout:Created osd(s) 2 on host 'vm07' 2026-03-09T19:23:36.917 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:36.916+0000 7f4cfbfff640 1 -- 192.168.123.107:0/4020951659 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f4ce8002bf0 con 0x7f4cec076130 2026-03-09T19:23:36.917 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:36.918+0000 7f4cf9ffb640 1 -- 192.168.123.107:0/4020951659 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4cec076130 msgr2=0x7f4cec0785f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:36.917 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:36.918+0000 7f4cf9ffb640 1 --2- 192.168.123.107:0/4020951659 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4cec076130 0x7f4cec0785f0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f4d1c1183b0 tx=0x7f4d04007450 comp rx=0 tx=0).stop 2026-03-09T19:23:36.917 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:36.918+0000 7f4cf9ffb640 1 -- 192.168.123.107:0/4020951659 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d1c072710 msgr2=0x7f4d1c116c80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:36.917 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:36.918+0000 7f4cf9ffb640 1 --2- 192.168.123.107:0/4020951659 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d1c072710 0x7f4d1c116c80 secure :-1 s=READY pgs=184 cs=0 l=1 rev1=1 crypto rx=0x7f4d10008000 tx=0x7f4d10030cc0 comp rx=0 tx=0).stop 2026-03-09T19:23:36.917 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:36.918+0000 7f4cf9ffb640 1 -- 192.168.123.107:0/4020951659 shutdown_connections 2026-03-09T19:23:36.917 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:36.919+0000 7f4cf9ffb640 1 --2- 192.168.123.107:0/4020951659 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4cec076130 0x7f4cec0785f0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:36.917 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:36.919+0000 7f4cf9ffb640 1 --2- 192.168.123.107:0/4020951659 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d1c1171c0 0x7f4d1c117640 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:36.917 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:36.919+0000 7f4cf9ffb640 1 --2- 192.168.123.107:0/4020951659 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d1c072710 0x7f4d1c116c80 unknown :-1 s=CLOSED pgs=184 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:36.917 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:36.919+0000 7f4cf9ffb640 1 -- 192.168.123.107:0/4020951659 >> 192.168.123.107:0/4020951659 conn(0x7f4d1c06d660 msgr2=0x7f4d1c10a860 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:23:36.917 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:36.919+0000 7f4cf9ffb640 1 -- 192.168.123.107:0/4020951659 shutdown_connections 2026-03-09T19:23:36.917 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:23:36.919+0000 7f4cf9ffb640 1 -- 192.168.123.107:0/4020951659 wait complete. 2026-03-09T19:23:36.974 DEBUG:teuthology.orchestra.run.vm07:osd.2> sudo journalctl -f -n 0 -u ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.2.service 2026-03-09T19:23:36.976 INFO:tasks.cephadm:Deploying osd.3 on vm08 with /dev/vde... 2026-03-09T19:23:36.976 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- lvm zap /dev/vde 2026-03-09T19:23:37.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:36 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:37.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:36 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:37.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:36 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:37.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:36 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:37.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:36 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:37.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:36 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:37.129 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm08/config 2026-03-09T19:23:37.641 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:23:37.655 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph orch daemon add osd vm08:/dev/vde 2026-03-09T19:23:37.800 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm08/config 2026-03-09T19:23:38.068 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.067+0000 7feb35f58640 1 -- 192.168.123.108:0/3645412261 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb30073b40 msgr2=0x7feb30073fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:38.068 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.067+0000 7feb35f58640 1 --2- 192.168.123.108:0/3645412261 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb30073b40 0x7feb30073fa0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7feb1c009a00 tx=0x7feb1c02f290 comp rx=0 tx=0).stop 2026-03-09T19:23:38.068 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.069+0000 7feb35f58640 1 -- 192.168.123.108:0/3645412261 shutdown_connections 2026-03-09T19:23:38.068 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.069+0000 7feb35f58640 1 --2- 192.168.123.108:0/3645412261 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb30073b40 0x7feb30073fa0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:38.068 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.069+0000 7feb35f58640 1 --2- 192.168.123.108:0/3645412261 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feb300751a0 0x7feb30073600 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:38.068 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.069+0000 7feb35f58640 1 -- 192.168.123.108:0/3645412261 >> 192.168.123.108:0/3645412261 conn(0x7feb300fbf80 msgr2=0x7feb300fe3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:23:38.068 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.069+0000 7feb35f58640 1 -- 192.168.123.108:0/3645412261 shutdown_connections 2026-03-09T19:23:38.069 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.069+0000 7feb35f58640 1 -- 192.168.123.108:0/3645412261 wait complete. 2026-03-09T19:23:38.069 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.069+0000 7feb35f58640 1 Processor -- start 2026-03-09T19:23:38.069 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.070+0000 7feb35f58640 1 -- start start 2026-03-09T19:23:38.069 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.070+0000 7feb35f58640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb30073b40 0x7feb30071660 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:38.070 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.070+0000 7feb35f58640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feb300751a0 0x7feb30071ba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:38.070 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.070+0000 7feb35f58640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feb300730a0 con 0x7feb300751a0 2026-03-09T19:23:38.070 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.070+0000 7feb35f58640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feb30073210 con 0x7feb30073b40 2026-03-09T19:23:38.070 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.070+0000 7feb2f7fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb30073b40 0x7feb30071660 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:38.070 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.070+0000 7feb2f7fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb30073b40 0x7feb30071660 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.108:42902/0 (socket says 192.168.123.108:42902) 2026-03-09T19:23:38.070 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.070+0000 7feb2f7fe640 1 -- 192.168.123.108:0/3211387674 learned_addr learned my addr 192.168.123.108:0/3211387674 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:23:38.070 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.070+0000 7feb2effd640 1 --2- 192.168.123.108:0/3211387674 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feb300751a0 0x7feb30071ba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:38.070 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.070+0000 7feb2f7fe640 1 -- 192.168.123.108:0/3211387674 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feb300751a0 msgr2=0x7feb30071ba0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:38.070 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.070+0000 7feb2f7fe640 1 --2- 192.168.123.108:0/3211387674 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feb300751a0 0x7feb30071ba0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:38.070 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.070+0000 7feb2f7fe640 1 -- 192.168.123.108:0/3211387674 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feb1c009660 con 0x7feb30073b40 2026-03-09T19:23:38.070 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.071+0000 7feb2f7fe640 1 --2- 192.168.123.108:0/3211387674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb30073b40 0x7feb30071660 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7feb2000b750 tx=0x7feb2000bc20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:23:38.070 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.071+0000 7feb2cff9640 1 -- 192.168.123.108:0/3211387674 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feb20004070 con 0x7feb30073b40 2026-03-09T19:23:38.071 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.071+0000 7feb35f58640 1 -- 192.168.123.108:0/3211387674 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7feb30072230 con 0x7feb30073b40 2026-03-09T19:23:38.071 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.071+0000 7feb35f58640 1 -- 192.168.123.108:0/3211387674 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7feb30104910 con 0x7feb30073b40 2026-03-09T19:23:38.072 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.072+0000 7feb2cff9640 1 -- 192.168.123.108:0/3211387674 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7feb200027a0 con 0x7feb30073b40 2026-03-09T19:23:38.072 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.072+0000 7feb2cff9640 1 -- 192.168.123.108:0/3211387674 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feb2000cad0 con 0x7feb30073b40 2026-03-09T19:23:38.073 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.073+0000 7feb2cff9640 1 -- 192.168.123.108:0/3211387674 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 98424+0+0 (secure 0 0 0) 0x7feb2000ccf0 con 0x7feb30073b40 2026-03-09T19:23:38.073 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.073+0000 7feb35f58640 1 -- 192.168.123.108:0/3211387674 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feaf4005350 con 0x7feb30073b40 2026-03-09T19:23:38.076 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.076+0000 7feb2cff9640 1 --2- 192.168.123.108:0/3211387674 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7feb000760e0 0x7feb000785a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:38.076 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.076+0000 7feb2cff9640 1 -- 192.168.123.108:0/3211387674 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(14..14 src has 1..14) v4 ==== 2519+0+0 (secure 0 0 0) 0x7feb2009ed80 con 0x7feb30073b40 2026-03-09T19:23:38.076 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.076+0000 7feb2effd640 1 --2- 192.168.123.108:0/3211387674 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7feb000760e0 0x7feb000785a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:38.076 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.076+0000 7feb2cff9640 1 -- 192.168.123.108:0/3211387674 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7feb2009b050 con 0x7feb30073b40 2026-03-09T19:23:38.077 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:37 vm08 ceph-mon[57794]: pgmap v26: 0 pgs: ; 0 B data, 57 MiB used, 40 GiB / 40 GiB avail 2026-03-09T19:23:38.077 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:37 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:38.077 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:37 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:38.077 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:37 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:38.077 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:37 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:38.079 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.077+0000 7feb2effd640 1 --2- 192.168.123.108:0/3211387674 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7feb000760e0 0x7feb000785a0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7feb1c038660 tx=0x7feb1c038470 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:23:38.178 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:38.178+0000 7feb35f58640 1 -- 192.168.123.108:0/3211387674 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7feaf4002bf0 con 0x7feb000760e0 2026-03-09T19:23:38.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:37 vm07 ceph-mon[48545]: pgmap v26: 0 pgs: ; 0 B data, 57 MiB used, 40 GiB / 40 GiB avail 2026-03-09T19:23:38.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:37 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:38.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:37 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:38.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:37 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:38.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:37 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:38.229 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:23:38 vm07 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2[83028]: 2026-03-09T19:23:38.128+0000 7f9a1dae8740 -1 osd.2 0 log_to_monitors true 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='osd.2 [v2:192.168.123.107:6818/261501426,v1:192.168.123.107:6819/261501426]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='client.24133 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='client.? 192.168.123.108:0/1672501119' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "175a772b-2920-452f-9d34-5c2a70bb1cb1"}]: dispatch 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "175a772b-2920-452f-9d34-5c2a70bb1cb1"}]: dispatch 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "175a772b-2920-452f-9d34-5c2a70bb1cb1"}]': finished 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: from='osd.2 [v2:192.168.123.107:6818/261501426,v1:192.168.123.107:6819/261501426]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-09T19:23:39.023 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:38 vm08 ceph-mon[57794]: osdmap e15: 4 total, 2 up, 4 in 2026-03-09T19:23:39.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='osd.2 [v2:192.168.123.107:6818/261501426,v1:192.168.123.107:6819/261501426]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='client.24133 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/1672501119' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "175a772b-2920-452f-9d34-5c2a70bb1cb1"}]: dispatch 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "175a772b-2920-452f-9d34-5c2a70bb1cb1"}]: dispatch 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "175a772b-2920-452f-9d34-5c2a70bb1cb1"}]': finished 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: from='osd.2 [v2:192.168.123.107:6818/261501426,v1:192.168.123.107:6819/261501426]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-09T19:23:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:38 vm07 ceph-mon[48545]: osdmap e15: 4 total, 2 up, 4 in 2026-03-09T19:23:40.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:39 vm07 ceph-mon[48545]: pgmap v27: 0 pgs: ; 0 B data, 57 MiB used, 40 GiB / 40 GiB avail 2026-03-09T19:23:40.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:39 vm07 ceph-mon[48545]: Detected new or changed devices on vm07 2026-03-09T19:23:40.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:39 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:23:40.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:39 vm07 ceph-mon[48545]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-09T19:23:40.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:39 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:40.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:39 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/650266906' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T19:23:40.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:39 vm07 ceph-mon[48545]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm07", "root=default"]}]': finished 2026-03-09T19:23:40.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:39 vm07 ceph-mon[48545]: osdmap e16: 4 total, 2 up, 4 in 2026-03-09T19:23:40.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:39 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:23:40.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:39 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:40.229 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:23:39 vm07 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2[83028]: 2026-03-09T19:23:39.915+0000 7f9a19244640 -1 osd.2 0 waiting for initial osdmap 2026-03-09T19:23:40.229 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:23:39 vm07 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2[83028]: 2026-03-09T19:23:39.978+0000 7f9a15880640 -1 osd.2 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T19:23:40.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:39 vm08 ceph-mon[57794]: pgmap v27: 0 pgs: ; 0 B data, 57 MiB used, 40 GiB / 40 GiB avail 2026-03-09T19:23:40.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:39 vm08 ceph-mon[57794]: Detected new or changed devices on vm07 2026-03-09T19:23:40.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:39 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:23:40.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:39 vm08 ceph-mon[57794]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-09T19:23:40.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:39 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:40.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:39 vm08 ceph-mon[57794]: from='client.? 192.168.123.108:0/650266906' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T19:23:40.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:39 vm08 ceph-mon[57794]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm07", "root=default"]}]': finished 2026-03-09T19:23:40.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:39 vm08 ceph-mon[57794]: osdmap e16: 4 total, 2 up, 4 in 2026-03-09T19:23:40.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:39 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:23:40.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:39 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:41.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:40 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:23:41.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:40 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:23:42.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:41 vm08 ceph-mon[57794]: purged_snaps scrub starts 2026-03-09T19:23:42.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:41 vm08 ceph-mon[57794]: purged_snaps scrub ok 2026-03-09T19:23:42.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:41 vm08 ceph-mon[57794]: pgmap v30: 0 pgs: ; 0 B data, 57 MiB used, 40 GiB / 40 GiB avail 2026-03-09T19:23:42.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:41 vm08 ceph-mon[57794]: osd.2 [v2:192.168.123.107:6818/261501426,v1:192.168.123.107:6819/261501426] boot 2026-03-09T19:23:42.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:41 vm08 ceph-mon[57794]: osdmap e17: 4 total, 3 up, 4 in 2026-03-09T19:23:42.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:41 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:23:42.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:41 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:42.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:41 vm07 ceph-mon[48545]: purged_snaps scrub starts 2026-03-09T19:23:42.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:41 vm07 ceph-mon[48545]: purged_snaps scrub ok 2026-03-09T19:23:42.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:41 vm07 ceph-mon[48545]: pgmap v30: 0 pgs: ; 0 B data, 57 MiB used, 40 GiB / 40 GiB avail 2026-03-09T19:23:42.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:41 vm07 ceph-mon[48545]: osd.2 [v2:192.168.123.107:6818/261501426,v1:192.168.123.107:6819/261501426] boot 2026-03-09T19:23:42.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:41 vm07 ceph-mon[48545]: osdmap e17: 4 total, 3 up, 4 in 2026-03-09T19:23:42.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:41 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:23:42.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:41 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:43.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:43 vm08 ceph-mon[57794]: osdmap e18: 4 total, 3 up, 4 in 2026-03-09T19:23:43.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:43 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:43.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:43 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-09T19:23:43.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:42 vm07 ceph-mon[48545]: osdmap e18: 4 total, 3 up, 4 in 2026-03-09T19:23:43.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:43 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:43.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:43 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-09T19:23:44.289 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:44 vm08 ceph-mon[57794]: pgmap v33: 0 pgs: ; 0 B data, 83 MiB used, 60 GiB / 60 GiB avail 2026-03-09T19:23:44.289 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:44 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-09T19:23:44.290 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:44 vm08 ceph-mon[57794]: osdmap e19: 4 total, 3 up, 4 in 2026-03-09T19:23:44.290 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:44 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:44.290 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:44 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-09T19:23:44.290 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:44 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T19:23:44.290 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:44 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:44.290 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:44 vm08 ceph-mon[57794]: Deploying daemon osd.3 on vm08 2026-03-09T19:23:44.290 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:44 vm08 sudo[63405]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vda 2026-03-09T19:23:44.290 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:44 vm08 sudo[63405]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T19:23:44.290 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:44 vm08 sudo[63405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=167) 2026-03-09T19:23:44.290 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:44 vm08 sudo[63405]: pam_unix(sudo:session): session closed for user root 2026-03-09T19:23:44.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:44 vm07 ceph-mon[48545]: pgmap v33: 0 pgs: ; 0 B data, 83 MiB used, 60 GiB / 60 GiB avail 2026-03-09T19:23:44.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:44 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-09T19:23:44.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:44 vm07 ceph-mon[48545]: osdmap e19: 4 total, 3 up, 4 in 2026-03-09T19:23:44.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:44 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:44.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:44 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-09T19:23:44.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:44 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T19:23:44.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:44 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:44.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:44 vm07 ceph-mon[48545]: Deploying daemon osd.3 on vm08 2026-03-09T19:23:44.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:44 vm07 sudo[88294]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vda 2026-03-09T19:23:44.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:44 vm07 sudo[88294]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T19:23:44.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:44 vm07 sudo[88294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=167) 2026-03-09T19:23:44.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:44 vm07 sudo[88294]: pam_unix(sudo:session): session closed for user root 2026-03-09T19:23:44.479 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:23:44 vm07 sudo[88286]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vdd 2026-03-09T19:23:44.479 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:23:44 vm07 sudo[88286]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T19:23:44.479 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:23:44 vm07 sudo[88286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=167) 2026-03-09T19:23:44.479 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:23:44 vm07 sudo[88286]: pam_unix(sudo:session): session closed for user root 2026-03-09T19:23:44.479 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:23:44 vm07 sudo[88282]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vde 2026-03-09T19:23:44.479 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:23:44 vm07 sudo[88282]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T19:23:44.479 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:23:44 vm07 sudo[88282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=167) 2026-03-09T19:23:44.479 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:23:44 vm07 sudo[88282]: pam_unix(sudo:session): session closed for user root 2026-03-09T19:23:44.479 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:23:44 vm07 sudo[88290]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vdc 2026-03-09T19:23:44.479 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:23:44 vm07 sudo[88290]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T19:23:44.479 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:23:44 vm07 sudo[88290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=167) 2026-03-09T19:23:44.479 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:23:44 vm07 sudo[88290]: pam_unix(sudo:session): session closed for user root 2026-03-09T19:23:45.025 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:45 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-09T19:23:45.030 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:45.029+0000 7feb2cff9640 1 -- 192.168.123.108:0/3211387674 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7feb200029a0 con 0x7feb30073b40 2026-03-09T19:23:45.276 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:45 vm08 ceph-mon[57794]: osdmap e20: 4 total, 3 up, 4 in 2026-03-09T19:23:45.276 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:45 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:45.276 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:45 vm08 ceph-mon[57794]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-09T19:23:45.276 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:45 vm08 ceph-mon[57794]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-09T19:23:45.276 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:45 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:23:45.276 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:45 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:45.276 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:45 vm08 ceph-mon[57794]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-09T19:23:45.276 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:45 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:23:45.276 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:45 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:45.276 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:45 vm08 ceph-mon[57794]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-09T19:23:45.276 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:45 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:45.276 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:45 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:45.276 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:45 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:45.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:45 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-09T19:23:45.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:45 vm07 ceph-mon[48545]: osdmap e20: 4 total, 3 up, 4 in 2026-03-09T19:23:45.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:45 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:45.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:45 vm07 ceph-mon[48545]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-09T19:23:45.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:45 vm07 ceph-mon[48545]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-09T19:23:45.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:45 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:23:45.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:45 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:45.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:45 vm07 ceph-mon[48545]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-09T19:23:45.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:45 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:23:45.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:45 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:23:45.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:45 vm07 ceph-mon[48545]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-09T19:23:45.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:45 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:45.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:45 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:45.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:45 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:45.724 INFO:teuthology.orchestra.run.vm08.stdout:Created osd(s) 3 on host 'vm08' 2026-03-09T19:23:45.724 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:45.722+0000 7feb2cff9640 1 -- 192.168.123.108:0/3211387674 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7feaf4002bf0 con 0x7feb000760e0 2026-03-09T19:23:45.724 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:45.724+0000 7feb35f58640 1 -- 192.168.123.108:0/3211387674 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7feb000760e0 msgr2=0x7feb000785a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:45.724 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:45.724+0000 7feb35f58640 1 --2- 192.168.123.108:0/3211387674 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7feb000760e0 0x7feb000785a0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7feb1c038660 tx=0x7feb1c038470 comp rx=0 tx=0).stop 2026-03-09T19:23:45.724 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:45.724+0000 7feb35f58640 1 -- 192.168.123.108:0/3211387674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb30073b40 msgr2=0x7feb30071660 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:45.724 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:45.724+0000 7feb35f58640 1 --2- 192.168.123.108:0/3211387674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb30073b40 0x7feb30071660 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7feb2000b750 tx=0x7feb2000bc20 comp rx=0 tx=0).stop 2026-03-09T19:23:45.724 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:45.725+0000 7feb35f58640 1 -- 192.168.123.108:0/3211387674 shutdown_connections 2026-03-09T19:23:45.724 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:45.725+0000 7feb35f58640 1 --2- 192.168.123.108:0/3211387674 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7feb000760e0 0x7feb000785a0 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:45.724 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:45.725+0000 7feb35f58640 1 --2- 192.168.123.108:0/3211387674 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feb300751a0 0x7feb30071ba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:45.724 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:45.725+0000 7feb35f58640 1 --2- 192.168.123.108:0/3211387674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb30073b40 0x7feb30071660 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:45.725 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:45.725+0000 7feb35f58640 1 -- 192.168.123.108:0/3211387674 >> 192.168.123.108:0/3211387674 conn(0x7feb300fbf80 msgr2=0x7feb300fdb10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:23:45.725 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:45.725+0000 7feb35f58640 1 -- 192.168.123.108:0/3211387674 shutdown_connections 2026-03-09T19:23:45.725 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:45.725+0000 7feb35f58640 1 -- 192.168.123.108:0/3211387674 wait complete. 2026-03-09T19:23:45.785 DEBUG:teuthology.orchestra.run.vm08:osd.3> sudo journalctl -f -n 0 -u ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.3.service 2026-03-09T19:23:45.787 INFO:tasks.cephadm:Deploying osd.4 on vm08 with /dev/vdd... 2026-03-09T19:23:45.787 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- lvm zap /dev/vdd 2026-03-09T19:23:46.028 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm08/config 2026-03-09T19:23:46.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:46 vm08 ceph-mon[57794]: pgmap v36: 1 pgs: 1 creating+peering; 0 B data, 83 MiB used, 60 GiB / 60 GiB avail 2026-03-09T19:23:46.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:46 vm08 ceph-mon[57794]: mgrmap e19: vm07.xacuym(active, since 56s), standbys: vm08.mxylvw 2026-03-09T19:23:46.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:46 vm08 ceph-mon[57794]: osdmap e21: 4 total, 3 up, 4 in 2026-03-09T19:23:46.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:46 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:46.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:46 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:46.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:46 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:46.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:46 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:46.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:46 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:46.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:46 vm07 ceph-mon[48545]: pgmap v36: 1 pgs: 1 creating+peering; 0 B data, 83 MiB used, 60 GiB / 60 GiB avail 2026-03-09T19:23:46.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:46 vm07 ceph-mon[48545]: mgrmap e19: vm07.xacuym(active, since 56s), standbys: vm08.mxylvw 2026-03-09T19:23:46.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:46 vm07 ceph-mon[48545]: osdmap e21: 4 total, 3 up, 4 in 2026-03-09T19:23:46.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:46 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:46.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:46 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:46.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:46 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:46.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:46 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:46.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:46 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:46.560 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:23:46.578 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph orch daemon add osd vm08:/dev/vdd 2026-03-09T19:23:46.791 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm08/config 2026-03-09T19:23:47.141 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.141+0000 7fbac9eb2640 1 -- 192.168.123.108:0/1075252019 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbac4072710 msgr2=0x7fbac410c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:47.142 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.141+0000 7fbac9eb2640 1 --2- 192.168.123.108:0/1075252019 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbac4072710 0x7fbac410c590 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7fbaa40099b0 tx=0x7fbaa402f220 comp rx=0 tx=0).stop 2026-03-09T19:23:47.142 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.142+0000 7fbac9eb2640 1 -- 192.168.123.108:0/1075252019 shutdown_connections 2026-03-09T19:23:47.142 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.142+0000 7fbac9eb2640 1 --2- 192.168.123.108:0/1075252019 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbac4072710 0x7fbac410c590 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:47.142 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.142+0000 7fbac9eb2640 1 --2- 192.168.123.108:0/1075252019 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbac4071d40 0x7fbac4072140 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:47.142 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.142+0000 7fbac9eb2640 1 -- 192.168.123.108:0/1075252019 >> 192.168.123.108:0/1075252019 conn(0x7fbac406d660 msgr2=0x7fbac406faa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:23:47.142 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.142+0000 7fbac9eb2640 1 -- 192.168.123.108:0/1075252019 shutdown_connections 2026-03-09T19:23:47.142 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.142+0000 7fbac9eb2640 1 -- 192.168.123.108:0/1075252019 wait complete. 2026-03-09T19:23:47.144 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.145+0000 7fbac9eb2640 1 Processor -- start 2026-03-09T19:23:47.144 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.145+0000 7fbac9eb2640 1 -- start start 2026-03-09T19:23:47.144 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.145+0000 7fbac9eb2640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbac4071d40 0x7fbac41a7320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:47.145 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.145+0000 7fbac9eb2640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbac4072710 0x7fbac41a7860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:47.145 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.145+0000 7fbac9eb2640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbac41a7da0 con 0x7fbac4071d40 2026-03-09T19:23:47.145 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.145+0000 7fbac9eb2640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbac41a7f10 con 0x7fbac4072710 2026-03-09T19:23:47.145 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.145+0000 7fbac37fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbac4071d40 0x7fbac41a7320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:47.145 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.146+0000 7fbabbfff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbac4072710 0x7fbac41a7860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:47.145 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.146+0000 7fbabbfff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbac4072710 0x7fbac41a7860 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.108:42982/0 (socket says 192.168.123.108:42982) 2026-03-09T19:23:47.145 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.146+0000 7fbabbfff640 1 -- 192.168.123.108:0/1506034079 learned_addr learned my addr 192.168.123.108:0/1506034079 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:23:47.145 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.146+0000 7fbabbfff640 1 -- 192.168.123.108:0/1506034079 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbac4071d40 msgr2=0x7fbac41a7320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:47.145 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.146+0000 7fbabbfff640 1 --2- 192.168.123.108:0/1506034079 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbac4071d40 0x7fbac41a7320 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:47.146 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.146+0000 7fbabbfff640 1 -- 192.168.123.108:0/1506034079 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbaa4009660 con 0x7fbac4072710 2026-03-09T19:23:47.146 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.146+0000 7fbabbfff640 1 --2- 192.168.123.108:0/1506034079 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbac4072710 0x7fbac41a7860 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fbaa402f730 tx=0x7fbaa4031cd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:23:47.146 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.147+0000 7fbac17fa640 1 -- 192.168.123.108:0/1506034079 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbaa403d070 con 0x7fbac4072710 2026-03-09T19:23:47.147 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.147+0000 7fbac9eb2640 1 -- 192.168.123.108:0/1506034079 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbac41ac990 con 0x7fbac4072710 2026-03-09T19:23:47.147 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.147+0000 7fbac9eb2640 1 -- 192.168.123.108:0/1506034079 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbac41ace30 con 0x7fbac4072710 2026-03-09T19:23:47.147 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.147+0000 7fbac17fa640 1 -- 192.168.123.108:0/1506034079 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbaa4031d40 con 0x7fbac4072710 2026-03-09T19:23:47.147 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.147+0000 7fbac17fa640 1 -- 192.168.123.108:0/1506034079 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbaa4031070 con 0x7fbac4072710 2026-03-09T19:23:47.149 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.149+0000 7fbac9eb2640 1 -- 192.168.123.108:0/1506034079 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fba84005350 con 0x7fbac4072710 2026-03-09T19:23:47.152 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.153+0000 7fbac17fa640 1 -- 192.168.123.108:0/1506034079 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fbaa40387c0 con 0x7fbac4072710 2026-03-09T19:23:47.153 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.153+0000 7fbac17fa640 1 --2- 192.168.123.108:0/1506034079 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fba900761c0 0x7fba90078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:47.153 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.153+0000 7fbac17fa640 1 -- 192.168.123.108:0/1506034079 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(21..21 src has 1..21) v4 ==== 3337+0+0 (secure 0 0 0) 0x7fbaa40bc390 con 0x7fbac4072710 2026-03-09T19:23:47.153 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.154+0000 7fbac17fa640 1 -- 192.168.123.108:0/1506034079 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fbaa40bc6f0 con 0x7fbac4072710 2026-03-09T19:23:47.153 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.154+0000 7fbac37fe640 1 --2- 192.168.123.108:0/1506034079 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fba900761c0 0x7fba90078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:47.154 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.155+0000 7fbac37fe640 1 --2- 192.168.123.108:0/1506034079 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fba900761c0 0x7fba90078680 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fbab4004620 tx=0x7fbab400a400 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:23:47.265 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:47.266+0000 7fbac9eb2640 1 -- 192.168.123.108:0/1506034079 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7fba84002bf0 con 0x7fba900761c0 2026-03-09T19:23:47.270 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:23:47 vm08 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3[63647]: 2026-03-09T19:23:47.017+0000 7f42f1e7f740 -1 osd.3 0 log_to_monitors true 2026-03-09T19:23:48.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:48 vm08 ceph-mon[57794]: pgmap v38: 1 pgs: 1 creating+peering; 0 B data, 83 MiB used, 60 GiB / 60 GiB avail 2026-03-09T19:23:48.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:48 vm08 ceph-mon[57794]: from='osd.3 [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T19:23:48.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:48 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T19:23:48.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:48 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T19:23:48.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:48 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:48.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:48 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:48.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:48 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:48.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:48 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:23:48.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:48 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:48.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:48 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:48.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:48 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:48.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:48 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:48.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:48 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:48.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:48 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:48.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:48 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:48.357 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:48 vm07 ceph-mon[48545]: pgmap v38: 1 pgs: 1 creating+peering; 0 B data, 83 MiB used, 60 GiB / 60 GiB avail 2026-03-09T19:23:48.357 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:48 vm07 ceph-mon[48545]: from='osd.3 [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T19:23:48.357 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T19:23:48.357 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T19:23:48.357 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:48.357 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:48.357 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:48.357 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:23:48.357 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:48.357 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:48.357 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:48.357 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:48.357 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:48.357 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:48.358 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:48 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:49.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:49 vm08 ceph-mon[57794]: from='client.24157 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:23:49.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:49 vm08 ceph-mon[57794]: Detected new or changed devices on vm08 2026-03-09T19:23:49.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:49 vm08 ceph-mon[57794]: from='client.? 192.168.123.108:0/1423503810' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "a265b553-1e86-4bab-beff-db9f81381120"}]: dispatch 2026-03-09T19:23:49.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:49 vm08 ceph-mon[57794]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "a265b553-1e86-4bab-beff-db9f81381120"}]: dispatch 2026-03-09T19:23:49.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:49 vm08 ceph-mon[57794]: from='osd.3 [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338]' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-09T19:23:49.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:49 vm08 ceph-mon[57794]: osdmap e22: 4 total, 3 up, 4 in 2026-03-09T19:23:49.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:49 vm08 ceph-mon[57794]: from='osd.3 [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:23:49.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:49 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:49.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:49 vm08 ceph-mon[57794]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "a265b553-1e86-4bab-beff-db9f81381120"}]': finished 2026-03-09T19:23:49.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:49 vm08 ceph-mon[57794]: osdmap e23: 5 total, 3 up, 5 in 2026-03-09T19:23:49.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:49 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:49.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:49 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:49.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:49 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:23:49.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:49 vm08 ceph-mon[57794]: from='client.? 192.168.123.108:0/1513177298' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T19:23:49.346 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:23:49 vm08 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3[63647]: 2026-03-09T19:23:49.049+0000 7f42ed5db640 -1 osd.3 0 waiting for initial osdmap 2026-03-09T19:23:49.346 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:23:49 vm08 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3[63647]: 2026-03-09T19:23:49.064+0000 7f42e9c17640 -1 osd.3 24 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T19:23:49.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:49 vm07 ceph-mon[48545]: from='client.24157 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:23:49.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:49 vm07 ceph-mon[48545]: Detected new or changed devices on vm08 2026-03-09T19:23:49.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:49 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/1423503810' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "a265b553-1e86-4bab-beff-db9f81381120"}]: dispatch 2026-03-09T19:23:49.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:49 vm07 ceph-mon[48545]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "a265b553-1e86-4bab-beff-db9f81381120"}]: dispatch 2026-03-09T19:23:49.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:49 vm07 ceph-mon[48545]: from='osd.3 [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338]' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-09T19:23:49.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:49 vm07 ceph-mon[48545]: osdmap e22: 4 total, 3 up, 4 in 2026-03-09T19:23:49.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:49 vm07 ceph-mon[48545]: from='osd.3 [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:23:49.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:49 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:49.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:49 vm07 ceph-mon[48545]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "a265b553-1e86-4bab-beff-db9f81381120"}]': finished 2026-03-09T19:23:49.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:49 vm07 ceph-mon[48545]: osdmap e23: 5 total, 3 up, 5 in 2026-03-09T19:23:49.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:49 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:49.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:49 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:49.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:49 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:23:49.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:49 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/1513177298' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T19:23:50.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:50 vm08 ceph-mon[57794]: purged_snaps scrub starts 2026-03-09T19:23:50.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:50 vm08 ceph-mon[57794]: purged_snaps scrub ok 2026-03-09T19:23:50.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:50 vm08 ceph-mon[57794]: pgmap v41: 1 pgs: 1 active+clean; 449 KiB data, 85 MiB used, 60 GiB / 60 GiB avail 2026-03-09T19:23:50.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:50 vm08 ceph-mon[57794]: from='osd.3 [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338]' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-09T19:23:50.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:50 vm08 ceph-mon[57794]: osdmap e24: 5 total, 3 up, 5 in 2026-03-09T19:23:50.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:50 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:50.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:50 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:50.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:50 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:50.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:50 vm07 ceph-mon[48545]: purged_snaps scrub starts 2026-03-09T19:23:50.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:50 vm07 ceph-mon[48545]: purged_snaps scrub ok 2026-03-09T19:23:50.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:50 vm07 ceph-mon[48545]: pgmap v41: 1 pgs: 1 active+clean; 449 KiB data, 85 MiB used, 60 GiB / 60 GiB avail 2026-03-09T19:23:50.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:50 vm07 ceph-mon[48545]: from='osd.3 [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338]' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-09T19:23:50.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:50 vm07 ceph-mon[48545]: osdmap e24: 5 total, 3 up, 5 in 2026-03-09T19:23:50.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:50 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:50.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:50 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:50.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:50 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:51.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:51 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:51.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:51 vm08 ceph-mon[57794]: osd.3 [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338] boot 2026-03-09T19:23:51.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:51 vm08 ceph-mon[57794]: osdmap e25: 5 total, 4 up, 5 in 2026-03-09T19:23:51.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:51 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:51.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:51 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:51.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:51 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:51.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:51 vm07 ceph-mon[48545]: osd.3 [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338] boot 2026-03-09T19:23:51.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:51 vm07 ceph-mon[48545]: osdmap e25: 5 total, 4 up, 5 in 2026-03-09T19:23:51.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:51 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:23:51.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:51 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:52.309 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:52 vm08 ceph-mon[57794]: pgmap v44: 1 pgs: 1 active+clean; 449 KiB data, 85 MiB used, 60 GiB / 60 GiB avail 2026-03-09T19:23:52.309 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:52 vm08 ceph-mon[57794]: osdmap e26: 5 total, 4 up, 5 in 2026-03-09T19:23:52.309 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:52 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:52.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:52 vm07 ceph-mon[48545]: pgmap v44: 1 pgs: 1 active+clean; 449 KiB data, 85 MiB used, 60 GiB / 60 GiB avail 2026-03-09T19:23:52.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:52 vm07 ceph-mon[48545]: osdmap e26: 5 total, 4 up, 5 in 2026-03-09T19:23:52.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:52 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:53.131 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:53 vm08 ceph-mon[57794]: osdmap e27: 5 total, 4 up, 5 in 2026-03-09T19:23:53.131 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:53 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:53.131 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:53 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T19:23:53.131 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:53 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:53.131 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:53 vm08 ceph-mon[57794]: Deploying daemon osd.4 on vm08 2026-03-09T19:23:53.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:53 vm07 ceph-mon[48545]: osdmap e27: 5 total, 4 up, 5 in 2026-03-09T19:23:53.501 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:53 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:53.501 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:53 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T19:23:53.501 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:53 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:53.501 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:53 vm07 ceph-mon[48545]: Deploying daemon osd.4 on vm08 2026-03-09T19:23:54.264 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:54 vm08 ceph-mon[57794]: pgmap v47: 1 pgs: 1 active+clean; 449 KiB data, 111 MiB used, 80 GiB / 80 GiB avail 2026-03-09T19:23:54.264 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:54 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:54.264 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:54 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:54.264 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:54 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:54.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:54 vm07 ceph-mon[48545]: pgmap v47: 1 pgs: 1 active+clean; 449 KiB data, 111 MiB used, 80 GiB / 80 GiB avail 2026-03-09T19:23:54.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:54 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:54.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:54 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:54.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:54 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:54.501 INFO:teuthology.orchestra.run.vm08.stdout:Created osd(s) 4 on host 'vm08' 2026-03-09T19:23:54.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:54.498+0000 7fbac17fa640 1 -- 192.168.123.108:0/1506034079 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7fba84002bf0 con 0x7fba900761c0 2026-03-09T19:23:54.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:54.500+0000 7fbac9eb2640 1 -- 192.168.123.108:0/1506034079 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fba900761c0 msgr2=0x7fba90078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:54.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:54.500+0000 7fbac9eb2640 1 --2- 192.168.123.108:0/1506034079 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fba900761c0 0x7fba90078680 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fbab4004620 tx=0x7fbab400a400 comp rx=0 tx=0).stop 2026-03-09T19:23:54.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:54.500+0000 7fbac9eb2640 1 -- 192.168.123.108:0/1506034079 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbac4072710 msgr2=0x7fbac41a7860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:54.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:54.500+0000 7fbac9eb2640 1 --2- 192.168.123.108:0/1506034079 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbac4072710 0x7fbac41a7860 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fbaa402f730 tx=0x7fbaa4031cd0 comp rx=0 tx=0).stop 2026-03-09T19:23:54.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:54.500+0000 7fbac9eb2640 1 -- 192.168.123.108:0/1506034079 shutdown_connections 2026-03-09T19:23:54.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:54.500+0000 7fbac9eb2640 1 --2- 192.168.123.108:0/1506034079 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fba900761c0 0x7fba90078680 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:54.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:54.500+0000 7fbac9eb2640 1 --2- 192.168.123.108:0/1506034079 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbac4072710 0x7fbac41a7860 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:54.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:54.500+0000 7fbac9eb2640 1 --2- 192.168.123.108:0/1506034079 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbac4071d40 0x7fbac41a7320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:54.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:54.500+0000 7fbac9eb2640 1 -- 192.168.123.108:0/1506034079 >> 192.168.123.108:0/1506034079 conn(0x7fbac406d660 msgr2=0x7fbac410a830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:23:54.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:54.501+0000 7fbac9eb2640 1 -- 192.168.123.108:0/1506034079 shutdown_connections 2026-03-09T19:23:54.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:54.501+0000 7fbac9eb2640 1 -- 192.168.123.108:0/1506034079 wait complete. 2026-03-09T19:23:54.546 DEBUG:teuthology.orchestra.run.vm08:osd.4> sudo journalctl -f -n 0 -u ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.4.service 2026-03-09T19:23:54.548 INFO:tasks.cephadm:Deploying osd.5 on vm08 with /dev/vdc... 2026-03-09T19:23:54.548 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- lvm zap /dev/vdc 2026-03-09T19:23:54.739 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm08/config 2026-03-09T19:23:55.262 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:23:55.281 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph orch daemon add osd vm08:/dev/vdc 2026-03-09T19:23:55.447 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm08/config 2026-03-09T19:23:55.649 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:55 vm08 ceph-mon[57794]: pgmap v48: 1 pgs: 1 active+clean; 449 KiB data, 111 MiB used, 80 GiB / 80 GiB avail; 85 KiB/s, 0 objects/s recovering 2026-03-09T19:23:55.649 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:55 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:55.649 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:55 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:55.649 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:55 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:55.649 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:55 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:55.649 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:23:55 vm08 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4[69823]: 2026-03-09T19:23:55.518+0000 7f8c3b574740 -1 osd.4 0 log_to_monitors true 2026-03-09T19:23:55.724 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.724+0000 7fe77f273640 1 -- 192.168.123.108:0/2057396290 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7781028d0 msgr2=0x7fe778102cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:55.724 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.724+0000 7fe77f273640 1 --2- 192.168.123.108:0/2057396290 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7781028d0 0x7fe778102cd0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fe7680099b0 tx=0x7fe76802f240 comp rx=0 tx=0).stop 2026-03-09T19:23:55.724 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.725+0000 7fe77f273640 1 -- 192.168.123.108:0/2057396290 shutdown_connections 2026-03-09T19:23:55.724 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.725+0000 7fe77f273640 1 --2- 192.168.123.108:0/2057396290 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe778103ad0 0x7fe778103f50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:55.724 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.725+0000 7fe77f273640 1 --2- 192.168.123.108:0/2057396290 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7781028d0 0x7fe778102cd0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:55.725 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.725+0000 7fe77f273640 1 -- 192.168.123.108:0/2057396290 >> 192.168.123.108:0/2057396290 conn(0x7fe7780fe0c0 msgr2=0x7fe7781004e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:23:55.725 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.725+0000 7fe77f273640 1 -- 192.168.123.108:0/2057396290 shutdown_connections 2026-03-09T19:23:55.725 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.725+0000 7fe77f273640 1 -- 192.168.123.108:0/2057396290 wait complete. 2026-03-09T19:23:55.726 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.726+0000 7fe77f273640 1 Processor -- start 2026-03-09T19:23:55.727 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.727+0000 7fe77f273640 1 -- start start 2026-03-09T19:23:55.727 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.727+0000 7fe77f273640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7781028d0 0x7fe7780716a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:55.727 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.727+0000 7fe77f273640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe778103ad0 0x7fe778071be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:55.727 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.727+0000 7fe77f273640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe7780730e0 con 0x7fe778103ad0 2026-03-09T19:23:55.728 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.727+0000 7fe77f273640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe778073250 con 0x7fe7781028d0 2026-03-09T19:23:55.728 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.728+0000 7fe76ffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe778103ad0 0x7fe778071be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:55.728 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.728+0000 7fe76ffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe778103ad0 0x7fe778071be0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:42024/0 (socket says 192.168.123.108:42024) 2026-03-09T19:23:55.728 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.728+0000 7fe76ffff640 1 -- 192.168.123.108:0/2168618129 learned_addr learned my addr 192.168.123.108:0/2168618129 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:23:55.728 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.728+0000 7fe77cfe8640 1 --2- 192.168.123.108:0/2168618129 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7781028d0 0x7fe7780716a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:55.728 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.728+0000 7fe76ffff640 1 -- 192.168.123.108:0/2168618129 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7781028d0 msgr2=0x7fe7780716a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:23:55.728 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.728+0000 7fe76ffff640 1 --2- 192.168.123.108:0/2168618129 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7781028d0 0x7fe7780716a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:23:55.728 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.728+0000 7fe76ffff640 1 -- 192.168.123.108:0/2168618129 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe768009660 con 0x7fe778103ad0 2026-03-09T19:23:55.728 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.728+0000 7fe77cfe8640 1 --2- 192.168.123.108:0/2168618129 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7781028d0 0x7fe7780716a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:23:55.728 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.729+0000 7fe76ffff640 1 --2- 192.168.123.108:0/2168618129 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe778103ad0 0x7fe778071be0 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7fe76000b4f0 tx=0x7fe76000b9c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:23:55.728 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.729+0000 7fe76dffb640 1 -- 192.168.123.108:0/2168618129 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe760004280 con 0x7fe778103ad0 2026-03-09T19:23:55.729 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.729+0000 7fe77f273640 1 -- 192.168.123.108:0/2168618129 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe7780797f0 con 0x7fe778103ad0 2026-03-09T19:23:55.729 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.729+0000 7fe77f273640 1 -- 192.168.123.108:0/2168618129 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe7780723c0 con 0x7fe778103ad0 2026-03-09T19:23:55.729 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.730+0000 7fe76dffb640 1 -- 192.168.123.108:0/2168618129 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe7600043e0 con 0x7fe778103ad0 2026-03-09T19:23:55.730 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.730+0000 7fe76dffb640 1 -- 192.168.123.108:0/2168618129 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe760010b50 con 0x7fe778103ad0 2026-03-09T19:23:55.730 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.730+0000 7fe77f273640 1 -- 192.168.123.108:0/2168618129 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe778102cd0 con 0x7fe778103ad0 2026-03-09T19:23:55.730 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.731+0000 7fe76dffb640 1 -- 192.168.123.108:0/2168618129 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fe76001a460 con 0x7fe778103ad0 2026-03-09T19:23:55.730 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.731+0000 7fe76dffb640 1 --2- 192.168.123.108:0/2168618129 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe74c076080 0x7fe74c078540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:23:55.731 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.731+0000 7fe76dffb640 1 -- 192.168.123.108:0/2168618129 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(27..27 src has 1..27) v4 ==== 3869+0+0 (secure 0 0 0) 0x7fe7600967c0 con 0x7fe778103ad0 2026-03-09T19:23:55.731 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.731+0000 7fe77cfe8640 1 --2- 192.168.123.108:0/2168618129 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe74c076080 0x7fe74c078540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:23:55.731 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.732+0000 7fe77cfe8640 1 --2- 192.168.123.108:0/2168618129 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe74c076080 0x7fe74c078540 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fe768002bf0 tx=0x7fe76803a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:23:55.733 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.734+0000 7fe76dffb640 1 -- 192.168.123.108:0/2168618129 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fe760060440 con 0x7fe778103ad0 2026-03-09T19:23:55.832 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:23:55.832+0000 7fe77f273640 1 -- 192.168.123.108:0/2168618129 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7fe778107d10 con 0x7fe74c076080 2026-03-09T19:23:55.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:55 vm07 ceph-mon[48545]: pgmap v48: 1 pgs: 1 active+clean; 449 KiB data, 111 MiB used, 80 GiB / 80 GiB avail; 85 KiB/s, 0 objects/s recovering 2026-03-09T19:23:55.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:55 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:55.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:55 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:55.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:55 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:55.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:55 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:56.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:56 vm08 ceph-mon[57794]: from='osd.4 [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T19:23:56.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:56 vm08 ceph-mon[57794]: from='client.14340 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:23:56.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:56 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T19:23:56.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:56 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T19:23:56.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:56 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:56.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:56 vm08 ceph-mon[57794]: Detected new or changed devices on vm08 2026-03-09T19:23:56.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:56 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:56.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:56 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:56.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:56 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:23:56.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:56 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:56.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:56 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:56.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:56 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:56.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:56 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:56.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:56 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:56.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:56 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:56.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:56 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:56.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:56 vm07 ceph-mon[48545]: from='osd.4 [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T19:23:56.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:56 vm07 ceph-mon[48545]: from='client.14340 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:23:56.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:56 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T19:23:56.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:56 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T19:23:56.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:56 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:56.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:56 vm07 ceph-mon[48545]: Detected new or changed devices on vm08 2026-03-09T19:23:56.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:56 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:56.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:56 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:56.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:56 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:23:56.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:56 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:56.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:56 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:56.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:56 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:56.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:56 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:23:56.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:56 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:23:56.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:56 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:23:56.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:56 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:23:57.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:57 vm08 ceph-mon[57794]: pgmap v49: 1 pgs: 1 active+clean; 449 KiB data, 112 MiB used, 80 GiB / 80 GiB avail; 71 KiB/s, 0 objects/s recovering 2026-03-09T19:23:57.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:57 vm08 ceph-mon[57794]: from='osd.4 [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370]' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-09T19:23:57.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:57 vm08 ceph-mon[57794]: osdmap e28: 5 total, 4 up, 5 in 2026-03-09T19:23:57.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:57 vm08 ceph-mon[57794]: from='osd.4 [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:23:57.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:57 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:57.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:57 vm08 ceph-mon[57794]: from='client.? 192.168.123.108:0/3861203896' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "44d4390e-f9a3-490b-9d44-f60b53e3d568"}]: dispatch 2026-03-09T19:23:57.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:57 vm08 ceph-mon[57794]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "44d4390e-f9a3-490b-9d44-f60b53e3d568"}]: dispatch 2026-03-09T19:23:57.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:57 vm08 ceph-mon[57794]: from='osd.4 [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370]' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-09T19:23:57.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:57 vm08 ceph-mon[57794]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "44d4390e-f9a3-490b-9d44-f60b53e3d568"}]': finished 2026-03-09T19:23:57.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:57 vm08 ceph-mon[57794]: osdmap e29: 6 total, 4 up, 6 in 2026-03-09T19:23:57.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:57 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:57.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:57 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:23:57.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:57 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:57.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:57 vm08 ceph-mon[57794]: from='client.? 192.168.123.108:0/100181467' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T19:23:57.846 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:23:57 vm08 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4[69823]: 2026-03-09T19:23:57.550+0000 7f8c36cd0640 -1 osd.4 0 waiting for initial osdmap 2026-03-09T19:23:57.846 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:23:57 vm08 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4[69823]: 2026-03-09T19:23:57.560+0000 7f8c32b0b640 -1 osd.4 29 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T19:23:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:57 vm07 ceph-mon[48545]: pgmap v49: 1 pgs: 1 active+clean; 449 KiB data, 112 MiB used, 80 GiB / 80 GiB avail; 71 KiB/s, 0 objects/s recovering 2026-03-09T19:23:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:57 vm07 ceph-mon[48545]: from='osd.4 [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370]' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-09T19:23:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:57 vm07 ceph-mon[48545]: osdmap e28: 5 total, 4 up, 5 in 2026-03-09T19:23:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:57 vm07 ceph-mon[48545]: from='osd.4 [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:23:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:57 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/3861203896' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "44d4390e-f9a3-490b-9d44-f60b53e3d568"}]: dispatch 2026-03-09T19:23:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:57 vm07 ceph-mon[48545]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "44d4390e-f9a3-490b-9d44-f60b53e3d568"}]: dispatch 2026-03-09T19:23:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:57 vm07 ceph-mon[48545]: from='osd.4 [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370]' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-09T19:23:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:57 vm07 ceph-mon[48545]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "44d4390e-f9a3-490b-9d44-f60b53e3d568"}]': finished 2026-03-09T19:23:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:57 vm07 ceph-mon[48545]: osdmap e29: 6 total, 4 up, 6 in 2026-03-09T19:23:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:23:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:57 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:57 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/100181467' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T19:23:58.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:58 vm08 ceph-mon[57794]: purged_snaps scrub starts 2026-03-09T19:23:58.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:58 vm08 ceph-mon[57794]: purged_snaps scrub ok 2026-03-09T19:23:58.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:58 vm08 ceph-mon[57794]: from='osd.4 [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370]' entity='osd.4' 2026-03-09T19:23:58.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:58 vm08 ceph-mon[57794]: osd.4 [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370] boot 2026-03-09T19:23:58.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:58 vm08 ceph-mon[57794]: osdmap e30: 6 total, 5 up, 6 in 2026-03-09T19:23:58.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:58 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:58.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:58 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:23:58.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:58 vm07 ceph-mon[48545]: purged_snaps scrub starts 2026-03-09T19:23:58.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:58 vm07 ceph-mon[48545]: purged_snaps scrub ok 2026-03-09T19:23:58.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:58 vm07 ceph-mon[48545]: from='osd.4 [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370]' entity='osd.4' 2026-03-09T19:23:58.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:58 vm07 ceph-mon[48545]: osd.4 [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370] boot 2026-03-09T19:23:58.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:58 vm07 ceph-mon[48545]: osdmap e30: 6 total, 5 up, 6 in 2026-03-09T19:23:58.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:58 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:23:58.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:58 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:23:59.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:59 vm07 ceph-mon[48545]: pgmap v53: 1 pgs: 1 active+clean; 449 KiB data, 538 MiB used, 99 GiB / 100 GiB avail; 75 KiB/s, 0 objects/s recovering 2026-03-09T19:23:59.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:59 vm07 ceph-mon[48545]: osdmap e31: 6 total, 5 up, 6 in 2026-03-09T19:23:59.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:23:59 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:24:00.039 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:59 vm08 ceph-mon[57794]: pgmap v53: 1 pgs: 1 active+clean; 449 KiB data, 538 MiB used, 99 GiB / 100 GiB avail; 75 KiB/s, 0 objects/s recovering 2026-03-09T19:24:00.039 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:59 vm08 ceph-mon[57794]: osdmap e31: 6 total, 5 up, 6 in 2026-03-09T19:24:00.039 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:23:59 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:24:01.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:01 vm08 ceph-mon[57794]: pgmap v55: 1 pgs: 1 active+clean; 449 KiB data, 538 MiB used, 99 GiB / 100 GiB avail 2026-03-09T19:24:01.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:01 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T19:24:01.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:01 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:01.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:01 vm08 ceph-mon[57794]: Deploying daemon osd.5 on vm08 2026-03-09T19:24:01.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:01 vm07 ceph-mon[48545]: pgmap v55: 1 pgs: 1 active+clean; 449 KiB data, 538 MiB used, 99 GiB / 100 GiB avail 2026-03-09T19:24:01.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:01 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T19:24:01.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:01 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:01.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:01 vm07 ceph-mon[48545]: Deploying daemon osd.5 on vm08 2026-03-09T19:24:02.708 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:02 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:24:02.708 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:02 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:02.708 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:02 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:02.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:02 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:24:02.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:02 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:02.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:02 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:03.049 INFO:teuthology.orchestra.run.vm08.stdout:Created osd(s) 5 on host 'vm08' 2026-03-09T19:24:03.049 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:03.047+0000 7fe76dffb640 1 -- 192.168.123.108:0/2168618129 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7fe778107d10 con 0x7fe74c076080 2026-03-09T19:24:03.049 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:03.049+0000 7fe77f273640 1 -- 192.168.123.108:0/2168618129 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe74c076080 msgr2=0x7fe74c078540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:03.049 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:03.049+0000 7fe77f273640 1 --2- 192.168.123.108:0/2168618129 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe74c076080 0x7fe74c078540 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fe768002bf0 tx=0x7fe76803a040 comp rx=0 tx=0).stop 2026-03-09T19:24:03.049 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:03.049+0000 7fe77f273640 1 -- 192.168.123.108:0/2168618129 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe778103ad0 msgr2=0x7fe778071be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:03.049 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:03.049+0000 7fe77f273640 1 --2- 192.168.123.108:0/2168618129 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe778103ad0 0x7fe778071be0 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7fe76000b4f0 tx=0x7fe76000b9c0 comp rx=0 tx=0).stop 2026-03-09T19:24:03.049 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:03.050+0000 7fe77f273640 1 -- 192.168.123.108:0/2168618129 shutdown_connections 2026-03-09T19:24:03.049 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:03.050+0000 7fe77f273640 1 --2- 192.168.123.108:0/2168618129 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe74c076080 0x7fe74c078540 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:03.049 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:03.050+0000 7fe77f273640 1 --2- 192.168.123.108:0/2168618129 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe778103ad0 0x7fe778071be0 unknown :-1 s=CLOSED pgs=198 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:03.049 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:03.050+0000 7fe77f273640 1 --2- 192.168.123.108:0/2168618129 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7781028d0 0x7fe7780716a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:03.049 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:03.050+0000 7fe77f273640 1 -- 192.168.123.108:0/2168618129 >> 192.168.123.108:0/2168618129 conn(0x7fe7780fe0c0 msgr2=0x7fe7780ffc10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:03.049 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:03.050+0000 7fe77f273640 1 -- 192.168.123.108:0/2168618129 shutdown_connections 2026-03-09T19:24:03.049 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:03.050+0000 7fe77f273640 1 -- 192.168.123.108:0/2168618129 wait complete. 2026-03-09T19:24:03.105 DEBUG:teuthology.orchestra.run.vm08:osd.5> sudo journalctl -f -n 0 -u ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.5.service 2026-03-09T19:24:03.106 INFO:tasks.cephadm:Waiting for 6 OSDs to come up... 2026-03-09T19:24:03.106 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd stat -f json 2026-03-09T19:24:03.278 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:03.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.532+0000 7f17d4b1f640 1 -- 192.168.123.107:0/2205327222 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17d00719a0 msgr2=0x7f17d0071da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:03.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.532+0000 7f17d4b1f640 1 --2- 192.168.123.107:0/2205327222 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17d00719a0 0x7f17d0071da0 secure :-1 s=READY pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7f17b80099b0 tx=0x7f17b802f240 comp rx=0 tx=0).stop 2026-03-09T19:24:03.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.533+0000 7f17d4b1f640 1 -- 192.168.123.107:0/2205327222 shutdown_connections 2026-03-09T19:24:03.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.533+0000 7f17d4b1f640 1 --2- 192.168.123.107:0/2205327222 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f17d0072370 0x7f17d010c590 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:03.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.533+0000 7f17d4b1f640 1 --2- 192.168.123.107:0/2205327222 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17d00719a0 0x7f17d0071da0 unknown :-1 s=CLOSED pgs=199 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:03.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.533+0000 7f17d4b1f640 1 -- 192.168.123.107:0/2205327222 >> 192.168.123.107:0/2205327222 conn(0x7f17d006d4f0 msgr2=0x7f17d006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:03.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.533+0000 7f17d4b1f640 1 -- 192.168.123.107:0/2205327222 shutdown_connections 2026-03-09T19:24:03.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.533+0000 7f17d4b1f640 1 -- 192.168.123.107:0/2205327222 wait complete. 2026-03-09T19:24:03.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.533+0000 7f17d4b1f640 1 Processor -- start 2026-03-09T19:24:03.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.533+0000 7f17d4b1f640 1 -- start start 2026-03-09T19:24:03.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.534+0000 7f17d4b1f640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f17d00719a0 0x7f17d01a7180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:03.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.534+0000 7f17d4b1f640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17d0072370 0x7f17d01a76c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:03.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.534+0000 7f17cf7fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f17d00719a0 0x7f17d01a7180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:03.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.534+0000 7f17c6dff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17d0072370 0x7f17d01a76c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:03.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.534+0000 7f17c6dff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17d0072370 0x7f17d01a76c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:46278/0 (socket says 192.168.123.107:46278) 2026-03-09T19:24:03.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.534+0000 7f17c6dff640 1 -- 192.168.123.107:0/3412908021 learned_addr learned my addr 192.168.123.107:0/3412908021 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:03.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.534+0000 7f17d4b1f640 1 -- 192.168.123.107:0/3412908021 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f17d01a7c90 con 0x7f17d0072370 2026-03-09T19:24:03.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.534+0000 7f17d4b1f640 1 -- 192.168.123.107:0/3412908021 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f17d01a7e00 con 0x7f17d00719a0 2026-03-09T19:24:03.533 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.535+0000 7f17c6dff640 1 -- 192.168.123.107:0/3412908021 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f17d00719a0 msgr2=0x7f17d01a7180 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:03.533 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.535+0000 7f17c6dff640 1 --2- 192.168.123.107:0/3412908021 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f17d00719a0 0x7f17d01a7180 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:03.533 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.535+0000 7f17c6dff640 1 -- 192.168.123.107:0/3412908021 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f17b8009660 con 0x7f17d0072370 2026-03-09T19:24:03.533 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.535+0000 7f17c6dff640 1 --2- 192.168.123.107:0/3412908021 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17d0072370 0x7f17d01a76c0 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7f17bc00d900 tx=0x7f17bc00ddd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:03.533 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.535+0000 7f17cd7fa640 1 -- 192.168.123.107:0/3412908021 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f17bc004490 con 0x7f17d0072370 2026-03-09T19:24:03.533 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.535+0000 7f17d4b1f640 1 -- 192.168.123.107:0/3412908021 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f17d01ac8a0 con 0x7f17d0072370 2026-03-09T19:24:03.533 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.536+0000 7f17d4b1f640 1 -- 192.168.123.107:0/3412908021 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f17d01acdf0 con 0x7f17d0072370 2026-03-09T19:24:03.535 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.536+0000 7f17d4b1f640 1 -- 192.168.123.107:0/3412908021 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f17d01183e0 con 0x7f17d0072370 2026-03-09T19:24:03.535 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.537+0000 7f17cd7fa640 1 -- 192.168.123.107:0/3412908021 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f17bc0076c0 con 0x7f17d0072370 2026-03-09T19:24:03.535 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.537+0000 7f17cd7fa640 1 -- 192.168.123.107:0/3412908021 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f17bc010460 con 0x7f17d0072370 2026-03-09T19:24:03.538 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.540+0000 7f17cd7fa640 1 -- 192.168.123.107:0/3412908021 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f17bc00b7c0 con 0x7f17d0072370 2026-03-09T19:24:03.539 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.541+0000 7f17cd7fa640 1 --2- 192.168.123.107:0/3412908021 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f17a0075fb0 0x7f17a0078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:03.539 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.541+0000 7f17cd7fa640 1 -- 192.168.123.107:0/3412908021 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(31..31 src has 1..31) v4 ==== 4301+0+0 (secure 0 0 0) 0x7f17bc097880 con 0x7f17d0072370 2026-03-09T19:24:03.539 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.541+0000 7f17cf7fe640 1 --2- 192.168.123.107:0/3412908021 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f17a0075fb0 0x7f17a0078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:03.539 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.541+0000 7f17cd7fa640 1 -- 192.168.123.107:0/3412908021 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f17bc0c58c0 con 0x7f17d0072370 2026-03-09T19:24:03.539 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.541+0000 7f17cf7fe640 1 --2- 192.168.123.107:0/3412908021 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f17a0075fb0 0x7f17a0078470 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f17b8002c30 tx=0x7f17b803a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:03.627 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.629+0000 7f17d4b1f640 1 -- 192.168.123.107:0/3412908021 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f17d0118620 con 0x7f17d0072370 2026-03-09T19:24:03.627 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.630+0000 7f17cd7fa640 1 -- 192.168.123.107:0/3412908021 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v31) v1 ==== 74+0+130 (secure 0 0 0) 0x7f17bc0613c0 con 0x7f17d0072370 2026-03-09T19:24:03.628 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:03.630 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.632+0000 7f17d4b1f640 1 -- 192.168.123.107:0/3412908021 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f17a0075fb0 msgr2=0x7f17a0078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:03.630 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.632+0000 7f17d4b1f640 1 --2- 192.168.123.107:0/3412908021 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f17a0075fb0 0x7f17a0078470 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f17b8002c30 tx=0x7f17b803a040 comp rx=0 tx=0).stop 2026-03-09T19:24:03.630 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.632+0000 7f17d4b1f640 1 -- 192.168.123.107:0/3412908021 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17d0072370 msgr2=0x7f17d01a76c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:03.630 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.632+0000 7f17d4b1f640 1 --2- 192.168.123.107:0/3412908021 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17d0072370 0x7f17d01a76c0 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7f17bc00d900 tx=0x7f17bc00ddd0 comp rx=0 tx=0).stop 2026-03-09T19:24:03.630 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.632+0000 7f17d4b1f640 1 -- 192.168.123.107:0/3412908021 shutdown_connections 2026-03-09T19:24:03.630 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.632+0000 7f17d4b1f640 1 --2- 192.168.123.107:0/3412908021 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f17a0075fb0 0x7f17a0078470 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:03.630 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.632+0000 7f17d4b1f640 1 --2- 192.168.123.107:0/3412908021 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17d0072370 0x7f17d01a76c0 unknown :-1 s=CLOSED pgs=200 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:03.630 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.632+0000 7f17d4b1f640 1 --2- 192.168.123.107:0/3412908021 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f17d00719a0 0x7f17d01a7180 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:03.631 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.633+0000 7f17d4b1f640 1 -- 192.168.123.107:0/3412908021 >> 192.168.123.107:0/3412908021 conn(0x7f17d006d4f0 msgr2=0x7f17d010a860 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:03.631 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.633+0000 7f17d4b1f640 1 -- 192.168.123.107:0/3412908021 shutdown_connections 2026-03-09T19:24:03.631 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:03.633+0000 7f17d4b1f640 1 -- 192.168.123.107:0/3412908021 wait complete. 2026-03-09T19:24:03.674 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":31,"num_osds":6,"num_up_osds":5,"osd_up_since":1773084237,"num_in_osds":6,"osd_in_since":1773084236,"num_remapped_pgs":0} 2026-03-09T19:24:04.270 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:04 vm08 ceph-mon[57794]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 538 MiB used, 99 GiB / 100 GiB avail 2026-03-09T19:24:04.270 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:04 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:04.270 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:04 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:04.270 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:04 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:04.270 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:04 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:04.270 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:04 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:24:04.270 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:04 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/3412908021' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T19:24:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:04 vm07 ceph-mon[48545]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 538 MiB used, 99 GiB / 100 GiB avail 2026-03-09T19:24:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:04 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:04 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:04 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:04 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:04 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:24:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:04 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/3412908021' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T19:24:04.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:24:04 vm08 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[76487]: 2026-03-09T19:24:04.269+0000 7f96ecbdd740 -1 osd.5 0 log_to_monitors true 2026-03-09T19:24:04.675 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd stat -f json 2026-03-09T19:24:04.815 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:05.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.037+0000 7fdc3ffff640 1 -- 192.168.123.107:0/2940495331 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdc401001c0 msgr2=0x7fdc400fe2e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:05.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.037+0000 7fdc3ffff640 1 --2- 192.168.123.107:0/2940495331 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdc401001c0 0x7fdc400fe2e0 secure :-1 s=READY pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7fdc28009a00 tx=0x7fdc2802f290 comp rx=0 tx=0).stop 2026-03-09T19:24:05.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.038+0000 7fdc3ffff640 1 -- 192.168.123.107:0/2940495331 shutdown_connections 2026-03-09T19:24:05.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.038+0000 7fdc3ffff640 1 --2- 192.168.123.107:0/2940495331 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdc401001c0 0x7fdc400fe2e0 unknown :-1 s=CLOSED pgs=202 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:05.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.038+0000 7fdc3ffff640 1 --2- 192.168.123.107:0/2940495331 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdc400ff7f0 0x7fdc400ffbf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:05.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.038+0000 7fdc3ffff640 1 -- 192.168.123.107:0/2940495331 >> 192.168.123.107:0/2940495331 conn(0x7fdc400f9f80 msgr2=0x7fdc400fc3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:05.036 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.038+0000 7fdc3ffff640 1 -- 192.168.123.107:0/2940495331 shutdown_connections 2026-03-09T19:24:05.036 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.038+0000 7fdc3ffff640 1 -- 192.168.123.107:0/2940495331 wait complete. 2026-03-09T19:24:05.036 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.038+0000 7fdc3ffff640 1 Processor -- start 2026-03-09T19:24:05.036 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.038+0000 7fdc3ffff640 1 -- start start 2026-03-09T19:24:05.036 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.038+0000 7fdc3ffff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdc400ff7f0 0x7fdc4019a360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:05.037 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.038+0000 7fdc3ffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdc401001c0 0x7fdc4019a8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:05.037 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.038+0000 7fdc3ffff640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdc4019ae70 con 0x7fdc401001c0 2026-03-09T19:24:05.037 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.038+0000 7fdc3ffff640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdc4019afe0 con 0x7fdc400ff7f0 2026-03-09T19:24:05.037 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.039+0000 7fdc3e7fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdc401001c0 0x7fdc4019a8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:05.037 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.039+0000 7fdc3effd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdc400ff7f0 0x7fdc4019a360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:05.037 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.039+0000 7fdc3e7fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdc401001c0 0x7fdc4019a8a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:46286/0 (socket says 192.168.123.107:46286) 2026-03-09T19:24:05.037 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.039+0000 7fdc3e7fc640 1 -- 192.168.123.107:0/1120879958 learned_addr learned my addr 192.168.123.107:0/1120879958 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:05.037 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.039+0000 7fdc3e7fc640 1 -- 192.168.123.107:0/1120879958 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdc400ff7f0 msgr2=0x7fdc4019a360 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:05.037 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.039+0000 7fdc3e7fc640 1 --2- 192.168.123.107:0/1120879958 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdc400ff7f0 0x7fdc4019a360 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:05.037 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.039+0000 7fdc3e7fc640 1 -- 192.168.123.107:0/1120879958 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdc28009660 con 0x7fdc401001c0 2026-03-09T19:24:05.037 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.039+0000 7fdc3e7fc640 1 --2- 192.168.123.107:0/1120879958 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdc401001c0 0x7fdc4019a8a0 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7fdc28002ab0 tx=0x7fdc28031b10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:05.038 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.040+0000 7fdc44984640 1 -- 192.168.123.107:0/1120879958 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdc28031c80 con 0x7fdc401001c0 2026-03-09T19:24:05.038 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.040+0000 7fdc44984640 1 -- 192.168.123.107:0/1120879958 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fdc28031de0 con 0x7fdc401001c0 2026-03-09T19:24:05.038 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.040+0000 7fdc44984640 1 -- 192.168.123.107:0/1120879958 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdc28031280 con 0x7fdc401001c0 2026-03-09T19:24:05.038 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.040+0000 7fdc3ffff640 1 -- 192.168.123.107:0/1120879958 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdc4019fa20 con 0x7fdc401001c0 2026-03-09T19:24:05.038 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.040+0000 7fdc3ffff640 1 -- 192.168.123.107:0/1120879958 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdc4019ff90 con 0x7fdc401001c0 2026-03-09T19:24:05.039 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.040+0000 7fdc3ffff640 1 -- 192.168.123.107:0/1120879958 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdc04005350 con 0x7fdc401001c0 2026-03-09T19:24:05.042 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.044+0000 7fdc44984640 1 -- 192.168.123.107:0/1120879958 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fdc2803f070 con 0x7fdc401001c0 2026-03-09T19:24:05.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.045+0000 7fdc44984640 1 --2- 192.168.123.107:0/1120879958 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fdc14076290 0x7fdc14078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:05.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.045+0000 7fdc3effd640 1 --2- 192.168.123.107:0/1120879958 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fdc14076290 0x7fdc14078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:05.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.045+0000 7fdc44984640 1 -- 192.168.123.107:0/1120879958 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(31..31 src has 1..31) v4 ==== 4301+0+0 (secure 0 0 0) 0x7fdc280bc600 con 0x7fdc401001c0 2026-03-09T19:24:05.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.045+0000 7fdc3effd640 1 --2- 192.168.123.107:0/1120879958 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fdc14076290 0x7fdc14078750 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fdc3400a8d0 tx=0x7fdc34008040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:05.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.045+0000 7fdc44984640 1 -- 192.168.123.107:0/1120879958 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fdc280bca80 con 0x7fdc401001c0 2026-03-09T19:24:05.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.138+0000 7fdc3ffff640 1 -- 192.168.123.107:0/1120879958 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7fdc040051c0 con 0x7fdc401001c0 2026-03-09T19:24:05.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.139+0000 7fdc44984640 1 -- 192.168.123.107:0/1120879958 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v31) v1 ==== 74+0+130 (secure 0 0 0) 0x7fdc2808e020 con 0x7fdc401001c0 2026-03-09T19:24:05.137 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:05.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.141+0000 7fdc3ffff640 1 -- 192.168.123.107:0/1120879958 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fdc14076290 msgr2=0x7fdc14078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:05.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.141+0000 7fdc3ffff640 1 --2- 192.168.123.107:0/1120879958 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fdc14076290 0x7fdc14078750 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fdc3400a8d0 tx=0x7fdc34008040 comp rx=0 tx=0).stop 2026-03-09T19:24:05.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.141+0000 7fdc3ffff640 1 -- 192.168.123.107:0/1120879958 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdc401001c0 msgr2=0x7fdc4019a8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:05.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.141+0000 7fdc3ffff640 1 --2- 192.168.123.107:0/1120879958 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdc401001c0 0x7fdc4019a8a0 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7fdc28002ab0 tx=0x7fdc28031b10 comp rx=0 tx=0).stop 2026-03-09T19:24:05.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.141+0000 7fdc3ffff640 1 -- 192.168.123.107:0/1120879958 shutdown_connections 2026-03-09T19:24:05.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.141+0000 7fdc3ffff640 1 --2- 192.168.123.107:0/1120879958 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fdc14076290 0x7fdc14078750 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:05.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.141+0000 7fdc3ffff640 1 --2- 192.168.123.107:0/1120879958 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdc401001c0 0x7fdc4019a8a0 secure :-1 s=CLOSED pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7fdc28002ab0 tx=0x7fdc28031b10 comp rx=0 tx=0).stop 2026-03-09T19:24:05.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.141+0000 7fdc3ffff640 1 --2- 192.168.123.107:0/1120879958 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdc400ff7f0 0x7fdc4019a360 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:05.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.141+0000 7fdc3ffff640 1 -- 192.168.123.107:0/1120879958 >> 192.168.123.107:0/1120879958 conn(0x7fdc400f9f80 msgr2=0x7fdc400fbad0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:05.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.142+0000 7fdc3ffff640 1 -- 192.168.123.107:0/1120879958 shutdown_connections 2026-03-09T19:24:05.140 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:05.142+0000 7fdc3ffff640 1 -- 192.168.123.107:0/1120879958 wait complete. 2026-03-09T19:24:05.206 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":31,"num_osds":6,"num_up_osds":5,"osd_up_since":1773084237,"num_in_osds":6,"osd_in_since":1773084236,"num_remapped_pgs":0} 2026-03-09T19:24:05.208 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:05 vm07 ceph-mon[48545]: Detected new or changed devices on vm08 2026-03-09T19:24:05.208 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:05.208 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:05.208 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:24:05.208 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:05.208 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:24:05.208 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:05.208 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:24:05.208 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:05.208 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:24:05.208 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:05 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:05.208 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:05 vm07 ceph-mon[48545]: from='osd.5 [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T19:24:05.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:05 vm08 ceph-mon[57794]: Detected new or changed devices on vm08 2026-03-09T19:24:05.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:05.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:05.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:24:05.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:05.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:24:05.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:05.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:24:05.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:05.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:24:05.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:05 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:05.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:05 vm08 ceph-mon[57794]: from='osd.5 [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T19:24:06.207 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd stat -f json 2026-03-09T19:24:06.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:06 vm08 ceph-mon[57794]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 538 MiB used, 99 GiB / 100 GiB avail 2026-03-09T19:24:06.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:06 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/1120879958' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T19:24:06.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:06 vm08 ceph-mon[57794]: from='osd.5 [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042]' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-09T19:24:06.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:06 vm08 ceph-mon[57794]: osdmap e32: 6 total, 5 up, 6 in 2026-03-09T19:24:06.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:06 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:24:06.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:06 vm08 ceph-mon[57794]: from='osd.5 [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:24:06.365 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:06.390 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:06 vm07 ceph-mon[48545]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 538 MiB used, 99 GiB / 100 GiB avail 2026-03-09T19:24:06.390 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:06 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1120879958' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T19:24:06.390 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:06 vm07 ceph-mon[48545]: from='osd.5 [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042]' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-09T19:24:06.390 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:06 vm07 ceph-mon[48545]: osdmap e32: 6 total, 5 up, 6 in 2026-03-09T19:24:06.390 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:06 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:24:06.390 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:06 vm07 ceph-mon[48545]: from='osd.5 [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:24:06.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.611+0000 7f7b6df38640 1 -- 192.168.123.107:0/1900011547 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b68103c60 msgr2=0x7f7b681040e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:06.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.611+0000 7f7b6df38640 1 --2- 192.168.123.107:0/1900011547 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b68103c60 0x7f7b681040e0 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7f7b54009a00 tx=0x7f7b5402f270 comp rx=0 tx=0).stop 2026-03-09T19:24:06.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.612+0000 7f7b6df38640 1 -- 192.168.123.107:0/1900011547 shutdown_connections 2026-03-09T19:24:06.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.612+0000 7f7b6df38640 1 --2- 192.168.123.107:0/1900011547 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b68103c60 0x7f7b681040e0 unknown :-1 s=CLOSED pgs=204 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:06.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.612+0000 7f7b6df38640 1 --2- 192.168.123.107:0/1900011547 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7b68102a60 0x7f7b68102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:06.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.612+0000 7f7b6df38640 1 -- 192.168.123.107:0/1900011547 >> 192.168.123.107:0/1900011547 conn(0x7f7b680fe250 msgr2=0x7f7b68100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:06.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.612+0000 7f7b6df38640 1 -- 192.168.123.107:0/1900011547 shutdown_connections 2026-03-09T19:24:06.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.612+0000 7f7b6df38640 1 -- 192.168.123.107:0/1900011547 wait complete. 2026-03-09T19:24:06.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.612+0000 7f7b6df38640 1 Processor -- start 2026-03-09T19:24:06.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.613+0000 7f7b6df38640 1 -- start start 2026-03-09T19:24:06.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.613+0000 7f7b6df38640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b68102a60 0x7f7b6819a440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:06.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.613+0000 7f7b6df38640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7b68103c60 0x7f7b6819a980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:06.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.613+0000 7f7b6df38640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7b6819af50 con 0x7f7b68102a60 2026-03-09T19:24:06.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.613+0000 7f7b6df38640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7b6819b0c0 con 0x7f7b68103c60 2026-03-09T19:24:06.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.613+0000 7f7b677fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b68102a60 0x7f7b6819a440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:06.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.613+0000 7f7b677fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b68102a60 0x7f7b6819a440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:46308/0 (socket says 192.168.123.107:46308) 2026-03-09T19:24:06.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.613+0000 7f7b677fe640 1 -- 192.168.123.107:0/3509005194 learned_addr learned my addr 192.168.123.107:0/3509005194 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:06.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.613+0000 7f7b66ffd640 1 --2- 192.168.123.107:0/3509005194 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7b68103c60 0x7f7b6819a980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:06.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.613+0000 7f7b677fe640 1 -- 192.168.123.107:0/3509005194 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7b68103c60 msgr2=0x7f7b6819a980 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:06.612 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.614+0000 7f7b677fe640 1 --2- 192.168.123.107:0/3509005194 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7b68103c60 0x7f7b6819a980 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:06.612 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.614+0000 7f7b677fe640 1 -- 192.168.123.107:0/3509005194 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7b54009660 con 0x7f7b68102a60 2026-03-09T19:24:06.612 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.614+0000 7f7b66ffd640 1 --2- 192.168.123.107:0/3509005194 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7b68103c60 0x7f7b6819a980 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:24:06.612 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.614+0000 7f7b677fe640 1 --2- 192.168.123.107:0/3509005194 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b68102a60 0x7f7b6819a440 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7f7b5800e990 tx=0x7f7b5800ee60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:06.612 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.614+0000 7f7b64ff9640 1 -- 192.168.123.107:0/3509005194 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7b58009800 con 0x7f7b68102a60 2026-03-09T19:24:06.612 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.614+0000 7f7b64ff9640 1 -- 192.168.123.107:0/3509005194 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f7b58004590 con 0x7f7b68102a60 2026-03-09T19:24:06.613 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.615+0000 7f7b64ff9640 1 -- 192.168.123.107:0/3509005194 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7b58010640 con 0x7f7b68102a60 2026-03-09T19:24:06.613 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.615+0000 7f7b6df38640 1 -- 192.168.123.107:0/3509005194 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7b6819fb60 con 0x7f7b68102a60 2026-03-09T19:24:06.613 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.615+0000 7f7b6df38640 1 -- 192.168.123.107:0/3509005194 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7b681a0080 con 0x7f7b68102a60 2026-03-09T19:24:06.614 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.616+0000 7f7b64ff9640 1 -- 192.168.123.107:0/3509005194 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f7b5800cd30 con 0x7f7b68102a60 2026-03-09T19:24:06.617 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.617+0000 7f7b4a7fc640 1 -- 192.168.123.107:0/3509005194 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7b68102e60 con 0x7f7b68102a60 2026-03-09T19:24:06.618 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.620+0000 7f7b64ff9640 1 --2- 192.168.123.107:0/3509005194 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f7b38075fb0 0x7f7b38078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:06.618 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.620+0000 7f7b64ff9640 1 -- 192.168.123.107:0/3509005194 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4338+0+0 (secure 0 0 0) 0x7f7b58014070 con 0x7f7b68102a60 2026-03-09T19:24:06.618 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.620+0000 7f7b64ff9640 1 -- 192.168.123.107:0/3509005194 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f7b5809c050 con 0x7f7b68102a60 2026-03-09T19:24:06.618 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.620+0000 7f7b66ffd640 1 --2- 192.168.123.107:0/3509005194 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f7b38075fb0 0x7f7b38078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:06.619 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.621+0000 7f7b66ffd640 1 --2- 192.168.123.107:0/3509005194 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f7b38075fb0 0x7f7b38078470 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f7b54002c80 tx=0x7f7b540023d0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:06.712 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.712+0000 7f7b4a7fc640 1 -- 192.168.123.107:0/3509005194 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f7b6810b690 con 0x7f7b68102a60 2026-03-09T19:24:06.712 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.714+0000 7f7b64ff9640 1 -- 192.168.123.107:0/3509005194 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v33) v1 ==== 74+0+130 (secure 0 0 0) 0x7f7b580605a0 con 0x7f7b68102a60 2026-03-09T19:24:06.712 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:06.714 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.716+0000 7f7b4a7fc640 1 -- 192.168.123.107:0/3509005194 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f7b38075fb0 msgr2=0x7f7b38078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:06.714 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.716+0000 7f7b4a7fc640 1 --2- 192.168.123.107:0/3509005194 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f7b38075fb0 0x7f7b38078470 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f7b54002c80 tx=0x7f7b540023d0 comp rx=0 tx=0).stop 2026-03-09T19:24:06.714 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.716+0000 7f7b4a7fc640 1 -- 192.168.123.107:0/3509005194 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b68102a60 msgr2=0x7f7b6819a440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:06.714 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.716+0000 7f7b4a7fc640 1 --2- 192.168.123.107:0/3509005194 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b68102a60 0x7f7b6819a440 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7f7b5800e990 tx=0x7f7b5800ee60 comp rx=0 tx=0).stop 2026-03-09T19:24:06.715 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.717+0000 7f7b4a7fc640 1 -- 192.168.123.107:0/3509005194 shutdown_connections 2026-03-09T19:24:06.715 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.717+0000 7f7b4a7fc640 1 --2- 192.168.123.107:0/3509005194 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f7b38075fb0 0x7f7b38078470 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:06.715 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.717+0000 7f7b4a7fc640 1 --2- 192.168.123.107:0/3509005194 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7b68103c60 0x7f7b6819a980 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:06.715 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.717+0000 7f7b4a7fc640 1 --2- 192.168.123.107:0/3509005194 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b68102a60 0x7f7b6819a440 unknown :-1 s=CLOSED pgs=205 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:06.715 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.717+0000 7f7b4a7fc640 1 -- 192.168.123.107:0/3509005194 >> 192.168.123.107:0/3509005194 conn(0x7f7b680fe250 msgr2=0x7f7b680ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:06.715 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.717+0000 7f7b4a7fc640 1 -- 192.168.123.107:0/3509005194 shutdown_connections 2026-03-09T19:24:06.715 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:06.717+0000 7f7b4a7fc640 1 -- 192.168.123.107:0/3509005194 wait complete. 2026-03-09T19:24:06.785 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":33,"num_osds":6,"num_up_osds":5,"osd_up_since":1773084237,"num_in_osds":6,"osd_in_since":1773084236,"num_remapped_pgs":0} 2026-03-09T19:24:07.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:07 vm07 ceph-mon[48545]: from='osd.5 [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042]' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-09T19:24:07.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:07 vm07 ceph-mon[48545]: osdmap e33: 6 total, 5 up, 6 in 2026-03-09T19:24:07.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:07 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:24:07.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:07 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:24:07.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:07 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/3509005194' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T19:24:07.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:24:07 vm08 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[76487]: 2026-03-09T19:24:07.227+0000 7f96e8339640 -1 osd.5 0 waiting for initial osdmap 2026-03-09T19:24:07.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:24:07 vm08 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[76487]: 2026-03-09T19:24:07.237+0000 7f96e4174640 -1 osd.5 33 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T19:24:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:07 vm08 ceph-mon[57794]: from='osd.5 [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042]' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-09T19:24:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:07 vm08 ceph-mon[57794]: osdmap e33: 6 total, 5 up, 6 in 2026-03-09T19:24:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:07 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:24:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:07 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:24:07.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:07 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/3509005194' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T19:24:07.786 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd stat -f json 2026-03-09T19:24:07.939 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:08.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.171+0000 7f65a8dec640 1 -- 192.168.123.107:0/4214938849 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f65a40ff7f0 msgr2=0x7f65a40ffbf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:08.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.171+0000 7f65a8dec640 1 --2- 192.168.123.107:0/4214938849 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f65a40ff7f0 0x7f65a40ffbf0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f65940099b0 tx=0x7f659402f240 comp rx=0 tx=0).stop 2026-03-09T19:24:08.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.172+0000 7f65a8dec640 1 -- 192.168.123.107:0/4214938849 shutdown_connections 2026-03-09T19:24:08.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.172+0000 7f65a8dec640 1 --2- 192.168.123.107:0/4214938849 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a41001c0 0x7f65a40fe2a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:08.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.172+0000 7f65a8dec640 1 --2- 192.168.123.107:0/4214938849 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f65a40ff7f0 0x7f65a40ffbf0 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:08.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.172+0000 7f65a8dec640 1 -- 192.168.123.107:0/4214938849 >> 192.168.123.107:0/4214938849 conn(0x7f65a40f9f80 msgr2=0x7f65a40fc3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:08.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.172+0000 7f65a8dec640 1 -- 192.168.123.107:0/4214938849 shutdown_connections 2026-03-09T19:24:08.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.172+0000 7f65a8dec640 1 -- 192.168.123.107:0/4214938849 wait complete. 2026-03-09T19:24:08.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.172+0000 7f65a8dec640 1 Processor -- start 2026-03-09T19:24:08.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.172+0000 7f65a8dec640 1 -- start start 2026-03-09T19:24:08.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.173+0000 7f65a8dec640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f65a40ff7f0 0x7f65a419a260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:08.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.173+0000 7f65a8dec640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a41001c0 0x7f65a419a7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:08.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.173+0000 7f65a8dec640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f65a419ad70 con 0x7f65a41001c0 2026-03-09T19:24:08.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.173+0000 7f65a8dec640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f65a419aee0 con 0x7f65a40ff7f0 2026-03-09T19:24:08.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.173+0000 7f65a2ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a41001c0 0x7f65a419a7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:08.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.173+0000 7f65a2ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a41001c0 0x7f65a419a7a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:46316/0 (socket says 192.168.123.107:46316) 2026-03-09T19:24:08.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.173+0000 7f65a2ffd640 1 -- 192.168.123.107:0/2069623713 learned_addr learned my addr 192.168.123.107:0/2069623713 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:08.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.173+0000 7f65a2ffd640 1 -- 192.168.123.107:0/2069623713 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f65a40ff7f0 msgr2=0x7f65a419a260 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:08.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.173+0000 7f65a2ffd640 1 --2- 192.168.123.107:0/2069623713 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f65a40ff7f0 0x7f65a419a260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:08.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.173+0000 7f65a2ffd640 1 -- 192.168.123.107:0/2069623713 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6590009590 con 0x7f65a41001c0 2026-03-09T19:24:08.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.173+0000 7f65a2ffd640 1 --2- 192.168.123.107:0/2069623713 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a41001c0 0x7f65a419a7a0 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7f6590002a00 tx=0x7f6590002ed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:08.172 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.173+0000 7f65a0ff9640 1 -- 192.168.123.107:0/2069623713 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f659000ec70 con 0x7f65a41001c0 2026-03-09T19:24:08.172 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.173+0000 7f65a0ff9640 1 -- 192.168.123.107:0/2069623713 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f659000edd0 con 0x7f65a41001c0 2026-03-09T19:24:08.172 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.174+0000 7f65a0ff9640 1 -- 192.168.123.107:0/2069623713 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f659000f760 con 0x7f65a41001c0 2026-03-09T19:24:08.172 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.174+0000 7f65a8dec640 1 -- 192.168.123.107:0/2069623713 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6594009660 con 0x7f65a41001c0 2026-03-09T19:24:08.172 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.174+0000 7f65a8dec640 1 -- 192.168.123.107:0/2069623713 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f65a419fcc0 con 0x7f65a41001c0 2026-03-09T19:24:08.173 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.175+0000 7f65a0ff9640 1 -- 192.168.123.107:0/2069623713 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f6590016070 con 0x7f65a41001c0 2026-03-09T19:24:08.173 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.176+0000 7f65a8dec640 1 -- 192.168.123.107:0/2069623713 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6568005350 con 0x7f65a41001c0 2026-03-09T19:24:08.174 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.176+0000 7f65a0ff9640 1 --2- 192.168.123.107:0/2069623713 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6578075f60 0x7f6578078420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:08.174 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.176+0000 7f65a0ff9640 1 -- 192.168.123.107:0/2069623713 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4338+0+0 (secure 0 0 0) 0x7f659009a1a0 con 0x7f65a41001c0 2026-03-09T19:24:08.174 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.176+0000 7f65a37fe640 1 --2- 192.168.123.107:0/2069623713 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6578075f60 0x7f6578078420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:08.177 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.179+0000 7f65a37fe640 1 --2- 192.168.123.107:0/2069623713 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6578075f60 0x7f6578078420 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f659402f750 tx=0x7f65940047c0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:08.177 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.179+0000 7f65a0ff9640 1 -- 192.168.123.107:0/2069623713 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f659009c050 con 0x7f65a41001c0 2026-03-09T19:24:08.268 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.270+0000 7f65a8dec640 1 -- 192.168.123.107:0/2069623713 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f65680058d0 con 0x7f65a41001c0 2026-03-09T19:24:08.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.271+0000 7f65a0ff9640 1 -- 192.168.123.107:0/2069623713 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v34) v1 ==== 74+0+130 (secure 0 0 0) 0x7f6590060720 con 0x7f65a41001c0 2026-03-09T19:24:08.269 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:08.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.273+0000 7f65a8dec640 1 -- 192.168.123.107:0/2069623713 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6578075f60 msgr2=0x7f6578078420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:08.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.273+0000 7f65a8dec640 1 --2- 192.168.123.107:0/2069623713 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6578075f60 0x7f6578078420 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f659402f750 tx=0x7f65940047c0 comp rx=0 tx=0).stop 2026-03-09T19:24:08.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.273+0000 7f65a8dec640 1 -- 192.168.123.107:0/2069623713 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a41001c0 msgr2=0x7f65a419a7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:08.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.273+0000 7f65a8dec640 1 --2- 192.168.123.107:0/2069623713 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a41001c0 0x7f65a419a7a0 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7f6590002a00 tx=0x7f6590002ed0 comp rx=0 tx=0).stop 2026-03-09T19:24:08.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.273+0000 7f65a8dec640 1 -- 192.168.123.107:0/2069623713 shutdown_connections 2026-03-09T19:24:08.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.273+0000 7f65a8dec640 1 --2- 192.168.123.107:0/2069623713 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6578075f60 0x7f6578078420 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:08.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.273+0000 7f65a8dec640 1 --2- 192.168.123.107:0/2069623713 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a41001c0 0x7f65a419a7a0 unknown :-1 s=CLOSED pgs=206 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:08.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.273+0000 7f65a8dec640 1 --2- 192.168.123.107:0/2069623713 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f65a40ff7f0 0x7f65a419a260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:08.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.273+0000 7f65a8dec640 1 -- 192.168.123.107:0/2069623713 >> 192.168.123.107:0/2069623713 conn(0x7f65a40f9f80 msgr2=0x7f65a40fba90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:08.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.273+0000 7f65a8dec640 1 -- 192.168.123.107:0/2069623713 shutdown_connections 2026-03-09T19:24:08.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.273+0000 7f65a8dec640 1 -- 192.168.123.107:0/2069623713 wait complete. 2026-03-09T19:24:08.329 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:08 vm07 ceph-mon[48545]: purged_snaps scrub starts 2026-03-09T19:24:08.329 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:08 vm07 ceph-mon[48545]: purged_snaps scrub ok 2026-03-09T19:24:08.329 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:08 vm07 ceph-mon[48545]: pgmap v60: 1 pgs: 1 active+clean; 449 KiB data, 538 MiB used, 99 GiB / 100 GiB avail 2026-03-09T19:24:08.329 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:08 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:24:08.329 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:08 vm07 ceph-mon[48545]: from='osd.5 [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042]' entity='osd.5' 2026-03-09T19:24:08.329 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:08 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:24:08.329 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":34,"num_osds":6,"num_up_osds":6,"osd_up_since":1773084248,"num_in_osds":6,"osd_in_since":1773084236,"num_remapped_pgs":0} 2026-03-09T19:24:08.329 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd dump --format=json 2026-03-09T19:24:08.463 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:08.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:08 vm08 ceph-mon[57794]: purged_snaps scrub starts 2026-03-09T19:24:08.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:08 vm08 ceph-mon[57794]: purged_snaps scrub ok 2026-03-09T19:24:08.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:08 vm08 ceph-mon[57794]: pgmap v60: 1 pgs: 1 active+clean; 449 KiB data, 538 MiB used, 99 GiB / 100 GiB avail 2026-03-09T19:24:08.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:08 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:24:08.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:08 vm08 ceph-mon[57794]: from='osd.5 [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042]' entity='osd.5' 2026-03-09T19:24:08.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:08 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:24:08.785 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.786+0000 7f3dd8f21640 1 -- 192.168.123.107:0/4163874460 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3dd41089d0 msgr2=0x7f3dd4108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:08.786 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.786+0000 7f3dd8f21640 1 --2- 192.168.123.107:0/4163874460 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3dd41089d0 0x7f3dd4108db0 secure :-1 s=READY pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7f3dbc0099b0 tx=0x7f3dbc02f220 comp rx=0 tx=0).stop 2026-03-09T19:24:08.786 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.787+0000 7f3dd8f21640 1 -- 192.168.123.107:0/4163874460 shutdown_connections 2026-03-09T19:24:08.786 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.787+0000 7f3dd8f21640 1 --2- 192.168.123.107:0/4163874460 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd41029d0 0x7f3dd4102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:08.786 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.787+0000 7f3dd8f21640 1 --2- 192.168.123.107:0/4163874460 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3dd41089d0 0x7f3dd4108db0 unknown :-1 s=CLOSED pgs=207 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:08.786 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.787+0000 7f3dd8f21640 1 -- 192.168.123.107:0/4163874460 >> 192.168.123.107:0/4163874460 conn(0x7f3dd40fe710 msgr2=0x7f3dd4100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:08.786 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.788+0000 7f3dd8f21640 1 -- 192.168.123.107:0/4163874460 shutdown_connections 2026-03-09T19:24:08.786 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.788+0000 7f3dd8f21640 1 -- 192.168.123.107:0/4163874460 wait complete. 2026-03-09T19:24:08.786 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.788+0000 7f3dd8f21640 1 Processor -- start 2026-03-09T19:24:08.786 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.788+0000 7f3dd8f21640 1 -- start start 2026-03-09T19:24:08.786 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.788+0000 7f3dd8f21640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd41029d0 0x7f3dd4075700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:08.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.788+0000 7f3dd8f21640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3dd41089d0 0x7f3dd4075c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:08.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.789+0000 7f3dd8f21640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3dd40796f0 con 0x7f3dd41089d0 2026-03-09T19:24:08.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.789+0000 7f3dd8f21640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3dd4079860 con 0x7f3dd41029d0 2026-03-09T19:24:08.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.789+0000 7f3dd2575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd41029d0 0x7f3dd4075700 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:08.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.789+0000 7f3dd2575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd41029d0 0x7f3dd4075700 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:40330/0 (socket says 192.168.123.107:40330) 2026-03-09T19:24:08.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.789+0000 7f3dd2575640 1 -- 192.168.123.107:0/1848352344 learned_addr learned my addr 192.168.123.107:0/1848352344 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:08.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.789+0000 7f3dd1d74640 1 --2- 192.168.123.107:0/1848352344 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3dd41089d0 0x7f3dd4075c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:08.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.789+0000 7f3dd2575640 1 -- 192.168.123.107:0/1848352344 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3dd41089d0 msgr2=0x7f3dd4075c40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:08.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.789+0000 7f3dd2575640 1 --2- 192.168.123.107:0/1848352344 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3dd41089d0 0x7f3dd4075c40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:08.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.789+0000 7f3dd2575640 1 -- 192.168.123.107:0/1848352344 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3dbc009660 con 0x7f3dd41029d0 2026-03-09T19:24:08.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.789+0000 7f3dd1d74640 1 --2- 192.168.123.107:0/1848352344 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3dd41089d0 0x7f3dd4075c40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:24:08.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.790+0000 7f3dd2575640 1 --2- 192.168.123.107:0/1848352344 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd41029d0 0x7f3dd4075700 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f3dbc002410 tx=0x7f3dbc031cd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:08.788 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.790+0000 7f3dbb7fe640 1 -- 192.168.123.107:0/1848352344 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3dbc03d070 con 0x7f3dd41029d0 2026-03-09T19:24:08.788 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.790+0000 7f3dd8f21640 1 -- 192.168.123.107:0/1848352344 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3dd4076180 con 0x7f3dd41029d0 2026-03-09T19:24:08.788 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.790+0000 7f3dd8f21640 1 -- 192.168.123.107:0/1848352344 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3dd41a90d0 con 0x7f3dd41029d0 2026-03-09T19:24:08.788 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.790+0000 7f3dbb7fe640 1 -- 192.168.123.107:0/1848352344 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3dbc0043d0 con 0x7f3dd41029d0 2026-03-09T19:24:08.789 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.791+0000 7f3dbb7fe640 1 -- 192.168.123.107:0/1848352344 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3dbc038470 con 0x7f3dd41029d0 2026-03-09T19:24:08.789 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.791+0000 7f3dd8f21640 1 -- 192.168.123.107:0/1848352344 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3dd4104110 con 0x7f3dd41029d0 2026-03-09T19:24:08.790 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.791+0000 7f3dbb7fe640 1 -- 192.168.123.107:0/1848352344 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f3dbc038650 con 0x7f3dd41029d0 2026-03-09T19:24:08.790 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.792+0000 7f3dbb7fe640 1 --2- 192.168.123.107:0/1848352344 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f3db0075f60 0x7f3db0078420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:08.790 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.792+0000 7f3dbb7fe640 1 -- 192.168.123.107:0/1848352344 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f3dbc0bbf30 con 0x7f3dd41029d0 2026-03-09T19:24:08.790 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.792+0000 7f3dd1d74640 1 --2- 192.168.123.107:0/1848352344 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f3db0075f60 0x7f3db0078420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:08.790 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.792+0000 7f3dd1d74640 1 --2- 192.168.123.107:0/1848352344 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f3db0075f60 0x7f3db0078420 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f3dd4076e80 tx=0x7f3dc800a400 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:08.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.794+0000 7f3dbb7fe640 1 -- 192.168.123.107:0/1848352344 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f3dbc0858b0 con 0x7f3dd41029d0 2026-03-09T19:24:08.881 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.883+0000 7f3dd8f21640 1 -- 192.168.123.107:0/1848352344 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f3dd4076c30 con 0x7f3dd41029d0 2026-03-09T19:24:08.882 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.883+0000 7f3dbb7fe640 1 -- 192.168.123.107:0/1848352344 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v34) v1 ==== 74+0+11503 (secure 0 0 0) 0x7f3dbc085250 con 0x7f3dd41029d0 2026-03-09T19:24:08.882 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:08.882 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":34,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","created":"2026-03-09T19:21:45.299152+0000","modified":"2026-03-09T19:24:08.231617+0000","last_up_change":"2026-03-09T19:24:08.231617+0000","last_in_change":"2026-03-09T19:23:56.638316+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-09T19:23:42.363375+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"21","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"133448fe-3146-488b-ab63-557fcf7f955d","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6802","nonce":2192525186},{"type":"v1","addr":"192.168.123.107:6803","nonce":2192525186}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6804","nonce":2192525186},{"type":"v1","addr":"192.168.123.107:6805","nonce":2192525186}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6808","nonce":2192525186},{"type":"v1","addr":"192.168.123.107:6809","nonce":2192525186}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6806","nonce":2192525186},{"type":"v1","addr":"192.168.123.107:6807","nonce":2192525186}]},"public_addr":"192.168.123.107:6803/2192525186","cluster_addr":"192.168.123.107:6805/2192525186","heartbeat_back_addr":"192.168.123.107:6809/2192525186","heartbeat_front_addr":"192.168.123.107:6807/2192525186","state":["exists","up"]},{"osd":1,"uuid":"0f0316a2-1b3a-4bd0-b463-b3d326b0fb51","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":26,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6810","nonce":407429515},{"type":"v1","addr":"192.168.123.107:6811","nonce":407429515}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6812","nonce":407429515},{"type":"v1","addr":"192.168.123.107:6813","nonce":407429515}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6816","nonce":407429515},{"type":"v1","addr":"192.168.123.107:6817","nonce":407429515}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6814","nonce":407429515},{"type":"v1","addr":"192.168.123.107:6815","nonce":407429515}]},"public_addr":"192.168.123.107:6811/407429515","cluster_addr":"192.168.123.107:6813/407429515","heartbeat_back_addr":"192.168.123.107:6817/407429515","heartbeat_front_addr":"192.168.123.107:6815/407429515","state":["exists","up"]},{"osd":2,"uuid":"8ea4ceda-8f60-4699-976a-464d32f7e944","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6818","nonce":261501426},{"type":"v1","addr":"192.168.123.107:6819","nonce":261501426}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6820","nonce":261501426},{"type":"v1","addr":"192.168.123.107:6821","nonce":261501426}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":261501426},{"type":"v1","addr":"192.168.123.107:6825","nonce":261501426}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6822","nonce":261501426},{"type":"v1","addr":"192.168.123.107:6823","nonce":261501426}]},"public_addr":"192.168.123.107:6819/261501426","cluster_addr":"192.168.123.107:6821/261501426","heartbeat_back_addr":"192.168.123.107:6825/261501426","heartbeat_front_addr":"192.168.123.107:6823/261501426","state":["exists","up"]},{"osd":3,"uuid":"175a772b-2920-452f-9d34-5c2a70bb1cb1","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":25,"up_thru":29,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6800","nonce":1988644338},{"type":"v1","addr":"192.168.123.108:6801","nonce":1988644338}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6802","nonce":1988644338},{"type":"v1","addr":"192.168.123.108:6803","nonce":1988644338}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6806","nonce":1988644338},{"type":"v1","addr":"192.168.123.108:6807","nonce":1988644338}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6804","nonce":1988644338},{"type":"v1","addr":"192.168.123.108:6805","nonce":1988644338}]},"public_addr":"192.168.123.108:6801/1988644338","cluster_addr":"192.168.123.108:6803/1988644338","heartbeat_back_addr":"192.168.123.108:6807/1988644338","heartbeat_front_addr":"192.168.123.108:6805/1988644338","state":["exists","up"]},{"osd":4,"uuid":"a265b553-1e86-4bab-beff-db9f81381120","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":30,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6808","nonce":68783370},{"type":"v1","addr":"192.168.123.108:6809","nonce":68783370}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6810","nonce":68783370},{"type":"v1","addr":"192.168.123.108:6811","nonce":68783370}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6814","nonce":68783370},{"type":"v1","addr":"192.168.123.108:6815","nonce":68783370}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6812","nonce":68783370},{"type":"v1","addr":"192.168.123.108:6813","nonce":68783370}]},"public_addr":"192.168.123.108:6809/68783370","cluster_addr":"192.168.123.108:6811/68783370","heartbeat_back_addr":"192.168.123.108:6815/68783370","heartbeat_front_addr":"192.168.123.108:6813/68783370","state":["exists","up"]},{"osd":5,"uuid":"44d4390e-f9a3-490b-9d44-f60b53e3d568","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":34,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6816","nonce":99511042},{"type":"v1","addr":"192.168.123.108:6817","nonce":99511042}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6818","nonce":99511042},{"type":"v1","addr":"192.168.123.108:6819","nonce":99511042}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6822","nonce":99511042},{"type":"v1","addr":"192.168.123.108:6823","nonce":99511042}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6820","nonce":99511042},{"type":"v1","addr":"192.168.123.108:6821","nonce":99511042}]},"public_addr":"192.168.123.108:6817/99511042","cluster_addr":"192.168.123.108:6819/99511042","heartbeat_back_addr":"192.168.123.108:6823/99511042","heartbeat_front_addr":"192.168.123.108:6821/99511042","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:23:18.794973+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:23:29.548141+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:23:39.080417+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:23:47.991412+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:23:56.496482+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.107:0/3701716623":"2026-03-10T19:21:57.086091+0000","192.168.123.107:0/2365276262":"2026-03-10T19:22:09.108986+0000","192.168.123.107:6801/1021580706":"2026-03-10T19:22:48.312502+0000","192.168.123.107:6800/1456056000":"2026-03-10T19:21:57.086091+0000","192.168.123.107:0/961195961":"2026-03-10T19:21:57.086091+0000","192.168.123.107:6801/1456056000":"2026-03-10T19:21:57.086091+0000","192.168.123.107:0/4096740693":"2026-03-10T19:22:48.312502+0000","192.168.123.107:0/1578432790":"2026-03-10T19:22:09.108986+0000","192.168.123.107:0/702157851":"2026-03-10T19:22:09.108986+0000","192.168.123.107:0/1848167886":"2026-03-10T19:22:48.312502+0000","192.168.123.107:6801/1318262611":"2026-03-10T19:22:09.108986+0000","192.168.123.107:0/2125348601":"2026-03-10T19:21:57.086091+0000","192.168.123.107:6800/1318262611":"2026-03-10T19:22:09.108986+0000","192.168.123.107:0/2275130412":"2026-03-10T19:22:48.312502+0000","192.168.123.107:6800/1021580706":"2026-03-10T19:22:48.312502+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-09T19:24:08.884 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.886+0000 7f3dd8f21640 1 -- 192.168.123.107:0/1848352344 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f3db0075f60 msgr2=0x7f3db0078420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:08.884 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.886+0000 7f3dd8f21640 1 --2- 192.168.123.107:0/1848352344 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f3db0075f60 0x7f3db0078420 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f3dd4076e80 tx=0x7f3dc800a400 comp rx=0 tx=0).stop 2026-03-09T19:24:08.884 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.886+0000 7f3dd8f21640 1 -- 192.168.123.107:0/1848352344 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd41029d0 msgr2=0x7f3dd4075700 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:08.884 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.886+0000 7f3dd8f21640 1 --2- 192.168.123.107:0/1848352344 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd41029d0 0x7f3dd4075700 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f3dbc002410 tx=0x7f3dbc031cd0 comp rx=0 tx=0).stop 2026-03-09T19:24:08.884 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.887+0000 7f3dd8f21640 1 -- 192.168.123.107:0/1848352344 shutdown_connections 2026-03-09T19:24:08.884 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.887+0000 7f3dd8f21640 1 --2- 192.168.123.107:0/1848352344 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f3db0075f60 0x7f3db0078420 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:08.884 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.887+0000 7f3dd8f21640 1 --2- 192.168.123.107:0/1848352344 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3dd41089d0 0x7f3dd4075c40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:08.885 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.887+0000 7f3dd8f21640 1 --2- 192.168.123.107:0/1848352344 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd41029d0 0x7f3dd4075700 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:08.885 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.887+0000 7f3dd8f21640 1 -- 192.168.123.107:0/1848352344 >> 192.168.123.107:0/1848352344 conn(0x7f3dd40fe710 msgr2=0x7f3dd4106550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:08.885 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.887+0000 7f3dd8f21640 1 -- 192.168.123.107:0/1848352344 shutdown_connections 2026-03-09T19:24:08.885 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:08.887+0000 7f3dd8f21640 1 -- 192.168.123.107:0/1848352344 wait complete. 2026-03-09T19:24:08.943 INFO:tasks.cephadm.ceph_manager.ceph:[{'pool': 1, 'pool_name': '.mgr', 'create_time': '2026-03-09T19:23:42.363375+0000', 'flags': 1, 'flags_names': 'hashpspool', 'type': 1, 'size': 3, 'min_size': 2, 'crush_rule': 0, 'peering_crush_bucket_count': 0, 'peering_crush_bucket_target': 0, 'peering_crush_bucket_barrier': 0, 'peering_crush_bucket_mandatory_member': 2147483647, 'is_stretch_pool': False, 'object_hash': 2, 'pg_autoscale_mode': 'off', 'pg_num': 1, 'pg_placement_num': 1, 'pg_placement_num_target': 1, 'pg_num_target': 1, 'pg_num_pending': 1, 'last_pg_merge_meta': {'source_pgid': '0.0', 'ready_epoch': 0, 'last_epoch_started': 0, 'last_epoch_clean': 0, 'source_version': "0'0", 'target_version': "0'0"}, 'last_change': '21', 'last_force_op_resend': '0', 'last_force_op_resend_prenautilus': '0', 'last_force_op_resend_preluminous': '0', 'auid': 0, 'snap_mode': 'selfmanaged', 'snap_seq': 0, 'snap_epoch': 0, 'pool_snaps': [], 'removed_snaps': '[]', 'quota_max_bytes': 0, 'quota_max_objects': 0, 'tiers': [], 'tier_of': -1, 'read_tier': -1, 'write_tier': -1, 'cache_mode': 'none', 'target_max_bytes': 0, 'target_max_objects': 0, 'cache_target_dirty_ratio_micro': 400000, 'cache_target_dirty_high_ratio_micro': 600000, 'cache_target_full_ratio_micro': 800000, 'cache_min_flush_age': 0, 'cache_min_evict_age': 0, 'erasure_code_profile': '', 'hit_set_params': {'type': 'none'}, 'hit_set_period': 0, 'hit_set_count': 0, 'use_gmt_hitset': True, 'min_read_recency_for_promote': 0, 'min_write_recency_for_promote': 0, 'hit_set_grade_decay_rate': 0, 'hit_set_search_last_n': 0, 'grade_table': [], 'stripe_width': 0, 'expected_num_objects': 0, 'fast_read': False, 'options': {'pg_num_max': 32, 'pg_num_min': 1}, 'application_metadata': {'mgr': {}}, 'read_balance': {'score_acting': 6, 'score_stable': 6, 'optimal_score': 0.5, 'raw_score_acting': 3, 'raw_score_stable': 3, 'primary_affinity_weighted': 1, 'average_primary_affinity': 1, 'average_primary_affinity_weighted': 1}}] 2026-03-09T19:24:08.943 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd pool get .mgr pg_num 2026-03-09T19:24:09.098 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:09.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.341+0000 7f53fd399640 1 -- 192.168.123.107:0/3459720979 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f53f8100010 msgr2=0x7f53f81044e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:09.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.341+0000 7f53fd399640 1 --2- 192.168.123.107:0/3459720979 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f53f8100010 0x7f53f81044e0 secure :-1 s=READY pgs=208 cs=0 l=1 rev1=1 crypto rx=0x7f53ec0099b0 tx=0x7f53ec02f2b0 comp rx=0 tx=0).stop 2026-03-09T19:24:09.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.341+0000 7f53fd399640 1 -- 192.168.123.107:0/3459720979 shutdown_connections 2026-03-09T19:24:09.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.341+0000 7f53fd399640 1 --2- 192.168.123.107:0/3459720979 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f53f8100010 0x7f53f81044e0 unknown :-1 s=CLOSED pgs=208 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:09.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.341+0000 7f53fd399640 1 --2- 192.168.123.107:0/3459720979 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53f80ff6f0 0x7f53f80ffad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:09.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.341+0000 7f53fd399640 1 -- 192.168.123.107:0/3459720979 >> 192.168.123.107:0/3459720979 conn(0x7f53f80fb340 msgr2=0x7f53f80fd760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:09.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.342+0000 7f53fd399640 1 -- 192.168.123.107:0/3459720979 shutdown_connections 2026-03-09T19:24:09.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.342+0000 7f53fd399640 1 -- 192.168.123.107:0/3459720979 wait complete. 2026-03-09T19:24:09.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.342+0000 7f53fd399640 1 Processor -- start 2026-03-09T19:24:09.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.342+0000 7f53fd399640 1 -- start start 2026-03-09T19:24:09.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.343+0000 7f53fd399640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53f80ff6f0 0x7f53f819edc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:09.341 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.343+0000 7f53fd399640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f53f8100010 0x7f53f819f300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:09.341 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.343+0000 7f53fd399640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f53f819f990 con 0x7f53f8100010 2026-03-09T19:24:09.341 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.343+0000 7f53fd399640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f53f81a3700 con 0x7f53f80ff6f0 2026-03-09T19:24:09.341 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.343+0000 7f53f67fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f53f8100010 0x7f53f819f300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:09.341 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.343+0000 7f53f67fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f53f8100010 0x7f53f819f300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43068/0 (socket says 192.168.123.107:43068) 2026-03-09T19:24:09.341 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.343+0000 7f53f67fc640 1 -- 192.168.123.107:0/1616998345 learned_addr learned my addr 192.168.123.107:0/1616998345 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:09.341 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.343+0000 7f53f67fc640 1 -- 192.168.123.107:0/1616998345 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53f80ff6f0 msgr2=0x7f53f819edc0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T19:24:09.341 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.343+0000 7f53f67fc640 1 --2- 192.168.123.107:0/1616998345 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53f80ff6f0 0x7f53f819edc0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:09.341 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.343+0000 7f53f67fc640 1 -- 192.168.123.107:0/1616998345 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f53ec009660 con 0x7f53f8100010 2026-03-09T19:24:09.341 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.343+0000 7f53f67fc640 1 --2- 192.168.123.107:0/1616998345 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f53f8100010 0x7f53f819f300 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7f53ec02f7c0 tx=0x7f53ec031ea0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:09.342 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.343+0000 7f53d7fff640 1 -- 192.168.123.107:0/1616998345 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f53ec03d070 con 0x7f53f8100010 2026-03-09T19:24:09.342 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.343+0000 7f53d7fff640 1 -- 192.168.123.107:0/1616998345 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f53ec02fd20 con 0x7f53f8100010 2026-03-09T19:24:09.342 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.343+0000 7f53d7fff640 1 -- 192.168.123.107:0/1616998345 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f53ec0319c0 con 0x7f53f8100010 2026-03-09T19:24:09.342 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.344+0000 7f53fd399640 1 -- 192.168.123.107:0/1616998345 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f53f81a3980 con 0x7f53f8100010 2026-03-09T19:24:09.342 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.344+0000 7f53fd399640 1 -- 192.168.123.107:0/1616998345 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f53f81a3e20 con 0x7f53f8100010 2026-03-09T19:24:09.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.345+0000 7f53d7fff640 1 -- 192.168.123.107:0/1616998345 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f53ec004050 con 0x7f53f8100010 2026-03-09T19:24:09.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.346+0000 7f53d7fff640 1 --2- 192.168.123.107:0/1616998345 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f53cc076290 0x7f53cc078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:09.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.346+0000 7f53f6ffd640 1 --2- 192.168.123.107:0/1616998345 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f53cc076290 0x7f53cc078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:09.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.346+0000 7f53fd399640 1 -- 192.168.123.107:0/1616998345 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f53bc005350 con 0x7f53f8100010 2026-03-09T19:24:09.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.346+0000 7f53d7fff640 1 -- 192.168.123.107:0/1616998345 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f53ec047030 con 0x7f53f8100010 2026-03-09T19:24:09.347 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.347+0000 7f53f6ffd640 1 --2- 192.168.123.107:0/1616998345 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f53cc076290 0x7f53cc078750 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f53e0005fd0 tx=0x7f53e0009450 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:09.347 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.349+0000 7f53d7fff640 1 -- 192.168.123.107:0/1616998345 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f53ec085500 con 0x7f53f8100010 2026-03-09T19:24:09.442 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:09 vm07 ceph-mon[48545]: osd.5 [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042] boot 2026-03-09T19:24:09.442 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:09 vm07 ceph-mon[48545]: osdmap e34: 6 total, 6 up, 6 in 2026-03-09T19:24:09.442 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:09 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:24:09.442 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:09 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/2069623713' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T19:24:09.442 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:09 vm07 ceph-mon[48545]: pgmap v62: 1 pgs: 1 active+clean; 449 KiB data, 138 MiB used, 100 GiB / 100 GiB avail 2026-03-09T19:24:09.442 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:09 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1848352344' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T19:24:09.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.442+0000 7f53fd399640 1 -- 192.168.123.107:0/1616998345 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"} v 0) v1 -- 0x7f53bc005b80 con 0x7f53f8100010 2026-03-09T19:24:09.443 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.444+0000 7f53d7fff640 1 -- 192.168.123.107:0/1616998345 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]=0 v34) v1 ==== 93+0+10 (secure 0 0 0) 0x7f53ec084ea0 con 0x7f53f8100010 2026-03-09T19:24:09.443 INFO:teuthology.orchestra.run.vm07.stdout:pg_num: 1 2026-03-09T19:24:09.445 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.447+0000 7f53fd399640 1 -- 192.168.123.107:0/1616998345 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f53cc076290 msgr2=0x7f53cc078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:09.445 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.447+0000 7f53fd399640 1 --2- 192.168.123.107:0/1616998345 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f53cc076290 0x7f53cc078750 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f53e0005fd0 tx=0x7f53e0009450 comp rx=0 tx=0).stop 2026-03-09T19:24:09.445 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.447+0000 7f53fd399640 1 -- 192.168.123.107:0/1616998345 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f53f8100010 msgr2=0x7f53f819f300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:09.445 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.447+0000 7f53fd399640 1 --2- 192.168.123.107:0/1616998345 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f53f8100010 0x7f53f819f300 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7f53ec02f7c0 tx=0x7f53ec031ea0 comp rx=0 tx=0).stop 2026-03-09T19:24:09.445 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.447+0000 7f53fd399640 1 -- 192.168.123.107:0/1616998345 shutdown_connections 2026-03-09T19:24:09.445 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.447+0000 7f53fd399640 1 --2- 192.168.123.107:0/1616998345 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f53cc076290 0x7f53cc078750 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:09.445 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.448+0000 7f53fd399640 1 --2- 192.168.123.107:0/1616998345 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f53f8100010 0x7f53f819f300 unknown :-1 s=CLOSED pgs=209 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:09.445 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.448+0000 7f53fd399640 1 --2- 192.168.123.107:0/1616998345 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53f80ff6f0 0x7f53f819edc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:09.446 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.448+0000 7f53fd399640 1 -- 192.168.123.107:0/1616998345 >> 192.168.123.107:0/1616998345 conn(0x7f53f80fb340 msgr2=0x7f53f80fcd90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:09.446 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.448+0000 7f53fd399640 1 -- 192.168.123.107:0/1616998345 shutdown_connections 2026-03-09T19:24:09.446 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.448+0000 7f53fd399640 1 -- 192.168.123.107:0/1616998345 wait complete. 2026-03-09T19:24:09.506 INFO:tasks.cephadm:Setting up client nodes... 2026-03-09T19:24:09.506 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph auth get-or-create client.0 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-09T19:24:09.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:09 vm08 ceph-mon[57794]: osd.5 [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042] boot 2026-03-09T19:24:09.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:09 vm08 ceph-mon[57794]: osdmap e34: 6 total, 6 up, 6 in 2026-03-09T19:24:09.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:09 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:24:09.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:09 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/2069623713' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T19:24:09.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:09 vm08 ceph-mon[57794]: pgmap v62: 1 pgs: 1 active+clean; 449 KiB data, 138 MiB used, 100 GiB / 100 GiB avail 2026-03-09T19:24:09.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:09 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/1848352344' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T19:24:09.653 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:09.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.889+0000 7fd5c8dfd640 1 -- 192.168.123.107:0/2774688585 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5c4104150 msgr2=0x7fd5c4104550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:09.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.889+0000 7fd5c8dfd640 1 --2- 192.168.123.107:0/2774688585 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5c4104150 0x7fd5c4104550 secure :-1 s=READY pgs=210 cs=0 l=1 rev1=1 crypto rx=0x7fd5b00099b0 tx=0x7fd5b002f220 comp rx=0 tx=0).stop 2026-03-09T19:24:09.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.890+0000 7fd5c8dfd640 1 -- 192.168.123.107:0/2774688585 shutdown_connections 2026-03-09T19:24:09.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.890+0000 7fd5c8dfd640 1 --2- 192.168.123.107:0/2774688585 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd5c4102690 0x7fd5c4102b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:09.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.890+0000 7fd5c8dfd640 1 --2- 192.168.123.107:0/2774688585 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5c4104150 0x7fd5c4104550 unknown :-1 s=CLOSED pgs=210 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:09.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.890+0000 7fd5c8dfd640 1 -- 192.168.123.107:0/2774688585 >> 192.168.123.107:0/2774688585 conn(0x7fd5c40fe220 msgr2=0x7fd5c4100640 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:09.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.890+0000 7fd5c8dfd640 1 -- 192.168.123.107:0/2774688585 shutdown_connections 2026-03-09T19:24:09.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.890+0000 7fd5c8dfd640 1 -- 192.168.123.107:0/2774688585 wait complete. 2026-03-09T19:24:09.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.891+0000 7fd5c8dfd640 1 Processor -- start 2026-03-09T19:24:09.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.891+0000 7fd5c8dfd640 1 -- start start 2026-03-09T19:24:09.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.891+0000 7fd5c8dfd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd5c4102690 0x7fd5c419a5d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:09.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.891+0000 7fd5c8dfd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5c4104150 0x7fd5c419ab10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:09.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.891+0000 7fd5c8dfd640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd5c419b0e0 con 0x7fd5c4104150 2026-03-09T19:24:09.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.891+0000 7fd5c8dfd640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd5c419b250 con 0x7fd5c4102690 2026-03-09T19:24:09.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.891+0000 7fd5c2575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd5c4102690 0x7fd5c419a5d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:09.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.891+0000 7fd5c2575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd5c4102690 0x7fd5c419a5d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:40386/0 (socket says 192.168.123.107:40386) 2026-03-09T19:24:09.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.891+0000 7fd5c2575640 1 -- 192.168.123.107:0/4135465110 learned_addr learned my addr 192.168.123.107:0/4135465110 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:09.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.892+0000 7fd5c1d74640 1 --2- 192.168.123.107:0/4135465110 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5c4104150 0x7fd5c419ab10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:09.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.892+0000 7fd5c1d74640 1 -- 192.168.123.107:0/4135465110 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd5c4102690 msgr2=0x7fd5c419a5d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:09.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.892+0000 7fd5c1d74640 1 --2- 192.168.123.107:0/4135465110 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd5c4102690 0x7fd5c419a5d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:09.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.892+0000 7fd5c1d74640 1 -- 192.168.123.107:0/4135465110 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd5b0009660 con 0x7fd5c4104150 2026-03-09T19:24:09.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.892+0000 7fd5c1d74640 1 --2- 192.168.123.107:0/4135465110 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5c4104150 0x7fd5c419ab10 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7fd5b800d700 tx=0x7fd5b800dbd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:09.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.892+0000 7fd5af7fe640 1 -- 192.168.123.107:0/4135465110 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd5b8004280 con 0x7fd5c4104150 2026-03-09T19:24:09.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.892+0000 7fd5af7fe640 1 -- 192.168.123.107:0/4135465110 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd5b8004d60 con 0x7fd5c4104150 2026-03-09T19:24:09.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.892+0000 7fd5af7fe640 1 -- 192.168.123.107:0/4135465110 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd5b8005230 con 0x7fd5c4104150 2026-03-09T19:24:09.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.892+0000 7fd5c8dfd640 1 -- 192.168.123.107:0/4135465110 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd5c419fcf0 con 0x7fd5c4104150 2026-03-09T19:24:09.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.893+0000 7fd5c8dfd640 1 -- 192.168.123.107:0/4135465110 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd5c4075990 con 0x7fd5c4104150 2026-03-09T19:24:09.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.894+0000 7fd5c8dfd640 1 -- 192.168.123.107:0/4135465110 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd590005350 con 0x7fd5c4104150 2026-03-09T19:24:09.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.894+0000 7fd5af7fe640 1 -- 192.168.123.107:0/4135465110 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fd5b80043e0 con 0x7fd5c4104150 2026-03-09T19:24:09.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.895+0000 7fd5af7fe640 1 --2- 192.168.123.107:0/4135465110 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd5a0076290 0x7fd5a0078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:09.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.895+0000 7fd5af7fe640 1 -- 192.168.123.107:0/4135465110 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4618+0+0 (secure 0 0 0) 0x7fd5b8097090 con 0x7fd5c4104150 2026-03-09T19:24:09.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.895+0000 7fd5c2575640 1 --2- 192.168.123.107:0/4135465110 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd5a0076290 0x7fd5a0078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:09.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.895+0000 7fd5c2575640 1 --2- 192.168.123.107:0/4135465110 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd5a0076290 0x7fd5a0078750 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fd5b0002c20 tx=0x7fd5b003a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:09.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:09.897+0000 7fd5af7fe640 1 -- 192.168.123.107:0/4135465110 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fd5b80609e0 con 0x7fd5c4104150 2026-03-09T19:24:10.016 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:10.017+0000 7fd5c8dfd640 1 -- 192.168.123.107:0/4135465110 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7fd5900051c0 con 0x7fd5c4104150 2026-03-09T19:24:10.019 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:10.020+0000 7fd5af7fe640 1 -- 192.168.123.107:0/4135465110 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v16) v1 ==== 170+0+59 (secure 0 0 0) 0x7fd5b8060380 con 0x7fd5c4104150 2026-03-09T19:24:10.019 INFO:teuthology.orchestra.run.vm07.stdout:[client.0] 2026-03-09T19:24:10.020 INFO:teuthology.orchestra.run.vm07.stdout: key = AQBaHq9pNnIaARAAg6wfWya6vKUCrP+NPl8gYQ== 2026-03-09T19:24:10.021 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:10.024+0000 7fd5c8dfd640 1 -- 192.168.123.107:0/4135465110 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd5a0076290 msgr2=0x7fd5a0078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:10.021 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:10.024+0000 7fd5c8dfd640 1 --2- 192.168.123.107:0/4135465110 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd5a0076290 0x7fd5a0078750 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fd5b0002c20 tx=0x7fd5b003a040 comp rx=0 tx=0).stop 2026-03-09T19:24:10.022 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:10.024+0000 7fd5c8dfd640 1 -- 192.168.123.107:0/4135465110 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5c4104150 msgr2=0x7fd5c419ab10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:10.022 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:10.024+0000 7fd5c8dfd640 1 --2- 192.168.123.107:0/4135465110 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5c4104150 0x7fd5c419ab10 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7fd5b800d700 tx=0x7fd5b800dbd0 comp rx=0 tx=0).stop 2026-03-09T19:24:10.022 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:10.024+0000 7fd5c8dfd640 1 -- 192.168.123.107:0/4135465110 shutdown_connections 2026-03-09T19:24:10.022 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:10.024+0000 7fd5c8dfd640 1 --2- 192.168.123.107:0/4135465110 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd5a0076290 0x7fd5a0078750 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:10.022 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:10.024+0000 7fd5c8dfd640 1 --2- 192.168.123.107:0/4135465110 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5c4104150 0x7fd5c419ab10 unknown :-1 s=CLOSED pgs=211 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:10.022 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:10.024+0000 7fd5c8dfd640 1 --2- 192.168.123.107:0/4135465110 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd5c4102690 0x7fd5c419a5d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:10.022 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:10.024+0000 7fd5c8dfd640 1 -- 192.168.123.107:0/4135465110 >> 192.168.123.107:0/4135465110 conn(0x7fd5c40fe220 msgr2=0x7fd5c40ffda0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:10.022 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:10.024+0000 7fd5c8dfd640 1 -- 192.168.123.107:0/4135465110 shutdown_connections 2026-03-09T19:24:10.022 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:10.024+0000 7fd5c8dfd640 1 -- 192.168.123.107:0/4135465110 wait complete. 2026-03-09T19:24:10.086 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:24:10.086 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/etc/ceph/ceph.client.0.keyring 2026-03-09T19:24:10.086 DEBUG:teuthology.orchestra.run.vm07:> sudo chmod 0644 /etc/ceph/ceph.client.0.keyring 2026-03-09T19:24:10.120 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph auth get-or-create client.1 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-09T19:24:10.262 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm08/config 2026-03-09T19:24:10.398 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:10 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/1616998345' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-09T19:24:10.398 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:10 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/4135465110' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T19:24:10.398 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:10 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/4135465110' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-09T19:24:10.398 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:10 vm08 ceph-mon[57794]: osdmap e35: 6 total, 6 up, 6 in 2026-03-09T19:24:10.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:10 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1616998345' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-09T19:24:10.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:10 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/4135465110' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T19:24:10.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:10 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/4135465110' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-09T19:24:10.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:10 vm07 ceph-mon[48545]: osdmap e35: 6 total, 6 up, 6 in 2026-03-09T19:24:10.504 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.504+0000 7f154080a640 1 -- 192.168.123.108:0/2279974838 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1538073600 msgr2=0x7f1538073a00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:10.504 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.504+0000 7f154080a640 1 --2- 192.168.123.108:0/2279974838 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1538073600 0x7f1538073a00 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f15240099b0 tx=0x7f152402f220 comp rx=0 tx=0).stop 2026-03-09T19:24:10.504 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.505+0000 7f154080a640 1 -- 192.168.123.108:0/2279974838 shutdown_connections 2026-03-09T19:24:10.504 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.505+0000 7f154080a640 1 --2- 192.168.123.108:0/2279974838 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1538073f40 0x7f15381086f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:10.505 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.505+0000 7f154080a640 1 --2- 192.168.123.108:0/2279974838 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1538073600 0x7f1538073a00 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:10.505 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.505+0000 7f154080a640 1 -- 192.168.123.108:0/2279974838 >> 192.168.123.108:0/2279974838 conn(0x7f15380fbf90 msgr2=0x7f15380fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:10.505 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.505+0000 7f154080a640 1 -- 192.168.123.108:0/2279974838 shutdown_connections 2026-03-09T19:24:10.505 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.505+0000 7f154080a640 1 -- 192.168.123.108:0/2279974838 wait complete. 2026-03-09T19:24:10.505 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.506+0000 7f154080a640 1 Processor -- start 2026-03-09T19:24:10.505 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.506+0000 7f154080a640 1 -- start start 2026-03-09T19:24:10.505 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.506+0000 7f154080a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1538073600 0x7f153819a590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:10.506 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.506+0000 7f154080a640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1538073f40 0x7f153819aad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:10.506 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.506+0000 7f154080a640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f153819b0a0 con 0x7f1538073600 2026-03-09T19:24:10.506 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.506+0000 7f154080a640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f153819b210 con 0x7f1538073f40 2026-03-09T19:24:10.506 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.506+0000 7f153e57f640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1538073600 0x7f153819a590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:10.506 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.506+0000 7f153e57f640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1538073600 0x7f153819a590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.108:40144/0 (socket says 192.168.123.108:40144) 2026-03-09T19:24:10.506 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.506+0000 7f153e57f640 1 -- 192.168.123.108:0/755434797 learned_addr learned my addr 192.168.123.108:0/755434797 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-09T19:24:10.506 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.506+0000 7f153dd7e640 1 --2- 192.168.123.108:0/755434797 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1538073f40 0x7f153819aad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:10.506 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.507+0000 7f153e57f640 1 -- 192.168.123.108:0/755434797 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1538073f40 msgr2=0x7f153819aad0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:10.506 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.507+0000 7f153e57f640 1 --2- 192.168.123.108:0/755434797 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1538073f40 0x7f153819aad0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:10.506 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.507+0000 7f153e57f640 1 -- 192.168.123.108:0/755434797 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1524009660 con 0x7f1538073600 2026-03-09T19:24:10.506 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.507+0000 7f153dd7e640 1 --2- 192.168.123.108:0/755434797 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1538073f40 0x7f153819aad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:24:10.507 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.507+0000 7f153e57f640 1 --2- 192.168.123.108:0/755434797 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1538073600 0x7f153819a590 secure :-1 s=READY pgs=212 cs=0 l=1 rev1=1 crypto rx=0x7f152402f730 tx=0x7f1524002910 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:10.508 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.507+0000 7f15237fe640 1 -- 192.168.123.108:0/755434797 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f152403d070 con 0x7f1538073600 2026-03-09T19:24:10.508 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.507+0000 7f15237fe640 1 -- 192.168.123.108:0/755434797 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f1524002e20 con 0x7f1538073600 2026-03-09T19:24:10.508 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.507+0000 7f15237fe640 1 -- 192.168.123.108:0/755434797 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f15240416d0 con 0x7f1538073600 2026-03-09T19:24:10.508 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.507+0000 7f154080a640 1 -- 192.168.123.108:0/755434797 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f153819fc50 con 0x7f1538073600 2026-03-09T19:24:10.508 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.507+0000 7f154080a640 1 -- 192.168.123.108:0/755434797 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f15381a0140 con 0x7f1538073600 2026-03-09T19:24:10.509 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.509+0000 7f15237fe640 1 -- 192.168.123.108:0/755434797 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f1524049050 con 0x7f1538073600 2026-03-09T19:24:10.509 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.509+0000 7f154080a640 1 -- 192.168.123.108:0/755434797 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1500005350 con 0x7f1538073600 2026-03-09T19:24:10.509 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.509+0000 7f15237fe640 1 --2- 192.168.123.108:0/755434797 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1514076170 0x7f1514078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:10.509 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.509+0000 7f15237fe640 1 -- 192.168.123.108:0/755434797 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f15240bbe90 con 0x7f1538073600 2026-03-09T19:24:10.509 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.510+0000 7f153dd7e640 1 --2- 192.168.123.108:0/755434797 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1514076170 0x7f1514078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:10.510 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.510+0000 7f153dd7e640 1 --2- 192.168.123.108:0/755434797 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1514076170 0x7f1514078630 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f153819bab0 tx=0x7f1530009290 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:10.511 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.512+0000 7f15237fe640 1 -- 192.168.123.108:0/755434797 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f15240852e0 con 0x7f1538073600 2026-03-09T19:24:10.629 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.629+0000 7f154080a640 1 -- 192.168.123.108:0/755434797 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7f15000051c0 con 0x7f1538073600 2026-03-09T19:24:10.632 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.632+0000 7f15237fe640 1 -- 192.168.123.108:0/755434797 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v17) v1 ==== 170+0+59 (secure 0 0 0) 0x7f1524085100 con 0x7f1538073600 2026-03-09T19:24:10.633 INFO:teuthology.orchestra.run.vm08.stdout:[client.1] 2026-03-09T19:24:10.633 INFO:teuthology.orchestra.run.vm08.stdout: key = AQBaHq9pxeewJRAAXDKDwHc6F6FItY9BCIuVyQ== 2026-03-09T19:24:10.636 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.636+0000 7f154080a640 1 -- 192.168.123.108:0/755434797 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1514076170 msgr2=0x7f1514078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:10.636 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.636+0000 7f154080a640 1 --2- 192.168.123.108:0/755434797 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1514076170 0x7f1514078630 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f153819bab0 tx=0x7f1530009290 comp rx=0 tx=0).stop 2026-03-09T19:24:10.636 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.636+0000 7f154080a640 1 -- 192.168.123.108:0/755434797 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1538073600 msgr2=0x7f153819a590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:10.636 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.636+0000 7f154080a640 1 --2- 192.168.123.108:0/755434797 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1538073600 0x7f153819a590 secure :-1 s=READY pgs=212 cs=0 l=1 rev1=1 crypto rx=0x7f152402f730 tx=0x7f1524002910 comp rx=0 tx=0).stop 2026-03-09T19:24:10.636 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.636+0000 7f154080a640 1 -- 192.168.123.108:0/755434797 shutdown_connections 2026-03-09T19:24:10.636 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.636+0000 7f154080a640 1 --2- 192.168.123.108:0/755434797 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1514076170 0x7f1514078630 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:10.636 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.636+0000 7f154080a640 1 --2- 192.168.123.108:0/755434797 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1538073f40 0x7f153819aad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:10.636 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.636+0000 7f154080a640 1 --2- 192.168.123.108:0/755434797 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1538073600 0x7f153819a590 unknown :-1 s=CLOSED pgs=212 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:10.637 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.636+0000 7f154080a640 1 -- 192.168.123.108:0/755434797 >> 192.168.123.108:0/755434797 conn(0x7f15380fbf90 msgr2=0x7f1538106990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:10.637 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.637+0000 7f154080a640 1 -- 192.168.123.108:0/755434797 shutdown_connections 2026-03-09T19:24:10.637 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:10.637+0000 7f154080a640 1 -- 192.168.123.108:0/755434797 wait complete. 2026-03-09T19:24:10.684 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-09T19:24:10.684 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/etc/ceph/ceph.client.1.keyring 2026-03-09T19:24:10.684 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod 0644 /etc/ceph/ceph.client.1.keyring 2026-03-09T19:24:10.724 INFO:tasks.ceph:Waiting until ceph daemons up and pgs clean... 2026-03-09T19:24:10.724 INFO:tasks.cephadm.ceph_manager.ceph:waiting for mgr available 2026-03-09T19:24:10.724 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mgr dump --format=json 2026-03-09T19:24:10.872 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:11.107 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.108+0000 7fa942676640 1 -- 192.168.123.107:0/1331611696 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa93c0fe630 msgr2=0x7fa93c0fea70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:11.107 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.108+0000 7fa942676640 1 --2- 192.168.123.107:0/1331611696 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa93c0fe630 0x7fa93c0fea70 secure :-1 s=READY pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7fa92c0099b0 tx=0x7fa92c02f2b0 comp rx=0 tx=0).stop 2026-03-09T19:24:11.107 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.109+0000 7fa942676640 1 -- 192.168.123.107:0/1331611696 shutdown_connections 2026-03-09T19:24:11.107 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.109+0000 7fa942676640 1 --2- 192.168.123.107:0/1331611696 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa93c0fe630 0x7fa93c0fea70 unknown :-1 s=CLOSED pgs=213 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:11.107 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.109+0000 7fa942676640 1 --2- 192.168.123.107:0/1331611696 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa93c1056e0 0x7fa93c105ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:11.107 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.109+0000 7fa942676640 1 -- 192.168.123.107:0/1331611696 >> 192.168.123.107:0/1331611696 conn(0x7fa93c0fa4a0 msgr2=0x7fa93c0fc8c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:11.107 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.109+0000 7fa942676640 1 -- 192.168.123.107:0/1331611696 shutdown_connections 2026-03-09T19:24:11.107 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.109+0000 7fa942676640 1 -- 192.168.123.107:0/1331611696 wait complete. 2026-03-09T19:24:11.107 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.110+0000 7fa942676640 1 Processor -- start 2026-03-09T19:24:11.107 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.110+0000 7fa942676640 1 -- start start 2026-03-09T19:24:11.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.110+0000 7fa942676640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa93c0fe630 0x7fa93c196260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:11.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.110+0000 7fa942676640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa93c1056e0 0x7fa93c1967a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:11.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.110+0000 7fa942676640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa93c196e30 con 0x7fa93c1056e0 2026-03-09T19:24:11.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.110+0000 7fa942676640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa93c19aba0 con 0x7fa93c0fe630 2026-03-09T19:24:11.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.110+0000 7fa940e73640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa93c1056e0 0x7fa93c1967a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:11.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.110+0000 7fa940e73640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa93c1056e0 0x7fa93c1967a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43094/0 (socket says 192.168.123.107:43094) 2026-03-09T19:24:11.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.110+0000 7fa940e73640 1 -- 192.168.123.107:0/2978666947 learned_addr learned my addr 192.168.123.107:0/2978666947 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:11.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.110+0000 7fa940e73640 1 -- 192.168.123.107:0/2978666947 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa93c0fe630 msgr2=0x7fa93c196260 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T19:24:11.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.110+0000 7fa940e73640 1 --2- 192.168.123.107:0/2978666947 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa93c0fe630 0x7fa93c196260 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:11.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.110+0000 7fa940e73640 1 -- 192.168.123.107:0/2978666947 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa92c009660 con 0x7fa93c1056e0 2026-03-09T19:24:11.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.111+0000 7fa940e73640 1 --2- 192.168.123.107:0/2978666947 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa93c1056e0 0x7fa93c1967a0 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7fa92c02f7c0 tx=0x7fa92c004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:11.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.111+0000 7fa92a7fc640 1 -- 192.168.123.107:0/2978666947 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa92c03d070 con 0x7fa93c1056e0 2026-03-09T19:24:11.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.111+0000 7fa92a7fc640 1 -- 192.168.123.107:0/2978666947 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa92c031ea0 con 0x7fa93c1056e0 2026-03-09T19:24:11.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.111+0000 7fa92a7fc640 1 -- 192.168.123.107:0/2978666947 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa92c038470 con 0x7fa93c1056e0 2026-03-09T19:24:11.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.111+0000 7fa942676640 1 -- 192.168.123.107:0/2978666947 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa93c19ae20 con 0x7fa93c1056e0 2026-03-09T19:24:11.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.111+0000 7fa942676640 1 -- 192.168.123.107:0/2978666947 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa93c19b360 con 0x7fa93c1056e0 2026-03-09T19:24:11.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.112+0000 7fa942676640 1 -- 192.168.123.107:0/2978666947 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa904005350 con 0x7fa93c1056e0 2026-03-09T19:24:11.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.113+0000 7fa92a7fc640 1 -- 192.168.123.107:0/2978666947 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fa92c031320 con 0x7fa93c1056e0 2026-03-09T19:24:11.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.114+0000 7fa92a7fc640 1 --2- 192.168.123.107:0/2978666947 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa9180761c0 0x7fa918078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:11.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.114+0000 7fa941674640 1 --2- 192.168.123.107:0/2978666947 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa9180761c0 0x7fa918078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:11.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.115+0000 7fa941674640 1 --2- 192.168.123.107:0/2978666947 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa9180761c0 0x7fa918078680 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7fa930009800 tx=0x7fa930006d20 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:11.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.115+0000 7fa92a7fc640 1 -- 192.168.123.107:0/2978666947 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7fa92c084c60 con 0x7fa93c1056e0 2026-03-09T19:24:11.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.115+0000 7fa92a7fc640 1 -- 192.168.123.107:0/2978666947 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fa92c084f20 con 0x7fa93c1056e0 2026-03-09T19:24:11.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.227+0000 7fa942676640 1 -- 192.168.123.107:0/2978666947 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mgr dump", "format": "json"} v 0) v1 -- 0x7fa9040051c0 con 0x7fa93c1056e0 2026-03-09T19:24:11.228 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.230+0000 7fa92a7fc640 1 -- 192.168.123.107:0/2978666947 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mgr dump", "format": "json"}]=0 v19) v1 ==== 74+0+189855 (secure 0 0 0) 0x7fa92c0848c0 con 0x7fa93c1056e0 2026-03-09T19:24:11.228 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:11.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.234+0000 7fa942676640 1 -- 192.168.123.107:0/2978666947 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa9180761c0 msgr2=0x7fa918078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:11.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.234+0000 7fa942676640 1 --2- 192.168.123.107:0/2978666947 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa9180761c0 0x7fa918078680 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7fa930009800 tx=0x7fa930006d20 comp rx=0 tx=0).stop 2026-03-09T19:24:11.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.235+0000 7fa942676640 1 -- 192.168.123.107:0/2978666947 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa93c1056e0 msgr2=0x7fa93c1967a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:11.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.235+0000 7fa942676640 1 --2- 192.168.123.107:0/2978666947 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa93c1056e0 0x7fa93c1967a0 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7fa92c02f7c0 tx=0x7fa92c004290 comp rx=0 tx=0).stop 2026-03-09T19:24:11.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.235+0000 7fa942676640 1 -- 192.168.123.107:0/2978666947 shutdown_connections 2026-03-09T19:24:11.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.235+0000 7fa942676640 1 --2- 192.168.123.107:0/2978666947 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa9180761c0 0x7fa918078680 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:11.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.235+0000 7fa942676640 1 --2- 192.168.123.107:0/2978666947 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa93c1056e0 0x7fa93c1967a0 unknown :-1 s=CLOSED pgs=214 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:11.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.235+0000 7fa942676640 1 --2- 192.168.123.107:0/2978666947 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa93c0fe630 0x7fa93c196260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:11.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.235+0000 7fa942676640 1 -- 192.168.123.107:0/2978666947 >> 192.168.123.107:0/2978666947 conn(0x7fa93c0fa4a0 msgr2=0x7fa93c108400 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:11.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.236+0000 7fa942676640 1 -- 192.168.123.107:0/2978666947 shutdown_connections 2026-03-09T19:24:11.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.236+0000 7fa942676640 1 -- 192.168.123.107:0/2978666947 wait complete. 2026-03-09T19:24:11.263 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:11 vm07 ceph-mon[48545]: pgmap v64: 1 pgs: 1 active+clean; 449 KiB data, 564 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:24:11.263 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:11 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/755434797' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T19:24:11.263 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:11 vm07 ceph-mon[48545]: from='client.? 192.168.123.108:0/755434797' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-09T19:24:11.263 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:11 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/2978666947' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-09T19:24:11.285 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":19,"flags":0,"active_gid":14227,"active_name":"vm07.xacuym","active_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6800","nonce":2885771920},{"type":"v1","addr":"192.168.123.107:6801","nonce":2885771920}]},"active_addr":"192.168.123.107:6801/2885771920","active_change":"2026-03-09T19:22:48.312610+0000","active_mgr_features":4540138322906710015,"available":true,"standbys":[{"gid":14250,"name":"vm08.mxylvw","mgr_features":4540138322906710015,"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/loki:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_nvmeof":{"name":"container_image_nvmeof","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nvmeof:1.0.0","min":"","max":"","enum_allowed":[],"desc":"Nvme-of container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/promtail:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"int","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"quay.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"hw_monitoring":{"name":"hw_monitoring","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Deploy hw monitoring daemon on every host.","long_desc":"","tags":[],"see_also":[]},"inventory_list_all":{"name":"inventory_list_all","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Whether ceph-volume inventory should report more devices (mostly mappers (LVs / mpaths), partitions...)","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"oob_default_addr":{"name":"oob_default_addr","type":"str","level":"advanced","flags":0,"default_value":"169.254.1.1","min":"","max":"","enum_allowed":[],"desc":"Default address for RedFish API (oob management).","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"UNSAFE_TLS_v1_2":{"name":"UNSAFE_TLS_v1_2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"exclude_perf_counters":{"name":"exclude_perf_counters","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Do not include perf-counters in the metrics output","long_desc":"Gathering perf-counters from a single Prometheus exporter can degrade ceph-mgr performance, especially in large clusters. Instead, Ceph-exporter daemons are now used by default for perf-counter gathering. This should only be disabled when no ceph-exporters are deployed.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_requests":{"name":"max_requests","type":"int","level":"advanced","flags":0,"default_value":"500","min":"","max":"","enum_allowed":[],"desc":"Maximum number of requests to keep in memory. When new request comes in, the oldest request will be removed if the number of requests exceeds the max request number.if un-finished request is removed, error message will be logged in the ceph-mgr log.","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"secondary_zone_period_retry_limit":{"name":"secondary_zone_period_retry_limit","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"RGW module period update retry limit for secondary site","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_cloning":{"name":"pause_cloning","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_purging":{"name":"pause_purging","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous subvolume purge threads","long_desc":"","tags":[],"see_also":[]},"periodic_async_work":{"name":"periodic_async_work","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Periodically check for async work","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_no_wait":{"name":"snapshot_clone_no_wait","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Reject subvolume clone request when cloner threads are busy","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}]}],"modules":["cephadm","dashboard","iostat","nfs","prometheus","restful"],"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/loki:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_nvmeof":{"name":"container_image_nvmeof","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nvmeof:1.0.0","min":"","max":"","enum_allowed":[],"desc":"Nvme-of container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/promtail:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"int","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"quay.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"hw_monitoring":{"name":"hw_monitoring","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Deploy hw monitoring daemon on every host.","long_desc":"","tags":[],"see_also":[]},"inventory_list_all":{"name":"inventory_list_all","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Whether ceph-volume inventory should report more devices (mostly mappers (LVs / mpaths), partitions...)","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"oob_default_addr":{"name":"oob_default_addr","type":"str","level":"advanced","flags":0,"default_value":"169.254.1.1","min":"","max":"","enum_allowed":[],"desc":"Default address for RedFish API (oob management).","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"UNSAFE_TLS_v1_2":{"name":"UNSAFE_TLS_v1_2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"exclude_perf_counters":{"name":"exclude_perf_counters","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Do not include perf-counters in the metrics output","long_desc":"Gathering perf-counters from a single Prometheus exporter can degrade ceph-mgr performance, especially in large clusters. Instead, Ceph-exporter daemons are now used by default for perf-counter gathering. This should only be disabled when no ceph-exporters are deployed.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_requests":{"name":"max_requests","type":"int","level":"advanced","flags":0,"default_value":"500","min":"","max":"","enum_allowed":[],"desc":"Maximum number of requests to keep in memory. When new request comes in, the oldest request will be removed if the number of requests exceeds the max request number.if un-finished request is removed, error message will be logged in the ceph-mgr log.","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"secondary_zone_period_retry_limit":{"name":"secondary_zone_period_retry_limit","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"RGW module period update retry limit for secondary site","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_cloning":{"name":"pause_cloning","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_purging":{"name":"pause_purging","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous subvolume purge threads","long_desc":"","tags":[],"see_also":[]},"periodic_async_work":{"name":"periodic_async_work","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Periodically check for async work","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_no_wait":{"name":"snapshot_clone_no_wait","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Reject subvolume clone request when cloner threads are busy","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}],"services":{"dashboard":"https://192.168.123.107:8443/","prometheus":"http://192.168.123.107:9283/"},"always_on_modules":{"octopus":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"pacific":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"quincy":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"reef":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"]},"force_disabled_modules":{},"last_failure_osd_epoch":5,"active_clients":[{"name":"devicehealth","addrvec":[{"type":"v2","addr":"192.168.123.107:0","nonce":910287422}]},{"name":"libcephsqlite","addrvec":[{"type":"v2","addr":"192.168.123.107:0","nonce":2208266146}]},{"name":"rbd_support","addrvec":[{"type":"v2","addr":"192.168.123.107:0","nonce":1009274776}]},{"name":"volumes","addrvec":[{"type":"v2","addr":"192.168.123.107:0","nonce":3787716570}]}]} 2026-03-09T19:24:11.287 INFO:tasks.cephadm.ceph_manager.ceph:mgr available! 2026-03-09T19:24:11.287 INFO:tasks.cephadm.ceph_manager.ceph:waiting for all up 2026-03-09T19:24:11.287 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd dump --format=json 2026-03-09T19:24:11.436 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:11.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:11 vm08 ceph-mon[57794]: pgmap v64: 1 pgs: 1 active+clean; 449 KiB data, 564 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:24:11.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:11 vm08 ceph-mon[57794]: from='client.? 192.168.123.108:0/755434797' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T19:24:11.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:11 vm08 ceph-mon[57794]: from='client.? 192.168.123.108:0/755434797' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-09T19:24:11.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:11 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/2978666947' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-09T19:24:11.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.683+0000 7ff85f32e640 1 -- 192.168.123.107:0/1450342598 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8580ff460 msgr2=0x7ff8580ff840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:11.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.683+0000 7ff85f32e640 1 --2- 192.168.123.107:0/1450342598 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8580ff460 0x7ff8580ff840 secure :-1 s=READY pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7ff8480099b0 tx=0x7ff84802f220 comp rx=0 tx=0).stop 2026-03-09T19:24:11.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.684+0000 7ff85f32e640 1 -- 192.168.123.107:0/1450342598 shutdown_connections 2026-03-09T19:24:11.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.684+0000 7ff85f32e640 1 --2- 192.168.123.107:0/1450342598 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8580ffd80 0x7ff85810cc50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:11.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.684+0000 7ff85f32e640 1 --2- 192.168.123.107:0/1450342598 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8580ff460 0x7ff8580ff840 unknown :-1 s=CLOSED pgs=215 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:11.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.685+0000 7ff85f32e640 1 -- 192.168.123.107:0/1450342598 >> 192.168.123.107:0/1450342598 conn(0x7ff8580fb2f0 msgr2=0x7ff8580fd710 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:11.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.685+0000 7ff85f32e640 1 -- 192.168.123.107:0/1450342598 shutdown_connections 2026-03-09T19:24:11.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.685+0000 7ff85f32e640 1 -- 192.168.123.107:0/1450342598 wait complete. 2026-03-09T19:24:11.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.686+0000 7ff85f32e640 1 Processor -- start 2026-03-09T19:24:11.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.686+0000 7ff85f32e640 1 -- start start 2026-03-09T19:24:11.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.686+0000 7ff85f32e640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8580ff460 0x7ff8581a06a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:11.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.686+0000 7ff85f32e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8580ffd80 0x7ff8581a0be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:11.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.686+0000 7ff85f32e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff85819a7e0 con 0x7ff8580ffd80 2026-03-09T19:24:11.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.686+0000 7ff85f32e640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff85819a950 con 0x7ff8580ff460 2026-03-09T19:24:11.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.686+0000 7ff85c8a2640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8580ffd80 0x7ff8581a0be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:11.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.686+0000 7ff85d0a3640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8580ff460 0x7ff8581a06a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:11.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.686+0000 7ff85d0a3640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8580ff460 0x7ff8581a06a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:40412/0 (socket says 192.168.123.107:40412) 2026-03-09T19:24:11.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.686+0000 7ff85d0a3640 1 -- 192.168.123.107:0/3315201317 learned_addr learned my addr 192.168.123.107:0/3315201317 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:11.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.687+0000 7ff85c8a2640 1 -- 192.168.123.107:0/3315201317 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8580ff460 msgr2=0x7ff8581a06a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:11.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.687+0000 7ff85c8a2640 1 --2- 192.168.123.107:0/3315201317 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8580ff460 0x7ff8581a06a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:11.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.687+0000 7ff85c8a2640 1 -- 192.168.123.107:0/3315201317 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff848009660 con 0x7ff8580ffd80 2026-03-09T19:24:11.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.687+0000 7ff85c8a2640 1 --2- 192.168.123.107:0/3315201317 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8580ffd80 0x7ff8581a0be0 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7ff84000b4d0 tx=0x7ff84000b9a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:11.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.687+0000 7ff84e7fc640 1 -- 192.168.123.107:0/3315201317 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff840004280 con 0x7ff8580ffd80 2026-03-09T19:24:11.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.687+0000 7ff84e7fc640 1 -- 192.168.123.107:0/3315201317 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff8400043e0 con 0x7ff8580ffd80 2026-03-09T19:24:11.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.687+0000 7ff84e7fc640 1 -- 192.168.123.107:0/3315201317 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff840010bc0 con 0x7ff8580ffd80 2026-03-09T19:24:11.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.687+0000 7ff85f32e640 1 -- 192.168.123.107:0/3315201317 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff85819ac30 con 0x7ff8580ffd80 2026-03-09T19:24:11.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.687+0000 7ff85f32e640 1 -- 192.168.123.107:0/3315201317 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff85819b180 con 0x7ff8580ffd80 2026-03-09T19:24:11.687 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.688+0000 7ff85f32e640 1 -- 192.168.123.107:0/3315201317 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff820005350 con 0x7ff8580ffd80 2026-03-09T19:24:11.690 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.692+0000 7ff84e7fc640 1 -- 192.168.123.107:0/3315201317 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7ff840010d20 con 0x7ff8580ffd80 2026-03-09T19:24:11.690 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.692+0000 7ff84e7fc640 1 --2- 192.168.123.107:0/3315201317 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff830076290 0x7ff830078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:11.690 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.692+0000 7ff85d0a3640 1 --2- 192.168.123.107:0/3315201317 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff830076290 0x7ff830078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:11.690 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.692+0000 7ff84e7fc640 1 -- 192.168.123.107:0/3315201317 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7ff8400971c0 con 0x7ff8580ffd80 2026-03-09T19:24:11.690 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.693+0000 7ff85d0a3640 1 --2- 192.168.123.107:0/3315201317 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff830076290 0x7ff830078750 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7ff84802f730 tx=0x7ff84803a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:11.691 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.693+0000 7ff84e7fc640 1 -- 192.168.123.107:0/3315201317 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7ff840060b10 con 0x7ff8580ffd80 2026-03-09T19:24:11.781 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.782+0000 7ff85f32e640 1 -- 192.168.123.107:0/3315201317 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7ff8200051c0 con 0x7ff8580ffd80 2026-03-09T19:24:11.781 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.783+0000 7ff84e7fc640 1 -- 192.168.123.107:0/3315201317 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v35) v1 ==== 74+0+11526 (secure 0 0 0) 0x7ff8400604b0 con 0x7ff8580ffd80 2026-03-09T19:24:11.782 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:11.782 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":35,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","created":"2026-03-09T19:21:45.299152+0000","modified":"2026-03-09T19:24:10.236215+0000","last_up_change":"2026-03-09T19:24:08.231617+0000","last_in_change":"2026-03-09T19:23:56.638316+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-09T19:23:42.363375+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"21","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"133448fe-3146-488b-ab63-557fcf7f955d","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6802","nonce":2192525186},{"type":"v1","addr":"192.168.123.107:6803","nonce":2192525186}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6804","nonce":2192525186},{"type":"v1","addr":"192.168.123.107:6805","nonce":2192525186}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6808","nonce":2192525186},{"type":"v1","addr":"192.168.123.107:6809","nonce":2192525186}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6806","nonce":2192525186},{"type":"v1","addr":"192.168.123.107:6807","nonce":2192525186}]},"public_addr":"192.168.123.107:6803/2192525186","cluster_addr":"192.168.123.107:6805/2192525186","heartbeat_back_addr":"192.168.123.107:6809/2192525186","heartbeat_front_addr":"192.168.123.107:6807/2192525186","state":["exists","up"]},{"osd":1,"uuid":"0f0316a2-1b3a-4bd0-b463-b3d326b0fb51","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":26,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6810","nonce":407429515},{"type":"v1","addr":"192.168.123.107:6811","nonce":407429515}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6812","nonce":407429515},{"type":"v1","addr":"192.168.123.107:6813","nonce":407429515}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6816","nonce":407429515},{"type":"v1","addr":"192.168.123.107:6817","nonce":407429515}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6814","nonce":407429515},{"type":"v1","addr":"192.168.123.107:6815","nonce":407429515}]},"public_addr":"192.168.123.107:6811/407429515","cluster_addr":"192.168.123.107:6813/407429515","heartbeat_back_addr":"192.168.123.107:6817/407429515","heartbeat_front_addr":"192.168.123.107:6815/407429515","state":["exists","up"]},{"osd":2,"uuid":"8ea4ceda-8f60-4699-976a-464d32f7e944","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6818","nonce":261501426},{"type":"v1","addr":"192.168.123.107:6819","nonce":261501426}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6820","nonce":261501426},{"type":"v1","addr":"192.168.123.107:6821","nonce":261501426}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":261501426},{"type":"v1","addr":"192.168.123.107:6825","nonce":261501426}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6822","nonce":261501426},{"type":"v1","addr":"192.168.123.107:6823","nonce":261501426}]},"public_addr":"192.168.123.107:6819/261501426","cluster_addr":"192.168.123.107:6821/261501426","heartbeat_back_addr":"192.168.123.107:6825/261501426","heartbeat_front_addr":"192.168.123.107:6823/261501426","state":["exists","up"]},{"osd":3,"uuid":"175a772b-2920-452f-9d34-5c2a70bb1cb1","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":25,"up_thru":29,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6800","nonce":1988644338},{"type":"v1","addr":"192.168.123.108:6801","nonce":1988644338}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6802","nonce":1988644338},{"type":"v1","addr":"192.168.123.108:6803","nonce":1988644338}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6806","nonce":1988644338},{"type":"v1","addr":"192.168.123.108:6807","nonce":1988644338}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6804","nonce":1988644338},{"type":"v1","addr":"192.168.123.108:6805","nonce":1988644338}]},"public_addr":"192.168.123.108:6801/1988644338","cluster_addr":"192.168.123.108:6803/1988644338","heartbeat_back_addr":"192.168.123.108:6807/1988644338","heartbeat_front_addr":"192.168.123.108:6805/1988644338","state":["exists","up"]},{"osd":4,"uuid":"a265b553-1e86-4bab-beff-db9f81381120","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":30,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6808","nonce":68783370},{"type":"v1","addr":"192.168.123.108:6809","nonce":68783370}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6810","nonce":68783370},{"type":"v1","addr":"192.168.123.108:6811","nonce":68783370}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6814","nonce":68783370},{"type":"v1","addr":"192.168.123.108:6815","nonce":68783370}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6812","nonce":68783370},{"type":"v1","addr":"192.168.123.108:6813","nonce":68783370}]},"public_addr":"192.168.123.108:6809/68783370","cluster_addr":"192.168.123.108:6811/68783370","heartbeat_back_addr":"192.168.123.108:6815/68783370","heartbeat_front_addr":"192.168.123.108:6813/68783370","state":["exists","up"]},{"osd":5,"uuid":"44d4390e-f9a3-490b-9d44-f60b53e3d568","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":34,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6816","nonce":99511042},{"type":"v1","addr":"192.168.123.108:6817","nonce":99511042}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6818","nonce":99511042},{"type":"v1","addr":"192.168.123.108:6819","nonce":99511042}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6822","nonce":99511042},{"type":"v1","addr":"192.168.123.108:6823","nonce":99511042}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6820","nonce":99511042},{"type":"v1","addr":"192.168.123.108:6821","nonce":99511042}]},"public_addr":"192.168.123.108:6817/99511042","cluster_addr":"192.168.123.108:6819/99511042","heartbeat_back_addr":"192.168.123.108:6823/99511042","heartbeat_front_addr":"192.168.123.108:6821/99511042","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:23:18.794973+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:23:29.548141+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:23:39.080417+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:23:47.991412+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:23:56.496482+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:24:05.257703+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.107:0/3701716623":"2026-03-10T19:21:57.086091+0000","192.168.123.107:0/2365276262":"2026-03-10T19:22:09.108986+0000","192.168.123.107:6801/1021580706":"2026-03-10T19:22:48.312502+0000","192.168.123.107:6800/1456056000":"2026-03-10T19:21:57.086091+0000","192.168.123.107:0/961195961":"2026-03-10T19:21:57.086091+0000","192.168.123.107:6801/1456056000":"2026-03-10T19:21:57.086091+0000","192.168.123.107:0/4096740693":"2026-03-10T19:22:48.312502+0000","192.168.123.107:0/1578432790":"2026-03-10T19:22:09.108986+0000","192.168.123.107:0/702157851":"2026-03-10T19:22:09.108986+0000","192.168.123.107:0/1848167886":"2026-03-10T19:22:48.312502+0000","192.168.123.107:6801/1318262611":"2026-03-10T19:22:09.108986+0000","192.168.123.107:0/2125348601":"2026-03-10T19:21:57.086091+0000","192.168.123.107:6800/1318262611":"2026-03-10T19:22:09.108986+0000","192.168.123.107:0/2275130412":"2026-03-10T19:22:48.312502+0000","192.168.123.107:6800/1021580706":"2026-03-10T19:22:48.312502+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-09T19:24:11.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.786+0000 7ff85f32e640 1 -- 192.168.123.107:0/3315201317 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff830076290 msgr2=0x7ff830078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:11.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.786+0000 7ff85f32e640 1 --2- 192.168.123.107:0/3315201317 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff830076290 0x7ff830078750 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7ff84802f730 tx=0x7ff84803a040 comp rx=0 tx=0).stop 2026-03-09T19:24:11.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.786+0000 7ff85f32e640 1 -- 192.168.123.107:0/3315201317 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8580ffd80 msgr2=0x7ff8581a0be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:11.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.786+0000 7ff85f32e640 1 --2- 192.168.123.107:0/3315201317 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8580ffd80 0x7ff8581a0be0 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7ff84000b4d0 tx=0x7ff84000b9a0 comp rx=0 tx=0).stop 2026-03-09T19:24:11.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.787+0000 7ff85f32e640 1 -- 192.168.123.107:0/3315201317 shutdown_connections 2026-03-09T19:24:11.785 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.787+0000 7ff85f32e640 1 --2- 192.168.123.107:0/3315201317 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff830076290 0x7ff830078750 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:11.785 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.787+0000 7ff85f32e640 1 --2- 192.168.123.107:0/3315201317 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8580ffd80 0x7ff8581a0be0 unknown :-1 s=CLOSED pgs=216 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:11.785 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.787+0000 7ff85f32e640 1 --2- 192.168.123.107:0/3315201317 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8580ff460 0x7ff8581a06a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:11.785 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.787+0000 7ff85f32e640 1 -- 192.168.123.107:0/3315201317 >> 192.168.123.107:0/3315201317 conn(0x7ff8580fb2f0 msgr2=0x7ff8580fbbd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:11.785 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.787+0000 7ff85f32e640 1 -- 192.168.123.107:0/3315201317 shutdown_connections 2026-03-09T19:24:11.785 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:11.787+0000 7ff85f32e640 1 -- 192.168.123.107:0/3315201317 wait complete. 2026-03-09T19:24:11.843 INFO:tasks.cephadm.ceph_manager.ceph:all up! 2026-03-09T19:24:11.843 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd dump --format=json 2026-03-09T19:24:11.992 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:12.221 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.223+0000 7f26c74f4640 1 -- 192.168.123.107:0/1614127801 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26c0105260 msgr2=0x7f26c00fee60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:12.221 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.223+0000 7f26c74f4640 1 --2- 192.168.123.107:0/1614127801 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26c0105260 0x7f26c00fee60 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7f26b40099b0 tx=0x7f26b402f220 comp rx=0 tx=0).stop 2026-03-09T19:24:12.222 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.224+0000 7f26c74f4640 1 -- 192.168.123.107:0/1614127801 shutdown_connections 2026-03-09T19:24:12.222 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.224+0000 7f26c74f4640 1 --2- 192.168.123.107:0/1614127801 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26c0105260 0x7f26c00fee60 unknown :-1 s=CLOSED pgs=217 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:12.222 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.224+0000 7f26c74f4640 1 --2- 192.168.123.107:0/1614127801 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26c00678f0 0x7f26c00fe920 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:12.222 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.224+0000 7f26c74f4640 1 -- 192.168.123.107:0/1614127801 >> 192.168.123.107:0/1614127801 conn(0x7f26c00fa5e0 msgr2=0x7f26c00fca00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:12.222 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.224+0000 7f26c74f4640 1 -- 192.168.123.107:0/1614127801 shutdown_connections 2026-03-09T19:24:12.222 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.224+0000 7f26c74f4640 1 -- 192.168.123.107:0/1614127801 wait complete. 2026-03-09T19:24:12.222 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.224+0000 7f26c74f4640 1 Processor -- start 2026-03-09T19:24:12.222 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.224+0000 7f26c74f4640 1 -- start start 2026-03-09T19:24:12.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.225+0000 7f26c74f4640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26c00678f0 0x7f26c01a0610 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:12.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.225+0000 7f26c74f4640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26c0105260 0x7f26c01a0b50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:12.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.225+0000 7f26c74f4640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f26c01a1170 con 0x7f26c0105260 2026-03-09T19:24:12.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.225+0000 7f26c74f4640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f26c019a700 con 0x7f26c00678f0 2026-03-09T19:24:12.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.225+0000 7f26c4a68640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26c0105260 0x7f26c01a0b50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:12.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.225+0000 7f26c4a68640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26c0105260 0x7f26c01a0b50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43114/0 (socket says 192.168.123.107:43114) 2026-03-09T19:24:12.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.225+0000 7f26c4a68640 1 -- 192.168.123.107:0/1373485388 learned_addr learned my addr 192.168.123.107:0/1373485388 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:12.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.225+0000 7f26c5269640 1 --2- 192.168.123.107:0/1373485388 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26c00678f0 0x7f26c01a0610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:12.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.226+0000 7f26c5269640 1 -- 192.168.123.107:0/1373485388 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26c0105260 msgr2=0x7f26c01a0b50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:12.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.226+0000 7f26c5269640 1 --2- 192.168.123.107:0/1373485388 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26c0105260 0x7f26c01a0b50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:12.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.226+0000 7f26c5269640 1 -- 192.168.123.107:0/1373485388 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f26b4009660 con 0x7f26c00678f0 2026-03-09T19:24:12.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.226+0000 7f26c4a68640 1 --2- 192.168.123.107:0/1373485388 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26c0105260 0x7f26c01a0b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:24:12.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.226+0000 7f26c5269640 1 --2- 192.168.123.107:0/1373485388 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26c00678f0 0x7f26c01a0610 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f26b000a5d0 tx=0x7f26b00077b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:12.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.226+0000 7f26ae7fc640 1 -- 192.168.123.107:0/1373485388 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f26b0003e50 con 0x7f26c00678f0 2026-03-09T19:24:12.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.226+0000 7f26c74f4640 1 -- 192.168.123.107:0/1373485388 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f26c019a9e0 con 0x7f26c00678f0 2026-03-09T19:24:12.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.227+0000 7f26c74f4640 1 -- 192.168.123.107:0/1373485388 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f26c019af30 con 0x7f26c00678f0 2026-03-09T19:24:12.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.227+0000 7f26ae7fc640 1 -- 192.168.123.107:0/1373485388 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f26b0010040 con 0x7f26c00678f0 2026-03-09T19:24:12.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.227+0000 7f26ae7fc640 1 -- 192.168.123.107:0/1373485388 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f26b0014990 con 0x7f26c00678f0 2026-03-09T19:24:12.226 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.228+0000 7f26ae7fc640 1 -- 192.168.123.107:0/1373485388 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f26b001e430 con 0x7f26c00678f0 2026-03-09T19:24:12.226 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.228+0000 7f26c74f4640 1 -- 192.168.123.107:0/1373485388 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f26c0100060 con 0x7f26c00678f0 2026-03-09T19:24:12.227 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.229+0000 7f26ae7fc640 1 --2- 192.168.123.107:0/1373485388 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f269c076170 0x7f269c078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:12.227 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.229+0000 7f26c4a68640 1 --2- 192.168.123.107:0/1373485388 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f269c076170 0x7f269c078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:12.227 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.229+0000 7f26ae7fc640 1 -- 192.168.123.107:0/1373485388 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f26b0097290 con 0x7f26c00678f0 2026-03-09T19:24:12.227 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.229+0000 7f26c4a68640 1 --2- 192.168.123.107:0/1373485388 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f269c076170 0x7f269c078630 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f26c019bb70 tx=0x7f26b403a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:12.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.231+0000 7f26ae7fc640 1 -- 192.168.123.107:0/1373485388 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f26b0061c70 con 0x7f26c00678f0 2026-03-09T19:24:12.322 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:12 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/3315201317' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T19:24:12.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.323+0000 7f26c74f4640 1 -- 192.168.123.107:0/1373485388 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f26c00fe920 con 0x7f26c00678f0 2026-03-09T19:24:12.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.327+0000 7f26ae7fc640 1 -- 192.168.123.107:0/1373485388 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v35) v1 ==== 74+0+11526 (secure 0 0 0) 0x7f26b0061610 con 0x7f26c00678f0 2026-03-09T19:24:12.326 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:12.326 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":35,"fsid":"17715774-1bed-11f1-9ad8-1bc9d74ff594","created":"2026-03-09T19:21:45.299152+0000","modified":"2026-03-09T19:24:10.236215+0000","last_up_change":"2026-03-09T19:24:08.231617+0000","last_in_change":"2026-03-09T19:23:56.638316+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-09T19:23:42.363375+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"21","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"133448fe-3146-488b-ab63-557fcf7f955d","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6802","nonce":2192525186},{"type":"v1","addr":"192.168.123.107:6803","nonce":2192525186}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6804","nonce":2192525186},{"type":"v1","addr":"192.168.123.107:6805","nonce":2192525186}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6808","nonce":2192525186},{"type":"v1","addr":"192.168.123.107:6809","nonce":2192525186}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6806","nonce":2192525186},{"type":"v1","addr":"192.168.123.107:6807","nonce":2192525186}]},"public_addr":"192.168.123.107:6803/2192525186","cluster_addr":"192.168.123.107:6805/2192525186","heartbeat_back_addr":"192.168.123.107:6809/2192525186","heartbeat_front_addr":"192.168.123.107:6807/2192525186","state":["exists","up"]},{"osd":1,"uuid":"0f0316a2-1b3a-4bd0-b463-b3d326b0fb51","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":26,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6810","nonce":407429515},{"type":"v1","addr":"192.168.123.107:6811","nonce":407429515}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6812","nonce":407429515},{"type":"v1","addr":"192.168.123.107:6813","nonce":407429515}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6816","nonce":407429515},{"type":"v1","addr":"192.168.123.107:6817","nonce":407429515}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6814","nonce":407429515},{"type":"v1","addr":"192.168.123.107:6815","nonce":407429515}]},"public_addr":"192.168.123.107:6811/407429515","cluster_addr":"192.168.123.107:6813/407429515","heartbeat_back_addr":"192.168.123.107:6817/407429515","heartbeat_front_addr":"192.168.123.107:6815/407429515","state":["exists","up"]},{"osd":2,"uuid":"8ea4ceda-8f60-4699-976a-464d32f7e944","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6818","nonce":261501426},{"type":"v1","addr":"192.168.123.107:6819","nonce":261501426}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6820","nonce":261501426},{"type":"v1","addr":"192.168.123.107:6821","nonce":261501426}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":261501426},{"type":"v1","addr":"192.168.123.107:6825","nonce":261501426}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6822","nonce":261501426},{"type":"v1","addr":"192.168.123.107:6823","nonce":261501426}]},"public_addr":"192.168.123.107:6819/261501426","cluster_addr":"192.168.123.107:6821/261501426","heartbeat_back_addr":"192.168.123.107:6825/261501426","heartbeat_front_addr":"192.168.123.107:6823/261501426","state":["exists","up"]},{"osd":3,"uuid":"175a772b-2920-452f-9d34-5c2a70bb1cb1","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":25,"up_thru":29,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6800","nonce":1988644338},{"type":"v1","addr":"192.168.123.108:6801","nonce":1988644338}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6802","nonce":1988644338},{"type":"v1","addr":"192.168.123.108:6803","nonce":1988644338}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6806","nonce":1988644338},{"type":"v1","addr":"192.168.123.108:6807","nonce":1988644338}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6804","nonce":1988644338},{"type":"v1","addr":"192.168.123.108:6805","nonce":1988644338}]},"public_addr":"192.168.123.108:6801/1988644338","cluster_addr":"192.168.123.108:6803/1988644338","heartbeat_back_addr":"192.168.123.108:6807/1988644338","heartbeat_front_addr":"192.168.123.108:6805/1988644338","state":["exists","up"]},{"osd":4,"uuid":"a265b553-1e86-4bab-beff-db9f81381120","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":30,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6808","nonce":68783370},{"type":"v1","addr":"192.168.123.108:6809","nonce":68783370}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6810","nonce":68783370},{"type":"v1","addr":"192.168.123.108:6811","nonce":68783370}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6814","nonce":68783370},{"type":"v1","addr":"192.168.123.108:6815","nonce":68783370}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6812","nonce":68783370},{"type":"v1","addr":"192.168.123.108:6813","nonce":68783370}]},"public_addr":"192.168.123.108:6809/68783370","cluster_addr":"192.168.123.108:6811/68783370","heartbeat_back_addr":"192.168.123.108:6815/68783370","heartbeat_front_addr":"192.168.123.108:6813/68783370","state":["exists","up"]},{"osd":5,"uuid":"44d4390e-f9a3-490b-9d44-f60b53e3d568","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":34,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6816","nonce":99511042},{"type":"v1","addr":"192.168.123.108:6817","nonce":99511042}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6818","nonce":99511042},{"type":"v1","addr":"192.168.123.108:6819","nonce":99511042}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6822","nonce":99511042},{"type":"v1","addr":"192.168.123.108:6823","nonce":99511042}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6820","nonce":99511042},{"type":"v1","addr":"192.168.123.108:6821","nonce":99511042}]},"public_addr":"192.168.123.108:6817/99511042","cluster_addr":"192.168.123.108:6819/99511042","heartbeat_back_addr":"192.168.123.108:6823/99511042","heartbeat_front_addr":"192.168.123.108:6821/99511042","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:23:18.794973+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:23:29.548141+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:23:39.080417+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:23:47.991412+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:23:56.496482+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T19:24:05.257703+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.107:0/3701716623":"2026-03-10T19:21:57.086091+0000","192.168.123.107:0/2365276262":"2026-03-10T19:22:09.108986+0000","192.168.123.107:6801/1021580706":"2026-03-10T19:22:48.312502+0000","192.168.123.107:6800/1456056000":"2026-03-10T19:21:57.086091+0000","192.168.123.107:0/961195961":"2026-03-10T19:21:57.086091+0000","192.168.123.107:6801/1456056000":"2026-03-10T19:21:57.086091+0000","192.168.123.107:0/4096740693":"2026-03-10T19:22:48.312502+0000","192.168.123.107:0/1578432790":"2026-03-10T19:22:09.108986+0000","192.168.123.107:0/702157851":"2026-03-10T19:22:09.108986+0000","192.168.123.107:0/1848167886":"2026-03-10T19:22:48.312502+0000","192.168.123.107:6801/1318262611":"2026-03-10T19:22:09.108986+0000","192.168.123.107:0/2125348601":"2026-03-10T19:21:57.086091+0000","192.168.123.107:6800/1318262611":"2026-03-10T19:22:09.108986+0000","192.168.123.107:0/2275130412":"2026-03-10T19:22:48.312502+0000","192.168.123.107:6800/1021580706":"2026-03-10T19:22:48.312502+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-09T19:24:12.328 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.330+0000 7f26c74f4640 1 -- 192.168.123.107:0/1373485388 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f269c076170 msgr2=0x7f269c078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:12.328 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.330+0000 7f26c74f4640 1 --2- 192.168.123.107:0/1373485388 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f269c076170 0x7f269c078630 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f26c019bb70 tx=0x7f26b403a040 comp rx=0 tx=0).stop 2026-03-09T19:24:12.329 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.331+0000 7f26c74f4640 1 -- 192.168.123.107:0/1373485388 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26c00678f0 msgr2=0x7f26c01a0610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:12.329 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.331+0000 7f26c74f4640 1 --2- 192.168.123.107:0/1373485388 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26c00678f0 0x7f26c01a0610 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f26b000a5d0 tx=0x7f26b00077b0 comp rx=0 tx=0).stop 2026-03-09T19:24:12.329 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.331+0000 7f26c74f4640 1 -- 192.168.123.107:0/1373485388 shutdown_connections 2026-03-09T19:24:12.329 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.331+0000 7f26c74f4640 1 --2- 192.168.123.107:0/1373485388 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f269c076170 0x7f269c078630 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:12.329 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.331+0000 7f26c74f4640 1 --2- 192.168.123.107:0/1373485388 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26c0105260 0x7f26c01a0b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:12.329 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.331+0000 7f26c74f4640 1 --2- 192.168.123.107:0/1373485388 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26c00678f0 0x7f26c01a0610 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:12.329 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.331+0000 7f26c74f4640 1 -- 192.168.123.107:0/1373485388 >> 192.168.123.107:0/1373485388 conn(0x7f26c00fa5e0 msgr2=0x7f26c01088e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:12.329 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.331+0000 7f26c74f4640 1 -- 192.168.123.107:0/1373485388 shutdown_connections 2026-03-09T19:24:12.329 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:12.332+0000 7f26c74f4640 1 -- 192.168.123.107:0/1373485388 wait complete. 2026-03-09T19:24:12.396 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph tell osd.0 flush_pg_stats 2026-03-09T19:24:12.396 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph tell osd.1 flush_pg_stats 2026-03-09T19:24:12.396 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph tell osd.2 flush_pg_stats 2026-03-09T19:24:12.396 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph tell osd.3 flush_pg_stats 2026-03-09T19:24:12.396 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph tell osd.4 flush_pg_stats 2026-03-09T19:24:12.397 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph tell osd.5 flush_pg_stats 2026-03-09T19:24:12.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:12 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/3315201317' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T19:24:12.837 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:12.842 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:12.896 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:12.902 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:12.920 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:12.957 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:13.356 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:13 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1373485388' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T19:24:13.356 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:13 vm07 ceph-mon[48545]: pgmap v65: 1 pgs: 1 active+clean; 449 KiB data, 564 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:24:13.452 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.453+0000 7fb3454b2640 1 -- 192.168.123.107:0/1693805482 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb340072cf0 msgr2=0x7fb34010cd90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.452 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.453+0000 7fb3454b2640 1 --2- 192.168.123.107:0/1693805482 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb340072cf0 0x7fb34010cd90 secure :-1 s=READY pgs=218 cs=0 l=1 rev1=1 crypto rx=0x7fb3200099b0 tx=0x7fb32002f240 comp rx=0 tx=0).stop 2026-03-09T19:24:13.452 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.453+0000 7fb3454b2640 1 -- 192.168.123.107:0/1693805482 shutdown_connections 2026-03-09T19:24:13.453 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.453+0000 7fb3454b2640 1 --2- 192.168.123.107:0/1693805482 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb340072cf0 0x7fb34010cd90 unknown :-1 s=CLOSED pgs=218 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.453 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.453+0000 7fb3454b2640 1 --2- 192.168.123.107:0/1693805482 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb340072340 0x7fb340072720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.453 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.453+0000 7fb3454b2640 1 -- 192.168.123.107:0/1693805482 >> 192.168.123.107:0/1693805482 conn(0x7fb34006b7f0 msgr2=0x7fb34006bc00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:13.453 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.454+0000 7fb3454b2640 1 -- 192.168.123.107:0/1693805482 shutdown_connections 2026-03-09T19:24:13.453 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.454+0000 7fb3454b2640 1 -- 192.168.123.107:0/1693805482 wait complete. 2026-03-09T19:24:13.453 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.454+0000 7fb3454b2640 1 Processor -- start 2026-03-09T19:24:13.453 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.454+0000 7fb3454b2640 1 -- start start 2026-03-09T19:24:13.453 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.455+0000 7fb3454b2640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb340072340 0x7fb340112d40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.453 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.455+0000 7fb3454b2640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb340072cf0 0x7fb340113280 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.453 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.455+0000 7fb3454b2640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb340113960 con 0x7fb340072cf0 2026-03-09T19:24:13.453 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.455+0000 7fb3454b2640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb3401b5d80 con 0x7fb340072340 2026-03-09T19:24:13.453 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.455+0000 7fb33effd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb340072340 0x7fb340112d40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.453 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.455+0000 7fb337fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb340072cf0 0x7fb340113280 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.453 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.455+0000 7fb337fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb340072cf0 0x7fb340113280 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43144/0 (socket says 192.168.123.107:43144) 2026-03-09T19:24:13.453 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.455+0000 7fb337fff640 1 -- 192.168.123.107:0/2971793193 learned_addr learned my addr 192.168.123.107:0/2971793193 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:13.454 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.455+0000 7fb337fff640 1 -- 192.168.123.107:0/2971793193 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb340072340 msgr2=0x7fb340112d40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.454 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.455+0000 7fb337fff640 1 --2- 192.168.123.107:0/2971793193 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb340072340 0x7fb340112d40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.454 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.455+0000 7fb337fff640 1 -- 192.168.123.107:0/2971793193 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb320009660 con 0x7fb340072cf0 2026-03-09T19:24:13.454 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.456+0000 7fb337fff640 1 --2- 192.168.123.107:0/2971793193 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb340072cf0 0x7fb340113280 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7fb32002f750 tx=0x7fb3200043d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.456 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.456+0000 7fb33cff9640 1 -- 192.168.123.107:0/2971793193 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb32003d070 con 0x7fb340072cf0 2026-03-09T19:24:13.456 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.456+0000 7fb33cff9640 1 -- 192.168.123.107:0/2971793193 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fb32002fcb0 con 0x7fb340072cf0 2026-03-09T19:24:13.456 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.456+0000 7fb3454b2640 1 -- 192.168.123.107:0/2971793193 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb3401b6000 con 0x7fb340072cf0 2026-03-09T19:24:13.456 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.456+0000 7fb3454b2640 1 -- 192.168.123.107:0/2971793193 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb3401b6570 con 0x7fb340072cf0 2026-03-09T19:24:13.456 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.457+0000 7fb3454b2640 1 -- 192.168.123.107:0/2971793193 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7fb300000fc0 con 0x7fb340072cf0 2026-03-09T19:24:13.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.457+0000 7fb33cff9640 1 -- 192.168.123.107:0/2971793193 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb3200388e0 con 0x7fb340072cf0 2026-03-09T19:24:13.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.458+0000 7fb33cff9640 1 -- 192.168.123.107:0/2971793193 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fb320038b50 con 0x7fb340072cf0 2026-03-09T19:24:13.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.459+0000 7fb33cff9640 1 --2- 192.168.123.107:0/2971793193 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb3140761c0 0x7fb314078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.459+0000 7fb33effd640 1 --2- 192.168.123.107:0/2971793193 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb3140761c0 0x7fb314078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.459+0000 7fb33cff9640 1 -- 192.168.123.107:0/2971793193 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7fb3200bcbc0 con 0x7fb340072cf0 2026-03-09T19:24:13.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.459+0000 7fb33effd640 1 --2- 192.168.123.107:0/2971793193 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb3140761c0 0x7fb314078680 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fb328004640 tx=0x7fb328009210 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.459+0000 7fb33cff9640 1 --2- 192.168.123.107:0/2971793193 >> [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186] conn(0x7fb31407bc70 0x7fb31407e090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.460+0000 7fb33f7fe640 1 --2- 192.168.123.107:0/2971793193 >> [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186] conn(0x7fb31407bc70 0x7fb31407e090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.460+0000 7fb33cff9640 1 -- 192.168.123.107:0/2971793193 --> [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7fb31407e740 con 0x7fb31407bc70 2026-03-09T19:24:13.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.460+0000 7fb33cff9640 1 -- 192.168.123.107:0/2971793193 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_get_version_reply(handle=1 version=35) v2 ==== 24+0+0 (secure 0 0 0) 0x7fb3200bcf70 con 0x7fb340072cf0 2026-03-09T19:24:13.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.460+0000 7fb33f7fe640 1 --2- 192.168.123.107:0/2971793193 >> [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186] conn(0x7fb31407bc70 0x7fb31407e090 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.462 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.461+0000 7fb33cff9640 1 -- 192.168.123.107:0/2971793193 <== osd.0 v2:192.168.123.107:6802/2192525186 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+26930 (crc 0 0 0) 0x7fb31407e740 con 0x7fb31407bc70 2026-03-09T19:24:13.471 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.470+0000 7fb3454b2640 1 -- 192.168.123.107:0/2971793193 --> [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7fb300002d20 con 0x7fb31407bc70 2026-03-09T19:24:13.471 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.472+0000 7fb33cff9640 1 -- 192.168.123.107:0/2971793193 <== osd.0 v2:192.168.123.107:6802/2192525186 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7fb300002d20 con 0x7fb31407bc70 2026-03-09T19:24:13.471 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.472+0000 7fb335ffb640 1 -- 192.168.123.107:0/2971793193 >> [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186] conn(0x7fb31407bc70 msgr2=0x7fb31407e090 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.471 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.472+0000 7fb335ffb640 1 --2- 192.168.123.107:0/2971793193 >> [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186] conn(0x7fb31407bc70 0x7fb31407e090 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.475+0000 7fb335ffb640 1 -- 192.168.123.107:0/2971793193 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb3140761c0 msgr2=0x7fb314078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.475+0000 7fb335ffb640 1 --2- 192.168.123.107:0/2971793193 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb3140761c0 0x7fb314078680 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fb328004640 tx=0x7fb328009210 comp rx=0 tx=0).stop 2026-03-09T19:24:13.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.475+0000 7fb335ffb640 1 -- 192.168.123.107:0/2971793193 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb340072cf0 msgr2=0x7fb340113280 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.475+0000 7fb335ffb640 1 --2- 192.168.123.107:0/2971793193 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb340072cf0 0x7fb340113280 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7fb32002f750 tx=0x7fb3200043d0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.475+0000 7fb335ffb640 1 -- 192.168.123.107:0/2971793193 shutdown_connections 2026-03-09T19:24:13.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.475+0000 7fb335ffb640 1 --2- 192.168.123.107:0/2971793193 >> [v2:192.168.123.107:6802/2192525186,v1:192.168.123.107:6803/2192525186] conn(0x7fb31407bc70 0x7fb31407e090 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.475+0000 7fb335ffb640 1 --2- 192.168.123.107:0/2971793193 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb3140761c0 0x7fb314078680 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.475+0000 7fb335ffb640 1 --2- 192.168.123.107:0/2971793193 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb340072cf0 0x7fb340113280 unknown :-1 s=CLOSED pgs=219 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.475+0000 7fb335ffb640 1 --2- 192.168.123.107:0/2971793193 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb340072340 0x7fb340112d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.475+0000 7fb335ffb640 1 -- 192.168.123.107:0/2971793193 >> 192.168.123.107:0/2971793193 conn(0x7fb34006b7f0 msgr2=0x7fb34010e080 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:13.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.475+0000 7fb335ffb640 1 -- 192.168.123.107:0/2971793193 shutdown_connections 2026-03-09T19:24:13.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.475+0000 7fb335ffb640 1 -- 192.168.123.107:0/2971793193 wait complete. 2026-03-09T19:24:13.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:13 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/1373485388' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T19:24:13.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:13 vm08 ceph-mon[57794]: pgmap v65: 1 pgs: 1 active+clean; 449 KiB data, 564 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:24:13.690 INFO:teuthology.orchestra.run.vm07.stdout:38654705676 2026-03-09T19:24:13.690 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd last-stat-seq osd.0 2026-03-09T19:24:13.795 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.797+0000 7f4dec8bb640 1 -- 192.168.123.107:0/577976388 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4de81005b0 msgr2=0x7f4de8100a10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.796 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.797+0000 7f4dec8bb640 1 --2- 192.168.123.107:0/577976388 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4de81005b0 0x7f4de8100a10 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f4ddc0099b0 tx=0x7f4ddc02f240 comp rx=0 tx=0).stop 2026-03-09T19:24:13.811 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.811+0000 7f4dec8bb640 1 -- 192.168.123.107:0/577976388 shutdown_connections 2026-03-09T19:24:13.811 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.811+0000 7f4dec8bb640 1 --2- 192.168.123.107:0/577976388 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4de81005b0 0x7f4de8100a10 unknown :-1 s=CLOSED pgs=220 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.811 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.811+0000 7f4dec8bb640 1 --2- 192.168.123.107:0/577976388 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4de8106660 0x7f4de8106a40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.811 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.811+0000 7f4dec8bb640 1 -- 192.168.123.107:0/577976388 >> 192.168.123.107:0/577976388 conn(0x7f4de80fc290 msgr2=0x7f4de80fe6b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:13.817 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.819+0000 7f4dec8bb640 1 -- 192.168.123.107:0/577976388 shutdown_connections 2026-03-09T19:24:13.826 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.824+0000 7f4dec8bb640 1 -- 192.168.123.107:0/577976388 wait complete. 2026-03-09T19:24:13.826 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.827+0000 7f4dec8bb640 1 Processor -- start 2026-03-09T19:24:13.829 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.831+0000 7f4dec8bb640 1 -- start start 2026-03-09T19:24:13.831 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.833+0000 7f4dec8bb640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4de81005b0 0x7f4de806d950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.831 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.833+0000 7f4dec8bb640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4de8106660 0x7f4de806de90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.831 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.833+0000 7f4dec8bb640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4de806e4d0 con 0x7f4de81005b0 2026-03-09T19:24:13.831 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.833+0000 7f4dec8bb640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4de806e640 con 0x7f4de8106660 2026-03-09T19:24:13.831 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.833+0000 7f4de6575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4de8106660 0x7f4de806de90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.831 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.833+0000 7f4de6575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4de8106660 0x7f4de806de90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:40470/0 (socket says 192.168.123.107:40470) 2026-03-09T19:24:13.831 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.833+0000 7f4de6575640 1 -- 192.168.123.107:0/3407249713 learned_addr learned my addr 192.168.123.107:0/3407249713 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:13.833 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.835+0000 7f4de6575640 1 -- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4de81005b0 msgr2=0x7f4de806d950 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.833 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.835+0000 7f4de6d76640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4de81005b0 0x7f4de806d950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.833 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.835+0000 7f4de6575640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4de81005b0 0x7f4de806d950 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.835+0000 7f4de6575640 1 -- 192.168.123.107:0/3407249713 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4ddc009660 con 0x7f4de8106660 2026-03-09T19:24:13.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.836+0000 7f4de6d76640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4de81005b0 0x7f4de806d950 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:24:13.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.836+0000 7f4de6575640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4de8106660 0x7f4de806de90 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f4ddc005ec0 tx=0x7f4ddc004300 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.836+0000 7f6a5cf4d640 1 -- 192.168.123.107:0/3766812900 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a58072340 msgr2=0x7f6a58072720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.836+0000 7f6a5cf4d640 1 --2- 192.168.123.107:0/3766812900 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a58072340 0x7f6a58072720 secure :-1 s=READY pgs=221 cs=0 l=1 rev1=1 crypto rx=0x7f6a400099b0 tx=0x7f6a4002f240 comp rx=0 tx=0).stop 2026-03-09T19:24:13.838 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.839+0000 7f4dc3fff640 1 -- 192.168.123.107:0/3407249713 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4ddc03d070 con 0x7f4de8106660 2026-03-09T19:24:13.838 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.839+0000 7f4dec8bb640 1 -- 192.168.123.107:0/3407249713 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4de81b1530 con 0x7f4de8106660 2026-03-09T19:24:13.839 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.839+0000 7f4dec8bb640 1 -- 192.168.123.107:0/3407249713 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4de81b1a20 con 0x7f4de8106660 2026-03-09T19:24:13.839 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.839+0000 7f4dc3fff640 1 -- 192.168.123.107:0/3407249713 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f4ddc0043f0 con 0x7f4de8106660 2026-03-09T19:24:13.839 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.839+0000 7f4dc3fff640 1 -- 192.168.123.107:0/3407249713 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4ddc0418b0 con 0x7f4de8106660 2026-03-09T19:24:13.839 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.839+0000 7f4dec8bb640 1 -- 192.168.123.107:0/3407249713 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f4dac000fc0 con 0x7f4de8106660 2026-03-09T19:24:13.841 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.843+0000 7f4dc3fff640 1 -- 192.168.123.107:0/3407249713 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f4ddc049050 con 0x7f4de8106660 2026-03-09T19:24:13.842 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.840+0000 7f6a5cf4d640 1 -- 192.168.123.107:0/3766812900 shutdown_connections 2026-03-09T19:24:13.842 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.840+0000 7f6a5cf4d640 1 --2- 192.168.123.107:0/3766812900 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6a58072cf0 0x7f6a5810cd90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.842 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.843+0000 7f4dc3fff640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4dbc076290 0x7f4dbc078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.842 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.840+0000 7f6a5cf4d640 1 --2- 192.168.123.107:0/3766812900 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a58072340 0x7f6a58072720 unknown :-1 s=CLOSED pgs=221 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.842 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.840+0000 7f6a5cf4d640 1 -- 192.168.123.107:0/3766812900 >> 192.168.123.107:0/3766812900 conn(0x7f6a5806b7f0 msgr2=0x7f6a5806bc00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:13.846 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.846+0000 7f4de6d76640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4dbc076290 0x7f4dbc078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.846 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.847+0000 7f4de6d76640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4dbc076290 0x7f4dbc078750 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f4dd4006fd0 tx=0x7f4dd4008040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.846 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.847+0000 7f4dc3fff640 1 -- 192.168.123.107:0/3407249713 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f4ddc051050 con 0x7f4de8106660 2026-03-09T19:24:13.846 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.847+0000 7f4dc3fff640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:6810/407429515,v1:192.168.123.107:6811/407429515] conn(0x7f4dbc07bd40 0x7f4dbc07e160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.846 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.847+0000 7f4dc3fff640 1 -- 192.168.123.107:0/3407249713 --> [v2:192.168.123.107:6810/407429515,v1:192.168.123.107:6811/407429515] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f4dbc07e810 con 0x7f4dbc07bd40 2026-03-09T19:24:13.846 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.848+0000 7f4de7577640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:6810/407429515,v1:192.168.123.107:6811/407429515] conn(0x7f4dbc07bd40 0x7f4dbc07e160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.846 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.847+0000 7f6a5cf4d640 1 -- 192.168.123.107:0/3766812900 shutdown_connections 2026-03-09T19:24:13.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.849+0000 7f4dc3fff640 1 -- 192.168.123.107:0/3407249713 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_get_version_reply(handle=1 version=35) v2 ==== 24+0+0 (secure 0 0 0) 0x7f4ddc0bca10 con 0x7f4de8106660 2026-03-09T19:24:13.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.851+0000 7f6a5cf4d640 1 -- 192.168.123.107:0/3766812900 wait complete. 2026-03-09T19:24:13.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.851+0000 7f6a5cf4d640 1 Processor -- start 2026-03-09T19:24:13.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.852+0000 7f4de7577640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:6810/407429515,v1:192.168.123.107:6811/407429515] conn(0x7f4dbc07bd40 0x7f4dbc07e160 crc :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.852+0000 7f6a5cf4d640 1 -- start start 2026-03-09T19:24:13.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.852+0000 7f6a5cf4d640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6a58072340 0x7f6a581ad580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.852+0000 7f6a5cf4d640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a58072cf0 0x7f6a581adac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.852+0000 7f6a5cf4d640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6a581a7670 con 0x7f6a58072cf0 2026-03-09T19:24:13.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.852+0000 7f6a5cf4d640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6a581a77e0 con 0x7f6a58072340 2026-03-09T19:24:13.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.852+0000 7f93a24ee640 1 -- 192.168.123.107:0/3368185786 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f939c072a40 msgr2=0x7f939c10ca90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.852+0000 7f93a24ee640 1 --2- 192.168.123.107:0/3368185786 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f939c072a40 0x7f939c10ca90 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7f93900099b0 tx=0x7f939002f240 comp rx=0 tx=0).stop 2026-03-09T19:24:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.854+0000 7f93a24ee640 1 -- 192.168.123.107:0/3368185786 shutdown_connections 2026-03-09T19:24:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.854+0000 7f93a24ee640 1 --2- 192.168.123.107:0/3368185786 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f939c072a40 0x7f939c10ca90 unknown :-1 s=CLOSED pgs=222 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.854+0000 7f93a24ee640 1 --2- 192.168.123.107:0/3368185786 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f939c072120 0x7f939c072500 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.854+0000 7f93a24ee640 1 -- 192.168.123.107:0/3368185786 >> 192.168.123.107:0/3368185786 conn(0x7f939c06c7d0 msgr2=0x7f939c06cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.855+0000 7f6a55d74640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a58072cf0 0x7f6a581adac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.855+0000 7f6a55d74640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a58072cf0 0x7f6a581adac0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43184/0 (socket says 192.168.123.107:43184) 2026-03-09T19:24:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.855+0000 7f6a55d74640 1 -- 192.168.123.107:0/4097538182 learned_addr learned my addr 192.168.123.107:0/4097538182 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.855+0000 7f6a56575640 1 --2- 192.168.123.107:0/4097538182 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6a58072340 0x7f6a581ad580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.855+0000 7f6a55d74640 1 -- 192.168.123.107:0/4097538182 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6a58072340 msgr2=0x7f6a581ad580 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.855+0000 7f6a55d74640 1 --2- 192.168.123.107:0/4097538182 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6a58072340 0x7f6a581ad580 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.855+0000 7f6a55d74640 1 -- 192.168.123.107:0/4097538182 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6a40009660 con 0x7f6a58072cf0 2026-03-09T19:24:13.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.857+0000 7f6a55d74640 1 --2- 192.168.123.107:0/4097538182 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a58072cf0 0x7f6a581adac0 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7f6a4c00e970 tx=0x7f6a4c00ee40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.860 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.861+0000 7f6a477fe640 1 -- 192.168.123.107:0/4097538182 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6a4c00cd10 con 0x7f6a58072cf0 2026-03-09T19:24:13.860 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.861+0000 7f6a477fe640 1 -- 192.168.123.107:0/4097538182 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6a4c00ce70 con 0x7f6a58072cf0 2026-03-09T19:24:13.860 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.861+0000 7f6a477fe640 1 -- 192.168.123.107:0/4097538182 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6a4c010640 con 0x7f6a58072cf0 2026-03-09T19:24:13.860 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.861+0000 7f6a5cf4d640 1 -- 192.168.123.107:0/4097538182 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6a581a7920 con 0x7f6a58072cf0 2026-03-09T19:24:13.862 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.862+0000 7f6a5cf4d640 1 -- 192.168.123.107:0/4097538182 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6a581a7e70 con 0x7f6a58072cf0 2026-03-09T19:24:13.864 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.857+0000 7f4dc3fff640 1 -- 192.168.123.107:0/3407249713 <== osd.1 v2:192.168.123.107:6810/407429515 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+26930 (crc 0 0 0) 0x7f4dbc07e810 con 0x7f4dbc07bd40 2026-03-09T19:24:13.864 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.864+0000 7f47592ea640 1 -- 192.168.123.107:0/319010169 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4754072ad0 msgr2=0x7f475410b9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.864 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.864+0000 7f47592ea640 1 --2- 192.168.123.107:0/319010169 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4754072ad0 0x7f475410b9a0 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7f474c00b0a0 tx=0x7f474c02f4c0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.866+0000 7f93a24ee640 1 -- 192.168.123.107:0/3368185786 shutdown_connections 2026-03-09T19:24:13.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.866+0000 7f6a477fe640 1 -- 192.168.123.107:0/4097538182 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f6a4c0107a0 con 0x7f6a58072cf0 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.869+0000 7f6a477fe640 1 --2- 192.168.123.107:0/4097538182 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6a24076170 0x7f6a24078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.870+0000 7f6a56575640 1 --2- 192.168.123.107:0/4097538182 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6a24076170 0x7f6a24078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.866+0000 7f93a24ee640 1 -- 192.168.123.107:0/3368185786 wait complete. 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.866+0000 7f93a24ee640 1 Processor -- start 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.869+0000 7f93a24ee640 1 -- start start 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.869+0000 7f93a24ee640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f939c072120 0x7f939c117d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.869+0000 7f93a24ee640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f939c072a40 0x7f939c1182d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.869+0000 7f93a24ee640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f939c112f50 con 0x7f939c072120 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.869+0000 7f93a24ee640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f939c1130c0 con 0x7f939c072a40 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.869+0000 7f93a14ec640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f939c072120 0x7f939c117d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.869+0000 7f93a14ec640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f939c072120 0x7f939c117d90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43208/0 (socket says 192.168.123.107:43208) 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.869+0000 7f93a14ec640 1 -- 192.168.123.107:0/1142569950 learned_addr learned my addr 192.168.123.107:0/1142569950 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.870+0000 7f93a14ec640 1 -- 192.168.123.107:0/1142569950 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f939c072a40 msgr2=0x7f939c1182d0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.870+0000 7f93a14ec640 1 --2- 192.168.123.107:0/1142569950 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f939c072a40 0x7f939c1182d0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.870+0000 7f93a14ec640 1 -- 192.168.123.107:0/1142569950 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9390009660 con 0x7f939c072120 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.870+0000 7f93a14ec640 1 --2- 192.168.123.107:0/1142569950 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f939c072120 0x7f939c117d90 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7f938c00ece0 tx=0x7f938c00c6a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.870+0000 7f938a7fc640 1 -- 192.168.123.107:0/1142569950 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f938c00eea0 con 0x7f939c072120 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.870+0000 7f93a24ee640 1 -- 192.168.123.107:0/1142569950 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f939c113340 con 0x7f939c072120 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.870+0000 7f93a24ee640 1 -- 192.168.123.107:0/1142569950 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f939c113890 con 0x7f939c072120 2026-03-09T19:24:13.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.871+0000 7f938a7fc640 1 -- 192.168.123.107:0/1142569950 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f938c004590 con 0x7f939c072120 2026-03-09T19:24:13.871 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.871+0000 7f938a7fc640 1 -- 192.168.123.107:0/1142569950 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f938c010640 con 0x7f939c072120 2026-03-09T19:24:13.871 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.871+0000 7f93a24ee640 1 -- 192.168.123.107:0/1142569950 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f939c108570 con 0x7f939c072120 2026-03-09T19:24:13.874 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.876+0000 7f4dec8bb640 1 -- 192.168.123.107:0/3407249713 --> [v2:192.168.123.107:6810/407429515,v1:192.168.123.107:6811/407429515] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f4dac002db0 con 0x7f4dbc07bd40 2026-03-09T19:24:13.874 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.875+0000 7f6a56575640 1 --2- 192.168.123.107:0/4097538182 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6a24076170 0x7f6a24078630 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f6a40002410 tx=0x7f6a4003a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.874 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.875+0000 7f6a477fe640 1 -- 192.168.123.107:0/4097538182 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f6a4c014070 con 0x7f6a58072cf0 2026-03-09T19:24:13.874 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.876+0000 7f6a5cf4d640 1 --2- 192.168.123.107:0/4097538182 >> [v2:192.168.123.107:6818/261501426,v1:192.168.123.107:6819/261501426] conn(0x7f6a1c0015e0 0x7f6a1c003aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.874 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.876+0000 7f6a5cf4d640 1 -- 192.168.123.107:0/4097538182 --> [v2:192.168.123.107:6818/261501426,v1:192.168.123.107:6819/261501426] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f6a1c006c40 con 0x7f6a1c0015e0 2026-03-09T19:24:13.874 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.872+0000 7f938a7fc640 1 -- 192.168.123.107:0/1142569950 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f938c0107a0 con 0x7f939c072120 2026-03-09T19:24:13.874 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.872+0000 7f938a7fc640 1 --2- 192.168.123.107:0/1142569950 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9378076170 0x7f9378078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.873+0000 7f938a7fc640 1 -- 192.168.123.107:0/1142569950 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f938c014070 con 0x7f939c072120 2026-03-09T19:24:13.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.873+0000 7f938a7fc640 1 --2- 192.168.123.107:0/1142569950 >> [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042] conn(0x7f937807bc20 0x7f937807e040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.873+0000 7f938a7fc640 1 -- 192.168.123.107:0/1142569950 --> [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f937807e6f0 con 0x7f937807bc20 2026-03-09T19:24:13.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.875+0000 7f93a0ceb640 1 --2- 192.168.123.107:0/1142569950 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9378076170 0x7f9378078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.876+0000 7f93a1ced640 1 --2- 192.168.123.107:0/1142569950 >> [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042] conn(0x7f937807bc20 0x7f937807e040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.876+0000 7f4dc3fff640 1 -- 192.168.123.107:0/3407249713 <== osd.1 v2:192.168.123.107:6810/407429515 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f4dac002db0 con 0x7f4dbc07bd40 2026-03-09T19:24:13.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.876+0000 7f6a56d76640 1 --2- 192.168.123.107:0/4097538182 >> [v2:192.168.123.107:6818/261501426,v1:192.168.123.107:6819/261501426] conn(0x7f6a1c0015e0 0x7f6a1c003aa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.880+0000 7f6a56d76640 1 --2- 192.168.123.107:0/4097538182 >> [v2:192.168.123.107:6818/261501426,v1:192.168.123.107:6819/261501426] conn(0x7f6a1c0015e0 0x7f6a1c003aa0 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.2 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.883+0000 7f6a477fe640 1 -- 192.168.123.107:0/4097538182 <== osd.2 v2:192.168.123.107:6818/261501426 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+26930 (crc 0 0 0) 0x7f6a1c006c40 con 0x7f6a1c0015e0 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.879+0000 7f47592ea640 1 -- 192.168.123.107:0/319010169 shutdown_connections 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.879+0000 7f47592ea640 1 --2- 192.168.123.107:0/319010169 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4754072ad0 0x7f475410b9a0 unknown :-1 s=CLOSED pgs=224 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.879+0000 7f47592ea640 1 --2- 192.168.123.107:0/319010169 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4754072120 0x7f4754072500 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.879+0000 7f47592ea640 1 -- 192.168.123.107:0/319010169 >> 192.168.123.107:0/319010169 conn(0x7f475406c7d0 msgr2=0x7f475406cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.879+0000 7f47592ea640 1 -- 192.168.123.107:0/319010169 shutdown_connections 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.879+0000 7f47592ea640 1 -- 192.168.123.107:0/319010169 wait complete. 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.879+0000 7f47592ea640 1 Processor -- start 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.879+0000 7f47592ea640 1 -- start start 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.879+0000 7f47592ea640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4754072120 0x7f475407d540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.879+0000 7f47592ea640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4754072ad0 0x7f475407da80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.879+0000 7f47592ea640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f475407dfc0 con 0x7f4754072120 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.879+0000 7f47592ea640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f475407e130 con 0x7f4754072ad0 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.879+0000 7f4753fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4754072120 0x7f475407d540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.879+0000 7f4753fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4754072120 0x7f475407d540 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43216/0 (socket says 192.168.123.107:43216) 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.879+0000 7f4753fff640 1 -- 192.168.123.107:0/3715767553 learned_addr learned my addr 192.168.123.107:0/3715767553 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.880+0000 7f47537fe640 1 --2- 192.168.123.107:0/3715767553 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4754072ad0 0x7f475407da80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.880+0000 7f4753fff640 1 -- 192.168.123.107:0/3715767553 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4754072ad0 msgr2=0x7f475407da80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.880+0000 7f4753fff640 1 --2- 192.168.123.107:0/3715767553 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4754072ad0 0x7f475407da80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.880+0000 7f4753fff640 1 -- 192.168.123.107:0/3715767553 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f474c009d00 con 0x7f4754072120 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.880+0000 7f4753fff640 1 --2- 192.168.123.107:0/3715767553 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4754072120 0x7f475407d540 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7f474400d8d0 tx=0x7f474400dda0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.880+0000 7f47517fa640 1 -- 192.168.123.107:0/3715767553 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4744004490 con 0x7f4754072120 2026-03-09T19:24:13.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.880+0000 7f938a7fc640 1 -- 192.168.123.107:0/1142569950 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_get_version_reply(handle=1 version=35) v2 ==== 24+0+0 (secure 0 0 0) 0x7f938c061f50 con 0x7f939c072120 2026-03-09T19:24:13.885 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.881+0000 7f47592ea640 1 -- 192.168.123.107:0/3715767553 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4754082070 con 0x7f4754072120 2026-03-09T19:24:13.885 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.881+0000 7f47592ea640 1 -- 192.168.123.107:0/3715767553 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f47540825c0 con 0x7f4754072120 2026-03-09T19:24:13.885 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.885+0000 7f47517fa640 1 -- 192.168.123.107:0/3715767553 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f474400bd00 con 0x7f4754072120 2026-03-09T19:24:13.885 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.885+0000 7f47517fa640 1 -- 192.168.123.107:0/3715767553 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4744010460 con 0x7f4754072120 2026-03-09T19:24:13.885 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.885+0000 7f47517fa640 1 -- 192.168.123.107:0/3715767553 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f47440105c0 con 0x7f4754072120 2026-03-09T19:24:13.885 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.886+0000 7f47517fa640 1 --2- 192.168.123.107:0/3715767553 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4734076360 0x7f4734078820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.880+0000 7f4dc1ffb640 1 -- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:6810/407429515,v1:192.168.123.107:6811/407429515] conn(0x7f4dbc07bd40 msgr2=0x7f4dbc07e160 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.880+0000 7f4dc1ffb640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:6810/407429515,v1:192.168.123.107:6811/407429515] conn(0x7f4dbc07bd40 0x7f4dbc07e160 crc :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.880+0000 7f4dc1ffb640 1 -- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4dbc076290 msgr2=0x7f4dbc078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.880+0000 7f4dc1ffb640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4dbc076290 0x7f4dbc078750 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f4dd4006fd0 tx=0x7f4dd4008040 comp rx=0 tx=0).stop 2026-03-09T19:24:13.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.880+0000 7f4dc1ffb640 1 -- 192.168.123.107:0/3407249713 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4de8106660 msgr2=0x7f4de806de90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.880+0000 7f4dc1ffb640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4de8106660 0x7f4de806de90 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f4ddc005ec0 tx=0x7f4ddc004300 comp rx=0 tx=0).stop 2026-03-09T19:24:13.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.888+0000 7f4dc1ffb640 1 -- 192.168.123.107:0/3407249713 shutdown_connections 2026-03-09T19:24:13.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.888+0000 7f4dc1ffb640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4dbc076290 0x7f4dbc078750 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.888+0000 7f4dc1ffb640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4de8106660 0x7f4de806de90 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.888+0000 7f4dc1ffb640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:6810/407429515,v1:192.168.123.107:6811/407429515] conn(0x7f4dbc07bd40 0x7f4dbc07e160 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.888+0000 7f4dc1ffb640 1 --2- 192.168.123.107:0/3407249713 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4de81005b0 0x7f4de806d950 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.888+0000 7f4dc1ffb640 1 -- 192.168.123.107:0/3407249713 >> 192.168.123.107:0/3407249713 conn(0x7f4de80fc290 msgr2=0x7f4de8104160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:13.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.888+0000 7f4dc1ffb640 1 -- 192.168.123.107:0/3407249713 shutdown_connections 2026-03-09T19:24:13.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.888+0000 7f4dc1ffb640 1 -- 192.168.123.107:0/3407249713 wait complete. 2026-03-09T19:24:13.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.886+0000 7f47537fe640 1 --2- 192.168.123.107:0/3715767553 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4734076360 0x7f4734078820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.887+0000 7f47537fe640 1 --2- 192.168.123.107:0/3715767553 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4734076360 0x7f4734078820 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f474c00b070 tx=0x7f474c002750 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.887+0000 7f47517fa640 1 -- 192.168.123.107:0/3715767553 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f4744098750 con 0x7f4754072120 2026-03-09T19:24:13.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.887+0000 7f47592ea640 1 --2- 192.168.123.107:0/3715767553 >> [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338] conn(0x7f471c0015e0 0x7f471c003aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.887+0000 7f47592ea640 1 -- 192.168.123.107:0/3715767553 --> [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f471c006c40 con 0x7f471c0015e0 2026-03-09T19:24:13.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.888+0000 7f4758ae9640 1 --2- 192.168.123.107:0/3715767553 >> [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338] conn(0x7f471c0015e0 0x7f471c003aa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.888+0000 7f4758ae9640 1 --2- 192.168.123.107:0/3715767553 >> [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338] conn(0x7f471c0015e0 0x7f471c003aa0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.3 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.889+0000 7fe1cd517640 1 -- 192.168.123.107:0/2456685433 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe1c00a49f0 msgr2=0x7fe1c00a4dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.889+0000 7fe1cd517640 1 --2- 192.168.123.107:0/2456685433 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe1c00a49f0 0x7fe1c00a4dd0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fe1bc0099b0 tx=0x7fe1bc02f2b0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.889+0000 7f47517fa640 1 -- 192.168.123.107:0/3715767553 <== osd.3 v2:192.168.123.108:6800/1988644338 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+26930 (crc 0 0 0) 0x7f471c006c40 con 0x7f471c0015e0 2026-03-09T19:24:13.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.892+0000 7fe1cd517640 1 -- 192.168.123.107:0/2456685433 shutdown_connections 2026-03-09T19:24:13.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.892+0000 7fe1cd517640 1 --2- 192.168.123.107:0/2456685433 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe1c00a5310 0x7fe1c00b75b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.892+0000 7fe1cd517640 1 --2- 192.168.123.107:0/2456685433 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe1c00a49f0 0x7fe1c00a4dd0 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.892+0000 7fe1cd517640 1 -- 192.168.123.107:0/2456685433 >> 192.168.123.107:0/2456685433 conn(0x7fe1c001a3c0 msgr2=0x7fe1c001a7d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:13.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.893+0000 7f93a1ced640 1 --2- 192.168.123.107:0/1142569950 >> [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042] conn(0x7f937807bc20 0x7f937807e040 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.5 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.893 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.894+0000 7fe1cd517640 1 -- 192.168.123.107:0/2456685433 shutdown_connections 2026-03-09T19:24:13.893 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.893+0000 7f93a0ceb640 1 --2- 192.168.123.107:0/1142569950 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9378076170 0x7f9378078630 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f939c1145e0 tx=0x7f93900023d0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.899 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.900+0000 7fe1cd517640 1 -- 192.168.123.107:0/2456685433 wait complete. 2026-03-09T19:24:13.899 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.900+0000 7fe1cd517640 1 Processor -- start 2026-03-09T19:24:13.899 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.896+0000 7f938a7fc640 1 -- 192.168.123.107:0/1142569950 <== osd.5 v2:192.168.123.108:6816/99511042 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+26930 (crc 0 0 0) 0x7f937807e6f0 con 0x7f937807bc20 2026-03-09T19:24:13.902 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.901+0000 7fe1cd517640 1 -- start start 2026-03-09T19:24:13.902 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.901+0000 7fe1cd517640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe1c00a5310 0x7fe1c0144d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.902 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.901+0000 7fe1cd517640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe1c01452c0 0x7fe1c01496c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.901+0000 7fe1cd517640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe1c01458c0 con 0x7fe1c01452c0 2026-03-09T19:24:13.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.901+0000 7fe1cd517640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe1c0145a30 con 0x7fe1c00a5310 2026-03-09T19:24:13.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.903+0000 7fe1c67fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe1c01452c0 0x7fe1c01496c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.903+0000 7fe1c67fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe1c01452c0 0x7fe1c01496c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43218/0 (socket says 192.168.123.107:43218) 2026-03-09T19:24:13.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.903+0000 7fe1c67fc640 1 -- 192.168.123.107:0/2874664949 learned_addr learned my addr 192.168.123.107:0/2874664949 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:13.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.903+0000 7fe1c6ffd640 1 --2- 192.168.123.107:0/2874664949 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe1c00a5310 0x7fe1c0144d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.904+0000 7fe1c67fc640 1 -- 192.168.123.107:0/2874664949 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe1c00a5310 msgr2=0x7fe1c0144d80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.904+0000 7fe1c67fc640 1 --2- 192.168.123.107:0/2874664949 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe1c00a5310 0x7fe1c0144d80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.904+0000 7fe1c67fc640 1 -- 192.168.123.107:0/2874664949 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe1b0009590 con 0x7fe1c01452c0 2026-03-09T19:24:13.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.904+0000 7fe1c67fc640 1 --2- 192.168.123.107:0/2874664949 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe1c01452c0 0x7fe1c01496c0 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7fe1b0002760 tx=0x7fe1b0002c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.904+0000 7fe1a7fff640 1 -- 192.168.123.107:0/2874664949 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe1b000ecf0 con 0x7fe1c01452c0 2026-03-09T19:24:13.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.906+0000 7fe1cd517640 1 -- 192.168.123.107:0/2874664949 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe1bc009660 con 0x7fe1c01452c0 2026-03-09T19:24:13.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.906+0000 7fe1cd517640 1 -- 192.168.123.107:0/2874664949 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe1c014a080 con 0x7fe1c01452c0 2026-03-09T19:24:13.906 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.907+0000 7fe1a7fff640 1 -- 192.168.123.107:0/2874664949 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe1b0002e90 con 0x7fe1c01452c0 2026-03-09T19:24:13.906 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.907+0000 7fe1a7fff640 1 -- 192.168.123.107:0/2874664949 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe1b00186a0 con 0x7fe1c01452c0 2026-03-09T19:24:13.906 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.907+0000 7fe1cd517640 1 -- 192.168.123.107:0/2874664949 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7fe18c000fc0 con 0x7fe1c01452c0 2026-03-09T19:24:13.910 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.911+0000 7f47592ea640 1 -- 192.168.123.107:0/3715767553 --> [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f471c005d20 con 0x7f471c0015e0 2026-03-09T19:24:13.911 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.908+0000 7fe1a7fff640 1 -- 192.168.123.107:0/2874664949 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fe1b0016020 con 0x7fe1c01452c0 2026-03-09T19:24:13.911 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.909+0000 7fe1a7fff640 1 --2- 192.168.123.107:0/2874664949 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe1ac076260 0x7fe1ac078720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.911 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.909+0000 7fe1c6ffd640 1 --2- 192.168.123.107:0/2874664949 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe1ac076260 0x7fe1ac078720 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.911 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.909+0000 7fe1a7fff640 1 -- 192.168.123.107:0/2874664949 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7fe1b0098bf0 con 0x7fe1c01452c0 2026-03-09T19:24:13.911 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.909+0000 7fe1a7fff640 1 --2- 192.168.123.107:0/2874664949 >> [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370] conn(0x7fe1ac07bd10 0x7fe1ac07e130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:13.911 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.909+0000 7fe1a7fff640 1 -- 192.168.123.107:0/2874664949 --> [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7fe1ac07e7e0 con 0x7fe1ac07bd10 2026-03-09T19:24:13.911 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.909+0000 7fe1a7fff640 1 -- 192.168.123.107:0/2874664949 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_get_version_reply(handle=1 version=35) v2 ==== 24+0+0 (secure 0 0 0) 0x7fe1b009d2a0 con 0x7fe1c01452c0 2026-03-09T19:24:13.916 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.916+0000 7f47517fa640 1 -- 192.168.123.107:0/3715767553 <== osd.3 v2:192.168.123.108:6800/1988644338 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f471c005d20 con 0x7f471c0015e0 2026-03-09T19:24:13.917 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.912+0000 7fe1c77fe640 1 --2- 192.168.123.107:0/2874664949 >> [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370] conn(0x7fe1ac07bd10 0x7fe1ac07e130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:13.923 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.925+0000 7f473affd640 1 -- 192.168.123.107:0/3715767553 >> [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338] conn(0x7f471c0015e0 msgr2=0x7f471c003aa0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.923 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.925+0000 7f473affd640 1 --2- 192.168.123.107:0/3715767553 >> [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338] conn(0x7f471c0015e0 0x7f471c003aa0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.925+0000 7f473affd640 1 -- 192.168.123.107:0/3715767553 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4734076360 msgr2=0x7f4734078820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.925+0000 7f473affd640 1 --2- 192.168.123.107:0/3715767553 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4734076360 0x7f4734078820 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f474c00b070 tx=0x7f474c002750 comp rx=0 tx=0).stop 2026-03-09T19:24:13.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.925+0000 7f473affd640 1 -- 192.168.123.107:0/3715767553 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4754072120 msgr2=0x7f475407d540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.925+0000 7f473affd640 1 --2- 192.168.123.107:0/3715767553 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4754072120 0x7f475407d540 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7f474400d8d0 tx=0x7f474400dda0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.921+0000 7fe1c6ffd640 1 --2- 192.168.123.107:0/2874664949 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe1ac076260 0x7fe1ac078720 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fe1bc009630 tx=0x7fe1bc0095c0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.925+0000 7fe1c77fe640 1 --2- 192.168.123.107:0/2874664949 >> [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370] conn(0x7fe1ac07bd10 0x7fe1ac07e130 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.4 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:13.925 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.926+0000 7f93a24ee640 1 -- 192.168.123.107:0/1142569950 --> [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f939c072500 con 0x7f937807bc20 2026-03-09T19:24:13.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.927+0000 7f473affd640 1 -- 192.168.123.107:0/3715767553 shutdown_connections 2026-03-09T19:24:13.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.927+0000 7f473affd640 1 --2- 192.168.123.107:0/3715767553 >> [v2:192.168.123.108:6800/1988644338,v1:192.168.123.108:6801/1988644338] conn(0x7f471c0015e0 0x7f471c003aa0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.927+0000 7f473affd640 1 --2- 192.168.123.107:0/3715767553 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4734076360 0x7f4734078820 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.927+0000 7f473affd640 1 --2- 192.168.123.107:0/3715767553 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4754072ad0 0x7f475407da80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.927+0000 7f473affd640 1 --2- 192.168.123.107:0/3715767553 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4754072120 0x7f475407d540 unknown :-1 s=CLOSED pgs=226 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.927+0000 7f473affd640 1 -- 192.168.123.107:0/3715767553 >> 192.168.123.107:0/3715767553 conn(0x7f475406c7d0 msgr2=0x7f475406fd30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:13.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.927+0000 7f473affd640 1 -- 192.168.123.107:0/3715767553 shutdown_connections 2026-03-09T19:24:13.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.928+0000 7f473affd640 1 -- 192.168.123.107:0/3715767553 wait complete. 2026-03-09T19:24:13.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.927+0000 7fe1a7fff640 1 -- 192.168.123.107:0/2874664949 <== osd.4 v2:192.168.123.108:6808/68783370 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+26930 (crc 0 0 0) 0x7fe1ac07e7e0 con 0x7fe1ac07bd10 2026-03-09T19:24:13.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.927+0000 7f938a7fc640 1 -- 192.168.123.107:0/1142569950 <== osd.5 v2:192.168.123.108:6816/99511042 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f939c072500 con 0x7f937807bc20 2026-03-09T19:24:13.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.927+0000 7f936bfff640 1 -- 192.168.123.107:0/1142569950 >> [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042] conn(0x7f937807bc20 msgr2=0x7f937807e040 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.927+0000 7f936bfff640 1 --2- 192.168.123.107:0/1142569950 >> [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042] conn(0x7f937807bc20 0x7f937807e040 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.928+0000 7f936bfff640 1 -- 192.168.123.107:0/1142569950 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9378076170 msgr2=0x7f9378078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.928+0000 7f936bfff640 1 --2- 192.168.123.107:0/1142569950 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9378076170 0x7f9378078630 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f939c1145e0 tx=0x7f93900023d0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.928+0000 7f936bfff640 1 -- 192.168.123.107:0/1142569950 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f939c072120 msgr2=0x7f939c117d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.928+0000 7f936bfff640 1 --2- 192.168.123.107:0/1142569950 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f939c072120 0x7f939c117d90 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7f938c00ece0 tx=0x7f938c00c6a0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.935+0000 7f6a5cf4d640 1 -- 192.168.123.107:0/4097538182 --> [v2:192.168.123.107:6818/261501426,v1:192.168.123.107:6819/261501426] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f6a1c005d20 con 0x7f6a1c0015e0 2026-03-09T19:24:13.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.933+0000 7f936bfff640 1 -- 192.168.123.107:0/1142569950 shutdown_connections 2026-03-09T19:24:13.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.933+0000 7f936bfff640 1 --2- 192.168.123.107:0/1142569950 >> [v2:192.168.123.108:6816/99511042,v1:192.168.123.108:6817/99511042] conn(0x7f937807bc20 0x7f937807e040 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.933+0000 7f936bfff640 1 --2- 192.168.123.107:0/1142569950 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9378076170 0x7f9378078630 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.933+0000 7f936bfff640 1 --2- 192.168.123.107:0/1142569950 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f939c072a40 0x7f939c1182d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.933+0000 7f936bfff640 1 --2- 192.168.123.107:0/1142569950 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f939c072120 0x7f939c117d90 unknown :-1 s=CLOSED pgs=225 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.933+0000 7f936bfff640 1 -- 192.168.123.107:0/1142569950 >> 192.168.123.107:0/1142569950 conn(0x7f939c06c7d0 msgr2=0x7f939c10dde0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:13.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.933+0000 7f936bfff640 1 -- 192.168.123.107:0/1142569950 shutdown_connections 2026-03-09T19:24:13.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.933+0000 7f936bfff640 1 -- 192.168.123.107:0/1142569950 wait complete. 2026-03-09T19:24:13.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.936+0000 7f6a477fe640 1 -- 192.168.123.107:0/4097538182 <== osd.2 v2:192.168.123.107:6818/261501426 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f6a1c005d20 con 0x7f6a1c0015e0 2026-03-09T19:24:13.935 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.937+0000 7f6a457fa640 1 -- 192.168.123.107:0/4097538182 >> [v2:192.168.123.107:6818/261501426,v1:192.168.123.107:6819/261501426] conn(0x7f6a1c0015e0 msgr2=0x7f6a1c003aa0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.935 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.937+0000 7f6a457fa640 1 --2- 192.168.123.107:0/4097538182 >> [v2:192.168.123.107:6818/261501426,v1:192.168.123.107:6819/261501426] conn(0x7f6a1c0015e0 0x7f6a1c003aa0 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.937 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.940+0000 7f6a457fa640 1 -- 192.168.123.107:0/4097538182 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6a24076170 msgr2=0x7f6a24078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.940+0000 7f6a457fa640 1 --2- 192.168.123.107:0/4097538182 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6a24076170 0x7f6a24078630 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f6a40002410 tx=0x7f6a4003a040 comp rx=0 tx=0).stop 2026-03-09T19:24:13.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.940+0000 7f6a457fa640 1 -- 192.168.123.107:0/4097538182 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a58072cf0 msgr2=0x7f6a581adac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.940+0000 7f6a457fa640 1 --2- 192.168.123.107:0/4097538182 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a58072cf0 0x7f6a581adac0 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7f6a4c00e970 tx=0x7f6a4c00ee40 comp rx=0 tx=0).stop 2026-03-09T19:24:13.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.940+0000 7f6a457fa640 1 -- 192.168.123.107:0/4097538182 shutdown_connections 2026-03-09T19:24:13.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.940+0000 7f6a457fa640 1 --2- 192.168.123.107:0/4097538182 >> [v2:192.168.123.107:6818/261501426,v1:192.168.123.107:6819/261501426] conn(0x7f6a1c0015e0 0x7f6a1c003aa0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.940+0000 7f6a457fa640 1 --2- 192.168.123.107:0/4097538182 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6a24076170 0x7f6a24078630 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.941+0000 7f6a457fa640 1 --2- 192.168.123.107:0/4097538182 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a58072cf0 0x7f6a581adac0 unknown :-1 s=CLOSED pgs=223 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.941+0000 7f6a457fa640 1 --2- 192.168.123.107:0/4097538182 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6a58072340 0x7f6a581ad580 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.941+0000 7f6a457fa640 1 -- 192.168.123.107:0/4097538182 >> 192.168.123.107:0/4097538182 conn(0x7f6a5806b7f0 msgr2=0x7f6a5810df80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:13.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.943+0000 7f6a457fa640 1 -- 192.168.123.107:0/4097538182 shutdown_connections 2026-03-09T19:24:13.941 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.943+0000 7f6a457fa640 1 -- 192.168.123.107:0/4097538182 wait complete. 2026-03-09T19:24:13.975 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.972+0000 7fe1cd517640 1 -- 192.168.123.107:0/2874664949 --> [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7fe18c002db0 con 0x7fe1ac07bd10 2026-03-09T19:24:13.975 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.975+0000 7fe1a7fff640 1 -- 192.168.123.107:0/2874664949 <== osd.4 v2:192.168.123.108:6808/68783370 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7fe18c002db0 con 0x7fe1ac07bd10 2026-03-09T19:24:13.975 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.975+0000 7fe1cd517640 1 -- 192.168.123.107:0/2874664949 >> [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370] conn(0x7fe1ac07bd10 msgr2=0x7fe1ac07e130 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.975 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.975+0000 7fe1cd517640 1 --2- 192.168.123.107:0/2874664949 >> [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370] conn(0x7fe1ac07bd10 0x7fe1ac07e130 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.975 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.975+0000 7fe1cd517640 1 -- 192.168.123.107:0/2874664949 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe1ac076260 msgr2=0x7fe1ac078720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.975 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.975+0000 7fe1cd517640 1 --2- 192.168.123.107:0/2874664949 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe1ac076260 0x7fe1ac078720 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fe1bc009630 tx=0x7fe1bc0095c0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.975 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.975+0000 7fe1cd517640 1 -- 192.168.123.107:0/2874664949 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe1c01452c0 msgr2=0x7fe1c01496c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:13.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.975+0000 7fe1cd517640 1 --2- 192.168.123.107:0/2874664949 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe1c01452c0 0x7fe1c01496c0 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7fe1b0002760 tx=0x7fe1b0002c30 comp rx=0 tx=0).stop 2026-03-09T19:24:13.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.976+0000 7fe1cd517640 1 -- 192.168.123.107:0/2874664949 shutdown_connections 2026-03-09T19:24:13.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.976+0000 7fe1cd517640 1 --2- 192.168.123.107:0/2874664949 >> [v2:192.168.123.108:6808/68783370,v1:192.168.123.108:6809/68783370] conn(0x7fe1ac07bd10 0x7fe1ac07e130 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.976+0000 7fe1cd517640 1 --2- 192.168.123.107:0/2874664949 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe1ac076260 0x7fe1ac078720 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.976+0000 7fe1cd517640 1 --2- 192.168.123.107:0/2874664949 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe1c01452c0 0x7fe1c01496c0 unknown :-1 s=CLOSED pgs=227 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.976+0000 7fe1cd517640 1 --2- 192.168.123.107:0/2874664949 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe1c00a5310 0x7fe1c0144d80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:13.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.976+0000 7fe1cd517640 1 -- 192.168.123.107:0/2874664949 >> 192.168.123.107:0/2874664949 conn(0x7fe1c001a3c0 msgr2=0x7fe1c00b5dc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:13.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.976+0000 7fe1cd517640 1 -- 192.168.123.107:0/2874664949 shutdown_connections 2026-03-09T19:24:13.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:13.976+0000 7fe1cd517640 1 -- 192.168.123.107:0/2874664949 wait complete. 2026-03-09T19:24:14.041 INFO:teuthology.orchestra.run.vm07.stdout:107374182406 2026-03-09T19:24:14.041 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd last-stat-seq osd.3 2026-03-09T19:24:14.058 INFO:teuthology.orchestra.run.vm07.stdout:146028888067 2026-03-09T19:24:14.058 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd last-stat-seq osd.5 2026-03-09T19:24:14.080 INFO:teuthology.orchestra.run.vm07.stdout:55834574858 2026-03-09T19:24:14.080 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd last-stat-seq osd.1 2026-03-09T19:24:14.131 INFO:teuthology.orchestra.run.vm07.stdout:128849018885 2026-03-09T19:24:14.131 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd last-stat-seq osd.4 2026-03-09T19:24:14.133 INFO:teuthology.orchestra.run.vm07.stdout:73014444040 2026-03-09T19:24:14.133 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd last-stat-seq osd.2 2026-03-09T19:24:14.139 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:14.556 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.636+0000 7fa0c7fff640 1 -- 192.168.123.107:0/2822012117 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c8072ad0 msgr2=0x7fa0c810b9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.636+0000 7fa0c7fff640 1 --2- 192.168.123.107:0/2822012117 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c8072ad0 0x7fa0c810b9a0 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7fa0c000a090 tx=0x7fa0c002f440 comp rx=0 tx=0).stop 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.636+0000 7fa0c7fff640 1 -- 192.168.123.107:0/2822012117 shutdown_connections 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.636+0000 7fa0c7fff640 1 --2- 192.168.123.107:0/2822012117 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c8072ad0 0x7fa0c810b9a0 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.636+0000 7fa0c7fff640 1 --2- 192.168.123.107:0/2822012117 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa0c8072120 0x7fa0c8072500 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.636+0000 7fa0c7fff640 1 -- 192.168.123.107:0/2822012117 >> 192.168.123.107:0/2822012117 conn(0x7fa0c806c7d0 msgr2=0x7fa0c806cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.636+0000 7fa0c7fff640 1 -- 192.168.123.107:0/2822012117 shutdown_connections 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.636+0000 7fa0c7fff640 1 -- 192.168.123.107:0/2822012117 wait complete. 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.637+0000 7fa0c7fff640 1 Processor -- start 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.637+0000 7fa0c7fff640 1 -- start start 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.637+0000 7fa0c7fff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c8072120 0x7fa0c807d480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.637+0000 7fa0c7fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa0c807d9c0 0x7fa0c807de20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.637+0000 7fa0c7fff640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa0c8084480 con 0x7fa0c807d9c0 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.637+0000 7fa0c7fff640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa0c80845f0 con 0x7fa0c8072120 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.637+0000 7fa0c67fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa0c807d9c0 0x7fa0c807de20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.638+0000 7fa0c6ffd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c8072120 0x7fa0c807d480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.638+0000 7fa0c6ffd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c8072120 0x7fa0c807d480 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:40536/0 (socket says 192.168.123.107:40536) 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.638+0000 7fa0c6ffd640 1 -- 192.168.123.107:0/747366375 learned_addr learned my addr 192.168.123.107:0/747366375 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.638+0000 7fa0c6ffd640 1 -- 192.168.123.107:0/747366375 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa0c807d9c0 msgr2=0x7fa0c807de20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:14.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.638+0000 7fa0c6ffd640 1 --2- 192.168.123.107:0/747366375 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa0c807d9c0 0x7fa0c807de20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:14.637 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.638+0000 7fa0c6ffd640 1 -- 192.168.123.107:0/747366375 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa0c0009d00 con 0x7fa0c8072120 2026-03-09T19:24:14.637 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.638+0000 7fa0c6ffd640 1 --2- 192.168.123.107:0/747366375 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c8072120 0x7fa0c807d480 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fa0b8009a90 tx=0x7fa0b8009f60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:14.642 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.641+0000 7fa0a7fff640 1 -- 192.168.123.107:0/747366375 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa0b8012070 con 0x7fa0c8072120 2026-03-09T19:24:14.642 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.641+0000 7fa0c7fff640 1 -- 192.168.123.107:0/747366375 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa0c8082050 con 0x7fa0c8072120 2026-03-09T19:24:14.642 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.641+0000 7fa0c7fff640 1 -- 192.168.123.107:0/747366375 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa0c80825a0 con 0x7fa0c8072120 2026-03-09T19:24:14.642 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.642+0000 7fa0a7fff640 1 -- 192.168.123.107:0/747366375 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa0b8004590 con 0x7fa0c8072120 2026-03-09T19:24:14.642 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.642+0000 7fa0a7fff640 1 -- 192.168.123.107:0/747366375 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa0b8016400 con 0x7fa0c8072120 2026-03-09T19:24:14.642 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.643+0000 7fa0a7fff640 1 -- 192.168.123.107:0/747366375 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fa0b80165e0 con 0x7fa0c8072120 2026-03-09T19:24:14.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.643+0000 7fa0a7fff640 1 --2- 192.168.123.107:0/747366375 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa098076290 0x7fa098078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:14.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.644+0000 7fa0c67fc640 1 --2- 192.168.123.107:0/747366375 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa098076290 0x7fa098078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:14.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.644+0000 7fa0c67fc640 1 --2- 192.168.123.107:0/747366375 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa098076290 0x7fa098078750 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fa0c0009cd0 tx=0x7fa0c003a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:14.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.644+0000 7fa0a7fff640 1 -- 192.168.123.107:0/747366375 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7fa0b80983b0 con 0x7fa0c8072120 2026-03-09T19:24:14.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.644+0000 7fa0c7fff640 1 -- 192.168.123.107:0/747366375 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa0c8108570 con 0x7fa0c8072120 2026-03-09T19:24:14.651 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.647+0000 7fa0a7fff640 1 -- 192.168.123.107:0/747366375 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fa0b8061d00 con 0x7fa0c8072120 2026-03-09T19:24:14.757 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:14.809 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:14.809 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.811+0000 7fa0c7fff640 1 -- 192.168.123.107:0/747366375 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 0} v 0) v1 -- 0x7fa0c807e9b0 con 0x7fa0c8072120 2026-03-09T19:24:14.812 INFO:teuthology.orchestra.run.vm07.stdout:38654705676 2026-03-09T19:24:14.812 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.812+0000 7fa0a7fff640 1 -- 192.168.123.107:0/747366375 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 0}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7fa0b80616a0 con 0x7fa0c8072120 2026-03-09T19:24:14.815 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.816+0000 7fa0a5ffb640 1 -- 192.168.123.107:0/747366375 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa098076290 msgr2=0x7fa098078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:14.815 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.816+0000 7fa0a5ffb640 1 --2- 192.168.123.107:0/747366375 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa098076290 0x7fa098078750 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fa0c0009cd0 tx=0x7fa0c003a040 comp rx=0 tx=0).stop 2026-03-09T19:24:14.815 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.816+0000 7fa0a5ffb640 1 -- 192.168.123.107:0/747366375 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c8072120 msgr2=0x7fa0c807d480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:14.815 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.816+0000 7fa0a5ffb640 1 --2- 192.168.123.107:0/747366375 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c8072120 0x7fa0c807d480 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fa0b8009a90 tx=0x7fa0b8009f60 comp rx=0 tx=0).stop 2026-03-09T19:24:14.815 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.817+0000 7fa0a5ffb640 1 -- 192.168.123.107:0/747366375 shutdown_connections 2026-03-09T19:24:14.815 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.817+0000 7fa0a5ffb640 1 --2- 192.168.123.107:0/747366375 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa098076290 0x7fa098078750 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:14.815 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.817+0000 7fa0a5ffb640 1 --2- 192.168.123.107:0/747366375 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa0c807d9c0 0x7fa0c807de20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:14.815 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.817+0000 7fa0a5ffb640 1 --2- 192.168.123.107:0/747366375 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c8072120 0x7fa0c807d480 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:14.815 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.817+0000 7fa0a5ffb640 1 -- 192.168.123.107:0/747366375 >> 192.168.123.107:0/747366375 conn(0x7fa0c806c7d0 msgr2=0x7fa0c807b5b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:14.815 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.817+0000 7fa0a5ffb640 1 -- 192.168.123.107:0/747366375 shutdown_connections 2026-03-09T19:24:14.815 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:14.817+0000 7fa0a5ffb640 1 -- 192.168.123.107:0/747366375 wait complete. 2026-03-09T19:24:14.827 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:14.850 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:14.902 INFO:tasks.cephadm.ceph_manager.ceph:need seq 38654705676 got 38654705676 for osd.0 2026-03-09T19:24:14.902 DEBUG:teuthology.parallel:result is None 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.222+0000 7faf9a087640 1 -- 192.168.123.107:0/3847439672 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf94072c20 msgr2=0x7faf9410bad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.222+0000 7faf9a087640 1 --2- 192.168.123.107:0/3847439672 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf94072c20 0x7faf9410bad0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7faf84007920 tx=0x7faf8402ffe0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.222+0000 7faf9a087640 1 -- 192.168.123.107:0/3847439672 shutdown_connections 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.222+0000 7faf9a087640 1 --2- 192.168.123.107:0/3847439672 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf94072c20 0x7faf9410bad0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.222+0000 7faf9a087640 1 --2- 192.168.123.107:0/3847439672 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faf94072270 0x7faf94072650 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.222+0000 7faf9a087640 1 -- 192.168.123.107:0/3847439672 >> 192.168.123.107:0/3847439672 conn(0x7faf9406b780 msgr2=0x7faf9406bb90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.222+0000 7faf9a087640 1 -- 192.168.123.107:0/3847439672 shutdown_connections 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.222+0000 7faf9a087640 1 -- 192.168.123.107:0/3847439672 wait complete. 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.222+0000 7faf9a087640 1 Processor -- start 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.222+0000 7faf9a087640 1 -- start start 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.223+0000 7faf9a087640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faf94072270 0x7faf94133290 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.223+0000 7faf9a087640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf941337d0 0x7faf9407e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.223+0000 7faf9a087640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faf94133d80 con 0x7faf94072270 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.223+0000 7faf9a087640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faf94133ef0 con 0x7faf941337d0 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.223+0000 7faf92ffd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf941337d0 0x7faf9407e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.223+0000 7faf92ffd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf941337d0 0x7faf9407e950 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:40556/0 (socket says 192.168.123.107:40556) 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.223+0000 7faf92ffd640 1 -- 192.168.123.107:0/3598869967 learned_addr learned my addr 192.168.123.107:0/3598869967 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.223+0000 7faf92ffd640 1 -- 192.168.123.107:0/3598869967 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faf94072270 msgr2=0x7faf94133290 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.223+0000 7faf92ffd640 1 --2- 192.168.123.107:0/3598869967 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faf94072270 0x7faf94133290 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.223+0000 7faf92ffd640 1 -- 192.168.123.107:0/3598869967 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faf840075d0 con 0x7faf941337d0 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.223+0000 7faf92ffd640 1 --2- 192.168.123.107:0/3598869967 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf941337d0 0x7faf9407e950 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7faf840304f0 tx=0x7faf84030a70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.223+0000 7faf90ff9640 1 -- 192.168.123.107:0/3598869967 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faf84030e70 con 0x7faf941337d0 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.223+0000 7faf9a087640 1 -- 192.168.123.107:0/3598869967 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faf9407ee90 con 0x7faf941337d0 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.223+0000 7faf9a087640 1 -- 192.168.123.107:0/3598869967 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faf9407f3e0 con 0x7faf941337d0 2026-03-09T19:24:15.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.226+0000 7faf90ff9640 1 -- 192.168.123.107:0/3598869967 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7faf84039b50 con 0x7faf941337d0 2026-03-09T19:24:15.226 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.226+0000 7faf90ff9640 1 -- 192.168.123.107:0/3598869967 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faf84038ad0 con 0x7faf941337d0 2026-03-09T19:24:15.226 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.226+0000 7faf727fc640 1 -- 192.168.123.107:0/3598869967 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faf54005350 con 0x7faf941337d0 2026-03-09T19:24:15.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.229+0000 7faf90ff9640 1 -- 192.168.123.107:0/3598869967 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7faf84038c30 con 0x7faf941337d0 2026-03-09T19:24:15.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.230+0000 7faf90ff9640 1 --2- 192.168.123.107:0/3598869967 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7faf6c076290 0x7faf6c078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:15.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.230+0000 7faf90ff9640 1 -- 192.168.123.107:0/3598869967 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7faf84048030 con 0x7faf941337d0 2026-03-09T19:24:15.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.230+0000 7faf90ff9640 1 -- 192.168.123.107:0/3598869967 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7faf840bc3d0 con 0x7faf941337d0 2026-03-09T19:24:15.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.235+0000 7faf937fe640 1 --2- 192.168.123.107:0/3598869967 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7faf6c076290 0x7faf6c078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:15.239 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.239+0000 7faf937fe640 1 --2- 192.168.123.107:0/3598869967 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7faf6c076290 0x7faf6c078750 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7faf8c00aab0 tx=0x7faf8c009250 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:15.457 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.458+0000 7f991dd37640 1 -- 192.168.123.107:0/2655093429 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9918072cf0 msgr2=0x7f991810cd90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.457 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.458+0000 7f991dd37640 1 --2- 192.168.123.107:0/2655093429 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9918072cf0 0x7f991810cd90 secure :-1 s=READY pgs=228 cs=0 l=1 rev1=1 crypto rx=0x7f990800b0a0 tx=0x7f990802f4a0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.457 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.459+0000 7faf727fc640 1 -- 192.168.123.107:0/3598869967 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 3} v 0) v1 -- 0x7faf540051c0 con 0x7faf941337d0 2026-03-09T19:24:15.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.460+0000 7f991dd37640 1 -- 192.168.123.107:0/2655093429 shutdown_connections 2026-03-09T19:24:15.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.460+0000 7f991dd37640 1 --2- 192.168.123.107:0/2655093429 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9918072cf0 0x7f991810cd90 unknown :-1 s=CLOSED pgs=228 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.460+0000 7f991dd37640 1 --2- 192.168.123.107:0/2655093429 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9918072340 0x7f9918072720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.460+0000 7f991dd37640 1 -- 192.168.123.107:0/2655093429 >> 192.168.123.107:0/2655093429 conn(0x7f991806b7f0 msgr2=0x7f991806bc00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:15.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.460+0000 7faf90ff9640 1 -- 192.168.123.107:0/3598869967 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 3}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7faf84085b00 con 0x7faf941337d0 2026-03-09T19:24:15.462 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.464+0000 7f991dd37640 1 -- 192.168.123.107:0/2655093429 shutdown_connections 2026-03-09T19:24:15.470 INFO:teuthology.orchestra.run.vm07.stdout:107374182406 2026-03-09T19:24:15.471 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.472+0000 7faf727fc640 1 -- 192.168.123.107:0/3598869967 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7faf6c076290 msgr2=0x7faf6c078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.471 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.472+0000 7faf727fc640 1 --2- 192.168.123.107:0/3598869967 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7faf6c076290 0x7faf6c078750 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7faf8c00aab0 tx=0x7faf8c009250 comp rx=0 tx=0).stop 2026-03-09T19:24:15.471 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.472+0000 7faf727fc640 1 -- 192.168.123.107:0/3598869967 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf941337d0 msgr2=0x7faf9407e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.471 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.472+0000 7faf727fc640 1 --2- 192.168.123.107:0/3598869967 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf941337d0 0x7faf9407e950 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7faf840304f0 tx=0x7faf84030a70 comp rx=0 tx=0).stop 2026-03-09T19:24:15.473 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.471+0000 7f991dd37640 1 -- 192.168.123.107:0/2655093429 wait complete. 2026-03-09T19:24:15.473 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.473+0000 7f991dd37640 1 Processor -- start 2026-03-09T19:24:15.473 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.473+0000 7f991dd37640 1 -- start start 2026-03-09T19:24:15.473 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.473+0000 7f991dd37640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9918072340 0x7f9918114dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:15.473 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.473+0000 7f991dd37640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9918072cf0 0x7f9918115300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:15.473 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.473+0000 7f991dd37640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f99181159e0 con 0x7f9918072cf0 2026-03-09T19:24:15.473 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.473+0000 7f991dd37640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f99181b5d80 con 0x7f9918072340 2026-03-09T19:24:15.473 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.473+0000 7f9916ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9918072cf0 0x7f9918115300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:15.473 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.473+0000 7f9916ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9918072cf0 0x7f9918115300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43278/0 (socket says 192.168.123.107:43278) 2026-03-09T19:24:15.473 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.473+0000 7f9916ffd640 1 -- 192.168.123.107:0/3458832968 learned_addr learned my addr 192.168.123.107:0/3458832968 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:15.474 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.473+0000 7f99177fe640 1 --2- 192.168.123.107:0/3458832968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9918072340 0x7f9918114dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:15.474 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.473+0000 7f9916ffd640 1 -- 192.168.123.107:0/3458832968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9918072340 msgr2=0x7f9918114dc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.474 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.473+0000 7f9916ffd640 1 --2- 192.168.123.107:0/3458832968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9918072340 0x7f9918114dc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.474 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.474+0000 7f9916ffd640 1 -- 192.168.123.107:0/3458832968 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9908009d00 con 0x7f9918072cf0 2026-03-09T19:24:15.474 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:15 vm07 ceph-mon[48545]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 564 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:24:15.474 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:15 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/747366375' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-09T19:24:15.474 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.476+0000 7f9916ffd640 1 --2- 192.168.123.107:0/3458832968 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9918072cf0 0x7f9918115300 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f99080042c0 tx=0x7f99080042f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:15.476 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.477+0000 7faf727fc640 1 -- 192.168.123.107:0/3598869967 shutdown_connections 2026-03-09T19:24:15.476 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.477+0000 7faf727fc640 1 --2- 192.168.123.107:0/3598869967 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7faf6c076290 0x7faf6c078750 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.476 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.477+0000 7faf727fc640 1 --2- 192.168.123.107:0/3598869967 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf941337d0 0x7faf9407e950 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.476 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.477+0000 7faf727fc640 1 --2- 192.168.123.107:0/3598869967 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faf94072270 0x7faf94133290 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.476 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.477+0000 7faf727fc640 1 -- 192.168.123.107:0/3598869967 >> 192.168.123.107:0/3598869967 conn(0x7faf9406b780 msgr2=0x7faf9406fe80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:15.476 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.477+0000 7faf727fc640 1 -- 192.168.123.107:0/3598869967 shutdown_connections 2026-03-09T19:24:15.476 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.477+0000 7faf727fc640 1 -- 192.168.123.107:0/3598869967 wait complete. 2026-03-09T19:24:15.476 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.478+0000 7f9914ff9640 1 -- 192.168.123.107:0/3458832968 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9908002e70 con 0x7f9918072cf0 2026-03-09T19:24:15.476 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.478+0000 7f991dd37640 1 -- 192.168.123.107:0/3458832968 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f99181b6000 con 0x7f9918072cf0 2026-03-09T19:24:15.476 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.478+0000 7f991dd37640 1 -- 192.168.123.107:0/3458832968 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f99181b6550 con 0x7f9918072cf0 2026-03-09T19:24:15.476 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.478+0000 7f9914ff9640 1 -- 192.168.123.107:0/3458832968 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f990802fe90 con 0x7f9918072cf0 2026-03-09T19:24:15.476 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.478+0000 7f9914ff9640 1 -- 192.168.123.107:0/3458832968 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9908040880 con 0x7f9918072cf0 2026-03-09T19:24:15.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.480+0000 7f991dd37640 1 -- 192.168.123.107:0/3458832968 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f98e4005350 con 0x7f9918072cf0 2026-03-09T19:24:15.482 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.483+0000 7f9914ff9640 1 -- 192.168.123.107:0/3458832968 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f99080092e0 con 0x7f9918072cf0 2026-03-09T19:24:15.482 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.484+0000 7f9914ff9640 1 --2- 192.168.123.107:0/3458832968 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f98ec0761c0 0x7f98ec078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:15.482 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.484+0000 7f99177fe640 1 --2- 192.168.123.107:0/3458832968 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f98ec0761c0 0x7f98ec078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:15.482 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.484+0000 7f9914ff9640 1 -- 192.168.123.107:0/3458832968 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f99080bc7e0 con 0x7f9918072cf0 2026-03-09T19:24:15.482 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.484+0000 7f99177fe640 1 --2- 192.168.123.107:0/3458832968 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f98ec0761c0 0x7f98ec078680 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f990c009990 tx=0x7f990c008040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:15.483 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.485+0000 7f9914ff9640 1 -- 192.168.123.107:0/3458832968 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f99080bcbf0 con 0x7f9918072cf0 2026-03-09T19:24:15.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.566+0000 7f35952ad640 1 -- 192.168.123.107:0/2636250434 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3590072080 msgr2=0x7f35900724c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.566+0000 7f35952ad640 1 --2- 192.168.123.107:0/2636250434 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3590072080 0x7f35900724c0 secure :-1 s=READY pgs=230 cs=0 l=1 rev1=1 crypto rx=0x7f3578009a00 tx=0x7f357802f280 comp rx=0 tx=0).stop 2026-03-09T19:24:15.568 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.570+0000 7f35952ad640 1 -- 192.168.123.107:0/2636250434 shutdown_connections 2026-03-09T19:24:15.568 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.570+0000 7f35952ad640 1 --2- 192.168.123.107:0/2636250434 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3590072080 0x7f35900724c0 unknown :-1 s=CLOSED pgs=230 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.568 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.570+0000 7f35952ad640 1 --2- 192.168.123.107:0/2636250434 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f359010d2f0 0x7f359010d6d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.568 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.570+0000 7f35952ad640 1 -- 192.168.123.107:0/2636250434 >> 192.168.123.107:0/2636250434 conn(0x7f359006cb50 msgr2=0x7f359006cf60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:15.571 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.573+0000 7f35952ad640 1 -- 192.168.123.107:0/2636250434 shutdown_connections 2026-03-09T19:24:15.571 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.573+0000 7f35952ad640 1 -- 192.168.123.107:0/2636250434 wait complete. 2026-03-09T19:24:15.572 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.573+0000 7f35952ad640 1 Processor -- start 2026-03-09T19:24:15.572 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.574+0000 7f35952ad640 1 -- start start 2026-03-09T19:24:15.572 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.574+0000 7f35952ad640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3590072080 0x7f359011cd10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:15.572 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.574+0000 7f35952ad640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f359010d2f0 0x7f359011d250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:15.572 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.574+0000 7f35952ad640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3590122aa0 con 0x7f3590072080 2026-03-09T19:24:15.572 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.574+0000 7f35952ad640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3590122c10 con 0x7f359010d2f0 2026-03-09T19:24:15.572 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.575+0000 7f358e7fc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f359010d2f0 0x7f359011d250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:15.573 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.575+0000 7f358e7fc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f359010d2f0 0x7f359011d250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:40590/0 (socket says 192.168.123.107:40590) 2026-03-09T19:24:15.573 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.575+0000 7f358e7fc640 1 -- 192.168.123.107:0/2272371715 learned_addr learned my addr 192.168.123.107:0/2272371715 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:15.573 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.576+0000 7f358effd640 1 --2- 192.168.123.107:0/2272371715 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3590072080 0x7f359011cd10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:15.574 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.576+0000 7f358e7fc640 1 -- 192.168.123.107:0/2272371715 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3590072080 msgr2=0x7f359011cd10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.574 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.576+0000 7f358e7fc640 1 --2- 192.168.123.107:0/2272371715 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3590072080 0x7f359011cd10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.574 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.576+0000 7f358e7fc640 1 -- 192.168.123.107:0/2272371715 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3578009660 con 0x7f359010d2f0 2026-03-09T19:24:15.574 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.576+0000 7f358effd640 1 --2- 192.168.123.107:0/2272371715 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3590072080 0x7f359011cd10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:24:15.575 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.577+0000 7f358e7fc640 1 --2- 192.168.123.107:0/2272371715 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f359010d2f0 0x7f359011d250 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f357802f790 tx=0x7f3578004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:15.575 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.577+0000 7f356ffff640 1 -- 192.168.123.107:0/2272371715 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3578004430 con 0x7f359010d2f0 2026-03-09T19:24:15.576 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.578+0000 7f356ffff640 1 -- 192.168.123.107:0/2272371715 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3578002e10 con 0x7f359010d2f0 2026-03-09T19:24:15.576 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.578+0000 7f35952ad640 1 -- 192.168.123.107:0/2272371715 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3590122e90 con 0x7f359010d2f0 2026-03-09T19:24:15.576 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.578+0000 7f35952ad640 1 -- 192.168.123.107:0/2272371715 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3590118050 con 0x7f359010d2f0 2026-03-09T19:24:15.577 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.579+0000 7f35952ad640 1 -- 192.168.123.107:0/2272371715 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3590108910 con 0x7f359010d2f0 2026-03-09T19:24:15.577 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.579+0000 7f356ffff640 1 -- 192.168.123.107:0/2272371715 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f35780417f0 con 0x7f359010d2f0 2026-03-09T19:24:15.579 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.581+0000 7f356ffff640 1 -- 192.168.123.107:0/2272371715 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f357803f070 con 0x7f359010d2f0 2026-03-09T19:24:15.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.584+0000 7f356ffff640 1 --2- 192.168.123.107:0/2272371715 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f35600761c0 0x7f3560078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:15.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.586+0000 7f358effd640 1 --2- 192.168.123.107:0/2272371715 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f35600761c0 0x7f3560078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:15.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.587+0000 7f358effd640 1 --2- 192.168.123.107:0/2272371715 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f35600761c0 0x7f3560078680 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f3584010470 tx=0x7f35840073d0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:15.585 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.587+0000 7f356ffff640 1 -- 192.168.123.107:0/2272371715 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f35780bd430 con 0x7f359010d2f0 2026-03-09T19:24:15.586 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.588+0000 7f356ffff640 1 -- 192.168.123.107:0/2272371715 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f357802fcf0 con 0x7f359010d2f0 2026-03-09T19:24:15.597 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.599+0000 7fb8fe381640 1 -- 192.168.123.107:0/1207534550 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8f810d1e0 msgr2=0x7fb8f810d5c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.597 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.599+0000 7fb8fe381640 1 --2- 192.168.123.107:0/1207534550 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8f810d1e0 0x7fb8f810d5c0 secure :-1 s=READY pgs=231 cs=0 l=1 rev1=1 crypto rx=0x7fb8e8009930 tx=0x7fb8e802f1a0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.598 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.600+0000 7fb8fe381640 1 -- 192.168.123.107:0/1207534550 shutdown_connections 2026-03-09T19:24:15.600 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.602+0000 7fb8fe381640 1 --2- 192.168.123.107:0/1207534550 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb8f8071fe0 0x7fb8f8072420 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.600 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.602+0000 7fb8fe381640 1 --2- 192.168.123.107:0/1207534550 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8f810d1e0 0x7fb8f810d5c0 unknown :-1 s=CLOSED pgs=231 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.600 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.602+0000 7fb8fe381640 1 -- 192.168.123.107:0/1207534550 >> 192.168.123.107:0/1207534550 conn(0x7fb8f806cb10 msgr2=0x7fb8f806cf20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:15.600 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.602+0000 7fb8fe381640 1 -- 192.168.123.107:0/1207534550 shutdown_connections 2026-03-09T19:24:15.600 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.602+0000 7fb8fe381640 1 -- 192.168.123.107:0/1207534550 wait complete. 2026-03-09T19:24:15.601 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.603+0000 7fb8fe381640 1 Processor -- start 2026-03-09T19:24:15.603 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.605+0000 7fb8fe381640 1 -- start start 2026-03-09T19:24:15.604 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.606+0000 7fb8fe381640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb8f8071fe0 0x7fb8f819ef70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:15.604 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.606+0000 7fb8fe381640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8f810d1e0 0x7fb8f819f4b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:15.604 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.606+0000 7fb8fd37f640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb8f8071fe0 0x7fb8f819ef70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:15.604 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.606+0000 7fb8fd37f640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb8f8071fe0 0x7fb8f819ef70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:40598/0 (socket says 192.168.123.107:40598) 2026-03-09T19:24:15.605 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.608+0000 7fb8fcb7e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8f810d1e0 0x7fb8f819f4b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:15.606 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.608+0000 7fb8fcb7e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8f810d1e0 0x7fb8f819f4b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43336/0 (socket says 192.168.123.107:43336) 2026-03-09T19:24:15.606 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.608+0000 7fb8fd37f640 1 -- 192.168.123.107:0/2487497406 learned_addr learned my addr 192.168.123.107:0/2487497406 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:15.606 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.608+0000 7fb8fe381640 1 -- 192.168.123.107:0/2487497406 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb8f819fb40 con 0x7fb8f810d1e0 2026-03-09T19:24:15.606 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.608+0000 7fb8fe381640 1 -- 192.168.123.107:0/2487497406 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb8f81a38b0 con 0x7fb8f8071fe0 2026-03-09T19:24:15.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.612+0000 7fb8fcb7e640 1 -- 192.168.123.107:0/2487497406 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb8f8071fe0 msgr2=0x7fb8f819ef70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.612+0000 7fb8fcb7e640 1 --2- 192.168.123.107:0/2487497406 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb8f8071fe0 0x7fb8f819ef70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.612+0000 7fb8fcb7e640 1 -- 192.168.123.107:0/2487497406 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb8e8009590 con 0x7fb8f810d1e0 2026-03-09T19:24:15.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.612+0000 7fb8fcb7e640 1 --2- 192.168.123.107:0/2487497406 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8f810d1e0 0x7fb8f819f4b0 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7fb8ec009e80 tx=0x7fb8ec00e6c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:15.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.613+0000 7fb8e67fc640 1 -- 192.168.123.107:0/2487497406 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb8ec010e80 con 0x7fb8f810d1e0 2026-03-09T19:24:15.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.613+0000 7fb8fe381640 1 -- 192.168.123.107:0/2487497406 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb8f8068110 con 0x7fb8f810d1e0 2026-03-09T19:24:15.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.613+0000 7fb8e67fc640 1 -- 192.168.123.107:0/2487497406 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fb8ec00ee80 con 0x7fb8f810d1e0 2026-03-09T19:24:15.612 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.613+0000 7fb8fe381640 1 -- 192.168.123.107:0/2487497406 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb8f81a3d00 con 0x7fb8f810d1e0 2026-03-09T19:24:15.612 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.614+0000 7fb8e67fc640 1 -- 192.168.123.107:0/2487497406 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb8ec013880 con 0x7fb8f810d1e0 2026-03-09T19:24:15.619 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.620+0000 7fb8c7fff640 1 -- 192.168.123.107:0/2487497406 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb8c0005350 con 0x7fb8f810d1e0 2026-03-09T19:24:15.632 INFO:teuthology.orchestra.run.vm07.stdout:146028888068 2026-03-09T19:24:15.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.623+0000 7f991dd37640 1 -- 192.168.123.107:0/3458832968 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 5} v 0) v1 -- 0x7f98e40058d0 con 0x7f9918072cf0 2026-03-09T19:24:15.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.630+0000 7f9914ff9640 1 -- 192.168.123.107:0/3458832968 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 5}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f9908086130 con 0x7f9918072cf0 2026-03-09T19:24:15.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.621+0000 7fb8e67fc640 1 -- 192.168.123.107:0/2487497406 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fb8ec0139e0 con 0x7fb8f810d1e0 2026-03-09T19:24:15.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.622+0000 7fb8e67fc640 1 --2- 192.168.123.107:0/2487497406 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb8d80761c0 0x7fb8d8078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:15.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.622+0000 7fb8e67fc640 1 -- 192.168.123.107:0/2487497406 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7fb8ec017070 con 0x7fb8f810d1e0 2026-03-09T19:24:15.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.630+0000 7fb8fd37f640 1 --2- 192.168.123.107:0/2487497406 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb8d80761c0 0x7fb8d8078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:15.639 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.638+0000 7f98fa7fc640 1 -- 192.168.123.107:0/3458832968 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f98ec0761c0 msgr2=0x7f98ec078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.639 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.638+0000 7f98fa7fc640 1 --2- 192.168.123.107:0/3458832968 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f98ec0761c0 0x7f98ec078680 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f990c009990 tx=0x7f990c008040 comp rx=0 tx=0).stop 2026-03-09T19:24:15.639 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.638+0000 7f98fa7fc640 1 -- 192.168.123.107:0/3458832968 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9918072cf0 msgr2=0x7f9918115300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.639 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.638+0000 7f98fa7fc640 1 --2- 192.168.123.107:0/3458832968 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9918072cf0 0x7f9918115300 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f99080042c0 tx=0x7f99080042f0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.639 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.638+0000 7f98fa7fc640 1 -- 192.168.123.107:0/3458832968 shutdown_connections 2026-03-09T19:24:15.639 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.638+0000 7f98fa7fc640 1 --2- 192.168.123.107:0/3458832968 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f98ec0761c0 0x7f98ec078680 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.639 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.638+0000 7f98fa7fc640 1 --2- 192.168.123.107:0/3458832968 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9918072cf0 0x7f9918115300 unknown :-1 s=CLOSED pgs=229 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.639 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.638+0000 7f98fa7fc640 1 --2- 192.168.123.107:0/3458832968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9918072340 0x7f9918114dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.639 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.638+0000 7f98fa7fc640 1 -- 192.168.123.107:0/3458832968 >> 192.168.123.107:0/3458832968 conn(0x7f991806b7f0 msgr2=0x7f991810de30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:15.639 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.638+0000 7f98fa7fc640 1 -- 192.168.123.107:0/3458832968 shutdown_connections 2026-03-09T19:24:15.639 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.638+0000 7f98fa7fc640 1 -- 192.168.123.107:0/3458832968 wait complete. 2026-03-09T19:24:15.639 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.639+0000 7fb8e67fc640 1 -- 192.168.123.107:0/2487497406 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fb8ec063d30 con 0x7fb8f810d1e0 2026-03-09T19:24:15.639 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.639+0000 7fb8fd37f640 1 --2- 192.168.123.107:0/2487497406 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb8d80761c0 0x7fb8d8078680 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fb8e8009a60 tx=0x7fb8e8009210 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:15.659 INFO:tasks.cephadm.ceph_manager.ceph:need seq 107374182406 got 107374182406 for osd.3 2026-03-09T19:24:15.659 DEBUG:teuthology.parallel:result is None 2026-03-09T19:24:15.690 INFO:tasks.cephadm.ceph_manager.ceph:need seq 146028888067 got 146028888068 for osd.5 2026-03-09T19:24:15.690 DEBUG:teuthology.parallel:result is None 2026-03-09T19:24:15.693 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.695+0000 7fb4578c0640 1 -- 192.168.123.107:0/3520682417 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb450072140 msgr2=0x7fb450072520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.693 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.695+0000 7fb4578c0640 1 --2- 192.168.123.107:0/3520682417 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb450072140 0x7fb450072520 secure :-1 s=READY pgs=233 cs=0 l=1 rev1=1 crypto rx=0x7fb44c0099b0 tx=0x7fb44c0314b0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.694 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.696+0000 7fb4578c0640 1 -- 192.168.123.107:0/3520682417 shutdown_connections 2026-03-09T19:24:15.694 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.696+0000 7fb4578c0640 1 --2- 192.168.123.107:0/3520682417 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb450072af0 0x7fb45010ba70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.694 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.696+0000 7fb4578c0640 1 --2- 192.168.123.107:0/3520682417 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb450072140 0x7fb450072520 unknown :-1 s=CLOSED pgs=233 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.694 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.696+0000 7fb4578c0640 1 -- 192.168.123.107:0/3520682417 >> 192.168.123.107:0/3520682417 conn(0x7fb45006c7e0 msgr2=0x7fb45006cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:15.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.696+0000 7fb4578c0640 1 -- 192.168.123.107:0/3520682417 shutdown_connections 2026-03-09T19:24:15.697 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.696+0000 7fb4578c0640 1 -- 192.168.123.107:0/3520682417 wait complete. 2026-03-09T19:24:15.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.697+0000 7fb4578c0640 1 Processor -- start 2026-03-09T19:24:15.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.697+0000 7fb4578c0640 1 -- start start 2026-03-09T19:24:15.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.697+0000 7fb4578c0640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb450072af0 0x7fb45007d3f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:15.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.697+0000 7fb4578c0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb450084360 0x7fb45007d930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:15.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.697+0000 7fb4578c0640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb45007dfc0 con 0x7fb450084360 2026-03-09T19:24:15.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.697+0000 7fb4578c0640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb45007e130 con 0x7fb450072af0 2026-03-09T19:24:15.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.697+0000 7fb454e34640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb450084360 0x7fb45007d930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:15.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.697+0000 7fb454e34640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb450084360 0x7fb45007d930 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43364/0 (socket says 192.168.123.107:43364) 2026-03-09T19:24:15.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.697+0000 7fb454e34640 1 -- 192.168.123.107:0/1668288489 learned_addr learned my addr 192.168.123.107:0/1668288489 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:15.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.700+0000 7fb455635640 1 --2- 192.168.123.107:0/1668288489 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb450072af0 0x7fb45007d3f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:15.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.700+0000 7fb454e34640 1 -- 192.168.123.107:0/1668288489 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb450072af0 msgr2=0x7fb45007d3f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.700+0000 7fb454e34640 1 --2- 192.168.123.107:0/1668288489 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb450072af0 0x7fb45007d3f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.700+0000 7fb454e34640 1 -- 192.168.123.107:0/1668288489 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb44c009660 con 0x7fb450084360 2026-03-09T19:24:15.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.700+0000 7fb454e34640 1 --2- 192.168.123.107:0/1668288489 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb450084360 0x7fb45007d930 secure :-1 s=READY pgs=234 cs=0 l=1 rev1=1 crypto rx=0x7fb448007e90 tx=0x7fb448007f90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:15.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.701+0000 7fb4467fc640 1 -- 192.168.123.107:0/1668288489 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb448011070 con 0x7fb450084360 2026-03-09T19:24:15.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.702+0000 7fb4578c0640 1 -- 192.168.123.107:0/1668288489 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb450081f60 con 0x7fb450084360 2026-03-09T19:24:15.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.702+0000 7fb4578c0640 1 -- 192.168.123.107:0/1668288489 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb450082460 con 0x7fb450084360 2026-03-09T19:24:15.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.703+0000 7fb4467fc640 1 -- 192.168.123.107:0/1668288489 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fb4480040d0 con 0x7fb450084360 2026-03-09T19:24:15.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.704+0000 7fb4467fc640 1 -- 192.168.123.107:0/1668288489 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb4480155d0 con 0x7fb450084360 2026-03-09T19:24:15.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.704+0000 7fb4467fc640 1 -- 192.168.123.107:0/1668288489 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fb44801d050 con 0x7fb450084360 2026-03-09T19:24:15.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.706+0000 7fb4467fc640 1 --2- 192.168.123.107:0/1668288489 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb430076360 0x7fb430078820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:15.708 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.710+0000 7fb455635640 1 --2- 192.168.123.107:0/1668288489 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb430076360 0x7fb430078820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:15.708 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.710+0000 7fb455635640 1 --2- 192.168.123.107:0/1668288489 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb430076360 0x7fb430078820 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fb44c0384f0 tx=0x7fb44c03b040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:15.708 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.711+0000 7fb4467fc640 1 -- 192.168.123.107:0/1668288489 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7fb448097db0 con 0x7fb450084360 2026-03-09T19:24:15.709 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.711+0000 7fb423fff640 1 -- 192.168.123.107:0/1668288489 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb418005350 con 0x7fb450084360 2026-03-09T19:24:15.713 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.715+0000 7fb4467fc640 1 -- 192.168.123.107:0/1668288489 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fb448061700 con 0x7fb450084360 2026-03-09T19:24:15.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.786+0000 7fb8c7fff640 1 -- 192.168.123.107:0/2487497406 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 2} v 0) v1 -- 0x7fb8c00051c0 con 0x7fb8f810d1e0 2026-03-09T19:24:15.788 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.790+0000 7f35952ad640 1 -- 192.168.123.107:0/2272371715 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 4} v 0) v1 -- 0x7f3590114060 con 0x7f359010d2f0 2026-03-09T19:24:15.790 INFO:teuthology.orchestra.run.vm07.stdout:128849018885 2026-03-09T19:24:15.790 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.792+0000 7f356ffff640 1 -- 192.168.123.107:0/2272371715 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 4}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f357808de60 con 0x7f359010d2f0 2026-03-09T19:24:15.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.794+0000 7f356dffb640 1 -- 192.168.123.107:0/2272371715 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f35600761c0 msgr2=0x7f3560078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.794+0000 7f356dffb640 1 --2- 192.168.123.107:0/2272371715 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f35600761c0 0x7f3560078680 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f3584010470 tx=0x7f35840073d0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.794+0000 7f356dffb640 1 -- 192.168.123.107:0/2272371715 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f359010d2f0 msgr2=0x7f359011d250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.794+0000 7f356dffb640 1 --2- 192.168.123.107:0/2272371715 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f359010d2f0 0x7f359011d250 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f357802f790 tx=0x7f3578004290 comp rx=0 tx=0).stop 2026-03-09T19:24:15.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.794+0000 7f356dffb640 1 -- 192.168.123.107:0/2272371715 shutdown_connections 2026-03-09T19:24:15.793 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.794+0000 7f356dffb640 1 --2- 192.168.123.107:0/2272371715 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f35600761c0 0x7f3560078680 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.793 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.794+0000 7f356dffb640 1 --2- 192.168.123.107:0/2272371715 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f359010d2f0 0x7f359011d250 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.793 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.795+0000 7f356dffb640 1 --2- 192.168.123.107:0/2272371715 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3590072080 0x7f359011cd10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.793 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.795+0000 7f356dffb640 1 -- 192.168.123.107:0/2272371715 >> 192.168.123.107:0/2272371715 conn(0x7f359006cb50 msgr2=0x7f3590111190 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:15.793 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.789+0000 7fb8e67fc640 1 -- 192.168.123.107:0/2487497406 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 2}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7fb8ec0636d0 con 0x7fb8f810d1e0 2026-03-09T19:24:15.793 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.795+0000 7f356dffb640 1 -- 192.168.123.107:0/2272371715 shutdown_connections 2026-03-09T19:24:15.793 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.795+0000 7f356dffb640 1 -- 192.168.123.107:0/2272371715 wait complete. 2026-03-09T19:24:15.793 INFO:teuthology.orchestra.run.vm07.stdout:73014444040 2026-03-09T19:24:15.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.800+0000 7fb8c7fff640 1 -- 192.168.123.107:0/2487497406 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb8d80761c0 msgr2=0x7fb8d8078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.800+0000 7fb8c7fff640 1 --2- 192.168.123.107:0/2487497406 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb8d80761c0 0x7fb8d8078680 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fb8e8009a60 tx=0x7fb8e8009210 comp rx=0 tx=0).stop 2026-03-09T19:24:15.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.800+0000 7fb8c7fff640 1 -- 192.168.123.107:0/2487497406 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8f810d1e0 msgr2=0x7fb8f819f4b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.800+0000 7fb8c7fff640 1 --2- 192.168.123.107:0/2487497406 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8f810d1e0 0x7fb8f819f4b0 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7fb8ec009e80 tx=0x7fb8ec00e6c0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.800+0000 7fb8c7fff640 1 -- 192.168.123.107:0/2487497406 shutdown_connections 2026-03-09T19:24:15.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.800+0000 7fb8c7fff640 1 --2- 192.168.123.107:0/2487497406 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb8d80761c0 0x7fb8d8078680 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.800+0000 7fb8c7fff640 1 --2- 192.168.123.107:0/2487497406 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb8f810d1e0 0x7fb8f819f4b0 secure :-1 s=CLOSED pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7fb8ec009e80 tx=0x7fb8ec00e6c0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.799 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.800+0000 7fb8c7fff640 1 --2- 192.168.123.107:0/2487497406 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb8f8071fe0 0x7fb8f819ef70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.799 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.800+0000 7fb8c7fff640 1 -- 192.168.123.107:0/2487497406 >> 192.168.123.107:0/2487497406 conn(0x7fb8f806cb10 msgr2=0x7fb8f810ad70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:15.799 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.801+0000 7fb8c7fff640 1 -- 192.168.123.107:0/2487497406 shutdown_connections 2026-03-09T19:24:15.799 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.801+0000 7fb8c7fff640 1 -- 192.168.123.107:0/2487497406 wait complete. 2026-03-09T19:24:15.842 INFO:tasks.cephadm.ceph_manager.ceph:need seq 128849018885 got 128849018885 for osd.4 2026-03-09T19:24:15.842 DEBUG:teuthology.parallel:result is None 2026-03-09T19:24:15.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:15 vm08 ceph-mon[57794]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 564 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:24:15.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:15 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/747366375' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-09T19:24:15.880 INFO:tasks.cephadm.ceph_manager.ceph:need seq 73014444040 got 73014444040 for osd.2 2026-03-09T19:24:15.881 DEBUG:teuthology.parallel:result is None 2026-03-09T19:24:15.900 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.902+0000 7fb423fff640 1 -- 192.168.123.107:0/1668288489 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 1} v 0) v1 -- 0x7fb4180051c0 con 0x7fb450084360 2026-03-09T19:24:15.900 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.902+0000 7fb4467fc640 1 -- 192.168.123.107:0/1668288489 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 1}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7fb4480610a0 con 0x7fb450084360 2026-03-09T19:24:15.900 INFO:teuthology.orchestra.run.vm07.stdout:55834574858 2026-03-09T19:24:15.902 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.904+0000 7fb423fff640 1 -- 192.168.123.107:0/1668288489 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb430076360 msgr2=0x7fb430078820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.902 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.904+0000 7fb423fff640 1 --2- 192.168.123.107:0/1668288489 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb430076360 0x7fb430078820 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fb44c0384f0 tx=0x7fb44c03b040 comp rx=0 tx=0).stop 2026-03-09T19:24:15.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.904+0000 7fb423fff640 1 -- 192.168.123.107:0/1668288489 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb450084360 msgr2=0x7fb45007d930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:15.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.905+0000 7fb423fff640 1 --2- 192.168.123.107:0/1668288489 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb450084360 0x7fb45007d930 secure :-1 s=READY pgs=234 cs=0 l=1 rev1=1 crypto rx=0x7fb448007e90 tx=0x7fb448007f90 comp rx=0 tx=0).stop 2026-03-09T19:24:15.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.905+0000 7fb423fff640 1 -- 192.168.123.107:0/1668288489 shutdown_connections 2026-03-09T19:24:15.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.905+0000 7fb423fff640 1 --2- 192.168.123.107:0/1668288489 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fb430076360 0x7fb430078820 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.905+0000 7fb423fff640 1 --2- 192.168.123.107:0/1668288489 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb450084360 0x7fb45007d930 unknown :-1 s=CLOSED pgs=234 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.905+0000 7fb423fff640 1 --2- 192.168.123.107:0/1668288489 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb450072af0 0x7fb45007d3f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:15.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.905+0000 7fb423fff640 1 -- 192.168.123.107:0/1668288489 >> 192.168.123.107:0/1668288489 conn(0x7fb45006c7e0 msgr2=0x7fb450071440 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:15.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.905+0000 7fb423fff640 1 -- 192.168.123.107:0/1668288489 shutdown_connections 2026-03-09T19:24:15.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:15.905+0000 7fb423fff640 1 -- 192.168.123.107:0/1668288489 wait complete. 2026-03-09T19:24:15.969 INFO:tasks.cephadm.ceph_manager.ceph:need seq 55834574858 got 55834574858 for osd.1 2026-03-09T19:24:15.969 DEBUG:teuthology.parallel:result is None 2026-03-09T19:24:15.969 INFO:tasks.cephadm.ceph_manager.ceph:waiting for clean 2026-03-09T19:24:15.969 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph pg dump --format=json 2026-03-09T19:24:16.152 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:16.402 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.403+0000 7f140d2cc640 1 -- 192.168.123.107:0/2733155039 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1408072cf0 msgr2=0x7f140810cd90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:16.402 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.403+0000 7f140d2cc640 1 --2- 192.168.123.107:0/2733155039 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1408072cf0 0x7f140810cd90 secure :-1 s=READY pgs=235 cs=0 l=1 rev1=1 crypto rx=0x7f13f00099b0 tx=0x7f13f002f240 comp rx=0 tx=0).stop 2026-03-09T19:24:16.402 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.403+0000 7f140d2cc640 1 -- 192.168.123.107:0/2733155039 shutdown_connections 2026-03-09T19:24:16.402 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.403+0000 7f140d2cc640 1 --2- 192.168.123.107:0/2733155039 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1408072cf0 0x7f140810cd90 unknown :-1 s=CLOSED pgs=235 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:16.402 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.403+0000 7f140d2cc640 1 --2- 192.168.123.107:0/2733155039 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1408072340 0x7f1408072720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:16.402 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.403+0000 7f140d2cc640 1 -- 192.168.123.107:0/2733155039 >> 192.168.123.107:0/2733155039 conn(0x7f140806b7f0 msgr2=0x7f140806bc00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:16.402 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.404+0000 7f140d2cc640 1 -- 192.168.123.107:0/2733155039 shutdown_connections 2026-03-09T19:24:16.402 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.404+0000 7f140d2cc640 1 -- 192.168.123.107:0/2733155039 wait complete. 2026-03-09T19:24:16.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.405+0000 7f140d2cc640 1 Processor -- start 2026-03-09T19:24:16.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.405+0000 7f140d2cc640 1 -- start start 2026-03-09T19:24:16.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.405+0000 7f140d2cc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1408072340 0x7f1408112c70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:16.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.405+0000 7f140d2cc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f14081131b0 0x7f14081b9cb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:16.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.405+0000 7f140d2cc640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1408113760 con 0x7f14081131b0 2026-03-09T19:24:16.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.405+0000 7f140d2cc640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f14081138d0 con 0x7f1408072340 2026-03-09T19:24:16.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.405+0000 7f14067fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f14081131b0 0x7f14081b9cb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:16.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.405+0000 7f14067fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f14081131b0 0x7f14081b9cb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43376/0 (socket says 192.168.123.107:43376) 2026-03-09T19:24:16.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.405+0000 7f14067fc640 1 -- 192.168.123.107:0/3260940559 learned_addr learned my addr 192.168.123.107:0/3260940559 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:16.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.405+0000 7f14067fc640 1 -- 192.168.123.107:0/3260940559 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1408072340 msgr2=0x7f1408112c70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:16.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.405+0000 7f14067fc640 1 --2- 192.168.123.107:0/3260940559 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1408072340 0x7f1408112c70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:16.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.405+0000 7f14067fc640 1 -- 192.168.123.107:0/3260940559 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f13f0009660 con 0x7f14081131b0 2026-03-09T19:24:16.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.406+0000 7f14067fc640 1 --2- 192.168.123.107:0/3260940559 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f14081131b0 0x7f14081b9cb0 secure :-1 s=READY pgs=236 cs=0 l=1 rev1=1 crypto rx=0x7f13f002f750 tx=0x7f13f0002c40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:16.404 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.406+0000 7f13e7fff640 1 -- 192.168.123.107:0/3260940559 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f13f003d070 con 0x7f14081131b0 2026-03-09T19:24:16.405 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.406+0000 7f140d2cc640 1 -- 192.168.123.107:0/3260940559 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f14081ba250 con 0x7f14081131b0 2026-03-09T19:24:16.405 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.406+0000 7f140d2cc640 1 -- 192.168.123.107:0/3260940559 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f14081ba7a0 con 0x7f14081131b0 2026-03-09T19:24:16.405 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.407+0000 7f13e7fff640 1 -- 192.168.123.107:0/3260940559 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f13f0004590 con 0x7f14081131b0 2026-03-09T19:24:16.405 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.407+0000 7f13e7fff640 1 -- 192.168.123.107:0/3260940559 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f13f0031070 con 0x7f14081131b0 2026-03-09T19:24:16.410 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.408+0000 7f13e5ffb640 1 -- 192.168.123.107:0/3260940559 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f13d0005350 con 0x7f14081131b0 2026-03-09T19:24:16.410 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.412+0000 7f13e7fff640 1 -- 192.168.123.107:0/3260940559 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f13f0049050 con 0x7f14081131b0 2026-03-09T19:24:16.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.413+0000 7f13e7fff640 1 --2- 192.168.123.107:0/3260940559 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f13d8076290 0x7f13d8078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:16.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.413+0000 7f13e7fff640 1 -- 192.168.123.107:0/3260940559 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f13f00bc7e0 con 0x7f14081131b0 2026-03-09T19:24:16.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.413+0000 7f1406ffd640 1 --2- 192.168.123.107:0/3260940559 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f13d8076290 0x7f13d8078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:16.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.413+0000 7f13e7fff640 1 -- 192.168.123.107:0/3260940559 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f13f0002d70 con 0x7f14081131b0 2026-03-09T19:24:16.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.414+0000 7f1406ffd640 1 --2- 192.168.123.107:0/3260940559 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f13d8076290 0x7f13d8078750 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f13f8005fd0 tx=0x7f13f8005950 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:16.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:16 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/3598869967' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-09T19:24:16.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:16 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/3458832968' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-09T19:24:16.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:16 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/2487497406' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-09T19:24:16.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:16 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/2272371715' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-09T19:24:16.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:16 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1668288489' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-09T19:24:16.498 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.500+0000 7f13e5ffb640 1 -- 192.168.123.107:0/3260940559 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f13d0002bf0 con 0x7f13d8076290 2026-03-09T19:24:16.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.501+0000 7f13e7fff640 1 -- 192.168.123.107:0/3260940559 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+18936 (secure 0 0 0) 0x7f13d0002bf0 con 0x7f13d8076290 2026-03-09T19:24:16.499 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:16.499 INFO:teuthology.orchestra.run.vm07.stderr:dumped all 2026-03-09T19:24:16.501 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.504+0000 7f13e5ffb640 1 -- 192.168.123.107:0/3260940559 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f13d8076290 msgr2=0x7f13d8078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:16.501 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.504+0000 7f13e5ffb640 1 --2- 192.168.123.107:0/3260940559 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f13d8076290 0x7f13d8078750 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f13f8005fd0 tx=0x7f13f8005950 comp rx=0 tx=0).stop 2026-03-09T19:24:16.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.504+0000 7f13e5ffb640 1 -- 192.168.123.107:0/3260940559 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f14081131b0 msgr2=0x7f14081b9cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:16.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.504+0000 7f13e5ffb640 1 --2- 192.168.123.107:0/3260940559 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f14081131b0 0x7f14081b9cb0 secure :-1 s=READY pgs=236 cs=0 l=1 rev1=1 crypto rx=0x7f13f002f750 tx=0x7f13f0002c40 comp rx=0 tx=0).stop 2026-03-09T19:24:16.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.504+0000 7f13e5ffb640 1 -- 192.168.123.107:0/3260940559 shutdown_connections 2026-03-09T19:24:16.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.504+0000 7f13e5ffb640 1 --2- 192.168.123.107:0/3260940559 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f13d8076290 0x7f13d8078750 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:16.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.504+0000 7f13e5ffb640 1 --2- 192.168.123.107:0/3260940559 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f14081131b0 0x7f14081b9cb0 unknown :-1 s=CLOSED pgs=236 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:16.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.504+0000 7f13e5ffb640 1 --2- 192.168.123.107:0/3260940559 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1408072340 0x7f1408112c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:16.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.504+0000 7f13e5ffb640 1 -- 192.168.123.107:0/3260940559 >> 192.168.123.107:0/3260940559 conn(0x7f140806b7f0 msgr2=0x7f140810dee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:16.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.504+0000 7f13e5ffb640 1 -- 192.168.123.107:0/3260940559 shutdown_connections 2026-03-09T19:24:16.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.504+0000 7f13e5ffb640 1 -- 192.168.123.107:0/3260940559 wait complete. 2026-03-09T19:24:16.559 INFO:teuthology.orchestra.run.vm07.stdout:{"pg_ready":true,"pg_map":{"version":67,"stamp":"2026-03-09T19:24:16.332638+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":577980,"kb_used_data":3228,"kb_used_omap":9,"kb_used_meta":165046,"kb_avail":125226564,"statfs":{"total":128823853056,"available":128232001536,"internally_reserved":0,"allocated":3305472,"data_stored":2114622,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":9532,"internal_metadata":169007812},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"8.098360"},"pg_stats":[{"pgid":"1.0","version":"20'32","reported_seq":79,"reported_epoch":34,"state":"active+clean","last_fresh":"2026-03-09T19:24:08.236874+0000","last_change":"2026-03-09T19:23:57.648564+0000","last_active":"2026-03-09T19:24:08.236874+0000","last_peered":"2026-03-09T19:24:08.236874+0000","last_clean":"2026-03-09T19:24:08.236874+0000","last_became_active":"2026-03-09T19:23:57.648393+0000","last_became_peered":"2026-03-09T19:23:57.648393+0000","last_unstale":"2026-03-09T19:24:08.236874+0000","last_undegraded":"2026-03-09T19:24:08.236874+0000","last_fullsized":"2026-03-09T19:24:08.236874+0000","mapping_epoch":29,"log_start":"0'0","ondisk_log_start":"0'0","created":19,"last_epoch_clean":30,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-09T19:23:42.998366+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-09T19:23:42.998366+0000","last_clean_scrub_stamp":"2026-03-09T19:23:42.998366+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-10T23:33:22.653876+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":34,"seq":146028888068,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":436744,"kb_used_data":312,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20530680,"statfs":{"total":21470642176,"available":21023416320,"internally_reserved":0,"allocated":319488,"data_stored":122797,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.55600000000000005}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65600000000000003}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60799999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59499999999999997}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.67700000000000005}]}]},{"osd":4,"up_from":30,"seq":128849018885,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27144,"kb_used_data":312,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20940280,"statfs":{"total":21470642176,"available":21442846720,"internally_reserved":0,"allocated":319488,"data_stored":122797,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[0,1,2,3],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.34599999999999997}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.29599999999999999}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.49299999999999999}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.25600000000000001}]}]},{"osd":3,"up_from":25,"seq":107374182407,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27596,"kb_used_data":764,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20939828,"statfs":{"total":21470642176,"available":21442383872,"internally_reserved":0,"allocated":782336,"data_stored":582077,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1587,"internal_metadata":27457997},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65100000000000002}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.55300000000000005}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":1.1040000000000001}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":1.2110000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60999999999999999}]}]},{"osd":2,"up_from":17,"seq":73014444040,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27144,"kb_used_data":312,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20940280,"statfs":{"total":21470642176,"available":21442846720,"internally_reserved":0,"allocated":319488,"data_stored":122797,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.54200000000000004}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.53300000000000003}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56299999999999994}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.51000000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.58799999999999997}]}]},{"osd":1,"up_from":13,"seq":55834574858,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":31756,"kb_used_data":764,"kb_used_omap":1,"kb_used_meta":30974,"kb_avail":20935668,"statfs":{"total":21470642176,"available":21438124032,"internally_reserved":0,"allocated":782336,"data_stored":582077,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":31717835},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.51800000000000002}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65100000000000002}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.502}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48599999999999999}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.629}]}]},{"osd":0,"up_from":9,"seq":38654705676,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27596,"kb_used_data":764,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20939828,"statfs":{"total":21470642176,"available":21442383872,"internally_reserved":0,"allocated":782336,"data_stored":582077,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.52200000000000002}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.313}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.66200000000000003}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59599999999999997}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.61299999999999999}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-09T19:24:16.559 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph pg dump --format=json 2026-03-09T19:24:16.713 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:16.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:16 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/3598869967' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-09T19:24:16.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:16 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/3458832968' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-09T19:24:16.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:16 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/2487497406' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-09T19:24:16.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:16 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/2272371715' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-09T19:24:16.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:16 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/1668288489' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-09T19:24:16.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.947+0000 7f868217e640 1 -- 192.168.123.107:0/2905214236 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f867c1029d0 msgr2=0x7f867c102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:16.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.947+0000 7f868217e640 1 --2- 192.168.123.107:0/2905214236 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f867c1029d0 0x7f867c102e30 secure :-1 s=READY pgs=237 cs=0 l=1 rev1=1 crypto rx=0x7f8668009a00 tx=0x7f866802f280 comp rx=0 tx=0).stop 2026-03-09T19:24:16.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.948+0000 7f868217e640 1 -- 192.168.123.107:0/2905214236 shutdown_connections 2026-03-09T19:24:16.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.948+0000 7f868217e640 1 --2- 192.168.123.107:0/2905214236 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f867c1029d0 0x7f867c102e30 unknown :-1 s=CLOSED pgs=237 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:16.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.948+0000 7f868217e640 1 --2- 192.168.123.107:0/2905214236 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f867c1089d0 0x7f867c108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:16.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.948+0000 7f868217e640 1 -- 192.168.123.107:0/2905214236 >> 192.168.123.107:0/2905214236 conn(0x7f867c0fe710 msgr2=0x7f867c100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:16.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.948+0000 7f868217e640 1 -- 192.168.123.107:0/2905214236 shutdown_connections 2026-03-09T19:24:16.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.948+0000 7f868217e640 1 -- 192.168.123.107:0/2905214236 wait complete. 2026-03-09T19:24:16.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.948+0000 7f868217e640 1 Processor -- start 2026-03-09T19:24:16.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.949+0000 7f868217e640 1 -- start start 2026-03-09T19:24:16.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.949+0000 7f868217e640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f867c1029d0 0x7f867c1a0670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:16.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.949+0000 7f868217e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f867c1089d0 0x7f867c1a0bb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:16.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.949+0000 7f868217e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f867c19a7b0 con 0x7f867c1089d0 2026-03-09T19:24:16.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.949+0000 7f868217e640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f867c19a920 con 0x7f867c1029d0 2026-03-09T19:24:16.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.949+0000 7f867affd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f867c1089d0 0x7f867c1a0bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:16.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.949+0000 7f867affd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f867c1089d0 0x7f867c1a0bb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43388/0 (socket says 192.168.123.107:43388) 2026-03-09T19:24:16.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.949+0000 7f867affd640 1 -- 192.168.123.107:0/1016096654 learned_addr learned my addr 192.168.123.107:0/1016096654 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:16.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.949+0000 7f867b7fe640 1 --2- 192.168.123.107:0/1016096654 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f867c1029d0 0x7f867c1a0670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:16.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.949+0000 7f867affd640 1 -- 192.168.123.107:0/1016096654 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f867c1029d0 msgr2=0x7f867c1a0670 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:16.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.949+0000 7f867affd640 1 --2- 192.168.123.107:0/1016096654 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f867c1029d0 0x7f867c1a0670 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:16.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.949+0000 7f867affd640 1 -- 192.168.123.107:0/1016096654 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8668009660 con 0x7f867c1089d0 2026-03-09T19:24:16.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.949+0000 7f867b7fe640 1 --2- 192.168.123.107:0/1016096654 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f867c1029d0 0x7f867c1a0670 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T19:24:16.948 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.950+0000 7f867affd640 1 --2- 192.168.123.107:0/1016096654 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f867c1089d0 0x7f867c1a0bb0 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7f866802f790 tx=0x7f8668004300 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:16.948 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.950+0000 7f8678ff9640 1 -- 192.168.123.107:0/1016096654 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f866802fae0 con 0x7f867c1089d0 2026-03-09T19:24:16.948 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.950+0000 7f8678ff9640 1 -- 192.168.123.107:0/1016096654 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f866802fc40 con 0x7f867c1089d0 2026-03-09T19:24:16.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.950+0000 7f8678ff9640 1 -- 192.168.123.107:0/1016096654 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f86680417e0 con 0x7f867c1089d0 2026-03-09T19:24:16.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.950+0000 7f868217e640 1 -- 192.168.123.107:0/1016096654 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f867c19aba0 con 0x7f867c1089d0 2026-03-09T19:24:16.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.950+0000 7f868217e640 1 -- 192.168.123.107:0/1016096654 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f867c19b090 con 0x7f867c1089d0 2026-03-09T19:24:16.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.952+0000 7f868217e640 1 -- 192.168.123.107:0/1016096654 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8644005350 con 0x7f867c1089d0 2026-03-09T19:24:16.953 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.955+0000 7f8678ff9640 1 -- 192.168.123.107:0/1016096654 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f866803f070 con 0x7f867c1089d0 2026-03-09T19:24:16.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.956+0000 7f8678ff9640 1 --2- 192.168.123.107:0/1016096654 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8650076290 0x7f8650078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:16.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.956+0000 7f8678ff9640 1 -- 192.168.123.107:0/1016096654 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f86680bbc80 con 0x7f867c1089d0 2026-03-09T19:24:16.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.956+0000 7f867b7fe640 1 --2- 192.168.123.107:0/1016096654 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8650076290 0x7f8650078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:16.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.956+0000 7f8678ff9640 1 -- 192.168.123.107:0/1016096654 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f86680c2080 con 0x7f867c1089d0 2026-03-09T19:24:16.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:16.957+0000 7f867b7fe640 1 --2- 192.168.123.107:0/1016096654 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8650076290 0x7f8650078750 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f8660002730 tx=0x7f8660009290 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:17.046 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.048+0000 7f868217e640 1 -- 192.168.123.107:0/1016096654 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f8644002bf0 con 0x7f8650076290 2026-03-09T19:24:17.048 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.051+0000 7f8678ff9640 1 -- 192.168.123.107:0/1016096654 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+18936 (secure 0 0 0) 0x7f8644002bf0 con 0x7f8650076290 2026-03-09T19:24:17.049 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:17.049 INFO:teuthology.orchestra.run.vm07.stderr:dumped all 2026-03-09T19:24:17.051 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.053+0000 7f868217e640 1 -- 192.168.123.107:0/1016096654 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8650076290 msgr2=0x7f8650078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:17.051 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.053+0000 7f868217e640 1 --2- 192.168.123.107:0/1016096654 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8650076290 0x7f8650078750 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f8660002730 tx=0x7f8660009290 comp rx=0 tx=0).stop 2026-03-09T19:24:17.051 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.053+0000 7f868217e640 1 -- 192.168.123.107:0/1016096654 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f867c1089d0 msgr2=0x7f867c1a0bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:17.051 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.053+0000 7f868217e640 1 --2- 192.168.123.107:0/1016096654 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f867c1089d0 0x7f867c1a0bb0 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7f866802f790 tx=0x7f8668004300 comp rx=0 tx=0).stop 2026-03-09T19:24:17.051 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.053+0000 7f868217e640 1 -- 192.168.123.107:0/1016096654 shutdown_connections 2026-03-09T19:24:17.051 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.053+0000 7f868217e640 1 --2- 192.168.123.107:0/1016096654 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8650076290 0x7f8650078750 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:17.051 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.053+0000 7f868217e640 1 --2- 192.168.123.107:0/1016096654 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f867c1089d0 0x7f867c1a0bb0 unknown :-1 s=CLOSED pgs=238 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:17.051 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.053+0000 7f868217e640 1 --2- 192.168.123.107:0/1016096654 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f867c1029d0 0x7f867c1a0670 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:17.051 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.053+0000 7f868217e640 1 -- 192.168.123.107:0/1016096654 >> 192.168.123.107:0/1016096654 conn(0x7f867c0fe710 msgr2=0x7f867c0feaf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:17.051 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.054+0000 7f868217e640 1 -- 192.168.123.107:0/1016096654 shutdown_connections 2026-03-09T19:24:17.052 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.054+0000 7f868217e640 1 -- 192.168.123.107:0/1016096654 wait complete. 2026-03-09T19:24:17.095 INFO:teuthology.orchestra.run.vm07.stdout:{"pg_ready":true,"pg_map":{"version":67,"stamp":"2026-03-09T19:24:16.332638+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":577980,"kb_used_data":3228,"kb_used_omap":9,"kb_used_meta":165046,"kb_avail":125226564,"statfs":{"total":128823853056,"available":128232001536,"internally_reserved":0,"allocated":3305472,"data_stored":2114622,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":9532,"internal_metadata":169007812},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"8.098360"},"pg_stats":[{"pgid":"1.0","version":"20'32","reported_seq":79,"reported_epoch":34,"state":"active+clean","last_fresh":"2026-03-09T19:24:08.236874+0000","last_change":"2026-03-09T19:23:57.648564+0000","last_active":"2026-03-09T19:24:08.236874+0000","last_peered":"2026-03-09T19:24:08.236874+0000","last_clean":"2026-03-09T19:24:08.236874+0000","last_became_active":"2026-03-09T19:23:57.648393+0000","last_became_peered":"2026-03-09T19:23:57.648393+0000","last_unstale":"2026-03-09T19:24:08.236874+0000","last_undegraded":"2026-03-09T19:24:08.236874+0000","last_fullsized":"2026-03-09T19:24:08.236874+0000","mapping_epoch":29,"log_start":"0'0","ondisk_log_start":"0'0","created":19,"last_epoch_clean":30,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-09T19:23:42.998366+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-09T19:23:42.998366+0000","last_clean_scrub_stamp":"2026-03-09T19:23:42.998366+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-10T23:33:22.653876+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":34,"seq":146028888068,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":436744,"kb_used_data":312,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20530680,"statfs":{"total":21470642176,"available":21023416320,"internally_reserved":0,"allocated":319488,"data_stored":122797,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.55600000000000005}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65600000000000003}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60799999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59499999999999997}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.67700000000000005}]}]},{"osd":4,"up_from":30,"seq":128849018885,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27144,"kb_used_data":312,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20940280,"statfs":{"total":21470642176,"available":21442846720,"internally_reserved":0,"allocated":319488,"data_stored":122797,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[0,1,2,3],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.34599999999999997}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.29599999999999999}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.49299999999999999}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.25600000000000001}]}]},{"osd":3,"up_from":25,"seq":107374182407,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27596,"kb_used_data":764,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20939828,"statfs":{"total":21470642176,"available":21442383872,"internally_reserved":0,"allocated":782336,"data_stored":582077,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1587,"internal_metadata":27457997},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65100000000000002}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.55300000000000005}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":1.1040000000000001}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":1.2110000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60999999999999999}]}]},{"osd":2,"up_from":17,"seq":73014444040,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27144,"kb_used_data":312,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20940280,"statfs":{"total":21470642176,"available":21442846720,"internally_reserved":0,"allocated":319488,"data_stored":122797,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.54200000000000004}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.53300000000000003}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56299999999999994}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.51000000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.58799999999999997}]}]},{"osd":1,"up_from":13,"seq":55834574858,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":31756,"kb_used_data":764,"kb_used_omap":1,"kb_used_meta":30974,"kb_avail":20935668,"statfs":{"total":21470642176,"available":21438124032,"internally_reserved":0,"allocated":782336,"data_stored":582077,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":31717835},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.51800000000000002}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65100000000000002}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.502}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48599999999999999}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.629}]}]},{"osd":0,"up_from":9,"seq":38654705676,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27596,"kb_used_data":764,"kb_used_omap":1,"kb_used_meta":26814,"kb_avail":20939828,"statfs":{"total":21470642176,"available":21442383872,"internally_reserved":0,"allocated":782336,"data_stored":582077,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1589,"internal_metadata":27457995},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.52200000000000002}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.313}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.66200000000000003}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59599999999999997}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.61299999999999999}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-09T19:24:17.095 INFO:tasks.cephadm.ceph_manager.ceph:clean! 2026-03-09T19:24:17.095 INFO:tasks.ceph:Waiting until ceph cluster ceph is healthy... 2026-03-09T19:24:17.095 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy 2026-03-09T19:24:17.096 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph health --format=json 2026-03-09T19:24:17.258 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:17.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.505+0000 7f63c2577640 1 -- 192.168.123.107:0/2754571657 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63bc106780 msgr2=0x7f63bc106b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:17.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.505+0000 7f63c2577640 1 --2- 192.168.123.107:0/2754571657 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63bc106780 0x7f63bc106b60 secure :-1 s=READY pgs=239 cs=0 l=1 rev1=1 crypto rx=0x7f63ac009a00 tx=0x7f63ac02f280 comp rx=0 tx=0).stop 2026-03-09T19:24:17.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.506+0000 7f63c2577640 1 -- 192.168.123.107:0/2754571657 shutdown_connections 2026-03-09T19:24:17.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.506+0000 7f63c2577640 1 --2- 192.168.123.107:0/2754571657 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f63bc100780 0x7f63bc100be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:17.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.506+0000 7f63c2577640 1 --2- 192.168.123.107:0/2754571657 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63bc106780 0x7f63bc106b60 unknown :-1 s=CLOSED pgs=239 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:17.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.506+0000 7f63c2577640 1 -- 192.168.123.107:0/2754571657 >> 192.168.123.107:0/2754571657 conn(0x7f63bc0fc460 msgr2=0x7f63bc0fe880 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:17.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.506+0000 7f63c2577640 1 -- 192.168.123.107:0/2754571657 shutdown_connections 2026-03-09T19:24:17.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.506+0000 7f63c2577640 1 -- 192.168.123.107:0/2754571657 wait complete. 2026-03-09T19:24:17.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.507+0000 7f63c2577640 1 Processor -- start 2026-03-09T19:24:17.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.507+0000 7f63c2577640 1 -- start start 2026-03-09T19:24:17.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.507+0000 7f63c2577640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f63bc100780 0x7f63bc196470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:17.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.507+0000 7f63c2577640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63bc106780 0x7f63bc1969b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:17.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.507+0000 7f63c2577640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f63bc197040 con 0x7f63bc106780 2026-03-09T19:24:17.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.507+0000 7f63c2577640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f63bc19ad60 con 0x7f63bc100780 2026-03-09T19:24:17.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.507+0000 7f63bb7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63bc106780 0x7f63bc1969b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:17.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.507+0000 7f63bb7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63bc106780 0x7f63bc1969b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43410/0 (socket says 192.168.123.107:43410) 2026-03-09T19:24:17.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.507+0000 7f63bb7fe640 1 -- 192.168.123.107:0/2330305396 learned_addr learned my addr 192.168.123.107:0/2330305396 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:17.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.507+0000 7f63bb7fe640 1 -- 192.168.123.107:0/2330305396 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f63bc100780 msgr2=0x7f63bc196470 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:24:17.506 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.508+0000 7f63bb7fe640 1 --2- 192.168.123.107:0/2330305396 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f63bc100780 0x7f63bc196470 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:17.506 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.508+0000 7f63bb7fe640 1 -- 192.168.123.107:0/2330305396 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f63ac009660 con 0x7f63bc106780 2026-03-09T19:24:17.506 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.508+0000 7f63bb7fe640 1 --2- 192.168.123.107:0/2330305396 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63bc106780 0x7f63bc1969b0 secure :-1 s=READY pgs=240 cs=0 l=1 rev1=1 crypto rx=0x7f63a800e9b0 tx=0x7f63a800ee80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:17.506 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.508+0000 7f63b97fa640 1 -- 192.168.123.107:0/2330305396 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f63a800cd90 con 0x7f63bc106780 2026-03-09T19:24:17.506 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.508+0000 7f63b97fa640 1 -- 192.168.123.107:0/2330305396 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f63a8004590 con 0x7f63bc106780 2026-03-09T19:24:17.513 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.508+0000 7f63c2577640 1 -- 192.168.123.107:0/2330305396 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f63bc19b040 con 0x7f63bc106780 2026-03-09T19:24:17.513 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.509+0000 7f63b97fa640 1 -- 192.168.123.107:0/2330305396 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f63a8010640 con 0x7f63bc106780 2026-03-09T19:24:17.514 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.509+0000 7f63c2577640 1 -- 192.168.123.107:0/2330305396 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f63bc19b590 con 0x7f63bc106780 2026-03-09T19:24:17.514 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.510+0000 7f63c2577640 1 -- 192.168.123.107:0/2330305396 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f63bc101ec0 con 0x7f63bc106780 2026-03-09T19:24:17.514 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.513+0000 7f63b97fa640 1 -- 192.168.123.107:0/2330305396 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f63a8010860 con 0x7f63bc106780 2026-03-09T19:24:17.514 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.513+0000 7f63b97fa640 1 --2- 192.168.123.107:0/2330305396 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6384076080 0x7f6384078540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:17.514 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.513+0000 7f63bbfff640 1 --2- 192.168.123.107:0/2330305396 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6384076080 0x7f6384078540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:17.514 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.514+0000 7f63b97fa640 1 -- 192.168.123.107:0/2330305396 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f63a8098e40 con 0x7f63bc106780 2026-03-09T19:24:17.514 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.514+0000 7f63bbfff640 1 --2- 192.168.123.107:0/2330305396 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6384076080 0x7f6384078540 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f63ac009a00 tx=0x7f63ac0023d0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:17.514 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.514+0000 7f63b97fa640 1 -- 192.168.123.107:0/2330305396 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f63a809d050 con 0x7f63bc106780 2026-03-09T19:24:17.617 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:17 vm07 ceph-mon[48545]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 564 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:24:17.617 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:17 vm07 ceph-mon[48545]: from='client.14442 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T19:24:17.617 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:17 vm07 ceph-mon[48545]: from='client.14446 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T19:24:17.617 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.618+0000 7f63c2577640 1 -- 192.168.123.107:0/2330305396 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "format": "json"} v 0) v1 -- 0x7f63bc100be0 con 0x7f63bc106780 2026-03-09T19:24:17.618 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.620+0000 7f63b97fa640 1 -- 192.168.123.107:0/2330305396 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "format": "json"}]=0 v0) v1 ==== 72+0+46 (secure 0 0 0) 0x7f63a8061a40 con 0x7f63bc106780 2026-03-09T19:24:17.618 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:17.618 INFO:teuthology.orchestra.run.vm07.stdout:{"status":"HEALTH_OK","checks":{},"mutes":[]} 2026-03-09T19:24:17.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.622+0000 7f63c2577640 1 -- 192.168.123.107:0/2330305396 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6384076080 msgr2=0x7f6384078540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:17.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.622+0000 7f63c2577640 1 --2- 192.168.123.107:0/2330305396 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6384076080 0x7f6384078540 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f63ac009a00 tx=0x7f63ac0023d0 comp rx=0 tx=0).stop 2026-03-09T19:24:17.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.622+0000 7f63c2577640 1 -- 192.168.123.107:0/2330305396 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63bc106780 msgr2=0x7f63bc1969b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:17.621 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.622+0000 7f63c2577640 1 --2- 192.168.123.107:0/2330305396 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63bc106780 0x7f63bc1969b0 secure :-1 s=READY pgs=240 cs=0 l=1 rev1=1 crypto rx=0x7f63a800e9b0 tx=0x7f63a800ee80 comp rx=0 tx=0).stop 2026-03-09T19:24:17.621 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.622+0000 7f63c2577640 1 -- 192.168.123.107:0/2330305396 shutdown_connections 2026-03-09T19:24:17.621 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.622+0000 7f63c2577640 1 --2- 192.168.123.107:0/2330305396 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6384076080 0x7f6384078540 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:17.621 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.622+0000 7f63c2577640 1 --2- 192.168.123.107:0/2330305396 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f63bc106780 0x7f63bc1969b0 unknown :-1 s=CLOSED pgs=240 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:17.621 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.622+0000 7f63c2577640 1 --2- 192.168.123.107:0/2330305396 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f63bc100780 0x7f63bc196470 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:17.621 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.622+0000 7f63c2577640 1 -- 192.168.123.107:0/2330305396 >> 192.168.123.107:0/2330305396 conn(0x7f63bc0fc460 msgr2=0x7f63bc10a720 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:17.621 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.623+0000 7f63c2577640 1 -- 192.168.123.107:0/2330305396 shutdown_connections 2026-03-09T19:24:17.621 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:17.623+0000 7f63c2577640 1 -- 192.168.123.107:0/2330305396 wait complete. 2026-03-09T19:24:17.813 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy done 2026-03-09T19:24:17.813 INFO:tasks.cephadm:Setup complete, yielding 2026-03-09T19:24:17.813 INFO:teuthology.run_tasks:Running task print... 2026-03-09T19:24:17.815 INFO:teuthology.task.print:**** done end installing reef cephadm ... 2026-03-09T19:24:17.815 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T19:24:17.818 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm07.local 2026-03-09T19:24:17.818 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- bash -c 'ceph config set mgr mgr/cephadm/use_repo_digest true --force' 2026-03-09T19:24:17.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:17 vm08 ceph-mon[57794]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 564 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:24:17.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:17 vm08 ceph-mon[57794]: from='client.14442 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T19:24:17.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:17 vm08 ceph-mon[57794]: from='client.14446 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T19:24:17.971 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:18.219 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.220+0000 7fe553f17640 1 -- 192.168.123.107:0/46233118 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe54c102a80 msgr2=0x7fe54c102e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:18.219 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.220+0000 7fe553f17640 1 --2- 192.168.123.107:0/46233118 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe54c102a80 0x7fe54c102e80 secure :-1 s=READY pgs=241 cs=0 l=1 rev1=1 crypto rx=0x7fe53c0099b0 tx=0x7fe53c02f220 comp rx=0 tx=0).stop 2026-03-09T19:24:18.219 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.221+0000 7fe553f17640 1 -- 192.168.123.107:0/46233118 shutdown_connections 2026-03-09T19:24:18.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.221+0000 7fe553f17640 1 --2- 192.168.123.107:0/46233118 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe54c103c80 0x7fe54c104100 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:18.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.221+0000 7fe553f17640 1 --2- 192.168.123.107:0/46233118 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe54c102a80 0x7fe54c102e80 unknown :-1 s=CLOSED pgs=241 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:18.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.221+0000 7fe553f17640 1 -- 192.168.123.107:0/46233118 >> 192.168.123.107:0/46233118 conn(0x7fe54c0fe250 msgr2=0x7fe54c100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:18.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.222+0000 7fe553f17640 1 -- 192.168.123.107:0/46233118 shutdown_connections 2026-03-09T19:24:18.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.222+0000 7fe553f17640 1 -- 192.168.123.107:0/46233118 wait complete. 2026-03-09T19:24:18.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.222+0000 7fe553f17640 1 Processor -- start 2026-03-09T19:24:18.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.223+0000 7fe553f17640 1 -- start start 2026-03-09T19:24:18.221 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.223+0000 7fe553f17640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe54c102a80 0x7fe54c19a500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:18.221 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.223+0000 7fe553f17640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe54c103c80 0x7fe54c19aa40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:18.221 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.223+0000 7fe551c8c640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe54c102a80 0x7fe54c19a500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:18.221 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.223+0000 7fe553f17640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe54c19b010 con 0x7fe54c103c80 2026-03-09T19:24:18.221 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.224+0000 7fe553f17640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe54c19b180 con 0x7fe54c102a80 2026-03-09T19:24:18.222 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.224+0000 7fe551c8c640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe54c102a80 0x7fe54c19a500 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:40668/0 (socket says 192.168.123.107:40668) 2026-03-09T19:24:18.222 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.224+0000 7fe551c8c640 1 -- 192.168.123.107:0/953475188 learned_addr learned my addr 192.168.123.107:0/953475188 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:18.222 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.223+0000 7fe55148b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe54c103c80 0x7fe54c19aa40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:18.222 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.224+0000 7fe55148b640 1 -- 192.168.123.107:0/953475188 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe54c102a80 msgr2=0x7fe54c19a500 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:18.222 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.224+0000 7fe55148b640 1 --2- 192.168.123.107:0/953475188 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe54c102a80 0x7fe54c19a500 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:18.222 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.224+0000 7fe55148b640 1 -- 192.168.123.107:0/953475188 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe53c009660 con 0x7fe54c103c80 2026-03-09T19:24:18.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.225+0000 7fe55148b640 1 --2- 192.168.123.107:0/953475188 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe54c103c80 0x7fe54c19aa40 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7fe53400b500 tx=0x7fe53400b9d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:18.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.225+0000 7fe542ffd640 1 -- 192.168.123.107:0/953475188 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe534004280 con 0x7fe54c103c80 2026-03-09T19:24:18.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.225+0000 7fe542ffd640 1 -- 192.168.123.107:0/953475188 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe5340043e0 con 0x7fe54c103c80 2026-03-09T19:24:18.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.225+0000 7fe553f17640 1 -- 192.168.123.107:0/953475188 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe54c19fc20 con 0x7fe54c103c80 2026-03-09T19:24:18.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.226+0000 7fe542ffd640 1 -- 192.168.123.107:0/953475188 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe53400fa60 con 0x7fe54c103c80 2026-03-09T19:24:18.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.226+0000 7fe553f17640 1 -- 192.168.123.107:0/953475188 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe54c1a01f0 con 0x7fe54c103c80 2026-03-09T19:24:18.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.227+0000 7fe542ffd640 1 -- 192.168.123.107:0/953475188 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fe53400fc80 con 0x7fe54c103c80 2026-03-09T19:24:18.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.227+0000 7fe542ffd640 1 --2- 192.168.123.107:0/953475188 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe528075fb0 0x7fe528078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:18.226 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.228+0000 7fe553f17640 1 -- 192.168.123.107:0/953475188 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe54c102e80 con 0x7fe54c103c80 2026-03-09T19:24:18.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.228+0000 7fe542ffd640 1 -- 192.168.123.107:0/953475188 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7fe53405f5d0 con 0x7fe54c103c80 2026-03-09T19:24:18.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.228+0000 7fe551c8c640 1 --2- 192.168.123.107:0/953475188 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe528075fb0 0x7fe528078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:18.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.228+0000 7fe551c8c640 1 --2- 192.168.123.107:0/953475188 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe528075fb0 0x7fe528078470 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7fe53c002b60 tx=0x7fe53c03a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:18.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.232+0000 7fe542ffd640 1 -- 192.168.123.107:0/953475188 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fe53409b050 con 0x7fe54c103c80 2026-03-09T19:24:18.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.321+0000 7fe553f17640 1 -- 192.168.123.107:0/953475188 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/use_repo_digest}] v 0) v1 -- 0x7fe54c10b730 con 0x7fe54c103c80 2026-03-09T19:24:18.329 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.328+0000 7fe542ffd640 1 -- 192.168.123.107:0/953475188 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/use_repo_digest}]=0 v17)=0 v17) v1 ==== 143+0+0 (secure 0 0 0) 0x7fe53405f890 con 0x7fe54c103c80 2026-03-09T19:24:18.333 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.335+0000 7fe553f17640 1 -- 192.168.123.107:0/953475188 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe528075fb0 msgr2=0x7fe528078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:18.333 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.335+0000 7fe553f17640 1 --2- 192.168.123.107:0/953475188 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe528075fb0 0x7fe528078470 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7fe53c002b60 tx=0x7fe53c03a040 comp rx=0 tx=0).stop 2026-03-09T19:24:18.333 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.335+0000 7fe553f17640 1 -- 192.168.123.107:0/953475188 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe54c103c80 msgr2=0x7fe54c19aa40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:18.333 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.335+0000 7fe553f17640 1 --2- 192.168.123.107:0/953475188 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe54c103c80 0x7fe54c19aa40 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7fe53400b500 tx=0x7fe53400b9d0 comp rx=0 tx=0).stop 2026-03-09T19:24:18.333 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.336+0000 7fe553f17640 1 -- 192.168.123.107:0/953475188 shutdown_connections 2026-03-09T19:24:18.333 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.336+0000 7fe553f17640 1 --2- 192.168.123.107:0/953475188 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fe528075fb0 0x7fe528078470 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:18.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.336+0000 7fe553f17640 1 --2- 192.168.123.107:0/953475188 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe54c103c80 0x7fe54c19aa40 unknown :-1 s=CLOSED pgs=242 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:18.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.336+0000 7fe553f17640 1 --2- 192.168.123.107:0/953475188 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe54c102a80 0x7fe54c19a500 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:18.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.336+0000 7fe553f17640 1 -- 192.168.123.107:0/953475188 >> 192.168.123.107:0/953475188 conn(0x7fe54c0fe250 msgr2=0x7fe54c0ffd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:18.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.336+0000 7fe553f17640 1 -- 192.168.123.107:0/953475188 shutdown_connections 2026-03-09T19:24:18.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.336+0000 7fe553f17640 1 -- 192.168.123.107:0/953475188 wait complete. 2026-03-09T19:24:18.401 INFO:teuthology.run_tasks:Running task print... 2026-03-09T19:24:18.403 INFO:teuthology.task.print:**** done cephadm.shell ceph config set mgr... 2026-03-09T19:24:18.403 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T19:24:18.405 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm07.local 2026-03-09T19:24:18.405 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- bash -c 'ceph orch status' 2026-03-09T19:24:18.569 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:18.599 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:18 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/2330305396' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-09T19:24:18.599 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:18 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/953475188' entity='client.admin' 2026-03-09T19:24:18.599 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:18 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:24:18.599 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:18 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:18.599 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:18 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:24:18.599 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:18 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:18.600 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:18 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:24:18.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:18 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/2330305396' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-09T19:24:18.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:18 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/953475188' entity='client.admin' 2026-03-09T19:24:18.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:18 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:24:18.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:18 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:18.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:18 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:24:18.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:18 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:18.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:18 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:24:18.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.947+0000 7ff3ae4e2640 1 -- 192.168.123.107:0/966301259 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3a80feba0 msgr2=0x7ff3a8105e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:18.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.947+0000 7ff3ae4e2640 1 --2- 192.168.123.107:0/966301259 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3a80feba0 0x7ff3a8105e80 secure :-1 s=READY pgs=243 cs=0 l=1 rev1=1 crypto rx=0x7ff3980099b0 tx=0x7ff39802f240 comp rx=0 tx=0).stop 2026-03-09T19:24:18.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.949+0000 7ff3ae4e2640 1 -- 192.168.123.107:0/966301259 shutdown_connections 2026-03-09T19:24:18.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.949+0000 7ff3ae4e2640 1 --2- 192.168.123.107:0/966301259 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3a80feba0 0x7ff3a8105e80 unknown :-1 s=CLOSED pgs=243 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:18.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.949+0000 7ff3ae4e2640 1 --2- 192.168.123.107:0/966301259 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3a80fe1d0 0x7ff3a80fe5d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:18.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.949+0000 7ff3ae4e2640 1 -- 192.168.123.107:0/966301259 >> 192.168.123.107:0/966301259 conn(0x7ff3a80f9f80 msgr2=0x7ff3a80fc3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:18.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.949+0000 7ff3ae4e2640 1 -- 192.168.123.107:0/966301259 shutdown_connections 2026-03-09T19:24:18.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.949+0000 7ff3ae4e2640 1 -- 192.168.123.107:0/966301259 wait complete. 2026-03-09T19:24:18.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.949+0000 7ff3ae4e2640 1 Processor -- start 2026-03-09T19:24:18.948 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.950+0000 7ff3ae4e2640 1 -- start start 2026-03-09T19:24:18.948 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.950+0000 7ff3ae4e2640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3a80fe1d0 0x7ff3a8199350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:18.948 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.950+0000 7ff3ae4e2640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3a80feba0 0x7ff3a8195e00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:18.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.950+0000 7ff3ae4e2640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff3a8199920 con 0x7ff3a80feba0 2026-03-09T19:24:18.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.950+0000 7ff3ae4e2640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff3a8196340 con 0x7ff3a80fe1d0 2026-03-09T19:24:18.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.950+0000 7ff3accdf640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3a80feba0 0x7ff3a8195e00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:18.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.950+0000 7ff3accdf640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3a80feba0 0x7ff3a8195e00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:52750/0 (socket says 192.168.123.107:52750) 2026-03-09T19:24:18.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.950+0000 7ff3accdf640 1 -- 192.168.123.107:0/1739636600 learned_addr learned my addr 192.168.123.107:0/1739636600 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:18.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.950+0000 7ff3accdf640 1 -- 192.168.123.107:0/1739636600 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3a80fe1d0 msgr2=0x7ff3a8199350 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:24:18.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.950+0000 7ff3accdf640 1 --2- 192.168.123.107:0/1739636600 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3a80fe1d0 0x7ff3a8199350 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:18.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.950+0000 7ff3accdf640 1 -- 192.168.123.107:0/1739636600 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff398009660 con 0x7ff3a80feba0 2026-03-09T19:24:18.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.950+0000 7ff3accdf640 1 --2- 192.168.123.107:0/1739636600 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3a80feba0 0x7ff3a8195e00 secure :-1 s=READY pgs=244 cs=0 l=1 rev1=1 crypto rx=0x7ff3980029e0 tx=0x7ff398002a10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:18.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.950+0000 7ff3967fc640 1 -- 192.168.123.107:0/1739636600 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff39803d070 con 0x7ff3a80feba0 2026-03-09T19:24:18.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.950+0000 7ff3967fc640 1 -- 192.168.123.107:0/1739636600 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff398031dd0 con 0x7ff3a80feba0 2026-03-09T19:24:18.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.951+0000 7ff3967fc640 1 -- 192.168.123.107:0/1739636600 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff39802fc50 con 0x7ff3a80feba0 2026-03-09T19:24:18.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.951+0000 7ff3ae4e2640 1 -- 192.168.123.107:0/1739636600 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff3a8196540 con 0x7ff3a80feba0 2026-03-09T19:24:18.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.952+0000 7ff3ae4e2640 1 -- 192.168.123.107:0/1739636600 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff3a8196a30 con 0x7ff3a80feba0 2026-03-09T19:24:18.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.953+0000 7ff377fff640 1 -- 192.168.123.107:0/1739636600 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff36c005350 con 0x7ff3a80feba0 2026-03-09T19:24:18.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.953+0000 7ff3967fc640 1 -- 192.168.123.107:0/1739636600 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7ff398031080 con 0x7ff3a80feba0 2026-03-09T19:24:18.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.953+0000 7ff3967fc640 1 --2- 192.168.123.107:0/1739636600 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff384076170 0x7ff384078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:18.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.953+0000 7ff3967fc640 1 -- 192.168.123.107:0/1739636600 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7ff398085080 con 0x7ff3a80feba0 2026-03-09T19:24:18.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.956+0000 7ff3ad4e0640 1 --2- 192.168.123.107:0/1739636600 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff384076170 0x7ff384078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:18.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.956+0000 7ff3ad4e0640 1 --2- 192.168.123.107:0/1739636600 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff384076170 0x7ff384078630 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7ff39c009770 tx=0x7ff39c006cd0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:18.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:18.956+0000 7ff3967fc640 1 -- 192.168.123.107:0/1739636600 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7ff3980c1050 con 0x7ff3a80feba0 2026-03-09T19:24:19.050 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.052+0000 7ff377fff640 1 -- 192.168.123.107:0/1739636600 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff36c002bf0 con 0x7ff384076170 2026-03-09T19:24:19.050 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.052+0000 7ff3967fc640 1 -- 192.168.123.107:0/1739636600 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+43 (secure 0 0 0) 0x7ff36c002bf0 con 0x7ff384076170 2026-03-09T19:24:19.051 INFO:teuthology.orchestra.run.vm07.stdout:Backend: cephadm 2026-03-09T19:24:19.051 INFO:teuthology.orchestra.run.vm07.stdout:Available: Yes 2026-03-09T19:24:19.051 INFO:teuthology.orchestra.run.vm07.stdout:Paused: No 2026-03-09T19:24:19.053 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.055+0000 7ff377fff640 1 -- 192.168.123.107:0/1739636600 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff384076170 msgr2=0x7ff384078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:19.053 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.055+0000 7ff377fff640 1 --2- 192.168.123.107:0/1739636600 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff384076170 0x7ff384078630 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7ff39c009770 tx=0x7ff39c006cd0 comp rx=0 tx=0).stop 2026-03-09T19:24:19.053 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.055+0000 7ff377fff640 1 -- 192.168.123.107:0/1739636600 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3a80feba0 msgr2=0x7ff3a8195e00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:19.053 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.055+0000 7ff377fff640 1 --2- 192.168.123.107:0/1739636600 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3a80feba0 0x7ff3a8195e00 secure :-1 s=READY pgs=244 cs=0 l=1 rev1=1 crypto rx=0x7ff3980029e0 tx=0x7ff398002a10 comp rx=0 tx=0).stop 2026-03-09T19:24:19.053 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.055+0000 7ff377fff640 1 -- 192.168.123.107:0/1739636600 shutdown_connections 2026-03-09T19:24:19.053 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.055+0000 7ff377fff640 1 --2- 192.168.123.107:0/1739636600 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff384076170 0x7ff384078630 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:19.053 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.055+0000 7ff377fff640 1 --2- 192.168.123.107:0/1739636600 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3a80feba0 0x7ff3a8195e00 unknown :-1 s=CLOSED pgs=244 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:19.053 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.055+0000 7ff377fff640 1 --2- 192.168.123.107:0/1739636600 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3a80fe1d0 0x7ff3a8199350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:19.053 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.055+0000 7ff377fff640 1 -- 192.168.123.107:0/1739636600 >> 192.168.123.107:0/1739636600 conn(0x7ff3a80f9f80 msgr2=0x7ff3a8101da0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:19.053 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.055+0000 7ff377fff640 1 -- 192.168.123.107:0/1739636600 shutdown_connections 2026-03-09T19:24:19.053 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.055+0000 7ff377fff640 1 -- 192.168.123.107:0/1739636600 wait complete. 2026-03-09T19:24:19.114 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- bash -c 'ceph orch ps' 2026-03-09T19:24:19.260 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:19.552 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:19 vm07 ceph-mon[48545]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 564 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:24:19.552 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:19 vm07 ceph-mon[48545]: from='client.14458 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:19.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.677+0000 7ff974f1f640 1 -- 192.168.123.107:0/3398054508 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff970103c60 msgr2=0x7ff9701040e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:19.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.677+0000 7ff974f1f640 1 --2- 192.168.123.107:0/3398054508 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff970103c60 0x7ff9701040e0 secure :-1 s=READY pgs=245 cs=0 l=1 rev1=1 crypto rx=0x7ff9640099b0 tx=0x7ff96402f220 comp rx=0 tx=0).stop 2026-03-09T19:24:19.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.678+0000 7ff974f1f640 1 -- 192.168.123.107:0/3398054508 shutdown_connections 2026-03-09T19:24:19.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.678+0000 7ff974f1f640 1 --2- 192.168.123.107:0/3398054508 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff970103c60 0x7ff9701040e0 unknown :-1 s=CLOSED pgs=245 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:19.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.678+0000 7ff974f1f640 1 --2- 192.168.123.107:0/3398054508 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff970102a60 0x7ff970102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:19.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.678+0000 7ff974f1f640 1 -- 192.168.123.107:0/3398054508 >> 192.168.123.107:0/3398054508 conn(0x7ff9700fe250 msgr2=0x7ff970100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:19.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.678+0000 7ff974f1f640 1 -- 192.168.123.107:0/3398054508 shutdown_connections 2026-03-09T19:24:19.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.678+0000 7ff974f1f640 1 -- 192.168.123.107:0/3398054508 wait complete. 2026-03-09T19:24:19.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.678+0000 7ff974f1f640 1 Processor -- start 2026-03-09T19:24:19.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.678+0000 7ff974f1f640 1 -- start start 2026-03-09T19:24:19.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.679+0000 7ff974f1f640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff970102a60 0x7ff97019a4d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:19.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.679+0000 7ff96e575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff970102a60 0x7ff97019a4d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:19.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.679+0000 7ff96e575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff970102a60 0x7ff97019a4d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:52768/0 (socket says 192.168.123.107:52768) 2026-03-09T19:24:19.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.679+0000 7ff974f1f640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff970103c60 0x7ff97019aa10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:19.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.679+0000 7ff974f1f640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff97019afe0 con 0x7ff970102a60 2026-03-09T19:24:19.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.679+0000 7ff974f1f640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff97019b150 con 0x7ff970103c60 2026-03-09T19:24:19.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.679+0000 7ff96e575640 1 -- 192.168.123.107:0/2333904883 learned_addr learned my addr 192.168.123.107:0/2333904883 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:19.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.679+0000 7ff96e575640 1 -- 192.168.123.107:0/2333904883 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff970103c60 msgr2=0x7ff97019aa10 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T19:24:19.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.679+0000 7ff96e575640 1 --2- 192.168.123.107:0/2333904883 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff970103c60 0x7ff97019aa10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:19.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.679+0000 7ff96e575640 1 -- 192.168.123.107:0/2333904883 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff964009660 con 0x7ff970102a60 2026-03-09T19:24:19.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.680+0000 7ff96e575640 1 --2- 192.168.123.107:0/2333904883 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff970102a60 0x7ff97019a4d0 secure :-1 s=READY pgs=246 cs=0 l=1 rev1=1 crypto rx=0x7ff95800b520 tx=0x7ff95800b9f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:19.678 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.680+0000 7ff9577fe640 1 -- 192.168.123.107:0/2333904883 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff958004430 con 0x7ff970102a60 2026-03-09T19:24:19.678 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.680+0000 7ff9577fe640 1 -- 192.168.123.107:0/2333904883 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff958004590 con 0x7ff970102a60 2026-03-09T19:24:19.678 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.680+0000 7ff974f1f640 1 -- 192.168.123.107:0/2333904883 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff97019fbf0 con 0x7ff970102a60 2026-03-09T19:24:19.678 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.680+0000 7ff9577fe640 1 -- 192.168.123.107:0/2333904883 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff95800fcd0 con 0x7ff970102a60 2026-03-09T19:24:19.679 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.681+0000 7ff974f1f640 1 -- 192.168.123.107:0/2333904883 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff9701a01c0 con 0x7ff970102a60 2026-03-09T19:24:19.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.682+0000 7ff9577fe640 1 -- 192.168.123.107:0/2333904883 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7ff95800d040 con 0x7ff970102a60 2026-03-09T19:24:19.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.682+0000 7ff974f1f640 1 -- 192.168.123.107:0/2333904883 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff97010b730 con 0x7ff970102a60 2026-03-09T19:24:19.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.682+0000 7ff9577fe640 1 --2- 192.168.123.107:0/2333904883 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff94c076170 0x7ff94c078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:19.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.682+0000 7ff96dd74640 1 --2- 192.168.123.107:0/2333904883 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff94c076170 0x7ff94c078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:19.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.683+0000 7ff9577fe640 1 -- 192.168.123.107:0/2333904883 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7ff958096950 con 0x7ff970102a60 2026-03-09T19:24:19.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.683+0000 7ff96dd74640 1 --2- 192.168.123.107:0/2333904883 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff94c076170 0x7ff94c078630 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7ff964002af0 tx=0x7ff9640023d0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:19.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.685+0000 7ff9577fe640 1 -- 192.168.123.107:0/2333904883 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7ff9580602c0 con 0x7ff970102a60 2026-03-09T19:24:19.776 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.777+0000 7ff974f1f640 1 -- 192.168.123.107:0/2333904883 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7ff970107ea0 con 0x7ff94c076170 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.783+0000 7ff9577fe640 1 -- 192.168.123.107:0/2333904883 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+2940 (secure 0 0 0) 0x7ff970107ea0 con 0x7ff94c076170 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (73s) 42s ago 116s 22.5M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (2m) 42s ago 2m 8144k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (86s) 16s ago 86s 8342k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (2m) 42s ago 2m 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2799ea3e4bf3 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (86s) 16s ago 85s 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 81fc95c210b6 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (72s) 42s ago 101s 78.6M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:9283,8765,8443 running (2m) 42s ago 2m 529M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 706e626ecd10 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (82s) 16s ago 82s 485M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 880604c16b45 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (2m) 42s ago 2m 47.1M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ccb644205fb3 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (81s) 16s ago 81s 47.4M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8d7b1da9e1e2 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (119s) 42s ago 119s 13.5M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (83s) 16s ago 83s 13.0M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (63s) 42s ago 63s 39.9M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d7417e3377af 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (53s) 42s ago 53s 61.7M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2b3c7dd92144 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (43s) 42s ago 43s 13.8M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 67f7c4b96ef8 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (34s) 16s ago 34s 41.3M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 740e44caf4fc 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (26s) 16s ago 26s 65.3M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d929d31f8a58 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (17s) 16s ago 17s 13.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:24:19.781 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (70s) 42s ago 95s 30.1M - 2.43.0 a07b618ecd1d 238baaac36ff 2026-03-09T19:24:19.783 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.785+0000 7ff974f1f640 1 -- 192.168.123.107:0/2333904883 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff94c076170 msgr2=0x7ff94c078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:19.783 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.786+0000 7ff974f1f640 1 --2- 192.168.123.107:0/2333904883 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff94c076170 0x7ff94c078630 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7ff964002af0 tx=0x7ff9640023d0 comp rx=0 tx=0).stop 2026-03-09T19:24:19.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.786+0000 7ff974f1f640 1 -- 192.168.123.107:0/2333904883 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff970102a60 msgr2=0x7ff97019a4d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:19.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.786+0000 7ff974f1f640 1 --2- 192.168.123.107:0/2333904883 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff970102a60 0x7ff97019a4d0 secure :-1 s=READY pgs=246 cs=0 l=1 rev1=1 crypto rx=0x7ff95800b520 tx=0x7ff95800b9f0 comp rx=0 tx=0).stop 2026-03-09T19:24:19.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.786+0000 7ff974f1f640 1 -- 192.168.123.107:0/2333904883 shutdown_connections 2026-03-09T19:24:19.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.786+0000 7ff974f1f640 1 --2- 192.168.123.107:0/2333904883 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff94c076170 0x7ff94c078630 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:19.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.786+0000 7ff974f1f640 1 --2- 192.168.123.107:0/2333904883 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff970103c60 0x7ff97019aa10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:19.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.786+0000 7ff974f1f640 1 --2- 192.168.123.107:0/2333904883 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff970102a60 0x7ff97019a4d0 unknown :-1 s=CLOSED pgs=246 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:19.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.786+0000 7ff974f1f640 1 -- 192.168.123.107:0/2333904883 >> 192.168.123.107:0/2333904883 conn(0x7ff9700fe250 msgr2=0x7ff9700ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:19.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.786+0000 7ff974f1f640 1 -- 192.168.123.107:0/2333904883 shutdown_connections 2026-03-09T19:24:19.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:19.786+0000 7ff974f1f640 1 -- 192.168.123.107:0/2333904883 wait complete. 2026-03-09T19:24:19.844 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- bash -c 'ceph orch ls' 2026-03-09T19:24:19.997 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:20.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:19 vm08 ceph-mon[57794]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 564 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:24:20.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:19 vm08 ceph-mon[57794]: from='client.14458 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:20.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.233+0000 7f267d035640 1 -- 192.168.123.107:0/57294244 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2678073b40 msgr2=0x7f2678073fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:20.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.233+0000 7f267d035640 1 --2- 192.168.123.107:0/57294244 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2678073b40 0x7f2678073fa0 secure :-1 s=READY pgs=247 cs=0 l=1 rev1=1 crypto rx=0x7f26640099b0 tx=0x7f266402f220 comp rx=0 tx=0).stop 2026-03-09T19:24:20.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.233+0000 7f267d035640 1 -- 192.168.123.107:0/57294244 shutdown_connections 2026-03-09T19:24:20.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.233+0000 7f267d035640 1 --2- 192.168.123.107:0/57294244 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2678073b40 0x7f2678073fa0 unknown :-1 s=CLOSED pgs=247 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:20.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.233+0000 7f267d035640 1 --2- 192.168.123.107:0/57294244 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26780751a0 0x7f2678073600 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:20.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.233+0000 7f267d035640 1 -- 192.168.123.107:0/57294244 >> 192.168.123.107:0/57294244 conn(0x7f26780fbfb0 msgr2=0x7f26780fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:20.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.234+0000 7f267d035640 1 -- 192.168.123.107:0/57294244 shutdown_connections 2026-03-09T19:24:20.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.234+0000 7f267d035640 1 -- 192.168.123.107:0/57294244 wait complete. 2026-03-09T19:24:20.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.234+0000 7f267d035640 1 Processor -- start 2026-03-09T19:24:20.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.234+0000 7f267d035640 1 -- start start 2026-03-09T19:24:20.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.235+0000 7f267d035640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2678073b40 0x7f267819e900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:20.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.235+0000 7f267d035640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26780751a0 0x7f267819ee40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:20.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.235+0000 7f267d035640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f267819f410 con 0x7f26780751a0 2026-03-09T19:24:20.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.235+0000 7f267d035640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f267819f580 con 0x7f2678073b40 2026-03-09T19:24:20.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.235+0000 7f2676575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26780751a0 0x7f267819ee40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:20.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.235+0000 7f2676575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26780751a0 0x7f267819ee40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:52792/0 (socket says 192.168.123.107:52792) 2026-03-09T19:24:20.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.235+0000 7f2676575640 1 -- 192.168.123.107:0/4077846671 learned_addr learned my addr 192.168.123.107:0/4077846671 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:20.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.235+0000 7f2676575640 1 -- 192.168.123.107:0/4077846671 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2678073b40 msgr2=0x7f267819e900 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:20.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.235+0000 7f2676575640 1 --2- 192.168.123.107:0/4077846671 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2678073b40 0x7f267819e900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:20.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.235+0000 7f2676575640 1 -- 192.168.123.107:0/4077846671 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2664009660 con 0x7f26780751a0 2026-03-09T19:24:20.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.235+0000 7f2676575640 1 --2- 192.168.123.107:0/4077846671 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26780751a0 0x7f267819ee40 secure :-1 s=READY pgs=248 cs=0 l=1 rev1=1 crypto rx=0x7f266402f730 tx=0x7f2664002940 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:20.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.236+0000 7f2653fff640 1 -- 192.168.123.107:0/4077846671 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f266403d070 con 0x7f26780751a0 2026-03-09T19:24:20.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.236+0000 7f2653fff640 1 -- 192.168.123.107:0/4077846671 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f2664038730 con 0x7f26780751a0 2026-03-09T19:24:20.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.236+0000 7f2653fff640 1 -- 192.168.123.107:0/4077846671 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2664041760 con 0x7f26780751a0 2026-03-09T19:24:20.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.236+0000 7f267d035640 1 -- 192.168.123.107:0/4077846671 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f26781a3fc0 con 0x7f26780751a0 2026-03-09T19:24:20.235 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.237+0000 7f267d035640 1 -- 192.168.123.107:0/4077846671 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f26781a4320 con 0x7f26780751a0 2026-03-09T19:24:20.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.238+0000 7f2653fff640 1 -- 192.168.123.107:0/4077846671 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f26640388a0 con 0x7f26780751a0 2026-03-09T19:24:20.237 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.239+0000 7f267d035640 1 -- 192.168.123.107:0/4077846671 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f263c005350 con 0x7f26780751a0 2026-03-09T19:24:20.237 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.239+0000 7f2653fff640 1 --2- 192.168.123.107:0/4077846671 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f264c075f60 0x7f264c078420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:20.237 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.239+0000 7f2653fff640 1 -- 192.168.123.107:0/4077846671 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f26640be4e0 con 0x7f26780751a0 2026-03-09T19:24:20.237 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.239+0000 7f2676d76640 1 --2- 192.168.123.107:0/4077846671 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f264c075f60 0x7f264c078420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:20.240 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.242+0000 7f2676d76640 1 --2- 192.168.123.107:0/4077846671 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f264c075f60 0x7f264c078420 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f266c0059c0 tx=0x7f266c005950 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:20.240 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.242+0000 7f2653fff640 1 -- 192.168.123.107:0/4077846671 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f2664084e20 con 0x7f26780751a0 2026-03-09T19:24:20.337 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.339+0000 7f267d035640 1 -- 192.168.123.107:0/4077846671 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f263c002bf0 con 0x7f264c075f60 2026-03-09T19:24:20.342 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.344+0000 7f2653fff640 1 -- 192.168.123.107:0/4077846671 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1150 (secure 0 0 0) 0x7f263c002bf0 con 0x7f264c075f60 2026-03-09T19:24:20.342 INFO:teuthology.orchestra.run.vm07.stdout:NAME PORTS RUNNING REFRESHED AGE PLACEMENT 2026-03-09T19:24:20.342 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager ?:9093,9094 1/1 42s ago 2m count:1 2026-03-09T19:24:20.342 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter 2/2 42s ago 2m * 2026-03-09T19:24:20.342 INFO:teuthology.orchestra.run.vm07.stdout:crash 2/2 42s ago 2m * 2026-03-09T19:24:20.342 INFO:teuthology.orchestra.run.vm07.stdout:grafana ?:3000 1/1 42s ago 2m count:1 2026-03-09T19:24:20.342 INFO:teuthology.orchestra.run.vm07.stdout:mgr 2/2 42s ago 2m count:2 2026-03-09T19:24:20.342 INFO:teuthology.orchestra.run.vm07.stdout:mon 2/2 42s ago 2m vm07:192.168.123.107=vm07;vm08:192.168.123.108=vm08;count:2 2026-03-09T19:24:20.342 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter ?:9100 2/2 42s ago 2m * 2026-03-09T19:24:20.342 INFO:teuthology.orchestra.run.vm07.stdout:osd 6 42s ago - 2026-03-09T19:24:20.342 INFO:teuthology.orchestra.run.vm07.stdout:prometheus ?:9095 1/1 42s ago 2m count:1 2026-03-09T19:24:20.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.345+0000 7f267d035640 1 -- 192.168.123.107:0/4077846671 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f264c075f60 msgr2=0x7f264c078420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:20.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.345+0000 7f267d035640 1 --2- 192.168.123.107:0/4077846671 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f264c075f60 0x7f264c078420 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f266c0059c0 tx=0x7f266c005950 comp rx=0 tx=0).stop 2026-03-09T19:24:20.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.345+0000 7f267d035640 1 -- 192.168.123.107:0/4077846671 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26780751a0 msgr2=0x7f267819ee40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:20.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.345+0000 7f267d035640 1 --2- 192.168.123.107:0/4077846671 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26780751a0 0x7f267819ee40 secure :-1 s=READY pgs=248 cs=0 l=1 rev1=1 crypto rx=0x7f266402f730 tx=0x7f2664002940 comp rx=0 tx=0).stop 2026-03-09T19:24:20.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.345+0000 7f267d035640 1 -- 192.168.123.107:0/4077846671 shutdown_connections 2026-03-09T19:24:20.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.345+0000 7f267d035640 1 --2- 192.168.123.107:0/4077846671 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f264c075f60 0x7f264c078420 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:20.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.345+0000 7f267d035640 1 --2- 192.168.123.107:0/4077846671 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26780751a0 0x7f267819ee40 unknown :-1 s=CLOSED pgs=248 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:20.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.345+0000 7f267d035640 1 --2- 192.168.123.107:0/4077846671 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2678073b40 0x7f267819e900 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:20.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.345+0000 7f267d035640 1 -- 192.168.123.107:0/4077846671 >> 192.168.123.107:0/4077846671 conn(0x7f26780fbfb0 msgr2=0x7f26780fdc90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:20.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.346+0000 7f267d035640 1 -- 192.168.123.107:0/4077846671 shutdown_connections 2026-03-09T19:24:20.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.346+0000 7f267d035640 1 -- 192.168.123.107:0/4077846671 wait complete. 2026-03-09T19:24:20.398 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- bash -c 'ceph orch host ls' 2026-03-09T19:24:20.550 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:20.592 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:20 vm07 ceph-mon[48545]: from='client.14462 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:20.780 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.782+0000 7f7cc5640640 1 -- 192.168.123.107:0/4112457955 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7cc0101a00 msgr2=0x7f7cc0101e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:20.780 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.782+0000 7f7cc5640640 1 --2- 192.168.123.107:0/4112457955 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7cc0101a00 0x7f7cc0101e80 secure :-1 s=READY pgs=249 cs=0 l=1 rev1=1 crypto rx=0x7f7ca80099b0 tx=0x7f7ca802f220 comp rx=0 tx=0).stop 2026-03-09T19:24:20.780 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.782+0000 7f7cc5640640 1 -- 192.168.123.107:0/4112457955 shutdown_connections 2026-03-09T19:24:20.780 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.782+0000 7f7cc5640640 1 --2- 192.168.123.107:0/4112457955 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7cc0101a00 0x7f7cc0101e80 unknown :-1 s=CLOSED pgs=249 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:20.780 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.782+0000 7f7cc5640640 1 --2- 192.168.123.107:0/4112457955 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7cc0100800 0x7f7cc0100c00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:20.781 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.782+0000 7f7cc5640640 1 -- 192.168.123.107:0/4112457955 >> 192.168.123.107:0/4112457955 conn(0x7f7cc00fbfb0 msgr2=0x7f7cc00fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:20.781 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.783+0000 7f7cc5640640 1 -- 192.168.123.107:0/4112457955 shutdown_connections 2026-03-09T19:24:20.781 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.783+0000 7f7cc5640640 1 -- 192.168.123.107:0/4112457955 wait complete. 2026-03-09T19:24:20.781 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.783+0000 7f7cc5640640 1 Processor -- start 2026-03-09T19:24:20.781 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.783+0000 7f7cc5640640 1 -- start start 2026-03-09T19:24:20.781 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.784+0000 7f7cc5640640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7cc0100800 0x7f7cc01981b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:20.782 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.784+0000 7f7cc5640640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7cc0101a00 0x7f7cc01986f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:20.782 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.784+0000 7f7cc5640640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7cc0198cc0 con 0x7f7cc0101a00 2026-03-09T19:24:20.782 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.784+0000 7f7cc5640640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7cc0198e30 con 0x7f7cc0100800 2026-03-09T19:24:20.782 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.784+0000 7f7cbe7fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7cc0101a00 0x7f7cc01986f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:20.782 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.784+0000 7f7cbe7fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7cc0101a00 0x7f7cc01986f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:52818/0 (socket says 192.168.123.107:52818) 2026-03-09T19:24:20.782 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.784+0000 7f7cbe7fc640 1 -- 192.168.123.107:0/852359943 learned_addr learned my addr 192.168.123.107:0/852359943 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:20.782 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.784+0000 7f7cbeffd640 1 --2- 192.168.123.107:0/852359943 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7cc0100800 0x7f7cc01981b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:20.782 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.784+0000 7f7cbe7fc640 1 -- 192.168.123.107:0/852359943 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7cc0100800 msgr2=0x7f7cc01981b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:20.782 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.784+0000 7f7cbe7fc640 1 --2- 192.168.123.107:0/852359943 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7cc0100800 0x7f7cc01981b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:20.782 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.784+0000 7f7cbe7fc640 1 -- 192.168.123.107:0/852359943 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7ca8009660 con 0x7f7cc0101a00 2026-03-09T19:24:20.783 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.785+0000 7f7cbe7fc640 1 --2- 192.168.123.107:0/852359943 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7cc0101a00 0x7f7cc01986f0 secure :-1 s=READY pgs=250 cs=0 l=1 rev1=1 crypto rx=0x7f7ca80029e0 tx=0x7f7ca8002a10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:20.783 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.785+0000 7f7c9ffff640 1 -- 192.168.123.107:0/852359943 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7ca803d070 con 0x7f7cc0101a00 2026-03-09T19:24:20.783 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.785+0000 7f7c9ffff640 1 -- 192.168.123.107:0/852359943 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f7ca8031e20 con 0x7f7cc0101a00 2026-03-09T19:24:20.783 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.785+0000 7f7c9ffff640 1 -- 192.168.123.107:0/852359943 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7ca8038680 con 0x7f7cc0101a00 2026-03-09T19:24:20.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.785+0000 7f7cbeffd640 1 --2- 192.168.123.107:0/852359943 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7cc0100800 0x7f7cc01981b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:24:20.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.785+0000 7f7cc5640640 1 -- 192.168.123.107:0/852359943 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7cc00733a0 con 0x7f7cc0101a00 2026-03-09T19:24:20.784 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.785+0000 7f7cc5640640 1 -- 192.168.123.107:0/852359943 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7cc0073890 con 0x7f7cc0101a00 2026-03-09T19:24:20.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.786+0000 7f7cc5640640 1 -- 192.168.123.107:0/852359943 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7c8c005350 con 0x7f7cc0101a00 2026-03-09T19:24:20.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.788+0000 7f7c9ffff640 1 -- 192.168.123.107:0/852359943 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f7ca8031320 con 0x7f7cc0101a00 2026-03-09T19:24:20.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.788+0000 7f7c9ffff640 1 --2- 192.168.123.107:0/852359943 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f7c980761c0 0x7f7c98078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:20.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.788+0000 7f7c9ffff640 1 -- 192.168.123.107:0/852359943 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7f7ca80bc870 con 0x7f7cc0101a00 2026-03-09T19:24:20.788 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.790+0000 7f7cbeffd640 1 --2- 192.168.123.107:0/852359943 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f7c980761c0 0x7f7c98078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:20.788 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.790+0000 7f7c9ffff640 1 -- 192.168.123.107:0/852359943 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f7ca80861c0 con 0x7f7cc0101a00 2026-03-09T19:24:20.788 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.790+0000 7f7cbeffd640 1 --2- 192.168.123.107:0/852359943 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f7c980761c0 0x7f7c98078680 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f7cc0101860 tx=0x7f7cb4005f70 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:20.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:20 vm08 ceph-mon[57794]: from='client.14462 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:20.877 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.879+0000 7f7cc5640640 1 -- 192.168.123.107:0/852359943 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f7c8c002bf0 con 0x7f7c980761c0 2026-03-09T19:24:20.880 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.882+0000 7f7c9ffff640 1 -- 192.168.123.107:0/852359943 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+139 (secure 0 0 0) 0x7f7c8c002bf0 con 0x7f7c980761c0 2026-03-09T19:24:20.880 INFO:teuthology.orchestra.run.vm07.stdout:HOST ADDR LABELS STATUS 2026-03-09T19:24:20.880 INFO:teuthology.orchestra.run.vm07.stdout:vm07 192.168.123.107 2026-03-09T19:24:20.880 INFO:teuthology.orchestra.run.vm07.stdout:vm08 192.168.123.108 2026-03-09T19:24:20.880 INFO:teuthology.orchestra.run.vm07.stdout:2 hosts in cluster 2026-03-09T19:24:20.882 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.884+0000 7f7cc5640640 1 -- 192.168.123.107:0/852359943 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f7c980761c0 msgr2=0x7f7c98078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:20.882 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.884+0000 7f7cc5640640 1 --2- 192.168.123.107:0/852359943 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f7c980761c0 0x7f7c98078680 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f7cc0101860 tx=0x7f7cb4005f70 comp rx=0 tx=0).stop 2026-03-09T19:24:20.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.885+0000 7f7cc5640640 1 -- 192.168.123.107:0/852359943 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7cc0101a00 msgr2=0x7f7cc01986f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:20.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.885+0000 7f7cc5640640 1 --2- 192.168.123.107:0/852359943 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7cc0101a00 0x7f7cc01986f0 secure :-1 s=READY pgs=250 cs=0 l=1 rev1=1 crypto rx=0x7f7ca80029e0 tx=0x7f7ca8002a10 comp rx=0 tx=0).stop 2026-03-09T19:24:20.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.885+0000 7f7cc5640640 1 -- 192.168.123.107:0/852359943 shutdown_connections 2026-03-09T19:24:20.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.885+0000 7f7cc5640640 1 --2- 192.168.123.107:0/852359943 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f7c980761c0 0x7f7c98078680 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:20.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.885+0000 7f7cc5640640 1 --2- 192.168.123.107:0/852359943 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7cc0101a00 0x7f7cc01986f0 unknown :-1 s=CLOSED pgs=250 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:20.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.885+0000 7f7cc5640640 1 --2- 192.168.123.107:0/852359943 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7cc0100800 0x7f7cc01981b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:20.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.885+0000 7f7cc5640640 1 -- 192.168.123.107:0/852359943 >> 192.168.123.107:0/852359943 conn(0x7f7cc00fbfb0 msgr2=0x7f7cc00fda90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:20.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.885+0000 7f7cc5640640 1 -- 192.168.123.107:0/852359943 shutdown_connections 2026-03-09T19:24:20.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:20.885+0000 7f7cc5640640 1 -- 192.168.123.107:0/852359943 wait complete. 2026-03-09T19:24:20.931 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- bash -c 'ceph orch device ls' 2026-03-09T19:24:21.077 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:21.302 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.303+0000 7fb0029f7640 1 -- 192.168.123.107:0/2145117013 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faffc102a60 msgr2=0x7faffc102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:21.302 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.303+0000 7fb0029f7640 1 --2- 192.168.123.107:0/2145117013 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faffc102a60 0x7faffc102e60 secure :-1 s=READY pgs=251 cs=0 l=1 rev1=1 crypto rx=0x7fafec0099b0 tx=0x7fafec02f220 comp rx=0 tx=0).stop 2026-03-09T19:24:21.302 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.304+0000 7fb0029f7640 1 -- 192.168.123.107:0/2145117013 shutdown_connections 2026-03-09T19:24:21.302 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.304+0000 7fb0029f7640 1 --2- 192.168.123.107:0/2145117013 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faffc103c60 0x7faffc1040e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:21.302 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.304+0000 7fb0029f7640 1 --2- 192.168.123.107:0/2145117013 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faffc102a60 0x7faffc102e60 unknown :-1 s=CLOSED pgs=251 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:21.302 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.304+0000 7fb0029f7640 1 -- 192.168.123.107:0/2145117013 >> 192.168.123.107:0/2145117013 conn(0x7faffc0fe250 msgr2=0x7faffc100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:21.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.304+0000 7fb0029f7640 1 -- 192.168.123.107:0/2145117013 shutdown_connections 2026-03-09T19:24:21.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.305+0000 7fb0029f7640 1 -- 192.168.123.107:0/2145117013 wait complete. 2026-03-09T19:24:21.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.305+0000 7fb0029f7640 1 Processor -- start 2026-03-09T19:24:21.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.305+0000 7fb0029f7640 1 -- start start 2026-03-09T19:24:21.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.305+0000 7fb0029f7640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faffc102a60 0x7faffc19a430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:21.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.305+0000 7fb0029f7640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faffc103c60 0x7faffc19a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:21.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.305+0000 7fb0029f7640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faffc19af40 con 0x7faffc102a60 2026-03-09T19:24:21.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.305+0000 7fb0029f7640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faffc19b0b0 con 0x7faffc103c60 2026-03-09T19:24:21.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.305+0000 7faffbfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faffc102a60 0x7faffc19a430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:21.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.305+0000 7faffbfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faffc102a60 0x7faffc19a430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:52836/0 (socket says 192.168.123.107:52836) 2026-03-09T19:24:21.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.305+0000 7faffbfff640 1 -- 192.168.123.107:0/737773549 learned_addr learned my addr 192.168.123.107:0/737773549 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:21.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.306+0000 7faffb7fe640 1 --2- 192.168.123.107:0/737773549 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faffc103c60 0x7faffc19a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:21.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.306+0000 7faffbfff640 1 -- 192.168.123.107:0/737773549 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faffc103c60 msgr2=0x7faffc19a970 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:21.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.306+0000 7faffbfff640 1 --2- 192.168.123.107:0/737773549 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faffc103c60 0x7faffc19a970 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:21.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.306+0000 7faffbfff640 1 -- 192.168.123.107:0/737773549 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fafec009660 con 0x7faffc102a60 2026-03-09T19:24:21.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.306+0000 7faffbfff640 1 --2- 192.168.123.107:0/737773549 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faffc102a60 0x7faffc19a430 secure :-1 s=READY pgs=252 cs=0 l=1 rev1=1 crypto rx=0x7fafec002940 tx=0x7fafec002970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:21.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.306+0000 7faff97fa640 1 -- 192.168.123.107:0/737773549 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fafec03d070 con 0x7faffc102a60 2026-03-09T19:24:21.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.307+0000 7fb0029f7640 1 -- 192.168.123.107:0/737773549 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faffc19faf0 con 0x7faffc102a60 2026-03-09T19:24:21.305 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.307+0000 7faff97fa640 1 -- 192.168.123.107:0/737773549 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fafec02fd50 con 0x7faffc102a60 2026-03-09T19:24:21.305 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.307+0000 7faff97fa640 1 -- 192.168.123.107:0/737773549 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fafec041a50 con 0x7faffc102a60 2026-03-09T19:24:21.305 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.308+0000 7fb0029f7640 1 -- 192.168.123.107:0/737773549 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faffc19ffe0 con 0x7faffc102a60 2026-03-09T19:24:21.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.308+0000 7faff97fa640 1 -- 192.168.123.107:0/737773549 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fafec049050 con 0x7faffc102a60 2026-03-09T19:24:21.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.308+0000 7fb0029f7640 1 -- 192.168.123.107:0/737773549 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faffc10b690 con 0x7faffc102a60 2026-03-09T19:24:21.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.310+0000 7faff97fa640 1 --2- 192.168.123.107:0/737773549 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fafd4075f60 0x7fafd4078420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:21.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.310+0000 7faff97fa640 1 -- 192.168.123.107:0/737773549 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7fafec0bdc20 con 0x7faffc102a60 2026-03-09T19:24:21.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.311+0000 7faff97fa640 1 -- 192.168.123.107:0/737773549 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fafec0c1050 con 0x7faffc102a60 2026-03-09T19:24:21.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.311+0000 7faffb7fe640 1 --2- 192.168.123.107:0/737773549 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fafd4075f60 0x7fafd4078420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:21.310 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.312+0000 7faffb7fe640 1 --2- 192.168.123.107:0/737773549 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fafd4075f60 0x7fafd4078420 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7faffc19b950 tx=0x7fafe800a5c0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:21.405 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.405+0000 7fb0029f7640 1 -- 192.168.123.107:0/737773549 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch device ls", "target": ["mon-mgr", ""]}) v1 -- 0x7faffc1080e0 con 0x7fafd4075f60 2026-03-09T19:24:21.406 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.408+0000 7faff97fa640 1 -- 192.168.123.107:0/737773549 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1617 (secure 0 0 0) 0x7faffc1080e0 con 0x7fafd4075f60 2026-03-09T19:24:21.406 INFO:teuthology.orchestra.run.vm07.stdout:HOST PATH TYPE DEVICE ID SIZE AVAILABLE REFRESHED REJECT REASONS 2026-03-09T19:24:21.406 INFO:teuthology.orchestra.run.vm07.stdout:vm07 /dev/sr0 hdd QEMU_DVD-ROM_QM00003 366k No 42s ago Has a FileSystem, Insufficient space (<5GB) 2026-03-09T19:24:21.406 INFO:teuthology.orchestra.run.vm07.stdout:vm07 /dev/vdb hdd DWNBRSTVMM07001 20.0G Yes 42s ago 2026-03-09T19:24:21.407 INFO:teuthology.orchestra.run.vm07.stdout:vm07 /dev/vdc hdd DWNBRSTVMM07002 20.0G No 42s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T19:24:21.407 INFO:teuthology.orchestra.run.vm07.stdout:vm07 /dev/vdd hdd DWNBRSTVMM07003 20.0G No 42s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T19:24:21.407 INFO:teuthology.orchestra.run.vm07.stdout:vm07 /dev/vde hdd DWNBRSTVMM07004 20.0G No 42s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T19:24:21.407 INFO:teuthology.orchestra.run.vm07.stdout:vm08 /dev/sr0 hdd QEMU_DVD-ROM_QM00003 366k No 17s ago Has a FileSystem, Insufficient space (<5GB) 2026-03-09T19:24:21.407 INFO:teuthology.orchestra.run.vm07.stdout:vm08 /dev/vdb hdd DWNBRSTVMM08001 20.0G Yes 17s ago 2026-03-09T19:24:21.407 INFO:teuthology.orchestra.run.vm07.stdout:vm08 /dev/vdc hdd DWNBRSTVMM08002 20.0G No 17s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T19:24:21.407 INFO:teuthology.orchestra.run.vm07.stdout:vm08 /dev/vdd hdd DWNBRSTVMM08003 20.0G No 17s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T19:24:21.407 INFO:teuthology.orchestra.run.vm07.stdout:vm08 /dev/vde hdd DWNBRSTVMM08004 20.0G No 17s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T19:24:21.409 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.411+0000 7fb0029f7640 1 -- 192.168.123.107:0/737773549 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fafd4075f60 msgr2=0x7fafd4078420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:21.409 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.411+0000 7fb0029f7640 1 --2- 192.168.123.107:0/737773549 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fafd4075f60 0x7fafd4078420 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7faffc19b950 tx=0x7fafe800a5c0 comp rx=0 tx=0).stop 2026-03-09T19:24:21.409 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.411+0000 7fb0029f7640 1 -- 192.168.123.107:0/737773549 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faffc102a60 msgr2=0x7faffc19a430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:21.409 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.411+0000 7fb0029f7640 1 --2- 192.168.123.107:0/737773549 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faffc102a60 0x7faffc19a430 secure :-1 s=READY pgs=252 cs=0 l=1 rev1=1 crypto rx=0x7fafec002940 tx=0x7fafec002970 comp rx=0 tx=0).stop 2026-03-09T19:24:21.409 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.411+0000 7fb0029f7640 1 -- 192.168.123.107:0/737773549 shutdown_connections 2026-03-09T19:24:21.409 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.411+0000 7fb0029f7640 1 --2- 192.168.123.107:0/737773549 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fafd4075f60 0x7fafd4078420 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:21.409 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.411+0000 7fb0029f7640 1 --2- 192.168.123.107:0/737773549 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faffc103c60 0x7faffc19a970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:21.409 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.411+0000 7fb0029f7640 1 --2- 192.168.123.107:0/737773549 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faffc102a60 0x7faffc19a430 unknown :-1 s=CLOSED pgs=252 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:21.409 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.411+0000 7fb0029f7640 1 -- 192.168.123.107:0/737773549 >> 192.168.123.107:0/737773549 conn(0x7faffc0fe250 msgr2=0x7faffc0ffa10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:21.409 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.412+0000 7fb0029f7640 1 -- 192.168.123.107:0/737773549 shutdown_connections 2026-03-09T19:24:21.410 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.412+0000 7fb0029f7640 1 -- 192.168.123.107:0/737773549 wait complete. 2026-03-09T19:24:21.484 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T19:24:21.486 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm07.local 2026-03-09T19:24:21.486 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- bash -c 'ceph fs volume create cephfs --placement=4' 2026-03-09T19:24:21.627 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:21.664 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:21 vm07 ceph-mon[48545]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 564 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:24:21.664 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:21 vm07 ceph-mon[48545]: from='client.14466 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:21.664 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:21 vm07 ceph-mon[48545]: from='client.14470 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:21 vm08 ceph-mon[57794]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 564 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:24:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:21 vm08 ceph-mon[57794]: from='client.14466 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:21 vm08 ceph-mon[57794]: from='client.14470 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:21.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.851+0000 7fc518ab0640 1 -- 192.168.123.107:0/3517985221 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5140751a0 msgr2=0x7fc514073600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:21.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.851+0000 7fc518ab0640 1 --2- 192.168.123.107:0/3517985221 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5140751a0 0x7fc514073600 secure :-1 s=READY pgs=253 cs=0 l=1 rev1=1 crypto rx=0x7fc5000099b0 tx=0x7fc50002f220 comp rx=0 tx=0).stop 2026-03-09T19:24:21.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.852+0000 7fc518ab0640 1 -- 192.168.123.107:0/3517985221 shutdown_connections 2026-03-09T19:24:21.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.852+0000 7fc518ab0640 1 --2- 192.168.123.107:0/3517985221 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc514073b40 0x7fc514073fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:21.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.852+0000 7fc518ab0640 1 --2- 192.168.123.107:0/3517985221 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5140751a0 0x7fc514073600 unknown :-1 s=CLOSED pgs=253 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:21.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.852+0000 7fc518ab0640 1 -- 192.168.123.107:0/3517985221 >> 192.168.123.107:0/3517985221 conn(0x7fc5140fbf80 msgr2=0x7fc5140fe3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:21.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.852+0000 7fc518ab0640 1 -- 192.168.123.107:0/3517985221 shutdown_connections 2026-03-09T19:24:21.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.852+0000 7fc518ab0640 1 -- 192.168.123.107:0/3517985221 wait complete. 2026-03-09T19:24:21.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.853+0000 7fc518ab0640 1 Processor -- start 2026-03-09T19:24:21.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.853+0000 7fc518ab0640 1 -- start start 2026-03-09T19:24:21.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.853+0000 7fc518ab0640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc514073b40 0x7fc51406d350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:21.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.853+0000 7fc518ab0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5140751a0 0x7fc51406d890 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:21.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.853+0000 7fc518ab0640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc51406ddd0 con 0x7fc5140751a0 2026-03-09T19:24:21.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.853+0000 7fc518ab0640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc51406df40 con 0x7fc514073b40 2026-03-09T19:24:21.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.853+0000 7fc511d74640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5140751a0 0x7fc51406d890 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:21.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.853+0000 7fc511d74640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5140751a0 0x7fc51406d890 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:52856/0 (socket says 192.168.123.107:52856) 2026-03-09T19:24:21.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.853+0000 7fc511d74640 1 -- 192.168.123.107:0/3692896513 learned_addr learned my addr 192.168.123.107:0/3692896513 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:21.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.854+0000 7fc511d74640 1 -- 192.168.123.107:0/3692896513 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc514073b40 msgr2=0x7fc51406d350 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:24:21.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.854+0000 7fc511d74640 1 --2- 192.168.123.107:0/3692896513 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc514073b40 0x7fc51406d350 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:21.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.854+0000 7fc511d74640 1 -- 192.168.123.107:0/3692896513 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc500009660 con 0x7fc5140751a0 2026-03-09T19:24:21.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.854+0000 7fc511d74640 1 --2- 192.168.123.107:0/3692896513 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5140751a0 0x7fc51406d890 secure :-1 s=READY pgs=254 cs=0 l=1 rev1=1 crypto rx=0x7fc50800ca30 tx=0x7fc50800cf00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:21.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.854+0000 7fc4f77fe640 1 -- 192.168.123.107:0/3692896513 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc508004430 con 0x7fc5140751a0 2026-03-09T19:24:21.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.854+0000 7fc518ab0640 1 -- 192.168.123.107:0/3692896513 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc5140729d0 con 0x7fc5140751a0 2026-03-09T19:24:21.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.854+0000 7fc4f77fe640 1 -- 192.168.123.107:0/3692896513 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fc508004590 con 0x7fc5140751a0 2026-03-09T19:24:21.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.854+0000 7fc4f77fe640 1 -- 192.168.123.107:0/3692896513 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc50800f660 con 0x7fc5140751a0 2026-03-09T19:24:21.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.855+0000 7fc518ab0640 1 -- 192.168.123.107:0/3692896513 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc514072ed0 con 0x7fc5140751a0 2026-03-09T19:24:21.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.855+0000 7fc518ab0640 1 -- 192.168.123.107:0/3692896513 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc514073f70 con 0x7fc5140751a0 2026-03-09T19:24:21.858 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.860+0000 7fc4f77fe640 1 -- 192.168.123.107:0/3692896513 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fc508002850 con 0x7fc5140751a0 2026-03-09T19:24:21.858 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.860+0000 7fc4f77fe640 1 --2- 192.168.123.107:0/3692896513 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc4ec076290 0x7fc4ec078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:21.858 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.860+0000 7fc4f77fe640 1 -- 192.168.123.107:0/3692896513 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(35..35 src has 1..35) v4 ==== 4618+0+0 (secure 0 0 0) 0x7fc508097980 con 0x7fc5140751a0 2026-03-09T19:24:21.859 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.861+0000 7fc4f77fe640 1 -- 192.168.123.107:0/3692896513 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fc5080612d0 con 0x7fc5140751a0 2026-03-09T19:24:21.859 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.861+0000 7fc512575640 1 --2- 192.168.123.107:0/3692896513 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc4ec076290 0x7fc4ec078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:21.859 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.861+0000 7fc512575640 1 --2- 192.168.123.107:0/3692896513 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc4ec076290 0x7fc4ec078750 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fc500002af0 tx=0x7fc5000023d0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:21.956 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:21.957+0000 7fc518ab0640 1 -- 192.168.123.107:0/3692896513 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}) v1 -- 0x7fc5141047c0 con 0x7fc4ec076290 2026-03-09T19:24:22.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:22 vm08 ceph-mon[57794]: from='client.14474 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:22.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:22 vm08 ceph-mon[57794]: from='client.14478 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:22.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:22 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-09T19:24:22.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:22 vm07 ceph-mon[48545]: from='client.14474 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:22.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:22 vm07 ceph-mon[48545]: from='client.14478 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:22.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:22 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-09T19:24:23.618 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:23.616+0000 7fc4f77fe640 1 -- 192.168.123.107:0/3692896513 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7fc5141047c0 con 0x7fc4ec076290 2026-03-09T19:24:23.618 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:23.618+0000 7fc518ab0640 1 -- 192.168.123.107:0/3692896513 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc4ec076290 msgr2=0x7fc4ec078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:23.619 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:23.618+0000 7fc518ab0640 1 --2- 192.168.123.107:0/3692896513 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc4ec076290 0x7fc4ec078750 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fc500002af0 tx=0x7fc5000023d0 comp rx=0 tx=0).stop 2026-03-09T19:24:23.619 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:23.618+0000 7fc518ab0640 1 -- 192.168.123.107:0/3692896513 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5140751a0 msgr2=0x7fc51406d890 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:23.619 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:23.618+0000 7fc518ab0640 1 --2- 192.168.123.107:0/3692896513 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5140751a0 0x7fc51406d890 secure :-1 s=READY pgs=254 cs=0 l=1 rev1=1 crypto rx=0x7fc50800ca30 tx=0x7fc50800cf00 comp rx=0 tx=0).stop 2026-03-09T19:24:23.619 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:23.618+0000 7fc518ab0640 1 -- 192.168.123.107:0/3692896513 shutdown_connections 2026-03-09T19:24:23.619 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:23.618+0000 7fc518ab0640 1 --2- 192.168.123.107:0/3692896513 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc4ec076290 0x7fc4ec078750 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:23.619 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:23.618+0000 7fc518ab0640 1 --2- 192.168.123.107:0/3692896513 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5140751a0 0x7fc51406d890 unknown :-1 s=CLOSED pgs=254 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:23.619 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:23.618+0000 7fc518ab0640 1 --2- 192.168.123.107:0/3692896513 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc514073b40 0x7fc51406d350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:23.619 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:23.618+0000 7fc518ab0640 1 -- 192.168.123.107:0/3692896513 >> 192.168.123.107:0/3692896513 conn(0x7fc5140fbf80 msgr2=0x7fc5140fd910 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:23.619 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:23.619+0000 7fc518ab0640 1 -- 192.168.123.107:0/3692896513 shutdown_connections 2026-03-09T19:24:23.619 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:23.619+0000 7fc518ab0640 1 -- 192.168.123.107:0/3692896513 wait complete. 2026-03-09T19:24:23.661 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- bash -c 'ceph fs dump' 2026-03-09T19:24:23.841 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:23.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:23 vm08 ceph-mon[57794]: pgmap v70: 1 pgs: 1 active+clean; 449 KiB data, 564 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:24:23.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:23 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-09T19:24:23.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:23 vm08 ceph-mon[57794]: osdmap e36: 6 total, 6 up, 6 in 2026-03-09T19:24:23.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:23 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-09T19:24:23.872 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:23 vm07 ceph-mon[48545]: pgmap v70: 1 pgs: 1 active+clean; 449 KiB data, 564 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:24:23.872 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:23 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-09T19:24:23.872 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:23 vm07 ceph-mon[48545]: osdmap e36: 6 total, 6 up, 6 in 2026-03-09T19:24:23.872 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:23 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-09T19:24:23.872 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:23 vm07 ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07[48541]: 2026-03-09T19:24:23.600+0000 7f9178593640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T19:24:24.124 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.124+0000 7f6f56243640 1 -- 192.168.123.107:0/3448026009 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f50103c60 msgr2=0x7f6f501040e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:24.124 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.124+0000 7f6f56243640 1 --2- 192.168.123.107:0/3448026009 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f50103c60 0x7f6f501040e0 secure :-1 s=READY pgs=255 cs=0 l=1 rev1=1 crypto rx=0x7f6f400099b0 tx=0x7f6f4002f220 comp rx=0 tx=0).stop 2026-03-09T19:24:24.124 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.125+0000 7f6f56243640 1 -- 192.168.123.107:0/3448026009 shutdown_connections 2026-03-09T19:24:24.124 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.125+0000 7f6f56243640 1 --2- 192.168.123.107:0/3448026009 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f50103c60 0x7f6f501040e0 unknown :-1 s=CLOSED pgs=255 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:24.124 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.125+0000 7f6f56243640 1 --2- 192.168.123.107:0/3448026009 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6f50102a60 0x7f6f50102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:24.124 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.125+0000 7f6f56243640 1 -- 192.168.123.107:0/3448026009 >> 192.168.123.107:0/3448026009 conn(0x7f6f500fe250 msgr2=0x7f6f50100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:24.124 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.126+0000 7f6f56243640 1 -- 192.168.123.107:0/3448026009 shutdown_connections 2026-03-09T19:24:24.125 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.126+0000 7f6f56243640 1 -- 192.168.123.107:0/3448026009 wait complete. 2026-03-09T19:24:24.125 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.127+0000 7f6f56243640 1 Processor -- start 2026-03-09T19:24:24.125 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.127+0000 7f6f56243640 1 -- start start 2026-03-09T19:24:24.126 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.127+0000 7f6f56243640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6f50102a60 0x7f6f5019e930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:24.126 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.127+0000 7f6f56243640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f50103c60 0x7f6f5019ee70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:24.126 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.127+0000 7f6f56243640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f5019f440 con 0x7f6f50103c60 2026-03-09T19:24:24.126 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.127+0000 7f6f56243640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f5019f5b0 con 0x7f6f50102a60 2026-03-09T19:24:24.126 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.128+0000 7f6f4f7fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6f50102a60 0x7f6f5019e930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:24.126 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.128+0000 7f6f4f7fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6f50102a60 0x7f6f5019e930 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:46504/0 (socket says 192.168.123.107:46504) 2026-03-09T19:24:24.126 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.128+0000 7f6f4f7fe640 1 -- 192.168.123.107:0/2433570789 learned_addr learned my addr 192.168.123.107:0/2433570789 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:24.126 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.128+0000 7f6f4effd640 1 --2- 192.168.123.107:0/2433570789 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f50103c60 0x7f6f5019ee70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:24.126 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.128+0000 7f6f4f7fe640 1 -- 192.168.123.107:0/2433570789 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f50103c60 msgr2=0x7f6f5019ee70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:24.126 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.128+0000 7f6f4f7fe640 1 --2- 192.168.123.107:0/2433570789 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f50103c60 0x7f6f5019ee70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:24.126 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.128+0000 7f6f4f7fe640 1 -- 192.168.123.107:0/2433570789 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6f40009660 con 0x7f6f50102a60 2026-03-09T19:24:24.126 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.128+0000 7f6f4f7fe640 1 --2- 192.168.123.107:0/2433570789 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6f50102a60 0x7f6f5019e930 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f6f3c00e960 tx=0x7f6f3c00ee30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:24.127 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.128+0000 7f6f4cff9640 1 -- 192.168.123.107:0/2433570789 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6f3c009800 con 0x7f6f50102a60 2026-03-09T19:24:24.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.129+0000 7f6f4cff9640 1 -- 192.168.123.107:0/2433570789 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6f3c004590 con 0x7f6f50102a60 2026-03-09T19:24:24.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.129+0000 7f6f4cff9640 1 -- 192.168.123.107:0/2433570789 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6f3c010430 con 0x7f6f50102a60 2026-03-09T19:24:24.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.130+0000 7f6f56243640 1 -- 192.168.123.107:0/2433570789 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6f501a3ff0 con 0x7f6f50102a60 2026-03-09T19:24:24.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.130+0000 7f6f56243640 1 -- 192.168.123.107:0/2433570789 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6f501a4400 con 0x7f6f50102a60 2026-03-09T19:24:24.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.130+0000 7f6f56243640 1 -- 192.168.123.107:0/2433570789 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6f50102e90 con 0x7f6f50102a60 2026-03-09T19:24:24.129 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.132+0000 7f6f4cff9640 1 -- 192.168.123.107:0/2433570789 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f6f3c0026e0 con 0x7f6f50102a60 2026-03-09T19:24:24.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.132+0000 7f6f4cff9640 1 --2- 192.168.123.107:0/2433570789 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6f28075fb0 0x7f6f28078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:24.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.132+0000 7f6f4effd640 1 --2- 192.168.123.107:0/2433570789 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6f28075fb0 0x7f6f28078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:24.131 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.133+0000 7f6f4cff9640 1 -- 192.168.123.107:0/2433570789 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f6f3c01d030 con 0x7f6f50102a60 2026-03-09T19:24:24.131 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.133+0000 7f6f4effd640 1 --2- 192.168.123.107:0/2433570789 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6f28075fb0 0x7f6f28078470 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f6f5019fe50 tx=0x7f6f4003a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:24.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.134+0000 7f6f4cff9640 1 -- 192.168.123.107:0/2433570789 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6f3c0610f0 con 0x7f6f50102a60 2026-03-09T19:24:24.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.289+0000 7f6f56243640 1 -- 192.168.123.107:0/2433570789 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f6f50107720 con 0x7f6f50102a60 2026-03-09T19:24:24.288 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.290+0000 7f6f4cff9640 1 -- 192.168.123.107:0/2433570789 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 2 v2) v1 ==== 75+0+1114 (secure 0 0 0) 0x7f6f50107720 con 0x7f6f50102a60 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:e2 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:epoch 2 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:23.601353+0000 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 0 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 1 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:in 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:up {} 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:24:24.289 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:24:24.290 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:24:24.290 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:24:24.290 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:24:24.290 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 0 2026-03-09T19:24:24.290 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:24.290 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:24.290 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 2 2026-03-09T19:24:24.296 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.296+0000 7f6f56243640 1 -- 192.168.123.107:0/2433570789 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6f28075fb0 msgr2=0x7f6f28078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:24.296 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.296+0000 7f6f56243640 1 --2- 192.168.123.107:0/2433570789 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6f28075fb0 0x7f6f28078470 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f6f5019fe50 tx=0x7f6f4003a040 comp rx=0 tx=0).stop 2026-03-09T19:24:24.296 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.296+0000 7f6f56243640 1 -- 192.168.123.107:0/2433570789 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6f50102a60 msgr2=0x7f6f5019e930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:24.296 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.296+0000 7f6f56243640 1 --2- 192.168.123.107:0/2433570789 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6f50102a60 0x7f6f5019e930 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f6f3c00e960 tx=0x7f6f3c00ee30 comp rx=0 tx=0).stop 2026-03-09T19:24:24.296 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.297+0000 7f6f56243640 1 -- 192.168.123.107:0/2433570789 shutdown_connections 2026-03-09T19:24:24.296 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.297+0000 7f6f56243640 1 --2- 192.168.123.107:0/2433570789 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6f28075fb0 0x7f6f28078470 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:24.296 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.297+0000 7f6f56243640 1 --2- 192.168.123.107:0/2433570789 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6f50103c60 0x7f6f5019ee70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:24.296 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.297+0000 7f6f56243640 1 --2- 192.168.123.107:0/2433570789 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6f50102a60 0x7f6f5019e930 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:24.296 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.297+0000 7f6f56243640 1 -- 192.168.123.107:0/2433570789 >> 192.168.123.107:0/2433570789 conn(0x7f6f500fe250 msgr2=0x7f6f50104e80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:24.296 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.298+0000 7f6f56243640 1 -- 192.168.123.107:0/2433570789 shutdown_connections 2026-03-09T19:24:24.296 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:24.298+0000 7f6f56243640 1 -- 192.168.123.107:0/2433570789 wait complete. 2026-03-09T19:24:24.497 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T19:24:24.499 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm07.local 2026-03-09T19:24:24.499 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- bash -c 'ceph fs set cephfs max_mds 2' 2026-03-09T19:24:24.718 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: osdmap e37: 6 total, 6 up, 6 in 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: osdmap e38: 6 total, 6 up, 6 in 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: fsmap cephfs:0 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: Saving service mds.cephfs spec with placement count:4 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.uizncw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.uizncw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: Deploying daemon mds.cephfs.vm07.uizncw on vm07 2026-03-09T19:24:24.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:24 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/2433570789' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:24:24.761 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: osdmap e37: 6 total, 6 up, 6 in 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: osdmap e38: 6 total, 6 up, 6 in 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: fsmap cephfs:0 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: Saving service mds.cephfs spec with placement count:4 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.uizncw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.uizncw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: Deploying daemon mds.cephfs.vm07.uizncw on vm07 2026-03-09T19:24:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:24 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/2433570789' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:24:25.032 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.032+0000 7f09e8e82640 1 -- 192.168.123.107:0/341057927 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09e4072070 msgr2=0x7f09e4072470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:25.032 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.032+0000 7f09e8e82640 1 --2- 192.168.123.107:0/341057927 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09e4072070 0x7f09e4072470 secure :-1 s=READY pgs=257 cs=0 l=1 rev1=1 crypto rx=0x7f09d8009a00 tx=0x7f09d802f290 comp rx=0 tx=0).stop 2026-03-09T19:24:25.032 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.034+0000 7f09e8e82640 1 -- 192.168.123.107:0/341057927 shutdown_connections 2026-03-09T19:24:25.032 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.034+0000 7f09e8e82640 1 --2- 192.168.123.107:0/341057927 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f09e40729b0 0x7f09e410d6e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:25.032 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.034+0000 7f09e8e82640 1 --2- 192.168.123.107:0/341057927 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09e4072070 0x7f09e4072470 unknown :-1 s=CLOSED pgs=257 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:25.032 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.034+0000 7f09e8e82640 1 -- 192.168.123.107:0/341057927 >> 192.168.123.107:0/341057927 conn(0x7f09e406d980 msgr2=0x7f09e406fdc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:25.032 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.034+0000 7f09e8e82640 1 -- 192.168.123.107:0/341057927 shutdown_connections 2026-03-09T19:24:25.032 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.034+0000 7f09e8e82640 1 -- 192.168.123.107:0/341057927 wait complete. 2026-03-09T19:24:25.033 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.035+0000 7f09e8e82640 1 Processor -- start 2026-03-09T19:24:25.033 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.035+0000 7f09e8e82640 1 -- start start 2026-03-09T19:24:25.033 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.035+0000 7f09e8e82640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09e4072070 0x7f09e419e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:25.033 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.035+0000 7f09e8e82640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f09e40729b0 0x7f09e419ee90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:25.033 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.035+0000 7f09e8e82640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f09e419f460 con 0x7f09e4072070 2026-03-09T19:24:25.033 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.035+0000 7f09e8e82640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f09e419f5d0 con 0x7f09e40729b0 2026-03-09T19:24:25.033 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.036+0000 7f09e1d74640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f09e40729b0 0x7f09e419ee90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:25.033 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.036+0000 7f09e1d74640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f09e40729b0 0x7f09e419ee90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:46528/0 (socket says 192.168.123.107:46528) 2026-03-09T19:24:25.033 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.036+0000 7f09e1d74640 1 -- 192.168.123.107:0/514388076 learned_addr learned my addr 192.168.123.107:0/514388076 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:25.034 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.036+0000 7f09e2575640 1 --2- 192.168.123.107:0/514388076 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09e4072070 0x7f09e419e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:25.034 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.036+0000 7f09e2575640 1 -- 192.168.123.107:0/514388076 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f09e40729b0 msgr2=0x7f09e419ee90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:25.034 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.036+0000 7f09e2575640 1 --2- 192.168.123.107:0/514388076 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f09e40729b0 0x7f09e419ee90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:25.034 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.036+0000 7f09e2575640 1 -- 192.168.123.107:0/514388076 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f09d8009660 con 0x7f09e4072070 2026-03-09T19:24:25.034 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.036+0000 7f09e2575640 1 --2- 192.168.123.107:0/514388076 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09e4072070 0x7f09e419e950 secure :-1 s=READY pgs=258 cs=0 l=1 rev1=1 crypto rx=0x7f09d8009a00 tx=0x7f09d8031b10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:25.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.037+0000 7f09cb7fe640 1 -- 192.168.123.107:0/514388076 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f09d8031c80 con 0x7f09e4072070 2026-03-09T19:24:25.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.037+0000 7f09cb7fe640 1 -- 192.168.123.107:0/514388076 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f09d8031de0 con 0x7f09e4072070 2026-03-09T19:24:25.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.037+0000 7f09cb7fe640 1 -- 192.168.123.107:0/514388076 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f09d8031280 con 0x7f09e4072070 2026-03-09T19:24:25.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.037+0000 7f09e8e82640 1 -- 192.168.123.107:0/514388076 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f09e41a4010 con 0x7f09e4072070 2026-03-09T19:24:25.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.037+0000 7f09e8e82640 1 -- 192.168.123.107:0/514388076 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f09e41a4500 con 0x7f09e4072070 2026-03-09T19:24:25.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.039+0000 7f09cb7fe640 1 -- 192.168.123.107:0/514388076 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f09d803f070 con 0x7f09e4072070 2026-03-09T19:24:25.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.039+0000 7f09e8e82640 1 -- 192.168.123.107:0/514388076 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f09e4072470 con 0x7f09e4072070 2026-03-09T19:24:25.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.040+0000 7f09cb7fe640 1 --2- 192.168.123.107:0/514388076 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f09b4075f60 0x7f09b4078420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:25.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.040+0000 7f09cb7fe640 1 -- 192.168.123.107:0/514388076 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f09d80bc270 con 0x7f09e4072070 2026-03-09T19:24:25.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.042+0000 7f09e1d74640 1 --2- 192.168.123.107:0/514388076 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f09b4075f60 0x7f09b4078420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:25.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.042+0000 7f09e1d74640 1 --2- 192.168.123.107:0/514388076 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f09b4075f60 0x7f09b4078420 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f09e419fe70 tx=0x7f09cc009290 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:25.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.042+0000 7f09cb7fe640 1 -- 192.168.123.107:0/514388076 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f09d80858b0 con 0x7f09e4072070 2026-03-09T19:24:25.153 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.154+0000 7f09e8e82640 1 -- 192.168.123.107:0/514388076 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"} v 0) v1 -- 0x7f09e41a4840 con 0x7f09e4072070 2026-03-09T19:24:25.768 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.769+0000 7f09cb7fe640 1 -- 192.168.123.107:0/514388076 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]=0 v3) v1 ==== 105+0+0 (secure 0 0 0) 0x7f09d8085250 con 0x7f09e4072070 2026-03-09T19:24:25.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.773+0000 7f09e8e82640 1 -- 192.168.123.107:0/514388076 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f09b4075f60 msgr2=0x7f09b4078420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:25.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.773+0000 7f09e8e82640 1 --2- 192.168.123.107:0/514388076 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f09b4075f60 0x7f09b4078420 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f09e419fe70 tx=0x7f09cc009290 comp rx=0 tx=0).stop 2026-03-09T19:24:25.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.773+0000 7f09e8e82640 1 -- 192.168.123.107:0/514388076 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09e4072070 msgr2=0x7f09e419e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:25.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.773+0000 7f09e8e82640 1 --2- 192.168.123.107:0/514388076 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09e4072070 0x7f09e419e950 secure :-1 s=READY pgs=258 cs=0 l=1 rev1=1 crypto rx=0x7f09d8009a00 tx=0x7f09d8031b10 comp rx=0 tx=0).stop 2026-03-09T19:24:25.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.773+0000 7f09e8e82640 1 -- 192.168.123.107:0/514388076 shutdown_connections 2026-03-09T19:24:25.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.773+0000 7f09e8e82640 1 --2- 192.168.123.107:0/514388076 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f09b4075f60 0x7f09b4078420 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:25.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.773+0000 7f09e8e82640 1 --2- 192.168.123.107:0/514388076 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f09e40729b0 0x7f09e419ee90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:25.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.773+0000 7f09e8e82640 1 --2- 192.168.123.107:0/514388076 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09e4072070 0x7f09e419e950 unknown :-1 s=CLOSED pgs=258 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:25.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.773+0000 7f09e8e82640 1 -- 192.168.123.107:0/514388076 >> 192.168.123.107:0/514388076 conn(0x7f09e406d980 msgr2=0x7f09e410b950 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:25.773 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.775+0000 7f09e8e82640 1 -- 192.168.123.107:0/514388076 shutdown_connections 2026-03-09T19:24:25.774 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:25.776+0000 7f09e8e82640 1 -- 192.168.123.107:0/514388076 wait complete. 2026-03-09T19:24:25.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:25 vm08 ceph-mon[57794]: pgmap v74: 65 pgs: 18 creating+peering, 41 unknown, 6 active+clean; 449 KiB data, 164 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:24:25.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:25 vm08 ceph-mon[57794]: osdmap e39: 6 total, 6 up, 6 in 2026-03-09T19:24:25.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:25 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:25.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:25 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:25.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:25 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:25.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:25 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.zcaqju", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:24:25.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:25 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.zcaqju", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T19:24:25.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:25 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:25.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:25 vm08 ceph-mon[57794]: Deploying daemon mds.cephfs.vm08.zcaqju on vm08 2026-03-09T19:24:25.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:25 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/514388076' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]: dispatch 2026-03-09T19:24:25.846 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T19:24:25.848 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm07.local 2026-03-09T19:24:25.848 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- bash -c 'ceph fs set cephfs allow_standby_replay false' 2026-03-09T19:24:25.887 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:25 vm07 ceph-mon[48545]: pgmap v74: 65 pgs: 18 creating+peering, 41 unknown, 6 active+clean; 449 KiB data, 164 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:24:25.888 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:25 vm07 ceph-mon[48545]: osdmap e39: 6 total, 6 up, 6 in 2026-03-09T19:24:25.888 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:25 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:25.888 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:25 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:25.888 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:25 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:25.888 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:25 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.zcaqju", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:24:25.888 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:25 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.zcaqju", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T19:24:25.888 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:25 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:25.888 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:25 vm07 ceph-mon[48545]: Deploying daemon mds.cephfs.vm08.zcaqju on vm08 2026-03-09T19:24:25.888 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:25 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/514388076' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]: dispatch 2026-03-09T19:24:26.093 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:26.925 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.926+0000 7ff6c7577640 1 -- 192.168.123.107:0/3062691387 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6c8073b40 msgr2=0x7ff6c8073fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:26.925 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.926+0000 7ff6c7577640 1 --2- 192.168.123.107:0/3062691387 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6c8073b40 0x7ff6c8073fa0 secure :-1 s=READY pgs=259 cs=0 l=1 rev1=1 crypto rx=0x7ff6bc0099b0 tx=0x7ff6bc02f240 comp rx=0 tx=0).stop 2026-03-09T19:24:26.925 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.927+0000 7ff6c7577640 1 -- 192.168.123.107:0/3062691387 shutdown_connections 2026-03-09T19:24:26.925 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.927+0000 7ff6c7577640 1 --2- 192.168.123.107:0/3062691387 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6c8073b40 0x7ff6c8073fa0 unknown :-1 s=CLOSED pgs=259 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:26.925 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.927+0000 7ff6c7577640 1 --2- 192.168.123.107:0/3062691387 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6c80751a0 0x7ff6c8073600 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:26.925 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.927+0000 7ff6c7577640 1 -- 192.168.123.107:0/3062691387 >> 192.168.123.107:0/3062691387 conn(0x7ff6c80fbdb0 msgr2=0x7ff6c80fe1f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:26.925 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.928+0000 7ff6c7577640 1 -- 192.168.123.107:0/3062691387 shutdown_connections 2026-03-09T19:24:26.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.928+0000 7ff6c7577640 1 -- 192.168.123.107:0/3062691387 wait complete. 2026-03-09T19:24:26.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.928+0000 7ff6c7577640 1 Processor -- start 2026-03-09T19:24:26.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.928+0000 7ff6c7577640 1 -- start start 2026-03-09T19:24:26.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.928+0000 7ff6c7577640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6c8073b40 0x7ff6c819a290 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:26.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.928+0000 7ff6c7577640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6c80751a0 0x7ff6c819a7d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:26.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.928+0000 7ff6c7577640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff6c819ada0 con 0x7ff6c80751a0 2026-03-09T19:24:26.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.928+0000 7ff6c7577640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff6c819af10 con 0x7ff6c8073b40 2026-03-09T19:24:26.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.928+0000 7ff6c5d74640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6c80751a0 0x7ff6c819a7d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:26.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.928+0000 7ff6c6575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6c8073b40 0x7ff6c819a290 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:26.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.928+0000 7ff6c5d74640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6c80751a0 0x7ff6c819a7d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:52926/0 (socket says 192.168.123.107:52926) 2026-03-09T19:24:26.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.928+0000 7ff6c5d74640 1 -- 192.168.123.107:0/2032442214 learned_addr learned my addr 192.168.123.107:0/2032442214 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:26.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.929+0000 7ff6c5d74640 1 -- 192.168.123.107:0/2032442214 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6c8073b40 msgr2=0x7ff6c819a290 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:26.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.929+0000 7ff6c5d74640 1 --2- 192.168.123.107:0/2032442214 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6c8073b40 0x7ff6c819a290 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:26.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.929+0000 7ff6c5d74640 1 -- 192.168.123.107:0/2032442214 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff6bc009660 con 0x7ff6c80751a0 2026-03-09T19:24:26.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.929+0000 7ff6c5d74640 1 --2- 192.168.123.107:0/2032442214 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6c80751a0 0x7ff6c819a7d0 secure :-1 s=READY pgs=260 cs=0 l=1 rev1=1 crypto rx=0x7ff6bc0386d0 tx=0x7ff6bc038700 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:26.928 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.930+0000 7ff6af7fe640 1 -- 192.168.123.107:0/2032442214 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff6bc02faa0 con 0x7ff6c80751a0 2026-03-09T19:24:26.928 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.930+0000 7ff6c7577640 1 -- 192.168.123.107:0/2032442214 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff6c819f950 con 0x7ff6c80751a0 2026-03-09T19:24:26.928 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.930+0000 7ff6c7577640 1 -- 192.168.123.107:0/2032442214 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff6c819fe40 con 0x7ff6c80751a0 2026-03-09T19:24:26.928 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.930+0000 7ff6af7fe640 1 -- 192.168.123.107:0/2032442214 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff6bc041400 con 0x7ff6c80751a0 2026-03-09T19:24:26.928 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.930+0000 7ff6af7fe640 1 -- 192.168.123.107:0/2032442214 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff6bc036370 con 0x7ff6c80751a0 2026-03-09T19:24:26.930 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.932+0000 7ff6af7fe640 1 -- 192.168.123.107:0/2032442214 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7ff6bc0419e0 con 0x7ff6c80751a0 2026-03-09T19:24:26.930 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.932+0000 7ff6af7fe640 1 --2- 192.168.123.107:0/2032442214 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff6940761c0 0x7ff694078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:26.930 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.932+0000 7ff6af7fe640 1 -- 192.168.123.107:0/2032442214 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7ff6bc0bc580 con 0x7ff6c80751a0 2026-03-09T19:24:26.930 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.933+0000 7ff6c7577640 1 -- 192.168.123.107:0/2032442214 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff68c005350 con 0x7ff6c80751a0 2026-03-09T19:24:26.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.936+0000 7ff6c6575640 1 --2- 192.168.123.107:0/2032442214 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff6940761c0 0x7ff694078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:26.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.936+0000 7ff6af7fe640 1 -- 192.168.123.107:0/2032442214 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7ff6bc085d00 con 0x7ff6c80751a0 2026-03-09T19:24:26.935 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:26.936+0000 7ff6c6575640 1 --2- 192.168.123.107:0/2032442214 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff6940761c0 0x7ff694078680 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7ff6b00059c0 tx=0x7ff6b000a380 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: osdmap e40: 6 total, 6 up, 6 in 2026-03-09T19:24:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.zkmcyw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:24:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.zkmcyw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T19:24:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: Deploying daemon mds.cephfs.vm07.zkmcyw on vm07 2026-03-09T19:24:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/514388076' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]': finished 2026-03-09T19:24:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: mds.? [v2:192.168.123.107:6826/1147625344,v1:192.168.123.107:6827/1147625344] up:boot 2026-03-09T19:24:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: mds.? [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] up:boot 2026-03-09T19:24:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: daemon mds.cephfs.vm08.zcaqju assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-09T19:24:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T19:24:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: fsmap cephfs:0 2 up:standby 2026-03-09T19:24:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.uizncw"}]: dispatch 2026-03-09T19:24:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.zcaqju"}]: dispatch 2026-03-09T19:24:26.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: fsmap cephfs:1 {0=cephfs.vm08.zcaqju=up:creating} 1 up:standby 2026-03-09T19:24:26.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:26 vm07 ceph-mon[48545]: daemon mds.cephfs.vm08.zcaqju is now active in filesystem cephfs as rank 0 2026-03-09T19:24:27.052 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:27.054+0000 7ff6c7577640 1 -- 192.168.123.107:0/2032442214 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"} v 0) v1 -- 0x7ff68c005b80 con 0x7ff6c80751a0 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: osdmap e40: 6 total, 6 up, 6 in 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.zkmcyw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.zkmcyw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: Deploying daemon mds.cephfs.vm07.zkmcyw on vm07 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/514388076' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]': finished 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: mds.? [v2:192.168.123.107:6826/1147625344,v1:192.168.123.107:6827/1147625344] up:boot 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: mds.? [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] up:boot 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: daemon mds.cephfs.vm08.zcaqju assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: fsmap cephfs:0 2 up:standby 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.uizncw"}]: dispatch 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.zcaqju"}]: dispatch 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: fsmap cephfs:1 {0=cephfs.vm08.zcaqju=up:creating} 1 up:standby 2026-03-09T19:24:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:26 vm08 ceph-mon[57794]: daemon mds.cephfs.vm08.zcaqju is now active in filesystem cephfs as rank 0 2026-03-09T19:24:27.719 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:27.721+0000 7ff6af7fe640 1 -- 192.168.123.107:0/2032442214 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]=0 v7) v1 ==== 122+0+0 (secure 0 0 0) 0x7ff6bc0856a0 con 0x7ff6c80751a0 2026-03-09T19:24:27.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:27.726+0000 7ff6c7577640 1 -- 192.168.123.107:0/2032442214 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff6940761c0 msgr2=0x7ff694078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:27.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:27.726+0000 7ff6c7577640 1 --2- 192.168.123.107:0/2032442214 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff6940761c0 0x7ff694078680 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7ff6b00059c0 tx=0x7ff6b000a380 comp rx=0 tx=0).stop 2026-03-09T19:24:27.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:27.726+0000 7ff6c7577640 1 -- 192.168.123.107:0/2032442214 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6c80751a0 msgr2=0x7ff6c819a7d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:27.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:27.726+0000 7ff6c7577640 1 --2- 192.168.123.107:0/2032442214 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6c80751a0 0x7ff6c819a7d0 secure :-1 s=READY pgs=260 cs=0 l=1 rev1=1 crypto rx=0x7ff6bc0386d0 tx=0x7ff6bc038700 comp rx=0 tx=0).stop 2026-03-09T19:24:27.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:27.726+0000 7ff6c7577640 1 -- 192.168.123.107:0/2032442214 shutdown_connections 2026-03-09T19:24:27.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:27.726+0000 7ff6c7577640 1 --2- 192.168.123.107:0/2032442214 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff6940761c0 0x7ff694078680 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:27.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:27.726+0000 7ff6c7577640 1 --2- 192.168.123.107:0/2032442214 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff6c80751a0 0x7ff6c819a7d0 unknown :-1 s=CLOSED pgs=260 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:27.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:27.726+0000 7ff6c7577640 1 --2- 192.168.123.107:0/2032442214 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6c8073b40 0x7ff6c819a290 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:27.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:27.726+0000 7ff6c7577640 1 -- 192.168.123.107:0/2032442214 >> 192.168.123.107:0/2032442214 conn(0x7ff6c80fbdb0 msgr2=0x7ff6c80fd920 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:27.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:27.726+0000 7ff6c7577640 1 -- 192.168.123.107:0/2032442214 shutdown_connections 2026-03-09T19:24:27.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:27.726+0000 7ff6c7577640 1 -- 192.168.123.107:0/2032442214 wait complete. 2026-03-09T19:24:27.786 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T19:24:27.788 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm07.local 2026-03-09T19:24:27.789 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- bash -c 'ceph fs set cephfs inline_data false' 2026-03-09T19:24:27.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:27 vm07 ceph-mon[48545]: pgmap v77: 65 pgs: 18 creating+peering, 26 unknown, 21 active+clean; 450 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 816 B/s wr, 3 op/s 2026-03-09T19:24:27.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:27 vm07 ceph-mon[48545]: mds.? [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] up:active 2026-03-09T19:24:27.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:27 vm07 ceph-mon[48545]: daemon mds.cephfs.vm07.uizncw assigned to filesystem cephfs as rank 1 (now has 2 ranks) 2026-03-09T19:24:27.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:27 vm07 ceph-mon[48545]: Health check failed: insufficient standby MDS daemons available (MDS_INSUFFICIENT_STANDBY) 2026-03-09T19:24:27.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:27 vm07 ceph-mon[48545]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-09T19:24:27.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:27 vm07 ceph-mon[48545]: fsmap cephfs:1 {0=cephfs.vm08.zcaqju=up:active} 1 up:standby 2026-03-09T19:24:27.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:27 vm07 ceph-mon[48545]: fsmap cephfs:2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm07.uizncw=up:creating} 2026-03-09T19:24:27.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:27 vm07 ceph-mon[48545]: daemon mds.cephfs.vm07.uizncw is now active in filesystem cephfs as rank 1 2026-03-09T19:24:27.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:27 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:27.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:27 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:27.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:27 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:27.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:27 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.jwsqrf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:24:27.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:27 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.jwsqrf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T19:24:27.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:27 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:27.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:27 vm07 ceph-mon[48545]: Deploying daemon mds.cephfs.vm08.jwsqrf on vm08 2026-03-09T19:24:27.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:27 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/2032442214' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-09T19:24:28.009 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:28.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:27 vm08 ceph-mon[57794]: pgmap v77: 65 pgs: 18 creating+peering, 26 unknown, 21 active+clean; 450 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 816 B/s wr, 3 op/s 2026-03-09T19:24:28.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:27 vm08 ceph-mon[57794]: mds.? [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] up:active 2026-03-09T19:24:28.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:27 vm08 ceph-mon[57794]: daemon mds.cephfs.vm07.uizncw assigned to filesystem cephfs as rank 1 (now has 2 ranks) 2026-03-09T19:24:28.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:27 vm08 ceph-mon[57794]: Health check failed: insufficient standby MDS daemons available (MDS_INSUFFICIENT_STANDBY) 2026-03-09T19:24:28.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:27 vm08 ceph-mon[57794]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-09T19:24:28.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:27 vm08 ceph-mon[57794]: fsmap cephfs:1 {0=cephfs.vm08.zcaqju=up:active} 1 up:standby 2026-03-09T19:24:28.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:27 vm08 ceph-mon[57794]: fsmap cephfs:2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm07.uizncw=up:creating} 2026-03-09T19:24:28.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:27 vm08 ceph-mon[57794]: daemon mds.cephfs.vm07.uizncw is now active in filesystem cephfs as rank 1 2026-03-09T19:24:28.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:27 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:28.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:27 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:28.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:27 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:28.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:27 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.jwsqrf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:24:28.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:27 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.jwsqrf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T19:24:28.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:27 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:28.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:27 vm08 ceph-mon[57794]: Deploying daemon mds.cephfs.vm08.jwsqrf on vm08 2026-03-09T19:24:28.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:27 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/2032442214' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.323+0000 7f942bfff640 1 -- 192.168.123.107:0/1963660491 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f942c0719a0 msgr2=0x7f942c071da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.323+0000 7f942bfff640 1 --2- 192.168.123.107:0/1963660491 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f942c0719a0 0x7f942c071da0 secure :-1 s=READY pgs=263 cs=0 l=1 rev1=1 crypto rx=0x7f941401cb30 tx=0x7f9414040420 comp rx=0 tx=0).stop 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.324+0000 7f942bfff640 1 -- 192.168.123.107:0/1963660491 shutdown_connections 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.324+0000 7f942bfff640 1 --2- 192.168.123.107:0/1963660491 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f942c072370 0x7f942c10c590 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.324+0000 7f942bfff640 1 --2- 192.168.123.107:0/1963660491 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f942c0719a0 0x7f942c071da0 unknown :-1 s=CLOSED pgs=263 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.324+0000 7f942bfff640 1 -- 192.168.123.107:0/1963660491 >> 192.168.123.107:0/1963660491 conn(0x7f942c06d4f0 msgr2=0x7f942c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.324+0000 7f942bfff640 1 -- 192.168.123.107:0/1963660491 shutdown_connections 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.324+0000 7f942bfff640 1 -- 192.168.123.107:0/1963660491 wait complete. 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.324+0000 7f942bfff640 1 Processor -- start 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.324+0000 7f942bfff640 1 -- start start 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.325+0000 7f942bfff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f942c0719a0 0x7f942c19e7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.325+0000 7f942bfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f942c072370 0x7f942c19ed20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.325+0000 7f9423fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f942c072370 0x7f942c19ed20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.325+0000 7f9423fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f942c072370 0x7f942c19ed20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:52960/0 (socket says 192.168.123.107:52960) 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.325+0000 7f9423fff640 1 -- 192.168.123.107:0/1014689247 learned_addr learned my addr 192.168.123.107:0/1014689247 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.325+0000 7f942bfff640 1 -- 192.168.123.107:0/1014689247 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f942c19f2f0 con 0x7f942c072370 2026-03-09T19:24:28.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.325+0000 7f942bfff640 1 -- 192.168.123.107:0/1014689247 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f942c19f460 con 0x7f942c0719a0 2026-03-09T19:24:28.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.326+0000 7f9423fff640 1 -- 192.168.123.107:0/1014689247 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f942c0719a0 msgr2=0x7f942c19e7e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:28.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.326+0000 7f9423fff640 1 --2- 192.168.123.107:0/1014689247 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f942c0719a0 0x7f942c19e7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:28.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.326+0000 7f9423fff640 1 -- 192.168.123.107:0/1014689247 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f941401c790 con 0x7f942c072370 2026-03-09T19:24:28.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.326+0000 7f9423fff640 1 --2- 192.168.123.107:0/1014689247 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f942c072370 0x7f942c19ed20 secure :-1 s=READY pgs=264 cs=0 l=1 rev1=1 crypto rx=0x7f940c00b780 tx=0x7f940c00bc50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:28.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.326+0000 7f9428ff9640 1 -- 192.168.123.107:0/1014689247 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f940c004070 con 0x7f942c072370 2026-03-09T19:24:28.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.327+0000 7f9428ff9640 1 -- 192.168.123.107:0/1014689247 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f940c002780 con 0x7f942c072370 2026-03-09T19:24:28.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.327+0000 7f9428ff9640 1 -- 192.168.123.107:0/1014689247 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f940c00ca90 con 0x7f942c072370 2026-03-09T19:24:28.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.327+0000 7f942bfff640 1 -- 192.168.123.107:0/1014689247 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f942c1a3f00 con 0x7f942c072370 2026-03-09T19:24:28.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.328+0000 7f942bfff640 1 -- 192.168.123.107:0/1014689247 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f942c1a43f0 con 0x7f942c072370 2026-03-09T19:24:28.327 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.328+0000 7f9428ff9640 1 -- 192.168.123.107:0/1014689247 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f940c00cc10 con 0x7f942c072370 2026-03-09T19:24:28.327 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.329+0000 7f9428ff9640 1 --2- 192.168.123.107:0/1014689247 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f94000761c0 0x7f9400078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:28.327 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.329+0000 7f942affd640 1 --2- 192.168.123.107:0/1014689247 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f94000761c0 0x7f9400078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:28.328 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.330+0000 7f9428ff9640 1 -- 192.168.123.107:0/1014689247 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f940c095f60 con 0x7f942c072370 2026-03-09T19:24:28.328 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.330+0000 7f9421ffb640 1 -- 192.168.123.107:0/1014689247 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f942c071da0 con 0x7f942c072370 2026-03-09T19:24:28.328 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.330+0000 7f942affd640 1 --2- 192.168.123.107:0/1014689247 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f94000761c0 0x7f9400078680 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f9414004870 tx=0x7f94140047c0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:28.332 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.334+0000 7f9428ff9640 1 -- 192.168.123.107:0/1014689247 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f940c09b050 con 0x7f942c072370 2026-03-09T19:24:28.446 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.447+0000 7f9421ffb640 1 -- 192.168.123.107:0/1014689247 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"} v 0) v1 -- 0x7f942c10fa70 con 0x7f942c072370 2026-03-09T19:24:28.821 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.823+0000 7f9428ff9640 1 -- 192.168.123.107:0/1014689247 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]=0 inline data disabled v8) v1 ==== 133+0+0 (secure 0 0 0) 0x7f940c05f690 con 0x7f942c072370 2026-03-09T19:24:28.821 INFO:teuthology.orchestra.run.vm07.stderr:inline data disabled 2026-03-09T19:24:28.823 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.825+0000 7f9421ffb640 1 -- 192.168.123.107:0/1014689247 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f94000761c0 msgr2=0x7f9400078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:28.823 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.825+0000 7f9421ffb640 1 --2- 192.168.123.107:0/1014689247 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f94000761c0 0x7f9400078680 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f9414004870 tx=0x7f94140047c0 comp rx=0 tx=0).stop 2026-03-09T19:24:28.823 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.825+0000 7f9421ffb640 1 -- 192.168.123.107:0/1014689247 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f942c072370 msgr2=0x7f942c19ed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:28.823 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.825+0000 7f9421ffb640 1 --2- 192.168.123.107:0/1014689247 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f942c072370 0x7f942c19ed20 secure :-1 s=READY pgs=264 cs=0 l=1 rev1=1 crypto rx=0x7f940c00b780 tx=0x7f940c00bc50 comp rx=0 tx=0).stop 2026-03-09T19:24:28.823 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.826+0000 7f9421ffb640 1 -- 192.168.123.107:0/1014689247 shutdown_connections 2026-03-09T19:24:28.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.826+0000 7f9421ffb640 1 --2- 192.168.123.107:0/1014689247 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f94000761c0 0x7f9400078680 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:28.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.826+0000 7f9421ffb640 1 --2- 192.168.123.107:0/1014689247 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f942c072370 0x7f942c19ed20 unknown :-1 s=CLOSED pgs=264 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:28.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.826+0000 7f9421ffb640 1 --2- 192.168.123.107:0/1014689247 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f942c0719a0 0x7f942c19e7e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:28.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.826+0000 7f9421ffb640 1 -- 192.168.123.107:0/1014689247 >> 192.168.123.107:0/1014689247 conn(0x7f942c06d4f0 msgr2=0x7f942c10a800 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:28.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.826+0000 7f9421ffb640 1 -- 192.168.123.107:0/1014689247 shutdown_connections 2026-03-09T19:24:28.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:28.826+0000 7f9421ffb640 1 -- 192.168.123.107:0/1014689247 wait complete. 2026-03-09T19:24:28.916 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T19:24:28.919 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm07.local 2026-03-09T19:24:28.919 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- bash -c 'ceph fs dump' 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: Health check cleared: MDS_INSUFFICIENT_STANDBY (was: insufficient standby MDS daemons available) 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: Cluster is now healthy 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/2032442214' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]': finished 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: mds.? [v2:192.168.123.107:6826/1147625344,v1:192.168.123.107:6827/1147625344] up:active 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: mds.? [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] up:boot 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: fsmap cephfs:2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm07.uizncw=up:active} 1 up:standby 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.zkmcyw"}]: dispatch 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1014689247' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]: dispatch 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:28.953 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:28 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: Health check cleared: MDS_INSUFFICIENT_STANDBY (was: insufficient standby MDS daemons available) 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: Cluster is now healthy 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/2032442214' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]': finished 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: mds.? [v2:192.168.123.107:6826/1147625344,v1:192.168.123.107:6827/1147625344] up:active 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: mds.? [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] up:boot 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: fsmap cephfs:2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm07.uizncw=up:active} 1 up:standby 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.zkmcyw"}]: dispatch 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/1014689247' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]: dispatch 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:28 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:29.120 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:29.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.413+0000 7f4345f37640 1 -- 192.168.123.107:0/930278012 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43400fee20 msgr2=0x7f4340105f30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:29.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.413+0000 7f4345f37640 1 --2- 192.168.123.107:0/930278012 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43400fee20 0x7f4340105f30 secure :-1 s=READY pgs=267 cs=0 l=1 rev1=1 crypto rx=0x7f43300099b0 tx=0x7f433002f2d0 comp rx=0 tx=0).stop 2026-03-09T19:24:29.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.414+0000 7f4345f37640 1 -- 192.168.123.107:0/930278012 shutdown_connections 2026-03-09T19:24:29.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.414+0000 7f4345f37640 1 --2- 192.168.123.107:0/930278012 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43400fee20 0x7f4340105f30 unknown :-1 s=CLOSED pgs=267 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:29.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.414+0000 7f4345f37640 1 --2- 192.168.123.107:0/930278012 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f43400fe4e0 0x7f43400fe8e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:29.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.414+0000 7f4345f37640 1 -- 192.168.123.107:0/930278012 >> 192.168.123.107:0/930278012 conn(0x7f43400fa150 msgr2=0x7f43400fc570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:29.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.414+0000 7f4345f37640 1 -- 192.168.123.107:0/930278012 shutdown_connections 2026-03-09T19:24:29.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.414+0000 7f4345f37640 1 -- 192.168.123.107:0/930278012 wait complete. 2026-03-09T19:24:29.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.415+0000 7f4345f37640 1 Processor -- start 2026-03-09T19:24:29.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.415+0000 7f4345f37640 1 -- start start 2026-03-09T19:24:29.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.415+0000 7f4345f37640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43400fe4e0 0x7f43401017f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:29.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.415+0000 7f4345f37640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f43400fee20 0x7f43400ffe40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:29.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.415+0000 7f4345f37640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4340101dc0 con 0x7f43400fe4e0 2026-03-09T19:24:29.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.415+0000 7f4345f37640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f43401003b0 con 0x7f43400fee20 2026-03-09T19:24:29.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.415+0000 7f433f7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43400fe4e0 0x7f43401017f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:29.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.415+0000 7f433f7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43400fe4e0 0x7f43401017f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:47550/0 (socket says 192.168.123.107:47550) 2026-03-09T19:24:29.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.415+0000 7f433f7fe640 1 -- 192.168.123.107:0/499525465 learned_addr learned my addr 192.168.123.107:0/499525465 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:29.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.415+0000 7f433effd640 1 --2- 192.168.123.107:0/499525465 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f43400fee20 0x7f43400ffe40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:29.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.416+0000 7f433f7fe640 1 -- 192.168.123.107:0/499525465 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f43400fee20 msgr2=0x7f43400ffe40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:29.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.416+0000 7f433f7fe640 1 --2- 192.168.123.107:0/499525465 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f43400fee20 0x7f43400ffe40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:29.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.416+0000 7f433f7fe640 1 -- 192.168.123.107:0/499525465 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4330009660 con 0x7f43400fe4e0 2026-03-09T19:24:29.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.416+0000 7f433f7fe640 1 --2- 192.168.123.107:0/499525465 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43400fe4e0 0x7f43401017f0 secure :-1 s=READY pgs=268 cs=0 l=1 rev1=1 crypto rx=0x7f432c00d8d0 tx=0x7f432c00dda0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:29.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.416+0000 7f433cff9640 1 -- 192.168.123.107:0/499525465 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f432c004490 con 0x7f43400fe4e0 2026-03-09T19:24:29.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.416+0000 7f4345f37640 1 -- 192.168.123.107:0/499525465 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4340100690 con 0x7f43400fe4e0 2026-03-09T19:24:29.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.416+0000 7f4345f37640 1 -- 192.168.123.107:0/499525465 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4340100be0 con 0x7f43400fe4e0 2026-03-09T19:24:29.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.416+0000 7f433cff9640 1 -- 192.168.123.107:0/499525465 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f432c00bd00 con 0x7f43400fe4e0 2026-03-09T19:24:29.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.416+0000 7f433cff9640 1 -- 192.168.123.107:0/499525465 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f432c010460 con 0x7f43400fe4e0 2026-03-09T19:24:29.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.418+0000 7f4345f37640 1 -- 192.168.123.107:0/499525465 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4340109400 con 0x7f43400fe4e0 2026-03-09T19:24:29.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.419+0000 7f433cff9640 1 -- 192.168.123.107:0/499525465 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f432c0027e0 con 0x7f43400fe4e0 2026-03-09T19:24:29.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.419+0000 7f433cff9640 1 --2- 192.168.123.107:0/499525465 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4310076170 0x7f4310078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:29.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.422+0000 7f433cff9640 1 -- 192.168.123.107:0/499525465 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f432c060190 con 0x7f43400fe4e0 2026-03-09T19:24:29.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.422+0000 7f433cff9640 1 -- 192.168.123.107:0/499525465 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f432c09c050 con 0x7f43400fe4e0 2026-03-09T19:24:29.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.422+0000 7f433effd640 1 --2- 192.168.123.107:0/499525465 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4310076170 0x7f4310078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:29.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.422+0000 7f433effd640 1 --2- 192.168.123.107:0/499525465 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4310076170 0x7f4310078630 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f4330002cc0 tx=0x7f433003a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:29.544 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.545+0000 7f4345f37640 1 -- 192.168.123.107:0/499525465 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f43401a0370 con 0x7f43400fe4e0 2026-03-09T19:24:29.544 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.546+0000 7f433cff9640 1 -- 192.168.123.107:0/499525465 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 9 v9) v1 ==== 75+0+1651 (secure 0 0 0) 0x7f432c060450 con 0x7f43400fe4e0 2026-03-09T19:24:29.545 INFO:teuthology.orchestra.run.vm07.stdout:e9 2026-03-09T19:24:29.545 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:24:29.545 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:epoch 9 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:28.835604+0000 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 2 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:replay seq 1 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:24:29.546 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 9 2026-03-09T19:24:29.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.551+0000 7f431a7fc640 1 -- 192.168.123.107:0/499525465 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4310076170 msgr2=0x7f4310078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:29.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.551+0000 7f431a7fc640 1 --2- 192.168.123.107:0/499525465 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4310076170 0x7f4310078630 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f4330002cc0 tx=0x7f433003a040 comp rx=0 tx=0).stop 2026-03-09T19:24:29.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.551+0000 7f431a7fc640 1 -- 192.168.123.107:0/499525465 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43400fe4e0 msgr2=0x7f43401017f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:29.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.551+0000 7f431a7fc640 1 --2- 192.168.123.107:0/499525465 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43400fe4e0 0x7f43401017f0 secure :-1 s=READY pgs=268 cs=0 l=1 rev1=1 crypto rx=0x7f432c00d8d0 tx=0x7f432c00dda0 comp rx=0 tx=0).stop 2026-03-09T19:24:29.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.551+0000 7f431a7fc640 1 -- 192.168.123.107:0/499525465 shutdown_connections 2026-03-09T19:24:29.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.551+0000 7f431a7fc640 1 --2- 192.168.123.107:0/499525465 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4310076170 0x7f4310078630 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:29.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.551+0000 7f431a7fc640 1 --2- 192.168.123.107:0/499525465 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f43400fee20 0x7f43400ffe40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:29.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.551+0000 7f431a7fc640 1 --2- 192.168.123.107:0/499525465 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f43400fe4e0 0x7f43401017f0 unknown :-1 s=CLOSED pgs=268 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:29.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.551+0000 7f431a7fc640 1 -- 192.168.123.107:0/499525465 >> 192.168.123.107:0/499525465 conn(0x7f43400fa150 msgr2=0x7f4340104200 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:29.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.551+0000 7f431a7fc640 1 -- 192.168.123.107:0/499525465 shutdown_connections 2026-03-09T19:24:29.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:29.551+0000 7f431a7fc640 1 -- 192.168.123.107:0/499525465 wait complete. 2026-03-09T19:24:29.608 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- bash -c 'ceph --format=json fs dump | jq -e ".filesystems | length == 1"' 2026-03-09T19:24:29.862 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: pgmap v78: 65 pgs: 8 creating+peering, 57 active+clean; 452 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 3.8 KiB/s wr, 11 op/s 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1014689247' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]': finished 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: mds.? [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] up:boot 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: Dropping low affinity active daemon mds.cephfs.vm07.uizncw in favor of higher affinity standby. 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: Replacing daemon mds.cephfs.vm07.uizncw as rank 1 with standby daemon mds.cephfs.vm08.jwsqrf 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: fsmap cephfs:2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm07.uizncw=up:active} 2 up:standby 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.jwsqrf"}]: dispatch 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: osdmap e41: 6 total, 6 up, 6 in 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: fsmap cephfs:2/2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm08.jwsqrf=up:replay} 1 up:standby 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/499525465' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:30.093 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:29 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:24:30.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: pgmap v78: 65 pgs: 8 creating+peering, 57 active+clean; 452 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 3.8 KiB/s wr, 11 op/s 2026-03-09T19:24:30.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/1014689247' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]': finished 2026-03-09T19:24:30.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: mds.? [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] up:boot 2026-03-09T19:24:30.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: Dropping low affinity active daemon mds.cephfs.vm07.uizncw in favor of higher affinity standby. 2026-03-09T19:24:30.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: Replacing daemon mds.cephfs.vm07.uizncw as rank 1 with standby daemon mds.cephfs.vm08.jwsqrf 2026-03-09T19:24:30.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T19:24:30.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: fsmap cephfs:2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm07.uizncw=up:active} 2 up:standby 2026-03-09T19:24:30.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.jwsqrf"}]: dispatch 2026-03-09T19:24:30.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: osdmap e41: 6 total, 6 up, 6 in 2026-03-09T19:24:30.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: fsmap cephfs:2/2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm08.jwsqrf=up:replay} 1 up:standby 2026-03-09T19:24:30.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/499525465' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:24:30.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:30.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:30.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:30.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:24:30.098 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:30.098 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:29 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:24:30.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.131+0000 7f9e54cba640 1 -- 192.168.123.107:0/1384967933 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e50073b40 msgr2=0x7f9e50073fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:30.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.131+0000 7f9e54cba640 1 --2- 192.168.123.107:0/1384967933 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e50073b40 0x7f9e50073fa0 secure :-1 s=READY pgs=269 cs=0 l=1 rev1=1 crypto rx=0x7f9e34009a00 tx=0x7f9e3402f290 comp rx=0 tx=0).stop 2026-03-09T19:24:30.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.132+0000 7f9e54cba640 1 -- 192.168.123.107:0/1384967933 shutdown_connections 2026-03-09T19:24:30.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.132+0000 7f9e54cba640 1 --2- 192.168.123.107:0/1384967933 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e50073b40 0x7f9e50073fa0 unknown :-1 s=CLOSED pgs=269 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:30.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.132+0000 7f9e54cba640 1 --2- 192.168.123.107:0/1384967933 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e500751a0 0x7f9e50073600 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:30.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.132+0000 7f9e54cba640 1 -- 192.168.123.107:0/1384967933 >> 192.168.123.107:0/1384967933 conn(0x7f9e500fbdb0 msgr2=0x7f9e500fe1f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:30.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.133+0000 7f9e54cba640 1 -- 192.168.123.107:0/1384967933 shutdown_connections 2026-03-09T19:24:30.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.133+0000 7f9e54cba640 1 -- 192.168.123.107:0/1384967933 wait complete. 2026-03-09T19:24:30.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.133+0000 7f9e54cba640 1 Processor -- start 2026-03-09T19:24:30.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.133+0000 7f9e54cba640 1 -- start start 2026-03-09T19:24:30.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.134+0000 7f9e54cba640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e50073b40 0x7f9e5019e870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:30.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.134+0000 7f9e54cba640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e500751a0 0x7f9e5019edb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:30.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.134+0000 7f9e4f7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e50073b40 0x7f9e5019e870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:30.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.134+0000 7f9e4f7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e50073b40 0x7f9e5019e870 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:47556/0 (socket says 192.168.123.107:47556) 2026-03-09T19:24:30.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.134+0000 7f9e4effd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e500751a0 0x7f9e5019edb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:30.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.134+0000 7f9e4effd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e500751a0 0x7f9e5019edb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:33440/0 (socket says 192.168.123.107:33440) 2026-03-09T19:24:30.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.134+0000 7f9e4effd640 1 -- 192.168.123.107:0/1526049693 learned_addr learned my addr 192.168.123.107:0/1526049693 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:30.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.134+0000 7f9e54cba640 1 -- 192.168.123.107:0/1526049693 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9e5019f380 con 0x7f9e50073b40 2026-03-09T19:24:30.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.134+0000 7f9e54cba640 1 -- 192.168.123.107:0/1526049693 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9e5019f4f0 con 0x7f9e500751a0 2026-03-09T19:24:30.133 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.135+0000 7f9e4effd640 1 -- 192.168.123.107:0/1526049693 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e50073b40 msgr2=0x7f9e5019e870 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:30.133 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.135+0000 7f9e4effd640 1 --2- 192.168.123.107:0/1526049693 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e50073b40 0x7f9e5019e870 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:30.133 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.135+0000 7f9e4effd640 1 -- 192.168.123.107:0/1526049693 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9e34009660 con 0x7f9e500751a0 2026-03-09T19:24:30.133 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.135+0000 7f9e4f7fe640 1 --2- 192.168.123.107:0/1526049693 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e50073b40 0x7f9e5019e870 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:24:30.133 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.135+0000 7f9e4effd640 1 --2- 192.168.123.107:0/1526049693 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e500751a0 0x7f9e5019edb0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f9e3402f7a0 tx=0x7f9e34031d40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:30.134 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.135+0000 7f9e4cff9640 1 -- 192.168.123.107:0/1526049693 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9e34031ee0 con 0x7f9e500751a0 2026-03-09T19:24:30.134 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.135+0000 7f9e54cba640 1 -- 192.168.123.107:0/1526049693 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9e501a3f30 con 0x7f9e500751a0 2026-03-09T19:24:30.134 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.135+0000 7f9e4cff9640 1 -- 192.168.123.107:0/1526049693 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9e340029a0 con 0x7f9e500751a0 2026-03-09T19:24:30.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.136+0000 7f9e4cff9640 1 -- 192.168.123.107:0/1526049693 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9e34040600 con 0x7f9e500751a0 2026-03-09T19:24:30.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.136+0000 7f9e54cba640 1 -- 192.168.123.107:0/1526049693 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9e501a4410 con 0x7f9e500751a0 2026-03-09T19:24:30.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.137+0000 7f9e427fc640 1 -- 192.168.123.107:0/1526049693 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9e18005350 con 0x7f9e500751a0 2026-03-09T19:24:30.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.138+0000 7f9e4cff9640 1 -- 192.168.123.107:0/1526049693 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f9e3403f070 con 0x7f9e500751a0 2026-03-09T19:24:30.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.138+0000 7f9e4cff9640 1 --2- 192.168.123.107:0/1526049693 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9e28075fb0 0x7f9e28078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:30.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.138+0000 7f9e4cff9640 1 -- 192.168.123.107:0/1526049693 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f9e340bbec0 con 0x7f9e500751a0 2026-03-09T19:24:30.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.139+0000 7f9e4f7fe640 1 --2- 192.168.123.107:0/1526049693 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9e28075fb0 0x7f9e28078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:30.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.139+0000 7f9e4f7fe640 1 --2- 192.168.123.107:0/1526049693 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9e28075fb0 0x7f9e28078470 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f9e3c007920 tx=0x7f9e3c008040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:30.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.141+0000 7f9e4cff9640 1 -- 192.168.123.107:0/1526049693 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f9e34085400 con 0x7f9e500751a0 2026-03-09T19:24:30.256 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.258+0000 7f9e427fc640 1 -- 192.168.123.107:0/1526049693 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f9e18005b80 con 0x7f9e500751a0 2026-03-09T19:24:30.257 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.259+0000 7f9e4cff9640 1 -- 192.168.123.107:0/1526049693 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 10 v10) v1 ==== 94+0+4853 (secure 0 0 0) 0x7f9e3407fe60 con 0x7f9e500751a0 2026-03-09T19:24:30.257 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 10 2026-03-09T19:24:30.259 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.261+0000 7f9e54cba640 1 -- 192.168.123.107:0/1526049693 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9e28075fb0 msgr2=0x7f9e28078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:30.259 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.261+0000 7f9e54cba640 1 --2- 192.168.123.107:0/1526049693 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9e28075fb0 0x7f9e28078470 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f9e3c007920 tx=0x7f9e3c008040 comp rx=0 tx=0).stop 2026-03-09T19:24:30.259 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.262+0000 7f9e54cba640 1 -- 192.168.123.107:0/1526049693 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e500751a0 msgr2=0x7f9e5019edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:30.260 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.262+0000 7f9e54cba640 1 --2- 192.168.123.107:0/1526049693 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e500751a0 0x7f9e5019edb0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f9e3402f7a0 tx=0x7f9e34031d40 comp rx=0 tx=0).stop 2026-03-09T19:24:30.260 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.262+0000 7f9e54cba640 1 -- 192.168.123.107:0/1526049693 shutdown_connections 2026-03-09T19:24:30.260 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.262+0000 7f9e54cba640 1 --2- 192.168.123.107:0/1526049693 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9e28075fb0 0x7f9e28078470 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:30.260 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.262+0000 7f9e54cba640 1 --2- 192.168.123.107:0/1526049693 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e500751a0 0x7f9e5019edb0 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:30.260 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.262+0000 7f9e54cba640 1 --2- 192.168.123.107:0/1526049693 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e50073b40 0x7f9e5019e870 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:30.260 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.262+0000 7f9e54cba640 1 -- 192.168.123.107:0/1526049693 >> 192.168.123.107:0/1526049693 conn(0x7f9e500fbdb0 msgr2=0x7f9e500fd6e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:30.260 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.262+0000 7f9e54cba640 1 -- 192.168.123.107:0/1526049693 shutdown_connections 2026-03-09T19:24:30.260 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.262+0000 7f9e54cba640 1 -- 192.168.123.107:0/1526049693 wait complete. 2026-03-09T19:24:30.268 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:24:30.313 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- bash -c 'while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done' 2026-03-09T19:24:30.509 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:30.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.803+0000 7f5590e2b640 1 -- 192.168.123.107:0/3913700298 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f558c0ff6e0 msgr2=0x7f558c0ffae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:30.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.803+0000 7f5590e2b640 1 --2- 192.168.123.107:0/3913700298 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f558c0ff6e0 0x7f558c0ffae0 secure :-1 s=READY pgs=270 cs=0 l=1 rev1=1 crypto rx=0x7f55800099b0 tx=0x7f558002f240 comp rx=0 tx=0).stop 2026-03-09T19:24:30.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.803+0000 7f5590e2b640 1 -- 192.168.123.107:0/3913700298 shutdown_connections 2026-03-09T19:24:30.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.803+0000 7f5590e2b640 1 --2- 192.168.123.107:0/3913700298 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f558c1008e0 0x7f558c100d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:30.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.803+0000 7f5590e2b640 1 --2- 192.168.123.107:0/3913700298 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f558c0ff6e0 0x7f558c0ffae0 unknown :-1 s=CLOSED pgs=270 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:30.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.803+0000 7f5590e2b640 1 -- 192.168.123.107:0/3913700298 >> 192.168.123.107:0/3913700298 conn(0x7f558c0fae50 msgr2=0x7f558c0fd2b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:30.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.803+0000 7f5590e2b640 1 -- 192.168.123.107:0/3913700298 shutdown_connections 2026-03-09T19:24:30.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.803+0000 7f5590e2b640 1 -- 192.168.123.107:0/3913700298 wait complete. 2026-03-09T19:24:30.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.804+0000 7f5590e2b640 1 Processor -- start 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.806+0000 7f5590e2b640 1 -- start start 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.806+0000 7f5590e2b640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f558c0ff6e0 0x7f558c19ea00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.806+0000 7f5590e2b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f558c1008e0 0x7f558c19ef40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.806+0000 7f5590e2b640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f558c19f510 con 0x7f558c1008e0 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.806+0000 7f5590e2b640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f558c19f680 con 0x7f558c0ff6e0 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.807+0000 7f558a575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f558c0ff6e0 0x7f558c19ea00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.807+0000 7f558a575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f558c0ff6e0 0x7f558c19ea00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:33458/0 (socket says 192.168.123.107:33458) 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.807+0000 7f558a575640 1 -- 192.168.123.107:0/764182705 learned_addr learned my addr 192.168.123.107:0/764182705 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.807+0000 7f5589d74640 1 --2- 192.168.123.107:0/764182705 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f558c1008e0 0x7f558c19ef40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.808+0000 7f5589d74640 1 -- 192.168.123.107:0/764182705 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f558c0ff6e0 msgr2=0x7f558c19ea00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.808+0000 7f5589d74640 1 --2- 192.168.123.107:0/764182705 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f558c0ff6e0 0x7f558c19ea00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.808+0000 7f5589d74640 1 -- 192.168.123.107:0/764182705 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5580009660 con 0x7f558c1008e0 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.808+0000 7f5589d74640 1 --2- 192.168.123.107:0/764182705 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f558c1008e0 0x7f558c19ef40 secure :-1 s=READY pgs=271 cs=0 l=1 rev1=1 crypto rx=0x7f557400ece0 tx=0x7f557400c6a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.808+0000 7f55737fe640 1 -- 192.168.123.107:0/764182705 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f557400ede0 con 0x7f558c1008e0 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.808+0000 7f5590e2b640 1 -- 192.168.123.107:0/764182705 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f558c06ab30 con 0x7f558c1008e0 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.808+0000 7f5590e2b640 1 -- 192.168.123.107:0/764182705 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f558c06b080 con 0x7f558c1008e0 2026-03-09T19:24:30.808 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.810+0000 7f55737fe640 1 -- 192.168.123.107:0/764182705 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f5574004590 con 0x7f558c1008e0 2026-03-09T19:24:30.813 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.810+0000 7f55737fe640 1 -- 192.168.123.107:0/764182705 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5574002ea0 con 0x7f558c1008e0 2026-03-09T19:24:30.813 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.810+0000 7f55737fe640 1 -- 192.168.123.107:0/764182705 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f5574010700 con 0x7f558c1008e0 2026-03-09T19:24:30.813 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.810+0000 7f55737fe640 1 --2- 192.168.123.107:0/764182705 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5564076080 0x7f5564078540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:30.813 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.810+0000 7f55737fe640 1 -- 192.168.123.107:0/764182705 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f5574014070 con 0x7f558c1008e0 2026-03-09T19:24:30.813 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.812+0000 7f558a575640 1 --2- 192.168.123.107:0/764182705 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5564076080 0x7f5564078540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:30.813 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.812+0000 7f558a575640 1 --2- 192.168.123.107:0/764182705 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5564076080 0x7f5564078540 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f5580002c30 tx=0x7f558003a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:30.813 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.812+0000 7f5590e2b640 1 -- 192.168.123.107:0/764182705 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5550005350 con 0x7f558c1008e0 2026-03-09T19:24:30.813 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.815+0000 7f55737fe640 1 -- 192.168.123.107:0/764182705 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f5574062220 con 0x7f558c1008e0 2026-03-09T19:24:30.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.967+0000 7f5590e2b640 1 -- 192.168.123.107:0/764182705 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mds versions", "format": "json"} v 0) v1 -- 0x7f5550005600 con 0x7f558c1008e0 2026-03-09T19:24:30.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.968+0000 7f55737fe640 1 -- 192.168.123.107:0/764182705 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mds versions", "format": "json"}]=0 v11) v1 ==== 78+0+98 (secure 0 0 0) 0x7f5574061bc0 con 0x7f558c1008e0 2026-03-09T19:24:30.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.971+0000 7f5590e2b640 1 -- 192.168.123.107:0/764182705 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5564076080 msgr2=0x7f5564078540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:30.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.971+0000 7f5590e2b640 1 --2- 192.168.123.107:0/764182705 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5564076080 0x7f5564078540 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f5580002c30 tx=0x7f558003a040 comp rx=0 tx=0).stop 2026-03-09T19:24:30.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.971+0000 7f5590e2b640 1 -- 192.168.123.107:0/764182705 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f558c1008e0 msgr2=0x7f558c19ef40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:30.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.971+0000 7f5590e2b640 1 --2- 192.168.123.107:0/764182705 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f558c1008e0 0x7f558c19ef40 secure :-1 s=READY pgs=271 cs=0 l=1 rev1=1 crypto rx=0x7f557400ece0 tx=0x7f557400c6a0 comp rx=0 tx=0).stop 2026-03-09T19:24:30.970 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.971+0000 7f5590e2b640 1 -- 192.168.123.107:0/764182705 shutdown_connections 2026-03-09T19:24:30.970 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.972+0000 7f5590e2b640 1 --2- 192.168.123.107:0/764182705 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5564076080 0x7f5564078540 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:30.970 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.972+0000 7f5590e2b640 1 --2- 192.168.123.107:0/764182705 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f558c1008e0 0x7f558c19ef40 unknown :-1 s=CLOSED pgs=271 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:30.970 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.972+0000 7f5590e2b640 1 --2- 192.168.123.107:0/764182705 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f558c0ff6e0 0x7f558c19ea00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:30.970 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.972+0000 7f5590e2b640 1 -- 192.168.123.107:0/764182705 >> 192.168.123.107:0/764182705 conn(0x7f558c0fae50 msgr2=0x7f558c0fc710 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:30.970 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.972+0000 7f5590e2b640 1 -- 192.168.123.107:0/764182705 shutdown_connections 2026-03-09T19:24:30.970 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:30.972+0000 7f5590e2b640 1 -- 192.168.123.107:0/764182705 wait complete. 2026-03-09T19:24:30.980 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:24:31.034 INFO:teuthology.run_tasks:Running task fs.pre_upgrade_save... 2026-03-09T19:24:31.038 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 2026-03-09T19:24:31.159 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:30 vm07 ceph-mon[48545]: mds.? [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] up:resolve 2026-03-09T19:24:31.159 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:30 vm07 ceph-mon[48545]: mds.? [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] up:boot 2026-03-09T19:24:31.159 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:30 vm07 ceph-mon[48545]: mds.? [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] up:active 2026-03-09T19:24:31.159 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:30 vm07 ceph-mon[48545]: fsmap cephfs:2/2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm08.jwsqrf=up:resolve} 2 up:standby 2026-03-09T19:24:31.159 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:30 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.uizncw"}]: dispatch 2026-03-09T19:24:31.159 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:30 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/1526049693' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T19:24:31.159 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:30 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:31.159 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:30 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:31.231 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:31.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:30 vm08 ceph-mon[57794]: mds.? [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] up:resolve 2026-03-09T19:24:31.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:30 vm08 ceph-mon[57794]: mds.? [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] up:boot 2026-03-09T19:24:31.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:30 vm08 ceph-mon[57794]: mds.? [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] up:active 2026-03-09T19:24:31.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:30 vm08 ceph-mon[57794]: fsmap cephfs:2/2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm08.jwsqrf=up:resolve} 2 up:standby 2026-03-09T19:24:31.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:30 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.uizncw"}]: dispatch 2026-03-09T19:24:31.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:30 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/1526049693' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T19:24:31.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:30 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:31.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:30 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:31.523 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.524+0000 7f2a6c7c0640 1 -- 192.168.123.107:0/3352700134 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a64072af0 msgr2=0x7f2a6410ba70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:31.523 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.524+0000 7f2a6c7c0640 1 --2- 192.168.123.107:0/3352700134 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a64072af0 0x7f2a6410ba70 secure :-1 s=READY pgs=272 cs=0 l=1 rev1=1 crypto rx=0x7f2a5c00b3e0 tx=0x7f2a5c02f730 comp rx=0 tx=0).stop 2026-03-09T19:24:31.523 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.525+0000 7f2a6c7c0640 1 -- 192.168.123.107:0/3352700134 shutdown_connections 2026-03-09T19:24:31.523 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.525+0000 7f2a6c7c0640 1 --2- 192.168.123.107:0/3352700134 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a64072af0 0x7f2a6410ba70 unknown :-1 s=CLOSED pgs=272 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:31.523 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.525+0000 7f2a6c7c0640 1 --2- 192.168.123.107:0/3352700134 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a64072140 0x7f2a64072520 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:31.523 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.525+0000 7f2a6c7c0640 1 -- 192.168.123.107:0/3352700134 >> 192.168.123.107:0/3352700134 conn(0x7f2a6406c7e0 msgr2=0x7f2a6406cbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:31.523 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.525+0000 7f2a6c7c0640 1 -- 192.168.123.107:0/3352700134 shutdown_connections 2026-03-09T19:24:31.523 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.525+0000 7f2a6c7c0640 1 -- 192.168.123.107:0/3352700134 wait complete. 2026-03-09T19:24:31.523 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.525+0000 7f2a6c7c0640 1 Processor -- start 2026-03-09T19:24:31.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.526+0000 7f2a6c7c0640 1 -- start start 2026-03-09T19:24:31.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.526+0000 7f2a6c7c0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a64072140 0x7f2a6407d650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:31.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.526+0000 7f2a6c7c0640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a64072af0 0x7f2a6407db90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:31.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.526+0000 7f2a6c7c0640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2a6407e0d0 con 0x7f2a64072140 2026-03-09T19:24:31.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.526+0000 7f2a6c7c0640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2a6407e240 con 0x7f2a64072af0 2026-03-09T19:24:31.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.526+0000 7f2a6a535640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a64072140 0x7f2a6407d650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:31.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.526+0000 7f2a6a535640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a64072140 0x7f2a6407d650 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:47588/0 (socket says 192.168.123.107:47588) 2026-03-09T19:24:31.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.526+0000 7f2a6a535640 1 -- 192.168.123.107:0/2997729471 learned_addr learned my addr 192.168.123.107:0/2997729471 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:31.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.526+0000 7f2a6a535640 1 -- 192.168.123.107:0/2997729471 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a64072af0 msgr2=0x7f2a6407db90 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:24:31.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.526+0000 7f2a6a535640 1 --2- 192.168.123.107:0/2997729471 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a64072af0 0x7f2a6407db90 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:31.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.526+0000 7f2a6a535640 1 -- 192.168.123.107:0/2997729471 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2a5c009d00 con 0x7f2a64072140 2026-03-09T19:24:31.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.526+0000 7f2a6a535640 1 --2- 192.168.123.107:0/2997729471 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a64072140 0x7f2a6407d650 secure :-1 s=READY pgs=273 cs=0 l=1 rev1=1 crypto rx=0x7f2a6000b4f0 tx=0x7f2a6000b9c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:31.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.527+0000 7f2a5b7fe640 1 -- 192.168.123.107:0/2997729471 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2a60004280 con 0x7f2a64072140 2026-03-09T19:24:31.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.527+0000 7f2a6c7c0640 1 -- 192.168.123.107:0/2997729471 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2a64082110 con 0x7f2a64072140 2026-03-09T19:24:31.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.527+0000 7f2a6c7c0640 1 -- 192.168.123.107:0/2997729471 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2a64082660 con 0x7f2a64072140 2026-03-09T19:24:31.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.531+0000 7f2a5b7fe640 1 -- 192.168.123.107:0/2997729471 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f2a60002c60 con 0x7f2a64072140 2026-03-09T19:24:31.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.531+0000 7f2a5b7fe640 1 -- 192.168.123.107:0/2997729471 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2a60010bb0 con 0x7f2a64072140 2026-03-09T19:24:31.534 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.534+0000 7f2a5b7fe640 1 -- 192.168.123.107:0/2997729471 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f2a600027a0 con 0x7f2a64072140 2026-03-09T19:24:31.534 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.535+0000 7f2a5b7fe640 1 --2- 192.168.123.107:0/2997729471 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2a48076290 0x7f2a48078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:31.534 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.535+0000 7f2a5b7fe640 1 -- 192.168.123.107:0/2997729471 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f2a60096e50 con 0x7f2a64072140 2026-03-09T19:24:31.536 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.537+0000 7f2a6c7c0640 1 -- 192.168.123.107:0/2997729471 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2a640822a0 con 0x7f2a64072140 2026-03-09T19:24:31.536 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.537+0000 7f2a69d34640 1 --2- 192.168.123.107:0/2997729471 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2a48076290 0x7f2a48078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:31.536 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.537+0000 7f2a69d34640 1 --2- 192.168.123.107:0/2997729471 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2a48076290 0x7f2a48078750 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f2a6407ee30 tx=0x7f2a5c002750 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:31.544 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.544+0000 7f2a5b7fe640 1 -- 192.168.123.107:0/2997729471 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f2a640822a0 con 0x7f2a64072140 2026-03-09T19:24:31.679 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.680+0000 7f2a6c7c0640 1 -- 192.168.123.107:0/2997729471 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f2a640822a0 con 0x7f2a64072140 2026-03-09T19:24:31.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.682+0000 7f2a5b7fe640 1 -- 192.168.123.107:0/2997729471 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 11 v11) v1 ==== 94+0+4855 (secure 0 0 0) 0x7f2a640822a0 con 0x7f2a64072140 2026-03-09T19:24:31.681 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:31.682 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":11,"default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14500,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6829/969322030","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":969322030},{"type":"v1","addr":"192.168.123.107:6829","nonce":969322030}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":7},{"gid":14510,"name":"cephfs.vm07.uizncw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/1298912984","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":1298912984},{"type":"v1","addr":"192.168.123.107:6827","nonce":1298912984}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":11,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:24:30.854097+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":41,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":2,"in":[0,1],"up":{"mds_0":24279,"mds_1":24285},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24279":{"gid":24279,"name":"cephfs.vm08.zcaqju","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.108:6825/1034426472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1034426472},{"type":"v1","addr":"192.168.123.108:6825","nonce":1034426472}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24285":{"gid":24285,"name":"cephfs.vm08.jwsqrf","rank":1,"incarnation":9,"state":"up:reconnect","state_seq":3,"addr":"192.168.123.108:6827/3082155067","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":3082155067},{"type":"v1","addr":"192.168.123.108:6827","nonce":3082155067}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1},"id":1}]} 2026-03-09T19:24:31.682 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 11 2026-03-09T19:24:31.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.686+0000 7f2a6c7c0640 1 -- 192.168.123.107:0/2997729471 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2a48076290 msgr2=0x7f2a48078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:31.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.686+0000 7f2a6c7c0640 1 --2- 192.168.123.107:0/2997729471 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2a48076290 0x7f2a48078750 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f2a6407ee30 tx=0x7f2a5c002750 comp rx=0 tx=0).stop 2026-03-09T19:24:31.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.686+0000 7f2a6c7c0640 1 -- 192.168.123.107:0/2997729471 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a64072140 msgr2=0x7f2a6407d650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:31.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.686+0000 7f2a6c7c0640 1 --2- 192.168.123.107:0/2997729471 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a64072140 0x7f2a6407d650 secure :-1 s=READY pgs=273 cs=0 l=1 rev1=1 crypto rx=0x7f2a6000b4f0 tx=0x7f2a6000b9c0 comp rx=0 tx=0).stop 2026-03-09T19:24:31.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.687+0000 7f2a6c7c0640 1 -- 192.168.123.107:0/2997729471 shutdown_connections 2026-03-09T19:24:31.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.687+0000 7f2a6c7c0640 1 --2- 192.168.123.107:0/2997729471 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2a48076290 0x7f2a48078750 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:31.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.687+0000 7f2a6c7c0640 1 --2- 192.168.123.107:0/2997729471 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a64072af0 0x7f2a6407db90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:31.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.687+0000 7f2a6c7c0640 1 --2- 192.168.123.107:0/2997729471 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a64072140 0x7f2a6407d650 unknown :-1 s=CLOSED pgs=273 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:31.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.687+0000 7f2a6c7c0640 1 -- 192.168.123.107:0/2997729471 >> 192.168.123.107:0/2997729471 conn(0x7f2a6406c7e0 msgr2=0x7f2a6406faf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:31.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.687+0000 7f2a6c7c0640 1 -- 192.168.123.107:0/2997729471 shutdown_connections 2026-03-09T19:24:31.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:31.687+0000 7f2a6c7c0640 1 -- 192.168.123.107:0/2997729471 wait complete. 2026-03-09T19:24:31.741 DEBUG:tasks.fs:fs fscid=1,name=cephfs state = {'epoch': 11, 'max_mds': 2, 'flags': 18} 2026-03-09T19:24:31.741 INFO:teuthology.run_tasks:Running task ceph-fuse... 2026-03-09T19:24:31.751 INFO:tasks.ceph_fuse:Running ceph_fuse task... 2026-03-09T19:24:31.751 INFO:tasks.ceph_fuse:config is {'client.0': {}, 'client.1': {}} 2026-03-09T19:24:31.751 INFO:tasks.ceph_fuse:client.0 config is {} 2026-03-09T19:24:31.751 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-09T19:24:31.751 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-09T19:24:31.751 INFO:tasks.ceph_fuse:client.1 config is {} 2026-03-09T19:24:31.751 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-09T19:24:31.751 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-09T19:24:31.751 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:31.751 DEBUG:teuthology.orchestra.run.vm07:> ip netns list 2026-03-09T19:24:31.771 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:31.771 DEBUG:teuthology.orchestra.run.vm07:> sudo ip link delete ceph-brx 2026-03-09T19:24:31.839 INFO:teuthology.orchestra.run.vm07.stderr:Cannot find device "ceph-brx" 2026-03-09T19:24:31.840 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T19:24:31.840 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:31.840 DEBUG:teuthology.orchestra.run.vm08:> ip netns list 2026-03-09T19:24:31.856 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:31.856 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link delete ceph-brx 2026-03-09T19:24:31.924 INFO:teuthology.orchestra.run.vm08.stderr:Cannot find device "ceph-brx" 2026-03-09T19:24:31.925 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T19:24:31.925 INFO:tasks.ceph_fuse:Mounting ceph-fuse clients... 2026-03-09T19:24:31.925 DEBUG:tasks.ceph_fuse:passing mntargs=[] 2026-03-09T19:24:31.925 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs ls 2026-03-09T19:24:31.949 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:31 vm07 ceph-mon[48545]: pgmap v80: 65 pgs: 65 active+clean; 452 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 511 B/s rd, 4.2 KiB/s wr, 11 op/s 2026-03-09T19:24:31.949 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:31 vm07 ceph-mon[48545]: mds.? [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] up:reconnect 2026-03-09T19:24:31.949 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:31 vm07 ceph-mon[48545]: fsmap cephfs:2/2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm08.jwsqrf=up:reconnect} 2 up:standby 2026-03-09T19:24:31.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:31 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/764182705' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-09T19:24:31.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:31 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:31.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:31 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:31.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:31 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:31.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:31 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:24:31.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:31 vm07 ceph-mon[48545]: from='client.? 192.168.123.107:0/2997729471' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T19:24:31.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:31 vm07 ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:32.085 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:32.320 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.322+0000 7f813b47c640 1 -- 192.168.123.107:0/2449770254 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f81341029d0 msgr2=0x7f8134102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:32.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.322+0000 7f813b47c640 1 --2- 192.168.123.107:0/2449770254 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f81341029d0 0x7f8134102e30 secure :-1 s=READY pgs=274 cs=0 l=1 rev1=1 crypto rx=0x7f81240099b0 tx=0x7f812402f220 comp rx=0 tx=0).stop 2026-03-09T19:24:32.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.323+0000 7f813b47c640 1 -- 192.168.123.107:0/2449770254 shutdown_connections 2026-03-09T19:24:32.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.323+0000 7f813b47c640 1 --2- 192.168.123.107:0/2449770254 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f81341029d0 0x7f8134102e30 unknown :-1 s=CLOSED pgs=274 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:32.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.323+0000 7f813b47c640 1 --2- 192.168.123.107:0/2449770254 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f81341089d0 0x7f8134108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:32.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.323+0000 7f813b47c640 1 -- 192.168.123.107:0/2449770254 >> 192.168.123.107:0/2449770254 conn(0x7f81340fe710 msgr2=0x7f8134100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:32.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.323+0000 7f813b47c640 1 -- 192.168.123.107:0/2449770254 shutdown_connections 2026-03-09T19:24:32.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.323+0000 7f813b47c640 1 -- 192.168.123.107:0/2449770254 wait complete. 2026-03-09T19:24:32.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.323+0000 7f813b47c640 1 Processor -- start 2026-03-09T19:24:32.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.323+0000 7f813b47c640 1 -- start start 2026-03-09T19:24:32.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.324+0000 7f813b47c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f81341029d0 0x7f81341a0640 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:32.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.324+0000 7f81391f1640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f81341029d0 0x7f81341a0640 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:32.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.324+0000 7f81391f1640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f81341029d0 0x7f81341a0640 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:47614/0 (socket says 192.168.123.107:47614) 2026-03-09T19:24:32.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.324+0000 7f813b47c640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f81341089d0 0x7f81341a0b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:32.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.324+0000 7f813b47c640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f813419a730 con 0x7f81341029d0 2026-03-09T19:24:32.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.324+0000 7f813b47c640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f813419a8a0 con 0x7f81341089d0 2026-03-09T19:24:32.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.324+0000 7f81391f1640 1 -- 192.168.123.107:0/3267235664 learned_addr learned my addr 192.168.123.107:0/3267235664 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:32.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.324+0000 7f81389f0640 1 --2- 192.168.123.107:0/3267235664 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f81341089d0 0x7f81341a0b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:32.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.325+0000 7f81389f0640 1 -- 192.168.123.107:0/3267235664 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f81341029d0 msgr2=0x7f81341a0640 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:32.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.325+0000 7f81389f0640 1 --2- 192.168.123.107:0/3267235664 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f81341029d0 0x7f81341a0640 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:32.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.325+0000 7f81389f0640 1 -- 192.168.123.107:0/3267235664 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8124009660 con 0x7f81341089d0 2026-03-09T19:24:32.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.325+0000 7f81391f1640 1 --2- 192.168.123.107:0/3267235664 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f81341029d0 0x7f81341a0640 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:24:32.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.325+0000 7f81389f0640 1 --2- 192.168.123.107:0/3267235664 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f81341089d0 0x7f81341a0b80 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f812402f730 tx=0x7f8124031cd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:32.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.325+0000 7f81227fc640 1 -- 192.168.123.107:0/3267235664 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f812403d070 con 0x7f81341089d0 2026-03-09T19:24:32.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.325+0000 7f813b47c640 1 -- 192.168.123.107:0/3267235664 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f813419ab20 con 0x7f81341089d0 2026-03-09T19:24:32.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.325+0000 7f813b47c640 1 -- 192.168.123.107:0/3267235664 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f813419b010 con 0x7f81341089d0 2026-03-09T19:24:32.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.326+0000 7f81227fc640 1 -- 192.168.123.107:0/3267235664 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8124031e80 con 0x7f81341089d0 2026-03-09T19:24:32.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.326+0000 7f81227fc640 1 -- 192.168.123.107:0/3267235664 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8124031260 con 0x7f81341089d0 2026-03-09T19:24:32.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.327+0000 7f813b47c640 1 -- 192.168.123.107:0/3267235664 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f80fc005350 con 0x7f81341089d0 2026-03-09T19:24:32.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.327+0000 7f81227fc640 1 -- 192.168.123.107:0/3267235664 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f8124049050 con 0x7f81341089d0 2026-03-09T19:24:32.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.327+0000 7f81227fc640 1 --2- 192.168.123.107:0/3267235664 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8110076170 0x7f8110078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:32.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.328+0000 7f81227fc640 1 -- 192.168.123.107:0/3267235664 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f812402fe20 con 0x7f81341089d0 2026-03-09T19:24:32.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.328+0000 7f81391f1640 1 --2- 192.168.123.107:0/3267235664 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8110076170 0x7f8110078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:32.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.328+0000 7f81391f1640 1 --2- 192.168.123.107:0/3267235664 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8110076170 0x7f8110078630 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f81280046b0 tx=0x7f81280092c0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:32.328 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.330+0000 7f81227fc640 1 -- 192.168.123.107:0/3267235664 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f81240c1050 con 0x7f81341089d0 2026-03-09T19:24:32.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:31 vm08 ceph-mon[57794]: pgmap v80: 65 pgs: 65 active+clean; 452 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 511 B/s rd, 4.2 KiB/s wr, 11 op/s 2026-03-09T19:24:32.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:31 vm08 ceph-mon[57794]: mds.? [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] up:reconnect 2026-03-09T19:24:32.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:31 vm08 ceph-mon[57794]: fsmap cephfs:2/2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm08.jwsqrf=up:reconnect} 2 up:standby 2026-03-09T19:24:32.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:31 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/764182705' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-09T19:24:32.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:31 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:32.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:31 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:32.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:31 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:32.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:31 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:24:32.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:31 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/2997729471' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T19:24:32.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:31 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:32.434 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.436+0000 7f813b47c640 1 -- 192.168.123.107:0/3267235664 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7f80fc0058d0 con 0x7f81341089d0 2026-03-09T19:24:32.434 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.437+0000 7f81227fc640 1 -- 192.168.123.107:0/3267235664 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v12) v1 ==== 53+0+83 (secure 0 0 0) 0x7f8124085210 con 0x7f81341089d0 2026-03-09T19:24:32.435 INFO:teuthology.orchestra.run.vm07.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-09T19:24:32.437 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.439+0000 7f813b47c640 1 -- 192.168.123.107:0/3267235664 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8110076170 msgr2=0x7f8110078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:32.437 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.439+0000 7f813b47c640 1 --2- 192.168.123.107:0/3267235664 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8110076170 0x7f8110078630 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f81280046b0 tx=0x7f81280092c0 comp rx=0 tx=0).stop 2026-03-09T19:24:32.437 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.439+0000 7f813b47c640 1 -- 192.168.123.107:0/3267235664 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f81341089d0 msgr2=0x7f81341a0b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:32.437 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.439+0000 7f813b47c640 1 --2- 192.168.123.107:0/3267235664 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f81341089d0 0x7f81341a0b80 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f812402f730 tx=0x7f8124031cd0 comp rx=0 tx=0).stop 2026-03-09T19:24:32.437 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.439+0000 7f813b47c640 1 -- 192.168.123.107:0/3267235664 shutdown_connections 2026-03-09T19:24:32.437 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.439+0000 7f813b47c640 1 --2- 192.168.123.107:0/3267235664 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8110076170 0x7f8110078630 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:32.437 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.439+0000 7f813b47c640 1 --2- 192.168.123.107:0/3267235664 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f81341089d0 0x7f81341a0b80 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:32.437 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.440+0000 7f813b47c640 1 --2- 192.168.123.107:0/3267235664 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f81341029d0 0x7f81341a0640 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:32.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.440+0000 7f813b47c640 1 -- 192.168.123.107:0/3267235664 >> 192.168.123.107:0/3267235664 conn(0x7f81340fe710 msgr2=0x7f81340feaf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:32.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.440+0000 7f813b47c640 1 -- 192.168.123.107:0/3267235664 shutdown_connections 2026-03-09T19:24:32.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32.440+0000 7f813b47c640 1 -- 192.168.123.107:0/3267235664 wait complete. 2026-03-09T19:24:32.499 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-09T19:24:32.499 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-09T19:24:32.499 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm07.local 2026-03-09T19:24:32.499 INFO:tasks.cephfs.mount:self.client.name = client.0 2026-03-09T19:24:32.499 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-09T19:24:32.499 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-09T19:24:32.499 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-09T19:24:32.499 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-09T19:24:32.499 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.0' 2026-03-09T19:24:32.499 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:32.499 DEBUG:teuthology.orchestra.run.vm07:> ip addr 2026-03-09T19:24:32.554 INFO:teuthology.orchestra.run.vm07.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-09T19:24:32.554 INFO:teuthology.orchestra.run.vm07.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-09T19:24:32.554 INFO:teuthology.orchestra.run.vm07.stdout: inet 127.0.0.1/8 scope host lo 2026-03-09T19:24:32.554 INFO:teuthology.orchestra.run.vm07.stdout: valid_lft forever preferred_lft forever 2026-03-09T19:24:32.554 INFO:teuthology.orchestra.run.vm07.stdout: inet6 ::1/128 scope host 2026-03-09T19:24:32.554 INFO:teuthology.orchestra.run.vm07.stdout: valid_lft forever preferred_lft forever 2026-03-09T19:24:32.554 INFO:teuthology.orchestra.run.vm07.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-09T19:24:32.554 INFO:teuthology.orchestra.run.vm07.stdout: link/ether 52:55:00:00:00:07 brd ff:ff:ff:ff:ff:ff 2026-03-09T19:24:32.554 INFO:teuthology.orchestra.run.vm07.stdout: altname enp0s3 2026-03-09T19:24:32.554 INFO:teuthology.orchestra.run.vm07.stdout: altname ens3 2026-03-09T19:24:32.554 INFO:teuthology.orchestra.run.vm07.stdout: inet 192.168.123.107/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-09T19:24:32.554 INFO:teuthology.orchestra.run.vm07.stdout: valid_lft 3158sec preferred_lft 3158sec 2026-03-09T19:24:32.554 INFO:teuthology.orchestra.run.vm07.stdout: inet6 fe80::5055:ff:fe00:7/64 scope link noprefixroute 2026-03-09T19:24:32.554 INFO:teuthology.orchestra.run.vm07.stdout: valid_lft forever preferred_lft forever 2026-03-09T19:24:32.555 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-09T19:24:32.555 DEBUG:teuthology.orchestra.run.vm07:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T19:24:32.555 DEBUG:teuthology.orchestra.run.vm07:> set -e 2026-03-09T19:24:32.555 DEBUG:teuthology.orchestra.run.vm07:> sudo ip link add name ceph-brx type bridge 2026-03-09T19:24:32.555 DEBUG:teuthology.orchestra.run.vm07:> sudo ip addr flush dev ceph-brx 2026-03-09T19:24:32.555 DEBUG:teuthology.orchestra.run.vm07:> sudo ip link set ceph-brx up 2026-03-09T19:24:32.555 DEBUG:teuthology.orchestra.run.vm07:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-09T19:24:32.555 DEBUG:teuthology.orchestra.run.vm07:> ') 2026-03-09T19:24:32.630 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T19:24:32.700 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T19:24:32.705 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:32.705 DEBUG:teuthology.orchestra.run.vm07:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-09T19:24:32.773 INFO:teuthology.orchestra.run.vm07.stdout:1 2026-03-09T19:24:32.775 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:32.775 DEBUG:teuthology.orchestra.run.vm07:> ip r 2026-03-09T19:24:32.831 INFO:teuthology.orchestra.run.vm07.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.107 metric 100 2026-03-09T19:24:32.831 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.107 metric 100 2026-03-09T19:24:32.831 INFO:teuthology.orchestra.run.vm07.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-09T19:24:32.831 DEBUG:teuthology.orchestra.run.vm07:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T19:24:32.831 DEBUG:teuthology.orchestra.run.vm07:> set -e 2026-03-09T19:24:32.831 DEBUG:teuthology.orchestra.run.vm07:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-09T19:24:32.831 DEBUG:teuthology.orchestra.run.vm07:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-09T19:24:32.831 DEBUG:teuthology.orchestra.run.vm07:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-09T19:24:32.831 DEBUG:teuthology.orchestra.run.vm07:> ') 2026-03-09T19:24:32.908 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T19:24:32.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:32 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T19:24:32.972 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:32.972 DEBUG:teuthology.orchestra.run.vm07:> ip netns list 2026-03-09T19:24:33.028 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:33.028 DEBUG:teuthology.orchestra.run.vm07:> ip netns list-id 2026-03-09T19:24:33.083 DEBUG:teuthology.orchestra.run.vm07:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T19:24:33.084 DEBUG:teuthology.orchestra.run.vm07:> set -e 2026-03-09T19:24:33.084 DEBUG:teuthology.orchestra.run.vm07:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T19:24:33.084 DEBUG:teuthology.orchestra.run.vm07:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.0 0 2026-03-09T19:24:33.084 DEBUG:teuthology.orchestra.run.vm07:> ') 2026-03-09T19:24:33.161 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:33 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T19:24:33.169 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:32 vm07.local ceph-mon[48545]: mds.? [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] up:rejoin 2026-03-09T19:24:33.169 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:32 vm07.local ceph-mon[48545]: mds.? [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] up:standby 2026-03-09T19:24:33.169 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:32 vm07.local ceph-mon[48545]: fsmap cephfs:2/2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm08.jwsqrf=up:rejoin} 2 up:standby 2026-03-09T19:24:33.169 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:32 vm07.local ceph-mon[48545]: daemon mds.cephfs.vm08.jwsqrf is now active in filesystem cephfs as rank 1 2026-03-09T19:24:33.169 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:32 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/3267235664' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-09T19:24:33.169 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:32 vm07.local ceph-mon[48545]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T19:24:33.169 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:32 vm07.local ceph-mon[48545]: Cluster is now healthy 2026-03-09T19:24:33.187 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:33 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T19:24:33.190 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.0' with 192.168.144.1/20 2026-03-09T19:24:33.190 DEBUG:teuthology.orchestra.run.vm07:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T19:24:33.190 DEBUG:teuthology.orchestra.run.vm07:> set -e 2026-03-09T19:24:33.190 DEBUG:teuthology.orchestra.run.vm07:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.0 type veth peer name brx.0 2026-03-09T19:24:33.190 DEBUG:teuthology.orchestra.run.vm07:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-09T19:24:33.190 DEBUG:teuthology.orchestra.run.vm07:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set veth0 up 2026-03-09T19:24:33.190 DEBUG:teuthology.orchestra.run.vm07:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set lo up 2026-03-09T19:24:33.190 DEBUG:teuthology.orchestra.run.vm07:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip route add default via 192.168.159.254 2026-03-09T19:24:33.190 DEBUG:teuthology.orchestra.run.vm07:> ') 2026-03-09T19:24:33.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:33 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T19:24:33.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:33 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T19:24:33.329 DEBUG:teuthology.orchestra.run.vm07:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T19:24:33.329 DEBUG:teuthology.orchestra.run.vm07:> set -e 2026-03-09T19:24:33.329 DEBUG:teuthology.orchestra.run.vm07:> sudo ip link set brx.0 up 2026-03-09T19:24:33.329 DEBUG:teuthology.orchestra.run.vm07:> sudo ip link set dev brx.0 master ceph-brx 2026-03-09T19:24:33.329 DEBUG:teuthology.orchestra.run.vm07:> ') 2026-03-09T19:24:33.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:32 vm08 ceph-mon[57794]: mds.? [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] up:rejoin 2026-03-09T19:24:33.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:32 vm08 ceph-mon[57794]: mds.? [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] up:standby 2026-03-09T19:24:33.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:32 vm08 ceph-mon[57794]: fsmap cephfs:2/2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm08.jwsqrf=up:rejoin} 2 up:standby 2026-03-09T19:24:33.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:32 vm08 ceph-mon[57794]: daemon mds.cephfs.vm08.jwsqrf is now active in filesystem cephfs as rank 1 2026-03-09T19:24:33.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:32 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/3267235664' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-09T19:24:33.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:32 vm08 ceph-mon[57794]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T19:24:33.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:32 vm08 ceph-mon[57794]: Cluster is now healthy 2026-03-09T19:24:33.404 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:33 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T19:24:33.434 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:33 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T19:24:33.438 INFO:tasks.cephfs.fuse_mount:Client client.0 config is {} 2026-03-09T19:24:33.438 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-09T19:24:33.438 DEBUG:teuthology.orchestra.run.vm07:> mkdir -p -v /home/ubuntu/cephtest/mnt.0 2026-03-09T19:24:33.494 INFO:teuthology.orchestra.run.vm07.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.0' 2026-03-09T19:24:33.494 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-09T19:24:33.494 DEBUG:teuthology.orchestra.run.vm07:> chmod 0000 /home/ubuntu/cephtest/mnt.0 2026-03-09T19:24:33.550 DEBUG:teuthology.orchestra.run.vm07:> sudo modprobe fuse 2026-03-09T19:24:33.615 DEBUG:teuthology.orchestra.run.vm07:> cat /proc/self/mounts | awk '{print $2}' 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/proc 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/sys 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/dev 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/sys/kernel/security 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/dev/shm 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/dev/pts 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/run 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/cgroup 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/pstore 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/bpf 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/sys/kernel/config 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/ 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/nfs/rpc_pipefs 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/selinux 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/dev/hugepages 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/dev/mqueue 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/sys/kernel/debug 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/sys/kernel/tracing 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/run/credentials/systemd-sysctl.service 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/fuse/connections 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/run/credentials/systemd-sysusers.service 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-09T19:24:33.673 INFO:teuthology.orchestra.run.vm07.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/run/user/1000 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/059443f4b768ef346efca483cc020b2840bf1624abc10a9a32025d11c911dd96/merged 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/9a00c34cfa7b724946bb2aa594d2edb8817c6cc6b62b479c8bf990b76a138efc/merged 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/run/user/0 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/f101c93e74268d0ad06764a0a50753b45a5ff5f6c9e07333274f3bde7615604c/merged 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/bebcffa1bc1cd70dc1a05fead20ab9017342239be82845c4b616fb9793e931ad/merged 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/2c2f8bbcc08d7568cb140661f9c6586abe17b93a7e2162e33f74f8e60e1968ab/merged 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/f55abd30e82fb9884524db9c5ae4abd47f4add42696839e84af04b3fbe0da873/merged 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/4f722e2a2ed61e80b581ac09fa1de69337c0c22b857bf668b296b2d65c2ab2e2/merged 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/697a502c9a085c464935203970110521dff6470301b2c42c39d46a95029b435e/merged 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/c6259ef0d3cf8ea7a21020045434300e205dd77c025266165d94ee2b99e73dd6/merged 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/cb477645d47bb5e51e206bff8fd4fa549e0157898bd1a3d9e0154ae42fe1e3e8/merged 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/dce3c0fe88f31ed1ac3f22b5509bc5d1c9a191c18410903cc99e5a5cfb4710b4/merged 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/4f42ec6282f72ee15b201c1f578f10dc6379630ac200ad52e39f3e7cea817b4e/merged 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/95bce347ecd670ba9960a9b475efae02e920fe014abf658f991662aca596d533/merged 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/run/netns 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run.vm07.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T19:24:33.674 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:33.674 DEBUG:teuthology.orchestra.run.vm07:> ls /sys/fs/fuse/connections 2026-03-09T19:24:33.732 INFO:tasks.cephfs.fuse_mount:Pre-mount connections: [] 2026-03-09T19:24:33.732 DEBUG:teuthology.orchestra.run.vm07:> (cd /home/ubuntu/cephtest && exec sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-fuse -f --admin-socket '/var/run/ceph/$cluster-$name.$pid.asok' /home/ubuntu/cephtest/mnt.0 --id 0) 2026-03-09T19:24:33.775 DEBUG:teuthology.orchestra.run.vm07:> sudo modprobe fuse 2026-03-09T19:24:33.802 DEBUG:teuthology.orchestra.run.vm07:> cat /proc/self/mounts | awk '{print $2}' 2026-03-09T19:24:33.847 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm07.stderr:2026-03-09T19:24:33.848+0000 7f006d410580 -1 init, newargv = 0x555612c85010 newargc=15 2026-03-09T19:24:33.847 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm07.stderr:ceph-fuse[96354]: starting ceph client 2026-03-09T19:24:33.856 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm07.stderr:ceph-fuse[96354]: starting fuse 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/proc 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/sys 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/dev 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/sys/kernel/security 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/dev/shm 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/dev/pts 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/run 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/cgroup 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/pstore 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/bpf 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/sys/kernel/config 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/ 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/nfs/rpc_pipefs 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/selinux 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/dev/hugepages 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/dev/mqueue 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/sys/kernel/debug 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/sys/kernel/tracing 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/run/credentials/systemd-sysctl.service 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/fuse/connections 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/run/credentials/systemd-sysusers.service 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/run/user/1000 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/059443f4b768ef346efca483cc020b2840bf1624abc10a9a32025d11c911dd96/merged 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/9a00c34cfa7b724946bb2aa594d2edb8817c6cc6b62b479c8bf990b76a138efc/merged 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/run/user/0 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/f101c93e74268d0ad06764a0a50753b45a5ff5f6c9e07333274f3bde7615604c/merged 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/bebcffa1bc1cd70dc1a05fead20ab9017342239be82845c4b616fb9793e931ad/merged 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/2c2f8bbcc08d7568cb140661f9c6586abe17b93a7e2162e33f74f8e60e1968ab/merged 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/f55abd30e82fb9884524db9c5ae4abd47f4add42696839e84af04b3fbe0da873/merged 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/4f722e2a2ed61e80b581ac09fa1de69337c0c22b857bf668b296b2d65c2ab2e2/merged 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/697a502c9a085c464935203970110521dff6470301b2c42c39d46a95029b435e/merged 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/c6259ef0d3cf8ea7a21020045434300e205dd77c025266165d94ee2b99e73dd6/merged 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/cb477645d47bb5e51e206bff8fd4fa549e0157898bd1a3d9e0154ae42fe1e3e8/merged 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/dce3c0fe88f31ed1ac3f22b5509bc5d1c9a191c18410903cc99e5a5cfb4710b4/merged 2026-03-09T19:24:33.868 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/4f42ec6282f72ee15b201c1f578f10dc6379630ac200ad52e39f3e7cea817b4e/merged 2026-03-09T19:24:33.869 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/95bce347ecd670ba9960a9b475efae02e920fe014abf658f991662aca596d533/merged 2026-03-09T19:24:33.869 INFO:teuthology.orchestra.run.vm07.stdout:/run/netns 2026-03-09T19:24:33.869 INFO:teuthology.orchestra.run.vm07.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T19:24:33.869 INFO:teuthology.orchestra.run.vm07.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T19:24:33.869 INFO:teuthology.orchestra.run.vm07.stdout:/home/ubuntu/cephtest/mnt.0 2026-03-09T19:24:33.869 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:33.869 DEBUG:teuthology.orchestra.run.vm07:> ls /sys/fs/fuse/connections 2026-03-09T19:24:33.926 INFO:teuthology.orchestra.run.vm07.stdout:97 2026-03-09T19:24:33.926 INFO:tasks.cephfs.fuse_mount:Post-mount connections: [97] 2026-03-09T19:24:33.926 DEBUG:teuthology.orchestra.run.vm07:> sudo stdin-killer -- python3 -c ' 2026-03-09T19:24:33.926 DEBUG:teuthology.orchestra.run.vm07:> import glob 2026-03-09T19:24:33.926 DEBUG:teuthology.orchestra.run.vm07:> import re 2026-03-09T19:24:33.926 DEBUG:teuthology.orchestra.run.vm07:> import os 2026-03-09T19:24:33.926 DEBUG:teuthology.orchestra.run.vm07:> import subprocess 2026-03-09T19:24:33.926 DEBUG:teuthology.orchestra.run.vm07:> 2026-03-09T19:24:33.926 DEBUG:teuthology.orchestra.run.vm07:> def _find_admin_socket(client_name): 2026-03-09T19:24:33.926 DEBUG:teuthology.orchestra.run.vm07:> asok_path = "/var/run/ceph/ceph-client.0.*.asok" 2026-03-09T19:24:33.926 DEBUG:teuthology.orchestra.run.vm07:> files = glob.glob(asok_path) 2026-03-09T19:24:33.926 DEBUG:teuthology.orchestra.run.vm07:> mountpoint = "/home/ubuntu/cephtest/mnt.0" 2026-03-09T19:24:33.926 DEBUG:teuthology.orchestra.run.vm07:> 2026-03-09T19:24:33.926 DEBUG:teuthology.orchestra.run.vm07:> # Given a non-glob path, it better be there 2026-03-09T19:24:33.926 DEBUG:teuthology.orchestra.run.vm07:> if "*" not in asok_path: 2026-03-09T19:24:33.926 DEBUG:teuthology.orchestra.run.vm07:> assert(len(files) == 1) 2026-03-09T19:24:33.927 DEBUG:teuthology.orchestra.run.vm07:> return files[0] 2026-03-09T19:24:33.927 DEBUG:teuthology.orchestra.run.vm07:> 2026-03-09T19:24:33.927 DEBUG:teuthology.orchestra.run.vm07:> for f in files: 2026-03-09T19:24:33.927 DEBUG:teuthology.orchestra.run.vm07:> pid = re.match(".*\.(\d+)\.asok$", f).group(1) 2026-03-09T19:24:33.927 DEBUG:teuthology.orchestra.run.vm07:> if os.path.exists("/proc/{0}".format(pid)): 2026-03-09T19:24:33.927 DEBUG:teuthology.orchestra.run.vm07:> with open("/proc/{0}/cmdline".format(pid), '"'"'r'"'"') as proc_f: 2026-03-09T19:24:33.927 DEBUG:teuthology.orchestra.run.vm07:> contents = proc_f.read() 2026-03-09T19:24:33.927 DEBUG:teuthology.orchestra.run.vm07:> if mountpoint in contents: 2026-03-09T19:24:33.927 DEBUG:teuthology.orchestra.run.vm07:> return f 2026-03-09T19:24:33.927 DEBUG:teuthology.orchestra.run.vm07:> raise RuntimeError("Client socket {0} not found".format(client_name)) 2026-03-09T19:24:33.927 DEBUG:teuthology.orchestra.run.vm07:> 2026-03-09T19:24:33.927 DEBUG:teuthology.orchestra.run.vm07:> print(_find_admin_socket("client.0")) 2026-03-09T19:24:33.927 DEBUG:teuthology.orchestra.run.vm07:> ' 2026-03-09T19:24:34.030 INFO:teuthology.orchestra.run.vm07.stdout:/var/run/ceph/ceph-client.0.96354.asok 2026-03-09T19:24:34.032 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T19:24:34.038 INFO:tasks.cephfs.fuse_mount:Found client admin socket at /var/run/ceph/ceph-client.0.96354.asok 2026-03-09T19:24:34.038 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:34.038 DEBUG:teuthology.orchestra.run.vm07:> sudo ceph --admin-daemon /var/run/ceph/ceph-client.0.96354.asok status 2026-03-09T19:24:34.101 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:33 vm07.local ceph-mon[48545]: pgmap v81: 65 pgs: 65 active+clean; 452 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 530 B/s rd, 3.4 KiB/s wr, 11 op/s 2026-03-09T19:24:34.101 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:33 vm07.local ceph-mon[48545]: mds.? [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] up:active 2026-03-09T19:24:34.101 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:33 vm07.local ceph-mon[48545]: fsmap cephfs:2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm08.jwsqrf=up:active} 2 up:standby 2026-03-09T19:24:34.101 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:33 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:24:34.148 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:24:34.148 INFO:teuthology.orchestra.run.vm07.stdout: "metadata": { 2026-03-09T19:24:34.148 INFO:teuthology.orchestra.run.vm07.stdout: "ceph_sha1": "ab47f43c099b2cbae6e21342fe673ce251da54d6", 2026-03-09T19:24:34.148 INFO:teuthology.orchestra.run.vm07.stdout: "ceph_version": "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)", 2026-03-09T19:24:34.148 INFO:teuthology.orchestra.run.vm07.stdout: "entity_id": "0", 2026-03-09T19:24:34.148 INFO:teuthology.orchestra.run.vm07.stdout: "hostname": "vm07.local", 2026-03-09T19:24:34.148 INFO:teuthology.orchestra.run.vm07.stdout: "mount_point": "/home/ubuntu/cephtest/mnt.0", 2026-03-09T19:24:34.148 INFO:teuthology.orchestra.run.vm07.stdout: "pid": "96354", 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "root": "/" 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "dentry_count": 0, 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "dentry_pinned_count": 0, 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "id": 14532, 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "inst": { 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "name": { 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "type": "client", 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "num": 14532 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "addr": { 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "type": "v1", 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "addr": "192.168.144.1:0", 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "nonce": 1264854118 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "addr": { 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "type": "v1", 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "addr": "192.168.144.1:0", 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "nonce": 1264854118 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "inst_str": "client.14532 192.168.144.1:0/1264854118", 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "addr_str": "192.168.144.1:0/1264854118", 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "inode_count": 1, 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "mds_epoch": 13, 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "osd_epoch": 41, 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "osd_epoch_barrier": 0, 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "blocklisted": false, 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout: "fs_name": "cephfs" 2026-03-09T19:24:34.149 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:24:34.155 DEBUG:tasks.ceph_fuse:passing mntargs=[] 2026-03-09T19:24:34.155 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs ls 2026-03-09T19:24:34.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:33 vm08 ceph-mon[57794]: pgmap v81: 65 pgs: 65 active+clean; 452 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 530 B/s rd, 3.4 KiB/s wr, 11 op/s 2026-03-09T19:24:34.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:33 vm08 ceph-mon[57794]: mds.? [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] up:active 2026-03-09T19:24:34.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:33 vm08 ceph-mon[57794]: fsmap cephfs:2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm08.jwsqrf=up:active} 2 up:standby 2026-03-09T19:24:34.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:33 vm08 ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:24:34.355 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:34.607 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.608+0000 7fd830a92640 1 -- 192.168.123.107:0/4259276240 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd82c1029d0 msgr2=0x7fd82c102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:34.607 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.608+0000 7fd830a92640 1 --2- 192.168.123.107:0/4259276240 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd82c1029d0 0x7fd82c102e30 secure :-1 s=READY pgs=277 cs=0 l=1 rev1=1 crypto rx=0x7fd8200099b0 tx=0x7fd82002f220 comp rx=0 tx=0).stop 2026-03-09T19:24:34.607 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.609+0000 7fd830a92640 1 -- 192.168.123.107:0/4259276240 shutdown_connections 2026-03-09T19:24:34.607 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.609+0000 7fd830a92640 1 --2- 192.168.123.107:0/4259276240 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd82c1029d0 0x7fd82c102e30 unknown :-1 s=CLOSED pgs=277 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:34.607 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.609+0000 7fd830a92640 1 --2- 192.168.123.107:0/4259276240 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd82c1089d0 0x7fd82c108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:34.607 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.609+0000 7fd830a92640 1 -- 192.168.123.107:0/4259276240 >> 192.168.123.107:0/4259276240 conn(0x7fd82c0fe710 msgr2=0x7fd82c100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:34.607 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.609+0000 7fd830a92640 1 -- 192.168.123.107:0/4259276240 shutdown_connections 2026-03-09T19:24:34.607 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.610+0000 7fd830a92640 1 -- 192.168.123.107:0/4259276240 wait complete. 2026-03-09T19:24:34.608 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.610+0000 7fd830a92640 1 Processor -- start 2026-03-09T19:24:34.608 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.610+0000 7fd830a92640 1 -- start start 2026-03-09T19:24:34.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.611+0000 7fd830a92640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd82c1029d0 0x7fd82c1a0620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:34.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.611+0000 7fd830a92640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd82c1089d0 0x7fd82c1a0b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:34.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.611+0000 7fd829d74640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd82c1089d0 0x7fd82c1a0b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:34.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.611+0000 7fd829d74640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd82c1089d0 0x7fd82c1a0b60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:47620/0 (socket says 192.168.123.107:47620) 2026-03-09T19:24:34.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.611+0000 7fd829d74640 1 -- 192.168.123.107:0/2858711185 learned_addr learned my addr 192.168.123.107:0/2858711185 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:34.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.611+0000 7fd82a575640 1 --2- 192.168.123.107:0/2858711185 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd82c1029d0 0x7fd82c1a0620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:34.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.612+0000 7fd830a92640 1 -- 192.168.123.107:0/2858711185 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd82c1a1180 con 0x7fd82c1089d0 2026-03-09T19:24:34.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.612+0000 7fd830a92640 1 -- 192.168.123.107:0/2858711185 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd82c19a710 con 0x7fd82c1029d0 2026-03-09T19:24:34.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.612+0000 7fd829d74640 1 -- 192.168.123.107:0/2858711185 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd82c1029d0 msgr2=0x7fd82c1a0620 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:34.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.612+0000 7fd829d74640 1 --2- 192.168.123.107:0/2858711185 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd82c1029d0 0x7fd82c1a0620 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:34.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.612+0000 7fd829d74640 1 -- 192.168.123.107:0/2858711185 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd820009660 con 0x7fd82c1089d0 2026-03-09T19:24:34.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.612+0000 7fd82a575640 1 --2- 192.168.123.107:0/2858711185 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd82c1029d0 0x7fd82c1a0620 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:24:34.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.612+0000 7fd829d74640 1 --2- 192.168.123.107:0/2858711185 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd82c1089d0 0x7fd82c1a0b60 secure :-1 s=READY pgs=278 cs=0 l=1 rev1=1 crypto rx=0x7fd82002f730 tx=0x7fd820004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:34.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.612+0000 7fd8137fe640 1 -- 192.168.123.107:0/2858711185 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd82003d070 con 0x7fd82c1089d0 2026-03-09T19:24:34.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.612+0000 7fd8137fe640 1 -- 192.168.123.107:0/2858711185 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd820038730 con 0x7fd82c1089d0 2026-03-09T19:24:34.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.612+0000 7fd8137fe640 1 -- 192.168.123.107:0/2858711185 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd820041620 con 0x7fd82c1089d0 2026-03-09T19:24:34.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.612+0000 7fd830a92640 1 -- 192.168.123.107:0/2858711185 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd82c19a990 con 0x7fd82c1089d0 2026-03-09T19:24:34.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.613+0000 7fd830a92640 1 -- 192.168.123.107:0/2858711185 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd82c19ae00 con 0x7fd82c1089d0 2026-03-09T19:24:34.612 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.614+0000 7fd830a92640 1 -- 192.168.123.107:0/2858711185 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd7f8005350 con 0x7fd82c1089d0 2026-03-09T19:24:34.616 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.615+0000 7fd8137fe640 1 -- 192.168.123.107:0/2858711185 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fd8200388a0 con 0x7fd82c1089d0 2026-03-09T19:24:34.616 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.615+0000 7fd8137fe640 1 --2- 192.168.123.107:0/2858711185 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd7fc076290 0x7fd7fc078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:34.616 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.615+0000 7fd8137fe640 1 -- 192.168.123.107:0/2858711185 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fd8200bbff0 con 0x7fd82c1089d0 2026-03-09T19:24:34.616 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.617+0000 7fd82a575640 1 --2- 192.168.123.107:0/2858711185 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd7fc076290 0x7fd7fc078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:34.616 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.617+0000 7fd8137fe640 1 -- 192.168.123.107:0/2858711185 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fd8200f0820 con 0x7fd82c1089d0 2026-03-09T19:24:34.616 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.618+0000 7fd82a575640 1 --2- 192.168.123.107:0/2858711185 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd7fc076290 0x7fd7fc078750 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fd814009fd0 tx=0x7fd814009290 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:34.718 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.719+0000 7fd830a92640 1 -- 192.168.123.107:0/2858711185 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7fd7f8005e10 con 0x7fd82c1089d0 2026-03-09T19:24:34.718 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.720+0000 7fd8137fe640 1 -- 192.168.123.107:0/2858711185 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v13) v1 ==== 53+0+83 (secure 0 0 0) 0x7fd820085660 con 0x7fd82c1089d0 2026-03-09T19:24:34.719 INFO:teuthology.orchestra.run.vm07.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-09T19:24:34.721 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.723+0000 7fd830a92640 1 -- 192.168.123.107:0/2858711185 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd7fc076290 msgr2=0x7fd7fc078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:34.721 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.723+0000 7fd830a92640 1 --2- 192.168.123.107:0/2858711185 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd7fc076290 0x7fd7fc078750 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fd814009fd0 tx=0x7fd814009290 comp rx=0 tx=0).stop 2026-03-09T19:24:34.721 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.723+0000 7fd830a92640 1 -- 192.168.123.107:0/2858711185 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd82c1089d0 msgr2=0x7fd82c1a0b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:34.721 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.724+0000 7fd830a92640 1 --2- 192.168.123.107:0/2858711185 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd82c1089d0 0x7fd82c1a0b60 secure :-1 s=READY pgs=278 cs=0 l=1 rev1=1 crypto rx=0x7fd82002f730 tx=0x7fd820004290 comp rx=0 tx=0).stop 2026-03-09T19:24:34.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.724+0000 7fd830a92640 1 -- 192.168.123.107:0/2858711185 shutdown_connections 2026-03-09T19:24:34.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.724+0000 7fd830a92640 1 --2- 192.168.123.107:0/2858711185 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd7fc076290 0x7fd7fc078750 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:34.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.724+0000 7fd830a92640 1 --2- 192.168.123.107:0/2858711185 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd82c1089d0 0x7fd82c1a0b60 unknown :-1 s=CLOSED pgs=278 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:34.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.724+0000 7fd830a92640 1 --2- 192.168.123.107:0/2858711185 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd82c1029d0 0x7fd82c1a0620 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:34.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.724+0000 7fd830a92640 1 -- 192.168.123.107:0/2858711185 >> 192.168.123.107:0/2858711185 conn(0x7fd82c0fe710 msgr2=0x7fd82c0feaf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:34.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.725+0000 7fd830a92640 1 -- 192.168.123.107:0/2858711185 shutdown_connections 2026-03-09T19:24:34.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:34.725+0000 7fd830a92640 1 -- 192.168.123.107:0/2858711185 wait complete. 2026-03-09T19:24:34.791 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-09T19:24:34.792 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-09T19:24:34.792 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm08.local 2026-03-09T19:24:34.792 INFO:tasks.cephfs.mount:self.client.name = client.1 2026-03-09T19:24:34.792 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-09T19:24:34.792 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-09T19:24:34.792 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-09T19:24:34.792 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-09T19:24:34.792 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.1' 2026-03-09T19:24:34.792 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:34.792 DEBUG:teuthology.orchestra.run.vm08:> ip addr 2026-03-09T19:24:34.809 INFO:teuthology.orchestra.run.vm08.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-09T19:24:34.809 INFO:teuthology.orchestra.run.vm08.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-09T19:24:34.810 INFO:teuthology.orchestra.run.vm08.stdout: inet 127.0.0.1/8 scope host lo 2026-03-09T19:24:34.810 INFO:teuthology.orchestra.run.vm08.stdout: valid_lft forever preferred_lft forever 2026-03-09T19:24:34.810 INFO:teuthology.orchestra.run.vm08.stdout: inet6 ::1/128 scope host 2026-03-09T19:24:34.810 INFO:teuthology.orchestra.run.vm08.stdout: valid_lft forever preferred_lft forever 2026-03-09T19:24:34.810 INFO:teuthology.orchestra.run.vm08.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-09T19:24:34.810 INFO:teuthology.orchestra.run.vm08.stdout: link/ether 52:55:00:00:00:08 brd ff:ff:ff:ff:ff:ff 2026-03-09T19:24:34.810 INFO:teuthology.orchestra.run.vm08.stdout: altname enp0s3 2026-03-09T19:24:34.810 INFO:teuthology.orchestra.run.vm08.stdout: altname ens3 2026-03-09T19:24:34.810 INFO:teuthology.orchestra.run.vm08.stdout: inet 192.168.123.108/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-09T19:24:34.810 INFO:teuthology.orchestra.run.vm08.stdout: valid_lft 3125sec preferred_lft 3125sec 2026-03-09T19:24:34.810 INFO:teuthology.orchestra.run.vm08.stdout: inet6 fe80::5055:ff:fe00:8/64 scope link noprefixroute 2026-03-09T19:24:34.810 INFO:teuthology.orchestra.run.vm08.stdout: valid_lft forever preferred_lft forever 2026-03-09T19:24:34.810 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-09T19:24:34.810 DEBUG:teuthology.orchestra.run.vm08:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T19:24:34.810 DEBUG:teuthology.orchestra.run.vm08:> set -e 2026-03-09T19:24:34.810 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link add name ceph-brx type bridge 2026-03-09T19:24:34.810 DEBUG:teuthology.orchestra.run.vm08:> sudo ip addr flush dev ceph-brx 2026-03-09T19:24:34.810 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link set ceph-brx up 2026-03-09T19:24:34.810 DEBUG:teuthology.orchestra.run.vm08:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-09T19:24:34.810 DEBUG:teuthology.orchestra.run.vm08:> ') 2026-03-09T19:24:34.889 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:34 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T19:24:34.962 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:34 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T19:24:34.965 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:34.965 DEBUG:teuthology.orchestra.run.vm08:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-09T19:24:34.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:34 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/2858711185' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-09T19:24:35.035 INFO:teuthology.orchestra.run.vm08.stdout:1 2026-03-09T19:24:35.036 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:35.036 DEBUG:teuthology.orchestra.run.vm08:> ip r 2026-03-09T19:24:35.092 INFO:teuthology.orchestra.run.vm08.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.108 metric 100 2026-03-09T19:24:35.092 INFO:teuthology.orchestra.run.vm08.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.108 metric 100 2026-03-09T19:24:35.092 INFO:teuthology.orchestra.run.vm08.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-09T19:24:35.092 DEBUG:teuthology.orchestra.run.vm08:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T19:24:35.092 DEBUG:teuthology.orchestra.run.vm08:> set -e 2026-03-09T19:24:35.092 DEBUG:teuthology.orchestra.run.vm08:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-09T19:24:35.092 DEBUG:teuthology.orchestra.run.vm08:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-09T19:24:35.092 DEBUG:teuthology.orchestra.run.vm08:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-09T19:24:35.092 DEBUG:teuthology.orchestra.run.vm08:> ') 2026-03-09T19:24:35.170 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:35 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T19:24:35.178 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:34 vm08 ceph-mon[57794]: from='client.? 192.168.123.107:0/2858711185' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-09T19:24:35.230 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:35 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T19:24:35.235 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:35.235 DEBUG:teuthology.orchestra.run.vm08:> ip netns list 2026-03-09T19:24:35.290 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:35.290 DEBUG:teuthology.orchestra.run.vm08:> ip netns list-id 2026-03-09T19:24:35.346 DEBUG:teuthology.orchestra.run.vm08:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T19:24:35.346 DEBUG:teuthology.orchestra.run.vm08:> set -e 2026-03-09T19:24:35.346 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T19:24:35.346 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.1 0 2026-03-09T19:24:35.346 DEBUG:teuthology.orchestra.run.vm08:> ') 2026-03-09T19:24:35.420 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:35 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T19:24:35.445 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:35 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T19:24:35.449 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.1' with 192.168.144.1/20 2026-03-09T19:24:35.449 DEBUG:teuthology.orchestra.run.vm08:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T19:24:35.449 DEBUG:teuthology.orchestra.run.vm08:> set -e 2026-03-09T19:24:35.449 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.1 type veth peer name brx.0 2026-03-09T19:24:35.449 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-09T19:24:35.449 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set veth0 up 2026-03-09T19:24:35.449 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set lo up 2026-03-09T19:24:35.449 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip route add default via 192.168.159.254 2026-03-09T19:24:35.449 DEBUG:teuthology.orchestra.run.vm08:> ') 2026-03-09T19:24:35.525 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:35 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T19:24:35.596 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:35 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T19:24:35.598 DEBUG:teuthology.orchestra.run.vm08:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T19:24:35.598 DEBUG:teuthology.orchestra.run.vm08:> set -e 2026-03-09T19:24:35.598 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link set brx.0 up 2026-03-09T19:24:35.598 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link set dev brx.0 master ceph-brx 2026-03-09T19:24:35.598 DEBUG:teuthology.orchestra.run.vm08:> ') 2026-03-09T19:24:35.671 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:35 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T19:24:35.696 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:35 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T19:24:35.700 INFO:tasks.cephfs.fuse_mount:Client client.1 config is {} 2026-03-09T19:24:35.700 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-09T19:24:35.700 DEBUG:teuthology.orchestra.run.vm08:> mkdir -p -v /home/ubuntu/cephtest/mnt.1 2026-03-09T19:24:35.758 INFO:teuthology.orchestra.run.vm08.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.1' 2026-03-09T19:24:35.758 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-09T19:24:35.759 DEBUG:teuthology.orchestra.run.vm08:> chmod 0000 /home/ubuntu/cephtest/mnt.1 2026-03-09T19:24:35.813 DEBUG:teuthology.orchestra.run.vm08:> sudo modprobe fuse 2026-03-09T19:24:35.881 DEBUG:teuthology.orchestra.run.vm08:> cat /proc/self/mounts | awk '{print $2}' 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/proc 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/sys 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/dev 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/security 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/dev/shm 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/dev/pts 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/run 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/cgroup 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/pstore 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/bpf 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/config 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/ 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/nfs/rpc_pipefs 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/selinux 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/dev/hugepages 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/dev/mqueue 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/debug 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/tracing 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-sysctl.service 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/fuse/connections 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-sysusers.service 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/run/user/1000 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/run/user/0 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/03a644a6cd4f1c6809db0aef903ff9d453e76a75cb73b404563e13b82e426d80/merged 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/1e1c4b6fdf87242253bae101eb1db0f87d481c55c0813767070024be4a54756b/merged 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/80ae5a51998af1524bcbe0392863498731bfa13ce9220b877587dc8140f9cace/merged 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/5961426996c93627f45825ff0656a0006164d53551fb5f36aeaa975301ab521e/merged 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/fbd26c2e1ca08b71f040ad3931fbb2b6359b74e7705231eaa124ec28f140ac67/merged 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/a62638e0048c2d8ee1518a2b3b7e64cd6236645c74b464c33f330d7e04e62c58/merged 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/e693c23618f810b8944fedd9ba8deaa8f268cc79b3ad2362518b8c5a4dcf7926/merged 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/fa6c29b3708950de574db206a75195fd06cbaebbd746a717f2fe7c121617c222/merged 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/b8554f35ef4d5684397d23fec76f14e9b10a93ce46dac3573c19cc8f880587e1/merged 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/66ff80c91afd479a27d7ec4f937e91637978f14a82b34803a51db659e3d549a4/merged 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T19:24:35.940 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:35.940 DEBUG:teuthology.orchestra.run.vm08:> ls /sys/fs/fuse/connections 2026-03-09T19:24:35.998 INFO:tasks.cephfs.fuse_mount:Pre-mount connections: [] 2026-03-09T19:24:35.998 DEBUG:teuthology.orchestra.run.vm08:> (cd /home/ubuntu/cephtest && exec sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-fuse -f --admin-socket '/var/run/ceph/$cluster-$name.$pid.asok' /home/ubuntu/cephtest/mnt.1 --id 1) 2026-03-09T19:24:36.039 DEBUG:teuthology.orchestra.run.vm08:> sudo modprobe fuse 2026-03-09T19:24:36.063 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:35 vm08.local ceph-mon[57794]: pgmap v82: 65 pgs: 65 active+clean; 452 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 3.8 KiB/s rd, 3.1 KiB/s wr, 13 op/s 2026-03-09T19:24:36.069 DEBUG:teuthology.orchestra.run.vm08:> cat /proc/self/mounts | awk '{print $2}' 2026-03-09T19:24:36.112 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm08.stderr:2026-03-09T19:24:36.111+0000 7f4e3d40b580 -1 init, newargv = 0x55ce196f6870 newargc=15 2026-03-09T19:24:36.112 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm08.stderr:ceph-fuse[83057]: starting ceph client 2026-03-09T19:24:36.121 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm08.stderr:ceph-fuse[83057]: starting fuse 2026-03-09T19:24:36.135 INFO:teuthology.orchestra.run.vm08.stdout:/proc 2026-03-09T19:24:36.135 INFO:teuthology.orchestra.run.vm08.stdout:/sys 2026-03-09T19:24:36.135 INFO:teuthology.orchestra.run.vm08.stdout:/dev 2026-03-09T19:24:36.135 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/security 2026-03-09T19:24:36.135 INFO:teuthology.orchestra.run.vm08.stdout:/dev/shm 2026-03-09T19:24:36.135 INFO:teuthology.orchestra.run.vm08.stdout:/dev/pts 2026-03-09T19:24:36.135 INFO:teuthology.orchestra.run.vm08.stdout:/run 2026-03-09T19:24:36.135 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/cgroup 2026-03-09T19:24:36.135 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/pstore 2026-03-09T19:24:36.135 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/bpf 2026-03-09T19:24:36.135 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/config 2026-03-09T19:24:36.135 INFO:teuthology.orchestra.run.vm08.stdout:/ 2026-03-09T19:24:36.135 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/nfs/rpc_pipefs 2026-03-09T19:24:36.135 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/selinux 2026-03-09T19:24:36.135 INFO:teuthology.orchestra.run.vm08.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T19:24:36.135 INFO:teuthology.orchestra.run.vm08.stdout:/dev/hugepages 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/dev/mqueue 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/debug 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/tracing 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-sysctl.service 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/fuse/connections 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-sysusers.service 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/run/user/1000 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/run/user/0 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/03a644a6cd4f1c6809db0aef903ff9d453e76a75cb73b404563e13b82e426d80/merged 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/1e1c4b6fdf87242253bae101eb1db0f87d481c55c0813767070024be4a54756b/merged 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/80ae5a51998af1524bcbe0392863498731bfa13ce9220b877587dc8140f9cace/merged 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/5961426996c93627f45825ff0656a0006164d53551fb5f36aeaa975301ab521e/merged 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/fbd26c2e1ca08b71f040ad3931fbb2b6359b74e7705231eaa124ec28f140ac67/merged 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/a62638e0048c2d8ee1518a2b3b7e64cd6236645c74b464c33f330d7e04e62c58/merged 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/e693c23618f810b8944fedd9ba8deaa8f268cc79b3ad2362518b8c5a4dcf7926/merged 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/fa6c29b3708950de574db206a75195fd06cbaebbd746a717f2fe7c121617c222/merged 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/b8554f35ef4d5684397d23fec76f14e9b10a93ce46dac3573c19cc8f880587e1/merged 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/66ff80c91afd479a27d7ec4f937e91637978f14a82b34803a51db659e3d549a4/merged 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run.vm08.stdout:/home/ubuntu/cephtest/mnt.1 2026-03-09T19:24:36.136 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:36.136 DEBUG:teuthology.orchestra.run.vm08:> ls /sys/fs/fuse/connections 2026-03-09T19:24:36.192 INFO:teuthology.orchestra.run.vm08.stdout:90 2026-03-09T19:24:36.192 INFO:tasks.cephfs.fuse_mount:Post-mount connections: [90] 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> sudo stdin-killer -- python3 -c ' 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> import glob 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> import re 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> import os 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> import subprocess 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> def _find_admin_socket(client_name): 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> asok_path = "/var/run/ceph/ceph-client.1.*.asok" 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> files = glob.glob(asok_path) 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> mountpoint = "/home/ubuntu/cephtest/mnt.1" 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> # Given a non-glob path, it better be there 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> if "*" not in asok_path: 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> assert(len(files) == 1) 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> return files[0] 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> for f in files: 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> pid = re.match(".*\.(\d+)\.asok$", f).group(1) 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> if os.path.exists("/proc/{0}".format(pid)): 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> with open("/proc/{0}/cmdline".format(pid), '"'"'r'"'"') as proc_f: 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> contents = proc_f.read() 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> if mountpoint in contents: 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> return f 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> raise RuntimeError("Client socket {0} not found".format(client_name)) 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> print(_find_admin_socket("client.1")) 2026-03-09T19:24:36.192 DEBUG:teuthology.orchestra.run.vm08:> ' 2026-03-09T19:24:36.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:35 vm07.local ceph-mon[48545]: pgmap v82: 65 pgs: 65 active+clean; 452 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 3.8 KiB/s rd, 3.1 KiB/s wr, 13 op/s 2026-03-09T19:24:36.294 INFO:teuthology.orchestra.run.vm08.stdout:/var/run/ceph/ceph-client.1.83057.asok 2026-03-09T19:24:36.296 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-09T19:24:36 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T19:24:36.301 INFO:tasks.cephfs.fuse_mount:Found client admin socket at /var/run/ceph/ceph-client.1.83057.asok 2026-03-09T19:24:36.301 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:36.301 DEBUG:teuthology.orchestra.run.vm08:> sudo ceph --admin-daemon /var/run/ceph/ceph-client.1.83057.asok status 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout:{ 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "metadata": { 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "ceph_sha1": "ab47f43c099b2cbae6e21342fe673ce251da54d6", 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "ceph_version": "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)", 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "entity_id": "1", 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "hostname": "vm08.local", 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "mount_point": "/home/ubuntu/cephtest/mnt.1", 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "pid": "83057", 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "root": "/" 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: }, 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "dentry_count": 0, 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "dentry_pinned_count": 0, 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "id": 24305, 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "inst": { 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "name": { 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "type": "client", 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "num": 24305 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: }, 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "addr": { 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "type": "v1", 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "addr": "192.168.144.1:0", 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "nonce": 3035669578 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: } 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: }, 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "addr": { 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "type": "v1", 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "addr": "192.168.144.1:0", 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "nonce": 3035669578 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: }, 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "inst_str": "client.24305 192.168.144.1:0/3035669578", 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "addr_str": "192.168.144.1:0/3035669578", 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "inode_count": 1, 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "mds_epoch": 13, 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "osd_epoch": 41, 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "osd_epoch_barrier": 0, 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "blocklisted": false, 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout: "fs_name": "cephfs" 2026-03-09T19:24:36.407 INFO:teuthology.orchestra.run.vm08.stdout:} 2026-03-09T19:24:36.414 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:36.414 DEBUG:teuthology.orchestra.run.vm07:> stat --file-system '--printf=%T 2026-03-09T19:24:36.414 DEBUG:teuthology.orchestra.run.vm07:> ' -- /home/ubuntu/cephtest/mnt.0 2026-03-09T19:24:36.430 INFO:teuthology.orchestra.run.vm07.stdout:fuseblk 2026-03-09T19:24:36.430 INFO:tasks.cephfs.fuse_mount:ceph-fuse is mounted on /home/ubuntu/cephtest/mnt.0 2026-03-09T19:24:36.430 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:36.430 DEBUG:teuthology.orchestra.run.vm07:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.0 2026-03-09T19:24:36.498 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:36.498 DEBUG:teuthology.orchestra.run.vm08:> stat --file-system '--printf=%T 2026-03-09T19:24:36.498 DEBUG:teuthology.orchestra.run.vm08:> ' -- /home/ubuntu/cephtest/mnt.1 2026-03-09T19:24:36.518 INFO:teuthology.orchestra.run.vm08.stdout:fuseblk 2026-03-09T19:24:36.518 INFO:tasks.cephfs.fuse_mount:ceph-fuse is mounted on /home/ubuntu/cephtest/mnt.1 2026-03-09T19:24:36.518 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:24:36.518 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.1 2026-03-09T19:24:38.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:37 vm07.local ceph-mon[48545]: pgmap v83: 65 pgs: 65 active+clean; 453 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 6.3 KiB/s rd, 2.5 KiB/s wr, 13 op/s 2026-03-09T19:24:38.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:37 vm08.local ceph-mon[57794]: pgmap v83: 65 pgs: 65 active+clean; 453 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 6.3 KiB/s rd, 2.5 KiB/s wr, 13 op/s 2026-03-09T19:24:39.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:39 vm07.local ceph-mon[48545]: pgmap v84: 65 pgs: 65 active+clean; 453 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 7.5 KiB/s rd, 1.4 KiB/s wr, 11 op/s 2026-03-09T19:24:39.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:39 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:40.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:39 vm08.local ceph-mon[57794]: pgmap v84: 65 pgs: 65 active+clean; 453 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 7.5 KiB/s rd, 1.4 KiB/s wr, 11 op/s 2026-03-09T19:24:40.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:39 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:40.605 INFO:teuthology.run_tasks:Running task print... 2026-03-09T19:24:40.608 INFO:teuthology.task.print:**** done client 2026-03-09T19:24:40.608 INFO:teuthology.run_tasks:Running task parallel... 2026-03-09T19:24:40.611 INFO:teuthology.task.parallel:starting parallel... 2026-03-09T19:24:40.611 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-09T19:24:40.611 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-09T19:24:40.612 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm07.local 2026-03-09T19:24:40.612 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mgr mgr/orchestrator/fail_fs false || true' 2026-03-09T19:24:40.612 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-09T19:24:40.612 INFO:teuthology.task.sequential:In sequential, running task workunit... 2026-03-09T19:24:40.614 INFO:tasks.workunit:Pulling workunits from ref 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-09T19:24:40.614 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-09T19:24:40.614 INFO:tasks.workunit:timeout=3h 2026-03-09T19:24:40.614 INFO:tasks.workunit:cleanup=True 2026-03-09T19:24:40.614 DEBUG:teuthology.orchestra.run.vm07:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-09T19:24:40.638 INFO:teuthology.orchestra.run.vm07.stdout: File: /home/ubuntu/cephtest/mnt.0 2026-03-09T19:24:40.638 INFO:teuthology.orchestra.run.vm07.stdout: Size: 0 Blocks: 1 IO Block: 4194304 directory 2026-03-09T19:24:40.638 INFO:teuthology.orchestra.run.vm07.stdout:Device: 61h/97d Inode: 1 Links: 2 2026-03-09T19:24:40.638 INFO:teuthology.orchestra.run.vm07.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-09T19:24:40.638 INFO:teuthology.orchestra.run.vm07.stdout:Context: system_u:object_r:fusefs_t:s0 2026-03-09T19:24:40.638 INFO:teuthology.orchestra.run.vm07.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-09T19:24:40.638 INFO:teuthology.orchestra.run.vm07.stdout:Modify: 2026-03-09 19:24:25.776453895 +0000 2026-03-09T19:24:40.638 INFO:teuthology.orchestra.run.vm07.stdout:Change: 2026-03-09 19:24:36.498135834 +0000 2026-03-09T19:24:40.638 INFO:teuthology.orchestra.run.vm07.stdout: Birth: - 2026-03-09T19:24:40.638 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.0 2026-03-09T19:24:40.638 DEBUG:teuthology.orchestra.run.vm07:> cd -- /home/ubuntu/cephtest/mnt.0 && sudo install -d -m 0755 --owner=ubuntu -- client.0 2026-03-09T19:24:40.710 DEBUG:teuthology.orchestra.run.vm08:> stat -- /home/ubuntu/cephtest/mnt.1 2026-03-09T19:24:40.729 INFO:teuthology.orchestra.run.vm08.stdout: File: /home/ubuntu/cephtest/mnt.1 2026-03-09T19:24:40.729 INFO:teuthology.orchestra.run.vm08.stdout: Size: 0 Blocks: 1 IO Block: 4194304 directory 2026-03-09T19:24:40.729 INFO:teuthology.orchestra.run.vm08.stdout:Device: 5ah/90d Inode: 1 Links: 3 2026-03-09T19:24:40.729 INFO:teuthology.orchestra.run.vm08.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-09T19:24:40.729 INFO:teuthology.orchestra.run.vm08.stdout:Context: system_u:object_r:fusefs_t:s0 2026-03-09T19:24:40.729 INFO:teuthology.orchestra.run.vm08.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-09T19:24:40.729 INFO:teuthology.orchestra.run.vm08.stdout:Modify: 2026-03-09 19:24:40.707695115 +0000 2026-03-09T19:24:40.729 INFO:teuthology.orchestra.run.vm08.stdout:Change: 2026-03-09 19:24:40.707695115 +0000 2026-03-09T19:24:40.729 INFO:teuthology.orchestra.run.vm08.stdout: Birth: - 2026-03-09T19:24:40.729 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.1 2026-03-09T19:24:40.729 DEBUG:teuthology.orchestra.run.vm08:> cd -- /home/ubuntu/cephtest/mnt.1 && sudo install -d -m 0755 --owner=ubuntu -- client.1 2026-03-09T19:24:40.794 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:40.805 DEBUG:teuthology.orchestra.run.vm07:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-09T19:24:40.805 DEBUG:teuthology.orchestra.run.vm08:> rm -rf /home/ubuntu/cephtest/clone.client.1 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.1 && cd /home/ubuntu/cephtest/clone.client.1 && git checkout 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-09T19:24:40.841 INFO:tasks.workunit.client.0.vm07.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-09T19:24:40.864 INFO:tasks.workunit.client.1.vm08.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.1'... 2026-03-09T19:24:41.055 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.057+0000 7f22de473640 1 -- 192.168.123.107:0/1981709147 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22d8076df0 msgr2=0x7f22d8077250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:41.055 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.057+0000 7f22de473640 1 --2- 192.168.123.107:0/1981709147 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22d8076df0 0x7f22d8077250 secure :-1 s=READY pgs=279 cs=0 l=1 rev1=1 crypto rx=0x7f22c40099b0 tx=0x7f22c402f220 comp rx=0 tx=0).stop 2026-03-09T19:24:41.055 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.057+0000 7f22de473640 1 -- 192.168.123.107:0/1981709147 shutdown_connections 2026-03-09T19:24:41.055 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.057+0000 7f22de473640 1 --2- 192.168.123.107:0/1981709147 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22d8076df0 0x7f22d8077250 unknown :-1 s=CLOSED pgs=279 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:41.055 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.057+0000 7f22de473640 1 --2- 192.168.123.107:0/1981709147 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f22d8075ba0 0x7f22d8075fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:41.055 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.057+0000 7f22de473640 1 -- 192.168.123.107:0/1981709147 >> 192.168.123.107:0/1981709147 conn(0x7f22d80fe250 msgr2=0x7f22d8100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:41.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.058+0000 7f22de473640 1 -- 192.168.123.107:0/1981709147 shutdown_connections 2026-03-09T19:24:41.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.058+0000 7f22de473640 1 -- 192.168.123.107:0/1981709147 wait complete. 2026-03-09T19:24:41.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.058+0000 7f22de473640 1 Processor -- start 2026-03-09T19:24:41.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.059+0000 7f22de473640 1 -- start start 2026-03-09T19:24:41.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.059+0000 7f22de473640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f22d8075ba0 0x7f22d819e920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:41.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.059+0000 7f22de473640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22d8076df0 0x7f22d819ee60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:41.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.059+0000 7f22de473640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f22d819f430 con 0x7f22d8076df0 2026-03-09T19:24:41.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.059+0000 7f22de473640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f22d819f5a0 con 0x7f22d8075ba0 2026-03-09T19:24:41.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.059+0000 7f22d7fff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f22d8075ba0 0x7f22d819e920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:41.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.059+0000 7f22d7fff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f22d8075ba0 0x7f22d819e920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:57534/0 (socket says 192.168.123.107:57534) 2026-03-09T19:24:41.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.059+0000 7f22d7fff640 1 -- 192.168.123.107:0/486495690 learned_addr learned my addr 192.168.123.107:0/486495690 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:41.058 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.059+0000 7f22d7fff640 1 -- 192.168.123.107:0/486495690 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22d8076df0 msgr2=0x7f22d819ee60 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:24:41.058 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.060+0000 7f22d77fe640 1 --2- 192.168.123.107:0/486495690 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22d8076df0 0x7f22d819ee60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:41.058 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.060+0000 7f22d7fff640 1 --2- 192.168.123.107:0/486495690 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22d8076df0 0x7f22d819ee60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:41.058 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.060+0000 7f22d7fff640 1 -- 192.168.123.107:0/486495690 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f22c4009660 con 0x7f22d8075ba0 2026-03-09T19:24:41.058 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.060+0000 7f22d7fff640 1 --2- 192.168.123.107:0/486495690 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f22d8075ba0 0x7f22d819e920 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f22c800c910 tx=0x7f22c800cde0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:41.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.060+0000 7f22d57fa640 1 -- 192.168.123.107:0/486495690 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f22c8007c20 con 0x7f22d8075ba0 2026-03-09T19:24:41.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.060+0000 7f22d57fa640 1 -- 192.168.123.107:0/486495690 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f22c8007d80 con 0x7f22d8075ba0 2026-03-09T19:24:41.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.060+0000 7f22de473640 1 -- 192.168.123.107:0/486495690 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f22d81a4040 con 0x7f22d8075ba0 2026-03-09T19:24:41.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.060+0000 7f22de473640 1 -- 192.168.123.107:0/486495690 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f22d81a4590 con 0x7f22d8075ba0 2026-03-09T19:24:41.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.060+0000 7f22d57fa640 1 -- 192.168.123.107:0/486495690 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f22c800f450 con 0x7f22d8075ba0 2026-03-09T19:24:41.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.062+0000 7f22d57fa640 1 -- 192.168.123.107:0/486495690 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f22c8016030 con 0x7f22d8075ba0 2026-03-09T19:24:41.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.061+0000 7f22de473640 1 -- 192.168.123.107:0/486495690 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f22d810fb80 con 0x7f22d8075ba0 2026-03-09T19:24:41.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.063+0000 7f22d57fa640 1 --2- 192.168.123.107:0/486495690 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f22ac076170 0x7f22ac078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:41.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.063+0000 7f22d57fa640 1 -- 192.168.123.107:0/486495690 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f22c8097540 con 0x7f22d8075ba0 2026-03-09T19:24:41.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.064+0000 7f22d77fe640 1 --2- 192.168.123.107:0/486495690 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f22ac076170 0x7f22ac078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:41.062 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.064+0000 7f22d77fe640 1 --2- 192.168.123.107:0/486495690 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f22ac076170 0x7f22ac078630 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f22c4004870 tx=0x7f22c40047c0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:41.063 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.066+0000 7f22d57fa640 1 -- 192.168.123.107:0/486495690 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f22c8060a80 con 0x7f22d8075ba0 2026-03-09T19:24:41.155 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.155+0000 7f22de473640 1 -- 192.168.123.107:0/486495690 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command([{prefix=config set, name=mgr/orchestrator/fail_fs}] v 0) v1 -- 0x7f22d81a49c0 con 0x7f22d8075ba0 2026-03-09T19:24:41.162 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.164+0000 7f22d57fa640 1 -- 192.168.123.107:0/486495690 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/orchestrator/fail_fs}]=0 v19)=0 v19) v1 ==== 126+0+0 (secure 0 0 0) 0x7f22c8060420 con 0x7f22d8075ba0 2026-03-09T19:24:41.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.171+0000 7f22de473640 1 -- 192.168.123.107:0/486495690 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f22ac076170 msgr2=0x7f22ac078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:41.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.171+0000 7f22de473640 1 --2- 192.168.123.107:0/486495690 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f22ac076170 0x7f22ac078630 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f22c4004870 tx=0x7f22c40047c0 comp rx=0 tx=0).stop 2026-03-09T19:24:41.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.171+0000 7f22de473640 1 -- 192.168.123.107:0/486495690 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f22d8075ba0 msgr2=0x7f22d819e920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:41.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.171+0000 7f22de473640 1 --2- 192.168.123.107:0/486495690 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f22d8075ba0 0x7f22d819e920 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f22c800c910 tx=0x7f22c800cde0 comp rx=0 tx=0).stop 2026-03-09T19:24:41.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.172+0000 7f22de473640 1 -- 192.168.123.107:0/486495690 shutdown_connections 2026-03-09T19:24:41.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.172+0000 7f22de473640 1 --2- 192.168.123.107:0/486495690 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f22ac076170 0x7f22ac078630 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:41.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.172+0000 7f22de473640 1 --2- 192.168.123.107:0/486495690 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22d8076df0 0x7f22d819ee60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:41.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.172+0000 7f22de473640 1 --2- 192.168.123.107:0/486495690 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f22d8075ba0 0x7f22d819e920 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:41.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.172+0000 7f22de473640 1 -- 192.168.123.107:0/486495690 >> 192.168.123.107:0/486495690 conn(0x7f22d80fe250 msgr2=0x7f22d80ffa20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:41.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.172+0000 7f22de473640 1 -- 192.168.123.107:0/486495690 shutdown_connections 2026-03-09T19:24:41.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.172+0000 7f22de473640 1 -- 192.168.123.107:0/486495690 wait complete. 2026-03-09T19:24:41.256 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-09T19:24:41.256 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm07.local 2026-03-09T19:24:41.256 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force' 2026-03-09T19:24:41.458 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:41.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.696+0000 7f651a9cb640 1 -- 192.168.123.107:0/1708071416 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6514103c60 msgr2=0x7f65141040e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:41.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.696+0000 7f651a9cb640 1 --2- 192.168.123.107:0/1708071416 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6514103c60 0x7f65141040e0 secure :-1 s=READY pgs=280 cs=0 l=1 rev1=1 crypto rx=0x7f65000099b0 tx=0x7f650002f220 comp rx=0 tx=0).stop 2026-03-09T19:24:41.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.697+0000 7f651a9cb640 1 -- 192.168.123.107:0/1708071416 shutdown_connections 2026-03-09T19:24:41.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.697+0000 7f651a9cb640 1 --2- 192.168.123.107:0/1708071416 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6514103c60 0x7f65141040e0 unknown :-1 s=CLOSED pgs=280 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:41.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.697+0000 7f651a9cb640 1 --2- 192.168.123.107:0/1708071416 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6514102a60 0x7f6514102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:41.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.697+0000 7f651a9cb640 1 -- 192.168.123.107:0/1708071416 >> 192.168.123.107:0/1708071416 conn(0x7f65140fe250 msgr2=0x7f6514100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:41.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.697+0000 7f651a9cb640 1 -- 192.168.123.107:0/1708071416 shutdown_connections 2026-03-09T19:24:41.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.697+0000 7f651a9cb640 1 -- 192.168.123.107:0/1708071416 wait complete. 2026-03-09T19:24:41.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.698+0000 7f651a9cb640 1 Processor -- start 2026-03-09T19:24:41.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.698+0000 7f651a9cb640 1 -- start start 2026-03-09T19:24:41.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.698+0000 7f651a9cb640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6514102a60 0x7f651419a460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:41.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.698+0000 7f6513fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6514102a60 0x7f651419a460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:41.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.699+0000 7f6513fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6514102a60 0x7f651419a460 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:47494/0 (socket says 192.168.123.107:47494) 2026-03-09T19:24:41.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.699+0000 7f651a9cb640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6514103c60 0x7f651419a9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:41.697 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.699+0000 7f651a9cb640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f651419af70 con 0x7f6514102a60 2026-03-09T19:24:41.697 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.699+0000 7f651a9cb640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f651419b0e0 con 0x7f6514103c60 2026-03-09T19:24:41.697 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.699+0000 7f6513fff640 1 -- 192.168.123.107:0/4027909232 learned_addr learned my addr 192.168.123.107:0/4027909232 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:41.697 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.699+0000 7f65137fe640 1 --2- 192.168.123.107:0/4027909232 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6514103c60 0x7f651419a9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:41.697 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.699+0000 7f65137fe640 1 -- 192.168.123.107:0/4027909232 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6514102a60 msgr2=0x7f651419a460 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:41.697 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.699+0000 7f65137fe640 1 --2- 192.168.123.107:0/4027909232 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6514102a60 0x7f651419a460 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:41.697 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.699+0000 7f65137fe640 1 -- 192.168.123.107:0/4027909232 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6500009660 con 0x7f6514103c60 2026-03-09T19:24:41.697 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.699+0000 7f6513fff640 1 --2- 192.168.123.107:0/4027909232 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6514102a60 0x7f651419a460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:24:41.697 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.700+0000 7f65137fe640 1 --2- 192.168.123.107:0/4027909232 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6514103c60 0x7f651419a9a0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f6500002c20 tx=0x7f65000028f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:41.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.700+0000 7f65117fa640 1 -- 192.168.123.107:0/4027909232 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f650003d070 con 0x7f6514103c60 2026-03-09T19:24:41.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.700+0000 7f651a9cb640 1 -- 192.168.123.107:0/4027909232 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f651419fb20 con 0x7f6514103c60 2026-03-09T19:24:41.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.700+0000 7f65117fa640 1 -- 192.168.123.107:0/4027909232 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f650002fc90 con 0x7f6514103c60 2026-03-09T19:24:41.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.700+0000 7f65117fa640 1 -- 192.168.123.107:0/4027909232 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f65000418f0 con 0x7f6514103c60 2026-03-09T19:24:41.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.700+0000 7f651a9cb640 1 -- 192.168.123.107:0/4027909232 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f65141a0010 con 0x7f6514103c60 2026-03-09T19:24:41.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.701+0000 7f651a9cb640 1 -- 192.168.123.107:0/4027909232 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f64d8005350 con 0x7f6514103c60 2026-03-09T19:24:41.700 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.702+0000 7f65117fa640 1 -- 192.168.123.107:0/4027909232 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f6500038c80 con 0x7f6514103c60 2026-03-09T19:24:41.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.703+0000 7f65117fa640 1 --2- 192.168.123.107:0/4027909232 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f64e80761c0 0x7f64e8078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:41.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.703+0000 7f65117fa640 1 -- 192.168.123.107:0/4027909232 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f65000bc810 con 0x7f6514103c60 2026-03-09T19:24:41.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.703+0000 7f6513fff640 1 --2- 192.168.123.107:0/4027909232 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f64e80761c0 0x7f64e8078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:41.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.704+0000 7f6513fff640 1 --2- 192.168.123.107:0/4027909232 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f64e80761c0 0x7f64e8078680 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f65040104e0 tx=0x7f6504007450 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:41.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.704+0000 7f65117fa640 1 -- 192.168.123.107:0/4027909232 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6500085e80 con 0x7f6514103c60 2026-03-09T19:24:41.791 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.792+0000 7f651a9cb640 1 -- 192.168.123.107:0/4027909232 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}] v 0) v1 -- 0x7f64d80051c0 con 0x7f6514103c60 2026-03-09T19:24:41.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.794+0000 7f65117fa640 1 -- 192.168.123.107:0/4027909232 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}]=0 v19)=0 v19) v1 ==== 155+0+0 (secure 0 0 0) 0x7f6500085820 con 0x7f6514103c60 2026-03-09T19:24:41.795 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.797+0000 7f651a9cb640 1 -- 192.168.123.107:0/4027909232 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f64e80761c0 msgr2=0x7f64e8078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:41.795 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.797+0000 7f651a9cb640 1 --2- 192.168.123.107:0/4027909232 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f64e80761c0 0x7f64e8078680 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f65040104e0 tx=0x7f6504007450 comp rx=0 tx=0).stop 2026-03-09T19:24:41.795 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.797+0000 7f651a9cb640 1 -- 192.168.123.107:0/4027909232 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6514103c60 msgr2=0x7f651419a9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:41.795 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.797+0000 7f651a9cb640 1 --2- 192.168.123.107:0/4027909232 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6514103c60 0x7f651419a9a0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f6500002c20 tx=0x7f65000028f0 comp rx=0 tx=0).stop 2026-03-09T19:24:41.795 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.798+0000 7f651a9cb640 1 -- 192.168.123.107:0/4027909232 shutdown_connections 2026-03-09T19:24:41.796 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.798+0000 7f651a9cb640 1 --2- 192.168.123.107:0/4027909232 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f64e80761c0 0x7f64e8078680 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:41.796 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.798+0000 7f651a9cb640 1 --2- 192.168.123.107:0/4027909232 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6514103c60 0x7f651419a9a0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:41.796 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.798+0000 7f651a9cb640 1 --2- 192.168.123.107:0/4027909232 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6514102a60 0x7f651419a460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:41.796 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.798+0000 7f651a9cb640 1 -- 192.168.123.107:0/4027909232 >> 192.168.123.107:0/4027909232 conn(0x7f65140fe250 msgr2=0x7f65140ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:41.796 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.798+0000 7f651a9cb640 1 -- 192.168.123.107:0/4027909232 shutdown_connections 2026-03-09T19:24:41.796 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:41.798+0000 7f651a9cb640 1 -- 192.168.123.107:0/4027909232 wait complete. 2026-03-09T19:24:41.856 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force' 2026-03-09T19:24:42.014 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:42.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.270+0000 7f283bfff640 1 -- 192.168.123.107:0/3766566664 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f283c0ff810 msgr2=0x7f283c0ffbf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:42.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.270+0000 7f283bfff640 1 --2- 192.168.123.107:0/3766566664 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f283c0ff810 0x7f283c0ffbf0 secure :-1 s=READY pgs=281 cs=0 l=1 rev1=1 crypto rx=0x7f2824009a00 tx=0x7f282402f280 comp rx=0 tx=0).stop 2026-03-09T19:24:42.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.271+0000 7f283bfff640 1 -- 192.168.123.107:0/3766566664 shutdown_connections 2026-03-09T19:24:42.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.271+0000 7f283bfff640 1 --2- 192.168.123.107:0/3766566664 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f283c1001c0 0x7f283c0fe2a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:42.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.271+0000 7f283bfff640 1 --2- 192.168.123.107:0/3766566664 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f283c0ff810 0x7f283c0ffbf0 unknown :-1 s=CLOSED pgs=281 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:42.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.271+0000 7f283bfff640 1 -- 192.168.123.107:0/3766566664 >> 192.168.123.107:0/3766566664 conn(0x7f283c0f9f80 msgr2=0x7f283c0fc3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:42.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.271+0000 7f283bfff640 1 -- 192.168.123.107:0/3766566664 shutdown_connections 2026-03-09T19:24:42.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.271+0000 7f283bfff640 1 -- 192.168.123.107:0/3766566664 wait complete. 2026-03-09T19:24:42.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.272+0000 7f283bfff640 1 Processor -- start 2026-03-09T19:24:42.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.272+0000 7f283bfff640 1 -- start start 2026-03-09T19:24:42.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.272+0000 7f283bfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f283c0ff810 0x7f283c19a2b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:42.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.273+0000 7f283bfff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f283c1001c0 0x7f283c19a7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:42.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.273+0000 7f283bfff640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f283c19adc0 con 0x7f283c0ff810 2026-03-09T19:24:42.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.273+0000 7f283bfff640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f283c19af30 con 0x7f283c1001c0 2026-03-09T19:24:42.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.273+0000 7f283a7fc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f283c1001c0 0x7f283c19a7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:42.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.273+0000 7f283a7fc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f283c1001c0 0x7f283c19a7f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:57564/0 (socket says 192.168.123.107:57564) 2026-03-09T19:24:42.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.273+0000 7f283a7fc640 1 -- 192.168.123.107:0/2679223382 learned_addr learned my addr 192.168.123.107:0/2679223382 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:42.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.273+0000 7f283a7fc640 1 -- 192.168.123.107:0/2679223382 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f283c0ff810 msgr2=0x7f283c19a2b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:42.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.273+0000 7f283affd640 1 --2- 192.168.123.107:0/2679223382 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f283c0ff810 0x7f283c19a2b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:42.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.274+0000 7f283a7fc640 1 --2- 192.168.123.107:0/2679223382 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f283c0ff810 0x7f283c19a2b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:42.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.274+0000 7f283a7fc640 1 -- 192.168.123.107:0/2679223382 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2824009660 con 0x7f283c1001c0 2026-03-09T19:24:42.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.274+0000 7f283affd640 1 --2- 192.168.123.107:0/2679223382 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f283c0ff810 0x7f283c19a2b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:24:42.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.274+0000 7f283a7fc640 1 --2- 192.168.123.107:0/2679223382 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f283c1001c0 0x7f283c19a7f0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f283000e960 tx=0x7f283000ee30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:42.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.274+0000 7f281bfff640 1 -- 192.168.123.107:0/2679223382 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f283000cd30 con 0x7f283c1001c0 2026-03-09T19:24:42.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.274+0000 7f281bfff640 1 -- 192.168.123.107:0/2679223382 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f283000ce90 con 0x7f283c1001c0 2026-03-09T19:24:42.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.274+0000 7f283bfff640 1 -- 192.168.123.107:0/2679223382 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f283c19f9d0 con 0x7f283c1001c0 2026-03-09T19:24:42.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.275+0000 7f281bfff640 1 -- 192.168.123.107:0/2679223382 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2830010640 con 0x7f283c1001c0 2026-03-09T19:24:42.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.275+0000 7f283bfff640 1 -- 192.168.123.107:0/2679223382 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f283c19fec0 con 0x7f283c1001c0 2026-03-09T19:24:42.274 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.276+0000 7f283bfff640 1 -- 192.168.123.107:0/2679223382 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f283c10b450 con 0x7f283c1001c0 2026-03-09T19:24:42.274 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.276+0000 7f281bfff640 1 -- 192.168.123.107:0/2679223382 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f28300026e0 con 0x7f283c1001c0 2026-03-09T19:24:42.275 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.277+0000 7f281bfff640 1 --2- 192.168.123.107:0/2679223382 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2810075fb0 0x7f2810078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:42.275 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.277+0000 7f281bfff640 1 -- 192.168.123.107:0/2679223382 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f2830014070 con 0x7f283c1001c0 2026-03-09T19:24:42.275 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.277+0000 7f283affd640 1 --2- 192.168.123.107:0/2679223382 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2810075fb0 0x7f2810078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:42.276 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.278+0000 7f283affd640 1 --2- 192.168.123.107:0/2679223382 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2810075fb0 0x7f2810078470 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f2824002c80 tx=0x7f28240023d0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:42.277 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.279+0000 7f281bfff640 1 -- 192.168.123.107:0/2679223382 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f2830060980 con 0x7f283c1001c0 2026-03-09T19:24:42.369 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:42 vm07.local ceph-mon[48545]: pgmap v85: 65 pgs: 65 active+clean; 453 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 6.5 KiB/s rd, 1.2 KiB/s wr, 9 op/s 2026-03-09T19:24:42.370 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:42 vm07.local ceph-mon[48545]: from='client.? ' entity='client.admin' 2026-03-09T19:24:42.370 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:42 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:24:42.370 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:42 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:42.370 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:42 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:24:42.370 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:42 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:42.370 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.371+0000 7f283bfff640 1 -- 192.168.123.107:0/2679223382 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}] v 0) v1 -- 0x7f283c10b690 con 0x7f283c1001c0 2026-03-09T19:24:42.371 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.373+0000 7f281bfff640 1 -- 192.168.123.107:0/2679223382 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}]=0 v19)=0 v19) v1 ==== 163+0+0 (secure 0 0 0) 0x7f2830060320 con 0x7f283c1001c0 2026-03-09T19:24:42.374 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.376+0000 7f283bfff640 1 -- 192.168.123.107:0/2679223382 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2810075fb0 msgr2=0x7f2810078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:42.374 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.376+0000 7f283bfff640 1 --2- 192.168.123.107:0/2679223382 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2810075fb0 0x7f2810078470 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f2824002c80 tx=0x7f28240023d0 comp rx=0 tx=0).stop 2026-03-09T19:24:42.374 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.376+0000 7f283bfff640 1 -- 192.168.123.107:0/2679223382 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f283c1001c0 msgr2=0x7f283c19a7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:42.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.377+0000 7f283bfff640 1 --2- 192.168.123.107:0/2679223382 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f283c1001c0 0x7f283c19a7f0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f283000e960 tx=0x7f283000ee30 comp rx=0 tx=0).stop 2026-03-09T19:24:42.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.377+0000 7f283bfff640 1 -- 192.168.123.107:0/2679223382 shutdown_connections 2026-03-09T19:24:42.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.377+0000 7f283bfff640 1 --2- 192.168.123.107:0/2679223382 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2810075fb0 0x7f2810078470 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:42.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.377+0000 7f283bfff640 1 --2- 192.168.123.107:0/2679223382 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f283c1001c0 0x7f283c19a7f0 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:42.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.377+0000 7f283bfff640 1 --2- 192.168.123.107:0/2679223382 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f283c0ff810 0x7f283c19a2b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:42.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.378+0000 7f283bfff640 1 -- 192.168.123.107:0/2679223382 >> 192.168.123.107:0/2679223382 conn(0x7f283c0f9f80 msgr2=0x7f283c0fbb10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:42.376 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.378+0000 7f283bfff640 1 -- 192.168.123.107:0/2679223382 shutdown_connections 2026-03-09T19:24:42.376 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.378+0000 7f283bfff640 1 -- 192.168.123.107:0/2679223382 wait complete. 2026-03-09T19:24:42.417 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set global log_to_journald false --force' 2026-03-09T19:24:42.571 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:42.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:42 vm08.local ceph-mon[57794]: pgmap v85: 65 pgs: 65 active+clean; 453 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 6.5 KiB/s rd, 1.2 KiB/s wr, 9 op/s 2026-03-09T19:24:42.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:42 vm08.local ceph-mon[57794]: from='client.? ' entity='client.admin' 2026-03-09T19:24:42.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:42 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:24:42.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:42 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:42.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:42 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:24:42.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:42 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:42.819 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.820+0000 7f80ed1f3640 1 -- 192.168.123.107:0/121771373 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e8103c80 msgr2=0x7f80e8104100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:42.819 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.820+0000 7f80ed1f3640 1 --2- 192.168.123.107:0/121771373 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e8103c80 0x7f80e8104100 secure :-1 s=READY pgs=282 cs=0 l=1 rev1=1 crypto rx=0x7f80d00099b0 tx=0x7f80d002f220 comp rx=0 tx=0).stop 2026-03-09T19:24:42.819 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.822+0000 7f80ed1f3640 1 -- 192.168.123.107:0/121771373 shutdown_connections 2026-03-09T19:24:42.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.822+0000 7f80ed1f3640 1 --2- 192.168.123.107:0/121771373 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e8103c80 0x7f80e8104100 unknown :-1 s=CLOSED pgs=282 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:42.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.822+0000 7f80ed1f3640 1 --2- 192.168.123.107:0/121771373 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80e8102a80 0x7f80e8102e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:42.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.822+0000 7f80ed1f3640 1 -- 192.168.123.107:0/121771373 >> 192.168.123.107:0/121771373 conn(0x7f80e80fe250 msgr2=0x7f80e8100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:42.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.822+0000 7f80ed1f3640 1 -- 192.168.123.107:0/121771373 shutdown_connections 2026-03-09T19:24:42.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.822+0000 7f80ed1f3640 1 -- 192.168.123.107:0/121771373 wait complete. 2026-03-09T19:24:42.821 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.823+0000 7f80ed1f3640 1 Processor -- start 2026-03-09T19:24:42.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.823+0000 7f80ed1f3640 1 -- start start 2026-03-09T19:24:42.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.824+0000 7f80ed1f3640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e8102a80 0x7f80e8072d40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:42.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.824+0000 7f80ed1f3640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80e8103c80 0x7f80e806f7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:42.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.824+0000 7f80ed1f3640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80e806fce0 con 0x7f80e8102a80 2026-03-09T19:24:42.823 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.824+0000 7f80ed1f3640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80e806fe50 con 0x7f80e8103c80 2026-03-09T19:24:42.823 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.825+0000 7f80e6575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80e8103c80 0x7f80e806f7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:42.823 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.825+0000 7f80e6575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80e8103c80 0x7f80e806f7a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:57584/0 (socket says 192.168.123.107:57584) 2026-03-09T19:24:42.823 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.825+0000 7f80e6575640 1 -- 192.168.123.107:0/1566264348 learned_addr learned my addr 192.168.123.107:0/1566264348 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:42.823 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.825+0000 7f80e6575640 1 -- 192.168.123.107:0/1566264348 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e8102a80 msgr2=0x7f80e8072d40 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:24:42.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.826+0000 7f80e6d76640 1 --2- 192.168.123.107:0/1566264348 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e8102a80 0x7f80e8072d40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:42.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.826+0000 7f80e6575640 1 --2- 192.168.123.107:0/1566264348 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e8102a80 0x7f80e8072d40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:42.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.826+0000 7f80e6575640 1 -- 192.168.123.107:0/1566264348 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f80d0009660 con 0x7f80e8103c80 2026-03-09T19:24:42.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.826+0000 7f80e6575640 1 --2- 192.168.123.107:0/1566264348 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80e8103c80 0x7f80e806f7a0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f80d002f730 tx=0x7f80d00028f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:42.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.826+0000 7f80e6d76640 1 --2- 192.168.123.107:0/1566264348 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e8102a80 0x7f80e8072d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:24:42.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.826+0000 7f80c7fff640 1 -- 192.168.123.107:0/1566264348 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80d003d070 con 0x7f80e8103c80 2026-03-09T19:24:42.825 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.827+0000 7f80c7fff640 1 -- 192.168.123.107:0/1566264348 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f80d002fc90 con 0x7f80e8103c80 2026-03-09T19:24:42.825 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.827+0000 7f80ed1f3640 1 -- 192.168.123.107:0/1566264348 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f80e80700d0 con 0x7f80e8103c80 2026-03-09T19:24:42.825 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.827+0000 7f80ed1f3640 1 -- 192.168.123.107:0/1566264348 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f80e810c690 con 0x7f80e8103c80 2026-03-09T19:24:42.829 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.828+0000 7f80c7fff640 1 -- 192.168.123.107:0/1566264348 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80d00388e0 con 0x7f80e8103c80 2026-03-09T19:24:42.829 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.831+0000 7f80c7fff640 1 -- 192.168.123.107:0/1566264348 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f80d0048020 con 0x7f80e8103c80 2026-03-09T19:24:42.829 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.831+0000 7f80c7fff640 1 --2- 192.168.123.107:0/1566264348 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f80bc0761c0 0x7f80bc078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:42.830 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.831+0000 7f80c7fff640 1 -- 192.168.123.107:0/1566264348 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f80d00bc070 con 0x7f80e8103c80 2026-03-09T19:24:42.830 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.832+0000 7f80e6d76640 1 --2- 192.168.123.107:0/1566264348 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f80bc0761c0 0x7f80bc078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:42.830 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.832+0000 7f80e6d76640 1 --2- 192.168.123.107:0/1566264348 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f80bc0761c0 0x7f80bc078680 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f80e8103ae0 tx=0x7f80dc005f70 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:42.831 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.833+0000 7f80ed1f3640 1 -- 192.168.123.107:0/1566264348 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f80ac005350 con 0x7f80e8103c80 2026-03-09T19:24:42.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.836+0000 7f80c7fff640 1 -- 192.168.123.107:0/1566264348 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f80d0085530 con 0x7f80e8103c80 2026-03-09T19:24:42.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.936+0000 7f80ed1f3640 1 -- 192.168.123.107:0/1566264348 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command([{prefix=config set, name=log_to_journald}] v 0) v1 -- 0x7f80ac005b80 con 0x7f80e8103c80 2026-03-09T19:24:42.935 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.937+0000 7f80c7fff640 1 -- 192.168.123.107:0/1566264348 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{prefix=config set, name=log_to_journald}]=0 v19)=0 v19) v1 ==== 135+0+0 (secure 0 0 0) 0x7f80d0084ed0 con 0x7f80e8103c80 2026-03-09T19:24:42.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.940+0000 7f80ed1f3640 1 -- 192.168.123.107:0/1566264348 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f80bc0761c0 msgr2=0x7f80bc078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:42.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.940+0000 7f80ed1f3640 1 --2- 192.168.123.107:0/1566264348 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f80bc0761c0 0x7f80bc078680 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f80e8103ae0 tx=0x7f80dc005f70 comp rx=0 tx=0).stop 2026-03-09T19:24:42.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.941+0000 7f80ed1f3640 1 -- 192.168.123.107:0/1566264348 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80e8103c80 msgr2=0x7f80e806f7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:42.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.941+0000 7f80ed1f3640 1 --2- 192.168.123.107:0/1566264348 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80e8103c80 0x7f80e806f7a0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f80d002f730 tx=0x7f80d00028f0 comp rx=0 tx=0).stop 2026-03-09T19:24:42.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.941+0000 7f80ed1f3640 1 -- 192.168.123.107:0/1566264348 shutdown_connections 2026-03-09T19:24:42.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.941+0000 7f80ed1f3640 1 --2- 192.168.123.107:0/1566264348 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f80bc0761c0 0x7f80bc078680 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:42.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.941+0000 7f80ed1f3640 1 --2- 192.168.123.107:0/1566264348 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80e8103c80 0x7f80e806f7a0 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:42.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.941+0000 7f80ed1f3640 1 --2- 192.168.123.107:0/1566264348 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80e8102a80 0x7f80e8072d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:42.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.941+0000 7f80ed1f3640 1 -- 192.168.123.107:0/1566264348 >> 192.168.123.107:0/1566264348 conn(0x7f80e80fe250 msgr2=0x7f80e8104ea0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:42.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.942+0000 7f80ed1f3640 1 -- 192.168.123.107:0/1566264348 shutdown_connections 2026-03-09T19:24:42.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:42.942+0000 7f80ed1f3640 1 -- 192.168.123.107:0/1566264348 wait complete. 2026-03-09T19:24:42.989 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1' 2026-03-09T19:24:43.144 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:43.426 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.427+0000 7f8f74b39640 1 -- 192.168.123.107:0/3239593844 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f70101a10 msgr2=0x7f8f70101e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:43.426 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.427+0000 7f8f74b39640 1 --2- 192.168.123.107:0/3239593844 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f70101a10 0x7f8f70101e90 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f8f640098e0 tx=0x7f8f6402f190 comp rx=0 tx=0).stop 2026-03-09T19:24:43.426 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.428+0000 7f8f74b39640 1 -- 192.168.123.107:0/3239593844 shutdown_connections 2026-03-09T19:24:43.426 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.428+0000 7f8f74b39640 1 --2- 192.168.123.107:0/3239593844 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f70101a10 0x7f8f70101e90 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:43.426 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.428+0000 7f8f74b39640 1 --2- 192.168.123.107:0/3239593844 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70100810 0x7f8f70100c10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:43.426 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.428+0000 7f8f74b39640 1 -- 192.168.123.107:0/3239593844 >> 192.168.123.107:0/3239593844 conn(0x7f8f700fbf80 msgr2=0x7f8f700fe3e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:43.426 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.428+0000 7f8f74b39640 1 -- 192.168.123.107:0/3239593844 shutdown_connections 2026-03-09T19:24:43.426 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.428+0000 7f8f74b39640 1 -- 192.168.123.107:0/3239593844 wait complete. 2026-03-09T19:24:43.426 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.429+0000 7f8f74b39640 1 Processor -- start 2026-03-09T19:24:43.427 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.429+0000 7f8f74b39640 1 -- start start 2026-03-09T19:24:43.427 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.429+0000 7f8f74b39640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f70100810 0x7f8f70198130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:43.427 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.429+0000 7f8f74b39640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70101a10 0x7f8f70198670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:43.427 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.429+0000 7f8f74b39640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8f70198c40 con 0x7f8f70101a10 2026-03-09T19:24:43.427 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.429+0000 7f8f74b39640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8f70198db0 con 0x7f8f70100810 2026-03-09T19:24:43.427 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.429+0000 7f8f6dd74640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70101a10 0x7f8f70198670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:43.427 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.429+0000 7f8f6dd74640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70101a10 0x7f8f70198670 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:47558/0 (socket says 192.168.123.107:47558) 2026-03-09T19:24:43.427 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.429+0000 7f8f6dd74640 1 -- 192.168.123.107:0/2168636894 learned_addr learned my addr 192.168.123.107:0/2168636894 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:43.427 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.429+0000 7f8f6e575640 1 --2- 192.168.123.107:0/2168636894 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f70100810 0x7f8f70198130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:43.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.430+0000 7f8f6dd74640 1 -- 192.168.123.107:0/2168636894 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f70100810 msgr2=0x7f8f70198130 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:43.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.430+0000 7f8f6dd74640 1 --2- 192.168.123.107:0/2168636894 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f70100810 0x7f8f70198130 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:43.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.430+0000 7f8f6dd74640 1 -- 192.168.123.107:0/2168636894 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8f58009660 con 0x7f8f70101a10 2026-03-09T19:24:43.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.430+0000 7f8f6e575640 1 --2- 192.168.123.107:0/2168636894 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f70100810 0x7f8f70198130 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:24:43.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.430+0000 7f8f6dd74640 1 --2- 192.168.123.107:0/2168636894 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70101a10 0x7f8f70198670 secure :-1 s=READY pgs=283 cs=0 l=1 rev1=1 crypto rx=0x7f8f64031c60 tx=0x7f8f64031c90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:43.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.430+0000 7f8f577fe640 1 -- 192.168.123.107:0/2168636894 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8f6403d070 con 0x7f8f70101a10 2026-03-09T19:24:43.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.430+0000 7f8f74b39640 1 -- 192.168.123.107:0/2168636894 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8f64009590 con 0x7f8f70101a10 2026-03-09T19:24:43.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.430+0000 7f8f577fe640 1 -- 192.168.123.107:0/2168636894 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8f6402fe70 con 0x7f8f70101a10 2026-03-09T19:24:43.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.430+0000 7f8f74b39640 1 -- 192.168.123.107:0/2168636894 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8f7019db50 con 0x7f8f70101a10 2026-03-09T19:24:43.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.431+0000 7f8f577fe640 1 -- 192.168.123.107:0/2168636894 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8f64038550 con 0x7f8f70101a10 2026-03-09T19:24:43.430 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.431+0000 7f8f74b39640 1 -- 192.168.123.107:0/2168636894 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8f34005350 con 0x7f8f70101a10 2026-03-09T19:24:43.434 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.435+0000 7f8f577fe640 1 -- 192.168.123.107:0/2168636894 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f8f64031170 con 0x7f8f70101a10 2026-03-09T19:24:43.434 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.436+0000 7f8f577fe640 1 --2- 192.168.123.107:0/2168636894 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8f48076170 0x7f8f48078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:43.434 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.436+0000 7f8f577fe640 1 -- 192.168.123.107:0/2168636894 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f8f640bcb50 con 0x7f8f70101a10 2026-03-09T19:24:43.434 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.436+0000 7f8f6e575640 1 --2- 192.168.123.107:0/2168636894 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8f48076170 0x7f8f48078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:43.434 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.436+0000 7f8f577fe640 1 -- 192.168.123.107:0/2168636894 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f8f640ea8c0 con 0x7f8f70101a10 2026-03-09T19:24:43.434 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.436+0000 7f8f6e575640 1 --2- 192.168.123.107:0/2168636894 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8f48076170 0x7f8f48078630 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f8f70101870 tx=0x7f8f58009340 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:43.538 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.539+0000 7f8f74b39640 1 -- 192.168.123.107:0/2168636894 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}) v1 -- 0x7f8f34002bf0 con 0x7f8f48076170 2026-03-09T19:24:43.546 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.548+0000 7f8f577fe640 1 -- 192.168.123.107:0/2168636894 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+89 (secure 0 0 0) 0x7f8f34002bf0 con 0x7f8f48076170 2026-03-09T19:24:43.547 INFO:teuthology.orchestra.run.vm07.stdout:Initiating upgrade to quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T19:24:43.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.552+0000 7f8f74b39640 1 -- 192.168.123.107:0/2168636894 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8f48076170 msgr2=0x7f8f48078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:43.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.552+0000 7f8f74b39640 1 --2- 192.168.123.107:0/2168636894 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8f48076170 0x7f8f48078630 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f8f70101870 tx=0x7f8f58009340 comp rx=0 tx=0).stop 2026-03-09T19:24:43.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.552+0000 7f8f74b39640 1 -- 192.168.123.107:0/2168636894 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70101a10 msgr2=0x7f8f70198670 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:43.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.552+0000 7f8f74b39640 1 --2- 192.168.123.107:0/2168636894 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70101a10 0x7f8f70198670 secure :-1 s=READY pgs=283 cs=0 l=1 rev1=1 crypto rx=0x7f8f64031c60 tx=0x7f8f64031c90 comp rx=0 tx=0).stop 2026-03-09T19:24:43.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.552+0000 7f8f74b39640 1 -- 192.168.123.107:0/2168636894 shutdown_connections 2026-03-09T19:24:43.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.552+0000 7f8f74b39640 1 --2- 192.168.123.107:0/2168636894 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8f48076170 0x7f8f48078630 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:43.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.552+0000 7f8f74b39640 1 --2- 192.168.123.107:0/2168636894 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70101a10 0x7f8f70198670 unknown :-1 s=CLOSED pgs=283 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:43.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.552+0000 7f8f74b39640 1 --2- 192.168.123.107:0/2168636894 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f70100810 0x7f8f70198130 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:43.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.552+0000 7f8f74b39640 1 -- 192.168.123.107:0/2168636894 >> 192.168.123.107:0/2168636894 conn(0x7f8f700fbf80 msgr2=0x7f8f700fda80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:43.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.552+0000 7f8f74b39640 1 -- 192.168.123.107:0/2168636894 shutdown_connections 2026-03-09T19:24:43.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:43.553+0000 7f8f74b39640 1 -- 192.168.123.107:0/2168636894 wait complete. 2026-03-09T19:24:43.615 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-09T19:24:43.616 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm07.local 2026-03-09T19:24:43.616 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'while ceph orch upgrade status | jq '"'"'.in_progress'"'"' | grep true && ! ceph orch upgrade status | jq '"'"'.message'"'"' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done' 2026-03-09T19:24:43.816 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:24:44.590 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:44 vm07.local ceph-mon[48545]: pgmap v86: 65 pgs: 65 active+clean; 459 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 6.0 KiB/s rd, 1.2 KiB/s wr, 8 op/s 2026-03-09T19:24:44.591 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:44 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:44.591 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:44 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:24:44.591 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:44 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:44.591 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:44 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:24:44.591 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:44 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:44.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:44 vm08.local ceph-mon[57794]: pgmap v86: 65 pgs: 65 active+clean; 459 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 6.0 KiB/s rd, 1.2 KiB/s wr, 8 op/s 2026-03-09T19:24:44.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:44 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:44.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:44 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:24:44.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:44 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:24:44.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:44 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:24:44.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:44 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:24:44.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.851+0000 7f93c1fae640 1 -- 192.168.123.107:0/3481217968 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93bc1007f0 msgr2=0x7f93bc100bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:44.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.851+0000 7f93c1fae640 1 --2- 192.168.123.107:0/3481217968 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93bc1007f0 0x7f93bc100bf0 secure :-1 s=READY pgs=284 cs=0 l=1 rev1=1 crypto rx=0x7f93ac0098e0 tx=0x7f93ac02f1d0 comp rx=0 tx=0).stop 2026-03-09T19:24:44.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.853+0000 7f93c1fae640 1 -- 192.168.123.107:0/3481217968 shutdown_connections 2026-03-09T19:24:44.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.853+0000 7f93c1fae640 1 --2- 192.168.123.107:0/3481217968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f93bc1019f0 0x7f93bc101e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:44.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.853+0000 7f93c1fae640 1 --2- 192.168.123.107:0/3481217968 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93bc1007f0 0x7f93bc100bf0 unknown :-1 s=CLOSED pgs=284 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:44.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.853+0000 7f93c1fae640 1 -- 192.168.123.107:0/3481217968 >> 192.168.123.107:0/3481217968 conn(0x7f93bc0fbf80 msgr2=0x7f93bc0fe3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:44.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.853+0000 7f93c1fae640 1 -- 192.168.123.107:0/3481217968 shutdown_connections 2026-03-09T19:24:44.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.853+0000 7f93c1fae640 1 -- 192.168.123.107:0/3481217968 wait complete. 2026-03-09T19:24:44.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.854+0000 7f93c1fae640 1 Processor -- start 2026-03-09T19:24:44.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.854+0000 7f93c1fae640 1 -- start start 2026-03-09T19:24:44.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.854+0000 7f93c1fae640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f93bc1007f0 0x7f93bc19e970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:44.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.854+0000 7f93c1fae640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93bc1019f0 0x7f93bc19eeb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:44.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.854+0000 7f93c1fae640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f93bc19f480 con 0x7f93bc1019f0 2026-03-09T19:24:44.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.855+0000 7f93baffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93bc1019f0 0x7f93bc19eeb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:44.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.855+0000 7f93bb7fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f93bc1007f0 0x7f93bc19e970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:44.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.855+0000 7f93baffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93bc1019f0 0x7f93bc19eeb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:47576/0 (socket says 192.168.123.107:47576) 2026-03-09T19:24:44.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.855+0000 7f93baffd640 1 -- 192.168.123.107:0/16533560 learned_addr learned my addr 192.168.123.107:0/16533560 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:44.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.855+0000 7f93c1fae640 1 -- 192.168.123.107:0/16533560 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f93bc19f5f0 con 0x7f93bc1007f0 2026-03-09T19:24:44.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.855+0000 7f93baffd640 1 -- 192.168.123.107:0/16533560 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f93bc1007f0 msgr2=0x7f93bc19e970 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:44.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.855+0000 7f93baffd640 1 --2- 192.168.123.107:0/16533560 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f93bc1007f0 0x7f93bc19e970 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:44.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.855+0000 7f93baffd640 1 -- 192.168.123.107:0/16533560 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f93ac009590 con 0x7f93bc1019f0 2026-03-09T19:24:44.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.855+0000 7f93baffd640 1 --2- 192.168.123.107:0/16533560 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93bc1019f0 0x7f93bc19eeb0 secure :-1 s=READY pgs=285 cs=0 l=1 rev1=1 crypto rx=0x7f93a800e9c0 tx=0x7f93a800ee90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:44.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.856+0000 7f93b8ff9640 1 -- 192.168.123.107:0/16533560 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f93a800cce0 con 0x7f93bc1019f0 2026-03-09T19:24:44.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.856+0000 7f93b8ff9640 1 -- 192.168.123.107:0/16533560 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f93a8004590 con 0x7f93bc1019f0 2026-03-09T19:24:44.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.856+0000 7f93b8ff9640 1 -- 192.168.123.107:0/16533560 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f93a8010640 con 0x7f93bc1019f0 2026-03-09T19:24:44.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.856+0000 7f93c1fae640 1 -- 192.168.123.107:0/16533560 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f93bc1a4010 con 0x7f93bc1019f0 2026-03-09T19:24:44.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.856+0000 7f93c1fae640 1 -- 192.168.123.107:0/16533560 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f93bc1a4530 con 0x7f93bc1019f0 2026-03-09T19:24:44.859 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.857+0000 7f93c1fae640 1 -- 192.168.123.107:0/16533560 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9388005350 con 0x7f93bc1019f0 2026-03-09T19:24:44.859 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.861+0000 7f93b8ff9640 1 -- 192.168.123.107:0/16533560 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f93a80040d0 con 0x7f93bc1019f0 2026-03-09T19:24:44.860 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.862+0000 7f93b8ff9640 1 --2- 192.168.123.107:0/16533560 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9394076290 0x7f9394078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:44.860 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.862+0000 7f93bb7fe640 1 --2- 192.168.123.107:0/16533560 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9394076290 0x7f9394078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:44.860 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.863+0000 7f93b8ff9640 1 -- 192.168.123.107:0/16533560 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f93a801d070 con 0x7f93bc1019f0 2026-03-09T19:24:44.861 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.863+0000 7f93bb7fe640 1 --2- 192.168.123.107:0/16533560 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9394076290 0x7f9394078750 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f93ac002410 tx=0x7f93ac03a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:44.861 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.863+0000 7f93b8ff9640 1 -- 192.168.123.107:0/16533560 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f93a80c58c0 con 0x7f93bc1019f0 2026-03-09T19:24:44.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.966+0000 7f93c1fae640 1 -- 192.168.123.107:0/16533560 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9388002bf0 con 0x7f9394076290 2026-03-09T19:24:44.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.967+0000 7f93b8ff9640 1 -- 192.168.123.107:0/16533560 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f9388002bf0 con 0x7f9394076290 2026-03-09T19:24:44.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.970+0000 7f93c1fae640 1 -- 192.168.123.107:0/16533560 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9394076290 msgr2=0x7f9394078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:44.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.970+0000 7f93c1fae640 1 --2- 192.168.123.107:0/16533560 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9394076290 0x7f9394078750 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f93ac002410 tx=0x7f93ac03a040 comp rx=0 tx=0).stop 2026-03-09T19:24:44.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.970+0000 7f93c1fae640 1 -- 192.168.123.107:0/16533560 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93bc1019f0 msgr2=0x7f93bc19eeb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:44.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.970+0000 7f93c1fae640 1 --2- 192.168.123.107:0/16533560 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93bc1019f0 0x7f93bc19eeb0 secure :-1 s=READY pgs=285 cs=0 l=1 rev1=1 crypto rx=0x7f93a800e9c0 tx=0x7f93a800ee90 comp rx=0 tx=0).stop 2026-03-09T19:24:44.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.970+0000 7f93c1fae640 1 -- 192.168.123.107:0/16533560 shutdown_connections 2026-03-09T19:24:44.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.970+0000 7f93c1fae640 1 --2- 192.168.123.107:0/16533560 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9394076290 0x7f9394078750 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:44.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.970+0000 7f93c1fae640 1 --2- 192.168.123.107:0/16533560 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93bc1019f0 0x7f93bc19eeb0 unknown :-1 s=CLOSED pgs=285 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:44.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.970+0000 7f93c1fae640 1 --2- 192.168.123.107:0/16533560 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f93bc1007f0 0x7f93bc19e970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:44.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.970+0000 7f93c1fae640 1 -- 192.168.123.107:0/16533560 >> 192.168.123.107:0/16533560 conn(0x7f93bc0fbf80 msgr2=0x7f93bc0fdad0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:44.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.971+0000 7f93c1fae640 1 -- 192.168.123.107:0/16533560 shutdown_connections 2026-03-09T19:24:44.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:44.971+0000 7f93c1fae640 1 -- 192.168.123.107:0/16533560 wait complete. 2026-03-09T19:24:44.978 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:24:45.033 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.034+0000 7f21477fe640 1 -- 192.168.123.107:0/921882077 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2140038470 con 0x7f2150076df0 2026-03-09T19:24:45.033 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.035+0000 7f2156fb3640 1 -- 192.168.123.107:0/921882077 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2150076df0 msgr2=0x7f2150077250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.033 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.035+0000 7f2156fb3640 1 --2- 192.168.123.107:0/921882077 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2150076df0 0x7f2150077250 secure :-1 s=READY pgs=286 cs=0 l=1 rev1=1 crypto rx=0x7f21400099b0 tx=0x7f214002f220 comp rx=0 tx=0).stop 2026-03-09T19:24:45.033 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.035+0000 7f2156fb3640 1 -- 192.168.123.107:0/921882077 shutdown_connections 2026-03-09T19:24:45.034 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.035+0000 7f2156fb3640 1 --2- 192.168.123.107:0/921882077 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2150076df0 0x7f2150077250 secure :-1 s=CLOSED pgs=286 cs=0 l=1 rev1=1 crypto rx=0x7f21400099b0 tx=0x7f214002f220 comp rx=0 tx=0).stop 2026-03-09T19:24:45.034 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.035+0000 7f2156fb3640 1 --2- 192.168.123.107:0/921882077 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2150075ba0 0x7f2150075fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.034 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.035+0000 7f2156fb3640 1 -- 192.168.123.107:0/921882077 >> 192.168.123.107:0/921882077 conn(0x7f21500fe250 msgr2=0x7f2150100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:45.034 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.036+0000 7f2156fb3640 1 -- 192.168.123.107:0/921882077 shutdown_connections 2026-03-09T19:24:45.034 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.036+0000 7f2156fb3640 1 -- 192.168.123.107:0/921882077 wait complete. 2026-03-09T19:24:45.034 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.036+0000 7f2156fb3640 1 Processor -- start 2026-03-09T19:24:45.034 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.036+0000 7f2156fb3640 1 -- start start 2026-03-09T19:24:45.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.036+0000 7f2156fb3640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2150075ba0 0x7f215019eb00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.036+0000 7f2156fb3640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f215019f040 0x7f21501a40b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.036+0000 7f2156fb3640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f215019f4c0 con 0x7f2150075ba0 2026-03-09T19:24:45.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.036+0000 7f2156fb3640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f215019f630 con 0x7f215019f040 2026-03-09T19:24:45.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.037+0000 7f2154d28640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2150075ba0 0x7f215019eb00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:45.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.037+0000 7f2154d28640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2150075ba0 0x7f215019eb00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:47594/0 (socket says 192.168.123.107:47594) 2026-03-09T19:24:45.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.037+0000 7f2154d28640 1 -- 192.168.123.107:0/2135763204 learned_addr learned my addr 192.168.123.107:0/2135763204 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:45.035 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.037+0000 7f2154d28640 1 -- 192.168.123.107:0/2135763204 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f215019f040 msgr2=0x7f21501a40b0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T19:24:45.036 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.037+0000 7f2154d28640 1 --2- 192.168.123.107:0/2135763204 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f215019f040 0x7f21501a40b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.036 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.037+0000 7f2154d28640 1 -- 192.168.123.107:0/2135763204 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2140009660 con 0x7f2150075ba0 2026-03-09T19:24:45.036 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.037+0000 7f2154d28640 1 --2- 192.168.123.107:0/2135763204 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2150075ba0 0x7f215019eb00 secure :-1 s=READY pgs=287 cs=0 l=1 rev1=1 crypto rx=0x7f213800ca30 tx=0x7f213800cf00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:45.036 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.038+0000 7f2145ffb640 1 -- 192.168.123.107:0/2135763204 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2138004430 con 0x7f2150075ba0 2026-03-09T19:24:45.036 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.038+0000 7f2156fb3640 1 -- 192.168.123.107:0/2135763204 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f21501a45f0 con 0x7f2150075ba0 2026-03-09T19:24:45.037 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.038+0000 7f2156fb3640 1 -- 192.168.123.107:0/2135763204 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f21501a4ac0 con 0x7f2150075ba0 2026-03-09T19:24:45.037 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.038+0000 7f2145ffb640 1 -- 192.168.123.107:0/2135763204 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f2138004590 con 0x7f2150075ba0 2026-03-09T19:24:45.037 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.039+0000 7f2145ffb640 1 -- 192.168.123.107:0/2135763204 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f213800f660 con 0x7f2150075ba0 2026-03-09T19:24:45.038 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.040+0000 7f2145ffb640 1 -- 192.168.123.107:0/2135763204 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f213800f880 con 0x7f2150075ba0 2026-03-09T19:24:45.042 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.040+0000 7f2156fb3640 1 -- 192.168.123.107:0/2135763204 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2124005350 con 0x7f2150075ba0 2026-03-09T19:24:45.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.041+0000 7f2145ffb640 1 --2- 192.168.123.107:0/2135763204 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2120075fb0 0x7f2120078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.041+0000 7f2147fff640 1 --2- 192.168.123.107:0/2135763204 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2120075fb0 0x7f2120078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:45.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.041+0000 7f2145ffb640 1 -- 192.168.123.107:0/2135763204 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f2138097c30 con 0x7f2150075ba0 2026-03-09T19:24:45.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.042+0000 7f2147fff640 1 --2- 192.168.123.107:0/2135763204 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2120075fb0 0x7f2120078470 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f2140002c20 tx=0x7f21400023d0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:45.048 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.047+0000 7f2145ffb640 1 -- 192.168.123.107:0/2135763204 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f21380611f0 con 0x7f2150075ba0 2026-03-09T19:24:45.155 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.156+0000 7f2156fb3640 1 -- 192.168.123.107:0/2135763204 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f2124002bf0 con 0x7f2120075fb0 2026-03-09T19:24:45.155 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.157+0000 7f2145ffb640 1 -- 192.168.123.107:0/2135763204 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f2124002bf0 con 0x7f2120075fb0 2026-03-09T19:24:45.158 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.160+0000 7f2156fb3640 1 -- 192.168.123.107:0/2135763204 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2120075fb0 msgr2=0x7f2120078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.158 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.160+0000 7f2156fb3640 1 --2- 192.168.123.107:0/2135763204 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2120075fb0 0x7f2120078470 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f2140002c20 tx=0x7f21400023d0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.158 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.160+0000 7f2156fb3640 1 -- 192.168.123.107:0/2135763204 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2150075ba0 msgr2=0x7f215019eb00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.158 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.160+0000 7f2156fb3640 1 --2- 192.168.123.107:0/2135763204 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2150075ba0 0x7f215019eb00 secure :-1 s=READY pgs=287 cs=0 l=1 rev1=1 crypto rx=0x7f213800ca30 tx=0x7f213800cf00 comp rx=0 tx=0).stop 2026-03-09T19:24:45.158 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.161+0000 7f2156fb3640 1 -- 192.168.123.107:0/2135763204 shutdown_connections 2026-03-09T19:24:45.159 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.161+0000 7f2156fb3640 1 --2- 192.168.123.107:0/2135763204 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2120075fb0 0x7f2120078470 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.159 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.161+0000 7f2156fb3640 1 --2- 192.168.123.107:0/2135763204 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f215019f040 0x7f21501a40b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.159 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.161+0000 7f2156fb3640 1 --2- 192.168.123.107:0/2135763204 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2150075ba0 0x7f215019eb00 unknown :-1 s=CLOSED pgs=287 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.159 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.161+0000 7f2156fb3640 1 -- 192.168.123.107:0/2135763204 >> 192.168.123.107:0/2135763204 conn(0x7f21500fe250 msgr2=0x7f21500ffec0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:45.159 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.161+0000 7f2156fb3640 1 -- 192.168.123.107:0/2135763204 shutdown_connections 2026-03-09T19:24:45.159 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.161+0000 7f2156fb3640 1 -- 192.168.123.107:0/2135763204 wait complete. 2026-03-09T19:24:45.228 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.230+0000 7f3155799640 1 -- 192.168.123.107:0/3452539577 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3150101a10 msgr2=0x7f3150101e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.228 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.230+0000 7f3155799640 1 --2- 192.168.123.107:0/3452539577 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3150101a10 0x7f3150101e90 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f31440098e0 tx=0x7f314402f190 comp rx=0 tx=0).stop 2026-03-09T19:24:45.228 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.230+0000 7f3155799640 1 -- 192.168.123.107:0/3452539577 shutdown_connections 2026-03-09T19:24:45.228 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.230+0000 7f3155799640 1 --2- 192.168.123.107:0/3452539577 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3150101a10 0x7f3150101e90 secure :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f31440098e0 tx=0x7f314402f190 comp rx=0 tx=0).stop 2026-03-09T19:24:45.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.230+0000 7f3155799640 1 --2- 192.168.123.107:0/3452539577 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3150100810 0x7f3150100c10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.230+0000 7f3155799640 1 -- 192.168.123.107:0/3452539577 >> 192.168.123.107:0/3452539577 conn(0x7f31500fbf80 msgr2=0x7f31500fe3e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:45.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.231+0000 7f3155799640 1 -- 192.168.123.107:0/3452539577 shutdown_connections 2026-03-09T19:24:45.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.231+0000 7f3155799640 1 -- 192.168.123.107:0/3452539577 wait complete. 2026-03-09T19:24:45.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.232+0000 7f3155799640 1 Processor -- start 2026-03-09T19:24:45.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.232+0000 7f3155799640 1 -- start start 2026-03-09T19:24:45.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.232+0000 7f3155799640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3150100810 0x7f3150194ca0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.232+0000 7f3155799640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31501951e0 0x7f3150191700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.232+0000 7f3155799640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3150195660 con 0x7f31501951e0 2026-03-09T19:24:45.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.232+0000 7f3155799640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3150191c40 con 0x7f3150100810 2026-03-09T19:24:45.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.232+0000 7f314e7fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31501951e0 0x7f3150191700 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:45.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.232+0000 7f314e7fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31501951e0 0x7f3150191700 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:47622/0 (socket says 192.168.123.107:47622) 2026-03-09T19:24:45.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.232+0000 7f314e7fc640 1 -- 192.168.123.107:0/388590334 learned_addr learned my addr 192.168.123.107:0/388590334 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:45.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.232+0000 7f314effd640 1 --2- 192.168.123.107:0/388590334 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3150100810 0x7f3150194ca0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:45.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.232+0000 7f314e7fc640 1 -- 192.168.123.107:0/388590334 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3150100810 msgr2=0x7f3150194ca0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.232+0000 7f314e7fc640 1 --2- 192.168.123.107:0/388590334 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3150100810 0x7f3150194ca0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.232+0000 7f314e7fc640 1 -- 192.168.123.107:0/388590334 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3138009660 con 0x7f31501951e0 2026-03-09T19:24:45.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.233+0000 7f314e7fc640 1 --2- 192.168.123.107:0/388590334 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31501951e0 0x7f3150191700 secure :-1 s=READY pgs=288 cs=0 l=1 rev1=1 crypto rx=0x7f314402f6a0 tx=0x7f3144031de0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:45.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.233+0000 7f312ffff640 1 -- 192.168.123.107:0/388590334 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f314403d070 con 0x7f31501951e0 2026-03-09T19:24:45.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.234+0000 7f312ffff640 1 -- 192.168.123.107:0/388590334 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f314402fdc0 con 0x7f31501951e0 2026-03-09T19:24:45.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.234+0000 7f3155799640 1 -- 192.168.123.107:0/388590334 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3144009590 con 0x7f31501951e0 2026-03-09T19:24:45.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.234+0000 7f3155799640 1 -- 192.168.123.107:0/388590334 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3150192220 con 0x7f31501951e0 2026-03-09T19:24:45.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.235+0000 7f312ffff640 1 -- 192.168.123.107:0/388590334 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f31440319e0 con 0x7f31501951e0 2026-03-09T19:24:45.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.235+0000 7f312ffff640 1 -- 192.168.123.107:0/388590334 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f3144038470 con 0x7f31501951e0 2026-03-09T19:24:45.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.235+0000 7f3155799640 1 -- 192.168.123.107:0/388590334 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3150101e90 con 0x7f31501951e0 2026-03-09T19:24:45.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.236+0000 7f312ffff640 1 --2- 192.168.123.107:0/388590334 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f3124075fb0 0x7f3124078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.236+0000 7f312ffff640 1 -- 192.168.123.107:0/388590334 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f31440bc170 con 0x7f31501951e0 2026-03-09T19:24:45.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.238+0000 7f314effd640 1 --2- 192.168.123.107:0/388590334 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f3124075fb0 0x7f3124078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:45.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.238+0000 7f312ffff640 1 -- 192.168.123.107:0/388590334 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f3144085700 con 0x7f31501951e0 2026-03-09T19:24:45.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.239+0000 7f314effd640 1 --2- 192.168.123.107:0/388590334 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f3124075fb0 0x7f3124078470 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f3150101870 tx=0x7f3138009340 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:45.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.335+0000 7f3155799640 1 -- 192.168.123.107:0/388590334 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f31500746d0 con 0x7f3124075fb0 2026-03-09T19:24:45.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.342+0000 7f312ffff640 1 -- 192.168.123.107:0/388590334 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3624 (secure 0 0 0) 0x7f31500746d0 con 0x7f3124075fb0 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (98s) 13s ago 2m 22.6M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (2m) 13s ago 2m 8284k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (112s) 14s ago 112s 8644k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (2m) 13s ago 2m 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2799ea3e4bf3 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (111s) 14s ago 111s 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 81fc95c210b6 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (97s) 13s ago 2m 79.7M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (20s) 13s ago 20s 12.5M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (18s) 13s ago 18s 14.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (17s) 14s ago 17s 16.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (19s) 14s ago 19s 17.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:9283,8765,8443 running (2m) 13s ago 2m 540M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 706e626ecd10 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (108s) 14s ago 108s 485M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 880604c16b45 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (2m) 13s ago 3m 53.6M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ccb644205fb3 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (106s) 14s ago 106s 49.4M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8d7b1da9e1e2 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (2m) 13s ago 2m 13.8M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (108s) 14s ago 108s 15.3M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (89s) 13s ago 89s 46.5M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d7417e3377af 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (78s) 13s ago 78s 67.5M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2b3c7dd92144 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (69s) 13s ago 69s 45.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 67f7c4b96ef8 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (60s) 14s ago 60s 45.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 740e44caf4fc 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (51s) 14s ago 51s 67.3M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d929d31f8a58 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (43s) 14s ago 43s 65.0M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:24:45.341 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (96s) 13s ago 2m 39.2M - 2.43.0 a07b618ecd1d 238baaac36ff 2026-03-09T19:24:45.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.345+0000 7f3155799640 1 -- 192.168.123.107:0/388590334 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f3124075fb0 msgr2=0x7f3124078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.346+0000 7f3155799640 1 --2- 192.168.123.107:0/388590334 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f3124075fb0 0x7f3124078470 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f3150101870 tx=0x7f3138009340 comp rx=0 tx=0).stop 2026-03-09T19:24:45.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.346+0000 7f3155799640 1 -- 192.168.123.107:0/388590334 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31501951e0 msgr2=0x7f3150191700 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.346+0000 7f3155799640 1 --2- 192.168.123.107:0/388590334 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31501951e0 0x7f3150191700 secure :-1 s=READY pgs=288 cs=0 l=1 rev1=1 crypto rx=0x7f314402f6a0 tx=0x7f3144031de0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.346+0000 7f3155799640 1 -- 192.168.123.107:0/388590334 shutdown_connections 2026-03-09T19:24:45.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.347+0000 7f3155799640 1 --2- 192.168.123.107:0/388590334 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f3124075fb0 0x7f3124078470 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:45 vm08.local ceph-mon[57794]: from='client.14552 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:45.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:45 vm08.local ceph-mon[57794]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T19:24:45.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:45 vm08.local ceph-mon[57794]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T19:24:45.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:45 vm08.local ceph-mon[57794]: pgmap v87: 65 pgs: 65 active+clean; 459 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 5.9 KiB/s rd, 1.2 KiB/s wr, 7 op/s 2026-03-09T19:24:45.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:45 vm08.local ceph-mon[57794]: from='client.14556 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:45.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:45 vm08.local ceph-mon[57794]: from='client.14560 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:45.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.347+0000 7f3155799640 1 --2- 192.168.123.107:0/388590334 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31501951e0 0x7f3150191700 unknown :-1 s=CLOSED pgs=288 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.347+0000 7f3155799640 1 --2- 192.168.123.107:0/388590334 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3150100810 0x7f3150194ca0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.346 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.348+0000 7f3155799640 1 -- 192.168.123.107:0/388590334 >> 192.168.123.107:0/388590334 conn(0x7f31500fbf80 msgr2=0x7f31500fd8d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:45.346 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.348+0000 7f3155799640 1 -- 192.168.123.107:0/388590334 shutdown_connections 2026-03-09T19:24:45.346 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.348+0000 7f3155799640 1 -- 192.168.123.107:0/388590334 wait complete. 2026-03-09T19:24:45.411 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:45 vm07.local ceph-mon[48545]: from='client.14552 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:45.411 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:45 vm07.local ceph-mon[48545]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T19:24:45.411 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:45 vm07.local ceph-mon[48545]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T19:24:45.411 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:45 vm07.local ceph-mon[48545]: pgmap v87: 65 pgs: 65 active+clean; 459 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 5.9 KiB/s rd, 1.2 KiB/s wr, 7 op/s 2026-03-09T19:24:45.411 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:45 vm07.local ceph-mon[48545]: from='client.14556 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:45.411 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:45 vm07.local ceph-mon[48545]: from='client.14560 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:45.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.413+0000 7f5ec967d640 1 -- 192.168.123.107:0/2568747073 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5ec4102a60 msgr2=0x7f5ec4102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.413+0000 7f5ec967d640 1 --2- 192.168.123.107:0/2568747073 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5ec4102a60 0x7f5ec4102e60 secure :-1 s=READY pgs=289 cs=0 l=1 rev1=1 crypto rx=0x7f5eac0099b0 tx=0x7f5eac02f220 comp rx=0 tx=0).stop 2026-03-09T19:24:45.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.413+0000 7f5ec967d640 1 -- 192.168.123.107:0/2568747073 shutdown_connections 2026-03-09T19:24:45.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.413+0000 7f5ec967d640 1 --2- 192.168.123.107:0/2568747073 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ec4103c60 0x7f5ec41040e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.413+0000 7f5ec967d640 1 --2- 192.168.123.107:0/2568747073 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5ec4102a60 0x7f5ec4102e60 unknown :-1 s=CLOSED pgs=289 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.413+0000 7f5ec967d640 1 -- 192.168.123.107:0/2568747073 >> 192.168.123.107:0/2568747073 conn(0x7f5ec40fe250 msgr2=0x7f5ec4100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:45.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.415+0000 7f5ec967d640 1 -- 192.168.123.107:0/2568747073 shutdown_connections 2026-03-09T19:24:45.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.416+0000 7f5ec967d640 1 -- 192.168.123.107:0/2568747073 wait complete. 2026-03-09T19:24:45.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.417+0000 7f5ec967d640 1 Processor -- start 2026-03-09T19:24:45.415 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.417+0000 7f5ec967d640 1 -- start start 2026-03-09T19:24:45.415 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.417+0000 7f5ec967d640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ec4102a60 0x7f5ec4071690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.415 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.417+0000 7f5ec967d640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5ec4103c60 0x7f5ec4071bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.415 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.417+0000 7f5ec967d640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ec4073080 con 0x7f5ec4103c60 2026-03-09T19:24:45.415 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.417+0000 7f5ec967d640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ec40731f0 con 0x7f5ec4102a60 2026-03-09T19:24:45.416 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.417+0000 7f5ec27fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5ec4103c60 0x7f5ec4071bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:45.416 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.417+0000 7f5ec2ffd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ec4102a60 0x7f5ec4071690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:45.416 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.417+0000 7f5ec27fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5ec4103c60 0x7f5ec4071bd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:47636/0 (socket says 192.168.123.107:47636) 2026-03-09T19:24:45.416 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.417+0000 7f5ec27fc640 1 -- 192.168.123.107:0/3001395540 learned_addr learned my addr 192.168.123.107:0/3001395540 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:45.416 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.418+0000 7f5ec27fc640 1 -- 192.168.123.107:0/3001395540 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ec4102a60 msgr2=0x7f5ec4071690 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.416 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.418+0000 7f5ec27fc640 1 --2- 192.168.123.107:0/3001395540 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ec4102a60 0x7f5ec4071690 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.416 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.418+0000 7f5ec27fc640 1 -- 192.168.123.107:0/3001395540 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5eac009660 con 0x7f5ec4103c60 2026-03-09T19:24:45.417 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.418+0000 7f5ec27fc640 1 --2- 192.168.123.107:0/3001395540 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5ec4103c60 0x7f5ec4071bd0 secure :-1 s=READY pgs=290 cs=0 l=1 rev1=1 crypto rx=0x7f5eb800d8d0 tx=0x7f5eb800dda0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:45.417 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.418+0000 7f5ea3fff640 1 -- 192.168.123.107:0/3001395540 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5eb8004490 con 0x7f5ec4103c60 2026-03-09T19:24:45.417 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.418+0000 7f5ea3fff640 1 -- 192.168.123.107:0/3001395540 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f5eb800bd00 con 0x7f5ec4103c60 2026-03-09T19:24:45.417 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.418+0000 7f5ea3fff640 1 -- 192.168.123.107:0/3001395540 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5eb8010460 con 0x7f5ec4103c60 2026-03-09T19:24:45.417 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.419+0000 7f5ec967d640 1 -- 192.168.123.107:0/3001395540 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5ec4072260 con 0x7f5ec4103c60 2026-03-09T19:24:45.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.420+0000 7f5ec967d640 1 -- 192.168.123.107:0/3001395540 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5ec41a8b80 con 0x7f5ec4103c60 2026-03-09T19:24:45.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.421+0000 7f5ea3fff640 1 -- 192.168.123.107:0/3001395540 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f5eb8010610 con 0x7f5ec4103c60 2026-03-09T19:24:45.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.421+0000 7f5ec967d640 1 -- 192.168.123.107:0/3001395540 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5ec410b690 con 0x7f5ec4103c60 2026-03-09T19:24:45.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.421+0000 7f5ea3fff640 1 --2- 192.168.123.107:0/3001395540 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5e98076170 0x7f5e98078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.422+0000 7f5ec2ffd640 1 --2- 192.168.123.107:0/3001395540 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5e98076170 0x7f5e98078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:45.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.422+0000 7f5ea3fff640 1 -- 192.168.123.107:0/3001395540 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f5eb8061280 con 0x7f5ec4103c60 2026-03-09T19:24:45.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.425+0000 7f5ec2ffd640 1 --2- 192.168.123.107:0/3001395540 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5e98076170 0x7f5e98078630 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f5eac002c20 tx=0x7f5eac03a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:45.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.426+0000 7f5ea3fff640 1 -- 192.168.123.107:0/3001395540 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f5eb809c050 con 0x7f5ec4103c60 2026-03-09T19:24:45.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.563+0000 7f5ec967d640 1 -- 192.168.123.107:0/3001395540 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f5ec41a8dd0 con 0x7f5ec4103c60 2026-03-09T19:24:45.562 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.564+0000 7f5ea3fff640 1 -- 192.168.123.107:0/3001395540 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f5eb8014070 con 0x7f5ec4103c60 2026-03-09T19:24:45.562 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:24:45.562 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:24:45.562 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T19:24:45.562 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:24:45.562 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:24:45.562 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T19:24:45.562 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:24:45.562 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:24:45.562 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T19:24:45.563 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:24:45.563 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:24:45.563 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:24:45.563 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:24:45.563 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:24:45.563 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 14 2026-03-09T19:24:45.563 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:24:45.563 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:24:45.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.567+0000 7f5ec967d640 1 -- 192.168.123.107:0/3001395540 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5e98076170 msgr2=0x7f5e98078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.567+0000 7f5ec967d640 1 --2- 192.168.123.107:0/3001395540 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5e98076170 0x7f5e98078630 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f5eac002c20 tx=0x7f5eac03a040 comp rx=0 tx=0).stop 2026-03-09T19:24:45.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.567+0000 7f5ec967d640 1 -- 192.168.123.107:0/3001395540 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5ec4103c60 msgr2=0x7f5ec4071bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.567+0000 7f5ec967d640 1 --2- 192.168.123.107:0/3001395540 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5ec4103c60 0x7f5ec4071bd0 secure :-1 s=READY pgs=290 cs=0 l=1 rev1=1 crypto rx=0x7f5eb800d8d0 tx=0x7f5eb800dda0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.568+0000 7f5ec967d640 1 -- 192.168.123.107:0/3001395540 shutdown_connections 2026-03-09T19:24:45.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.568+0000 7f5ec967d640 1 --2- 192.168.123.107:0/3001395540 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f5e98076170 0x7f5e98078630 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.568+0000 7f5ec967d640 1 --2- 192.168.123.107:0/3001395540 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5ec4103c60 0x7f5ec4071bd0 unknown :-1 s=CLOSED pgs=290 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.568+0000 7f5ec967d640 1 --2- 192.168.123.107:0/3001395540 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ec4102a60 0x7f5ec4071690 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.568+0000 7f5ec967d640 1 -- 192.168.123.107:0/3001395540 >> 192.168.123.107:0/3001395540 conn(0x7f5ec40fe250 msgr2=0x7f5ec40ffa10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:45.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.568+0000 7f5ec967d640 1 -- 192.168.123.107:0/3001395540 shutdown_connections 2026-03-09T19:24:45.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.568+0000 7f5ec967d640 1 -- 192.168.123.107:0/3001395540 wait complete. 2026-03-09T19:24:45.628 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.630+0000 7f6dd5060640 1 -- 192.168.123.107:0/690148757 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6dd0103c60 msgr2=0x7f6dd01040e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.632 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.630+0000 7f6dd5060640 1 --2- 192.168.123.107:0/690148757 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6dd0103c60 0x7f6dd01040e0 secure :-1 s=READY pgs=291 cs=0 l=1 rev1=1 crypto rx=0x7f6dc40099b0 tx=0x7f6dc402f220 comp rx=0 tx=0).stop 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.630+0000 7f6dd5060640 1 -- 192.168.123.107:0/690148757 shutdown_connections 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.630+0000 7f6dd5060640 1 --2- 192.168.123.107:0/690148757 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6dd0103c60 0x7f6dd01040e0 unknown :-1 s=CLOSED pgs=291 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.630+0000 7f6dd5060640 1 --2- 192.168.123.107:0/690148757 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6dd0102a60 0x7f6dd0102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.630+0000 7f6dd5060640 1 -- 192.168.123.107:0/690148757 >> 192.168.123.107:0/690148757 conn(0x7f6dd00fe250 msgr2=0x7f6dd0100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.631+0000 7f6dd5060640 1 -- 192.168.123.107:0/690148757 shutdown_connections 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.631+0000 7f6dd5060640 1 -- 192.168.123.107:0/690148757 wait complete. 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.631+0000 7f6dd5060640 1 Processor -- start 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.631+0000 7f6dd5060640 1 -- start start 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.632+0000 7f6dd5060640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6dd0102a60 0x7f6dd019a430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.632+0000 7f6dd5060640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6dd0103c60 0x7f6dd019a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.632+0000 7f6dd5060640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6dd019af40 con 0x7f6dd0103c60 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.632+0000 7f6dd5060640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6dd019b0b0 con 0x7f6dd0102a60 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.632+0000 7f6dce575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6dd0103c60 0x7f6dd019a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.632+0000 7f6dce575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6dd0103c60 0x7f6dd019a970 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:47648/0 (socket says 192.168.123.107:47648) 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.632+0000 7f6dce575640 1 -- 192.168.123.107:0/680778487 learned_addr learned my addr 192.168.123.107:0/680778487 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.632+0000 7f6dce575640 1 -- 192.168.123.107:0/680778487 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6dd0102a60 msgr2=0x7f6dd019a430 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.632+0000 7f6dce575640 1 --2- 192.168.123.107:0/680778487 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6dd0102a60 0x7f6dd019a430 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.632+0000 7f6dce575640 1 -- 192.168.123.107:0/680778487 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6dc4009660 con 0x7f6dd0103c60 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.633+0000 7f6dce575640 1 --2- 192.168.123.107:0/680778487 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6dd0103c60 0x7f6dd019a970 secure :-1 s=READY pgs=292 cs=0 l=1 rev1=1 crypto rx=0x7f6dc4009ae0 tx=0x7f6dc40026e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.633+0000 7f6daffff640 1 -- 192.168.123.107:0/680778487 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6dc403d070 con 0x7f6dd0103c60 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.633+0000 7f6daffff640 1 -- 192.168.123.107:0/680778487 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6dc40028c0 con 0x7f6dd0103c60 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.633+0000 7f6daffff640 1 -- 192.168.123.107:0/680778487 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6dc4041770 con 0x7f6dd0103c60 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.634+0000 7f6dd5060640 1 -- 192.168.123.107:0/680778487 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6dd019faf0 con 0x7f6dd0103c60 2026-03-09T19:24:45.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.634+0000 7f6dd5060640 1 -- 192.168.123.107:0/680778487 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6dd019ff60 con 0x7f6dd0103c60 2026-03-09T19:24:45.634 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.635+0000 7f6dd5060640 1 -- 192.168.123.107:0/680778487 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6d94005350 con 0x7f6dd0103c60 2026-03-09T19:24:45.634 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.635+0000 7f6daffff640 1 -- 192.168.123.107:0/680778487 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f6dc402fc90 con 0x7f6dd0103c60 2026-03-09T19:24:45.635 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.637+0000 7f6daffff640 1 --2- 192.168.123.107:0/680778487 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6da4075fb0 0x7f6da4078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.637+0000 7f6daffff640 1 -- 192.168.123.107:0/680778487 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f6dc40bc410 con 0x7f6dd0103c60 2026-03-09T19:24:45.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.638+0000 7f6dced76640 1 --2- 192.168.123.107:0/680778487 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6da4075fb0 0x7f6da4078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:45.636 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.638+0000 7f6dced76640 1 --2- 192.168.123.107:0/680778487 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6da4075fb0 0x7f6da4078470 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f6dd0103ac0 tx=0x7f6db8009290 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:45.637 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.639+0000 7f6daffff640 1 -- 192.168.123.107:0/680778487 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6dc40859d0 con 0x7f6dd0103c60 2026-03-09T19:24:45.752 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.754+0000 7f6dd5060640 1 -- 192.168.123.107:0/680778487 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f6d94005e10 con 0x7f6dd0103c60 2026-03-09T19:24:45.753 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.755+0000 7f6daffff640 1 -- 192.168.123.107:0/680778487 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1856 (secure 0 0 0) 0x7f6dc4085370 con 0x7f6dd0103c60 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:e13 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:epoch 13 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:24:45.754 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:24:45.755 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:24:45.756 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.759+0000 7f6dd5060640 1 -- 192.168.123.107:0/680778487 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6da4075fb0 msgr2=0x7f6da4078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.757 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.759+0000 7f6dd5060640 1 --2- 192.168.123.107:0/680778487 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6da4075fb0 0x7f6da4078470 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f6dd0103ac0 tx=0x7f6db8009290 comp rx=0 tx=0).stop 2026-03-09T19:24:45.757 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.759+0000 7f6dd5060640 1 -- 192.168.123.107:0/680778487 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6dd0103c60 msgr2=0x7f6dd019a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.757 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.759+0000 7f6dd5060640 1 --2- 192.168.123.107:0/680778487 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6dd0103c60 0x7f6dd019a970 secure :-1 s=READY pgs=292 cs=0 l=1 rev1=1 crypto rx=0x7f6dc4009ae0 tx=0x7f6dc40026e0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.757 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.759+0000 7f6dd5060640 1 -- 192.168.123.107:0/680778487 shutdown_connections 2026-03-09T19:24:45.757 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.759+0000 7f6dd5060640 1 --2- 192.168.123.107:0/680778487 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6da4075fb0 0x7f6da4078470 secure :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f6dd0103ac0 tx=0x7f6db8009290 comp rx=0 tx=0).stop 2026-03-09T19:24:45.757 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.759+0000 7f6dd5060640 1 --2- 192.168.123.107:0/680778487 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6dd0103c60 0x7f6dd019a970 secure :-1 s=CLOSED pgs=292 cs=0 l=1 rev1=1 crypto rx=0x7f6dc4009ae0 tx=0x7f6dc40026e0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.758 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.759+0000 7f6dd5060640 1 --2- 192.168.123.107:0/680778487 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6dd0102a60 0x7f6dd019a430 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.758 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.759+0000 7f6dd5060640 1 -- 192.168.123.107:0/680778487 >> 192.168.123.107:0/680778487 conn(0x7f6dd00fe250 msgr2=0x7f6dd00ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:45.758 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.759+0000 7f6dd5060640 1 -- 192.168.123.107:0/680778487 shutdown_connections 2026-03-09T19:24:45.758 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.760+0000 7f6dd5060640 1 -- 192.168.123.107:0/680778487 wait complete. 2026-03-09T19:24:45.817 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.819+0000 7fa88684c640 1 -- 192.168.123.107:0/846731147 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa880075ba0 msgr2=0x7fa880075fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.818 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.819+0000 7fa88684c640 1 --2- 192.168.123.107:0/846731147 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa880075ba0 0x7fa880075fa0 secure :-1 s=READY pgs=293 cs=0 l=1 rev1=1 crypto rx=0x7fa8700099b0 tx=0x7fa87002f220 comp rx=0 tx=0).stop 2026-03-09T19:24:45.818 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.819+0000 7fa88684c640 1 -- 192.168.123.107:0/846731147 shutdown_connections 2026-03-09T19:24:45.818 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.819+0000 7fa88684c640 1 --2- 192.168.123.107:0/846731147 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa880076df0 0x7fa880077250 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.818 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.819+0000 7fa88684c640 1 --2- 192.168.123.107:0/846731147 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa880075ba0 0x7fa880075fa0 unknown :-1 s=CLOSED pgs=293 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.818 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.819+0000 7fa88684c640 1 -- 192.168.123.107:0/846731147 >> 192.168.123.107:0/846731147 conn(0x7fa8800fe250 msgr2=0x7fa880100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:45.818 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.820+0000 7fa88684c640 1 -- 192.168.123.107:0/846731147 shutdown_connections 2026-03-09T19:24:45.818 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.820+0000 7fa88684c640 1 -- 192.168.123.107:0/846731147 wait complete. 2026-03-09T19:24:45.818 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.820+0000 7fa88684c640 1 Processor -- start 2026-03-09T19:24:45.818 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.820+0000 7fa88684c640 1 -- start start 2026-03-09T19:24:45.819 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.821+0000 7fa88684c640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa880075ba0 0x7fa88019e920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.819 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.821+0000 7fa88684c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa880076df0 0x7fa88019ee60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.819 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.821+0000 7fa88684c640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa88019f430 con 0x7fa880076df0 2026-03-09T19:24:45.819 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.821+0000 7fa88684c640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa88019f5a0 con 0x7fa880075ba0 2026-03-09T19:24:45.819 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.821+0000 7fa87ffff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa880075ba0 0x7fa88019e920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:45.819 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.821+0000 7fa87ffff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa880075ba0 0x7fa88019e920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:57738/0 (socket says 192.168.123.107:57738) 2026-03-09T19:24:45.819 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.821+0000 7fa87ffff640 1 -- 192.168.123.107:0/1782081795 learned_addr learned my addr 192.168.123.107:0/1782081795 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:45.819 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.821+0000 7fa87f7fe640 1 --2- 192.168.123.107:0/1782081795 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa880076df0 0x7fa88019ee60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:45.819 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.821+0000 7fa87ffff640 1 -- 192.168.123.107:0/1782081795 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa880076df0 msgr2=0x7fa88019ee60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.821+0000 7fa87ffff640 1 --2- 192.168.123.107:0/1782081795 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa880076df0 0x7fa88019ee60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.821+0000 7fa87ffff640 1 -- 192.168.123.107:0/1782081795 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa870009660 con 0x7fa880075ba0 2026-03-09T19:24:45.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.822+0000 7fa87ffff640 1 --2- 192.168.123.107:0/1782081795 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa880075ba0 0x7fa88019e920 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fa870002c20 tx=0x7fa870002910 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:45.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.822+0000 7fa87d7fa640 1 -- 192.168.123.107:0/1782081795 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa87003d070 con 0x7fa880075ba0 2026-03-09T19:24:45.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.822+0000 7fa88684c640 1 -- 192.168.123.107:0/1782081795 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa8801a3fe0 con 0x7fa880075ba0 2026-03-09T19:24:45.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.822+0000 7fa88684c640 1 -- 192.168.123.107:0/1782081795 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa8801a4450 con 0x7fa880075ba0 2026-03-09T19:24:45.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.822+0000 7fa87d7fa640 1 -- 192.168.123.107:0/1782081795 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa870002e20 con 0x7fa880075ba0 2026-03-09T19:24:45.821 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.822+0000 7fa87d7fa640 1 -- 192.168.123.107:0/1782081795 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa870041690 con 0x7fa880075ba0 2026-03-09T19:24:45.821 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.823+0000 7fa88684c640 1 -- 192.168.123.107:0/1782081795 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa84c005350 con 0x7fa880075ba0 2026-03-09T19:24:45.821 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.823+0000 7fa87d7fa640 1 -- 192.168.123.107:0/1782081795 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fa870038730 con 0x7fa880075ba0 2026-03-09T19:24:45.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.824+0000 7fa87d7fa640 1 --2- 192.168.123.107:0/1782081795 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa8540761c0 0x7fa854078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.824+0000 7fa87d7fa640 1 -- 192.168.123.107:0/1782081795 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fa8700bc620 con 0x7fa880075ba0 2026-03-09T19:24:45.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.824+0000 7fa87f7fe640 1 --2- 192.168.123.107:0/1782081795 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa8540761c0 0x7fa854078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:45.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.824+0000 7fa87f7fe640 1 --2- 192.168.123.107:0/1782081795 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa8540761c0 0x7fa854078680 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fa88019fe40 tx=0x7fa86c009290 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:45.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.826+0000 7fa87d7fa640 1 -- 192.168.123.107:0/1782081795 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fa870085be0 con 0x7fa880075ba0 2026-03-09T19:24:45.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.925+0000 7fa88684c640 1 -- 192.168.123.107:0/1782081795 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fa84c002bf0 con 0x7fa8540761c0 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.926+0000 7fa87d7fa640 1 -- 192.168.123.107:0/1782081795 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7fa84c002bf0 con 0x7fa8540761c0 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [], 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "", 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.929+0000 7fa88684c640 1 -- 192.168.123.107:0/1782081795 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa8540761c0 msgr2=0x7fa854078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.929+0000 7fa88684c640 1 --2- 192.168.123.107:0/1782081795 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa8540761c0 0x7fa854078680 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fa88019fe40 tx=0x7fa86c009290 comp rx=0 tx=0).stop 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.929+0000 7fa88684c640 1 -- 192.168.123.107:0/1782081795 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa880075ba0 msgr2=0x7fa88019e920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.929+0000 7fa88684c640 1 --2- 192.168.123.107:0/1782081795 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa880075ba0 0x7fa88019e920 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fa870002c20 tx=0x7fa870002910 comp rx=0 tx=0).stop 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.929+0000 7fa88684c640 1 -- 192.168.123.107:0/1782081795 shutdown_connections 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.929+0000 7fa88684c640 1 --2- 192.168.123.107:0/1782081795 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fa8540761c0 0x7fa854078680 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.929+0000 7fa88684c640 1 --2- 192.168.123.107:0/1782081795 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa880076df0 0x7fa88019ee60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.929+0000 7fa88684c640 1 --2- 192.168.123.107:0/1782081795 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa880075ba0 0x7fa88019e920 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.929+0000 7fa88684c640 1 -- 192.168.123.107:0/1782081795 >> 192.168.123.107:0/1782081795 conn(0x7fa8800fe250 msgr2=0x7fa8800ffa20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.929+0000 7fa88684c640 1 -- 192.168.123.107:0/1782081795 shutdown_connections 2026-03-09T19:24:45.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.929+0000 7fa88684c640 1 -- 192.168.123.107:0/1782081795 wait complete. 2026-03-09T19:24:45.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.986+0000 7f07bcdc1640 1 -- 192.168.123.107:0/3882347940 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f07b8101a00 msgr2=0x7f07b8101e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.986+0000 7f07bcdc1640 1 --2- 192.168.123.107:0/3882347940 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f07b8101a00 0x7f07b8101e80 secure :-1 s=READY pgs=294 cs=0 l=1 rev1=1 crypto rx=0x7f07ac0099b0 tx=0x7f07ac02f240 comp rx=0 tx=0).stop 2026-03-09T19:24:45.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.987+0000 7f07bcdc1640 1 -- 192.168.123.107:0/3882347940 shutdown_connections 2026-03-09T19:24:45.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.987+0000 7f07bcdc1640 1 --2- 192.168.123.107:0/3882347940 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f07b8101a00 0x7f07b8101e80 unknown :-1 s=CLOSED pgs=294 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.987+0000 7f07bcdc1640 1 --2- 192.168.123.107:0/3882347940 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07b8100800 0x7f07b8100c00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.987+0000 7f07bcdc1640 1 -- 192.168.123.107:0/3882347940 >> 192.168.123.107:0/3882347940 conn(0x7f07b80fbfb0 msgr2=0x7f07b80fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:45.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.987+0000 7f07bcdc1640 1 -- 192.168.123.107:0/3882347940 shutdown_connections 2026-03-09T19:24:45.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.988+0000 7f07bcdc1640 1 -- 192.168.123.107:0/3882347940 wait complete. 2026-03-09T19:24:45.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.988+0000 7f07bcdc1640 1 Processor -- start 2026-03-09T19:24:45.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.988+0000 7f07bcdc1640 1 -- start start 2026-03-09T19:24:45.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.988+0000 7f07bcdc1640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f07b8100800 0x7f07b8195f20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.989+0000 7f07b6575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f07b8100800 0x7f07b8195f20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:45.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.989+0000 7f07bcdc1640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07b8101a00 0x7f07b8196460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.989+0000 7f07bcdc1640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f07b8196a30 con 0x7f07b8100800 2026-03-09T19:24:45.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.989+0000 7f07b6575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f07b8100800 0x7f07b8195f20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:47668/0 (socket says 192.168.123.107:47668) 2026-03-09T19:24:45.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.989+0000 7f07b6575640 1 -- 192.168.123.107:0/1214859177 learned_addr learned my addr 192.168.123.107:0/1214859177 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:24:45.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.989+0000 7f07bcdc1640 1 -- 192.168.123.107:0/1214859177 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f07b8196ba0 con 0x7f07b8101a00 2026-03-09T19:24:45.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.989+0000 7f07b5d74640 1 --2- 192.168.123.107:0/1214859177 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07b8101a00 0x7f07b8196460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:45.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.989+0000 7f07b6575640 1 -- 192.168.123.107:0/1214859177 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07b8101a00 msgr2=0x7f07b8196460 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:45.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.989+0000 7f07b6575640 1 --2- 192.168.123.107:0/1214859177 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07b8101a00 0x7f07b8196460 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:45.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.989+0000 7f07b6575640 1 -- 192.168.123.107:0/1214859177 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f07ac009660 con 0x7f07b8100800 2026-03-09T19:24:45.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.989+0000 7f07b6575640 1 --2- 192.168.123.107:0/1214859177 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f07b8100800 0x7f07b8195f20 secure :-1 s=READY pgs=295 cs=0 l=1 rev1=1 crypto rx=0x7f07a000e990 tx=0x7f07a000ee60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:45.988 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.990+0000 7f079f7fe640 1 -- 192.168.123.107:0/1214859177 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f07a0009800 con 0x7f07b8100800 2026-03-09T19:24:45.988 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.990+0000 7f079f7fe640 1 -- 192.168.123.107:0/1214859177 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f07a0004590 con 0x7f07b8100800 2026-03-09T19:24:45.988 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.990+0000 7f07bcdc1640 1 -- 192.168.123.107:0/1214859177 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f07b819b640 con 0x7f07b8100800 2026-03-09T19:24:45.989 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.990+0000 7f07bcdc1640 1 -- 192.168.123.107:0/1214859177 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f07b819bb90 con 0x7f07b8100800 2026-03-09T19:24:45.989 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.990+0000 7f079f7fe640 1 -- 192.168.123.107:0/1214859177 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f07a0010640 con 0x7f07b8100800 2026-03-09T19:24:45.989 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.991+0000 7f079f7fe640 1 -- 192.168.123.107:0/1214859177 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f07a00107a0 con 0x7f07b8100800 2026-03-09T19:24:45.990 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.992+0000 7f07bcdc1640 1 -- 192.168.123.107:0/1214859177 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f07b81071d0 con 0x7f07b8100800 2026-03-09T19:24:45.990 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.992+0000 7f079f7fe640 1 --2- 192.168.123.107:0/1214859177 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f07880761c0 0x7f0788078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:24:45.990 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.992+0000 7f079f7fe640 1 -- 192.168.123.107:0/1214859177 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f07a0014070 con 0x7f07b8100800 2026-03-09T19:24:45.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.995+0000 7f07b5d74640 1 --2- 192.168.123.107:0/1214859177 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f07880761c0 0x7f0788078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:24:45.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.995+0000 7f079f7fe640 1 -- 192.168.123.107:0/1214859177 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f07a0062680 con 0x7f07b8100800 2026-03-09T19:24:45.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:45.995+0000 7f07b5d74640 1 --2- 192.168.123.107:0/1214859177 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f07880761c0 0x7f0788078680 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f07ac002c30 tx=0x7f07ac03a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:24:46.119 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:46.121+0000 7f07bcdc1640 1 -- 192.168.123.107:0/1214859177 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f07b8107420 con 0x7f07b8100800 2026-03-09T19:24:46.125 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:46.126+0000 7f079f7fe640 1 -- 192.168.123.107:0/1214859177 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f07a0062020 con 0x7f07b8100800 2026-03-09T19:24:46.125 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_OK 2026-03-09T19:24:46.127 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:46.129+0000 7f07bcdc1640 1 -- 192.168.123.107:0/1214859177 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f07880761c0 msgr2=0x7f0788078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:46.127 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:46.129+0000 7f07bcdc1640 1 --2- 192.168.123.107:0/1214859177 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f07880761c0 0x7f0788078680 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f07ac002c30 tx=0x7f07ac03a040 comp rx=0 tx=0).stop 2026-03-09T19:24:46.127 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:46.129+0000 7f07bcdc1640 1 -- 192.168.123.107:0/1214859177 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f07b8100800 msgr2=0x7f07b8195f20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:24:46.127 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:46.129+0000 7f07bcdc1640 1 --2- 192.168.123.107:0/1214859177 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f07b8100800 0x7f07b8195f20 secure :-1 s=READY pgs=295 cs=0 l=1 rev1=1 crypto rx=0x7f07a000e990 tx=0x7f07a000ee60 comp rx=0 tx=0).stop 2026-03-09T19:24:46.127 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:46.129+0000 7f07bcdc1640 1 -- 192.168.123.107:0/1214859177 shutdown_connections 2026-03-09T19:24:46.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:46.129+0000 7f07bcdc1640 1 --2- 192.168.123.107:0/1214859177 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f07880761c0 0x7f0788078680 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:46.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:46.129+0000 7f07bcdc1640 1 --2- 192.168.123.107:0/1214859177 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07b8101a00 0x7f07b8196460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:46.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:46.129+0000 7f07bcdc1640 1 --2- 192.168.123.107:0/1214859177 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f07b8100800 0x7f07b8195f20 unknown :-1 s=CLOSED pgs=295 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:24:46.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:46.129+0000 7f07bcdc1640 1 -- 192.168.123.107:0/1214859177 >> 192.168.123.107:0/1214859177 conn(0x7f07b80fbfb0 msgr2=0x7f07b80fda90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:24:46.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:46.130+0000 7f07bcdc1640 1 -- 192.168.123.107:0/1214859177 shutdown_connections 2026-03-09T19:24:46.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:24:46.130+0000 7f07bcdc1640 1 -- 192.168.123.107:0/1214859177 wait complete. 2026-03-09T19:24:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:46 vm07.local ceph-mon[48545]: from='client.14564 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:46 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/3001395540' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:24:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:46 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/680778487' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:24:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:46 vm07.local ceph-mon[48545]: from='client.24339 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:46 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/1214859177' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:24:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:46 vm08.local ceph-mon[57794]: from='client.14564 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:46 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/3001395540' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:24:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:46 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/680778487' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:24:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:46 vm08.local ceph-mon[57794]: from='client.24339 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:24:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:46 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/1214859177' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:24:47.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:47 vm07.local ceph-mon[48545]: pgmap v88: 65 pgs: 65 active+clean; 459 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 3.5 KiB/s rd, 1.1 KiB/s wr, 4 op/s 2026-03-09T19:24:47.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:47 vm08.local ceph-mon[57794]: pgmap v88: 65 pgs: 65 active+clean; 459 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 3.5 KiB/s rd, 1.1 KiB/s wr, 4 op/s 2026-03-09T19:24:48.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:48 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:24:48.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:48 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:24:49.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:49 vm07.local ceph-mon[48545]: pgmap v89: 65 pgs: 65 active+clean; 459 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 1023 B/s rd, 1.1 KiB/s wr, 2 op/s 2026-03-09T19:24:49.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:49 vm08.local ceph-mon[57794]: pgmap v89: 65 pgs: 65 active+clean; 459 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 1023 B/s rd, 1.1 KiB/s wr, 2 op/s 2026-03-09T19:24:51.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:51 vm07.local ceph-mon[48545]: pgmap v90: 65 pgs: 65 active+clean; 459 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 767 B/s wr, 0 op/s 2026-03-09T19:24:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:51 vm08.local ceph-mon[57794]: pgmap v90: 65 pgs: 65 active+clean; 459 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 767 B/s wr, 0 op/s 2026-03-09T19:24:53.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:53 vm07.local ceph-mon[48545]: pgmap v91: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 1.1 KiB/s wr, 0 op/s 2026-03-09T19:24:54.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:53 vm08.local ceph-mon[57794]: pgmap v91: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 1.1 KiB/s wr, 0 op/s 2026-03-09T19:24:55.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:55 vm07.local ceph-mon[48545]: pgmap v92: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 597 B/s wr, 0 op/s 2026-03-09T19:24:56.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:55 vm08.local ceph-mon[57794]: pgmap v92: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 597 B/s wr, 0 op/s 2026-03-09T19:24:58.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:24:57 vm08.local ceph-mon[57794]: pgmap v93: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 0 op/s 2026-03-09T19:24:58.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:24:57 vm07.local ceph-mon[48545]: pgmap v93: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 0 op/s 2026-03-09T19:25:00.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:00 vm07.local ceph-mon[48545]: pgmap v94: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 0 op/s 2026-03-09T19:25:00.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:00 vm08.local ceph-mon[57794]: pgmap v94: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 0 op/s 2026-03-09T19:25:02.137 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:02 vm07.local ceph-mon[48545]: pgmap v95: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 426 B/s wr, 0 op/s 2026-03-09T19:25:02.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:02 vm08.local ceph-mon[57794]: pgmap v95: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 426 B/s wr, 0 op/s 2026-03-09T19:25:04.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:04 vm08.local ceph-mon[57794]: pgmap v96: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 426 B/s wr, 0 op/s 2026-03-09T19:25:04.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:04 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:25:04.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:04 vm07.local ceph-mon[48545]: pgmap v96: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 426 B/s wr, 0 op/s 2026-03-09T19:25:04.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:04 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:25:05.408 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:05 vm07.local ceph-mon[48545]: pgmap v97: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 85 B/s wr, 0 op/s 2026-03-09T19:25:05.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:05 vm08.local ceph-mon[57794]: pgmap v97: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 85 B/s wr, 0 op/s 2026-03-09T19:25:07.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:07 vm08.local ceph-mon[57794]: pgmap v98: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:07.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:07 vm07.local ceph-mon[48545]: pgmap v98: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:09.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:09 vm08.local ceph-mon[57794]: pgmap v99: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:09.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:09 vm07.local ceph-mon[48545]: pgmap v99: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:11.836 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:11 vm07.local ceph-mon[48545]: pgmap v100: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:11.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:11 vm08.local ceph-mon[57794]: pgmap v100: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:13.933 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:13 vm08.local ceph-mon[57794]: pgmap v101: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:13.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:13 vm07.local ceph-mon[48545]: pgmap v101: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:15.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:15 vm07.local ceph-mon[48545]: pgmap v102: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:16.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:15 vm08.local ceph-mon[57794]: pgmap v102: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:16.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.232+0000 7f67a9c15640 1 -- 192.168.123.107:0/1961413278 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67a41028b0 msgr2=0x7f67a4102cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:16.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.232+0000 7f67a9c15640 1 --2- 192.168.123.107:0/1961413278 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67a41028b0 0x7f67a4102cb0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f67940098e0 tx=0x7f679402f1d0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.233+0000 7f67a9c15640 1 -- 192.168.123.107:0/1961413278 shutdown_connections 2026-03-09T19:25:16.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.233+0000 7f67a9c15640 1 --2- 192.168.123.107:0/1961413278 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67a4103ab0 0x7f67a4103f30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.233+0000 7f67a9c15640 1 --2- 192.168.123.107:0/1961413278 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67a41028b0 0x7f67a4102cb0 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.233+0000 7f67a9c15640 1 -- 192.168.123.107:0/1961413278 >> 192.168.123.107:0/1961413278 conn(0x7f67a40fe060 msgr2=0x7f67a4100480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:16.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.234+0000 7f67a9c15640 1 -- 192.168.123.107:0/1961413278 shutdown_connections 2026-03-09T19:25:16.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.234+0000 7f67a9c15640 1 -- 192.168.123.107:0/1961413278 wait complete. 2026-03-09T19:25:16.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.235+0000 7f67a9c15640 1 Processor -- start 2026-03-09T19:25:16.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.235+0000 7f67a9c15640 1 -- start start 2026-03-09T19:25:16.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.235+0000 7f67a9c15640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67a41028b0 0x7f67a40716b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:16.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.235+0000 7f67a9c15640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67a4103ab0 0x7f67a4071bf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:16.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.235+0000 7f67a9c15640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f67a40730f0 con 0x7f67a41028b0 2026-03-09T19:25:16.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.235+0000 7f67a9c15640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f67a4073260 con 0x7f67a4103ab0 2026-03-09T19:25:16.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.235+0000 7f679bfff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67a4103ab0 0x7f67a4071bf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:16.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.236+0000 7f679bfff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67a4103ab0 0x7f67a4071bf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:48544/0 (socket says 192.168.123.107:48544) 2026-03-09T19:25:16.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.236+0000 7f679bfff640 1 -- 192.168.123.107:0/697401877 learned_addr learned my addr 192.168.123.107:0/697401877 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:25:16.235 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.237+0000 7f67a8c13640 1 --2- 192.168.123.107:0/697401877 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67a41028b0 0x7f67a40716b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:16.235 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.237+0000 7f679bfff640 1 -- 192.168.123.107:0/697401877 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67a41028b0 msgr2=0x7f67a40716b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:16.235 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.237+0000 7f679bfff640 1 --2- 192.168.123.107:0/697401877 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67a41028b0 0x7f67a40716b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.235 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.237+0000 7f679bfff640 1 -- 192.168.123.107:0/697401877 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6794009590 con 0x7f67a4103ab0 2026-03-09T19:25:16.235 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.237+0000 7f679bfff640 1 --2- 192.168.123.107:0/697401877 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67a4103ab0 0x7f67a4071bf0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f678c002fd0 tx=0x7f678c00da70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:16.235 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.238+0000 7f6799ffb640 1 -- 192.168.123.107:0/697401877 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f678c0098e0 con 0x7f67a4103ab0 2026-03-09T19:25:16.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.238+0000 7f67a9c15640 1 -- 192.168.123.107:0/697401877 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f67a40722d0 con 0x7f67a4103ab0 2026-03-09T19:25:16.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.238+0000 7f67a9c15640 1 -- 192.168.123.107:0/697401877 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f67a41a89d0 con 0x7f67a4103ab0 2026-03-09T19:25:16.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.238+0000 7f6799ffb640 1 -- 192.168.123.107:0/697401877 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f678c010460 con 0x7f67a4103ab0 2026-03-09T19:25:16.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.238+0000 7f6799ffb640 1 -- 192.168.123.107:0/697401877 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f678c00f5d0 con 0x7f67a4103ab0 2026-03-09T19:25:16.237 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.239+0000 7f67737fe640 1 -- 192.168.123.107:0/697401877 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f67a410b4e0 con 0x7f67a4103ab0 2026-03-09T19:25:16.238 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.240+0000 7f6799ffb640 1 -- 192.168.123.107:0/697401877 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f678c0105d0 con 0x7f67a4103ab0 2026-03-09T19:25:16.238 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.240+0000 7f6799ffb640 1 --2- 192.168.123.107:0/697401877 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6780076170 0x7f6780078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:16.239 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.241+0000 7f67a8c13640 1 --2- 192.168.123.107:0/697401877 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6780076170 0x7f6780078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:16.239 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.241+0000 7f67a8c13640 1 --2- 192.168.123.107:0/697401877 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6780076170 0x7f6780078630 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f67940098e0 tx=0x7f67940023d0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:16.239 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.241+0000 7f6799ffb640 1 -- 192.168.123.107:0/697401877 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f678c0611a0 con 0x7f67a4103ab0 2026-03-09T19:25:16.243 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.245+0000 7f6799ffb640 1 -- 192.168.123.107:0/697401877 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f678c060fe0 con 0x7f67a4103ab0 2026-03-09T19:25:16.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.420+0000 7f67737fe640 1 -- 192.168.123.107:0/697401877 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f67a410b720 con 0x7f6780076170 2026-03-09T19:25:16.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.421+0000 7f6799ffb640 1 -- 192.168.123.107:0/697401877 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f67a410b720 con 0x7f6780076170 2026-03-09T19:25:16.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.423+0000 7f67737fe640 1 -- 192.168.123.107:0/697401877 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6780076170 msgr2=0x7f6780078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:16.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.424+0000 7f67737fe640 1 --2- 192.168.123.107:0/697401877 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6780076170 0x7f6780078630 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f67940098e0 tx=0x7f67940023d0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.422 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.424+0000 7f67737fe640 1 -- 192.168.123.107:0/697401877 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67a4103ab0 msgr2=0x7f67a4071bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:16.422 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.424+0000 7f67737fe640 1 --2- 192.168.123.107:0/697401877 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67a4103ab0 0x7f67a4071bf0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f678c002fd0 tx=0x7f678c00da70 comp rx=0 tx=0).stop 2026-03-09T19:25:16.422 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.424+0000 7f67737fe640 1 -- 192.168.123.107:0/697401877 shutdown_connections 2026-03-09T19:25:16.422 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.424+0000 7f67737fe640 1 --2- 192.168.123.107:0/697401877 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6780076170 0x7f6780078630 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.422 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.424+0000 7f67737fe640 1 --2- 192.168.123.107:0/697401877 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67a4103ab0 0x7f67a4071bf0 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.422 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.425+0000 7f67737fe640 1 --2- 192.168.123.107:0/697401877 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67a41028b0 0x7f67a40716b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.422 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.425+0000 7f67737fe640 1 -- 192.168.123.107:0/697401877 >> 192.168.123.107:0/697401877 conn(0x7f67a40fe060 msgr2=0x7f67a40ffbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:16.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.426+0000 7f67737fe640 1 -- 192.168.123.107:0/697401877 shutdown_connections 2026-03-09T19:25:16.424 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.426+0000 7f67737fe640 1 -- 192.168.123.107:0/697401877 wait complete. 2026-03-09T19:25:16.432 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:25:16.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.495+0000 7ff992f28640 1 -- 192.168.123.107:0/3840759643 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff98c072370 msgr2=0x7ff98c10c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:16.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.495+0000 7ff992f28640 1 --2- 192.168.123.107:0/3840759643 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff98c072370 0x7ff98c10c590 secure :-1 s=READY pgs=296 cs=0 l=1 rev1=1 crypto rx=0x7ff98400b0a0 tx=0x7ff98402f4c0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.496+0000 7ff992f28640 1 -- 192.168.123.107:0/3840759643 shutdown_connections 2026-03-09T19:25:16.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.496+0000 7ff992f28640 1 --2- 192.168.123.107:0/3840759643 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff98c072370 0x7ff98c10c590 unknown :-1 s=CLOSED pgs=296 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.496+0000 7ff992f28640 1 --2- 192.168.123.107:0/3840759643 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff98c0719a0 0x7ff98c071da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.496+0000 7ff992f28640 1 -- 192.168.123.107:0/3840759643 >> 192.168.123.107:0/3840759643 conn(0x7ff98c06d4f0 msgr2=0x7ff98c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:16.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.497+0000 7ff992f28640 1 -- 192.168.123.107:0/3840759643 shutdown_connections 2026-03-09T19:25:16.498 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.497+0000 7ff992f28640 1 -- 192.168.123.107:0/3840759643 wait complete. 2026-03-09T19:25:16.498 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.499+0000 7ff992f28640 1 Processor -- start 2026-03-09T19:25:16.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.501+0000 7ff992f28640 1 -- start start 2026-03-09T19:25:16.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.501+0000 7ff992f28640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff98c0719a0 0x7ff98c115970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:16.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.501+0000 7ff992f28640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff98c072370 0x7ff98c115eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:16.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.501+0000 7ff992f28640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff98c1173b0 con 0x7ff98c072370 2026-03-09T19:25:16.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.501+0000 7ff992f28640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff98c117520 con 0x7ff98c0719a0 2026-03-09T19:25:16.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.501+0000 7ff991725640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff98c072370 0x7ff98c115eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:16.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.501+0000 7ff991725640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff98c072370 0x7ff98c115eb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:34954/0 (socket says 192.168.123.107:34954) 2026-03-09T19:25:16.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.501+0000 7ff991725640 1 -- 192.168.123.107:0/2552533347 learned_addr learned my addr 192.168.123.107:0/2552533347 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:25:16.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.501+0000 7ff991f26640 1 --2- 192.168.123.107:0/2552533347 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff98c0719a0 0x7ff98c115970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:16.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.502+0000 7ff991725640 1 -- 192.168.123.107:0/2552533347 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff98c0719a0 msgr2=0x7ff98c115970 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:16.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.502+0000 7ff991725640 1 --2- 192.168.123.107:0/2552533347 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff98c0719a0 0x7ff98c115970 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.502+0000 7ff991725640 1 -- 192.168.123.107:0/2552533347 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff984009d00 con 0x7ff98c072370 2026-03-09T19:25:16.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.502+0000 7ff991725640 1 --2- 192.168.123.107:0/2552533347 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff98c072370 0x7ff98c115eb0 secure :-1 s=READY pgs=297 cs=0 l=1 rev1=1 crypto rx=0x7ff984002790 tx=0x7ff984002a60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:16.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.502+0000 7ff97affd640 1 -- 192.168.123.107:0/2552533347 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff984002e60 con 0x7ff98c072370 2026-03-09T19:25:16.501 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.502+0000 7ff992f28640 1 -- 192.168.123.107:0/2552533347 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff98c1164b0 con 0x7ff98c072370 2026-03-09T19:25:16.501 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.502+0000 7ff992f28640 1 -- 192.168.123.107:0/2552533347 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff98c1b5990 con 0x7ff98c072370 2026-03-09T19:25:16.501 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.502+0000 7ff97affd640 1 -- 192.168.123.107:0/2552533347 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff984007b20 con 0x7ff98c072370 2026-03-09T19:25:16.501 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.503+0000 7ff97affd640 1 -- 192.168.123.107:0/2552533347 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff984040cd0 con 0x7ff98c072370 2026-03-09T19:25:16.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.504+0000 7ff992f28640 1 -- 192.168.123.107:0/2552533347 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff954005350 con 0x7ff98c072370 2026-03-09T19:25:16.503 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.504+0000 7ff97affd640 1 -- 192.168.123.107:0/2552533347 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7ff984007660 con 0x7ff98c072370 2026-03-09T19:25:16.509 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.507+0000 7ff97affd640 1 --2- 192.168.123.107:0/2552533347 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff960075f60 0x7ff960078420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:16.509 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.507+0000 7ff97affd640 1 -- 192.168.123.107:0/2552533347 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7ff9840bc840 con 0x7ff98c072370 2026-03-09T19:25:16.509 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.509+0000 7ff991f26640 1 --2- 192.168.123.107:0/2552533347 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff960075f60 0x7ff960078420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:16.509 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.511+0000 7ff991f26640 1 --2- 192.168.123.107:0/2552533347 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff960075f60 0x7ff960078420 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7ff98c071800 tx=0x7ff980009210 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:16.513 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.511+0000 7ff97affd640 1 -- 192.168.123.107:0/2552533347 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7ff984085e30 con 0x7ff98c072370 2026-03-09T19:25:16.648 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.649+0000 7ff992f28640 1 -- 192.168.123.107:0/2552533347 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff954002bf0 con 0x7ff960075f60 2026-03-09T19:25:16.649 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.651+0000 7ff97affd640 1 -- 192.168.123.107:0/2552533347 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7ff954002bf0 con 0x7ff960075f60 2026-03-09T19:25:16.652 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.653+0000 7ff992f28640 1 -- 192.168.123.107:0/2552533347 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff960075f60 msgr2=0x7ff960078420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:16.652 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.653+0000 7ff992f28640 1 --2- 192.168.123.107:0/2552533347 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff960075f60 0x7ff960078420 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7ff98c071800 tx=0x7ff980009210 comp rx=0 tx=0).stop 2026-03-09T19:25:16.652 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.654+0000 7ff992f28640 1 -- 192.168.123.107:0/2552533347 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff98c072370 msgr2=0x7ff98c115eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:16.652 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.654+0000 7ff992f28640 1 --2- 192.168.123.107:0/2552533347 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff98c072370 0x7ff98c115eb0 secure :-1 s=READY pgs=297 cs=0 l=1 rev1=1 crypto rx=0x7ff984002790 tx=0x7ff984002a60 comp rx=0 tx=0).stop 2026-03-09T19:25:16.652 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.654+0000 7ff992f28640 1 -- 192.168.123.107:0/2552533347 shutdown_connections 2026-03-09T19:25:16.652 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.654+0000 7ff992f28640 1 --2- 192.168.123.107:0/2552533347 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff960075f60 0x7ff960078420 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.653 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.654+0000 7ff992f28640 1 --2- 192.168.123.107:0/2552533347 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff98c072370 0x7ff98c115eb0 unknown :-1 s=CLOSED pgs=297 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.653 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.654+0000 7ff992f28640 1 --2- 192.168.123.107:0/2552533347 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff98c0719a0 0x7ff98c115970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.653 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.654+0000 7ff992f28640 1 -- 192.168.123.107:0/2552533347 >> 192.168.123.107:0/2552533347 conn(0x7ff98c06d4f0 msgr2=0x7ff98c10a810 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:16.653 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.655+0000 7ff992f28640 1 -- 192.168.123.107:0/2552533347 shutdown_connections 2026-03-09T19:25:16.653 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.655+0000 7ff992f28640 1 -- 192.168.123.107:0/2552533347 wait complete. 2026-03-09T19:25:16.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.742+0000 7f8f9f0a9640 1 -- 192.168.123.107:0/4057307681 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f981028d0 msgr2=0x7f8f98102cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:16.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.742+0000 7f8f9f0a9640 1 --2- 192.168.123.107:0/4057307681 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f981028d0 0x7f8f98102cd0 secure :-1 s=READY pgs=298 cs=0 l=1 rev1=1 crypto rx=0x7f8f88009a00 tx=0x7f8f8802f290 comp rx=0 tx=0).stop 2026-03-09T19:25:16.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.744+0000 7f8f9f0a9640 1 -- 192.168.123.107:0/4057307681 shutdown_connections 2026-03-09T19:25:16.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.744+0000 7f8f9f0a9640 1 --2- 192.168.123.107:0/4057307681 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f98103ad0 0x7f8f98103f50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.744+0000 7f8f9f0a9640 1 --2- 192.168.123.107:0/4057307681 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f981028d0 0x7f8f98102cd0 unknown :-1 s=CLOSED pgs=298 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.744+0000 7f8f9f0a9640 1 -- 192.168.123.107:0/4057307681 >> 192.168.123.107:0/4057307681 conn(0x7f8f980fe060 msgr2=0x7f8f981004a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:16.743 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.744+0000 7f8f9f0a9640 1 -- 192.168.123.107:0/4057307681 shutdown_connections 2026-03-09T19:25:16.743 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.744+0000 7f8f9f0a9640 1 -- 192.168.123.107:0/4057307681 wait complete. 2026-03-09T19:25:16.744 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.746+0000 7f8f9f0a9640 1 Processor -- start 2026-03-09T19:25:16.745 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.746+0000 7f8f9f0a9640 1 -- start start 2026-03-09T19:25:16.746 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.746+0000 7f8f9f0a9640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f981028d0 0x7f8f98078fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:16.746 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.746+0000 7f8f9f0a9640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f98103ad0 0x7f8f980794e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:16.746 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.746+0000 7f8f9f0a9640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8f98075a00 con 0x7f8f98103ad0 2026-03-09T19:25:16.746 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.746+0000 7f8f9f0a9640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8f98075b70 con 0x7f8f981028d0 2026-03-09T19:25:16.746 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.748+0000 7f8f9ce1e640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f981028d0 0x7f8f98078fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:16.747 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.748+0000 7f8f8ffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f98103ad0 0x7f8f980794e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:16.747 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.748+0000 7f8f9ce1e640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f981028d0 0x7f8f98078fa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:48582/0 (socket says 192.168.123.107:48582) 2026-03-09T19:25:16.747 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.748+0000 7f8f9ce1e640 1 -- 192.168.123.107:0/2747860228 learned_addr learned my addr 192.168.123.107:0/2747860228 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:25:16.747 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.748+0000 7f8f8ffff640 1 -- 192.168.123.107:0/2747860228 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f981028d0 msgr2=0x7f8f98078fa0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:16.747 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.748+0000 7f8f8ffff640 1 --2- 192.168.123.107:0/2747860228 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f981028d0 0x7f8f98078fa0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.747 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.748+0000 7f8f8ffff640 1 -- 192.168.123.107:0/2747860228 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8f88009660 con 0x7f8f98103ad0 2026-03-09T19:25:16.748 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.749+0000 7f8f8ffff640 1 --2- 192.168.123.107:0/2747860228 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f98103ad0 0x7f8f980794e0 secure :-1 s=READY pgs=299 cs=0 l=1 rev1=1 crypto rx=0x7f8f8000b730 tx=0x7f8f8000bc00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:16.748 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.749+0000 7f8f8dffb640 1 -- 192.168.123.107:0/2747860228 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8f800040f0 con 0x7f8f98103ad0 2026-03-09T19:25:16.748 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.750+0000 7f8f9f0a9640 1 -- 192.168.123.107:0/2747860228 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8f98075dd0 con 0x7f8f98103ad0 2026-03-09T19:25:16.748 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.750+0000 7f8f9f0a9640 1 -- 192.168.123.107:0/2747860228 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8f980763a0 con 0x7f8f98103ad0 2026-03-09T19:25:16.748 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.750+0000 7f8f8dffb640 1 -- 192.168.123.107:0/2747860228 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8f800027a0 con 0x7f8f98103ad0 2026-03-09T19:25:16.748 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.750+0000 7f8f8dffb640 1 -- 192.168.123.107:0/2747860228 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8f8000ca50 con 0x7f8f98103ad0 2026-03-09T19:25:16.750 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.752+0000 7f8f9f0a9640 1 -- 192.168.123.107:0/2747860228 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8f98102cd0 con 0x7f8f98103ad0 2026-03-09T19:25:16.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.754+0000 7f8f8dffb640 1 -- 192.168.123.107:0/2747860228 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f8f8000cc90 con 0x7f8f98103ad0 2026-03-09T19:25:16.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.754+0000 7f8f8dffb640 1 --2- 192.168.123.107:0/2747860228 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8f6c076290 0x7f8f6c078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:16.756 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.754+0000 7f8f9ce1e640 1 --2- 192.168.123.107:0/2747860228 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8f6c076290 0x7f8f6c078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:16.756 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.755+0000 7f8f8dffb640 1 -- 192.168.123.107:0/2747860228 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f8f80096ca0 con 0x7f8f98103ad0 2026-03-09T19:25:16.756 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.755+0000 7f8f9ce1e640 1 --2- 192.168.123.107:0/2747860228 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8f6c076290 0x7f8f6c078750 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f8f88002410 tx=0x7f8f88002c20 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:16.757 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.760+0000 7f8f8dffb640 1 -- 192.168.123.107:0/2747860228 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f8f80060370 con 0x7f8f98103ad0 2026-03-09T19:25:16.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.889+0000 7f8f9f0a9640 1 -- 192.168.123.107:0/2747860228 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f8f98108420 con 0x7f8f6c076290 2026-03-09T19:25:16.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.905+0000 7f8f8dffb640 1 -- 192.168.123.107:0/2747860228 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3624 (secure 0 0 0) 0x7f8f98108420 con 0x7f8f6c076290 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (2m) 45s ago 2m 22.6M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (3m) 45s ago 3m 8284k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (2m) 46s ago 2m 8644k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (3m) 45s ago 3m 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2799ea3e4bf3 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (2m) 46s ago 2m 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 81fc95c210b6 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (2m) 45s ago 2m 79.7M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (52s) 45s ago 52s 12.5M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (50s) 45s ago 49s 14.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (49s) 46s ago 49s 16.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (51s) 46s ago 51s 17.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:9283,8765,8443 running (3m) 45s ago 3m 540M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 706e626ecd10 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (2m) 46s ago 2m 485M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 880604c16b45 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (3m) 45s ago 3m 53.6M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ccb644205fb3 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (2m) 46s ago 2m 49.4M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8d7b1da9e1e2 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (2m) 45s ago 2m 13.8M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (2m) 46s ago 2m 15.3M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (2m) 45s ago 2m 46.5M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d7417e3377af 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (110s) 45s ago 110s 67.5M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2b3c7dd92144 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (100s) 45s ago 100s 45.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 67f7c4b96ef8 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (92s) 46s ago 92s 45.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 740e44caf4fc 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (83s) 46s ago 83s 67.3M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d929d31f8a58 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (74s) 46s ago 74s 65.0M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:25:16.905 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (2m) 45s ago 2m 39.2M - 2.43.0 a07b618ecd1d 238baaac36ff 2026-03-09T19:25:16.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.907+0000 7f8f9f0a9640 1 -- 192.168.123.107:0/2747860228 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8f6c076290 msgr2=0x7f8f6c078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:16.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.907+0000 7f8f9f0a9640 1 --2- 192.168.123.107:0/2747860228 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8f6c076290 0x7f8f6c078750 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f8f88002410 tx=0x7f8f88002c20 comp rx=0 tx=0).stop 2026-03-09T19:25:16.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.907+0000 7f8f9f0a9640 1 -- 192.168.123.107:0/2747860228 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f98103ad0 msgr2=0x7f8f980794e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:16.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.907+0000 7f8f9f0a9640 1 --2- 192.168.123.107:0/2747860228 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f98103ad0 0x7f8f980794e0 secure :-1 s=READY pgs=299 cs=0 l=1 rev1=1 crypto rx=0x7f8f8000b730 tx=0x7f8f8000bc00 comp rx=0 tx=0).stop 2026-03-09T19:25:16.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.909+0000 7f8f9f0a9640 1 -- 192.168.123.107:0/2747860228 shutdown_connections 2026-03-09T19:25:16.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.909+0000 7f8f9f0a9640 1 --2- 192.168.123.107:0/2747860228 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f8f6c076290 0x7f8f6c078750 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.909+0000 7f8f9f0a9640 1 --2- 192.168.123.107:0/2747860228 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f98103ad0 0x7f8f980794e0 unknown :-1 s=CLOSED pgs=299 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.909+0000 7f8f9f0a9640 1 --2- 192.168.123.107:0/2747860228 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f981028d0 0x7f8f98078fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.909+0000 7f8f9f0a9640 1 -- 192.168.123.107:0/2747860228 >> 192.168.123.107:0/2747860228 conn(0x7f8f980fe060 msgr2=0x7f8f98106d00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:16.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.911+0000 7f8f9f0a9640 1 -- 192.168.123.107:0/2747860228 shutdown_connections 2026-03-09T19:25:16.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.911+0000 7f8f9f0a9640 1 -- 192.168.123.107:0/2747860228 wait complete. 2026-03-09T19:25:16.991 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.993+0000 7f4387fff640 1 -- 192.168.123.107:0/1529567356 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f43880719a0 msgr2=0x7f4388071da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:16.991 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.993+0000 7f4387fff640 1 --2- 192.168.123.107:0/1529567356 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f43880719a0 0x7f4388071da0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f4380009f90 tx=0x7f438002f440 comp rx=0 tx=0).stop 2026-03-09T19:25:16.991 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.993+0000 7f4387fff640 1 -- 192.168.123.107:0/1529567356 shutdown_connections 2026-03-09T19:25:16.992 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.993+0000 7f4387fff640 1 --2- 192.168.123.107:0/1529567356 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4388072370 0x7f438810c590 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.992 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.993+0000 7f4387fff640 1 --2- 192.168.123.107:0/1529567356 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f43880719a0 0x7f4388071da0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.992 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.993+0000 7f4387fff640 1 -- 192.168.123.107:0/1529567356 >> 192.168.123.107:0/1529567356 conn(0x7f438806d4f0 msgr2=0x7f438806f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:16.992 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.994+0000 7f4387fff640 1 -- 192.168.123.107:0/1529567356 shutdown_connections 2026-03-09T19:25:16.992 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.994+0000 7f4387fff640 1 -- 192.168.123.107:0/1529567356 wait complete. 2026-03-09T19:25:16.992 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.995+0000 7f4387fff640 1 Processor -- start 2026-03-09T19:25:16.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.995+0000 7f4387fff640 1 -- start start 2026-03-09T19:25:16.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.995+0000 7f4387fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4388072370 0x7f4388115940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:16.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.995+0000 7f4387fff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f43881172f0 0x7f4388115e80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:16.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.995+0000 7f4387fff640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4388116450 con 0x7f4388072370 2026-03-09T19:25:16.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.995+0000 7f4387fff640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f43881165c0 con 0x7f43881172f0 2026-03-09T19:25:16.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.996+0000 7f4386ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4388072370 0x7f4388115940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:16.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.996+0000 7f4386ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4388072370 0x7f4388115940 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:34988/0 (socket says 192.168.123.107:34988) 2026-03-09T19:25:16.995 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.996+0000 7f4386ffd640 1 -- 192.168.123.107:0/500824625 learned_addr learned my addr 192.168.123.107:0/500824625 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:25:16.995 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.997+0000 7f43867fc640 1 --2- 192.168.123.107:0/500824625 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f43881172f0 0x7f4388115e80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:16.995 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.997+0000 7f43867fc640 1 -- 192.168.123.107:0/500824625 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4388072370 msgr2=0x7f4388115940 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:16.995 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.997+0000 7f43867fc640 1 --2- 192.168.123.107:0/500824625 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4388072370 0x7f4388115940 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:16.995 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.997+0000 7f43867fc640 1 -- 192.168.123.107:0/500824625 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4380009c40 con 0x7f43881172f0 2026-03-09T19:25:16.995 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.998+0000 7f43867fc640 1 --2- 192.168.123.107:0/500824625 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f43881172f0 0x7f4388115e80 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f437000e970 tx=0x7f437000ee40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:16.996 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.998+0000 7f4367fff640 1 -- 192.168.123.107:0/500824625 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f437000ccb0 con 0x7f43881172f0 2026-03-09T19:25:16.996 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.998+0000 7f4387fff640 1 -- 192.168.123.107:0/500824625 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f43881b5700 con 0x7f43881172f0 2026-03-09T19:25:16.996 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.998+0000 7f4367fff640 1 -- 192.168.123.107:0/500824625 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f4370004590 con 0x7f43881172f0 2026-03-09T19:25:16.996 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.998+0000 7f4367fff640 1 -- 192.168.123.107:0/500824625 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4370010640 con 0x7f43881172f0 2026-03-09T19:25:16.998 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:16.999+0000 7f4387fff640 1 -- 192.168.123.107:0/500824625 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f43881b5b70 con 0x7f43881172f0 2026-03-09T19:25:16.999 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.001+0000 7f4367fff640 1 -- 192.168.123.107:0/500824625 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f4370002850 con 0x7f43881172f0 2026-03-09T19:25:16.999 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.001+0000 7f4367fff640 1 --2- 192.168.123.107:0/500824625 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f43600761c0 0x7f4360078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:17.000 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.002+0000 7f4386ffd640 1 --2- 192.168.123.107:0/500824625 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f43600761c0 0x7f4360078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:17.000 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.002+0000 7f4386ffd640 1 --2- 192.168.123.107:0/500824625 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f43600761c0 0x7f4360078680 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f43800092b0 tx=0x7f438003a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:17.000 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.002+0000 7f4367fff640 1 -- 192.168.123.107:0/500824625 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f4370014070 con 0x7f43881172f0 2026-03-09T19:25:17.001 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.003+0000 7f4387fff640 1 -- 192.168.123.107:0/500824625 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f43880719a0 con 0x7f43881172f0 2026-03-09T19:25:17.008 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.007+0000 7f4367fff640 1 -- 192.168.123.107:0/500824625 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f437009c050 con 0x7f43881172f0 2026-03-09T19:25:17.143 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.144+0000 7f4387fff640 1 -- 192.168.123.107:0/500824625 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f43881185c0 con 0x7f43881172f0 2026-03-09T19:25:17.144 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.146+0000 7f4367fff640 1 -- 192.168.123.107:0/500824625 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f4370060730 con 0x7f43881172f0 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 14 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:25:17.145 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:25:17.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.149+0000 7f4387fff640 1 -- 192.168.123.107:0/500824625 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f43600761c0 msgr2=0x7f4360078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:17.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.149+0000 7f4387fff640 1 --2- 192.168.123.107:0/500824625 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f43600761c0 0x7f4360078680 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f43800092b0 tx=0x7f438003a040 comp rx=0 tx=0).stop 2026-03-09T19:25:17.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.149+0000 7f4387fff640 1 -- 192.168.123.107:0/500824625 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f43881172f0 msgr2=0x7f4388115e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:17.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.150+0000 7f4387fff640 1 --2- 192.168.123.107:0/500824625 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f43881172f0 0x7f4388115e80 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f437000e970 tx=0x7f437000ee40 comp rx=0 tx=0).stop 2026-03-09T19:25:17.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.150+0000 7f4387fff640 1 -- 192.168.123.107:0/500824625 shutdown_connections 2026-03-09T19:25:17.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.150+0000 7f4387fff640 1 --2- 192.168.123.107:0/500824625 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f43600761c0 0x7f4360078680 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.150+0000 7f4387fff640 1 --2- 192.168.123.107:0/500824625 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f43881172f0 0x7f4388115e80 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.150+0000 7f4387fff640 1 --2- 192.168.123.107:0/500824625 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4388072370 0x7f4388115940 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.150+0000 7f4387fff640 1 -- 192.168.123.107:0/500824625 >> 192.168.123.107:0/500824625 conn(0x7f438806d4f0 msgr2=0x7f4388070410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:17.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.151+0000 7f4387fff640 1 -- 192.168.123.107:0/500824625 shutdown_connections 2026-03-09T19:25:17.149 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.151+0000 7f4387fff640 1 -- 192.168.123.107:0/500824625 wait complete. 2026-03-09T19:25:17.217 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.219+0000 7f626811f640 1 -- 192.168.123.107:0/188115922 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f62601028d0 msgr2=0x7f6260102cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:17.217 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.219+0000 7f626811f640 1 --2- 192.168.123.107:0/188115922 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f62601028d0 0x7f6260102cd0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f62500099b0 tx=0x7f625002f220 comp rx=0 tx=0).stop 2026-03-09T19:25:17.218 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.219+0000 7f626811f640 1 -- 192.168.123.107:0/188115922 shutdown_connections 2026-03-09T19:25:17.218 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.219+0000 7f626811f640 1 --2- 192.168.123.107:0/188115922 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6260103ad0 0x7f6260103f50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.218 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.219+0000 7f626811f640 1 --2- 192.168.123.107:0/188115922 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f62601028d0 0x7f6260102cd0 secure :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f62500099b0 tx=0x7f625002f220 comp rx=0 tx=0).stop 2026-03-09T19:25:17.218 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.219+0000 7f626811f640 1 -- 192.168.123.107:0/188115922 >> 192.168.123.107:0/188115922 conn(0x7f62600fe060 msgr2=0x7f62601004a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:17.218 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.220+0000 7f626811f640 1 -- 192.168.123.107:0/188115922 shutdown_connections 2026-03-09T19:25:17.218 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.220+0000 7f626811f640 1 -- 192.168.123.107:0/188115922 wait complete. 2026-03-09T19:25:17.219 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.221+0000 7f626811f640 1 Processor -- start 2026-03-09T19:25:17.219 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.221+0000 7f626811f640 1 -- start start 2026-03-09T19:25:17.219 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.221+0000 7f626811f640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6260103ad0 0x7f626006f980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:17.219 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.221+0000 7f626811f640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6260072f20 0x7f626006fec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:17.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.221+0000 7f626811f640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6260070400 con 0x7f6260103ad0 2026-03-09T19:25:17.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.221+0000 7f626811f640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6260070570 con 0x7f6260072f20 2026-03-09T19:25:17.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.222+0000 7f6265693640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6260072f20 0x7f626006fec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:17.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.222+0000 7f6265693640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6260072f20 0x7f626006fec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:48614/0 (socket says 192.168.123.107:48614) 2026-03-09T19:25:17.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.222+0000 7f6265693640 1 -- 192.168.123.107:0/2356262913 learned_addr learned my addr 192.168.123.107:0/2356262913 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:25:17.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.222+0000 7f6265e94640 1 --2- 192.168.123.107:0/2356262913 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6260103ad0 0x7f626006f980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:17.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.222+0000 7f6265e94640 1 -- 192.168.123.107:0/2356262913 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6260072f20 msgr2=0x7f626006fec0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:17.221 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.222+0000 7f6265e94640 1 --2- 192.168.123.107:0/2356262913 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6260072f20 0x7f626006fec0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.221 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.222+0000 7f6265e94640 1 -- 192.168.123.107:0/2356262913 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6250009660 con 0x7f6260103ad0 2026-03-09T19:25:17.221 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.223+0000 7f6265e94640 1 --2- 192.168.123.107:0/2356262913 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6260103ad0 0x7f626006f980 secure :-1 s=READY pgs=300 cs=0 l=1 rev1=1 crypto rx=0x7f625002f730 tx=0x7f6250002ed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:17.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.225+0000 7f624effd640 1 -- 192.168.123.107:0/2356262913 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f625003d070 con 0x7f6260103ad0 2026-03-09T19:25:17.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.225+0000 7f624effd640 1 -- 192.168.123.107:0/2356262913 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6250004510 con 0x7f6260103ad0 2026-03-09T19:25:17.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.225+0000 7f624effd640 1 -- 192.168.123.107:0/2356262913 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6250038c60 con 0x7f6260103ad0 2026-03-09T19:25:17.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.225+0000 7f626811f640 1 -- 192.168.123.107:0/2356262913 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f62601acfb0 con 0x7f6260103ad0 2026-03-09T19:25:17.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.225+0000 7f626811f640 1 -- 192.168.123.107:0/2356262913 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f62601ad3c0 con 0x7f6260103ad0 2026-03-09T19:25:17.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.226+0000 7f626811f640 1 -- 192.168.123.107:0/2356262913 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6234005350 con 0x7f6260103ad0 2026-03-09T19:25:17.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.230+0000 7f624effd640 1 -- 192.168.123.107:0/2356262913 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f625002fc90 con 0x7f6260103ad0 2026-03-09T19:25:17.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.231+0000 7f624effd640 1 --2- 192.168.123.107:0/2356262913 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f623c076000 0x7f623c0784c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:17.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.231+0000 7f6265693640 1 --2- 192.168.123.107:0/2356262913 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f623c076000 0x7f623c0784c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:17.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.232+0000 7f6265693640 1 --2- 192.168.123.107:0/2356262913 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f623c076000 0x7f623c0784c0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f6260070ea0 tx=0x7f6254009290 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:17.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.232+0000 7f624effd640 1 -- 192.168.123.107:0/2356262913 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f62500bc840 con 0x7f6260103ad0 2026-03-09T19:25:17.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.232+0000 7f624effd640 1 -- 192.168.123.107:0/2356262913 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6250085d70 con 0x7f6260103ad0 2026-03-09T19:25:17.389 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.389+0000 7f626811f640 1 -- 192.168.123.107:0/2356262913 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f62340058d0 con 0x7f6260103ad0 2026-03-09T19:25:17.397 INFO:teuthology.orchestra.run.vm07.stdout:e13 2026-03-09T19:25:17.397 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:25:17.397 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:25:17.397 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:25:17.397 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:25:17.397 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:25:17.397 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:epoch 13 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.396+0000 7f624effd640 1 -- 192.168.123.107:0/2356262913 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1856 (secure 0 0 0) 0x7f6250085710 con 0x7f6260103ad0 2026-03-09T19:25:17.398 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:25:17.400 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.401+0000 7f626811f640 1 -- 192.168.123.107:0/2356262913 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f623c076000 msgr2=0x7f623c0784c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:17.400 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.401+0000 7f626811f640 1 --2- 192.168.123.107:0/2356262913 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f623c076000 0x7f623c0784c0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f6260070ea0 tx=0x7f6254009290 comp rx=0 tx=0).stop 2026-03-09T19:25:17.400 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.401+0000 7f626811f640 1 -- 192.168.123.107:0/2356262913 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6260103ad0 msgr2=0x7f626006f980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:17.400 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.401+0000 7f626811f640 1 --2- 192.168.123.107:0/2356262913 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6260103ad0 0x7f626006f980 secure :-1 s=READY pgs=300 cs=0 l=1 rev1=1 crypto rx=0x7f625002f730 tx=0x7f6250002ed0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.400 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.402+0000 7f626811f640 1 -- 192.168.123.107:0/2356262913 shutdown_connections 2026-03-09T19:25:17.400 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.402+0000 7f626811f640 1 --2- 192.168.123.107:0/2356262913 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f623c076000 0x7f623c0784c0 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.400 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.402+0000 7f626811f640 1 --2- 192.168.123.107:0/2356262913 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6260072f20 0x7f626006fec0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.400 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.402+0000 7f626811f640 1 --2- 192.168.123.107:0/2356262913 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6260103ad0 0x7f626006f980 unknown :-1 s=CLOSED pgs=300 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.400 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.402+0000 7f626811f640 1 -- 192.168.123.107:0/2356262913 >> 192.168.123.107:0/2356262913 conn(0x7f62600fe060 msgr2=0x7f6260104cf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:17.400 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.402+0000 7f626811f640 1 -- 192.168.123.107:0/2356262913 shutdown_connections 2026-03-09T19:25:17.400 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.402+0000 7f626811f640 1 -- 192.168.123.107:0/2356262913 wait complete. 2026-03-09T19:25:17.467 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.469+0000 7f41a0f0d640 1 -- 192.168.123.107:0/1127327254 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f419c072390 msgr2=0x7f419c10c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:17.467 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.469+0000 7f41a0f0d640 1 --2- 192.168.123.107:0/1127327254 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f419c072390 0x7f419c10c590 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f419400b0a0 tx=0x7f419402f4c0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.468 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.469+0000 7f41a0f0d640 1 -- 192.168.123.107:0/1127327254 shutdown_connections 2026-03-09T19:25:17.468 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.469+0000 7f41a0f0d640 1 --2- 192.168.123.107:0/1127327254 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f419c072390 0x7f419c10c590 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.468 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.469+0000 7f41a0f0d640 1 --2- 192.168.123.107:0/1127327254 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f419c0719c0 0x7f419c071dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.468 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.469+0000 7f41a0f0d640 1 -- 192.168.123.107:0/1127327254 >> 192.168.123.107:0/1127327254 conn(0x7f419c06d4f0 msgr2=0x7f419c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:17.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.470+0000 7f41a0f0d640 1 -- 192.168.123.107:0/1127327254 shutdown_connections 2026-03-09T19:25:17.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.470+0000 7f41a0f0d640 1 -- 192.168.123.107:0/1127327254 wait complete. 2026-03-09T19:25:17.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.471+0000 7f41a0f0d640 1 Processor -- start 2026-03-09T19:25:17.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.471+0000 7f41a0f0d640 1 -- start start 2026-03-09T19:25:17.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.471+0000 7f41a0f0d640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f419c0719c0 0x7f419c115a10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:17.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.471+0000 7f41a0f0d640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f419c072390 0x7f419c115f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:17.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.471+0000 7f41a0f0d640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f419c117390 con 0x7f419c072390 2026-03-09T19:25:17.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.471+0000 7f41a0f0d640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f419c117500 con 0x7f419c0719c0 2026-03-09T19:25:17.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.471+0000 7f419a575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f419c0719c0 0x7f419c115a10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:17.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.471+0000 7f419a575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f419c0719c0 0x7f419c115a10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:48634/0 (socket says 192.168.123.107:48634) 2026-03-09T19:25:17.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.471+0000 7f419a575640 1 -- 192.168.123.107:0/3005269256 learned_addr learned my addr 192.168.123.107:0/3005269256 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:25:17.471 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.472+0000 7f4199d74640 1 --2- 192.168.123.107:0/3005269256 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f419c072390 0x7f419c115f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:17.471 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.472+0000 7f419a575640 1 -- 192.168.123.107:0/3005269256 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f419c072390 msgr2=0x7f419c115f50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:17.471 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.472+0000 7f419a575640 1 --2- 192.168.123.107:0/3005269256 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f419c072390 0x7f419c115f50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.471 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.472+0000 7f419a575640 1 -- 192.168.123.107:0/3005269256 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4194009d00 con 0x7f419c0719c0 2026-03-09T19:25:17.472 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.472+0000 7f4199d74640 1 --2- 192.168.123.107:0/3005269256 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f419c072390 0x7f419c115f50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:25:17.472 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.473+0000 7f419a575640 1 --2- 192.168.123.107:0/3005269256 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f419c0719c0 0x7f419c115a10 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f418400b4f0 tx=0x7f418400b9c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:17.472 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.473+0000 7f41837fe640 1 -- 192.168.123.107:0/3005269256 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4184004280 con 0x7f419c0719c0 2026-03-09T19:25:17.472 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.473+0000 7f41837fe640 1 -- 192.168.123.107:0/3005269256 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f4184002c60 con 0x7f419c0719c0 2026-03-09T19:25:17.472 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.474+0000 7f41837fe640 1 -- 192.168.123.107:0/3005269256 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4184010b80 con 0x7f419c0719c0 2026-03-09T19:25:17.473 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.474+0000 7f41a0f0d640 1 -- 192.168.123.107:0/3005269256 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f419c116550 con 0x7f419c0719c0 2026-03-09T19:25:17.473 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.474+0000 7f41a0f0d640 1 -- 192.168.123.107:0/3005269256 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f419c11d740 con 0x7f419c0719c0 2026-03-09T19:25:17.473 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.475+0000 7f41a0f0d640 1 -- 192.168.123.107:0/3005269256 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f419c071dc0 con 0x7f419c0719c0 2026-03-09T19:25:17.475 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.476+0000 7f41837fe640 1 -- 192.168.123.107:0/3005269256 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f41840043e0 con 0x7f419c0719c0 2026-03-09T19:25:17.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.482+0000 7f41837fe640 1 --2- 192.168.123.107:0/3005269256 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4174076080 0x7f4174078540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:17.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.482+0000 7f4199d74640 1 --2- 192.168.123.107:0/3005269256 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4174076080 0x7f4174078540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:17.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.483+0000 7f4199d74640 1 --2- 192.168.123.107:0/3005269256 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4174076080 0x7f4174078540 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f419c117080 tx=0x7f419403a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:17.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.483+0000 7f41837fe640 1 -- 192.168.123.107:0/3005269256 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f4184096ce0 con 0x7f419c0719c0 2026-03-09T19:25:17.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.483+0000 7f41837fe640 1 -- 192.168.123.107:0/3005269256 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f418409d280 con 0x7f419c0719c0 2026-03-09T19:25:17.598 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.600+0000 7f41a0f0d640 1 -- 192.168.123.107:0/3005269256 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f419c10bf80 con 0x7f4174076080 2026-03-09T19:25:17.599 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.601+0000 7f41837fe640 1 -- 192.168.123.107:0/3005269256 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f419c10bf80 con 0x7f4174076080 2026-03-09T19:25:17.599 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:25:17.599 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-09T19:25:17.599 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:25:17.599 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:25:17.599 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [], 2026-03-09T19:25:17.599 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "", 2026-03-09T19:25:17.599 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-09T19:25:17.599 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:25:17.599 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:25:17.603 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.605+0000 7f41817fa640 1 -- 192.168.123.107:0/3005269256 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4174076080 msgr2=0x7f4174078540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:17.604 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.606+0000 7f41817fa640 1 --2- 192.168.123.107:0/3005269256 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4174076080 0x7f4174078540 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f419c117080 tx=0x7f419403a040 comp rx=0 tx=0).stop 2026-03-09T19:25:17.604 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.606+0000 7f41817fa640 1 -- 192.168.123.107:0/3005269256 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f419c0719c0 msgr2=0x7f419c115a10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:17.604 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.606+0000 7f41817fa640 1 --2- 192.168.123.107:0/3005269256 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f419c0719c0 0x7f419c115a10 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f418400b4f0 tx=0x7f418400b9c0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.604 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.606+0000 7f41817fa640 1 -- 192.168.123.107:0/3005269256 shutdown_connections 2026-03-09T19:25:17.604 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.606+0000 7f41817fa640 1 --2- 192.168.123.107:0/3005269256 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4174076080 0x7f4174078540 secure :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f419c117080 tx=0x7f419403a040 comp rx=0 tx=0).stop 2026-03-09T19:25:17.604 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.606+0000 7f41817fa640 1 --2- 192.168.123.107:0/3005269256 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f419c072390 0x7f419c115f50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.604 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.606+0000 7f41817fa640 1 --2- 192.168.123.107:0/3005269256 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f419c0719c0 0x7f419c115a10 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.605 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.606+0000 7f41817fa640 1 -- 192.168.123.107:0/3005269256 >> 192.168.123.107:0/3005269256 conn(0x7f419c06d4f0 msgr2=0x7f419c10a860 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:17.605 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.607+0000 7f41817fa640 1 -- 192.168.123.107:0/3005269256 shutdown_connections 2026-03-09T19:25:17.605 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.607+0000 7f41817fa640 1 -- 192.168.123.107:0/3005269256 wait complete. 2026-03-09T19:25:17.669 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.671+0000 7f132c11e640 1 -- 192.168.123.107:0/1352563476 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f13240719c0 msgr2=0x7f1324071dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:17.670 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.671+0000 7f132c11e640 1 --2- 192.168.123.107:0/1352563476 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f13240719c0 0x7f1324071dc0 secure :-1 s=READY pgs=301 cs=0 l=1 rev1=1 crypto rx=0x7f13200099b0 tx=0x7f132002f240 comp rx=0 tx=0).stop 2026-03-09T19:25:17.670 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.671+0000 7f132c11e640 1 -- 192.168.123.107:0/1352563476 shutdown_connections 2026-03-09T19:25:17.670 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.671+0000 7f132c11e640 1 --2- 192.168.123.107:0/1352563476 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1324072390 0x7f132410c590 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.670 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.671+0000 7f132c11e640 1 --2- 192.168.123.107:0/1352563476 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f13240719c0 0x7f1324071dc0 unknown :-1 s=CLOSED pgs=301 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.670 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.671+0000 7f132c11e640 1 -- 192.168.123.107:0/1352563476 >> 192.168.123.107:0/1352563476 conn(0x7f132406d4f0 msgr2=0x7f132406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:17.672 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.674+0000 7f132c11e640 1 -- 192.168.123.107:0/1352563476 shutdown_connections 2026-03-09T19:25:17.672 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.674+0000 7f132c11e640 1 -- 192.168.123.107:0/1352563476 wait complete. 2026-03-09T19:25:17.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.674+0000 7f132c11e640 1 Processor -- start 2026-03-09T19:25:17.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.674+0000 7f132c11e640 1 -- start start 2026-03-09T19:25:17.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.674+0000 7f132c11e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f13240719c0 0x7f132411a420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:17.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.674+0000 7f132c11e640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1324072390 0x7f132411c970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:17.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.674+0000 7f132c11e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f132411ceb0 con 0x7f13240719c0 2026-03-09T19:25:17.674 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.674+0000 7f132c11e640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f132411d020 con 0x7f1324072390 2026-03-09T19:25:17.674 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.674+0000 7f1329692640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1324072390 0x7f132411c970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:17.674 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.674+0000 7f1329692640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1324072390 0x7f132411c970 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:48654/0 (socket says 192.168.123.107:48654) 2026-03-09T19:25:17.674 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.674+0000 7f1329692640 1 -- 192.168.123.107:0/1108447029 learned_addr learned my addr 192.168.123.107:0/1108447029 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:25:17.674 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.675+0000 7f1329692640 1 -- 192.168.123.107:0/1108447029 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f13240719c0 msgr2=0x7f132411a420 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:17.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.675+0000 7f1329692640 1 --2- 192.168.123.107:0/1108447029 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f13240719c0 0x7f132411a420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.675+0000 7f1329692640 1 -- 192.168.123.107:0/1108447029 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1320009660 con 0x7f1324072390 2026-03-09T19:25:17.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.675+0000 7f1329692640 1 --2- 192.168.123.107:0/1108447029 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1324072390 0x7f132411c970 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f131c00ef10 tx=0x7f131c00c560 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:17.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.676+0000 7f1312ffd640 1 -- 192.168.123.107:0/1108447029 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f131c009280 con 0x7f1324072390 2026-03-09T19:25:17.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.676+0000 7f132c11e640 1 -- 192.168.123.107:0/1108447029 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f132411d2a0 con 0x7f1324072390 2026-03-09T19:25:17.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.676+0000 7f132c11e640 1 -- 192.168.123.107:0/1108447029 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f132411d7f0 con 0x7f1324072390 2026-03-09T19:25:17.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.676+0000 7f1312ffd640 1 -- 192.168.123.107:0/1108447029 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f131c00f040 con 0x7f1324072390 2026-03-09T19:25:17.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.676+0000 7f1312ffd640 1 -- 192.168.123.107:0/1108447029 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f131c0048f0 con 0x7f1324072390 2026-03-09T19:25:17.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.678+0000 7f1312ffd640 1 -- 192.168.123.107:0/1108447029 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f131c0040d0 con 0x7f1324072390 2026-03-09T19:25:17.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.679+0000 7f132c11e640 1 -- 192.168.123.107:0/1108447029 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f12ec005350 con 0x7f1324072390 2026-03-09T19:25:17.694 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.696+0000 7f1312ffd640 1 --2- 192.168.123.107:0/1108447029 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1300076170 0x7f1300078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:17.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.700+0000 7f1329e93640 1 --2- 192.168.123.107:0/1108447029 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1300076170 0x7f1300078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:17.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.700+0000 7f1312ffd640 1 -- 192.168.123.107:0/1108447029 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f131c097490 con 0x7f1324072390 2026-03-09T19:25:17.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.700+0000 7f1329e93640 1 --2- 192.168.123.107:0/1108447029 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1300076170 0x7f1300078630 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f1320002c30 tx=0x7f13200047c0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:17.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.701+0000 7f1312ffd640 1 -- 192.168.123.107:0/1108447029 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f131c099a30 con 0x7f1324072390 2026-03-09T19:25:17.872 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.873+0000 7f132c11e640 1 -- 192.168.123.107:0/1108447029 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f12ec0051c0 con 0x7f1324072390 2026-03-09T19:25:17.872 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_OK 2026-03-09T19:25:17.872 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.873+0000 7f1312ffd640 1 -- 192.168.123.107:0/1108447029 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f131c060a80 con 0x7f1324072390 2026-03-09T19:25:17.874 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.876+0000 7f132c11e640 1 -- 192.168.123.107:0/1108447029 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1300076170 msgr2=0x7f1300078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:17.874 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.876+0000 7f132c11e640 1 --2- 192.168.123.107:0/1108447029 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1300076170 0x7f1300078630 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f1320002c30 tx=0x7f13200047c0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.874 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.876+0000 7f132c11e640 1 -- 192.168.123.107:0/1108447029 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1324072390 msgr2=0x7f132411c970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:17.874 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.876+0000 7f132c11e640 1 --2- 192.168.123.107:0/1108447029 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1324072390 0x7f132411c970 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f131c00ef10 tx=0x7f131c00c560 comp rx=0 tx=0).stop 2026-03-09T19:25:17.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.876+0000 7f132c11e640 1 -- 192.168.123.107:0/1108447029 shutdown_connections 2026-03-09T19:25:17.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.876+0000 7f132c11e640 1 --2- 192.168.123.107:0/1108447029 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1300076170 0x7f1300078630 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.876+0000 7f132c11e640 1 --2- 192.168.123.107:0/1108447029 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1324072390 0x7f132411c970 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.876+0000 7f132c11e640 1 --2- 192.168.123.107:0/1108447029 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f13240719c0 0x7f132411a420 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:17.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.876+0000 7f132c11e640 1 -- 192.168.123.107:0/1108447029 >> 192.168.123.107:0/1108447029 conn(0x7f132406d4f0 msgr2=0x7f132410a860 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:17.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.876+0000 7f132c11e640 1 -- 192.168.123.107:0/1108447029 shutdown_connections 2026-03-09T19:25:17.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:17.876+0000 7f132c11e640 1 -- 192.168.123.107:0/1108447029 wait complete. 2026-03-09T19:25:17.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:17 vm07.local ceph-mon[48545]: pgmap v103: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:17.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:17 vm07.local ceph-mon[48545]: from='client.24347 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:25:17.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:17 vm07.local ceph-mon[48545]: from='client.14586 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:25:17.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:17 vm07.local ceph-mon[48545]: from='client.14590 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:25:17.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:17 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/500824625' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:25:17.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:17 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/2356262913' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:25:18.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:17 vm08.local ceph-mon[57794]: pgmap v103: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:18.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:17 vm08.local ceph-mon[57794]: from='client.24347 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:25:18.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:17 vm08.local ceph-mon[57794]: from='client.14586 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:25:18.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:17 vm08.local ceph-mon[57794]: from='client.14590 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:25:18.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:17 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/500824625' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:25:18.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:17 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/2356262913' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:25:19.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:19 vm08.local ceph-mon[57794]: from='client.24363 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:25:19.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:19 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/1108447029' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:25:19.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:19 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:25:19.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:19 vm07.local ceph-mon[48545]: from='client.24363 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:25:19.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:19 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/1108447029' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:25:19.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:19 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:25:20.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:20 vm08.local ceph-mon[57794]: pgmap v104: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:20.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:20 vm07.local ceph-mon[48545]: pgmap v104: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:22.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:22 vm07.local ceph-mon[48545]: pgmap v105: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:22.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:22 vm08.local ceph-mon[57794]: pgmap v105: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:24.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:24 vm08.local ceph-mon[57794]: pgmap v106: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:24.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:24 vm07.local ceph-mon[48545]: pgmap v106: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:26.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:26 vm07.local ceph-mon[48545]: pgmap v107: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:26.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:26 vm08.local ceph-mon[57794]: pgmap v107: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:28.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:28 vm07.local ceph-mon[48545]: pgmap v108: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:28.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:28 vm08.local ceph-mon[57794]: pgmap v108: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:30.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:30 vm07.local ceph-mon[48545]: pgmap v109: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:30.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:30 vm08.local ceph-mon[57794]: pgmap v109: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:32.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:32 vm07.local ceph-mon[48545]: pgmap v110: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:32.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:32 vm08.local ceph-mon[57794]: pgmap v110: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:34.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:34 vm07.local ceph-mon[48545]: pgmap v111: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:34.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:34 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:25:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:34 vm08.local ceph-mon[57794]: pgmap v111: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:34 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:25:36.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:36 vm07.local ceph-mon[48545]: pgmap v112: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:36.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:36 vm08.local ceph-mon[57794]: pgmap v112: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:38.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:38 vm08.local ceph-mon[57794]: pgmap v113: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:38.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:38 vm07.local ceph-mon[48545]: pgmap v113: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:40.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:40 vm08.local ceph-mon[57794]: pgmap v114: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:40.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:40 vm07.local ceph-mon[48545]: pgmap v114: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:41.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:41 vm08.local ceph-mon[57794]: pgmap v115: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:41.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:41 vm07.local ceph-mon[48545]: pgmap v115: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:43.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:43 vm07.local ceph-mon[48545]: pgmap v116: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:43.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:43 vm08.local ceph-mon[57794]: pgmap v116: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:45.426 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:45 vm07.local ceph-mon[48545]: pgmap v117: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:45.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:45 vm08.local ceph-mon[57794]: pgmap v117: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:47.580 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:47 vm07.local ceph-mon[48545]: pgmap v118: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:47.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:47 vm08.local ceph-mon[57794]: pgmap v118: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:47.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.984+0000 7f695ed4e640 1 -- 192.168.123.107:0/4026136416 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6958072370 msgr2=0x7f695810c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:47.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.984+0000 7f695ed4e640 1 --2- 192.168.123.107:0/4026136416 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6958072370 0x7f695810c590 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f69540099b0 tx=0x7f695402f240 comp rx=0 tx=0).stop 2026-03-09T19:25:47.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.984+0000 7f695ed4e640 1 -- 192.168.123.107:0/4026136416 shutdown_connections 2026-03-09T19:25:47.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.984+0000 7f695ed4e640 1 --2- 192.168.123.107:0/4026136416 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6958072370 0x7f695810c590 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:47.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.984+0000 7f695ed4e640 1 --2- 192.168.123.107:0/4026136416 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f69580719a0 0x7f6958071da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:47.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.984+0000 7f695ed4e640 1 -- 192.168.123.107:0/4026136416 >> 192.168.123.107:0/4026136416 conn(0x7f695806d4f0 msgr2=0x7f695806f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:47.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.984+0000 7f695ed4e640 1 -- 192.168.123.107:0/4026136416 shutdown_connections 2026-03-09T19:25:47.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.984+0000 7f695ed4e640 1 -- 192.168.123.107:0/4026136416 wait complete. 2026-03-09T19:25:47.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.984+0000 7f695ed4e640 1 Processor -- start 2026-03-09T19:25:47.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.984+0000 7f695ed4e640 1 -- start start 2026-03-09T19:25:47.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.985+0000 7f695ed4e640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f69580719a0 0x7f69581a70d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:47.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.985+0000 7f695ed4e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f69581a7610 0x7f69581ac680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:47.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.985+0000 7f695ed4e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f69581a7a90 con 0x7f69581a7610 2026-03-09T19:25:47.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.985+0000 7f695ed4e640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f69581a7c00 con 0x7f69580719a0 2026-03-09T19:25:47.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.985+0000 7f695d54b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f69581a7610 0x7f69581ac680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:47.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.985+0000 7f695dd4c640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f69580719a0 0x7f69581a70d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:47.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.985+0000 7f695dd4c640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f69580719a0 0x7f69581a70d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:59660/0 (socket says 192.168.123.107:59660) 2026-03-09T19:25:47.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.985+0000 7f695dd4c640 1 -- 192.168.123.107:0/3876945744 learned_addr learned my addr 192.168.123.107:0/3876945744 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:25:47.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.985+0000 7f695d54b640 1 -- 192.168.123.107:0/3876945744 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f69580719a0 msgr2=0x7f69581a70d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:47.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.985+0000 7f695d54b640 1 --2- 192.168.123.107:0/3876945744 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f69580719a0 0x7f69581a70d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:47.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.985+0000 7f695d54b640 1 -- 192.168.123.107:0/3876945744 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6954009660 con 0x7f69581a7610 2026-03-09T19:25:47.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.985+0000 7f695d54b640 1 --2- 192.168.123.107:0/3876945744 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f69581a7610 0x7f69581ac680 secure :-1 s=READY pgs=302 cs=0 l=1 rev1=1 crypto rx=0x7f69540099b0 tx=0x7f6954004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:47.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.985+0000 7f6946ffd640 1 -- 192.168.123.107:0/3876945744 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f695403d070 con 0x7f69581a7610 2026-03-09T19:25:47.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.985+0000 7f6946ffd640 1 -- 192.168.123.107:0/3876945744 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6954004590 con 0x7f69581a7610 2026-03-09T19:25:47.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.985+0000 7f695ed4e640 1 -- 192.168.123.107:0/3876945744 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f695810ed10 con 0x7f69581a7610 2026-03-09T19:25:47.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.986+0000 7f695ed4e640 1 -- 192.168.123.107:0/3876945744 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f695810f1b0 con 0x7f69581a7610 2026-03-09T19:25:47.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.987+0000 7f6946ffd640 1 -- 192.168.123.107:0/3876945744 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6954031070 con 0x7f69581a7610 2026-03-09T19:25:47.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.987+0000 7f695ed4e640 1 -- 192.168.123.107:0/3876945744 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6920005350 con 0x7f69581a7610 2026-03-09T19:25:47.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.988+0000 7f6946ffd640 1 -- 192.168.123.107:0/3876945744 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f6954038690 con 0x7f69581a7610 2026-03-09T19:25:47.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.991+0000 7f6946ffd640 1 --2- 192.168.123.107:0/3876945744 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6934075fb0 0x7f6934078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:47.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.991+0000 7f695dd4c640 1 --2- 192.168.123.107:0/3876945744 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6934075fb0 0x7f6934078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:47.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.991+0000 7f6946ffd640 1 -- 192.168.123.107:0/3876945744 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f69540bc260 con 0x7f69581a7610 2026-03-09T19:25:47.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.991+0000 7f695dd4c640 1 --2- 192.168.123.107:0/3876945744 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6934075fb0 0x7f6934078470 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f694c005f40 tx=0x7f694c005ed0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:47.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:47.991+0000 7f6946ffd640 1 -- 192.168.123.107:0/3876945744 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f69540be770 con 0x7f69581a7610 2026-03-09T19:25:48.129 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.131+0000 7f695ed4e640 1 -- 192.168.123.107:0/3876945744 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6920002bf0 con 0x7f6934075fb0 2026-03-09T19:25:48.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.133+0000 7f6946ffd640 1 -- 192.168.123.107:0/3876945744 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f6920002bf0 con 0x7f6934075fb0 2026-03-09T19:25:48.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.137+0000 7f6944ff9640 1 -- 192.168.123.107:0/3876945744 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6934075fb0 msgr2=0x7f6934078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:48.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.137+0000 7f6944ff9640 1 --2- 192.168.123.107:0/3876945744 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6934075fb0 0x7f6934078470 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f694c005f40 tx=0x7f694c005ed0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.137+0000 7f6944ff9640 1 -- 192.168.123.107:0/3876945744 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f69581a7610 msgr2=0x7f69581ac680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:48.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.137+0000 7f6944ff9640 1 --2- 192.168.123.107:0/3876945744 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f69581a7610 0x7f69581ac680 secure :-1 s=READY pgs=302 cs=0 l=1 rev1=1 crypto rx=0x7f69540099b0 tx=0x7f6954004290 comp rx=0 tx=0).stop 2026-03-09T19:25:48.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.137+0000 7f6944ff9640 1 -- 192.168.123.107:0/3876945744 shutdown_connections 2026-03-09T19:25:48.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.137+0000 7f6944ff9640 1 --2- 192.168.123.107:0/3876945744 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6934075fb0 0x7f6934078470 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.137+0000 7f6944ff9640 1 --2- 192.168.123.107:0/3876945744 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f69581a7610 0x7f69581ac680 unknown :-1 s=CLOSED pgs=302 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.137+0000 7f6944ff9640 1 --2- 192.168.123.107:0/3876945744 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f69580719a0 0x7f69581a70d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.137+0000 7f6944ff9640 1 -- 192.168.123.107:0/3876945744 >> 192.168.123.107:0/3876945744 conn(0x7f695806d4f0 msgr2=0x7f6958070390 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:48.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.137+0000 7f6944ff9640 1 -- 192.168.123.107:0/3876945744 shutdown_connections 2026-03-09T19:25:48.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.137+0000 7f6944ff9640 1 -- 192.168.123.107:0/3876945744 wait complete. 2026-03-09T19:25:48.145 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:25:48.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.200+0000 7fef2addf640 1 -- 192.168.123.107:0/3194300186 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fef24072440 msgr2=0x7fef240771b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:48.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.200+0000 7fef2addf640 1 --2- 192.168.123.107:0/3194300186 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fef24072440 0x7fef240771b0 secure :-1 s=READY pgs=303 cs=0 l=1 rev1=1 crypto rx=0x7fef1c00b0a0 tx=0x7fef1c02f4c0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.200+0000 7fef2addf640 1 -- 192.168.123.107:0/3194300186 shutdown_connections 2026-03-09T19:25:48.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.200+0000 7fef2addf640 1 --2- 192.168.123.107:0/3194300186 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fef24072440 0x7fef240771b0 unknown :-1 s=CLOSED pgs=303 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.200+0000 7fef2addf640 1 --2- 192.168.123.107:0/3194300186 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fef24071a70 0x7fef24071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.200+0000 7fef2addf640 1 -- 192.168.123.107:0/3194300186 >> 192.168.123.107:0/3194300186 conn(0x7fef2406d4f0 msgr2=0x7fef2406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.200+0000 7fef2addf640 1 -- 192.168.123.107:0/3194300186 shutdown_connections 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.200+0000 7fef2addf640 1 -- 192.168.123.107:0/3194300186 wait complete. 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.201+0000 7fef2addf640 1 Processor -- start 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.201+0000 7fef2addf640 1 -- start start 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.201+0000 7fef2addf640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fef24071a70 0x7fef24084160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.201+0000 7fef2addf640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fef240827b0 0x7fef24082c30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.201+0000 7fef2addf640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fef24083170 con 0x7fef24071a70 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.201+0000 7fef2addf640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fef240832e0 con 0x7fef240827b0 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.201+0000 7fef28b54640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fef24071a70 0x7fef24084160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.201+0000 7fef28b54640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fef24071a70 0x7fef24084160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:57410/0 (socket says 192.168.123.107:57410) 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.201+0000 7fef28b54640 1 -- 192.168.123.107:0/1924370262 learned_addr learned my addr 192.168.123.107:0/1924370262 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.201+0000 7fef28b54640 1 -- 192.168.123.107:0/1924370262 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fef240827b0 msgr2=0x7fef24082c30 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.201+0000 7fef28b54640 1 --2- 192.168.123.107:0/1924370262 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fef240827b0 0x7fef24082c30 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.201+0000 7fef28b54640 1 -- 192.168.123.107:0/1924370262 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fef1c009d00 con 0x7fef24071a70 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.201+0000 7fef28b54640 1 --2- 192.168.123.107:0/1924370262 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fef24071a70 0x7fef24084160 secure :-1 s=READY pgs=304 cs=0 l=1 rev1=1 crypto rx=0x7fef1400b4f0 tx=0x7fef1400b9c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.201+0000 7fef21ffb640 1 -- 192.168.123.107:0/1924370262 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fef14004280 con 0x7fef24071a70 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.201+0000 7fef2addf640 1 -- 192.168.123.107:0/1924370262 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fef240835c0 con 0x7fef24071a70 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.202+0000 7fef2addf640 1 -- 192.168.123.107:0/1924370262 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fef241b5c70 con 0x7fef24071a70 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.202+0000 7fef21ffb640 1 -- 192.168.123.107:0/1924370262 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fef140043e0 con 0x7fef24071a70 2026-03-09T19:25:48.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.202+0000 7fef21ffb640 1 -- 192.168.123.107:0/1924370262 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fef14010b10 con 0x7fef24071a70 2026-03-09T19:25:48.201 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.203+0000 7fef21ffb640 1 -- 192.168.123.107:0/1924370262 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fef1401a460 con 0x7fef24071a70 2026-03-09T19:25:48.201 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.204+0000 7fef21ffb640 1 --2- 192.168.123.107:0/1924370262 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fef04076290 0x7fef04078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:48.202 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.204+0000 7fef23fff640 1 --2- 192.168.123.107:0/1924370262 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fef04076290 0x7fef04078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:48.202 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.204+0000 7fef21ffb640 1 -- 192.168.123.107:0/1924370262 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fef14097f00 con 0x7fef24071a70 2026-03-09T19:25:48.202 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.204+0000 7fef23fff640 1 --2- 192.168.123.107:0/1924370262 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fef04076290 0x7fef04078750 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7fef1c00b070 tx=0x7fef1c002750 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:48.206 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.204+0000 7fef2addf640 1 -- 192.168.123.107:0/1924370262 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feee8005350 con 0x7fef24071a70 2026-03-09T19:25:48.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.213+0000 7fef21ffb640 1 -- 192.168.123.107:0/1924370262 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fef14061430 con 0x7fef24071a70 2026-03-09T19:25:48.335 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.332+0000 7fef2addf640 1 -- 192.168.123.107:0/1924370262 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7feee8002bf0 con 0x7fef04076290 2026-03-09T19:25:48.335 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.333+0000 7fef21ffb640 1 -- 192.168.123.107:0/1924370262 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7feee8002bf0 con 0x7fef04076290 2026-03-09T19:25:48.335 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.336+0000 7fef2addf640 1 -- 192.168.123.107:0/1924370262 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fef04076290 msgr2=0x7fef04078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:48.335 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.336+0000 7fef2addf640 1 --2- 192.168.123.107:0/1924370262 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fef04076290 0x7fef04078750 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7fef1c00b070 tx=0x7fef1c002750 comp rx=0 tx=0).stop 2026-03-09T19:25:48.335 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.336+0000 7fef2addf640 1 -- 192.168.123.107:0/1924370262 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fef24071a70 msgr2=0x7fef24084160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:48.335 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.336+0000 7fef2addf640 1 --2- 192.168.123.107:0/1924370262 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fef24071a70 0x7fef24084160 secure :-1 s=READY pgs=304 cs=0 l=1 rev1=1 crypto rx=0x7fef1400b4f0 tx=0x7fef1400b9c0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.335 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.337+0000 7fef2addf640 1 -- 192.168.123.107:0/1924370262 shutdown_connections 2026-03-09T19:25:48.335 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.337+0000 7fef2addf640 1 --2- 192.168.123.107:0/1924370262 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fef04076290 0x7fef04078750 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.335 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.337+0000 7fef2addf640 1 --2- 192.168.123.107:0/1924370262 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fef240827b0 0x7fef24082c30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.335 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.337+0000 7fef2addf640 1 --2- 192.168.123.107:0/1924370262 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fef24071a70 0x7fef24084160 unknown :-1 s=CLOSED pgs=304 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.335 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.337+0000 7fef2addf640 1 -- 192.168.123.107:0/1924370262 >> 192.168.123.107:0/1924370262 conn(0x7fef2406d4f0 msgr2=0x7fef240704b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:48.335 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.337+0000 7fef2addf640 1 -- 192.168.123.107:0/1924370262 shutdown_connections 2026-03-09T19:25:48.335 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.338+0000 7fef2addf640 1 -- 192.168.123.107:0/1924370262 wait complete. 2026-03-09T19:25:48.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.412+0000 7fc6aa403640 1 -- 192.168.123.107:0/3505183144 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6a4072420 msgr2=0x7fc6a4077190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:48.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.412+0000 7fc6aa403640 1 --2- 192.168.123.107:0/3505183144 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6a4072420 0x7fc6a4077190 secure :-1 s=READY pgs=305 cs=0 l=1 rev1=1 crypto rx=0x7fc69c00caa0 tx=0x7fc69c030580 comp rx=0 tx=0).stop 2026-03-09T19:25:48.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.414+0000 7fc6aa403640 1 -- 192.168.123.107:0/3505183144 shutdown_connections 2026-03-09T19:25:48.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.414+0000 7fc6aa403640 1 --2- 192.168.123.107:0/3505183144 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6a4072420 0x7fc6a4077190 unknown :-1 s=CLOSED pgs=305 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.414+0000 7fc6aa403640 1 --2- 192.168.123.107:0/3505183144 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc6a4071a50 0x7fc6a4071e50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.414+0000 7fc6aa403640 1 -- 192.168.123.107:0/3505183144 >> 192.168.123.107:0/3505183144 conn(0x7fc6a406d4f0 msgr2=0x7fc6a406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:48.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.414+0000 7fc6aa403640 1 -- 192.168.123.107:0/3505183144 shutdown_connections 2026-03-09T19:25:48.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.414+0000 7fc6aa403640 1 -- 192.168.123.107:0/3505183144 wait complete. 2026-03-09T19:25:48.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.414+0000 7fc6aa403640 1 Processor -- start 2026-03-09T19:25:48.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.414+0000 7fc6aa403640 1 -- start start 2026-03-09T19:25:48.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.414+0000 7fc6aa403640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc6a4071a50 0x7fc6a40840b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:48.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.414+0000 7fc6aa403640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6a4082700 0x7fc6a4082b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:48.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.414+0000 7fc6aa403640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc6a40845f0 con 0x7fc6a4082700 2026-03-09T19:25:48.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.414+0000 7fc6aa403640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc6a40830c0 con 0x7fc6a4071a50 2026-03-09T19:25:48.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.415+0000 7fc6a8c00640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6a4082700 0x7fc6a4082b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:48.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.415+0000 7fc6a8c00640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6a4082700 0x7fc6a4082b80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:57436/0 (socket says 192.168.123.107:57436) 2026-03-09T19:25:48.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.415+0000 7fc6a8c00640 1 -- 192.168.123.107:0/1830296582 learned_addr learned my addr 192.168.123.107:0/1830296582 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:25:48.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.415+0000 7fc6a9401640 1 --2- 192.168.123.107:0/1830296582 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc6a4071a50 0x7fc6a40840b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:48.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.415+0000 7fc6a8c00640 1 -- 192.168.123.107:0/1830296582 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc6a4071a50 msgr2=0x7fc6a40840b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:48.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.415+0000 7fc6a8c00640 1 --2- 192.168.123.107:0/1830296582 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc6a4071a50 0x7fc6a40840b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.415+0000 7fc6a8c00640 1 -- 192.168.123.107:0/1830296582 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc69c009d00 con 0x7fc6a4082700 2026-03-09T19:25:48.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.415+0000 7fc6a8c00640 1 --2- 192.168.123.107:0/1830296582 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6a4082700 0x7fc6a4082b80 secure :-1 s=READY pgs=306 cs=0 l=1 rev1=1 crypto rx=0x7fc69c009bb0 tx=0x7fc69c009c60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:48.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.416+0000 7fc69a7fc640 1 -- 192.168.123.107:0/1830296582 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc69c007db0 con 0x7fc6a4082700 2026-03-09T19:25:48.416 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.417+0000 7fc6aa403640 1 -- 192.168.123.107:0/1830296582 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc6a40832f0 con 0x7fc6a4082700 2026-03-09T19:25:48.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.417+0000 7fc6aa403640 1 -- 192.168.123.107:0/1830296582 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc6a41b5bc0 con 0x7fc6a4082700 2026-03-09T19:25:48.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.418+0000 7fc6aa403640 1 -- 192.168.123.107:0/1830296582 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc6a407a870 con 0x7fc6a4082700 2026-03-09T19:25:48.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.420+0000 7fc69a7fc640 1 -- 192.168.123.107:0/1830296582 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fc69c043070 con 0x7fc6a4082700 2026-03-09T19:25:48.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.420+0000 7fc69a7fc640 1 -- 192.168.123.107:0/1830296582 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc69c00c320 con 0x7fc6a4082700 2026-03-09T19:25:48.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.421+0000 7fc69a7fc640 1 -- 192.168.123.107:0/1830296582 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fc69c00c540 con 0x7fc6a4082700 2026-03-09T19:25:48.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.421+0000 7fc69a7fc640 1 --2- 192.168.123.107:0/1830296582 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc678076360 0x7fc678078820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:48.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.421+0000 7fc69a7fc640 1 -- 192.168.123.107:0/1830296582 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fc69c0c1680 con 0x7fc6a4082700 2026-03-09T19:25:48.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.421+0000 7fc6a9401640 1 --2- 192.168.123.107:0/1830296582 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc678076360 0x7fc678078820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:48.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.422+0000 7fc6a9401640 1 --2- 192.168.123.107:0/1830296582 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc678076360 0x7fc678078820 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7fc6a0004640 tx=0x7fc6a0009290 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:48.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.422+0000 7fc69a7fc640 1 -- 192.168.123.107:0/1830296582 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fc69c08ac60 con 0x7fc6a4082700 2026-03-09T19:25:48.558 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.552+0000 7fc6aa403640 1 -- 192.168.123.107:0/1830296582 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fc6a4076190 con 0x7fc678076360 2026-03-09T19:25:48.567 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.564+0000 7fc69a7fc640 1 -- 192.168.123.107:0/1830296582 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3624 (secure 0 0 0) 0x7fc6a4076190 con 0x7fc678076360 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (2m) 76s ago 3m 22.6M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (3m) 76s ago 3m 8284k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (2m) 77s ago 2m 8644k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (3m) 76s ago 3m 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2799ea3e4bf3 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (2m) 77s ago 2m 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 81fc95c210b6 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (2m) 76s ago 3m 79.7M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (83s) 76s ago 83s 12.5M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (81s) 76s ago 81s 14.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (80s) 77s ago 80s 16.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (83s) 77s ago 82s 17.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:9283,8765,8443 running (4m) 76s ago 4m 540M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 706e626ecd10 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (2m) 77s ago 2m 485M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 880604c16b45 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (4m) 76s ago 4m 53.6M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ccb644205fb3 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (2m) 77s ago 2m 49.4M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8d7b1da9e1e2 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (3m) 76s ago 3m 13.8M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (2m) 77s ago 2m 15.3M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (2m) 76s ago 2m 46.5M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d7417e3377af 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (2m) 76s ago 2m 67.5M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2b3c7dd92144 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (2m) 76s ago 2m 45.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 67f7c4b96ef8 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (2m) 77s ago 2m 45.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 740e44caf4fc 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (115s) 77s ago 114s 67.3M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d929d31f8a58 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (106s) 77s ago 106s 65.0M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:25:48.568 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (2m) 76s ago 3m 39.2M - 2.43.0 a07b618ecd1d 238baaac36ff 2026-03-09T19:25:48.573 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.574+0000 7fc6aa403640 1 -- 192.168.123.107:0/1830296582 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc678076360 msgr2=0x7fc678078820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:48.573 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.574+0000 7fc6aa403640 1 --2- 192.168.123.107:0/1830296582 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc678076360 0x7fc678078820 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7fc6a0004640 tx=0x7fc6a0009290 comp rx=0 tx=0).stop 2026-03-09T19:25:48.573 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.574+0000 7fc6aa403640 1 -- 192.168.123.107:0/1830296582 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6a4082700 msgr2=0x7fc6a4082b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:48.573 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.574+0000 7fc6aa403640 1 --2- 192.168.123.107:0/1830296582 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6a4082700 0x7fc6a4082b80 secure :-1 s=READY pgs=306 cs=0 l=1 rev1=1 crypto rx=0x7fc69c009bb0 tx=0x7fc69c009c60 comp rx=0 tx=0).stop 2026-03-09T19:25:48.573 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.574+0000 7fc6aa403640 1 -- 192.168.123.107:0/1830296582 shutdown_connections 2026-03-09T19:25:48.573 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.574+0000 7fc6aa403640 1 --2- 192.168.123.107:0/1830296582 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc678076360 0x7fc678078820 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.573 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.574+0000 7fc6aa403640 1 --2- 192.168.123.107:0/1830296582 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6a4082700 0x7fc6a4082b80 unknown :-1 s=CLOSED pgs=306 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.573 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.575+0000 7fc6aa403640 1 --2- 192.168.123.107:0/1830296582 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc6a4071a50 0x7fc6a40840b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.573 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.575+0000 7fc6aa403640 1 -- 192.168.123.107:0/1830296582 >> 192.168.123.107:0/1830296582 conn(0x7fc6a406d4f0 msgr2=0x7fc6a4070430 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:48.573 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.575+0000 7fc6aa403640 1 -- 192.168.123.107:0/1830296582 shutdown_connections 2026-03-09T19:25:48.573 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.575+0000 7fc6aa403640 1 -- 192.168.123.107:0/1830296582 wait complete. 2026-03-09T19:25:48.600 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:48 vm07.local ceph-mon[48545]: from='client.14604 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:25:48.600 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:48 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:25:48.659 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.661+0000 7f37d8c60640 1 -- 192.168.123.107:0/3804459936 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37d4075ba0 msgr2=0x7f37d4075fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:48.659 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.661+0000 7f37d8c60640 1 --2- 192.168.123.107:0/3804459936 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37d4075ba0 0x7f37d4075fa0 secure :-1 s=READY pgs=307 cs=0 l=1 rev1=1 crypto rx=0x7f37c00099b0 tx=0x7f37c002f240 comp rx=0 tx=0).stop 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.662+0000 7f37d8c60640 1 -- 192.168.123.107:0/3804459936 shutdown_connections 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.662+0000 7f37d8c60640 1 --2- 192.168.123.107:0/3804459936 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f37d4076df0 0x7f37d4077250 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.662+0000 7f37d8c60640 1 --2- 192.168.123.107:0/3804459936 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37d4075ba0 0x7f37d4075fa0 unknown :-1 s=CLOSED pgs=307 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.662+0000 7f37d8c60640 1 -- 192.168.123.107:0/3804459936 >> 192.168.123.107:0/3804459936 conn(0x7f37d40fe060 msgr2=0x7f37d4100480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.662+0000 7f37d8c60640 1 -- 192.168.123.107:0/3804459936 shutdown_connections 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.662+0000 7f37d8c60640 1 -- 192.168.123.107:0/3804459936 wait complete. 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.663+0000 7f37d8c60640 1 Processor -- start 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.663+0000 7f37d8c60640 1 -- start start 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.663+0000 7f37d8c60640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f37d4075ba0 0x7f37d419e880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.663+0000 7f37d8c60640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37d4076df0 0x7f37d419edc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.663+0000 7f37d8c60640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f37d419f390 con 0x7f37d4076df0 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.663+0000 7f37d8c60640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f37d419f500 con 0x7f37d4075ba0 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.663+0000 7f37d37fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f37d4075ba0 0x7f37d419e880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.663+0000 7f37d37fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f37d4075ba0 0x7f37d419e880 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:41286/0 (socket says 192.168.123.107:41286) 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.663+0000 7f37d37fe640 1 -- 192.168.123.107:0/2898450976 learned_addr learned my addr 192.168.123.107:0/2898450976 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.664+0000 7f37d2ffd640 1 --2- 192.168.123.107:0/2898450976 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37d4076df0 0x7f37d419edc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.664+0000 7f37d37fe640 1 -- 192.168.123.107:0/2898450976 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37d4076df0 msgr2=0x7f37d419edc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.664+0000 7f37d37fe640 1 --2- 192.168.123.107:0/2898450976 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37d4076df0 0x7f37d419edc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.664+0000 7f37d37fe640 1 -- 192.168.123.107:0/2898450976 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f37c0009660 con 0x7f37d4075ba0 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.664+0000 7f37d2ffd640 1 --2- 192.168.123.107:0/2898450976 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37d4076df0 0x7f37d419edc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.664+0000 7f37d37fe640 1 --2- 192.168.123.107:0/2898450976 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f37d4075ba0 0x7f37d419e880 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f37c0002410 tx=0x7f37c0002980 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.664+0000 7f37d0ff9640 1 -- 192.168.123.107:0/2898450976 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f37c003d070 con 0x7f37d4075ba0 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.664+0000 7f37d8c60640 1 -- 192.168.123.107:0/2898450976 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f37d41a3f40 con 0x7f37d4075ba0 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.664+0000 7f37d0ff9640 1 -- 192.168.123.107:0/2898450976 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f37c002fd70 con 0x7f37d4075ba0 2026-03-09T19:25:48.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.664+0000 7f37d8c60640 1 -- 192.168.123.107:0/2898450976 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f37d41a4430 con 0x7f37d4075ba0 2026-03-09T19:25:48.665 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.665+0000 7f37d0ff9640 1 -- 192.168.123.107:0/2898450976 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f37c00419d0 con 0x7f37d4075ba0 2026-03-09T19:25:48.665 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.666+0000 7f37d0ff9640 1 -- 192.168.123.107:0/2898450976 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f37c0049050 con 0x7f37d4075ba0 2026-03-09T19:25:48.665 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.666+0000 7f37d8c60640 1 -- 192.168.123.107:0/2898450976 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3798005350 con 0x7f37d4075ba0 2026-03-09T19:25:48.665 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.666+0000 7f37d0ff9640 1 --2- 192.168.123.107:0/2898450976 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f37a8075fb0 0x7f37a8078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:48.665 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.666+0000 7f37d0ff9640 1 -- 192.168.123.107:0/2898450976 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f37c00bc570 con 0x7f37d4075ba0 2026-03-09T19:25:48.665 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.667+0000 7f37d2ffd640 1 --2- 192.168.123.107:0/2898450976 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f37a8075fb0 0x7f37a8078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:48.666 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.668+0000 7f37d2ffd640 1 --2- 192.168.123.107:0/2898450976 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f37a8075fb0 0x7f37a8078470 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f37d419fda0 tx=0x7f37c4008040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:48.669 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.671+0000 7f37d0ff9640 1 -- 192.168.123.107:0/2898450976 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f37c0085ab0 con 0x7f37d4075ba0 2026-03-09T19:25:48.812 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.813+0000 7f37d8c60640 1 -- 192.168.123.107:0/2898450976 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f3798005e10 con 0x7f37d4075ba0 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 14 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:25:48.813 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.814+0000 7f37d0ff9640 1 -- 192.168.123.107:0/2898450976 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f37c0085450 con 0x7f37d4075ba0 2026-03-09T19:25:48.817 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.819+0000 7f37ae7fc640 1 -- 192.168.123.107:0/2898450976 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f37a8075fb0 msgr2=0x7f37a8078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:48.817 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.819+0000 7f37ae7fc640 1 --2- 192.168.123.107:0/2898450976 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f37a8075fb0 0x7f37a8078470 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f37d419fda0 tx=0x7f37c4008040 comp rx=0 tx=0).stop 2026-03-09T19:25:48.817 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.819+0000 7f37ae7fc640 1 -- 192.168.123.107:0/2898450976 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f37d4075ba0 msgr2=0x7f37d419e880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:48.817 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.819+0000 7f37ae7fc640 1 --2- 192.168.123.107:0/2898450976 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f37d4075ba0 0x7f37d419e880 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f37c0002410 tx=0x7f37c0002980 comp rx=0 tx=0).stop 2026-03-09T19:25:48.817 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.819+0000 7f37ae7fc640 1 -- 192.168.123.107:0/2898450976 shutdown_connections 2026-03-09T19:25:48.817 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.819+0000 7f37ae7fc640 1 --2- 192.168.123.107:0/2898450976 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f37a8075fb0 0x7f37a8078470 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.817 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.819+0000 7f37ae7fc640 1 --2- 192.168.123.107:0/2898450976 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37d4076df0 0x7f37d419edc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.817 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.819+0000 7f37ae7fc640 1 --2- 192.168.123.107:0/2898450976 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f37d4075ba0 0x7f37d419e880 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.817 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.819+0000 7f37ae7fc640 1 -- 192.168.123.107:0/2898450976 >> 192.168.123.107:0/2898450976 conn(0x7f37d40fe060 msgr2=0x7f37d40ffbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:48.817 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.819+0000 7f37ae7fc640 1 -- 192.168.123.107:0/2898450976 shutdown_connections 2026-03-09T19:25:48.817 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.819+0000 7f37ae7fc640 1 -- 192.168.123.107:0/2898450976 wait complete. 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.885+0000 7ff9d13fe640 1 -- 192.168.123.107:0/630877563 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9cc072420 msgr2=0x7ff9cc077190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.885+0000 7ff9d13fe640 1 --2- 192.168.123.107:0/630877563 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9cc072420 0x7ff9cc077190 secure :-1 s=READY pgs=308 cs=0 l=1 rev1=1 crypto rx=0x7ff9bc008880 tx=0x7ff9bc02f120 comp rx=0 tx=0).stop 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.885+0000 7ff9d13fe640 1 -- 192.168.123.107:0/630877563 shutdown_connections 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.885+0000 7ff9d13fe640 1 --2- 192.168.123.107:0/630877563 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9cc072420 0x7ff9cc077190 unknown :-1 s=CLOSED pgs=308 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.885+0000 7ff9d13fe640 1 --2- 192.168.123.107:0/630877563 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9cc071a50 0x7ff9cc071e50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.885+0000 7ff9d13fe640 1 -- 192.168.123.107:0/630877563 >> 192.168.123.107:0/630877563 conn(0x7ff9cc06d4f0 msgr2=0x7ff9cc06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.885+0000 7ff9d13fe640 1 -- 192.168.123.107:0/630877563 shutdown_connections 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.887+0000 7ff9d13fe640 1 -- 192.168.123.107:0/630877563 wait complete. 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.888+0000 7ff9d13fe640 1 Processor -- start 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.888+0000 7ff9d13fe640 1 -- start start 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.888+0000 7ff9d13fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9cc071a50 0x7ff9cc081ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.888+0000 7ff9d13fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9cc080110 0x7ff9cc080590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.888+0000 7ff9d13fe640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff9cc082000 con 0x7ff9cc071a50 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.888+0000 7ff9d13fe640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff9cc080ad0 con 0x7ff9cc080110 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.888+0000 7ff9cbfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9cc071a50 0x7ff9cc081ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.888+0000 7ff9cbfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9cc071a50 0x7ff9cc081ac0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:34160/0 (socket says 192.168.123.107:34160) 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.888+0000 7ff9cbfff640 1 -- 192.168.123.107:0/3132872967 learned_addr learned my addr 192.168.123.107:0/3132872967 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.888+0000 7ff9cbfff640 1 -- 192.168.123.107:0/3132872967 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9cc080110 msgr2=0x7ff9cc080590 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.888+0000 7ff9cbfff640 1 --2- 192.168.123.107:0/3132872967 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9cc080110 0x7ff9cc080590 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.888+0000 7ff9cbfff640 1 -- 192.168.123.107:0/3132872967 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff9bc008530 con 0x7ff9cc071a50 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.888+0000 7ff9cbfff640 1 --2- 192.168.123.107:0/3132872967 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9cc071a50 0x7ff9cc081ac0 secure :-1 s=READY pgs=309 cs=0 l=1 rev1=1 crypto rx=0x7ff9c40060b0 tx=0x7ff9c40087c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:48.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.889+0000 7ff9c97fa640 1 -- 192.168.123.107:0/3132872967 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff9c4004940 con 0x7ff9cc071a50 2026-03-09T19:25:48.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.889+0000 7ff9c97fa640 1 -- 192.168.123.107:0/3132872967 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff9c400bbc0 con 0x7ff9cc071a50 2026-03-09T19:25:48.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.889+0000 7ff9c97fa640 1 -- 192.168.123.107:0/3132872967 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff9c4005390 con 0x7ff9cc071a50 2026-03-09T19:25:48.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.890+0000 7ff9d13fe640 1 -- 192.168.123.107:0/3132872967 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff9cc080db0 con 0x7ff9cc071a50 2026-03-09T19:25:48.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.890+0000 7ff9d13fe640 1 -- 192.168.123.107:0/3132872967 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff9cc1b5bc0 con 0x7ff9cc071a50 2026-03-09T19:25:48.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.892+0000 7ff9c97fa640 1 -- 192.168.123.107:0/3132872967 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7ff9c401c050 con 0x7ff9cc071a50 2026-03-09T19:25:48.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.892+0000 7ff9aaffd640 1 -- 192.168.123.107:0/3132872967 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff9cc072420 con 0x7ff9cc071a50 2026-03-09T19:25:48.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.893+0000 7ff9c97fa640 1 --2- 192.168.123.107:0/3132872967 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff9a4076170 0x7ff9a4078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:48.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.893+0000 7ff9c97fa640 1 -- 192.168.123.107:0/3132872967 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7ff9c4097400 con 0x7ff9cc071a50 2026-03-09T19:25:48.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.895+0000 7ff9c97fa640 1 -- 192.168.123.107:0/3132872967 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7ff9c4060960 con 0x7ff9cc071a50 2026-03-09T19:25:48.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.895+0000 7ff9cb7fe640 1 --2- 192.168.123.107:0/3132872967 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff9a4076170 0x7ff9a4078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:48.898 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:48.899+0000 7ff9cb7fe640 1 --2- 192.168.123.107:0/3132872967 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff9a4076170 0x7ff9a4078630 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7ff9cc081840 tx=0x7ff9bc0023d0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:49.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.015+0000 7ff9aaffd640 1 -- 192.168.123.107:0/3132872967 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7ff9cc061c80 con 0x7ff9cc071a50 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:e13 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:epoch 13 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:25:49.017 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:25:49.018 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:25:49.018 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:25:49.018 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:25:49.018 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:25:49.018 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:25:49.018 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:25:49.018 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:25:49.018 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:25:49.018 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:25:49.018 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:25:49.018 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.018+0000 7ff9c97fa640 1 -- 192.168.123.107:0/3132872967 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1856 (secure 0 0 0) 0x7ff9c400bd30 con 0x7ff9cc071a50 2026-03-09T19:25:49.018 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:25:49.023 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.025+0000 7ff9d13fe640 1 -- 192.168.123.107:0/3132872967 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff9a4076170 msgr2=0x7ff9a4078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:49.023 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.025+0000 7ff9d13fe640 1 --2- 192.168.123.107:0/3132872967 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff9a4076170 0x7ff9a4078630 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7ff9cc081840 tx=0x7ff9bc0023d0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.023 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.025+0000 7ff9d13fe640 1 -- 192.168.123.107:0/3132872967 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9cc071a50 msgr2=0x7ff9cc081ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:49.023 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.025+0000 7ff9d13fe640 1 --2- 192.168.123.107:0/3132872967 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9cc071a50 0x7ff9cc081ac0 secure :-1 s=READY pgs=309 cs=0 l=1 rev1=1 crypto rx=0x7ff9c40060b0 tx=0x7ff9c40087c0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.026+0000 7ff9d13fe640 1 -- 192.168.123.107:0/3132872967 shutdown_connections 2026-03-09T19:25:49.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.026+0000 7ff9d13fe640 1 --2- 192.168.123.107:0/3132872967 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7ff9a4076170 0x7ff9a4078630 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.026+0000 7ff9d13fe640 1 --2- 192.168.123.107:0/3132872967 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9cc080110 0x7ff9cc080590 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.026+0000 7ff9d13fe640 1 --2- 192.168.123.107:0/3132872967 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9cc071a50 0x7ff9cc081ac0 unknown :-1 s=CLOSED pgs=309 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.026+0000 7ff9d13fe640 1 -- 192.168.123.107:0/3132872967 >> 192.168.123.107:0/3132872967 conn(0x7ff9cc06d4f0 msgr2=0x7ff9cc06fd80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:49.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.026+0000 7ff9d13fe640 1 -- 192.168.123.107:0/3132872967 shutdown_connections 2026-03-09T19:25:49.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.026+0000 7ff9d13fe640 1 -- 192.168.123.107:0/3132872967 wait complete. 2026-03-09T19:25:49.094 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.096+0000 7f9ca3fff640 1 -- 192.168.123.107:0/3383500096 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ca40719a0 msgr2=0x7f9ca4071da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:49.094 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.096+0000 7f9ca3fff640 1 --2- 192.168.123.107:0/3383500096 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ca40719a0 0x7f9ca4071da0 secure :-1 s=READY pgs=310 cs=0 l=1 rev1=1 crypto rx=0x7f9c980099b0 tx=0x7f9c9802f240 comp rx=0 tx=0).stop 2026-03-09T19:25:49.094 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.096+0000 7f9ca3fff640 1 -- 192.168.123.107:0/3383500096 shutdown_connections 2026-03-09T19:25:49.094 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.096+0000 7f9ca3fff640 1 --2- 192.168.123.107:0/3383500096 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9ca40722e0 0x7f9ca4110d20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.094 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.096+0000 7f9ca3fff640 1 --2- 192.168.123.107:0/3383500096 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ca40719a0 0x7f9ca4071da0 unknown :-1 s=CLOSED pgs=310 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.094 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.096+0000 7f9ca3fff640 1 -- 192.168.123.107:0/3383500096 >> 192.168.123.107:0/3383500096 conn(0x7f9ca406d4f0 msgr2=0x7f9ca406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:48 vm08.local ceph-mon[57794]: from='client.14604 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:25:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:48 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:25:49.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.096+0000 7f9ca3fff640 1 -- 192.168.123.107:0/3383500096 shutdown_connections 2026-03-09T19:25:49.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.096+0000 7f9ca3fff640 1 -- 192.168.123.107:0/3383500096 wait complete. 2026-03-09T19:25:49.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.097+0000 7f9ca3fff640 1 Processor -- start 2026-03-09T19:25:49.106 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.108+0000 7f9ca3fff640 1 -- start start 2026-03-09T19:25:49.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.108+0000 7f9ca3fff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9ca40719a0 0x7f9ca41a2c60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:49.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.108+0000 7f9ca3fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ca40722e0 0x7f9ca41a31a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:49.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.108+0000 7f9ca3fff640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9ca41a37a0 con 0x7f9ca40722e0 2026-03-09T19:25:49.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.108+0000 7f9ca3fff640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9ca41a3910 con 0x7f9ca40719a0 2026-03-09T19:25:49.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.108+0000 7f9ca27fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ca40722e0 0x7f9ca41a31a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:49.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.108+0000 7f9ca27fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ca40722e0 0x7f9ca41a31a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:34180/0 (socket says 192.168.123.107:34180) 2026-03-09T19:25:49.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.108+0000 7f9ca27fc640 1 -- 192.168.123.107:0/204699646 learned_addr learned my addr 192.168.123.107:0/204699646 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:25:49.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.108+0000 7f9ca2ffd640 1 --2- 192.168.123.107:0/204699646 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9ca40719a0 0x7f9ca41a2c60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:49.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.109+0000 7f9ca27fc640 1 -- 192.168.123.107:0/204699646 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9ca40719a0 msgr2=0x7f9ca41a2c60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:49.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.109+0000 7f9ca27fc640 1 --2- 192.168.123.107:0/204699646 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9ca40719a0 0x7f9ca41a2c60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.109+0000 7f9ca27fc640 1 -- 192.168.123.107:0/204699646 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9c98009660 con 0x7f9ca40722e0 2026-03-09T19:25:49.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.109+0000 7f9ca27fc640 1 --2- 192.168.123.107:0/204699646 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ca40722e0 0x7f9ca41a31a0 secure :-1 s=READY pgs=311 cs=0 l=1 rev1=1 crypto rx=0x7f9c8c00d8d0 tx=0x7f9c8c00dda0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:49.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.109+0000 7f9ca8a59640 1 -- 192.168.123.107:0/204699646 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9c8c004490 con 0x7f9ca40722e0 2026-03-09T19:25:49.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.109+0000 7f9ca8a59640 1 -- 192.168.123.107:0/204699646 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9c8c0076c0 con 0x7f9ca40722e0 2026-03-09T19:25:49.117 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.110+0000 7f9ca8a59640 1 -- 192.168.123.107:0/204699646 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9c8c002e90 con 0x7f9ca40722e0 2026-03-09T19:25:49.117 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.110+0000 7f9ca3fff640 1 -- 192.168.123.107:0/204699646 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9ca41a83a0 con 0x7f9ca40722e0 2026-03-09T19:25:49.117 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.110+0000 7f9ca3fff640 1 -- 192.168.123.107:0/204699646 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9ca41a8870 con 0x7f9ca40722e0 2026-03-09T19:25:49.117 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.111+0000 7f9ca3fff640 1 -- 192.168.123.107:0/204699646 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9c68005350 con 0x7f9ca40722e0 2026-03-09T19:25:49.117 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.111+0000 7f9ca8a59640 1 -- 192.168.123.107:0/204699646 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f9c8c00b840 con 0x7f9ca40722e0 2026-03-09T19:25:49.117 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.112+0000 7f9ca8a59640 1 --2- 192.168.123.107:0/204699646 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9c78075fb0 0x7f9c78078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:49.117 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.115+0000 7f9ca2ffd640 1 --2- 192.168.123.107:0/204699646 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9c78075fb0 0x7f9c78078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:49.117 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.115+0000 7f9ca8a59640 1 -- 192.168.123.107:0/204699646 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f9c8c0600a0 con 0x7f9ca40722e0 2026-03-09T19:25:49.117 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.115+0000 7f9ca8a59640 1 -- 192.168.123.107:0/204699646 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f9c8c09c050 con 0x7f9ca40722e0 2026-03-09T19:25:49.121 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.119+0000 7f9ca2ffd640 1 --2- 192.168.123.107:0/204699646 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9c78075fb0 0x7f9c78078470 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7f9c98002c30 tx=0x7f9c9803a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:49.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.222+0000 7f9ca3fff640 1 -- 192.168.123.107:0/204699646 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9c68002bf0 con 0x7f9c78075fb0 2026-03-09T19:25:49.227 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.229+0000 7f9ca8a59640 1 -- 192.168.123.107:0/204699646 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f9c68002bf0 con 0x7f9c78075fb0 2026-03-09T19:25:49.229 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:25:49.229 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-09T19:25:49.229 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:25:49.229 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:25:49.229 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [], 2026-03-09T19:25:49.229 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "", 2026-03-09T19:25:49.229 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-09T19:25:49.229 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:25:49.229 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:25:49.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.231+0000 7f9ca3fff640 1 -- 192.168.123.107:0/204699646 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9c78075fb0 msgr2=0x7f9c78078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:49.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.231+0000 7f9ca3fff640 1 --2- 192.168.123.107:0/204699646 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9c78075fb0 0x7f9c78078470 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7f9c98002c30 tx=0x7f9c9803a040 comp rx=0 tx=0).stop 2026-03-09T19:25:49.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.231+0000 7f9ca3fff640 1 -- 192.168.123.107:0/204699646 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ca40722e0 msgr2=0x7f9ca41a31a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:49.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.231+0000 7f9ca3fff640 1 --2- 192.168.123.107:0/204699646 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ca40722e0 0x7f9ca41a31a0 secure :-1 s=READY pgs=311 cs=0 l=1 rev1=1 crypto rx=0x7f9c8c00d8d0 tx=0x7f9c8c00dda0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.231+0000 7f9ca3fff640 1 -- 192.168.123.107:0/204699646 shutdown_connections 2026-03-09T19:25:49.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.231+0000 7f9ca3fff640 1 --2- 192.168.123.107:0/204699646 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9c78075fb0 0x7f9c78078470 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.231+0000 7f9ca3fff640 1 --2- 192.168.123.107:0/204699646 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ca40722e0 0x7f9ca41a31a0 unknown :-1 s=CLOSED pgs=311 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.231+0000 7f9ca3fff640 1 --2- 192.168.123.107:0/204699646 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9ca40719a0 0x7f9ca41a2c60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.231+0000 7f9ca3fff640 1 -- 192.168.123.107:0/204699646 >> 192.168.123.107:0/204699646 conn(0x7f9ca406d4f0 msgr2=0x7f9ca410f1f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:49.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.231+0000 7f9ca3fff640 1 -- 192.168.123.107:0/204699646 shutdown_connections 2026-03-09T19:25:49.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.231+0000 7f9ca3fff640 1 -- 192.168.123.107:0/204699646 wait complete. 2026-03-09T19:25:49.285 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4208d0a640 1 -- 192.168.123.107:0/858986981 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4204071a70 msgr2=0x7f4204071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:49.285 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4208d0a640 1 --2- 192.168.123.107:0/858986981 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4204071a70 0x7f4204071e70 secure :-1 s=READY pgs=312 cs=0 l=1 rev1=1 crypto rx=0x7f41f40099b0 tx=0x7f41f402f220 comp rx=0 tx=0).stop 2026-03-09T19:25:49.285 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4208d0a640 1 -- 192.168.123.107:0/858986981 shutdown_connections 2026-03-09T19:25:49.285 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4208d0a640 1 --2- 192.168.123.107:0/858986981 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4204072440 0x7f42040771b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.285 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4208d0a640 1 --2- 192.168.123.107:0/858986981 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4204071a70 0x7f4204071e70 unknown :-1 s=CLOSED pgs=312 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.285 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4208d0a640 1 -- 192.168.123.107:0/858986981 >> 192.168.123.107:0/858986981 conn(0x7f420406d4f0 msgr2=0x7f420406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:49.286 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4208d0a640 1 -- 192.168.123.107:0/858986981 shutdown_connections 2026-03-09T19:25:49.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4208d0a640 1 -- 192.168.123.107:0/858986981 wait complete. 2026-03-09T19:25:49.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4208d0a640 1 Processor -- start 2026-03-09T19:25:49.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4208d0a640 1 -- start start 2026-03-09T19:25:49.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4208d0a640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4204072440 0x7f42040840c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:49.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4208d0a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4204082710 0x7f4204082b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:49.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4208d0a640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4204084600 con 0x7f4204082710 2026-03-09T19:25:49.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4208d0a640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f42040830d0 con 0x7f4204072440 2026-03-09T19:25:49.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4201d74640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4204082710 0x7f4204082b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:49.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4201d74640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4204082710 0x7f4204082b90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:34196/0 (socket says 192.168.123.107:34196) 2026-03-09T19:25:49.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.287+0000 7f4201d74640 1 -- 192.168.123.107:0/2160823718 learned_addr learned my addr 192.168.123.107:0/2160823718 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:25:49.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.288+0000 7f4201d74640 1 -- 192.168.123.107:0/2160823718 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4204072440 msgr2=0x7f42040840c0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:25:49.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.288+0000 7f4201d74640 1 --2- 192.168.123.107:0/2160823718 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4204072440 0x7f42040840c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.288+0000 7f4201d74640 1 -- 192.168.123.107:0/2160823718 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f41f4009660 con 0x7f4204082710 2026-03-09T19:25:49.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.288+0000 7f4201d74640 1 --2- 192.168.123.107:0/2160823718 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4204082710 0x7f4204082b90 secure :-1 s=READY pgs=313 cs=0 l=1 rev1=1 crypto rx=0x7f41fc0047b0 tx=0x7f41fc00d4a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:49.287 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.288+0000 7f41f37fe640 1 -- 192.168.123.107:0/2160823718 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f41fc0090d0 con 0x7f4204082710 2026-03-09T19:25:49.289 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.288+0000 7f4208d0a640 1 -- 192.168.123.107:0/2160823718 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f42040833b0 con 0x7f4204082710 2026-03-09T19:25:49.289 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.288+0000 7f4208d0a640 1 -- 192.168.123.107:0/2160823718 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f42041b5c10 con 0x7f4204082710 2026-03-09T19:25:49.289 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.290+0000 7f41f37fe640 1 -- 192.168.123.107:0/2160823718 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f41fc00f040 con 0x7f4204082710 2026-03-09T19:25:49.289 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.290+0000 7f41f37fe640 1 -- 192.168.123.107:0/2160823718 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f41fc013600 con 0x7f4204082710 2026-03-09T19:25:49.289 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.291+0000 7f41f37fe640 1 -- 192.168.123.107:0/2160823718 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f41fc0137e0 con 0x7f4204082710 2026-03-09T19:25:49.289 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.291+0000 7f41f37fe640 1 --2- 192.168.123.107:0/2160823718 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f41e4076390 0x7f41e4078850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:25:49.290 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.292+0000 7f4202575640 1 --2- 192.168.123.107:0/2160823718 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f41e4076390 0x7f41e4078850 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:25:49.290 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.292+0000 7f41f37fe640 1 -- 192.168.123.107:0/2160823718 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f41fc098240 con 0x7f4204082710 2026-03-09T19:25:49.290 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.292+0000 7f4202575640 1 --2- 192.168.123.107:0/2160823718 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f41e4076390 0x7f41e4078850 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f41f40099b0 tx=0x7f41f40023d0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:25:49.290 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.292+0000 7f4208d0a640 1 -- 192.168.123.107:0/2160823718 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f41d0005350 con 0x7f4204082710 2026-03-09T19:25:49.294 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.295+0000 7f41f37fe640 1 -- 192.168.123.107:0/2160823718 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f41fc061820 con 0x7f4204082710 2026-03-09T19:25:49.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.420+0000 7f4208d0a640 1 -- 192.168.123.107:0/2160823718 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f41d0005600 con 0x7f4204082710 2026-03-09T19:25:49.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.420+0000 7f41f37fe640 1 -- 192.168.123.107:0/2160823718 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f41fc018090 con 0x7f4204082710 2026-03-09T19:25:49.419 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_OK 2026-03-09T19:25:49.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.423+0000 7f41f17fa640 1 -- 192.168.123.107:0/2160823718 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f41e4076390 msgr2=0x7f41e4078850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:49.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.423+0000 7f41f17fa640 1 --2- 192.168.123.107:0/2160823718 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f41e4076390 0x7f41e4078850 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f41f40099b0 tx=0x7f41f40023d0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.423+0000 7f41f17fa640 1 -- 192.168.123.107:0/2160823718 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4204082710 msgr2=0x7f4204082b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:25:49.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.423+0000 7f41f17fa640 1 --2- 192.168.123.107:0/2160823718 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4204082710 0x7f4204082b90 secure :-1 s=READY pgs=313 cs=0 l=1 rev1=1 crypto rx=0x7f41fc0047b0 tx=0x7f41fc00d4a0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.423+0000 7f41f17fa640 1 -- 192.168.123.107:0/2160823718 shutdown_connections 2026-03-09T19:25:49.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.423+0000 7f41f17fa640 1 --2- 192.168.123.107:0/2160823718 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f41e4076390 0x7f41e4078850 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.423+0000 7f41f17fa640 1 --2- 192.168.123.107:0/2160823718 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4204082710 0x7f4204082b90 unknown :-1 s=CLOSED pgs=313 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.423+0000 7f41f17fa640 1 --2- 192.168.123.107:0/2160823718 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4204072440 0x7f42040840c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:25:49.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.423+0000 7f41f17fa640 1 -- 192.168.123.107:0/2160823718 >> 192.168.123.107:0/2160823718 conn(0x7f420406d4f0 msgr2=0x7f42040753f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:25:49.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.423+0000 7f41f17fa640 1 -- 192.168.123.107:0/2160823718 shutdown_connections 2026-03-09T19:25:49.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:25:49.423+0000 7f41f17fa640 1 -- 192.168.123.107:0/2160823718 wait complete. 2026-03-09T19:25:49.859 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:49 vm08.local ceph-mon[57794]: from='client.14608 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:25:49.859 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:49 vm08.local ceph-mon[57794]: pgmap v119: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:49.859 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:49 vm08.local ceph-mon[57794]: from='client.14612 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:25:49.859 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:49 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/2898450976' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:25:49.859 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:49 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/3132872967' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:25:49.859 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:49 vm08.local ceph-mon[57794]: from='client.14622 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:25:49.859 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:49 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/2160823718' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:25:50.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:49 vm07.local ceph-mon[48545]: from='client.14608 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:25:50.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:49 vm07.local ceph-mon[48545]: pgmap v119: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:50.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:49 vm07.local ceph-mon[48545]: from='client.14612 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:25:50.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:49 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/2898450976' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:25:50.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:49 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/3132872967' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:25:50.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:49 vm07.local ceph-mon[48545]: from='client.14622 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:25:50.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:49 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/2160823718' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:25:52.670 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:52 vm07.local ceph-mon[48545]: pgmap v120: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:53.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:52 vm08.local ceph-mon[57794]: pgmap v120: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:53.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:53 vm07.local ceph-mon[48545]: pgmap v121: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:54.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:53 vm08.local ceph-mon[57794]: pgmap v121: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:56.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:55 vm08.local ceph-mon[57794]: pgmap v122: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:56.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:55 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:25:56.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:55 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:25:56.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:55 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:25:56.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:55 vm07.local ceph-mon[48545]: pgmap v122: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:56.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:55 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:25:56.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:55 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:25:56.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:55 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:25:57.334 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:57 vm07.local ceph-mon[48545]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-09T19:25:57.334 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:57 vm07.local ceph-mon[48545]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-09T19:25:57.334 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:57 vm07.local ceph-mon[48545]: Upgrade: Need to upgrade myself (mgr.vm07.xacuym) 2026-03-09T19:25:57.334 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:57 vm07.local ceph-mon[48545]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm08 2026-03-09T19:25:57.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:57 vm08.local ceph-mon[57794]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-09T19:25:57.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:57 vm08.local ceph-mon[57794]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-09T19:25:57.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:57 vm08.local ceph-mon[57794]: Upgrade: Need to upgrade myself (mgr.vm07.xacuym) 2026-03-09T19:25:57.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:57 vm08.local ceph-mon[57794]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm08 2026-03-09T19:25:57.787 INFO:tasks.workunit.client.1.vm08.stderr:Updating files: 96% (13479/13941) Updating files: 97% (13523/13941) Updating files: 98% (13663/13941) Updating files: 99% (13802/13941) Updating files: 100% (13941/13941) Updating files: 100% (13941/13941), done. 2026-03-09T19:25:58.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:58 vm08.local ceph-mon[57794]: pgmap v123: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:58.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:58 vm07.local ceph-mon[48545]: pgmap v123: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:58.810 INFO:tasks.workunit.client.1.vm08.stderr:Note: switching to '569c3e99c9b32a51b4eaf08731c728f4513ed589'. 2026-03-09T19:25:58.810 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-09T19:25:58.810 INFO:tasks.workunit.client.1.vm08.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-09T19:25:58.810 INFO:tasks.workunit.client.1.vm08.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-09T19:25:58.810 INFO:tasks.workunit.client.1.vm08.stderr:state without impacting any branches by switching back to a branch. 2026-03-09T19:25:58.810 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-09T19:25:58.810 INFO:tasks.workunit.client.1.vm08.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-09T19:25:58.810 INFO:tasks.workunit.client.1.vm08.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-09T19:25:58.810 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-09T19:25:58.810 INFO:tasks.workunit.client.1.vm08.stderr: git switch -c 2026-03-09T19:25:58.810 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-09T19:25:58.810 INFO:tasks.workunit.client.1.vm08.stderr:Or undo this operation with: 2026-03-09T19:25:58.810 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-09T19:25:58.810 INFO:tasks.workunit.client.1.vm08.stderr: git switch - 2026-03-09T19:25:58.810 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-09T19:25:58.811 INFO:tasks.workunit.client.1.vm08.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-09T19:25:58.811 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-09T19:25:58.811 INFO:tasks.workunit.client.1.vm08.stderr:HEAD is now at 569c3e99c9b qa/rgw: bucket notifications use pynose 2026-03-09T19:25:58.816 DEBUG:teuthology.orchestra.run.vm08:> cd -- /home/ubuntu/cephtest/clone.client.1/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.1 2026-03-09T19:25:58.881 INFO:tasks.workunit.client.1.vm08.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-09T19:25:58.883 INFO:tasks.workunit.client.1.vm08.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-09T19:25:58.883 INFO:tasks.workunit.client.1.vm08.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-09T19:25:58.947 INFO:tasks.workunit.client.1.vm08.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-09T19:25:58.996 INFO:tasks.workunit.client.1.vm08.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-09T19:25:59.030 INFO:tasks.workunit.client.1.vm08.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-09T19:25:59.031 INFO:tasks.workunit.client.1.vm08.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-09T19:25:59.031 INFO:tasks.workunit.client.1.vm08.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-09T19:25:59.060 INFO:tasks.workunit.client.1.vm08.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-09T19:25:59.063 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-09T19:25:59.064 DEBUG:teuthology.orchestra.run.vm08:> dd if=/home/ubuntu/cephtest/workunits.list.client.1 of=/dev/stdout 2026-03-09T19:25:59.122 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.1... 2026-03-09T19:25:59.123 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-09T19:25:59.123 DEBUG:teuthology.orchestra.run.vm08:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && cd -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="1" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.1 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.1 CEPH_MNT=/home/ubuntu/cephtest/mnt.1 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.1/qa/workunits/suites/fsstress.sh 2026-03-09T19:25:59.215 INFO:tasks.workunit.client.1.vm08.stderr:+ mkdir -p fsstress 2026-03-09T19:25:59.217 INFO:tasks.workunit.client.1.vm08.stderr:+ pushd fsstress 2026-03-09T19:25:59.218 INFO:tasks.workunit.client.1.vm08.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-09T19:25:59.218 INFO:tasks.workunit.client.1.vm08.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-09T19:25:59.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:25:59 vm07.local ceph-mon[48545]: pgmap v124: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:25:59.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:25:59 vm08.local ceph-mon[57794]: pgmap v124: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:26:00.499 INFO:tasks.workunit.client.0.vm07.stderr:Note: switching to '569c3e99c9b32a51b4eaf08731c728f4513ed589'. 2026-03-09T19:26:00.499 INFO:tasks.workunit.client.0.vm07.stderr: 2026-03-09T19:26:00.499 INFO:tasks.workunit.client.0.vm07.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-09T19:26:00.499 INFO:tasks.workunit.client.0.vm07.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-09T19:26:00.499 INFO:tasks.workunit.client.0.vm07.stderr:state without impacting any branches by switching back to a branch. 2026-03-09T19:26:00.499 INFO:tasks.workunit.client.0.vm07.stderr: 2026-03-09T19:26:00.499 INFO:tasks.workunit.client.0.vm07.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-09T19:26:00.500 INFO:tasks.workunit.client.0.vm07.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-09T19:26:00.500 INFO:tasks.workunit.client.0.vm07.stderr: 2026-03-09T19:26:00.500 INFO:tasks.workunit.client.0.vm07.stderr: git switch -c 2026-03-09T19:26:00.500 INFO:tasks.workunit.client.0.vm07.stderr: 2026-03-09T19:26:00.500 INFO:tasks.workunit.client.0.vm07.stderr:Or undo this operation with: 2026-03-09T19:26:00.500 INFO:tasks.workunit.client.0.vm07.stderr: 2026-03-09T19:26:00.500 INFO:tasks.workunit.client.0.vm07.stderr: git switch - 2026-03-09T19:26:00.500 INFO:tasks.workunit.client.0.vm07.stderr: 2026-03-09T19:26:00.500 INFO:tasks.workunit.client.0.vm07.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-09T19:26:00.500 INFO:tasks.workunit.client.0.vm07.stderr: 2026-03-09T19:26:00.500 INFO:tasks.workunit.client.0.vm07.stderr:HEAD is now at 569c3e99c9b qa/rgw: bucket notifications use pynose 2026-03-09T19:26:00.507 DEBUG:teuthology.orchestra.run.vm07:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-09T19:26:00.566 INFO:tasks.workunit.client.0.vm07.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-09T19:26:00.568 INFO:tasks.workunit.client.0.vm07.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-09T19:26:00.568 INFO:tasks.workunit.client.0.vm07.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-09T19:26:00.677 INFO:tasks.workunit.client.0.vm07.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-09T19:26:00.713 INFO:tasks.workunit.client.0.vm07.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-09T19:26:00.741 INFO:tasks.workunit.client.0.vm07.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-09T19:26:00.742 INFO:tasks.workunit.client.0.vm07.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-09T19:26:00.742 INFO:tasks.workunit.client.0.vm07.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-09T19:26:00.770 INFO:tasks.workunit.client.0.vm07.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-09T19:26:00.773 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:26:00.773 DEBUG:teuthology.orchestra.run.vm07:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-09T19:26:00.823 INFO:tasks.workunit.client.1.vm08.stderr:+ tar xzf ltp-full.tgz 2026-03-09T19:26:00.843 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.0... 2026-03-09T19:26:00.844 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-09T19:26:00.844 DEBUG:teuthology.orchestra.run.vm07:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/fsstress.sh 2026-03-09T19:26:00.919 INFO:tasks.workunit.client.0.vm07.stderr:+ mkdir -p fsstress 2026-03-09T19:26:00.922 INFO:tasks.workunit.client.0.vm07.stderr:+ pushd fsstress 2026-03-09T19:26:00.923 INFO:tasks.workunit.client.0.vm07.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-09T19:26:00.923 INFO:tasks.workunit.client.0.vm07.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-09T19:26:01.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:01 vm08.local ceph-mon[57794]: pgmap v125: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:26:01.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:01 vm07.local ceph-mon[48545]: pgmap v125: 65 pgs: 65 active+clean; 462 KiB data, 165 MiB used, 120 GiB / 120 GiB avail 2026-03-09T19:26:02.537 INFO:tasks.workunit.client.0.vm07.stderr:+ tar xzf ltp-full.tgz 2026-03-09T19:26:03.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:03 vm08.local ceph-mon[57794]: pgmap v126: 65 pgs: 65 active+clean; 868 KiB data, 169 MiB used, 120 GiB / 120 GiB avail; 39 KiB/s wr, 8 op/s 2026-03-09T19:26:03.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:03 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:26:03.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:03 vm07.local ceph-mon[48545]: pgmap v126: 65 pgs: 65 active+clean; 868 KiB data, 169 MiB used, 120 GiB / 120 GiB avail; 39 KiB/s wr, 8 op/s 2026-03-09T19:26:03.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:03 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:26:06.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:05 vm07.local ceph-mon[48545]: pgmap v127: 65 pgs: 65 active+clean; 17 MiB data, 311 MiB used, 120 GiB / 120 GiB avail; 1.4 MiB/s wr, 54 op/s 2026-03-09T19:26:06.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:05 vm08.local ceph-mon[57794]: pgmap v127: 65 pgs: 65 active+clean; 17 MiB data, 311 MiB used, 120 GiB / 120 GiB avail; 1.4 MiB/s wr, 54 op/s 2026-03-09T19:26:08.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:08 vm08.local ceph-mon[57794]: pgmap v128: 65 pgs: 65 active+clean; 25 MiB data, 333 MiB used, 120 GiB / 120 GiB avail; 2.1 MiB/s wr, 84 op/s 2026-03-09T19:26:08.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:08 vm07.local ceph-mon[48545]: pgmap v128: 65 pgs: 65 active+clean; 25 MiB data, 333 MiB used, 120 GiB / 120 GiB avail; 2.1 MiB/s wr, 84 op/s 2026-03-09T19:26:10.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:10 vm07.local ceph-mon[48545]: pgmap v129: 65 pgs: 65 active+clean; 37 MiB data, 407 MiB used, 120 GiB / 120 GiB avail; 3.2 MiB/s wr, 188 op/s 2026-03-09T19:26:10.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:10 vm08.local ceph-mon[57794]: pgmap v129: 65 pgs: 65 active+clean; 37 MiB data, 407 MiB used, 120 GiB / 120 GiB avail; 3.2 MiB/s wr, 188 op/s 2026-03-09T19:26:11.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:11 vm07.local ceph-mon[48545]: pgmap v130: 65 pgs: 65 active+clean; 38 MiB data, 419 MiB used, 120 GiB / 120 GiB avail; 3.3 MiB/s wr, 232 op/s 2026-03-09T19:26:12.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:11 vm08.local ceph-mon[57794]: pgmap v130: 65 pgs: 65 active+clean; 38 MiB data, 419 MiB used, 120 GiB / 120 GiB avail; 3.3 MiB/s wr, 232 op/s 2026-03-09T19:26:13.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:13 vm07.local ceph-mon[48545]: pgmap v131: 65 pgs: 65 active+clean; 47 MiB data, 441 MiB used, 120 GiB / 120 GiB avail; 4.1 MiB/s wr, 320 op/s 2026-03-09T19:26:14.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:13 vm08.local ceph-mon[57794]: pgmap v131: 65 pgs: 65 active+clean; 47 MiB data, 441 MiB used, 120 GiB / 120 GiB avail; 4.1 MiB/s wr, 320 op/s 2026-03-09T19:26:16.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:16 vm07.local ceph-mon[48545]: pgmap v132: 65 pgs: 65 active+clean; 51 MiB data, 480 MiB used, 120 GiB / 120 GiB avail; 4.4 MiB/s wr, 384 op/s 2026-03-09T19:26:16.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:16 vm08.local ceph-mon[57794]: pgmap v132: 65 pgs: 65 active+clean; 51 MiB data, 480 MiB used, 120 GiB / 120 GiB avail; 4.4 MiB/s wr, 384 op/s 2026-03-09T19:26:17.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:17 vm07.local ceph-mon[48545]: pgmap v133: 65 pgs: 65 active+clean; 57 MiB data, 539 MiB used, 119 GiB / 120 GiB avail; 3.5 MiB/s wr, 361 op/s 2026-03-09T19:26:17.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:17 vm08.local ceph-mon[57794]: pgmap v133: 65 pgs: 65 active+clean; 57 MiB data, 539 MiB used, 119 GiB / 120 GiB avail; 3.5 MiB/s wr, 361 op/s 2026-03-09T19:26:18.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:18 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:26:18.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:18 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:26:19.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.505+0000 7fab7127c640 1 -- 192.168.123.107:0/1186754111 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab6c071a70 msgr2=0x7fab6c071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:19.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.505+0000 7fab7127c640 1 --2- 192.168.123.107:0/1186754111 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab6c071a70 0x7fab6c071e70 secure :-1 s=READY pgs=314 cs=0 l=1 rev1=1 crypto rx=0x7fab5c007980 tx=0x7fab5c031160 comp rx=0 tx=0).stop 2026-03-09T19:26:19.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.505+0000 7fab7127c640 1 -- 192.168.123.107:0/1186754111 shutdown_connections 2026-03-09T19:26:19.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.505+0000 7fab7127c640 1 --2- 192.168.123.107:0/1186754111 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fab6c072440 0x7fab6c0771b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:19.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.505+0000 7fab7127c640 1 --2- 192.168.123.107:0/1186754111 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab6c071a70 0x7fab6c071e70 unknown :-1 s=CLOSED pgs=314 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:19.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.505+0000 7fab7127c640 1 -- 192.168.123.107:0/1186754111 >> 192.168.123.107:0/1186754111 conn(0x7fab6c06d4f0 msgr2=0x7fab6c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:19.506 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.507+0000 7fab7127c640 1 -- 192.168.123.107:0/1186754111 shutdown_connections 2026-03-09T19:26:19.506 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.507+0000 7fab7127c640 1 -- 192.168.123.107:0/1186754111 wait complete. 2026-03-09T19:26:19.506 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.507+0000 7fab7127c640 1 Processor -- start 2026-03-09T19:26:19.507 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.508+0000 7fab7127c640 1 -- start start 2026-03-09T19:26:19.507 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.508+0000 7fab7127c640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fab6c072440 0x7fab6c084130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:19.507 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.508+0000 7fab7127c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab6c082780 0x7fab6c082c00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:19.507 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.508+0000 7fab7127c640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fab6c084670 con 0x7fab6c082780 2026-03-09T19:26:19.507 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.508+0000 7fab7127c640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fab6c083170 con 0x7fab6c072440 2026-03-09T19:26:19.508 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.508+0000 7fab6a575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab6c082780 0x7fab6c082c00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:19.508 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.508+0000 7fab6a575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab6c082780 0x7fab6c082c00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:39716/0 (socket says 192.168.123.107:39716) 2026-03-09T19:26:19.508 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.508+0000 7fab6a575640 1 -- 192.168.123.107:0/1878625777 learned_addr learned my addr 192.168.123.107:0/1878625777 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:26:19.508 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.508+0000 7fab6ad76640 1 --2- 192.168.123.107:0/1878625777 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fab6c072440 0x7fab6c084130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:19.508 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.509+0000 7fab6a575640 1 -- 192.168.123.107:0/1878625777 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fab6c072440 msgr2=0x7fab6c084130 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:19.508 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.509+0000 7fab6a575640 1 --2- 192.168.123.107:0/1878625777 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fab6c072440 0x7fab6c084130 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:19.508 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.509+0000 7fab6a575640 1 -- 192.168.123.107:0/1878625777 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fab5c0075d0 con 0x7fab6c082780 2026-03-09T19:26:19.508 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.509+0000 7fab6a575640 1 --2- 192.168.123.107:0/1878625777 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab6c082780 0x7fab6c082c00 secure :-1 s=READY pgs=315 cs=0 l=1 rev1=1 crypto rx=0x7fab640047b0 tx=0x7fab6400d4a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:19.509 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.510+0000 7fab4bfff640 1 -- 192.168.123.107:0/1878625777 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fab640090d0 con 0x7fab6c082780 2026-03-09T19:26:19.510 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.510+0000 7fab7127c640 1 -- 192.168.123.107:0/1878625777 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fab6c083450 con 0x7fab6c082780 2026-03-09T19:26:19.510 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.510+0000 7fab7127c640 1 -- 192.168.123.107:0/1878625777 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fab6c12efc0 con 0x7fab6c082780 2026-03-09T19:26:19.510 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.511+0000 7fab4bfff640 1 -- 192.168.123.107:0/1878625777 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fab6400f040 con 0x7fab6c082780 2026-03-09T19:26:19.510 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.511+0000 7fab4bfff640 1 -- 192.168.123.107:0/1878625777 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fab64013600 con 0x7fab6c082780 2026-03-09T19:26:19.511 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.512+0000 7fab4bfff640 1 -- 192.168.123.107:0/1878625777 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fab64013760 con 0x7fab6c082780 2026-03-09T19:26:19.511 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.513+0000 7fab4bfff640 1 --2- 192.168.123.107:0/1878625777 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fab58076290 0x7fab58078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:19.511 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.514+0000 7fab6ad76640 1 --2- 192.168.123.107:0/1878625777 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fab58076290 0x7fab58078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:19.513 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.515+0000 7fab6ad76640 1 --2- 192.168.123.107:0/1878625777 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fab58076290 0x7fab58078750 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7fab5c002410 tx=0x7fab5c004040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:19.513 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.515+0000 7fab4bfff640 1 -- 192.168.123.107:0/1878625777 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fab64098170 con 0x7fab6c082780 2026-03-09T19:26:19.513 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.515+0000 7fab7127c640 1 -- 192.168.123.107:0/1878625777 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fab38005350 con 0x7fab6c082780 2026-03-09T19:26:19.517 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.519+0000 7fab4bfff640 1 -- 192.168.123.107:0/1878625777 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fab64061750 con 0x7fab6c082780 2026-03-09T19:26:19.639 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.640+0000 7fab7127c640 1 -- 192.168.123.107:0/1878625777 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fab38002bf0 con 0x7fab58076290 2026-03-09T19:26:19.641 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.642+0000 7fab4bfff640 1 -- 192.168.123.107:0/1878625777 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7fab38002bf0 con 0x7fab58076290 2026-03-09T19:26:19.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.646+0000 7fab7127c640 1 -- 192.168.123.107:0/1878625777 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fab58076290 msgr2=0x7fab58078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:19.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.646+0000 7fab7127c640 1 --2- 192.168.123.107:0/1878625777 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fab58076290 0x7fab58078750 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7fab5c002410 tx=0x7fab5c004040 comp rx=0 tx=0).stop 2026-03-09T19:26:19.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.646+0000 7fab7127c640 1 -- 192.168.123.107:0/1878625777 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab6c082780 msgr2=0x7fab6c082c00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:19.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.646+0000 7fab7127c640 1 --2- 192.168.123.107:0/1878625777 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab6c082780 0x7fab6c082c00 secure :-1 s=READY pgs=315 cs=0 l=1 rev1=1 crypto rx=0x7fab640047b0 tx=0x7fab6400d4a0 comp rx=0 tx=0).stop 2026-03-09T19:26:19.645 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.646+0000 7fab7127c640 1 -- 192.168.123.107:0/1878625777 shutdown_connections 2026-03-09T19:26:19.645 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.646+0000 7fab7127c640 1 --2- 192.168.123.107:0/1878625777 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fab58076290 0x7fab58078750 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:19.645 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.646+0000 7fab7127c640 1 --2- 192.168.123.107:0/1878625777 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab6c082780 0x7fab6c082c00 unknown :-1 s=CLOSED pgs=315 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:19.645 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.646+0000 7fab7127c640 1 --2- 192.168.123.107:0/1878625777 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fab6c072440 0x7fab6c084130 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:19.645 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.646+0000 7fab7127c640 1 -- 192.168.123.107:0/1878625777 >> 192.168.123.107:0/1878625777 conn(0x7fab6c06d4f0 msgr2=0x7fab6c07b430 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:19.645 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.647+0000 7fab7127c640 1 -- 192.168.123.107:0/1878625777 shutdown_connections 2026-03-09T19:26:19.645 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.647+0000 7fab7127c640 1 -- 192.168.123.107:0/1878625777 wait complete. 2026-03-09T19:26:19.655 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:26:19.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:19 vm07.local ceph-mon[48545]: pgmap v134: 65 pgs: 65 active+clean; 69 MiB data, 671 MiB used, 119 GiB / 120 GiB avail; 3.9 MiB/s wr, 415 op/s 2026-03-09T19:26:19.729 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.731+0000 7f005c87e640 1 -- 192.168.123.107:0/110748031 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0050093bb0 msgr2=0x7f0050093fb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:19.729 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.731+0000 7f005c87e640 1 --2- 192.168.123.107:0/110748031 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0050093bb0 0x7f0050093fb0 secure :-1 s=READY pgs=316 cs=0 l=1 rev1=1 crypto rx=0x7f004c0099b0 tx=0x7f004c02f240 comp rx=0 tx=0).stop 2026-03-09T19:26:19.730 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.732+0000 7f005c87e640 1 -- 192.168.123.107:0/110748031 shutdown_connections 2026-03-09T19:26:19.730 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.732+0000 7f005c87e640 1 --2- 192.168.123.107:0/110748031 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0050094db0 0x7f0050095230 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:19.730 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.732+0000 7f005c87e640 1 --2- 192.168.123.107:0/110748031 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0050093bb0 0x7f0050093fb0 unknown :-1 s=CLOSED pgs=316 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:19.730 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.732+0000 7f005c87e640 1 -- 192.168.123.107:0/110748031 >> 192.168.123.107:0/110748031 conn(0x7f005008f320 msgr2=0x7f0050091780 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:19.730 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.732+0000 7f005c87e640 1 -- 192.168.123.107:0/110748031 shutdown_connections 2026-03-09T19:26:19.730 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.732+0000 7f005c87e640 1 -- 192.168.123.107:0/110748031 wait complete. 2026-03-09T19:26:19.733 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.734+0000 7f005c87e640 1 Processor -- start 2026-03-09T19:26:19.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.737+0000 7f005c87e640 1 -- start start 2026-03-09T19:26:19.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.738+0000 7f005c87e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0050093bb0 0x7f005012cf70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:19.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.738+0000 7f005c87e640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0050094db0 0x7f005012d4b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:19.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.739+0000 7f0056575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0050094db0 0x7f005012d4b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:19.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.739+0000 7f0056d76640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0050093bb0 0x7f005012cf70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:19.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.739+0000 7f0056d76640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0050093bb0 0x7f005012cf70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:39734/0 (socket says 192.168.123.107:39734) 2026-03-09T19:26:19.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.739+0000 7f0056d76640 1 -- 192.168.123.107:0/3915617187 learned_addr learned my addr 192.168.123.107:0/3915617187 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:26:19.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.738+0000 7f005c87e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f005012da80 con 0x7f0050093bb0 2026-03-09T19:26:19.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.739+0000 7f005c87e640 1 -- 192.168.123.107:0/3915617187 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f005012dbf0 con 0x7f0050094db0 2026-03-09T19:26:19.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.740+0000 7f0056d76640 1 -- 192.168.123.107:0/3915617187 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0050094db0 msgr2=0x7f005012d4b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:19.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.740+0000 7f0056d76640 1 --2- 192.168.123.107:0/3915617187 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0050094db0 0x7f005012d4b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:19.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.740+0000 7f0056d76640 1 -- 192.168.123.107:0/3915617187 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f004c009660 con 0x7f0050093bb0 2026-03-09T19:26:19.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.741+0000 7f0056d76640 1 --2- 192.168.123.107:0/3915617187 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0050093bb0 0x7f005012cf70 secure :-1 s=READY pgs=317 cs=0 l=1 rev1=1 crypto rx=0x7f004c002fd0 tx=0x7f004c004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:19.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.742+0000 7f0037fff640 1 -- 192.168.123.107:0/3915617187 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f004c03d070 con 0x7f0050093bb0 2026-03-09T19:26:19.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.742+0000 7f0037fff640 1 -- 192.168.123.107:0/3915617187 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f004c0043f0 con 0x7f0050093bb0 2026-03-09T19:26:19.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.742+0000 7f0037fff640 1 -- 192.168.123.107:0/3915617187 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f004c0418b0 con 0x7f0050093bb0 2026-03-09T19:26:19.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.742+0000 7f005c87e640 1 -- 192.168.123.107:0/3915617187 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0050132630 con 0x7f0050093bb0 2026-03-09T19:26:19.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.743+0000 7f005c87e640 1 -- 192.168.123.107:0/3915617187 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0050132af0 con 0x7f0050093bb0 2026-03-09T19:26:19.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.743+0000 7f005c87e640 1 -- 192.168.123.107:0/3915617187 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f001c005350 con 0x7f0050093bb0 2026-03-09T19:26:19.744 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.746+0000 7f0037fff640 1 -- 192.168.123.107:0/3915617187 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f004c02fcb0 con 0x7f0050093bb0 2026-03-09T19:26:19.744 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.746+0000 7f0037fff640 1 --2- 192.168.123.107:0/3915617187 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f002c075f60 0x7f002c078420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:19.745 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.748+0000 7f0056575640 1 --2- 192.168.123.107:0/3915617187 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f002c075f60 0x7f002c078420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:19.746 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.748+0000 7f0037fff640 1 -- 192.168.123.107:0/3915617187 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f004c0858e0 con 0x7f0050093bb0 2026-03-09T19:26:19.746 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.748+0000 7f0056575640 1 --2- 192.168.123.107:0/3915617187 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f002c075f60 0x7f002c078420 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f005012e490 tx=0x7f00400073d0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:19.748 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.750+0000 7f0037fff640 1 -- 192.168.123.107:0/3915617187 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f004c085ba0 con 0x7f0050093bb0 2026-03-09T19:26:19.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:19 vm08.local ceph-mon[57794]: pgmap v134: 65 pgs: 65 active+clean; 69 MiB data, 671 MiB used, 119 GiB / 120 GiB avail; 3.9 MiB/s wr, 415 op/s 2026-03-09T19:26:19.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.858+0000 7f005c87e640 1 -- 192.168.123.107:0/3915617187 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f001c002bf0 con 0x7f002c075f60 2026-03-09T19:26:19.857 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.859+0000 7f0037fff640 1 -- 192.168.123.107:0/3915617187 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f001c002bf0 con 0x7f002c075f60 2026-03-09T19:26:19.861 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.862+0000 7f005c87e640 1 -- 192.168.123.107:0/3915617187 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f002c075f60 msgr2=0x7f002c078420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:19.861 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.862+0000 7f005c87e640 1 --2- 192.168.123.107:0/3915617187 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f002c075f60 0x7f002c078420 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f005012e490 tx=0x7f00400073d0 comp rx=0 tx=0).stop 2026-03-09T19:26:19.861 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.862+0000 7f005c87e640 1 -- 192.168.123.107:0/3915617187 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0050093bb0 msgr2=0x7f005012cf70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:19.861 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.862+0000 7f005c87e640 1 --2- 192.168.123.107:0/3915617187 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0050093bb0 0x7f005012cf70 secure :-1 s=READY pgs=317 cs=0 l=1 rev1=1 crypto rx=0x7f004c002fd0 tx=0x7f004c004290 comp rx=0 tx=0).stop 2026-03-09T19:26:19.861 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.863+0000 7f005c87e640 1 -- 192.168.123.107:0/3915617187 shutdown_connections 2026-03-09T19:26:19.861 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.863+0000 7f005c87e640 1 --2- 192.168.123.107:0/3915617187 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f002c075f60 0x7f002c078420 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:19.861 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.863+0000 7f005c87e640 1 --2- 192.168.123.107:0/3915617187 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0050094db0 0x7f005012d4b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:19.861 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.863+0000 7f005c87e640 1 --2- 192.168.123.107:0/3915617187 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0050093bb0 0x7f005012cf70 unknown :-1 s=CLOSED pgs=317 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:19.861 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.863+0000 7f005c87e640 1 -- 192.168.123.107:0/3915617187 >> 192.168.123.107:0/3915617187 conn(0x7f005008f320 msgr2=0x7f0050090bc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:19.861 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.863+0000 7f005c87e640 1 -- 192.168.123.107:0/3915617187 shutdown_connections 2026-03-09T19:26:19.861 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.863+0000 7f005c87e640 1 -- 192.168.123.107:0/3915617187 wait complete. 2026-03-09T19:26:19.930 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.932+0000 7f6de9126640 1 -- 192.168.123.107:0/1275804977 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6de4103a20 msgr2=0x7f6de4105e10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:19.931 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.932+0000 7f6de9126640 1 --2- 192.168.123.107:0/1275804977 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6de4103a20 0x7f6de4105e10 secure :-1 s=READY pgs=318 cs=0 l=1 rev1=1 crypto rx=0x7f6dd40099b0 tx=0x7f6dd402f220 comp rx=0 tx=0).stop 2026-03-09T19:26:19.931 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.932+0000 7f6de9126640 1 -- 192.168.123.107:0/1275804977 shutdown_connections 2026-03-09T19:26:19.931 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.932+0000 7f6de9126640 1 --2- 192.168.123.107:0/1275804977 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6de4103a20 0x7f6de4105e10 secure :-1 s=CLOSED pgs=318 cs=0 l=1 rev1=1 crypto rx=0x7f6dd40099b0 tx=0x7f6dd402f220 comp rx=0 tx=0).stop 2026-03-09T19:26:19.931 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.932+0000 7f6de9126640 1 --2- 192.168.123.107:0/1275804977 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6de41010f0 0x7f6de41034e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:19.931 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.932+0000 7f6de9126640 1 -- 192.168.123.107:0/1275804977 >> 192.168.123.107:0/1275804977 conn(0x7f6de40fac90 msgr2=0x7f6de40fd0b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:19.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.934+0000 7f6de9126640 1 -- 192.168.123.107:0/1275804977 shutdown_connections 2026-03-09T19:26:19.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.935+0000 7f6de9126640 1 -- 192.168.123.107:0/1275804977 wait complete. 2026-03-09T19:26:19.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.936+0000 7f6de9126640 1 Processor -- start 2026-03-09T19:26:19.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.936+0000 7f6de9126640 1 -- start start 2026-03-09T19:26:19.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.937+0000 7f6de9126640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6de41010f0 0x7f6de4198260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:19.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.937+0000 7f6de9126640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6de41987a0 0x7f6de419d810 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:19.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.937+0000 7f6de9126640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6de4198c20 con 0x7f6de41010f0 2026-03-09T19:26:19.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.938+0000 7f6de9126640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6de4198d90 con 0x7f6de41987a0 2026-03-09T19:26:19.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.938+0000 7f6de37fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6de41987a0 0x7f6de419d810 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:19.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.938+0000 7f6de37fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6de41987a0 0x7f6de419d810 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:45024/0 (socket says 192.168.123.107:45024) 2026-03-09T19:26:19.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.938+0000 7f6de37fe640 1 -- 192.168.123.107:0/366103062 learned_addr learned my addr 192.168.123.107:0/366103062 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:26:19.937 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.938+0000 7f6de37fe640 1 -- 192.168.123.107:0/366103062 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6de41010f0 msgr2=0x7f6de4198260 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:19.937 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.938+0000 7f6de37fe640 1 --2- 192.168.123.107:0/366103062 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6de41010f0 0x7f6de4198260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:19.937 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.938+0000 7f6de37fe640 1 -- 192.168.123.107:0/366103062 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6dd4009660 con 0x7f6de41987a0 2026-03-09T19:26:19.937 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.939+0000 7f6de37fe640 1 --2- 192.168.123.107:0/366103062 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6de41987a0 0x7f6de419d810 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f6dd4002fd0 tx=0x7f6dd4005b40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:19.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.940+0000 7f6de17fa640 1 -- 192.168.123.107:0/366103062 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6dd403d070 con 0x7f6de41987a0 2026-03-09T19:26:19.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.940+0000 7f6de9126640 1 -- 192.168.123.107:0/366103062 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6de419dd50 con 0x7f6de41987a0 2026-03-09T19:26:19.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.940+0000 7f6de9126640 1 -- 192.168.123.107:0/366103062 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6de419e1c0 con 0x7f6de41987a0 2026-03-09T19:26:19.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.941+0000 7f6de17fa640 1 -- 192.168.123.107:0/366103062 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6dd402fc90 con 0x7f6de41987a0 2026-03-09T19:26:19.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.941+0000 7f6de17fa640 1 -- 192.168.123.107:0/366103062 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6dd4041620 con 0x7f6de41987a0 2026-03-09T19:26:19.942 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.943+0000 7f6de17fa640 1 -- 192.168.123.107:0/366103062 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f6dd404b430 con 0x7f6de41987a0 2026-03-09T19:26:19.942 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.943+0000 7f6de9126640 1 -- 192.168.123.107:0/366103062 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6db0005350 con 0x7f6de41987a0 2026-03-09T19:26:19.942 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.944+0000 7f6de17fa640 1 --2- 192.168.123.107:0/366103062 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6dc00761f0 0x7f6dc00786b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:19.943 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.945+0000 7f6de17fa640 1 -- 192.168.123.107:0/366103062 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f6dd40bd270 con 0x7f6de41987a0 2026-03-09T19:26:19.943 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.945+0000 7f6de3fff640 1 --2- 192.168.123.107:0/366103062 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6dc00761f0 0x7f6dc00786b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:19.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.951+0000 7f6de3fff640 1 --2- 192.168.123.107:0/366103062 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6dc00761f0 0x7f6dc00786b0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f6dd00059c0 tx=0x7f6dd0005950 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:19.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:19.951+0000 7f6de17fa640 1 -- 192.168.123.107:0/366103062 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6dd4086720 con 0x7f6de41987a0 2026-03-09T19:26:20.063 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.064+0000 7f6de9126640 1 -- 192.168.123.107:0/366103062 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f6db0002bf0 con 0x7f6dc00761f0 2026-03-09T19:26:20.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.071+0000 7f6de17fa640 1 -- 192.168.123.107:0/366103062 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3624 (secure 0 0 0) 0x7f6db0002bf0 con 0x7f6dc00761f0 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (3m) 108s ago 3m 22.6M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (4m) 108s ago 4m 8284k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (3m) 109s ago 3m 8644k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (4m) 108s ago 4m 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2799ea3e4bf3 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (3m) 109s ago 3m 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 81fc95c210b6 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (3m) 108s ago 3m 79.7M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (115s) 108s ago 115s 12.5M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (113s) 108s ago 113s 14.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (112s) 109s ago 112s 16.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (114s) 109s ago 114s 17.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:9283,8765,8443 running (4m) 108s ago 4m 540M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 706e626ecd10 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (3m) 109s ago 3m 485M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 880604c16b45 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (4m) 108s ago 4m 53.6M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ccb644205fb3 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (3m) 109s ago 3m 49.4M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8d7b1da9e1e2 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (4m) 108s ago 4m 13.8M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (3m) 109s ago 3m 15.3M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (3m) 108s ago 3m 46.5M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d7417e3377af 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (2m) 108s ago 2m 67.5M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2b3c7dd92144 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (2m) 108s ago 2m 45.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 67f7c4b96ef8 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (2m) 109s ago 2m 45.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 740e44caf4fc 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (2m) 109s ago 2m 67.3M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d929d31f8a58 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (2m) 109s ago 2m 65.0M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:26:20.070 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (3m) 108s ago 3m 39.2M - 2.43.0 a07b618ecd1d 238baaac36ff 2026-03-09T19:26:20.071 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.073+0000 7f6de9126640 1 -- 192.168.123.107:0/366103062 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6dc00761f0 msgr2=0x7f6dc00786b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:20.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.073+0000 7f6de9126640 1 --2- 192.168.123.107:0/366103062 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6dc00761f0 0x7f6dc00786b0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f6dd00059c0 tx=0x7f6dd0005950 comp rx=0 tx=0).stop 2026-03-09T19:26:20.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.073+0000 7f6de9126640 1 -- 192.168.123.107:0/366103062 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6de41987a0 msgr2=0x7f6de419d810 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:20.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.073+0000 7f6de9126640 1 --2- 192.168.123.107:0/366103062 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6de41987a0 0x7f6de419d810 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f6dd4002fd0 tx=0x7f6dd4005b40 comp rx=0 tx=0).stop 2026-03-09T19:26:20.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.073+0000 7f6de9126640 1 -- 192.168.123.107:0/366103062 shutdown_connections 2026-03-09T19:26:20.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.073+0000 7f6de9126640 1 --2- 192.168.123.107:0/366103062 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6dc00761f0 0x7f6dc00786b0 secure :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f6dd00059c0 tx=0x7f6dd0005950 comp rx=0 tx=0).stop 2026-03-09T19:26:20.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.073+0000 7f6de9126640 1 --2- 192.168.123.107:0/366103062 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6de41987a0 0x7f6de419d810 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.073+0000 7f6de9126640 1 --2- 192.168.123.107:0/366103062 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6de41010f0 0x7f6de4198260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.073+0000 7f6de9126640 1 -- 192.168.123.107:0/366103062 >> 192.168.123.107:0/366103062 conn(0x7f6de40fac90 msgr2=0x7f6de40fd080 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:20.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.073+0000 7f6de9126640 1 -- 192.168.123.107:0/366103062 shutdown_connections 2026-03-09T19:26:20.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.073+0000 7f6de9126640 1 -- 192.168.123.107:0/366103062 wait complete. 2026-03-09T19:26:20.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.132+0000 7fce08b48640 1 -- 192.168.123.107:0/3953698488 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fce041006e0 msgr2=0x7fce04100b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:20.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.132+0000 7fce08b48640 1 --2- 192.168.123.107:0/3953698488 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fce041006e0 0x7fce04100b60 secure :-1 s=READY pgs=319 cs=0 l=1 rev1=1 crypto rx=0x7fcdf00099b0 tx=0x7fcdf002f220 comp rx=0 tx=0).stop 2026-03-09T19:26:20.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.132+0000 7fce08b48640 1 -- 192.168.123.107:0/3953698488 shutdown_connections 2026-03-09T19:26:20.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.132+0000 7fce08b48640 1 --2- 192.168.123.107:0/3953698488 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fce041006e0 0x7fce04100b60 unknown :-1 s=CLOSED pgs=319 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.132+0000 7fce08b48640 1 --2- 192.168.123.107:0/3953698488 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fce040ff4e0 0x7fce040ff8e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.132+0000 7fce08b48640 1 -- 192.168.123.107:0/3953698488 >> 192.168.123.107:0/3953698488 conn(0x7fce040fac90 msgr2=0x7fce040fd0b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:20.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.133+0000 7fce08b48640 1 -- 192.168.123.107:0/3953698488 shutdown_connections 2026-03-09T19:26:20.131 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.133+0000 7fce08b48640 1 -- 192.168.123.107:0/3953698488 wait complete. 2026-03-09T19:26:20.131 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.133+0000 7fce08b48640 1 Processor -- start 2026-03-09T19:26:20.131 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.134+0000 7fce08b48640 1 -- start start 2026-03-09T19:26:20.134 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.136+0000 7fce08b48640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fce040ff4e0 0x7fce041980c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:20.134 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.136+0000 7fce08b48640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fce041006e0 0x7fce04198600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:20.134 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.136+0000 7fce02ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fce041006e0 0x7fce04198600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:20.134 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.136+0000 7fce02ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fce041006e0 0x7fce04198600 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:39762/0 (socket says 192.168.123.107:39762) 2026-03-09T19:26:20.134 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.136+0000 7fce02ffd640 1 -- 192.168.123.107:0/1712769736 learned_addr learned my addr 192.168.123.107:0/1712769736 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:26:20.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.137+0000 7fce037fe640 1 --2- 192.168.123.107:0/1712769736 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fce040ff4e0 0x7fce041980c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:20.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.137+0000 7fce08b48640 1 -- 192.168.123.107:0/1712769736 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fce04198bd0 con 0x7fce041006e0 2026-03-09T19:26:20.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.137+0000 7fce08b48640 1 -- 192.168.123.107:0/1712769736 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fce04198d40 con 0x7fce040ff4e0 2026-03-09T19:26:20.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.137+0000 7fce02ffd640 1 -- 192.168.123.107:0/1712769736 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fce040ff4e0 msgr2=0x7fce041980c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:20.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.137+0000 7fce02ffd640 1 --2- 192.168.123.107:0/1712769736 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fce040ff4e0 0x7fce041980c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.137+0000 7fce02ffd640 1 -- 192.168.123.107:0/1712769736 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcdf0009660 con 0x7fce041006e0 2026-03-09T19:26:20.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.137+0000 7fce02ffd640 1 --2- 192.168.123.107:0/1712769736 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fce041006e0 0x7fce04198600 secure :-1 s=READY pgs=320 cs=0 l=1 rev1=1 crypto rx=0x7fcdf0009ae0 tx=0x7fcdf0002980 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:20.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.138+0000 7fce00ff9640 1 -- 192.168.123.107:0/1712769736 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcdf003d070 con 0x7fce041006e0 2026-03-09T19:26:20.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.138+0000 7fce00ff9640 1 -- 192.168.123.107:0/1712769736 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fcdf0031df0 con 0x7fce041006e0 2026-03-09T19:26:20.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.138+0000 7fce00ff9640 1 -- 192.168.123.107:0/1712769736 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcdf0031280 con 0x7fce041006e0 2026-03-09T19:26:20.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.138+0000 7fce08b48640 1 -- 192.168.123.107:0/1712769736 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fce04101900 con 0x7fce041006e0 2026-03-09T19:26:20.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.138+0000 7fce08b48640 1 -- 192.168.123.107:0/1712769736 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fce04101e70 con 0x7fce041006e0 2026-03-09T19:26:20.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.139+0000 7fce08b48640 1 -- 192.168.123.107:0/1712769736 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcdc8005350 con 0x7fce041006e0 2026-03-09T19:26:20.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.143+0000 7fce00ff9640 1 -- 192.168.123.107:0/1712769736 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fcdf00388c0 con 0x7fce041006e0 2026-03-09T19:26:20.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.143+0000 7fce00ff9640 1 --2- 192.168.123.107:0/1712769736 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fcdd80761c0 0x7fcdd8078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:20.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.143+0000 7fce00ff9640 1 -- 192.168.123.107:0/1712769736 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fcdf00bc950 con 0x7fce041006e0 2026-03-09T19:26:20.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.143+0000 7fce00ff9640 1 -- 192.168.123.107:0/1712769736 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fcdf00ea8c0 con 0x7fce041006e0 2026-03-09T19:26:20.143 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.144+0000 7fce037fe640 1 --2- 192.168.123.107:0/1712769736 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fcdd80761c0 0x7fcdd8078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:20.143 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.145+0000 7fce037fe640 1 --2- 192.168.123.107:0/1712769736 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fcdd80761c0 0x7fcdd8078680 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7fce04100540 tx=0x7fcdf4009210 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:20.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.273+0000 7fce08b48640 1 -- 192.168.123.107:0/1712769736 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fcdc80058d0 con 0x7fce041006e0 2026-03-09T19:26:20.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.274+0000 7fce00ff9640 1 -- 192.168.123.107:0/1712769736 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7fcdf0085f10 con 0x7fce041006e0 2026-03-09T19:26:20.272 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:26:20.272 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:26:20.272 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T19:26:20.272 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:26:20.272 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:26:20.273 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T19:26:20.273 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:26:20.273 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:26:20.273 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T19:26:20.273 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:26:20.273 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:26:20.273 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:26:20.273 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:26:20.273 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:26:20.273 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 14 2026-03-09T19:26:20.273 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:26:20.273 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:26:20.275 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.277+0000 7fce08b48640 1 -- 192.168.123.107:0/1712769736 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fcdd80761c0 msgr2=0x7fcdd8078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:20.275 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.277+0000 7fce08b48640 1 --2- 192.168.123.107:0/1712769736 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fcdd80761c0 0x7fcdd8078680 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7fce04100540 tx=0x7fcdf4009210 comp rx=0 tx=0).stop 2026-03-09T19:26:20.275 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.277+0000 7fce08b48640 1 -- 192.168.123.107:0/1712769736 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fce041006e0 msgr2=0x7fce04198600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:20.275 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.277+0000 7fce08b48640 1 --2- 192.168.123.107:0/1712769736 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fce041006e0 0x7fce04198600 secure :-1 s=READY pgs=320 cs=0 l=1 rev1=1 crypto rx=0x7fcdf0009ae0 tx=0x7fcdf0002980 comp rx=0 tx=0).stop 2026-03-09T19:26:20.275 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.277+0000 7fce08b48640 1 -- 192.168.123.107:0/1712769736 shutdown_connections 2026-03-09T19:26:20.275 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.277+0000 7fce08b48640 1 --2- 192.168.123.107:0/1712769736 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fcdd80761c0 0x7fcdd8078680 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.275 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.277+0000 7fce08b48640 1 --2- 192.168.123.107:0/1712769736 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fce041006e0 0x7fce04198600 unknown :-1 s=CLOSED pgs=320 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.275 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.277+0000 7fce08b48640 1 --2- 192.168.123.107:0/1712769736 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fce040ff4e0 0x7fce041980c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.275 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.277+0000 7fce08b48640 1 -- 192.168.123.107:0/1712769736 >> 192.168.123.107:0/1712769736 conn(0x7fce040fac90 msgr2=0x7fce040696d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:20.275 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.277+0000 7fce08b48640 1 -- 192.168.123.107:0/1712769736 shutdown_connections 2026-03-09T19:26:20.275 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.277+0000 7fce08b48640 1 -- 192.168.123.107:0/1712769736 wait complete. 2026-03-09T19:26:20.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.341+0000 7fc723685640 1 -- 192.168.123.107:0/1841006178 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc71c107f50 msgr2=0x7fc71c1083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:20.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.341+0000 7fc723685640 1 --2- 192.168.123.107:0/1841006178 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc71c107f50 0x7fc71c1083d0 secure :-1 s=READY pgs=321 cs=0 l=1 rev1=1 crypto rx=0x7fc70c0099b0 tx=0x7fc70c02f220 comp rx=0 tx=0).stop 2026-03-09T19:26:20.342 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.344+0000 7fc723685640 1 -- 192.168.123.107:0/1841006178 shutdown_connections 2026-03-09T19:26:20.342 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.344+0000 7fc723685640 1 --2- 192.168.123.107:0/1841006178 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc71c107f50 0x7fc71c1083d0 unknown :-1 s=CLOSED pgs=321 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.342 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.344+0000 7fc723685640 1 --2- 192.168.123.107:0/1841006178 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc71c107580 0x7fc71c107980 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.342 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.344+0000 7fc723685640 1 -- 192.168.123.107:0/1841006178 >> 192.168.123.107:0/1841006178 conn(0x7fc71c075fb0 msgr2=0x7fc71c0783f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:20.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.345+0000 7fc723685640 1 -- 192.168.123.107:0/1841006178 shutdown_connections 2026-03-09T19:26:20.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.345+0000 7fc723685640 1 -- 192.168.123.107:0/1841006178 wait complete. 2026-03-09T19:26:20.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.345+0000 7fc723685640 1 Processor -- start 2026-03-09T19:26:20.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.345+0000 7fc723685640 1 -- start start 2026-03-09T19:26:20.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.345+0000 7fc723685640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc71c107580 0x7fc71c19e8c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:20.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.345+0000 7fc723685640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc71c107f50 0x7fc71c19ee00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:20.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.345+0000 7fc723685640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc71c19f340 con 0x7fc71c107580 2026-03-09T19:26:20.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.345+0000 7fc723685640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc71c19f4b0 con 0x7fc71c107f50 2026-03-09T19:26:20.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.346+0000 7fc7213fa640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc71c107580 0x7fc71c19e8c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:20.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.346+0000 7fc720bf9640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc71c107f50 0x7fc71c19ee00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:20.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.346+0000 7fc720bf9640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc71c107f50 0x7fc71c19ee00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:45044/0 (socket says 192.168.123.107:45044) 2026-03-09T19:26:20.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.346+0000 7fc720bf9640 1 -- 192.168.123.107:0/2198853017 learned_addr learned my addr 192.168.123.107:0/2198853017 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:26:20.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.346+0000 7fc7213fa640 1 -- 192.168.123.107:0/2198853017 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc71c107f50 msgr2=0x7fc71c19ee00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:20.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.346+0000 7fc7213fa640 1 --2- 192.168.123.107:0/2198853017 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc71c107f50 0x7fc71c19ee00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.346+0000 7fc7213fa640 1 -- 192.168.123.107:0/2198853017 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc70c009660 con 0x7fc71c107580 2026-03-09T19:26:20.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.346+0000 7fc7213fa640 1 --2- 192.168.123.107:0/2198853017 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc71c107580 0x7fc71c19e8c0 secure :-1 s=READY pgs=322 cs=0 l=1 rev1=1 crypto rx=0x7fc70400c8c0 tx=0x7fc70400cd90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:20.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.347+0000 7fc7127fc640 1 -- 192.168.123.107:0/2198853017 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc7040044f0 con 0x7fc71c107580 2026-03-09T19:26:20.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.347+0000 7fc7127fc640 1 -- 192.168.123.107:0/2198853017 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fc704007c60 con 0x7fc71c107580 2026-03-09T19:26:20.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.347+0000 7fc723685640 1 -- 192.168.123.107:0/2198853017 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc71c1a3f90 con 0x7fc71c107580 2026-03-09T19:26:20.346 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.347+0000 7fc723685640 1 -- 192.168.123.107:0/2198853017 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc71c1a4400 con 0x7fc71c107580 2026-03-09T19:26:20.346 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.348+0000 7fc7127fc640 1 -- 192.168.123.107:0/2198853017 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc70400f660 con 0x7fc71c107580 2026-03-09T19:26:20.347 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.349+0000 7fc723685640 1 -- 192.168.123.107:0/2198853017 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc71c107980 con 0x7fc71c107580 2026-03-09T19:26:20.347 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.349+0000 7fc7127fc640 1 -- 192.168.123.107:0/2198853017 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fc70400f880 con 0x7fc71c107580 2026-03-09T19:26:20.347 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.349+0000 7fc7127fc640 1 --2- 192.168.123.107:0/2198853017 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc6f8075fb0 0x7fc6f8078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:20.347 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.349+0000 7fc7127fc640 1 -- 192.168.123.107:0/2198853017 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fc704097830 con 0x7fc71c107580 2026-03-09T19:26:20.350 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.352+0000 7fc720bf9640 1 --2- 192.168.123.107:0/2198853017 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc6f8075fb0 0x7fc6f8078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:20.350 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.352+0000 7fc7127fc640 1 -- 192.168.123.107:0/2198853017 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fc704060ea0 con 0x7fc71c107580 2026-03-09T19:26:20.350 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.353+0000 7fc720bf9640 1 --2- 192.168.123.107:0/2198853017 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc6f8075fb0 0x7fc6f8078470 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7fc70c002b60 tx=0x7fc70c03a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:20.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.498+0000 7fc723685640 1 -- 192.168.123.107:0/2198853017 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fc71c10fbb0 con 0x7fc71c107580 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:e13 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:epoch 13 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:26:20.694 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.693+0000 7fc7127fc640 1 -- 192.168.123.107:0/2198853017 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1856 (secure 0 0 0) 0x7fc704060840 con 0x7fc71c107580 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.696+0000 7fc6ebfff640 1 -- 192.168.123.107:0/2198853017 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc6f8075fb0 msgr2=0x7fc6f8078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.696+0000 7fc6ebfff640 1 --2- 192.168.123.107:0/2198853017 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc6f8075fb0 0x7fc6f8078470 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7fc70c002b60 tx=0x7fc70c03a040 comp rx=0 tx=0).stop 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.696+0000 7fc6ebfff640 1 -- 192.168.123.107:0/2198853017 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc71c107580 msgr2=0x7fc71c19e8c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.696+0000 7fc6ebfff640 1 --2- 192.168.123.107:0/2198853017 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc71c107580 0x7fc71c19e8c0 secure :-1 s=READY pgs=322 cs=0 l=1 rev1=1 crypto rx=0x7fc70400c8c0 tx=0x7fc70400cd90 comp rx=0 tx=0).stop 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.696+0000 7fc6ebfff640 1 -- 192.168.123.107:0/2198853017 shutdown_connections 2026-03-09T19:26:20.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.696+0000 7fc6ebfff640 1 --2- 192.168.123.107:0/2198853017 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fc6f8075fb0 0x7fc6f8078470 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.696+0000 7fc6ebfff640 1 --2- 192.168.123.107:0/2198853017 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc71c107f50 0x7fc71c19ee00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.696+0000 7fc6ebfff640 1 --2- 192.168.123.107:0/2198853017 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc71c107580 0x7fc71c19e8c0 unknown :-1 s=CLOSED pgs=322 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.696+0000 7fc6ebfff640 1 -- 192.168.123.107:0/2198853017 >> 192.168.123.107:0/2198853017 conn(0x7fc71c075fb0 msgr2=0x7fc71c077730 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:20.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.696+0000 7fc6ebfff640 1 -- 192.168.123.107:0/2198853017 shutdown_connections 2026-03-09T19:26:20.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.696+0000 7fc6ebfff640 1 -- 192.168.123.107:0/2198853017 wait complete. 2026-03-09T19:26:20.759 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.761+0000 7fa02ef8d640 1 -- 192.168.123.107:0/4064241646 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa028107580 msgr2=0x7fa028107980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:20.759 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.761+0000 7fa02ef8d640 1 --2- 192.168.123.107:0/4064241646 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa028107580 0x7fa028107980 secure :-1 s=READY pgs=323 cs=0 l=1 rev1=1 crypto rx=0x7fa014009a00 tx=0x7fa01402f290 comp rx=0 tx=0).stop 2026-03-09T19:26:20.759 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.761+0000 7fa02ef8d640 1 -- 192.168.123.107:0/4064241646 shutdown_connections 2026-03-09T19:26:20.759 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.761+0000 7fa02ef8d640 1 --2- 192.168.123.107:0/4064241646 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa028107f50 0x7fa0281083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.759 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.761+0000 7fa02ef8d640 1 --2- 192.168.123.107:0/4064241646 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa028107580 0x7fa028107980 unknown :-1 s=CLOSED pgs=323 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.759 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.761+0000 7fa02ef8d640 1 -- 192.168.123.107:0/4064241646 >> 192.168.123.107:0/4064241646 conn(0x7fa028075f50 msgr2=0x7fa028078370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:20.759 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.761+0000 7fa02ef8d640 1 -- 192.168.123.107:0/4064241646 shutdown_connections 2026-03-09T19:26:20.760 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.761+0000 7fa02ef8d640 1 -- 192.168.123.107:0/4064241646 wait complete. 2026-03-09T19:26:20.760 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.762+0000 7fa02ef8d640 1 Processor -- start 2026-03-09T19:26:20.760 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.762+0000 7fa02ef8d640 1 -- start start 2026-03-09T19:26:20.760 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.763+0000 7fa02ef8d640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa028107580 0x7fa02819e930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:20.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.763+0000 7fa02ef8d640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa028107f50 0x7fa02819ee70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:20.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.763+0000 7fa02ef8d640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa02819f440 con 0x7fa028107580 2026-03-09T19:26:20.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.763+0000 7fa02ef8d640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa02819f5b0 con 0x7fa028107f50 2026-03-09T19:26:20.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.763+0000 7fa02cd02640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa028107580 0x7fa02819e930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:20.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.763+0000 7fa02cd02640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa028107580 0x7fa02819e930 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:39798/0 (socket says 192.168.123.107:39798) 2026-03-09T19:26:20.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.763+0000 7fa02cd02640 1 -- 192.168.123.107:0/1982983950 learned_addr learned my addr 192.168.123.107:0/1982983950 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:26:20.763 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.765+0000 7fa027fff640 1 --2- 192.168.123.107:0/1982983950 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa028107f50 0x7fa02819ee70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:20.763 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.765+0000 7fa02cd02640 1 -- 192.168.123.107:0/1982983950 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa028107f50 msgr2=0x7fa02819ee70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:20.763 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.765+0000 7fa02cd02640 1 --2- 192.168.123.107:0/1982983950 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa028107f50 0x7fa02819ee70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.764 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.765+0000 7fa02cd02640 1 -- 192.168.123.107:0/1982983950 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa014009660 con 0x7fa028107580 2026-03-09T19:26:20.764 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.766+0000 7fa02cd02640 1 --2- 192.168.123.107:0/1982983950 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa028107580 0x7fa02819e930 secure :-1 s=READY pgs=324 cs=0 l=1 rev1=1 crypto rx=0x7fa014002c80 tx=0x7fa014004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:20.765 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.767+0000 7fa025ffb640 1 -- 192.168.123.107:0/1982983950 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa01402fd00 con 0x7fa028107580 2026-03-09T19:26:20.765 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.767+0000 7fa02ef8d640 1 -- 192.168.123.107:0/1982983950 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa0281a3ff0 con 0x7fa028107580 2026-03-09T19:26:20.766 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.767+0000 7fa02ef8d640 1 -- 192.168.123.107:0/1982983950 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa0281a4460 con 0x7fa028107580 2026-03-09T19:26:20.766 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.767+0000 7fa025ffb640 1 -- 192.168.123.107:0/1982983950 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fa01402fe60 con 0x7fa028107580 2026-03-09T19:26:20.766 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.767+0000 7fa025ffb640 1 -- 192.168.123.107:0/1982983950 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa014041a30 con 0x7fa028107580 2026-03-09T19:26:20.767 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.769+0000 7fa025ffb640 1 -- 192.168.123.107:0/1982983950 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fa01403f070 con 0x7fa028107580 2026-03-09T19:26:20.767 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.769+0000 7fa025ffb640 1 --2- 192.168.123.107:0/1982983950 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9ff4076290 0x7f9ff4078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:20.767 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.770+0000 7fa025ffb640 1 -- 192.168.123.107:0/1982983950 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fa0140bc5e0 con 0x7fa028107580 2026-03-09T19:26:20.768 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.770+0000 7fa02ef8d640 1 -- 192.168.123.107:0/1982983950 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa02810fb90 con 0x7fa028107580 2026-03-09T19:26:20.769 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.771+0000 7fa027fff640 1 --2- 192.168.123.107:0/1982983950 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9ff4076290 0x7f9ff4078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:20.772 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.774+0000 7fa025ffb640 1 -- 192.168.123.107:0/1982983950 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fa014046080 con 0x7fa028107580 2026-03-09T19:26:20.776 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.774+0000 7fa027fff640 1 --2- 192.168.123.107:0/1982983950 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9ff4076290 0x7f9ff4078750 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7fa02819fe50 tx=0x7fa01c00b040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:20.888 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.890+0000 7fa02ef8d640 1 -- 192.168.123.107:0/1982983950 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fa02810c720 con 0x7f9ff4076290 2026-03-09T19:26:20.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.891+0000 7fa025ffb640 1 -- 192.168.123.107:0/1982983950 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7fa02810c720 con 0x7f9ff4076290 2026-03-09T19:26:20.890 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:26:20.890 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T19:26:20.890 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:26:20.890 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:26:20.890 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [], 2026-03-09T19:26:20.890 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "0/23 daemons upgraded", 2026-03-09T19:26:20.890 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm08", 2026-03-09T19:26:20.890 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:26:20.890 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:26:20.893 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.894+0000 7fa02ef8d640 1 -- 192.168.123.107:0/1982983950 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9ff4076290 msgr2=0x7f9ff4078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:20.893 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.894+0000 7fa02ef8d640 1 --2- 192.168.123.107:0/1982983950 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9ff4076290 0x7f9ff4078750 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7fa02819fe50 tx=0x7fa01c00b040 comp rx=0 tx=0).stop 2026-03-09T19:26:20.893 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.895+0000 7fa02ef8d640 1 -- 192.168.123.107:0/1982983950 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa028107580 msgr2=0x7fa02819e930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:20.893 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.895+0000 7fa02ef8d640 1 --2- 192.168.123.107:0/1982983950 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa028107580 0x7fa02819e930 secure :-1 s=READY pgs=324 cs=0 l=1 rev1=1 crypto rx=0x7fa014002c80 tx=0x7fa014004290 comp rx=0 tx=0).stop 2026-03-09T19:26:20.893 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.895+0000 7fa02ef8d640 1 -- 192.168.123.107:0/1982983950 shutdown_connections 2026-03-09T19:26:20.893 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.895+0000 7fa02ef8d640 1 --2- 192.168.123.107:0/1982983950 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9ff4076290 0x7f9ff4078750 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.893 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.895+0000 7fa02ef8d640 1 --2- 192.168.123.107:0/1982983950 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa028107f50 0x7fa02819ee70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.893 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.895+0000 7fa02ef8d640 1 --2- 192.168.123.107:0/1982983950 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa028107580 0x7fa02819e930 unknown :-1 s=CLOSED pgs=324 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.893 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.895+0000 7fa02ef8d640 1 -- 192.168.123.107:0/1982983950 >> 192.168.123.107:0/1982983950 conn(0x7fa028075f50 msgr2=0x7fa028077790 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:20.893 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.895+0000 7fa02ef8d640 1 -- 192.168.123.107:0/1982983950 shutdown_connections 2026-03-09T19:26:20.894 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.895+0000 7fa02ef8d640 1 -- 192.168.123.107:0/1982983950 wait complete. 2026-03-09T19:26:20.973 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:20 vm07.local ceph-mon[48545]: from='client.14630 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:26:20.974 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:20 vm07.local ceph-mon[48545]: from='client.14634 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:26:20.974 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:20 vm07.local ceph-mon[48545]: from='client.24395 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:26:20.974 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:20 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/1712769736' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:26:20.974 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.975+0000 7fd718630640 1 -- 192.168.123.107:0/4217802754 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd71010c850 msgr2=0x7fd71010cc50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:20.974 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.975+0000 7fd718630640 1 --2- 192.168.123.107:0/4217802754 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd71010c850 0x7fd71010cc50 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fd7000098e0 tx=0x7fd70002f1b0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.975 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.978+0000 7fd718630640 1 -- 192.168.123.107:0/4217802754 shutdown_connections 2026-03-09T19:26:20.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.978+0000 7fd718630640 1 --2- 192.168.123.107:0/4217802754 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd71010b590 0x7fd71010ba10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.978+0000 7fd718630640 1 --2- 192.168.123.107:0/4217802754 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd71010c850 0x7fd71010cc50 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.978+0000 7fd718630640 1 -- 192.168.123.107:0/4217802754 >> 192.168.123.107:0/4217802754 conn(0x7fd71006a700 msgr2=0x7fd71006ab10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:20.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.978+0000 7fd718630640 1 -- 192.168.123.107:0/4217802754 shutdown_connections 2026-03-09T19:26:20.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.978+0000 7fd718630640 1 -- 192.168.123.107:0/4217802754 wait complete. 2026-03-09T19:26:20.977 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.979+0000 7fd718630640 1 Processor -- start 2026-03-09T19:26:20.977 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.979+0000 7fd718630640 1 -- start start 2026-03-09T19:26:20.978 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.980+0000 7fd718630640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd71010b590 0x7fd7101a2c50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:20.978 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.980+0000 7fd718630640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd71010c850 0x7fd7101a3190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:20.978 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.980+0000 7fd718630640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd7101a3760 con 0x7fd71010c850 2026-03-09T19:26:20.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.980+0000 7fd715ba4640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd71010c850 0x7fd7101a3190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:20.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.980+0000 7fd715ba4640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd71010c850 0x7fd7101a3190 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:39818/0 (socket says 192.168.123.107:39818) 2026-03-09T19:26:20.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.980+0000 7fd715ba4640 1 -- 192.168.123.107:0/3965169419 learned_addr learned my addr 192.168.123.107:0/3965169419 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:26:20.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.980+0000 7fd7163a5640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd71010b590 0x7fd7101a2c50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:20.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.981+0000 7fd718630640 1 -- 192.168.123.107:0/3965169419 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd7101a38d0 con 0x7fd71010b590 2026-03-09T19:26:20.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.981+0000 7fd7163a5640 1 -- 192.168.123.107:0/3965169419 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd71010c850 msgr2=0x7fd7101a3190 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:20.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.981+0000 7fd7163a5640 1 --2- 192.168.123.107:0/3965169419 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd71010c850 0x7fd7101a3190 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:20.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.981+0000 7fd7163a5640 1 -- 192.168.123.107:0/3965169419 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd700009590 con 0x7fd71010b590 2026-03-09T19:26:20.981 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.983+0000 7fd7163a5640 1 --2- 192.168.123.107:0/3965169419 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd71010b590 0x7fd7101a2c50 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fd7000098b0 tx=0x7fd700031c20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:20.984 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.986+0000 7fd6f77fe640 1 -- 192.168.123.107:0/3965169419 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd70003d070 con 0x7fd71010b590 2026-03-09T19:26:20.984 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.986+0000 7fd718630640 1 -- 192.168.123.107:0/3965169419 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd7101a8300 con 0x7fd71010b590 2026-03-09T19:26:20.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.986+0000 7fd718630640 1 -- 192.168.123.107:0/3965169419 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd7101a87f0 con 0x7fd71010b590 2026-03-09T19:26:20.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.987+0000 7fd718630640 1 -- 192.168.123.107:0/3965169419 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd710113ee0 con 0x7fd71010b590 2026-03-09T19:26:20.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.987+0000 7fd6f77fe640 1 -- 192.168.123.107:0/3965169419 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd70002fe90 con 0x7fd71010b590 2026-03-09T19:26:20.985 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.987+0000 7fd6f77fe640 1 -- 192.168.123.107:0/3965169419 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd700031770 con 0x7fd71010b590 2026-03-09T19:26:20.986 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.989+0000 7fd6f77fe640 1 -- 192.168.123.107:0/3965169419 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fd70002f9d0 con 0x7fd71010b590 2026-03-09T19:26:20.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.989+0000 7fd6f77fe640 1 --2- 192.168.123.107:0/3965169419 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd6fc076170 0x7fd6fc078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:20.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.989+0000 7fd6f77fe640 1 -- 192.168.123.107:0/3965169419 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fd7000bc880 con 0x7fd71010b590 2026-03-09T19:26:20.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.989+0000 7fd715ba4640 1 --2- 192.168.123.107:0/3965169419 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd6fc076170 0x7fd6fc078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:20.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.990+0000 7fd715ba4640 1 --2- 192.168.123.107:0/3965169419 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd6fc076170 0x7fd6fc078630 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7fd704009ea0 tx=0x7fd704009340 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:20.989 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:20.991+0000 7fd6f77fe640 1 -- 192.168.123.107:0/3965169419 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fd700085ef0 con 0x7fd71010b590 2026-03-09T19:26:21.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:20 vm08.local ceph-mon[57794]: from='client.14630 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:26:21.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:20 vm08.local ceph-mon[57794]: from='client.14634 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:26:21.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:20 vm08.local ceph-mon[57794]: from='client.24395 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:26:21.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:20 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/1712769736' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:26:21.125 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:21.127+0000 7fd718630640 1 -- 192.168.123.107:0/3965169419 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fd7101a8ad0 con 0x7fd71010b590 2026-03-09T19:26:21.125 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:21.127+0000 7fd6f77fe640 1 -- 192.168.123.107:0/3965169419 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7fd700085890 con 0x7fd71010b590 2026-03-09T19:26:21.125 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_OK 2026-03-09T19:26:21.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:21.130+0000 7fd718630640 1 -- 192.168.123.107:0/3965169419 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd6fc076170 msgr2=0x7fd6fc078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:21.129 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:21.130+0000 7fd718630640 1 --2- 192.168.123.107:0/3965169419 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd6fc076170 0x7fd6fc078630 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7fd704009ea0 tx=0x7fd704009340 comp rx=0 tx=0).stop 2026-03-09T19:26:21.129 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:21.130+0000 7fd718630640 1 -- 192.168.123.107:0/3965169419 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd71010b590 msgr2=0x7fd7101a2c50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:21.129 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:21.130+0000 7fd718630640 1 --2- 192.168.123.107:0/3965169419 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd71010b590 0x7fd7101a2c50 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fd7000098b0 tx=0x7fd700031c20 comp rx=0 tx=0).stop 2026-03-09T19:26:21.129 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:21.131+0000 7fd718630640 1 -- 192.168.123.107:0/3965169419 shutdown_connections 2026-03-09T19:26:21.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:21.131+0000 7fd718630640 1 --2- 192.168.123.107:0/3965169419 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd6fc076170 0x7fd6fc078630 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:21.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:21.131+0000 7fd718630640 1 --2- 192.168.123.107:0/3965169419 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd71010c850 0x7fd7101a3190 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:21.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:21.131+0000 7fd718630640 1 --2- 192.168.123.107:0/3965169419 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd71010b590 0x7fd7101a2c50 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:21.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:21.131+0000 7fd718630640 1 -- 192.168.123.107:0/3965169419 >> 192.168.123.107:0/3965169419 conn(0x7fd71006a700 msgr2=0x7fd71010a640 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:21.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:21.132+0000 7fd718630640 1 -- 192.168.123.107:0/3965169419 shutdown_connections 2026-03-09T19:26:21.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:21.132+0000 7fd718630640 1 -- 192.168.123.107:0/3965169419 wait complete. 2026-03-09T19:26:22.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:21 vm08.local ceph-mon[57794]: pgmap v135: 65 pgs: 65 active+clean; 73 MiB data, 708 MiB used, 119 GiB / 120 GiB avail; 3.2 MiB/s wr, 356 op/s 2026-03-09T19:26:22.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:21 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/2198853017' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:26:22.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:21 vm08.local ceph-mon[57794]: from='client.14648 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:26:22.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:21 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/3965169419' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:26:22.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:21 vm07.local ceph-mon[48545]: pgmap v135: 65 pgs: 65 active+clean; 73 MiB data, 708 MiB used, 119 GiB / 120 GiB avail; 3.2 MiB/s wr, 356 op/s 2026-03-09T19:26:22.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:21 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/2198853017' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:26:22.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:21 vm07.local ceph-mon[48545]: from='client.14648 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:26:22.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:21 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/3965169419' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:26:24.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:23 vm07.local ceph-mon[48545]: pgmap v136: 65 pgs: 65 active+clean; 85 MiB data, 811 MiB used, 119 GiB / 120 GiB avail; 4.1 MiB/s wr, 411 op/s 2026-03-09T19:26:24.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:23 vm08.local ceph-mon[57794]: pgmap v136: 65 pgs: 65 active+clean; 85 MiB data, 811 MiB used, 119 GiB / 120 GiB avail; 4.1 MiB/s wr, 411 op/s 2026-03-09T19:26:25.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:25 vm07.local ceph-mon[48545]: pgmap v137: 65 pgs: 65 active+clean; 90 MiB data, 853 MiB used, 119 GiB / 120 GiB avail; 3.8 MiB/s wr, 414 op/s 2026-03-09T19:26:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:25 vm08.local ceph-mon[57794]: pgmap v137: 65 pgs: 65 active+clean; 90 MiB data, 853 MiB used, 119 GiB / 120 GiB avail; 3.8 MiB/s wr, 414 op/s 2026-03-09T19:26:27.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:27 vm07.local ceph-mon[48545]: pgmap v138: 65 pgs: 65 active+clean; 99 MiB data, 895 MiB used, 119 GiB / 120 GiB avail; 4.2 MiB/s wr, 377 op/s 2026-03-09T19:26:28.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:27 vm08.local ceph-mon[57794]: pgmap v138: 65 pgs: 65 active+clean; 99 MiB data, 895 MiB used, 119 GiB / 120 GiB avail; 4.2 MiB/s wr, 377 op/s 2026-03-09T19:26:30.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:29 vm08.local ceph-mon[57794]: pgmap v139: 65 pgs: 65 active+clean; 106 MiB data, 1020 MiB used, 119 GiB / 120 GiB avail; 4.3 MiB/s wr, 451 op/s 2026-03-09T19:26:30.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:29 vm07.local ceph-mon[48545]: pgmap v139: 65 pgs: 65 active+clean; 106 MiB data, 1020 MiB used, 119 GiB / 120 GiB avail; 4.3 MiB/s wr, 451 op/s 2026-03-09T19:26:32.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:32 vm08.local ceph-mon[57794]: pgmap v140: 65 pgs: 65 active+clean; 108 MiB data, 1013 MiB used, 119 GiB / 120 GiB avail; 3.5 MiB/s wr, 413 op/s 2026-03-09T19:26:32.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:32 vm07.local ceph-mon[48545]: pgmap v140: 65 pgs: 65 active+clean; 108 MiB data, 1013 MiB used, 119 GiB / 120 GiB avail; 3.5 MiB/s wr, 413 op/s 2026-03-09T19:26:33.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:33 vm08.local ceph-mon[57794]: pgmap v141: 65 pgs: 65 active+clean; 116 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 3.8 MiB/s wr, 466 op/s 2026-03-09T19:26:33.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:33 vm07.local ceph-mon[48545]: pgmap v141: 65 pgs: 65 active+clean; 116 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 3.8 MiB/s wr, 466 op/s 2026-03-09T19:26:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:34 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:26:34.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:34 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:26:35.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:35 vm08.local ceph-mon[57794]: pgmap v142: 65 pgs: 65 active+clean; 125 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 3.7 MiB/s wr, 464 op/s 2026-03-09T19:26:35.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:35 vm07.local ceph-mon[48545]: pgmap v142: 65 pgs: 65 active+clean; 125 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 3.7 MiB/s wr, 464 op/s 2026-03-09T19:26:38.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:38 vm07.local ceph-mon[48545]: pgmap v143: 65 pgs: 65 active+clean; 130 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 3.5 MiB/s wr, 407 op/s 2026-03-09T19:26:38.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:38 vm08.local ceph-mon[57794]: pgmap v143: 65 pgs: 65 active+clean; 130 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 3.5 MiB/s wr, 407 op/s 2026-03-09T19:26:39.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:39 vm08.local ceph-mon[57794]: pgmap v144: 65 pgs: 65 active+clean; 138 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 3.5 MiB/s wr, 474 op/s 2026-03-09T19:26:39.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:39 vm07.local ceph-mon[48545]: pgmap v144: 65 pgs: 65 active+clean; 138 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 3.5 MiB/s wr, 474 op/s 2026-03-09T19:26:42.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:42 vm07.local ceph-mon[48545]: pgmap v145: 65 pgs: 65 active+clean; 144 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 3.4 MiB/s wr, 423 op/s 2026-03-09T19:26:42.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:42 vm08.local ceph-mon[57794]: pgmap v145: 65 pgs: 65 active+clean; 144 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 3.4 MiB/s wr, 423 op/s 2026-03-09T19:26:44.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:44 vm07.local ceph-mon[48545]: pgmap v146: 65 pgs: 65 active+clean; 153 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 4.0 MiB/s wr, 458 op/s 2026-03-09T19:26:44.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:44 vm08.local ceph-mon[57794]: pgmap v146: 65 pgs: 65 active+clean; 153 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 4.0 MiB/s wr, 458 op/s 2026-03-09T19:26:45.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:45 vm07.local ceph-mon[48545]: pgmap v147: 65 pgs: 65 active+clean; 170 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 4.8 MiB/s wr, 445 op/s 2026-03-09T19:26:45.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:45 vm08.local ceph-mon[57794]: pgmap v147: 65 pgs: 65 active+clean; 170 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 4.8 MiB/s wr, 445 op/s 2026-03-09T19:26:48.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:47 vm08.local ceph-mon[57794]: pgmap v148: 65 pgs: 65 active+clean; 174 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 4.2 MiB/s wr, 378 op/s 2026-03-09T19:26:48.120 INFO:tasks.workunit.client.1.vm08.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-09T19:26:48.128 INFO:tasks.workunit.client.1.vm08.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-09T19:26:48.128 INFO:tasks.workunit.client.1.vm08.stderr:+ make 2026-03-09T19:26:48.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:47 vm07.local ceph-mon[48545]: pgmap v148: 65 pgs: 65 active+clean; 174 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 4.2 MiB/s wr, 378 op/s 2026-03-09T19:26:48.729 INFO:tasks.workunit.client.1.vm08.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-09T19:26:49.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:49 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:26:49.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:49 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:26:49.774 INFO:tasks.workunit.client.1.vm08.stderr:++ readlink -f fsstress 2026-03-09T19:26:49.777 INFO:tasks.workunit.client.1.vm08.stderr:+ BIN=/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-09T19:26:49.777 INFO:tasks.workunit.client.1.vm08.stderr:+ popd 2026-03-09T19:26:49.778 INFO:tasks.workunit.client.1.vm08.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-09T19:26:49.778 INFO:tasks.workunit.client.1.vm08.stderr:+ popd 2026-03-09T19:26:49.780 INFO:tasks.workunit.client.1.vm08.stdout:~/cephtest/mnt.1/client.1/tmp 2026-03-09T19:26:49.780 INFO:tasks.workunit.client.1.vm08.stderr:++ mktemp -d -p . 2026-03-09T19:26:49.783 INFO:tasks.workunit.client.1.vm08.stderr:+ T=./tmp.V6OdvJ4d0F 2026-03-09T19:26:49.783 INFO:tasks.workunit.client.1.vm08.stderr:+ /home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.V6OdvJ4d0F -l 1 -n 1000 -p 10 -v 2026-03-09T19:26:49.790 INFO:tasks.workunit.client.1.vm08.stdout:seed = 1772292446 2026-03-09T19:26:49.797 INFO:tasks.workunit.client.1.vm08.stdout:0/0: truncate - no filename 2026-03-09T19:26:49.797 INFO:tasks.workunit.client.1.vm08.stdout:0/1: dwrite - no filename 2026-03-09T19:26:49.802 INFO:tasks.workunit.client.1.vm08.stdout:1/0: chown . 133 1 2026-03-09T19:26:49.803 INFO:tasks.workunit.client.1.vm08.stdout:3/0: readlink - no filename 2026-03-09T19:26:49.804 INFO:tasks.workunit.client.1.vm08.stdout:0/2: creat f0 x:0 0 0 2026-03-09T19:26:49.807 INFO:tasks.workunit.client.1.vm08.stdout:4/0: chown . 10464 1 2026-03-09T19:26:49.807 INFO:tasks.workunit.client.1.vm08.stdout:4/1: truncate - no filename 2026-03-09T19:26:49.807 INFO:tasks.workunit.client.1.vm08.stdout:4/2: truncate - no filename 2026-03-09T19:26:49.813 INFO:tasks.workunit.client.1.vm08.stdout:1/1: mknod c0 0 2026-03-09T19:26:49.813 INFO:tasks.workunit.client.1.vm08.stdout:1/2: dread - no filename 2026-03-09T19:26:49.813 INFO:tasks.workunit.client.1.vm08.stdout:1/3: dread - no filename 2026-03-09T19:26:49.817 INFO:tasks.workunit.client.1.vm08.stdout:3/1: mkdir d0 0 2026-03-09T19:26:49.818 INFO:tasks.workunit.client.1.vm08.stdout:5/0: readlink - no filename 2026-03-09T19:26:49.818 INFO:tasks.workunit.client.1.vm08.stdout:5/1: rmdir - no directory 2026-03-09T19:26:49.820 INFO:tasks.workunit.client.1.vm08.stdout:1/4: symlink l1 0 2026-03-09T19:26:49.821 INFO:tasks.workunit.client.1.vm08.stdout:4/3: symlink l0 0 2026-03-09T19:26:49.822 INFO:tasks.workunit.client.1.vm08.stdout:7/0: dwrite - no filename 2026-03-09T19:26:49.825 INFO:tasks.workunit.client.1.vm08.stdout:3/2: mknod d0/c1 0 2026-03-09T19:26:49.825 INFO:tasks.workunit.client.1.vm08.stdout:1/5: creat f2 x:0 0 0 2026-03-09T19:26:49.826 INFO:tasks.workunit.client.1.vm08.stdout:3/3: stat d0/c1 0 2026-03-09T19:26:49.826 INFO:tasks.workunit.client.1.vm08.stdout:1/6: truncate f2 107805 0 2026-03-09T19:26:49.826 INFO:tasks.workunit.client.1.vm08.stdout:1/7: rmdir - no directory 2026-03-09T19:26:49.827 INFO:tasks.workunit.client.1.vm08.stdout:1/8: dread f2 [0,4194304] 0 2026-03-09T19:26:49.827 INFO:tasks.workunit.client.1.vm08.stdout:3/4: stat d0 0 2026-03-09T19:26:49.830 INFO:tasks.workunit.client.1.vm08.stdout:5/2: creat f0 x:0 0 0 2026-03-09T19:26:49.831 INFO:tasks.workunit.client.1.vm08.stdout:5/3: write f0 [928793,37423] 0 2026-03-09T19:26:49.831 INFO:tasks.workunit.client.1.vm08.stdout:8/0: link - no file 2026-03-09T19:26:49.831 INFO:tasks.workunit.client.1.vm08.stdout:8/1: chown . 5 1 2026-03-09T19:26:49.831 INFO:tasks.workunit.client.1.vm08.stdout:8/2: dwrite - no filename 2026-03-09T19:26:49.832 INFO:tasks.workunit.client.1.vm08.stdout:8/3: chown . 1301504 1 2026-03-09T19:26:49.832 INFO:tasks.workunit.client.1.vm08.stdout:8/4: dread - no filename 2026-03-09T19:26:49.832 INFO:tasks.workunit.client.1.vm08.stdout:8/5: write - no filename 2026-03-09T19:26:49.842 INFO:tasks.workunit.client.1.vm08.stdout:1/9: symlink l3 0 2026-03-09T19:26:49.842 INFO:tasks.workunit.client.1.vm08.stdout:1/10: stat f2 0 2026-03-09T19:26:49.844 INFO:tasks.workunit.client.1.vm08.stdout:3/5: mkdir d0/d2 0 2026-03-09T19:26:49.844 INFO:tasks.workunit.client.1.vm08.stdout:3/6: readlink - no filename 2026-03-09T19:26:49.844 INFO:tasks.workunit.client.1.vm08.stdout:3/7: read - no filename 2026-03-09T19:26:49.845 INFO:tasks.workunit.client.1.vm08.stdout:3/8: read - no filename 2026-03-09T19:26:49.849 INFO:tasks.workunit.client.1.vm08.stdout:5/4: creat f1 x:0 0 0 2026-03-09T19:26:49.855 INFO:tasks.workunit.client.1.vm08.stdout:7/1: mknod c0 0 2026-03-09T19:26:49.855 INFO:tasks.workunit.client.1.vm08.stdout:7/2: dwrite - no filename 2026-03-09T19:26:49.855 INFO:tasks.workunit.client.1.vm08.stdout:7/3: chown c0 4469353 1 2026-03-09T19:26:49.855 INFO:tasks.workunit.client.1.vm08.stdout:7/4: dread - no filename 2026-03-09T19:26:49.861 INFO:tasks.workunit.client.1.vm08.stdout:1/11: symlink l4 0 2026-03-09T19:26:49.863 INFO:tasks.workunit.client.1.vm08.stdout:3/9: rename d0/c1 to d0/c3 0 2026-03-09T19:26:49.863 INFO:tasks.workunit.client.1.vm08.stdout:6/0: dread - no filename 2026-03-09T19:26:49.863 INFO:tasks.workunit.client.1.vm08.stdout:6/1: truncate - no filename 2026-03-09T19:26:49.863 INFO:tasks.workunit.client.1.vm08.stdout:6/2: stat - no entries 2026-03-09T19:26:49.867 INFO:tasks.workunit.client.1.vm08.stdout:5/5: creat f2 x:0 0 0 2026-03-09T19:26:49.868 INFO:tasks.workunit.client.1.vm08.stdout:8/6: creat f0 x:0 0 0 2026-03-09T19:26:49.868 INFO:tasks.workunit.client.1.vm08.stdout:8/7: dread - f0 zero size 2026-03-09T19:26:49.869 INFO:tasks.workunit.client.1.vm08.stdout:8/8: truncate f0 322255 0 2026-03-09T19:26:49.870 INFO:tasks.workunit.client.1.vm08.stdout:7/5: mknod c1 0 2026-03-09T19:26:49.875 INFO:tasks.workunit.client.1.vm08.stdout:1/12: write f2 [894489,43876] 0 2026-03-09T19:26:49.881 INFO:tasks.workunit.client.1.vm08.stdout:2/0: write - no filename 2026-03-09T19:26:49.890 INFO:tasks.workunit.client.1.vm08.stdout:3/10: rename d0/d2 to d0/d4 0 2026-03-09T19:26:49.890 INFO:tasks.workunit.client.1.vm08.stdout:3/11: chown d0/d4 5 1 2026-03-09T19:26:49.892 INFO:tasks.workunit.client.1.vm08.stdout:5/6: mknod c3 0 2026-03-09T19:26:49.893 INFO:tasks.workunit.client.1.vm08.stdout:5/7: truncate f1 441120 0 2026-03-09T19:26:49.895 INFO:tasks.workunit.client.1.vm08.stdout:8/9: creat f1 x:0 0 0 2026-03-09T19:26:49.895 INFO:tasks.workunit.client.1.vm08.stdout:5/8: dread f1 [0,4194304] 0 2026-03-09T19:26:49.906 INFO:tasks.workunit.client.1.vm08.stdout:6/3: creat f0 x:0 0 0 2026-03-09T19:26:49.912 INFO:tasks.workunit.client.1.vm08.stdout:8/10: dwrite f1 [0,4194304] 0 2026-03-09T19:26:49.932 INFO:tasks.workunit.client.1.vm08.stdout:5/9: mknod c4 0 2026-03-09T19:26:49.935 INFO:tasks.workunit.client.1.vm08.stdout:5/10: dread f0 [0,4194304] 0 2026-03-09T19:26:49.941 INFO:tasks.workunit.client.1.vm08.stdout:2/1: symlink l0 0 2026-03-09T19:26:49.941 INFO:tasks.workunit.client.1.vm08.stdout:2/2: write - no filename 2026-03-09T19:26:49.941 INFO:tasks.workunit.client.1.vm08.stdout:2/3: write - no filename 2026-03-09T19:26:49.946 INFO:tasks.workunit.client.1.vm08.stdout:8/11: symlink l2 0 2026-03-09T19:26:49.950 INFO:tasks.workunit.client.1.vm08.stdout:8/12: dread f1 [0,4194304] 0 2026-03-09T19:26:49.953 INFO:tasks.workunit.client.1.vm08.stdout:5/11: creat f5 x:0 0 0 2026-03-09T19:26:49.953 INFO:tasks.workunit.client.1.vm08.stdout:5/12: readlink - no filename 2026-03-09T19:26:49.953 INFO:tasks.workunit.client.1.vm08.stdout:7/6: link c0 c2 0 2026-03-09T19:26:49.953 INFO:tasks.workunit.client.1.vm08.stdout:7/7: write - no filename 2026-03-09T19:26:49.953 INFO:tasks.workunit.client.1.vm08.stdout:7/8: dwrite - no filename 2026-03-09T19:26:49.955 INFO:tasks.workunit.client.1.vm08.stdout:1/13: link l1 l5 0 2026-03-09T19:26:49.961 INFO:tasks.workunit.client.1.vm08.stdout:2/4: creat f1 x:0 0 0 2026-03-09T19:26:49.961 INFO:tasks.workunit.client.1.vm08.stdout:1/14: dread f2 [0,4194304] 0 2026-03-09T19:26:49.962 INFO:tasks.workunit.client.1.vm08.stdout:1/15: read f2 [222927,34509] 0 2026-03-09T19:26:49.970 INFO:tasks.workunit.client.1.vm08.stdout:8/13: creat f3 x:0 0 0 2026-03-09T19:26:49.978 INFO:tasks.workunit.client.1.vm08.stdout:7/9: creat f3 x:0 0 0 2026-03-09T19:26:49.981 INFO:tasks.workunit.client.1.vm08.stdout:2/5: creat f2 x:0 0 0 2026-03-09T19:26:49.993 INFO:tasks.workunit.client.1.vm08.stdout:2/6: dwrite f1 [0,4194304] 0 2026-03-09T19:26:50.002 INFO:tasks.workunit.client.1.vm08.stdout:8/14: mkdir d4 0 2026-03-09T19:26:50.011 INFO:tasks.workunit.client.1.vm08.stdout:7/10: mknod c4 0 2026-03-09T19:26:50.018 INFO:tasks.workunit.client.1.vm08.stdout:1/16: link f2 f6 0 2026-03-09T19:26:50.025 INFO:tasks.workunit.client.1.vm08.stdout:2/7: mkdir d3 0 2026-03-09T19:26:50.050 INFO:tasks.workunit.client.1.vm08.stdout:2/8: mkdir d3/d4 0 2026-03-09T19:26:50.050 INFO:tasks.workunit.client.1.vm08.stdout:1/17: dwrite f2 [0,4194304] 0 2026-03-09T19:26:50.054 INFO:tasks.workunit.client.1.vm08.stdout:2/9: read - f2 zero size 2026-03-09T19:26:50.054 INFO:tasks.workunit.client.1.vm08.stdout:2/10: dread - f2 zero size 2026-03-09T19:26:50.082 INFO:tasks.workunit.client.1.vm08.stdout:1/18: mknod c7 0 2026-03-09T19:26:50.082 INFO:tasks.workunit.client.1.vm08.stdout:2/11: mknod d3/c5 0 2026-03-09T19:26:50.083 INFO:tasks.workunit.client.1.vm08.stdout:2/12: read - f2 zero size 2026-03-09T19:26:50.088 INFO:tasks.workunit.client.1.vm08.stdout:1/19: fdatasync f6 0 2026-03-09T19:26:50.090 INFO:tasks.workunit.client.1.vm08.stdout:1/20: dread f2 [0,4194304] 0 2026-03-09T19:26:50.092 INFO:tasks.workunit.client.1.vm08.stdout:1/21: dread f6 [0,4194304] 0 2026-03-09T19:26:50.094 INFO:tasks.workunit.client.1.vm08.stdout:2/13: creat d3/d4/f6 x:0 0 0 2026-03-09T19:26:50.094 INFO:tasks.workunit.client.1.vm08.stdout:2/14: stat l0 0 2026-03-09T19:26:50.100 INFO:tasks.workunit.client.1.vm08.stdout:1/22: unlink c7 0 2026-03-09T19:26:50.102 INFO:tasks.workunit.client.1.vm08.stdout:1/23: dread f2 [0,4194304] 0 2026-03-09T19:26:50.110 INFO:tasks.workunit.client.1.vm08.stdout:2/15: creat d3/f7 x:0 0 0 2026-03-09T19:26:50.114 INFO:tasks.workunit.client.1.vm08.stdout:1/24: rename l1 to l8 0 2026-03-09T19:26:50.118 INFO:tasks.workunit.client.1.vm08.stdout:1/25: dread f2 [0,4194304] 0 2026-03-09T19:26:50.123 INFO:tasks.workunit.client.1.vm08.stdout:2/16: link d3/d4/f6 d3/d4/f8 0 2026-03-09T19:26:50.127 INFO:tasks.workunit.client.1.vm08.stdout:2/17: mkdir d3/d9 0 2026-03-09T19:26:50.130 INFO:tasks.workunit.client.1.vm08.stdout:1/26: chown l5 1100 1 2026-03-09T19:26:50.133 INFO:tasks.workunit.client.1.vm08.stdout:1/27: dwrite f6 [0,4194304] 0 2026-03-09T19:26:50.151 INFO:tasks.workunit.client.1.vm08.stdout:2/18: dwrite d3/d4/f8 [0,4194304] 0 2026-03-09T19:26:50.165 INFO:tasks.workunit.client.1.vm08.stdout:2/19: creat d3/d9/fa x:0 0 0 2026-03-09T19:26:50.171 INFO:tasks.workunit.client.1.vm08.stdout:2/20: readlink l0 0 2026-03-09T19:26:50.171 INFO:tasks.workunit.client.1.vm08.stdout:2/21: chown l0 1 1 2026-03-09T19:26:50.377 INFO:tasks.workunit.client.1.vm08.stdout:0/3: fsync f0 0 2026-03-09T19:26:50.383 INFO:tasks.workunit.client.1.vm08.stdout:4/4: getdents . 0 2026-03-09T19:26:50.383 INFO:tasks.workunit.client.1.vm08.stdout:4/5: readlink l0 0 2026-03-09T19:26:50.396 INFO:tasks.workunit.client.1.vm08.stdout:3/12: rename d0/d4 to d0/d5 0 2026-03-09T19:26:50.396 INFO:tasks.workunit.client.1.vm08.stdout:3/13: dwrite - no filename 2026-03-09T19:26:50.397 INFO:tasks.workunit.client.1.vm08.stdout:3/14: chown d0/c3 1822 1 2026-03-09T19:26:50.399 INFO:tasks.workunit.client.1.vm08.stdout:6/4: getdents . 0 2026-03-09T19:26:50.406 INFO:tasks.workunit.client.1.vm08.stdout:8/15: dwrite f1 [4194304,4194304] 0 2026-03-09T19:26:50.407 INFO:tasks.workunit.client.1.vm08.stdout:8/16: read f1 [3750682,89950] 0 2026-03-09T19:26:50.408 INFO:tasks.workunit.client.1.vm08.stdout:6/5: link f0 f1 0 2026-03-09T19:26:50.411 INFO:tasks.workunit.client.1.vm08.stdout:2/22: fdatasync d3/d4/f8 0 2026-03-09T19:26:50.428 INFO:tasks.workunit.client.1.vm08.stdout:6/6: mknod c2 0 2026-03-09T19:26:50.428 INFO:tasks.workunit.client.1.vm08.stdout:2/23: creat d3/d4/fb x:0 0 0 2026-03-09T19:26:50.428 INFO:tasks.workunit.client.1.vm08.stdout:6/7: mkdir d3 0 2026-03-09T19:26:50.428 INFO:tasks.workunit.client.1.vm08.stdout:6/8: read - f0 zero size 2026-03-09T19:26:50.428 INFO:tasks.workunit.client.1.vm08.stdout:6/9: dread - f0 zero size 2026-03-09T19:26:50.428 INFO:tasks.workunit.client.1.vm08.stdout:6/10: rename d3 to d3/d4 22 2026-03-09T19:26:50.428 INFO:tasks.workunit.client.1.vm08.stdout:2/24: mkdir d3/d9/dc 0 2026-03-09T19:26:50.428 INFO:tasks.workunit.client.1.vm08.stdout:2/25: write d3/d4/f6 [3914443,112329] 0 2026-03-09T19:26:50.428 INFO:tasks.workunit.client.1.vm08.stdout:2/26: chown d3/d9/dc 211 1 2026-03-09T19:26:50.428 INFO:tasks.workunit.client.1.vm08.stdout:8/17: rmdir d4 0 2026-03-09T19:26:50.428 INFO:tasks.workunit.client.1.vm08.stdout:8/18: rmdir - no directory 2026-03-09T19:26:50.432 INFO:tasks.workunit.client.1.vm08.stdout:6/11: dwrite f1 [0,4194304] 0 2026-03-09T19:26:50.433 INFO:tasks.workunit.client.1.vm08.stdout:2/27: creat d3/d4/fd x:0 0 0 2026-03-09T19:26:50.433 INFO:tasks.workunit.client.1.vm08.stdout:8/19: symlink l5 0 2026-03-09T19:26:50.435 INFO:tasks.workunit.client.1.vm08.stdout:8/20: dread f0 [0,4194304] 0 2026-03-09T19:26:50.436 INFO:tasks.workunit.client.1.vm08.stdout:5/13: truncate f0 841305 0 2026-03-09T19:26:50.444 INFO:tasks.workunit.client.1.vm08.stdout:2/28: mkdir d3/d9/dc/de 0 2026-03-09T19:26:50.447 INFO:tasks.workunit.client.1.vm08.stdout:2/29: read - d3/d9/fa zero size 2026-03-09T19:26:50.447 INFO:tasks.workunit.client.1.vm08.stdout:8/21: write f0 [158764,23948] 0 2026-03-09T19:26:50.447 INFO:tasks.workunit.client.1.vm08.stdout:5/14: creat f6 x:0 0 0 2026-03-09T19:26:50.450 INFO:tasks.workunit.client.1.vm08.stdout:6/12: link f0 d3/f5 0 2026-03-09T19:26:50.452 INFO:tasks.workunit.client.1.vm08.stdout:6/13: dread f0 [0,4194304] 0 2026-03-09T19:26:50.460 INFO:tasks.workunit.client.1.vm08.stdout:6/14: dwrite f1 [0,4194304] 0 2026-03-09T19:26:50.463 INFO:tasks.workunit.client.1.vm08.stdout:6/15: dread f0 [0,4194304] 0 2026-03-09T19:26:50.467 INFO:tasks.workunit.client.1.vm08.stdout:6/16: dread f0 [0,4194304] 0 2026-03-09T19:26:50.476 INFO:tasks.workunit.client.1.vm08.stdout:1/28: truncate f6 3450945 0 2026-03-09T19:26:50.756 INFO:tasks.workunit.client.1.vm08.stdout:3/15: rename d0/d5 to d0/d6 0 2026-03-09T19:26:50.757 INFO:tasks.workunit.client.1.vm08.stdout:3/16: rename d0/d6 to d0/d6/d7 22 2026-03-09T19:26:50.757 INFO:tasks.workunit.client.1.vm08.stdout:3/17: chown d0/d6 14 1 2026-03-09T19:26:50.759 INFO:tasks.workunit.client.1.vm08.stdout:3/18: unlink d0/c3 0 2026-03-09T19:26:50.759 INFO:tasks.workunit.client.1.vm08.stdout:3/19: stat d0 0 2026-03-09T19:26:50.759 INFO:tasks.workunit.client.1.vm08.stdout:3/20: truncate - no filename 2026-03-09T19:26:50.759 INFO:tasks.workunit.client.1.vm08.stdout:3/21: chown d0/d6 603169 1 2026-03-09T19:26:50.759 INFO:tasks.workunit.client.1.vm08.stdout:3/22: chown d0 11716819 1 2026-03-09T19:26:50.759 INFO:tasks.workunit.client.1.vm08.stdout:3/23: dread - no filename 2026-03-09T19:26:50.759 INFO:tasks.workunit.client.1.vm08.stdout:3/24: write - no filename 2026-03-09T19:26:50.759 INFO:tasks.workunit.client.1.vm08.stdout:3/25: dread - no filename 2026-03-09T19:26:50.761 INFO:tasks.workunit.client.1.vm08.stdout:3/26: mkdir d0/d8 0 2026-03-09T19:26:50.761 INFO:tasks.workunit.client.1.vm08.stdout:3/27: fdatasync - no filename 2026-03-09T19:26:50.761 INFO:tasks.workunit.client.1.vm08.stdout:3/28: dread - no filename 2026-03-09T19:26:50.762 INFO:tasks.workunit.client.1.vm08.stdout:3/29: mkdir d0/d6/d9 0 2026-03-09T19:26:50.762 INFO:tasks.workunit.client.1.vm08.stdout:3/30: write - no filename 2026-03-09T19:26:50.762 INFO:tasks.workunit.client.1.vm08.stdout:3/31: dwrite - no filename 2026-03-09T19:26:50.762 INFO:tasks.workunit.client.1.vm08.stdout:3/32: link - no file 2026-03-09T19:26:50.765 INFO:tasks.workunit.client.1.vm08.stdout:3/33: symlink d0/d6/d9/la 0 2026-03-09T19:26:50.766 INFO:tasks.workunit.client.1.vm08.stdout:3/34: mkdir d0/d8/db 0 2026-03-09T19:26:50.766 INFO:tasks.workunit.client.1.vm08.stdout:3/35: dwrite - no filename 2026-03-09T19:26:50.766 INFO:tasks.workunit.client.1.vm08.stdout:3/36: dread - no filename 2026-03-09T19:26:50.768 INFO:tasks.workunit.client.1.vm08.stdout:3/37: creat d0/d6/d9/fc x:0 0 0 2026-03-09T19:26:50.769 INFO:tasks.workunit.client.1.vm08.stdout:3/38: creat d0/d8/db/fd x:0 0 0 2026-03-09T19:26:50.773 INFO:tasks.workunit.client.1.vm08.stdout:3/39: dwrite d0/d8/db/fd [0,4194304] 0 2026-03-09T19:26:50.775 INFO:tasks.workunit.client.1.vm08.stdout:3/40: readlink d0/d6/d9/la 0 2026-03-09T19:26:50.786 INFO:tasks.workunit.client.1.vm08.stdout:6/17: fdatasync f0 0 2026-03-09T19:26:50.786 INFO:tasks.workunit.client.1.vm08.stdout:8/22: getdents . 0 2026-03-09T19:26:50.786 INFO:tasks.workunit.client.1.vm08.stdout:3/41: mkdir d0/d6/de 0 2026-03-09T19:26:50.786 INFO:tasks.workunit.client.1.vm08.stdout:3/42: dread - d0/d6/d9/fc zero size 2026-03-09T19:26:50.794 INFO:tasks.workunit.client.1.vm08.stdout:8/23: creat f6 x:0 0 0 2026-03-09T19:26:50.815 INFO:tasks.workunit.client.1.vm08.stdout:3/43: mknod d0/cf 0 2026-03-09T19:26:50.815 INFO:tasks.workunit.client.1.vm08.stdout:8/24: rename f3 to f7 0 2026-03-09T19:26:50.815 INFO:tasks.workunit.client.1.vm08.stdout:8/25: mknod c8 0 2026-03-09T19:26:50.815 INFO:tasks.workunit.client.1.vm08.stdout:8/26: read - f7 zero size 2026-03-09T19:26:50.815 INFO:tasks.workunit.client.1.vm08.stdout:3/44: mknod d0/d6/de/c10 0 2026-03-09T19:26:50.815 INFO:tasks.workunit.client.1.vm08.stdout:3/45: stat d0/d6/d9 0 2026-03-09T19:26:50.815 INFO:tasks.workunit.client.1.vm08.stdout:8/27: dwrite f7 [0,4194304] 0 2026-03-09T19:26:50.936 INFO:tasks.workunit.client.1.vm08.stdout:6/18: truncate f1 2872429 0 2026-03-09T19:26:50.940 INFO:tasks.workunit.client.1.vm08.stdout:1/29: write f2 [277112,20062] 0 2026-03-09T19:26:50.961 INFO:tasks.workunit.client.1.vm08.stdout:1/30: chown f2 698 1 2026-03-09T19:26:50.961 INFO:tasks.workunit.client.1.vm08.stdout:1/31: dwrite f6 [0,4194304] 0 2026-03-09T19:26:50.961 INFO:tasks.workunit.client.1.vm08.stdout:1/32: write f6 [1667327,86776] 0 2026-03-09T19:26:50.961 INFO:tasks.workunit.client.1.vm08.stdout:1/33: chown l4 78798914 1 2026-03-09T19:26:50.961 INFO:tasks.workunit.client.1.vm08.stdout:1/34: mkdir d9 0 2026-03-09T19:26:50.961 INFO:tasks.workunit.client.1.vm08.stdout:1/35: mkdir d9/da 0 2026-03-09T19:26:50.961 INFO:tasks.workunit.client.1.vm08.stdout:1/36: mkdir d9/da/db 0 2026-03-09T19:26:50.961 INFO:tasks.workunit.client.1.vm08.stdout:1/37: mkdir d9/da/dc 0 2026-03-09T19:26:50.961 INFO:tasks.workunit.client.1.vm08.stdout:1/38: symlink d9/da/dc/ld 0 2026-03-09T19:26:50.961 INFO:tasks.workunit.client.1.vm08.stdout:1/39: dwrite f2 [0,4194304] 0 2026-03-09T19:26:50.961 INFO:tasks.workunit.client.1.vm08.stdout:1/40: dread f2 [0,4194304] 0 2026-03-09T19:26:50.961 INFO:tasks.workunit.client.1.vm08.stdout:1/41: creat d9/da/db/fe x:0 0 0 2026-03-09T19:26:50.970 INFO:tasks.workunit.client.1.vm08.stdout:1/42: creat d9/da/dc/ff x:0 0 0 2026-03-09T19:26:50.976 INFO:tasks.workunit.client.1.vm08.stdout:1/43: write f6 [1162069,48849] 0 2026-03-09T19:26:50.982 INFO:tasks.workunit.client.1.vm08.stdout:1/44: dwrite d9/da/db/fe [0,4194304] 0 2026-03-09T19:26:50.990 INFO:tasks.workunit.client.1.vm08.stdout:1/45: dwrite f6 [0,4194304] 0 2026-03-09T19:26:50.992 INFO:tasks.workunit.client.1.vm08.stdout:1/46: write d9/da/dc/ff [77328,52760] 0 2026-03-09T19:26:51.004 INFO:tasks.workunit.client.1.vm08.stdout:1/47: creat d9/da/dc/f10 x:0 0 0 2026-03-09T19:26:51.006 INFO:tasks.workunit.client.1.vm08.stdout:1/48: mkdir d9/d11 0 2026-03-09T19:26:51.018 INFO:tasks.workunit.client.1.vm08.stdout:1/49: dread - d9/da/dc/f10 zero size 2026-03-09T19:26:51.021 INFO:tasks.workunit.client.1.vm08.stdout:7/11: sync 2026-03-09T19:26:51.021 INFO:tasks.workunit.client.1.vm08.stdout:9/0: sync 2026-03-09T19:26:51.021 INFO:tasks.workunit.client.1.vm08.stdout:9/1: dwrite - no filename 2026-03-09T19:26:51.021 INFO:tasks.workunit.client.1.vm08.stdout:7/12: write f3 [940446,34579] 0 2026-03-09T19:26:51.022 INFO:tasks.workunit.client.1.vm08.stdout:7/13: stat f3 0 2026-03-09T19:26:51.022 INFO:tasks.workunit.client.1.vm08.stdout:7/14: write f3 [1172391,37624] 0 2026-03-09T19:26:51.043 INFO:tasks.workunit.client.1.vm08.stdout:7/15: mkdir d5 0 2026-03-09T19:26:51.045 INFO:tasks.workunit.client.1.vm08.stdout:9/2: mkdir d0 0 2026-03-09T19:26:51.046 INFO:tasks.workunit.client.1.vm08.stdout:9/3: rename d0 to d0/d1 22 2026-03-09T19:26:51.046 INFO:tasks.workunit.client.1.vm08.stdout:9/4: write - no filename 2026-03-09T19:26:51.051 INFO:tasks.workunit.client.1.vm08.stdout:9/5: mkdir d0/d2 0 2026-03-09T19:26:51.051 INFO:tasks.workunit.client.1.vm08.stdout:9/6: dread - no filename 2026-03-09T19:26:51.051 INFO:tasks.workunit.client.1.vm08.stdout:9/7: fdatasync - no filename 2026-03-09T19:26:51.051 INFO:tasks.workunit.client.1.vm08.stdout:9/8: fdatasync - no filename 2026-03-09T19:26:51.054 INFO:tasks.workunit.client.1.vm08.stdout:7/16: link c0 d5/c6 0 2026-03-09T19:26:51.058 INFO:tasks.workunit.client.1.vm08.stdout:7/17: rmdir d5 39 2026-03-09T19:26:51.114 INFO:tasks.workunit.client.1.vm08.stdout:7/18: dread f3 [0,4194304] 0 2026-03-09T19:26:51.114 INFO:tasks.workunit.client.1.vm08.stdout:7/19: readlink - no filename 2026-03-09T19:26:51.121 INFO:tasks.workunit.client.1.vm08.stdout:7/20: unlink c0 0 2026-03-09T19:26:51.125 INFO:tasks.workunit.client.1.vm08.stdout:7/21: rename f3 to d5/f7 0 2026-03-09T19:26:51.133 INFO:tasks.workunit.client.1.vm08.stdout:7/22: mknod d5/c8 0 2026-03-09T19:26:51.133 INFO:tasks.workunit.client.1.vm08.stdout:7/23: dread d5/f7 [0,4194304] 0 2026-03-09T19:26:51.141 INFO:tasks.workunit.client.1.vm08.stdout:7/24: dwrite d5/f7 [0,4194304] 0 2026-03-09T19:26:51.154 INFO:tasks.workunit.client.1.vm08.stdout:7/25: creat d5/f9 x:0 0 0 2026-03-09T19:26:51.160 INFO:tasks.workunit.client.1.vm08.stdout:7/26: link d5/f7 d5/fa 0 2026-03-09T19:26:51.161 INFO:tasks.workunit.client.1.vm08.stdout:2/30: sync 2026-03-09T19:26:51.161 INFO:tasks.workunit.client.1.vm08.stdout:4/6: sync 2026-03-09T19:26:51.161 INFO:tasks.workunit.client.1.vm08.stdout:4/7: write - no filename 2026-03-09T19:26:51.161 INFO:tasks.workunit.client.1.vm08.stdout:4/8: read - no filename 2026-03-09T19:26:51.161 INFO:tasks.workunit.client.1.vm08.stdout:4/9: write - no filename 2026-03-09T19:26:51.161 INFO:tasks.workunit.client.1.vm08.stdout:0/4: sync 2026-03-09T19:26:51.162 INFO:tasks.workunit.client.1.vm08.stdout:0/5: truncate f0 648993 0 2026-03-09T19:26:51.168 INFO:tasks.workunit.client.1.vm08.stdout:7/27: creat d5/fb x:0 0 0 2026-03-09T19:26:51.172 INFO:tasks.workunit.client.1.vm08.stdout:4/10: creat f1 x:0 0 0 2026-03-09T19:26:51.172 INFO:tasks.workunit.client.1.vm08.stdout:4/11: write f1 [900610,104212] 0 2026-03-09T19:26:51.172 INFO:tasks.workunit.client.1.vm08.stdout:4/12: chown l0 2393 1 2026-03-09T19:26:51.174 INFO:tasks.workunit.client.1.vm08.stdout:0/6: rename f0 to f1 0 2026-03-09T19:26:51.181 INFO:tasks.workunit.client.1.vm08.stdout:4/13: dwrite f1 [0,4194304] 0 2026-03-09T19:26:51.202 INFO:tasks.workunit.client.1.vm08.stdout:0/7: write f1 [308966,66781] 0 2026-03-09T19:26:51.206 INFO:tasks.workunit.client.1.vm08.stdout:7/28: chown c2 841 1 2026-03-09T19:26:51.226 INFO:tasks.workunit.client.1.vm08.stdout:0/8: write f1 [369884,117658] 0 2026-03-09T19:26:51.226 INFO:tasks.workunit.client.1.vm08.stdout:7/29: link d5/f9 d5/fc 0 2026-03-09T19:26:51.226 INFO:tasks.workunit.client.1.vm08.stdout:7/30: creat d5/fd x:0 0 0 2026-03-09T19:26:51.226 INFO:tasks.workunit.client.1.vm08.stdout:7/31: chown c1 64468 1 2026-03-09T19:26:51.226 INFO:tasks.workunit.client.1.vm08.stdout:7/32: mknod d5/ce 0 2026-03-09T19:26:51.226 INFO:tasks.workunit.client.1.vm08.stdout:7/33: write d5/fd [789678,85738] 0 2026-03-09T19:26:51.230 INFO:tasks.workunit.client.1.vm08.stdout:7/34: mknod d5/cf 0 2026-03-09T19:26:51.231 INFO:tasks.workunit.client.1.vm08.stdout:7/35: write d5/f7 [64630,17972] 0 2026-03-09T19:26:51.231 INFO:tasks.workunit.client.1.vm08.stdout:7/36: truncate d5/fb 936624 0 2026-03-09T19:26:51.235 INFO:tasks.workunit.client.1.vm08.stdout:7/37: dwrite d5/fb [0,4194304] 0 2026-03-09T19:26:51.239 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.239+0000 7febe8a09640 1 -- 192.168.123.107:0/435263112 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7febd4038470 con 0x7febe40719a0 2026-03-09T19:26:51.239 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.240+0000 7febeaa0d640 1 -- 192.168.123.107:0/435263112 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febe40719a0 msgr2=0x7febe4071da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:51.240 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.240+0000 7febeaa0d640 1 --2- 192.168.123.107:0/435263112 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febe40719a0 0x7febe4071da0 secure :-1 s=READY pgs=326 cs=0 l=1 rev1=1 crypto rx=0x7febd40099b0 tx=0x7febd402f240 comp rx=0 tx=0).stop 2026-03-09T19:26:51.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.243+0000 7febeaa0d640 1 -- 192.168.123.107:0/435263112 shutdown_connections 2026-03-09T19:26:51.243 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.244+0000 7febeaa0d640 1 --2- 192.168.123.107:0/435263112 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7febe4072370 0x7febe410c590 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.243 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.244+0000 7febeaa0d640 1 --2- 192.168.123.107:0/435263112 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febe40719a0 0x7febe4071da0 unknown :-1 s=CLOSED pgs=326 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.243 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.244+0000 7febeaa0d640 1 -- 192.168.123.107:0/435263112 >> 192.168.123.107:0/435263112 conn(0x7febe406d4f0 msgr2=0x7febe406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:51.243 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.245+0000 7febeaa0d640 1 -- 192.168.123.107:0/435263112 shutdown_connections 2026-03-09T19:26:51.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.245+0000 7febeaa0d640 1 -- 192.168.123.107:0/435263112 wait complete. 2026-03-09T19:26:51.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.245+0000 7febeaa0d640 1 Processor -- start 2026-03-09T19:26:51.244 INFO:tasks.workunit.client.1.vm08.stdout:7/38: stat c2 0 2026-03-09T19:26:51.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.245+0000 7febeaa0d640 1 -- start start 2026-03-09T19:26:51.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.246+0000 7febeaa0d640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7febe40719a0 0x7febe4115970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:51.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.246+0000 7febeaa0d640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febe4072370 0x7febe4115eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:51.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.246+0000 7febeaa0d640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7febe41173b0 con 0x7febe4072370 2026-03-09T19:26:51.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.246+0000 7febeaa0d640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7febe4117520 con 0x7febe40719a0 2026-03-09T19:26:51.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.246+0000 7febe920a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febe4072370 0x7febe4115eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:51.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.246+0000 7febe920a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febe4072370 0x7febe4115eb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:35984/0 (socket says 192.168.123.107:35984) 2026-03-09T19:26:51.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.246+0000 7febe920a640 1 -- 192.168.123.107:0/2586313821 learned_addr learned my addr 192.168.123.107:0/2586313821 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:26:51.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.246+0000 7febe920a640 1 -- 192.168.123.107:0/2586313821 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7febe40719a0 msgr2=0x7febe4115970 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:26:51.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.246+0000 7febe920a640 1 --2- 192.168.123.107:0/2586313821 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7febe40719a0 0x7febe4115970 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.246+0000 7febe920a640 1 -- 192.168.123.107:0/2586313821 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7febd4009660 con 0x7febe4072370 2026-03-09T19:26:51.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.246+0000 7febe920a640 1 --2- 192.168.123.107:0/2586313821 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febe4072370 0x7febe4115eb0 secure :-1 s=READY pgs=327 cs=0 l=1 rev1=1 crypto rx=0x7febd800d8d0 tx=0x7febd800dda0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:51.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.246+0000 7febd2ffd640 1 -- 192.168.123.107:0/2586313821 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7febd8004490 con 0x7febe4072370 2026-03-09T19:26:51.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.247+0000 7febd2ffd640 1 -- 192.168.123.107:0/2586313821 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7febd800bd00 con 0x7febe4072370 2026-03-09T19:26:51.257 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.247+0000 7febd2ffd640 1 -- 192.168.123.107:0/2586313821 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7febd8010460 con 0x7febe4072370 2026-03-09T19:26:51.257 INFO:tasks.workunit.client.1.vm08.stdout:7/39: dwrite d5/f9 [0,4194304] 0 2026-03-09T19:26:51.257 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.247+0000 7febeaa0d640 1 -- 192.168.123.107:0/2586313821 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7febe4116510 con 0x7febe4072370 2026-03-09T19:26:51.257 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.247+0000 7febeaa0d640 1 -- 192.168.123.107:0/2586313821 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7febe41b58d0 con 0x7febe4072370 2026-03-09T19:26:51.257 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.248+0000 7febeaa0d640 1 -- 192.168.123.107:0/2586313821 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7febb8005350 con 0x7febe4072370 2026-03-09T19:26:51.257 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.249+0000 7febd2ffd640 1 -- 192.168.123.107:0/2586313821 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7febd80027e0 con 0x7febe4072370 2026-03-09T19:26:51.257 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.250+0000 7febd2ffd640 1 --2- 192.168.123.107:0/2586313821 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7febc40761f0 0x7febc40786b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:51.257 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.250+0000 7febd2ffd640 1 -- 192.168.123.107:0/2586313821 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7febd8097e90 con 0x7febe4072370 2026-03-09T19:26:51.257 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.252+0000 7febe9a0b640 1 --2- 192.168.123.107:0/2586313821 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7febc40761f0 0x7febc40786b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:51.257 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.252+0000 7febd2ffd640 1 -- 192.168.123.107:0/2586313821 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7febd8061450 con 0x7febe4072370 2026-03-09T19:26:51.257 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.255+0000 7febe9a0b640 1 --2- 192.168.123.107:0/2586313821 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7febc40761f0 0x7febc40786b0 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7febd402f750 tx=0x7febd40023d0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:51.258 INFO:tasks.workunit.client.1.vm08.stdout:7/40: dwrite d5/f7 [0,4194304] 0 2026-03-09T19:26:51.263 INFO:tasks.workunit.client.1.vm08.stdout:7/41: write d5/fd [1756929,56712] 0 2026-03-09T19:26:51.314 INFO:tasks.workunit.client.1.vm08.stdout:3/46: rmdir d0/d8/db 39 2026-03-09T19:26:51.314 INFO:tasks.workunit.client.1.vm08.stdout:3/47: chown d0/d6/d9/fc 11186894 1 2026-03-09T19:26:51.315 INFO:tasks.workunit.client.1.vm08.stdout:3/48: dread - d0/d6/d9/fc zero size 2026-03-09T19:26:51.317 INFO:tasks.workunit.client.1.vm08.stdout:5/15: write f0 [240403,114251] 0 2026-03-09T19:26:51.318 INFO:tasks.workunit.client.1.vm08.stdout:4/14: sync 2026-03-09T19:26:51.323 INFO:tasks.workunit.client.1.vm08.stdout:4/15: dwrite f1 [0,4194304] 0 2026-03-09T19:26:51.335 INFO:tasks.workunit.client.1.vm08.stdout:3/49: dread d0/d8/db/fd [0,4194304] 0 2026-03-09T19:26:51.336 INFO:tasks.workunit.client.1.vm08.stdout:3/50: write d0/d6/d9/fc [982943,49558] 0 2026-03-09T19:26:51.336 INFO:tasks.workunit.client.1.vm08.stdout:3/51: chown d0/d6/de/c10 6548 1 2026-03-09T19:26:51.337 INFO:tasks.workunit.client.1.vm08.stdout:3/52: readlink d0/d6/d9/la 0 2026-03-09T19:26:51.338 INFO:tasks.workunit.client.1.vm08.stdout:3/53: write d0/d6/d9/fc [94522,89828] 0 2026-03-09T19:26:51.340 INFO:tasks.workunit.client.1.vm08.stdout:5/16: fdatasync f1 0 2026-03-09T19:26:51.341 INFO:tasks.workunit.client.1.vm08.stdout:8/28: getdents . 0 2026-03-09T19:26:51.341 INFO:tasks.workunit.client.1.vm08.stdout:8/29: fsync f6 0 2026-03-09T19:26:51.341 INFO:tasks.workunit.client.1.vm08.stdout:8/30: write f0 [16938,97656] 0 2026-03-09T19:26:51.347 INFO:tasks.workunit.client.1.vm08.stdout:4/16: creat f2 x:0 0 0 2026-03-09T19:26:51.352 INFO:tasks.workunit.client.1.vm08.stdout:4/17: dwrite f2 [0,4194304] 0 2026-03-09T19:26:51.354 INFO:tasks.workunit.client.1.vm08.stdout:3/54: rmdir d0 39 2026-03-09T19:26:51.367 INFO:tasks.workunit.client.1.vm08.stdout:5/17: mknod c7 0 2026-03-09T19:26:51.367 INFO:tasks.workunit.client.1.vm08.stdout:5/18: chown f1 95622801 1 2026-03-09T19:26:51.371 INFO:tasks.workunit.client.1.vm08.stdout:8/31: creat f9 x:0 0 0 2026-03-09T19:26:51.375 INFO:tasks.workunit.client.1.vm08.stdout:4/18: symlink l3 0 2026-03-09T19:26:51.379 INFO:tasks.workunit.client.1.vm08.stdout:8/32: write f1 [7137265,8420] 0 2026-03-09T19:26:51.385 INFO:tasks.workunit.client.1.vm08.stdout:8/33: chown f1 1179 1 2026-03-09T19:26:51.385 INFO:tasks.workunit.client.1.vm08.stdout:3/55: mkdir d0/d8/db/d11 0 2026-03-09T19:26:51.385 INFO:tasks.workunit.client.1.vm08.stdout:3/56: chown d0/cf 686 1 2026-03-09T19:26:51.385 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.386+0000 7febeaa0d640 1 -- 192.168.123.107:0/2586313821 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7febb8002bf0 con 0x7febc40761f0 2026-03-09T19:26:51.386 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.387+0000 7febd2ffd640 1 -- 192.168.123.107:0/2586313821 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7febb8002bf0 con 0x7febc40761f0 2026-03-09T19:26:51.390 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.391+0000 7febeaa0d640 1 -- 192.168.123.107:0/2586313821 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7febc40761f0 msgr2=0x7febc40786b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:51.390 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.391+0000 7febeaa0d640 1 --2- 192.168.123.107:0/2586313821 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7febc40761f0 0x7febc40786b0 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7febd402f750 tx=0x7febd40023d0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.390 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.391+0000 7febeaa0d640 1 -- 192.168.123.107:0/2586313821 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febe4072370 msgr2=0x7febe4115eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:51.390 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.391+0000 7febeaa0d640 1 --2- 192.168.123.107:0/2586313821 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febe4072370 0x7febe4115eb0 secure :-1 s=READY pgs=327 cs=0 l=1 rev1=1 crypto rx=0x7febd800d8d0 tx=0x7febd800dda0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.390 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.391+0000 7febeaa0d640 1 -- 192.168.123.107:0/2586313821 shutdown_connections 2026-03-09T19:26:51.390 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.391+0000 7febeaa0d640 1 --2- 192.168.123.107:0/2586313821 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7febc40761f0 0x7febc40786b0 secure :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7febd402f750 tx=0x7febd40023d0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.390 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.391+0000 7febeaa0d640 1 --2- 192.168.123.107:0/2586313821 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febe4072370 0x7febe4115eb0 unknown :-1 s=CLOSED pgs=327 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.391 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.391+0000 7febeaa0d640 1 --2- 192.168.123.107:0/2586313821 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7febe40719a0 0x7febe4115970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.391 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.391+0000 7febeaa0d640 1 -- 192.168.123.107:0/2586313821 >> 192.168.123.107:0/2586313821 conn(0x7febe406d4f0 msgr2=0x7febe40706f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:51.391 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.392+0000 7febeaa0d640 1 -- 192.168.123.107:0/2586313821 shutdown_connections 2026-03-09T19:26:51.392 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.393+0000 7febeaa0d640 1 -- 192.168.123.107:0/2586313821 wait complete. 2026-03-09T19:26:51.399 INFO:tasks.workunit.client.1.vm08.stdout:8/34: unlink f9 0 2026-03-09T19:26:51.399 INFO:tasks.workunit.client.1.vm08.stdout:8/35: write f1 [2532754,30798] 0 2026-03-09T19:26:51.400 INFO:tasks.workunit.client.1.vm08.stdout:8/36: chown c8 12 1 2026-03-09T19:26:51.400 INFO:tasks.workunit.client.1.vm08.stdout:8/37: chown l5 214155016 1 2026-03-09T19:26:51.404 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:26:51.414 INFO:tasks.workunit.client.1.vm08.stdout:3/57: dwrite d0/d8/db/fd [0,4194304] 0 2026-03-09T19:26:51.429 INFO:tasks.workunit.client.1.vm08.stdout:6/19: write d3/f5 [802996,50547] 0 2026-03-09T19:26:51.437 INFO:tasks.workunit.client.1.vm08.stdout:1/50: rename d9/da/db to d9/da/d12 0 2026-03-09T19:26:51.442 INFO:tasks.workunit.client.1.vm08.stdout:6/20: creat d3/f6 x:0 0 0 2026-03-09T19:26:51.443 INFO:tasks.workunit.client.1.vm08.stdout:8/38: link c8 ca 0 2026-03-09T19:26:51.445 INFO:tasks.workunit.client.1.vm08.stdout:1/51: symlink d9/da/l13 0 2026-03-09T19:26:51.453 INFO:tasks.workunit.client.1.vm08.stdout:8/39: dwrite f7 [0,4194304] 0 2026-03-09T19:26:51.456 INFO:tasks.workunit.client.1.vm08.stdout:3/58: rmdir d0/d8/db/d11 0 2026-03-09T19:26:51.457 INFO:tasks.workunit.client.1.vm08.stdout:3/59: rename d0 to d0/d6/de/d12 22 2026-03-09T19:26:51.459 INFO:tasks.workunit.client.1.vm08.stdout:9/9: getdents d0 0 2026-03-09T19:26:51.465 INFO:tasks.workunit.client.1.vm08.stdout:3/60: dwrite d0/d6/d9/fc [0,4194304] 0 2026-03-09T19:26:51.475 INFO:tasks.workunit.client.1.vm08.stdout:1/52: creat d9/da/d12/f14 x:0 0 0 2026-03-09T19:26:51.477 INFO:tasks.workunit.client.1.vm08.stdout:6/21: fdatasync d3/f5 0 2026-03-09T19:26:51.477 INFO:tasks.workunit.client.1.vm08.stdout:6/22: chown f1 70965751 1 2026-03-09T19:26:51.480 INFO:tasks.workunit.client.1.vm08.stdout:3/61: rmdir d0/d8/db 39 2026-03-09T19:26:51.481 INFO:tasks.workunit.client.1.vm08.stdout:1/53: symlink d9/l15 0 2026-03-09T19:26:51.483 INFO:tasks.workunit.client.1.vm08.stdout:8/40: chown c8 4942 1 2026-03-09T19:26:51.484 INFO:tasks.workunit.client.1.vm08.stdout:6/23: rename f0 to d3/f7 0 2026-03-09T19:26:51.485 INFO:tasks.workunit.client.1.vm08.stdout:6/24: write d3/f6 [913800,86524] 0 2026-03-09T19:26:51.487 INFO:tasks.workunit.client.1.vm08.stdout:3/62: mknod d0/d6/de/c13 0 2026-03-09T19:26:51.495 INFO:tasks.workunit.client.1.vm08.stdout:3/63: readlink d0/d6/d9/la 0 2026-03-09T19:26:51.495 INFO:tasks.workunit.client.1.vm08.stdout:9/10: getdents d0 0 2026-03-09T19:26:51.497 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:51 vm07.local ceph-mon[48545]: pgmap v149: 65 pgs: 65 active+clean; 185 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 4.9 MiB/s wr, 436 op/s 2026-03-09T19:26:51.497 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.498+0000 7f6cdb47b640 1 -- 192.168.123.107:0/2455968253 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6ccc0a4830 msgr2=0x7f6ccc0a4c30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:51.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.498+0000 7f6cdb47b640 1 --2- 192.168.123.107:0/2455968253 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6ccc0a4830 0x7f6ccc0a4c30 secure :-1 s=READY pgs=328 cs=0 l=1 rev1=1 crypto rx=0x7f6cc40099b0 tx=0x7f6cc402f240 comp rx=0 tx=0).stop 2026-03-09T19:26:51.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.501+0000 7f6cdb47b640 1 -- 192.168.123.107:0/2455968253 shutdown_connections 2026-03-09T19:26:51.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.501+0000 7f6cdb47b640 1 --2- 192.168.123.107:0/2455968253 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ccc0a5920 0x7f6ccc0a5d80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.501+0000 7f6cdb47b640 1 --2- 192.168.123.107:0/2455968253 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6ccc0a4830 0x7f6ccc0a4c30 unknown :-1 s=CLOSED pgs=328 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.501+0000 7f6cdb47b640 1 -- 192.168.123.107:0/2455968253 >> 192.168.123.107:0/2455968253 conn(0x7f6ccc09fe80 msgr2=0x7f6ccc0a22e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:51.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.503+0000 7f6cdb47b640 1 -- 192.168.123.107:0/2455968253 shutdown_connections 2026-03-09T19:26:51.503 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.503+0000 7f6cdb47b640 1 -- 192.168.123.107:0/2455968253 wait complete. 2026-03-09T19:26:51.503 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.504+0000 7f6cdb47b640 1 Processor -- start 2026-03-09T19:26:51.503 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.504+0000 7f6cdb47b640 1 -- start start 2026-03-09T19:26:51.503 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.504+0000 7f6cdb47b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6ccc0a4830 0x7f6ccc0b32b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:51.503 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.504+0000 7f6cdb47b640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ccc0a5920 0x7f6ccc0b37f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:51.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.504+0000 7f6cdb47b640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ccc0b4cf0 con 0x7f6ccc0a4830 2026-03-09T19:26:51.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.504+0000 7f6cdb47b640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ccc0b4e60 con 0x7f6ccc0a5920 2026-03-09T19:26:51.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.505+0000 7f6cda479640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6ccc0a4830 0x7f6ccc0b32b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:51.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.505+0000 7f6cd9c78640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ccc0a5920 0x7f6ccc0b37f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:51.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.505+0000 7f6cd9c78640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ccc0a5920 0x7f6ccc0b37f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:46696/0 (socket says 192.168.123.107:46696) 2026-03-09T19:26:51.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.505+0000 7f6cd9c78640 1 -- 192.168.123.107:0/3811263896 learned_addr learned my addr 192.168.123.107:0/3811263896 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:26:51.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.505+0000 7f6cda479640 1 -- 192.168.123.107:0/3811263896 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ccc0a5920 msgr2=0x7f6ccc0b37f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:51.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.505+0000 7f6cda479640 1 --2- 192.168.123.107:0/3811263896 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ccc0a5920 0x7f6ccc0b37f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.505+0000 7f6cda479640 1 -- 192.168.123.107:0/3811263896 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6cc4009660 con 0x7f6ccc0a4830 2026-03-09T19:26:51.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.505+0000 7f6cda479640 1 --2- 192.168.123.107:0/3811263896 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6ccc0a4830 0x7f6ccc0b32b0 secure :-1 s=READY pgs=329 cs=0 l=1 rev1=1 crypto rx=0x7f6cc4005bb0 tx=0x7f6cc40026e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:51.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.506+0000 7f6cc37fe640 1 -- 192.168.123.107:0/3811263896 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6cc403d070 con 0x7f6ccc0a4830 2026-03-09T19:26:51.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.506+0000 7f6cc37fe640 1 -- 192.168.123.107:0/3811263896 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6cc4038730 con 0x7f6ccc0a4830 2026-03-09T19:26:51.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.506+0000 7f6cc37fe640 1 -- 192.168.123.107:0/3811263896 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6cc4041820 con 0x7f6ccc0a4830 2026-03-09T19:26:51.505 INFO:tasks.workunit.client.1.vm08.stdout:8/41: rename ca to cb 0 2026-03-09T19:26:51.505 INFO:tasks.workunit.client.1.vm08.stdout:8/42: chown l2 116503365 1 2026-03-09T19:26:51.506 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.506+0000 7f6cdb47b640 1 -- 192.168.123.107:0/3811263896 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6ccc0b3e20 con 0x7f6ccc0a4830 2026-03-09T19:26:51.506 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.506+0000 7f6cdb47b640 1 -- 192.168.123.107:0/3811263896 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6ccc153240 con 0x7f6ccc0a4830 2026-03-09T19:26:51.506 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.507+0000 7f6cdb47b640 1 -- 192.168.123.107:0/3811263896 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6ca4005350 con 0x7f6ccc0a4830 2026-03-09T19:26:51.509 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.508+0000 7f6cc37fe640 1 -- 192.168.123.107:0/3811263896 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f6cc40388a0 con 0x7f6ccc0a4830 2026-03-09T19:26:51.509 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.509+0000 7f6cc37fe640 1 --2- 192.168.123.107:0/3811263896 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6ca80761c0 0x7f6ca8078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:51.509 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.509+0000 7f6cc37fe640 1 -- 192.168.123.107:0/3811263896 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f6cc40bc030 con 0x7f6ccc0a4830 2026-03-09T19:26:51.509 INFO:tasks.workunit.client.1.vm08.stdout:1/54: link d9/l15 d9/da/d12/l16 0 2026-03-09T19:26:51.512 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.510+0000 7f6cd9c78640 1 --2- 192.168.123.107:0/3811263896 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6ca80761c0 0x7f6ca8078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:51.512 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.510+0000 7f6cd9c78640 1 --2- 192.168.123.107:0/3811263896 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6ca80761c0 0x7f6ca8078680 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f6ccc0b49e0 tx=0x7f6cc8009290 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:51.512 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.511+0000 7f6cc37fe640 1 -- 192.168.123.107:0/3811263896 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6cc40f0820 con 0x7f6ccc0a4830 2026-03-09T19:26:51.512 INFO:tasks.workunit.client.1.vm08.stdout:1/55: dwrite f2 [0,4194304] 0 2026-03-09T19:26:51.513 INFO:tasks.workunit.client.1.vm08.stdout:1/56: chown d9/d11 680 1 2026-03-09T19:26:51.513 INFO:tasks.workunit.client.1.vm08.stdout:1/57: write d9/da/dc/ff [566993,13825] 0 2026-03-09T19:26:51.519 INFO:tasks.workunit.client.1.vm08.stdout:1/58: dread d9/da/d12/fe [0,4194304] 0 2026-03-09T19:26:51.526 INFO:tasks.workunit.client.1.vm08.stdout:8/43: mknod cc 0 2026-03-09T19:26:51.527 INFO:tasks.workunit.client.1.vm08.stdout:0/9: unlink f1 0 2026-03-09T19:26:51.527 INFO:tasks.workunit.client.1.vm08.stdout:9/11: getdents d0/d2 0 2026-03-09T19:26:51.536 INFO:tasks.workunit.client.1.vm08.stdout:8/44: mknod cd 0 2026-03-09T19:26:51.545 INFO:tasks.workunit.client.1.vm08.stdout:0/10: creat f2 x:0 0 0 2026-03-09T19:26:51.545 INFO:tasks.workunit.client.1.vm08.stdout:9/12: mkdir d0/d3 0 2026-03-09T19:26:51.546 INFO:tasks.workunit.client.1.vm08.stdout:1/59: fdatasync f6 0 2026-03-09T19:26:51.549 INFO:tasks.workunit.client.1.vm08.stdout:2/31: truncate d3/d4/f6 2271386 0 2026-03-09T19:26:51.561 INFO:tasks.workunit.client.1.vm08.stdout:7/42: getdents d5 0 2026-03-09T19:26:51.591 INFO:tasks.workunit.client.1.vm08.stdout:0/11: sync 2026-03-09T19:26:51.591 INFO:tasks.workunit.client.1.vm08.stdout:8/45: sync 2026-03-09T19:26:51.591 INFO:tasks.workunit.client.1.vm08.stdout:8/46: fsync f1 0 2026-03-09T19:26:51.592 INFO:tasks.workunit.client.1.vm08.stdout:8/47: dread f0 [0,4194304] 0 2026-03-09T19:26:51.593 INFO:tasks.workunit.client.1.vm08.stdout:8/48: write f6 [943657,55650] 0 2026-03-09T19:26:51.597 INFO:tasks.workunit.client.1.vm08.stdout:8/49: dwrite f6 [0,4194304] 0 2026-03-09T19:26:51.614 INFO:tasks.workunit.client.1.vm08.stdout:5/19: getdents . 0 2026-03-09T19:26:51.620 INFO:tasks.workunit.client.1.vm08.stdout:4/19: getdents . 0 2026-03-09T19:26:51.620 INFO:tasks.workunit.client.1.vm08.stdout:4/20: dread f1 [0,4194304] 0 2026-03-09T19:26:51.620 INFO:tasks.workunit.client.1.vm08.stdout:4/21: readlink l3 0 2026-03-09T19:26:51.624 INFO:tasks.workunit.client.1.vm08.stdout:4/22: dwrite f2 [0,4194304] 0 2026-03-09T19:26:51.643 INFO:tasks.workunit.client.1.vm08.stdout:6/25: getdents d3 0 2026-03-09T19:26:51.646 INFO:tasks.workunit.client.1.vm08.stdout:3/64: write d0/d6/d9/fc [4398249,69964] 0 2026-03-09T19:26:51.658 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.658+0000 7f6cdb47b640 1 -- 192.168.123.107:0/3811263896 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6ca4002bf0 con 0x7f6ca80761c0 2026-03-09T19:26:51.664 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.660+0000 7f6cc37fe640 1 -- 192.168.123.107:0/3811263896 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f6ca4002bf0 con 0x7f6ca80761c0 2026-03-09T19:26:51.664 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.663+0000 7f6cc17fa640 1 -- 192.168.123.107:0/3811263896 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6ca80761c0 msgr2=0x7f6ca8078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:51.664 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.663+0000 7f6cc17fa640 1 --2- 192.168.123.107:0/3811263896 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6ca80761c0 0x7f6ca8078680 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f6ccc0b49e0 tx=0x7f6cc8009290 comp rx=0 tx=0).stop 2026-03-09T19:26:51.664 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.664+0000 7f6cc17fa640 1 -- 192.168.123.107:0/3811263896 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6ccc0a4830 msgr2=0x7f6ccc0b32b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:51.664 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.664+0000 7f6cc17fa640 1 --2- 192.168.123.107:0/3811263896 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6ccc0a4830 0x7f6ccc0b32b0 secure :-1 s=READY pgs=329 cs=0 l=1 rev1=1 crypto rx=0x7f6cc4005bb0 tx=0x7f6cc40026e0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.665 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.664+0000 7f6cc17fa640 1 -- 192.168.123.107:0/3811263896 shutdown_connections 2026-03-09T19:26:51.665 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.664+0000 7f6cc17fa640 1 --2- 192.168.123.107:0/3811263896 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6ca80761c0 0x7f6ca8078680 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.665 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.664+0000 7f6cc17fa640 1 --2- 192.168.123.107:0/3811263896 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ccc0a5920 0x7f6ccc0b37f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.665 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.665+0000 7f6cc17fa640 1 --2- 192.168.123.107:0/3811263896 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6ccc0a4830 0x7f6ccc0b32b0 unknown :-1 s=CLOSED pgs=329 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.665 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.665+0000 7f6cc17fa640 1 -- 192.168.123.107:0/3811263896 >> 192.168.123.107:0/3811263896 conn(0x7f6ccc09fe80 msgr2=0x7f6ccc0a1770 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:51.665 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.666+0000 7f6cc17fa640 1 -- 192.168.123.107:0/3811263896 shutdown_connections 2026-03-09T19:26:51.665 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.666+0000 7f6cc17fa640 1 -- 192.168.123.107:0/3811263896 wait complete. 2026-03-09T19:26:51.718 INFO:tasks.workunit.client.1.vm08.stdout:9/13: creat d0/f4 x:0 0 0 2026-03-09T19:26:51.724 INFO:tasks.workunit.client.1.vm08.stdout:1/60: stat d9/l15 0 2026-03-09T19:26:51.724 INFO:tasks.workunit.client.1.vm08.stdout:9/14: dwrite d0/f4 [0,4194304] 0 2026-03-09T19:26:51.727 INFO:tasks.workunit.client.1.vm08.stdout:9/15: dwrite d0/f4 [0,4194304] 0 2026-03-09T19:26:51.728 INFO:tasks.workunit.client.1.vm08.stdout:9/16: truncate d0/f4 4368377 0 2026-03-09T19:26:51.728 INFO:tasks.workunit.client.1.vm08.stdout:9/17: write d0/f4 [3584090,75121] 0 2026-03-09T19:26:51.729 INFO:tasks.workunit.client.1.vm08.stdout:9/18: write d0/f4 [1795968,56862] 0 2026-03-09T19:26:51.739 INFO:tasks.workunit.client.1.vm08.stdout:3/65: mknod d0/c14 0 2026-03-09T19:26:51.739 INFO:tasks.workunit.client.1.vm08.stdout:1/61: mkdir d9/da/d17 0 2026-03-09T19:26:51.740 INFO:tasks.workunit.client.1.vm08.stdout:1/62: stat d9/da/dc/ld 0 2026-03-09T19:26:51.746 INFO:tasks.workunit.client.1.vm08.stdout:0/12: rename f2 to f3 0 2026-03-09T19:26:51.747 INFO:tasks.workunit.client.1.vm08.stdout:0/13: truncate f3 12929 0 2026-03-09T19:26:51.749 INFO:tasks.workunit.client.1.vm08.stdout:3/66: mkdir d0/d6/de/d15 0 2026-03-09T19:26:51.750 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.751+0000 7f494aece640 1 -- 192.168.123.107:0/3715111383 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4944103ab0 msgr2=0x7f4944103f30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:51.750 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.751+0000 7f494aece640 1 --2- 192.168.123.107:0/3715111383 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4944103ab0 0x7f4944103f30 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f49380098e0 tx=0x7f493802f190 comp rx=0 tx=0).stop 2026-03-09T19:26:51.770 INFO:tasks.workunit.client.1.vm08.stdout:1/63: rename l4 to d9/d11/l18 0 2026-03-09T19:26:51.770 INFO:tasks.workunit.client.1.vm08.stdout:1/64: chown d9/d11 0 1 2026-03-09T19:26:51.770 INFO:tasks.workunit.client.1.vm08.stdout:9/19: creat d0/d3/f5 x:0 0 0 2026-03-09T19:26:51.770 INFO:tasks.workunit.client.1.vm08.stdout:4/23: link l3 l4 0 2026-03-09T19:26:51.770 INFO:tasks.workunit.client.1.vm08.stdout:3/67: mkdir d0/d6/d9/d16 0 2026-03-09T19:26:51.770 INFO:tasks.workunit.client.1.vm08.stdout:1/65: mknod d9/c19 0 2026-03-09T19:26:51.770 INFO:tasks.workunit.client.1.vm08.stdout:9/20: mkdir d0/d3/d6 0 2026-03-09T19:26:51.770 INFO:tasks.workunit.client.1.vm08.stdout:9/21: write d0/f4 [2113332,9474] 0 2026-03-09T19:26:51.770 INFO:tasks.workunit.client.1.vm08.stdout:8/50: getdents . 0 2026-03-09T19:26:51.770 INFO:tasks.workunit.client.1.vm08.stdout:4/24: creat f5 x:0 0 0 2026-03-09T19:26:51.770 INFO:tasks.workunit.client.1.vm08.stdout:8/51: dread f6 [0,4194304] 0 2026-03-09T19:26:51.770 INFO:tasks.workunit.client.1.vm08.stdout:3/68: mkdir d0/d6/d9/d16/d17 0 2026-03-09T19:26:51.771 INFO:tasks.workunit.client.1.vm08.stdout:9/22: mkdir d0/d3/d6/d7 0 2026-03-09T19:26:51.771 INFO:tasks.workunit.client.1.vm08.stdout:9/23: write d0/d3/f5 [680293,56914] 0 2026-03-09T19:26:51.771 INFO:tasks.workunit.client.1.vm08.stdout:8/52: mkdir de 0 2026-03-09T19:26:51.771 INFO:tasks.workunit.client.1.vm08.stdout:3/69: fsync d0/d8/db/fd 0 2026-03-09T19:26:51.771 INFO:tasks.workunit.client.1.vm08.stdout:3/70: write d0/d6/d9/fc [4843131,14214] 0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.753+0000 7f494aece640 1 -- 192.168.123.107:0/3715111383 shutdown_connections 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.753+0000 7f494aece640 1 --2- 192.168.123.107:0/3715111383 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4944103ab0 0x7f4944103f30 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.753+0000 7f494aece640 1 --2- 192.168.123.107:0/3715111383 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49441028b0 0x7f4944102cb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.754+0000 7f494aece640 1 -- 192.168.123.107:0/3715111383 >> 192.168.123.107:0/3715111383 conn(0x7f49440fe060 msgr2=0x7f4944100480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.754+0000 7f494aece640 1 -- 192.168.123.107:0/3715111383 shutdown_connections 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.754+0000 7f494aece640 1 -- 192.168.123.107:0/3715111383 wait complete. 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.755+0000 7f494aece640 1 Processor -- start 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.755+0000 7f494aece640 1 -- start start 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.755+0000 7f494aece640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49441028b0 0x7f494419e740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.755+0000 7f494aece640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4944103ab0 0x7f494419ec80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.755+0000 7f494aece640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f494419f250 con 0x7f49441028b0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.755+0000 7f494aece640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f494419f3c0 con 0x7f4944103ab0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.756+0000 7f49496cb640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4944103ab0 0x7f494419ec80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.756+0000 7f4949ecc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49441028b0 0x7f494419e740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.756+0000 7f4949ecc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49441028b0 0x7f494419e740 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:36016/0 (socket says 192.168.123.107:36016) 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.756+0000 7f4949ecc640 1 -- 192.168.123.107:0/3167294905 learned_addr learned my addr 192.168.123.107:0/3167294905 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.757+0000 7f49496cb640 1 -- 192.168.123.107:0/3167294905 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49441028b0 msgr2=0x7f494419e740 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.757+0000 7f49496cb640 1 --2- 192.168.123.107:0/3167294905 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49441028b0 0x7f494419e740 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.757+0000 7f49496cb640 1 -- 192.168.123.107:0/3167294905 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4938009590 con 0x7f4944103ab0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.758+0000 7f49496cb640 1 --2- 192.168.123.107:0/3167294905 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4944103ab0 0x7f494419ec80 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f49380098b0 tx=0x7f4938031c00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.758+0000 7f4932ffd640 1 -- 192.168.123.107:0/3167294905 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f493803d070 con 0x7f4944103ab0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.758+0000 7f494aece640 1 -- 192.168.123.107:0/3167294905 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f49441a3e00 con 0x7f4944103ab0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.758+0000 7f494aece640 1 -- 192.168.123.107:0/3167294905 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f49441a42c0 con 0x7f4944103ab0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.759+0000 7f4932ffd640 1 -- 192.168.123.107:0/3167294905 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f4938031f00 con 0x7f4944103ab0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.759+0000 7f4932ffd640 1 -- 192.168.123.107:0/3167294905 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f49380317a0 con 0x7f4944103ab0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.760+0000 7f494aece640 1 -- 192.168.123.107:0/3167294905 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f490c005350 con 0x7f4944103ab0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.760+0000 7f4932ffd640 1 -- 192.168.123.107:0/3167294905 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f4938049050 con 0x7f4944103ab0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.762+0000 7f4932ffd640 1 --2- 192.168.123.107:0/3167294905 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4920076170 0x7f4920078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.762+0000 7f4932ffd640 1 -- 192.168.123.107:0/3167294905 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f49380bc0e0 con 0x7f4944103ab0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.762+0000 7f4949ecc640 1 --2- 192.168.123.107:0/3167294905 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4920076170 0x7f4920078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.763+0000 7f4949ecc640 1 --2- 192.168.123.107:0/3167294905 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4920076170 0x7f4920078630 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f49340059c0 tx=0x7f4934009340 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:51.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.765+0000 7f4932ffd640 1 -- 192.168.123.107:0/3167294905 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f4938085620 con 0x7f4944103ab0 2026-03-09T19:26:51.771 INFO:tasks.workunit.client.1.vm08.stdout:3/71: dread d0/d8/db/fd [0,4194304] 0 2026-03-09T19:26:51.772 INFO:tasks.workunit.client.1.vm08.stdout:8/53: creat de/ff x:0 0 0 2026-03-09T19:26:51.773 INFO:tasks.workunit.client.1.vm08.stdout:3/72: mkdir d0/d6/d9/d16/d18 0 2026-03-09T19:26:51.776 INFO:tasks.workunit.client.1.vm08.stdout:3/73: rename d0/d8/db to d0/d8/d19 0 2026-03-09T19:26:51.777 INFO:tasks.workunit.client.1.vm08.stdout:3/74: write d0/d8/d19/fd [5206893,60886] 0 2026-03-09T19:26:51.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:51 vm08.local ceph-mon[57794]: pgmap v149: 65 pgs: 65 active+clean; 185 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 4.9 MiB/s wr, 436 op/s 2026-03-09T19:26:51.866 INFO:tasks.workunit.client.1.vm08.stdout:6/26: sync 2026-03-09T19:26:51.866 INFO:tasks.workunit.client.1.vm08.stdout:1/66: sync 2026-03-09T19:26:51.867 INFO:tasks.workunit.client.1.vm08.stdout:1/67: read - d9/da/dc/f10 zero size 2026-03-09T19:26:51.868 INFO:tasks.workunit.client.1.vm08.stdout:1/68: truncate d9/da/dc/f10 519398 0 2026-03-09T19:26:51.873 INFO:tasks.workunit.client.1.vm08.stdout:6/27: mknod d3/c8 0 2026-03-09T19:26:51.874 INFO:tasks.workunit.client.1.vm08.stdout:1/69: mknod d9/da/c1a 0 2026-03-09T19:26:51.875 INFO:tasks.workunit.client.1.vm08.stdout:1/70: write d9/da/d12/f14 [443541,81852] 0 2026-03-09T19:26:51.876 INFO:tasks.workunit.client.1.vm08.stdout:1/71: write f6 [4977867,59819] 0 2026-03-09T19:26:51.883 INFO:tasks.workunit.client.1.vm08.stdout:6/28: creat d3/f9 x:0 0 0 2026-03-09T19:26:51.912 INFO:tasks.workunit.client.1.vm08.stdout:1/72: dread d9/da/d12/f14 [0,4194304] 0 2026-03-09T19:26:51.915 INFO:tasks.workunit.client.1.vm08.stdout:1/73: creat d9/da/d12/f1b x:0 0 0 2026-03-09T19:26:51.916 INFO:tasks.workunit.client.1.vm08.stdout:1/74: symlink d9/d11/l1c 0 2026-03-09T19:26:51.916 INFO:tasks.workunit.client.1.vm08.stdout:1/75: chown d9/da 267977926 1 2026-03-09T19:26:51.920 INFO:tasks.workunit.client.1.vm08.stdout:1/76: dwrite d9/da/d12/f1b [0,4194304] 0 2026-03-09T19:26:51.922 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.923+0000 7f494aece640 1 -- 192.168.123.107:0/3167294905 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f490c002bf0 con 0x7f4920076170 2026-03-09T19:26:51.924 INFO:tasks.workunit.client.1.vm08.stdout:1/77: dread d9/da/d12/fe [0,4194304] 0 2026-03-09T19:26:51.926 INFO:tasks.workunit.client.1.vm08.stdout:1/78: unlink d9/da/d12/f14 0 2026-03-09T19:26:51.932 INFO:tasks.workunit.client.1.vm08.stdout:1/79: creat d9/da/dc/f1d x:0 0 0 2026-03-09T19:26:51.972 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:26:51.972 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (3m) 2m ago 4m 22.6M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:26:51.972 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (4m) 2m ago 4m 8284k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:26:51.972 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (3m) 2m ago 3m 8644k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:26:51.972 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (4m) 2m ago 4m 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2799ea3e4bf3 2026-03-09T19:26:51.972 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (3m) 2m ago 3m 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 81fc95c210b6 2026-03-09T19:26:51.972 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (3m) 2m ago 4m 79.7M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (2m) 2m ago 2m 12.5M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (2m) 2m ago 2m 14.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (2m) 2m ago 2m 16.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (2m) 2m ago 2m 17.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:9283,8765,8443 running (5m) 2m ago 5m 540M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 706e626ecd10 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (3m) 2m ago 3m 485M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 880604c16b45 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (5m) 2m ago 5m 53.6M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ccb644205fb3 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (3m) 2m ago 3m 49.4M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8d7b1da9e1e2 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (4m) 2m ago 4m 13.8M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (3m) 2m ago 3m 15.3M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (3m) 2m ago 3m 46.5M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d7417e3377af 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (3m) 2m ago 3m 67.5M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2b3c7dd92144 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (3m) 2m ago 3m 45.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 67f7c4b96ef8 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (3m) 2m ago 3m 45.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 740e44caf4fc 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (2m) 2m ago 2m 67.3M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d929d31f8a58 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (2m) 2m ago 2m 65.0M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (3m) 2m ago 4m 39.2M - 2.43.0 a07b618ecd1d 238baaac36ff 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.945+0000 7f4932ffd640 1 -- 192.168.123.107:0/3167294905 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f490c002bf0 con 0x7f4920076170 2026-03-09T19:26:51.973 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.948+0000 7f494aece640 1 -- 192.168.123.107:0/3167294905 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4920076170 msgr2=0x7f4920078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:51.974 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.948+0000 7f494aece640 1 --2- 192.168.123.107:0/3167294905 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4920076170 0x7f4920078630 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f49340059c0 tx=0x7f4934009340 comp rx=0 tx=0).stop 2026-03-09T19:26:51.974 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.948+0000 7f494aece640 1 -- 192.168.123.107:0/3167294905 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4944103ab0 msgr2=0x7f494419ec80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:51.974 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.948+0000 7f494aece640 1 --2- 192.168.123.107:0/3167294905 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4944103ab0 0x7f494419ec80 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f49380098b0 tx=0x7f4938031c00 comp rx=0 tx=0).stop 2026-03-09T19:26:51.974 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.949+0000 7f494aece640 1 -- 192.168.123.107:0/3167294905 shutdown_connections 2026-03-09T19:26:51.974 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.949+0000 7f494aece640 1 --2- 192.168.123.107:0/3167294905 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f4920076170 0x7f4920078630 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.974 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.949+0000 7f494aece640 1 --2- 192.168.123.107:0/3167294905 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4944103ab0 0x7f494419ec80 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.974 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.949+0000 7f494aece640 1 --2- 192.168.123.107:0/3167294905 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f49441028b0 0x7f494419e740 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:51.974 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.949+0000 7f494aece640 1 -- 192.168.123.107:0/3167294905 >> 192.168.123.107:0/3167294905 conn(0x7f49440fe060 msgr2=0x7f49440ffb20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:51.974 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.949+0000 7f494aece640 1 -- 192.168.123.107:0/3167294905 shutdown_connections 2026-03-09T19:26:51.974 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:51.949+0000 7f494aece640 1 -- 192.168.123.107:0/3167294905 wait complete. 2026-03-09T19:26:51.974 INFO:tasks.workunit.client.1.vm08.stdout:1/80: creat d9/da/f1e x:0 0 0 2026-03-09T19:26:51.974 INFO:tasks.workunit.client.1.vm08.stdout:1/81: dread - d9/da/dc/f1d zero size 2026-03-09T19:26:51.974 INFO:tasks.workunit.client.1.vm08.stdout:1/82: unlink d9/da/dc/ff 0 2026-03-09T19:26:51.974 INFO:tasks.workunit.client.1.vm08.stdout:1/83: dwrite f6 [0,4194304] 0 2026-03-09T19:26:51.974 INFO:tasks.workunit.client.1.vm08.stdout:3/75: fsync d0/d8/d19/fd 0 2026-03-09T19:26:51.974 INFO:tasks.workunit.client.1.vm08.stdout:3/76: rmdir d0 39 2026-03-09T19:26:51.974 INFO:tasks.workunit.client.1.vm08.stdout:2/32: rmdir d3/d4 39 2026-03-09T19:26:51.974 INFO:tasks.workunit.client.1.vm08.stdout:3/77: mkdir d0/d6/de/d1a 0 2026-03-09T19:26:51.974 INFO:tasks.workunit.client.1.vm08.stdout:3/78: dread d0/d6/d9/fc [0,4194304] 0 2026-03-09T19:26:51.974 INFO:tasks.workunit.client.1.vm08.stdout:3/79: dread d0/d8/d19/fd [0,4194304] 0 2026-03-09T19:26:51.974 INFO:tasks.workunit.client.1.vm08.stdout:3/80: dwrite d0/d8/d19/fd [0,4194304] 0 2026-03-09T19:26:51.974 INFO:tasks.workunit.client.1.vm08.stdout:3/81: truncate d0/d8/d19/fd 6291509 0 2026-03-09T19:26:51.981 INFO:tasks.workunit.client.1.vm08.stdout:3/82: rmdir d0/d6 39 2026-03-09T19:26:51.986 INFO:tasks.workunit.client.1.vm08.stdout:3/83: dwrite d0/d8/d19/fd [0,4194304] 0 2026-03-09T19:26:51.996 INFO:tasks.workunit.client.1.vm08.stdout:3/84: dwrite d0/d8/d19/fd [4194304,4194304] 0 2026-03-09T19:26:52.029 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.030+0000 7f9a43127640 1 -- 192.168.123.107:0/820557667 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9a3c071a50 msgr2=0x7f9a3c071e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:52.055 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.030+0000 7f9a43127640 1 --2- 192.168.123.107:0/820557667 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9a3c071a50 0x7f9a3c071e50 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f9a3800bb70 tx=0x7f9a38030fe0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.030+0000 7f9a43127640 1 -- 192.168.123.107:0/820557667 shutdown_connections 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.030+0000 7f9a43127640 1 --2- 192.168.123.107:0/820557667 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9a3c072420 0x7f9a3c077190 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.030+0000 7f9a43127640 1 --2- 192.168.123.107:0/820557667 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9a3c071a50 0x7f9a3c071e50 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.030+0000 7f9a43127640 1 -- 192.168.123.107:0/820557667 >> 192.168.123.107:0/820557667 conn(0x7f9a3c06d4f0 msgr2=0x7f9a3c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.033+0000 7f9a43127640 1 -- 192.168.123.107:0/820557667 shutdown_connections 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.033+0000 7f9a43127640 1 -- 192.168.123.107:0/820557667 wait complete. 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.034+0000 7f9a43127640 1 Processor -- start 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.034+0000 7f9a43127640 1 -- start start 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.034+0000 7f9a43127640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9a3c071a50 0x7f9a3c1b4050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.034+0000 7f9a43127640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9a3c072420 0x7f9a3c1b4590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.034+0000 7f9a43127640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9a3c1b4b60 con 0x7f9a3c072420 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.034+0000 7f9a43127640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9a3c1b4cd0 con 0x7f9a3c071a50 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.035+0000 7f9a41924640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9a3c072420 0x7f9a3c1b4590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.035+0000 7f9a41924640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9a3c072420 0x7f9a3c1b4590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:36036/0 (socket says 192.168.123.107:36036) 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.035+0000 7f9a41924640 1 -- 192.168.123.107:0/3334380433 learned_addr learned my addr 192.168.123.107:0/3334380433 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.035+0000 7f9a42125640 1 --2- 192.168.123.107:0/3334380433 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9a3c071a50 0x7f9a3c1b4050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.035+0000 7f9a41924640 1 -- 192.168.123.107:0/3334380433 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9a3c071a50 msgr2=0x7f9a3c1b4050 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.035+0000 7f9a41924640 1 --2- 192.168.123.107:0/3334380433 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9a3c071a50 0x7f9a3c1b4050 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.035+0000 7f9a41924640 1 -- 192.168.123.107:0/3334380433 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9a3800b820 con 0x7f9a3c072420 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.036+0000 7f9a41924640 1 --2- 192.168.123.107:0/3334380433 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9a3c072420 0x7f9a3c1b4590 secure :-1 s=READY pgs=330 cs=0 l=1 rev1=1 crypto rx=0x7f9a3400eaa0 tx=0x7f9a3400ef70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.036+0000 7f9a2b7fe640 1 -- 192.168.123.107:0/3334380433 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9a34002c10 con 0x7f9a3c072420 2026-03-09T19:26:52.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.036+0000 7f9a43127640 1 -- 192.168.123.107:0/3334380433 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9a3c1b9770 con 0x7f9a3c072420 2026-03-09T19:26:52.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.036+0000 7f9a43127640 1 -- 192.168.123.107:0/3334380433 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9a3c1b9cc0 con 0x7f9a3c072420 2026-03-09T19:26:52.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.037+0000 7f9a2b7fe640 1 -- 192.168.123.107:0/3334380433 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9a3400be10 con 0x7f9a3c072420 2026-03-09T19:26:52.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.037+0000 7f9a2b7fe640 1 -- 192.168.123.107:0/3334380433 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9a340082c0 con 0x7f9a3c072420 2026-03-09T19:26:52.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.037+0000 7f9a43127640 1 -- 192.168.123.107:0/3334380433 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9a3c07a810 con 0x7f9a3c072420 2026-03-09T19:26:52.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.039+0000 7f9a2b7fe640 1 -- 192.168.123.107:0/3334380433 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f9a34007a00 con 0x7f9a3c072420 2026-03-09T19:26:52.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.039+0000 7f9a2b7fe640 1 --2- 192.168.123.107:0/3334380433 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9a1c0761c0 0x7f9a1c078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:52.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.039+0000 7f9a2b7fe640 1 -- 192.168.123.107:0/3334380433 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f9a340983a0 con 0x7f9a3c072420 2026-03-09T19:26:52.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.039+0000 7f9a42125640 1 --2- 192.168.123.107:0/3334380433 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9a1c0761c0 0x7f9a1c078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:52.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.040+0000 7f9a42125640 1 --2- 192.168.123.107:0/3334380433 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9a1c0761c0 0x7f9a1c078680 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f9a3800b440 tx=0x7f9a380023d0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:52.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.043+0000 7f9a2b7fe640 1 -- 192.168.123.107:0/3334380433 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f9a34061a10 con 0x7f9a3c072420 2026-03-09T19:26:52.066 INFO:tasks.workunit.client.1.vm08.stdout:5/20: dwrite f1 [0,4194304] 0 2026-03-09T19:26:52.153 INFO:tasks.workunit.client.1.vm08.stdout:5/21: sync 2026-03-09T19:26:52.154 INFO:tasks.workunit.client.1.vm08.stdout:5/22: write f2 [965700,81146] 0 2026-03-09T19:26:52.156 INFO:tasks.workunit.client.1.vm08.stdout:5/23: rename c3 to c8 0 2026-03-09T19:26:52.157 INFO:tasks.workunit.client.1.vm08.stdout:5/24: mknod c9 0 2026-03-09T19:26:52.157 INFO:tasks.workunit.client.1.vm08.stdout:5/25: mknod ca 0 2026-03-09T19:26:52.157 INFO:tasks.workunit.client.1.vm08.stdout:5/26: read - f5 zero size 2026-03-09T19:26:52.169 INFO:tasks.workunit.client.1.vm08.stdout:0/14: getdents . 0 2026-03-09T19:26:52.170 INFO:tasks.workunit.client.1.vm08.stdout:4/25: truncate f1 689147 0 2026-03-09T19:26:52.171 INFO:tasks.workunit.client.1.vm08.stdout:4/26: dread - f5 zero size 2026-03-09T19:26:52.171 INFO:tasks.workunit.client.1.vm08.stdout:0/15: dread f3 [0,4194304] 0 2026-03-09T19:26:52.172 INFO:tasks.workunit.client.1.vm08.stdout:4/27: truncate f5 201032 0 2026-03-09T19:26:52.172 INFO:tasks.workunit.client.1.vm08.stdout:0/16: dread f3 [0,4194304] 0 2026-03-09T19:26:52.174 INFO:tasks.workunit.client.1.vm08.stdout:1/84: fdatasync f6 0 2026-03-09T19:26:52.174 INFO:tasks.workunit.client.1.vm08.stdout:1/85: stat d9/da/f1e 0 2026-03-09T19:26:52.177 INFO:tasks.workunit.client.1.vm08.stdout:1/86: read - d9/da/dc/f1d zero size 2026-03-09T19:26:52.177 INFO:tasks.workunit.client.1.vm08.stdout:7/43: truncate d5/f9 2965738 0 2026-03-09T19:26:52.187 INFO:tasks.workunit.client.1.vm08.stdout:9/24: rename d0/d3/d6 to d0/d2/d8 0 2026-03-09T19:26:52.187 INFO:tasks.workunit.client.1.vm08.stdout:9/25: write d0/f4 [4166814,30720] 0 2026-03-09T19:26:52.192 INFO:tasks.workunit.client.1.vm08.stdout:9/26: dwrite d0/f4 [0,4194304] 0 2026-03-09T19:26:52.202 INFO:tasks.workunit.client.1.vm08.stdout:8/54: fsync de/ff 0 2026-03-09T19:26:52.209 INFO:tasks.workunit.client.1.vm08.stdout:0/17: symlink l4 0 2026-03-09T19:26:52.216 INFO:tasks.workunit.client.1.vm08.stdout:3/85: rename d0/d6/d9 to d0/d6/de/d1b 0 2026-03-09T19:26:52.217 INFO:tasks.workunit.client.1.vm08.stdout:3/86: chown d0/d6/de/d15 15 1 2026-03-09T19:26:52.225 INFO:tasks.workunit.client.1.vm08.stdout:9/27: creat d0/d3/f9 x:0 0 0 2026-03-09T19:26:52.226 INFO:tasks.workunit.client.1.vm08.stdout:9/28: write d0/d3/f5 [165566,1481] 0 2026-03-09T19:26:52.233 INFO:tasks.workunit.client.1.vm08.stdout:6/29: fsync d3/f9 0 2026-03-09T19:26:52.233 INFO:tasks.workunit.client.1.vm08.stdout:6/30: readlink - no filename 2026-03-09T19:26:52.235 INFO:tasks.workunit.client.1.vm08.stdout:7/44: getdents d5 0 2026-03-09T19:26:52.237 INFO:tasks.workunit.client.1.vm08.stdout:6/31: dwrite d3/f9 [0,4194304] 0 2026-03-09T19:26:52.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.242+0000 7f9a43127640 1 -- 192.168.123.107:0/3334380433 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f9a3c1b9fa0 con 0x7f9a3c072420 2026-03-09T19:26:52.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.246+0000 7f9a2b7fe640 1 -- 192.168.123.107:0/3334380433 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f9a3c1b9fa0 con 0x7f9a3c072420 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 14 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:26:52.246 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:26:52.246 INFO:tasks.workunit.client.1.vm08.stdout:3/87: write d0/d6/de/d1b/fc [5730766,111838] 0 2026-03-09T19:26:52.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.250+0000 7f9a297fa640 1 -- 192.168.123.107:0/3334380433 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9a1c0761c0 msgr2=0x7f9a1c078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:52.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.250+0000 7f9a297fa640 1 --2- 192.168.123.107:0/3334380433 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9a1c0761c0 0x7f9a1c078680 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f9a3800b440 tx=0x7f9a380023d0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.250+0000 7f9a297fa640 1 -- 192.168.123.107:0/3334380433 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9a3c072420 msgr2=0x7f9a3c1b4590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:52.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.250+0000 7f9a297fa640 1 --2- 192.168.123.107:0/3334380433 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9a3c072420 0x7f9a3c1b4590 secure :-1 s=READY pgs=330 cs=0 l=1 rev1=1 crypto rx=0x7f9a3400eaa0 tx=0x7f9a3400ef70 comp rx=0 tx=0).stop 2026-03-09T19:26:52.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.250+0000 7f9a297fa640 1 -- 192.168.123.107:0/3334380433 shutdown_connections 2026-03-09T19:26:52.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.250+0000 7f9a297fa640 1 --2- 192.168.123.107:0/3334380433 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f9a1c0761c0 0x7f9a1c078680 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.250+0000 7f9a297fa640 1 --2- 192.168.123.107:0/3334380433 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9a3c072420 0x7f9a3c1b4590 unknown :-1 s=CLOSED pgs=330 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.250+0000 7f9a297fa640 1 --2- 192.168.123.107:0/3334380433 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9a3c071a50 0x7f9a3c1b4050 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.250+0000 7f9a297fa640 1 -- 192.168.123.107:0/3334380433 >> 192.168.123.107:0/3334380433 conn(0x7f9a3c06d4f0 msgr2=0x7f9a3c075570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:52.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.250+0000 7f9a297fa640 1 -- 192.168.123.107:0/3334380433 shutdown_connections 2026-03-09T19:26:52.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.250+0000 7f9a297fa640 1 -- 192.168.123.107:0/3334380433 wait complete. 2026-03-09T19:26:52.252 INFO:tasks.workunit.client.1.vm08.stdout:9/29: creat d0/fa x:0 0 0 2026-03-09T19:26:52.253 INFO:tasks.workunit.client.1.vm08.stdout:9/30: write d0/f4 [1487030,124892] 0 2026-03-09T19:26:52.253 INFO:tasks.workunit.client.1.vm08.stdout:9/31: chown d0/d2 64 1 2026-03-09T19:26:52.255 INFO:tasks.workunit.client.1.vm08.stdout:0/18: link f3 f5 0 2026-03-09T19:26:52.255 INFO:tasks.workunit.client.1.vm08.stdout:0/19: chown f3 53695 1 2026-03-09T19:26:52.261 INFO:tasks.workunit.client.1.vm08.stdout:6/32: mknod d3/ca 0 2026-03-09T19:26:52.266 INFO:tasks.workunit.client.1.vm08.stdout:9/32: mknod d0/cb 0 2026-03-09T19:26:52.267 INFO:tasks.workunit.client.1.vm08.stdout:9/33: stat d0/fa 0 2026-03-09T19:26:52.267 INFO:tasks.workunit.client.1.vm08.stdout:9/34: truncate d0/d3/f9 159523 0 2026-03-09T19:26:52.271 INFO:tasks.workunit.client.1.vm08.stdout:9/35: dwrite d0/d3/f5 [0,4194304] 0 2026-03-09T19:26:52.271 INFO:tasks.workunit.client.1.vm08.stdout:9/36: chown d0 79 1 2026-03-09T19:26:52.272 INFO:tasks.workunit.client.1.vm08.stdout:9/37: chown d0 1 1 2026-03-09T19:26:52.275 INFO:tasks.workunit.client.1.vm08.stdout:6/33: fsync d3/f9 0 2026-03-09T19:26:52.281 INFO:tasks.workunit.client.1.vm08.stdout:9/38: dwrite d0/fa [0,4194304] 0 2026-03-09T19:26:52.286 INFO:tasks.workunit.client.1.vm08.stdout:7/45: link d5/fb d5/f10 0 2026-03-09T19:26:52.299 INFO:tasks.workunit.client.1.vm08.stdout:3/88: symlink d0/d6/de/d15/l1c 0 2026-03-09T19:26:52.300 INFO:tasks.workunit.client.1.vm08.stdout:3/89: read d0/d6/de/d1b/fc [73024,91200] 0 2026-03-09T19:26:52.300 INFO:tasks.workunit.client.1.vm08.stdout:3/90: fsync d0/d6/de/d1b/fc 0 2026-03-09T19:26:52.301 INFO:tasks.workunit.client.1.vm08.stdout:3/91: fdatasync d0/d8/d19/fd 0 2026-03-09T19:26:52.303 INFO:tasks.workunit.client.1.vm08.stdout:6/34: mkdir d3/db 0 2026-03-09T19:26:52.317 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.313+0000 7f0bf8e7d640 1 -- 192.168.123.107:0/4230915229 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0bf4101840 msgr2=0x7f0bf4101cc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:52.317 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.313+0000 7f0bf8e7d640 1 --2- 192.168.123.107:0/4230915229 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0bf4101840 0x7f0bf4101cc0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f0bdc0099b0 tx=0x7f0bdc02f220 comp rx=0 tx=0).stop 2026-03-09T19:26:52.317 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.316+0000 7f0bf8e7d640 1 -- 192.168.123.107:0/4230915229 shutdown_connections 2026-03-09T19:26:52.317 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.316+0000 7f0bf8e7d640 1 --2- 192.168.123.107:0/4230915229 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0bf4101840 0x7f0bf4101cc0 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.317 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.316+0000 7f0bf8e7d640 1 --2- 192.168.123.107:0/4230915229 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bf4100640 0x7f0bf4100a40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.317 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.316+0000 7f0bf8e7d640 1 -- 192.168.123.107:0/4230915229 >> 192.168.123.107:0/4230915229 conn(0x7f0bf40fbdf0 msgr2=0x7f0bf40fe210 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:52.317 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.316+0000 7f0bf8e7d640 1 -- 192.168.123.107:0/4230915229 shutdown_connections 2026-03-09T19:26:52.317 INFO:tasks.workunit.client.1.vm08.stdout:6/35: read d3/f5 [1666196,55765] 0 2026-03-09T19:26:52.317 INFO:tasks.workunit.client.1.vm08.stdout:5/27: write f1 [4790540,127665] 0 2026-03-09T19:26:52.317 INFO:tasks.workunit.client.1.vm08.stdout:9/39: mknod d0/d2/d8/cc 0 2026-03-09T19:26:52.317 INFO:tasks.workunit.client.1.vm08.stdout:2/33: dwrite d3/d4/f8 [0,4194304] 0 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.317+0000 7f0bf8e7d640 1 -- 192.168.123.107:0/4230915229 wait complete. 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.317+0000 7f0bf8e7d640 1 Processor -- start 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.320+0000 7f0bf8e7d640 1 -- start start 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.320+0000 7f0bf8e7d640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bf4100640 0x7f0bf4195e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.320+0000 7f0bf8e7d640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0bf4101840 0x7f0bf4196390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.320+0000 7f0bf8e7d640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0bf4196960 con 0x7f0bf4100640 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.320+0000 7f0bf8e7d640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0bf4196ad0 con 0x7f0bf4101840 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.320+0000 7f0bf2575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bf4100640 0x7f0bf4195e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.320+0000 7f0bf2575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bf4100640 0x7f0bf4195e50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:36044/0 (socket says 192.168.123.107:36044) 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.320+0000 7f0bf2575640 1 -- 192.168.123.107:0/1548666271 learned_addr learned my addr 192.168.123.107:0/1548666271 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.320+0000 7f0bf1d74640 1 --2- 192.168.123.107:0/1548666271 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0bf4101840 0x7f0bf4196390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.320+0000 7f0bf2575640 1 -- 192.168.123.107:0/1548666271 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0bf4101840 msgr2=0x7f0bf4196390 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.320+0000 7f0bf2575640 1 --2- 192.168.123.107:0/1548666271 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0bf4101840 0x7f0bf4196390 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.320+0000 7f0bf2575640 1 -- 192.168.123.107:0/1548666271 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0bdc009660 con 0x7f0bf4100640 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.320+0000 7f0bf2575640 1 --2- 192.168.123.107:0/1548666271 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bf4100640 0x7f0bf4195e50 secure :-1 s=READY pgs=331 cs=0 l=1 rev1=1 crypto rx=0x7f0be800b4f0 tx=0x7f0be800b9c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.320+0000 7f0bdb7fe640 1 -- 192.168.123.107:0/1548666271 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0be8004280 con 0x7f0bf4100640 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.320+0000 7f0bf8e7d640 1 -- 192.168.123.107:0/1548666271 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0bf419b5a0 con 0x7f0bf4100640 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.320+0000 7f0bf8e7d640 1 -- 192.168.123.107:0/1548666271 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0bf419bb70 con 0x7f0bf4100640 2026-03-09T19:26:52.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.321+0000 7f0bdb7fe640 1 -- 192.168.123.107:0/1548666271 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f0be80043e0 con 0x7f0bf4100640 2026-03-09T19:26:52.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.321+0000 7f0bdb7fe640 1 -- 192.168.123.107:0/1548666271 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0be8010bf0 con 0x7f0bf4100640 2026-03-09T19:26:52.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.321+0000 7f0bf8e7d640 1 -- 192.168.123.107:0/1548666271 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0bb8005350 con 0x7f0bf4100640 2026-03-09T19:26:52.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.323+0000 7f0bdb7fe640 1 -- 192.168.123.107:0/1548666271 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f0be801a460 con 0x7f0bf4100640 2026-03-09T19:26:52.325 INFO:tasks.workunit.client.1.vm08.stdout:5/28: symlink lb 0 2026-03-09T19:26:52.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.326+0000 7f0bdb7fe640 1 --2- 192.168.123.107:0/1548666271 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f0bc40761c0 0x7f0bc4078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:52.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.327+0000 7f0bf1d74640 1 --2- 192.168.123.107:0/1548666271 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f0bc40761c0 0x7f0bc4078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:52.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.327+0000 7f0bdb7fe640 1 -- 192.168.123.107:0/1548666271 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f0be80989a0 con 0x7f0bf4100640 2026-03-09T19:26:52.326 INFO:tasks.workunit.client.1.vm08.stdout:6/36: dwrite f1 [0,4194304] 0 2026-03-09T19:26:52.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.327+0000 7f0bf1d74640 1 --2- 192.168.123.107:0/1548666271 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f0bc40761c0 0x7f0bc4078680 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f0bdc002c20 tx=0x7f0bdc03a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:52.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.327+0000 7f0bdb7fe640 1 -- 192.168.123.107:0/1548666271 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f0be80c6820 con 0x7f0bf4100640 2026-03-09T19:26:52.327 INFO:tasks.workunit.client.1.vm08.stdout:5/29: write f6 [613968,26407] 0 2026-03-09T19:26:52.332 INFO:tasks.workunit.client.1.vm08.stdout:5/30: dread f2 [0,4194304] 0 2026-03-09T19:26:52.332 INFO:tasks.workunit.client.1.vm08.stdout:5/31: write f6 [900682,9610] 0 2026-03-09T19:26:52.332 INFO:tasks.workunit.client.1.vm08.stdout:5/32: write f2 [1553951,101134] 0 2026-03-09T19:26:52.349 INFO:tasks.workunit.client.1.vm08.stdout:9/40: creat d0/d3/fd x:0 0 0 2026-03-09T19:26:52.350 INFO:tasks.workunit.client.1.vm08.stdout:9/41: read d0/d3/f5 [3151912,33316] 0 2026-03-09T19:26:52.362 INFO:tasks.workunit.client.1.vm08.stdout:6/37: creat d3/fc x:0 0 0 2026-03-09T19:26:52.363 INFO:tasks.workunit.client.1.vm08.stdout:0/20: write f3 [878205,60593] 0 2026-03-09T19:26:52.363 INFO:tasks.workunit.client.1.vm08.stdout:0/21: readlink l4 0 2026-03-09T19:26:52.365 INFO:tasks.workunit.client.1.vm08.stdout:6/38: dread d3/f9 [0,4194304] 0 2026-03-09T19:26:52.369 INFO:tasks.workunit.client.1.vm08.stdout:9/42: creat d0/d2/d8/fe x:0 0 0 2026-03-09T19:26:52.370 INFO:tasks.workunit.client.1.vm08.stdout:9/43: write d0/fa [2708372,121186] 0 2026-03-09T19:26:52.372 INFO:tasks.workunit.client.1.vm08.stdout:2/34: mknod d3/d9/dc/de/cf 0 2026-03-09T19:26:52.376 INFO:tasks.workunit.client.1.vm08.stdout:0/22: creat f6 x:0 0 0 2026-03-09T19:26:52.376 INFO:tasks.workunit.client.1.vm08.stdout:0/23: write f6 [235245,37962] 0 2026-03-09T19:26:52.379 INFO:tasks.workunit.client.1.vm08.stdout:8/55: write f0 [474612,59394] 0 2026-03-09T19:26:52.383 INFO:tasks.workunit.client.1.vm08.stdout:2/35: mkdir d3/d4/d10 0 2026-03-09T19:26:52.385 INFO:tasks.workunit.client.1.vm08.stdout:1/87: truncate d9/da/d12/f1b 236682 0 2026-03-09T19:26:52.386 INFO:tasks.workunit.client.1.vm08.stdout:0/24: unlink f6 0 2026-03-09T19:26:52.388 INFO:tasks.workunit.client.1.vm08.stdout:6/39: mknod d3/db/cd 0 2026-03-09T19:26:52.388 INFO:tasks.workunit.client.1.vm08.stdout:6/40: chown d3/f5 226342 1 2026-03-09T19:26:52.391 INFO:tasks.workunit.client.1.vm08.stdout:3/92: dwrite d0/d6/de/d1b/fc [0,4194304] 0 2026-03-09T19:26:52.400 INFO:tasks.workunit.client.1.vm08.stdout:3/93: write d0/d8/d19/fd [2352736,7203] 0 2026-03-09T19:26:52.400 INFO:tasks.workunit.client.1.vm08.stdout:8/56: rename f0 to de/f10 0 2026-03-09T19:26:52.409 INFO:tasks.workunit.client.1.vm08.stdout:1/88: creat d9/da/f1f x:0 0 0 2026-03-09T19:26:52.442 INFO:tasks.workunit.client.1.vm08.stdout:0/25: creat f7 x:0 0 0 2026-03-09T19:26:52.442 INFO:tasks.workunit.client.1.vm08.stdout:6/41: creat d3/fe x:0 0 0 2026-03-09T19:26:52.442 INFO:tasks.workunit.client.1.vm08.stdout:8/57: read f6 [1894171,99625] 0 2026-03-09T19:26:52.442 INFO:tasks.workunit.client.1.vm08.stdout:8/58: dwrite f1 [4194304,4194304] 0 2026-03-09T19:26:52.442 INFO:tasks.workunit.client.1.vm08.stdout:1/89: creat d9/da/dc/f20 x:0 0 0 2026-03-09T19:26:52.442 INFO:tasks.workunit.client.1.vm08.stdout:0/26: symlink l8 0 2026-03-09T19:26:52.442 INFO:tasks.workunit.client.1.vm08.stdout:0/27: dread - f7 zero size 2026-03-09T19:26:52.442 INFO:tasks.workunit.client.1.vm08.stdout:7/46: truncate d5/f10 2111564 0 2026-03-09T19:26:52.442 INFO:tasks.workunit.client.1.vm08.stdout:8/59: creat de/f11 x:0 0 0 2026-03-09T19:26:52.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.443+0000 7f0bf8e7d640 1 -- 192.168.123.107:0/1548666271 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f0bb80058d0 con 0x7f0bf4100640 2026-03-09T19:26:52.446 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.447+0000 7f0bdb7fe640 1 -- 192.168.123.107:0/1548666271 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1856 (secure 0 0 0) 0x7f0be8061f60 con 0x7f0bf4100640 2026-03-09T19:26:52.447 INFO:teuthology.orchestra.run.vm07.stdout:e13 2026-03-09T19:26:52.478 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:26:52.478 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:26:52.478 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:26:52.478 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:epoch 13 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:26:52.479 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:26:52.480 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:26:52.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.450+0000 7f0bf8e7d640 1 -- 192.168.123.107:0/1548666271 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f0bc40761c0 msgr2=0x7f0bc4078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:52.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.450+0000 7f0bf8e7d640 1 --2- 192.168.123.107:0/1548666271 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f0bc40761c0 0x7f0bc4078680 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f0bdc002c20 tx=0x7f0bdc03a040 comp rx=0 tx=0).stop 2026-03-09T19:26:52.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.450+0000 7f0bf8e7d640 1 -- 192.168.123.107:0/1548666271 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bf4100640 msgr2=0x7f0bf4195e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:52.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.450+0000 7f0bf8e7d640 1 --2- 192.168.123.107:0/1548666271 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bf4100640 0x7f0bf4195e50 secure :-1 s=READY pgs=331 cs=0 l=1 rev1=1 crypto rx=0x7f0be800b4f0 tx=0x7f0be800b9c0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.450+0000 7f0bf8e7d640 1 -- 192.168.123.107:0/1548666271 shutdown_connections 2026-03-09T19:26:52.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.450+0000 7f0bf8e7d640 1 --2- 192.168.123.107:0/1548666271 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f0bc40761c0 0x7f0bc4078680 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.450+0000 7f0bf8e7d640 1 --2- 192.168.123.107:0/1548666271 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0bf4101840 0x7f0bf4196390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.450+0000 7f0bf8e7d640 1 --2- 192.168.123.107:0/1548666271 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0bf4100640 0x7f0bf4195e50 unknown :-1 s=CLOSED pgs=331 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.450+0000 7f0bf8e7d640 1 -- 192.168.123.107:0/1548666271 >> 192.168.123.107:0/1548666271 conn(0x7f0bf40fbdf0 msgr2=0x7f0bf40fd930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:52.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.450+0000 7f0bf8e7d640 1 -- 192.168.123.107:0/1548666271 shutdown_connections 2026-03-09T19:26:52.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.451+0000 7f0bf8e7d640 1 -- 192.168.123.107:0/1548666271 wait complete. 2026-03-09T19:26:52.528 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.529+0000 7fd5fe21a640 1 -- 192.168.123.107:0/1098214811 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5f81091a0 msgr2=0x7fd5f8071c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:52.528 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.529+0000 7fd5fe21a640 1 --2- 192.168.123.107:0/1098214811 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5f81091a0 0x7fd5f8071c60 secure :-1 s=READY pgs=332 cs=0 l=1 rev1=1 crypto rx=0x7fd5ec0099b0 tx=0x7fd5ec02f220 comp rx=0 tx=0).stop 2026-03-09T19:26:52.528 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.529+0000 7fd5fe21a640 1 -- 192.168.123.107:0/1098214811 shutdown_connections 2026-03-09T19:26:52.528 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.529+0000 7fd5fe21a640 1 --2- 192.168.123.107:0/1098214811 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5f81091a0 0x7fd5f8071c60 unknown :-1 s=CLOSED pgs=332 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.528 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.529+0000 7fd5fe21a640 1 --2- 192.168.123.107:0/1098214811 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd5f81087d0 0x7fd5f8108bd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.529 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.529+0000 7fd5fe21a640 1 -- 192.168.123.107:0/1098214811 >> 192.168.123.107:0/1098214811 conn(0x7fd5f806d7b0 msgr2=0x7fd5f806fbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:52.529 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.530+0000 7fd5fe21a640 1 -- 192.168.123.107:0/1098214811 shutdown_connections 2026-03-09T19:26:52.529 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.530+0000 7fd5fe21a640 1 -- 192.168.123.107:0/1098214811 wait complete. 2026-03-09T19:26:52.529 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.530+0000 7fd5fe21a640 1 Processor -- start 2026-03-09T19:26:52.530 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.531+0000 7fd5fe21a640 1 -- start start 2026-03-09T19:26:52.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.531+0000 7fd5fe21a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5f81087d0 0x7fd5f81a2c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:52.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.531+0000 7fd5fe21a640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd5f81091a0 0x7fd5f81a3150 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:52.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.531+0000 7fd5fe21a640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd5f81a3720 con 0x7fd5f81087d0 2026-03-09T19:26:52.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.531+0000 7fd5fe21a640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd5f81a3890 con 0x7fd5f81091a0 2026-03-09T19:26:52.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.532+0000 7fd5fd218640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5f81087d0 0x7fd5f81a2c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:52.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.532+0000 7fd5fd218640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5f81087d0 0x7fd5f81a2c10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:36060/0 (socket says 192.168.123.107:36060) 2026-03-09T19:26:52.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.532+0000 7fd5fd218640 1 -- 192.168.123.107:0/3326364771 learned_addr learned my addr 192.168.123.107:0/3326364771 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:26:52.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.532+0000 7fd5fd218640 1 -- 192.168.123.107:0/3326364771 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd5f81091a0 msgr2=0x7fd5f81a3150 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:52.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.532+0000 7fd5fd218640 1 --2- 192.168.123.107:0/3326364771 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd5f81091a0 0x7fd5f81a3150 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.532+0000 7fd5fd218640 1 -- 192.168.123.107:0/3326364771 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd5e80071c0 con 0x7fd5f81087d0 2026-03-09T19:26:52.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.532+0000 7fd5fd218640 1 --2- 192.168.123.107:0/3326364771 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5f81087d0 0x7fd5f81a2c10 secure :-1 s=READY pgs=333 cs=0 l=1 rev1=1 crypto rx=0x7fd5e80079e0 tx=0x7fd5e8007eb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:52.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.533+0000 7fd5e67fc640 1 -- 192.168.123.107:0/3326364771 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd5e8004490 con 0x7fd5f81087d0 2026-03-09T19:26:52.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.533+0000 7fd5fe21a640 1 -- 192.168.123.107:0/3326364771 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd5ec009660 con 0x7fd5f81087d0 2026-03-09T19:26:52.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.533+0000 7fd5fe21a640 1 -- 192.168.123.107:0/3326364771 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd5f81a86e0 con 0x7fd5f81087d0 2026-03-09T19:26:52.534 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.534+0000 7fd5e67fc640 1 -- 192.168.123.107:0/3326364771 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd5e8010040 con 0x7fd5f81087d0 2026-03-09T19:26:52.534 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.534+0000 7fd5e67fc640 1 -- 192.168.123.107:0/3326364771 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd5e800ec80 con 0x7fd5f81087d0 2026-03-09T19:26:52.535 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.536+0000 7fd5fe21a640 1 -- 192.168.123.107:0/3326364771 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd5c0005350 con 0x7fd5f81087d0 2026-03-09T19:26:52.536 INFO:tasks.workunit.client.1.vm08.stdout:5/33: sync 2026-03-09T19:26:52.536 INFO:tasks.workunit.client.1.vm08.stdout:8/60: sync 2026-03-09T19:26:52.536 INFO:tasks.workunit.client.1.vm08.stdout:0/28: sync 2026-03-09T19:26:52.536 INFO:tasks.workunit.client.1.vm08.stdout:9/44: sync 2026-03-09T19:26:52.536 INFO:tasks.workunit.client.1.vm08.stdout:9/45: readlink - no filename 2026-03-09T19:26:52.537 INFO:tasks.workunit.client.1.vm08.stdout:8/61: chown de/ff 45400 1 2026-03-09T19:26:52.537 INFO:tasks.workunit.client.1.vm08.stdout:5/34: write f0 [885260,60018] 0 2026-03-09T19:26:52.537 INFO:tasks.workunit.client.1.vm08.stdout:5/35: rmdir - no directory 2026-03-09T19:26:52.541 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.541+0000 7fd5e67fc640 1 -- 192.168.123.107:0/3326364771 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fd5e80143f0 con 0x7fd5f81087d0 2026-03-09T19:26:52.541 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.542+0000 7fd5e67fc640 1 --2- 192.168.123.107:0/3326364771 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd5cc076290 0x7fd5cc078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:52.541 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.542+0000 7fd5e67fc640 1 -- 192.168.123.107:0/3326364771 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fd5e8097010 con 0x7fd5f81087d0 2026-03-09T19:26:52.542 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.543+0000 7fd5fca17640 1 --2- 192.168.123.107:0/3326364771 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd5cc076290 0x7fd5cc078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:52.542 INFO:tasks.workunit.client.1.vm08.stdout:0/29: sync 2026-03-09T19:26:52.542 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.543+0000 7fd5fca17640 1 --2- 192.168.123.107:0/3326364771 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd5cc076290 0x7fd5cc078750 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7fd5ec002af0 tx=0x7fd5ec005c50 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:52.544 INFO:tasks.workunit.client.1.vm08.stdout:5/36: dread f6 [0,4194304] 0 2026-03-09T19:26:52.545 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.546+0000 7fd5e67fc640 1 -- 192.168.123.107:0/3326364771 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fd5e80cb820 con 0x7fd5f81087d0 2026-03-09T19:26:52.551 INFO:tasks.workunit.client.1.vm08.stdout:3/94: fdatasync d0/d8/d19/fd 0 2026-03-09T19:26:52.560 INFO:tasks.workunit.client.1.vm08.stdout:9/46: dread d0/d3/f5 [0,4194304] 0 2026-03-09T19:26:52.564 INFO:tasks.workunit.client.1.vm08.stdout:2/36: fsync d3/d4/f6 0 2026-03-09T19:26:52.564 INFO:tasks.workunit.client.1.vm08.stdout:2/37: readlink l0 0 2026-03-09T19:26:52.564 INFO:tasks.workunit.client.1.vm08.stdout:2/38: dread - d3/d4/fb zero size 2026-03-09T19:26:52.565 INFO:tasks.workunit.client.1.vm08.stdout:9/47: sync 2026-03-09T19:26:52.565 INFO:tasks.workunit.client.1.vm08.stdout:2/39: fdatasync d3/d4/f6 0 2026-03-09T19:26:52.566 INFO:tasks.workunit.client.1.vm08.stdout:9/48: chown d0/fa 20299 1 2026-03-09T19:26:52.575 INFO:tasks.workunit.client.1.vm08.stdout:8/62: write de/f10 [83995,110592] 0 2026-03-09T19:26:52.581 INFO:tasks.workunit.client.1.vm08.stdout:8/63: dwrite de/f10 [0,4194304] 0 2026-03-09T19:26:52.583 INFO:tasks.workunit.client.1.vm08.stdout:8/64: chown de/f11 1 1 2026-03-09T19:26:52.584 INFO:tasks.workunit.client.1.vm08.stdout:9/49: fsync d0/d3/fd 0 2026-03-09T19:26:52.595 INFO:tasks.workunit.client.1.vm08.stdout:5/37: rename ca to cc 0 2026-03-09T19:26:52.597 INFO:tasks.workunit.client.1.vm08.stdout:4/28: write f1 [671671,84581] 0 2026-03-09T19:26:52.598 INFO:tasks.workunit.client.1.vm08.stdout:4/29: write f5 [1072692,130555] 0 2026-03-09T19:26:52.599 INFO:tasks.workunit.client.1.vm08.stdout:4/30: truncate f5 2019745 0 2026-03-09T19:26:52.606 INFO:tasks.workunit.client.1.vm08.stdout:4/31: dwrite f5 [0,4194304] 0 2026-03-09T19:26:52.642 INFO:tasks.workunit.client.1.vm08.stdout:6/42: getdents d3/db 0 2026-03-09T19:26:52.685 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:52 vm07.local ceph-mon[48545]: pgmap v150: 65 pgs: 65 active+clean; 190 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 4.6 MiB/s wr, 374 op/s 2026-03-09T19:26:52.686 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:52 vm07.local ceph-mon[48545]: from='client.14656 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:26:52.686 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:52 vm07.local ceph-mon[48545]: from='client.14660 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:26:52.686 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:52 vm07.local ceph-mon[48545]: from='client.24417 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:26:52.686 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:52 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/3334380433' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:26:52.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.685+0000 7fd5fe21a640 1 -- 192.168.123.107:0/3326364771 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd5c0002bf0 con 0x7fd5cc076290 2026-03-09T19:26:52.689 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.690+0000 7fd5e67fc640 1 -- 192.168.123.107:0/3326364771 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7fd5c0002bf0 con 0x7fd5cc076290 2026-03-09T19:26:52.689 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:26:52.689 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T19:26:52.689 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:26:52.689 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:26:52.690 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [], 2026-03-09T19:26:52.690 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "0/23 daemons upgraded", 2026-03-09T19:26:52.690 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm08", 2026-03-09T19:26:52.690 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:26:52.690 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:26:52.692 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.693+0000 7fd5fe21a640 1 -- 192.168.123.107:0/3326364771 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd5cc076290 msgr2=0x7fd5cc078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:52.692 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.693+0000 7fd5fe21a640 1 --2- 192.168.123.107:0/3326364771 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd5cc076290 0x7fd5cc078750 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7fd5ec002af0 tx=0x7fd5ec005c50 comp rx=0 tx=0).stop 2026-03-09T19:26:52.692 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.693+0000 7fd5fe21a640 1 -- 192.168.123.107:0/3326364771 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5f81087d0 msgr2=0x7fd5f81a2c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:52.692 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.693+0000 7fd5fe21a640 1 --2- 192.168.123.107:0/3326364771 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5f81087d0 0x7fd5f81a2c10 secure :-1 s=READY pgs=333 cs=0 l=1 rev1=1 crypto rx=0x7fd5e80079e0 tx=0x7fd5e8007eb0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.692 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.693+0000 7fd5fe21a640 1 -- 192.168.123.107:0/3326364771 shutdown_connections 2026-03-09T19:26:52.692 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.693+0000 7fd5fe21a640 1 --2- 192.168.123.107:0/3326364771 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fd5cc076290 0x7fd5cc078750 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.692 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.693+0000 7fd5fe21a640 1 --2- 192.168.123.107:0/3326364771 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd5f81091a0 0x7fd5f81a3150 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.692 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.693+0000 7fd5fe21a640 1 --2- 192.168.123.107:0/3326364771 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd5f81087d0 0x7fd5f81a2c10 unknown :-1 s=CLOSED pgs=333 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.692 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.693+0000 7fd5fe21a640 1 -- 192.168.123.107:0/3326364771 >> 192.168.123.107:0/3326364771 conn(0x7fd5f806d7b0 msgr2=0x7fd5f806f290 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:52.692 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.693+0000 7fd5fe21a640 1 -- 192.168.123.107:0/3326364771 shutdown_connections 2026-03-09T19:26:52.692 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.693+0000 7fd5fe21a640 1 -- 192.168.123.107:0/3326364771 wait complete. 2026-03-09T19:26:52.749 INFO:tasks.workunit.client.1.vm08.stdout:1/90: dwrite d9/da/d12/f1b [0,4194304] 0 2026-03-09T19:26:52.782 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.783+0000 7f64677fe640 1 -- 192.168.123.107:0/1525657297 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6458038470 con 0x7f6470103c60 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.785+0000 7f6476c95640 1 -- 192.168.123.107:0/1525657297 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6470103c60 msgr2=0x7f64701040e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.785+0000 7f6476c95640 1 --2- 192.168.123.107:0/1525657297 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6470103c60 0x7f64701040e0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f64580099b0 tx=0x7f645802f220 comp rx=0 tx=0).stop 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.785+0000 7f6476c95640 1 -- 192.168.123.107:0/1525657297 shutdown_connections 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.785+0000 7f6476c95640 1 --2- 192.168.123.107:0/1525657297 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6470103c60 0x7f64701040e0 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.785+0000 7f6476c95640 1 --2- 192.168.123.107:0/1525657297 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6470102a60 0x7f6470102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.785+0000 7f6476c95640 1 -- 192.168.123.107:0/1525657297 >> 192.168.123.107:0/1525657297 conn(0x7f64700fe250 msgr2=0x7f6470100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.785+0000 7f6476c95640 1 -- 192.168.123.107:0/1525657297 shutdown_connections 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.785+0000 7f6476c95640 1 -- 192.168.123.107:0/1525657297 wait complete. 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.785+0000 7f6476c95640 1 Processor -- start 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.785+0000 7f6476c95640 1 -- start start 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.787+0000 7f6476c95640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6470102a60 0x7f647019e8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.787+0000 7f6476c95640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6470103c60 0x7f647019ee30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.787+0000 7f6476c95640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f647019f370 con 0x7f6470102a60 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.787+0000 7f6476c95640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f647019f4e0 con 0x7f6470103c60 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.788+0000 7f6467fff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6470103c60 0x7f647019ee30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.788+0000 7f6467fff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6470103c60 0x7f647019ee30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:46808/0 (socket says 192.168.123.107:46808) 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.788+0000 7f6467fff640 1 -- 192.168.123.107:0/70297874 learned_addr learned my addr 192.168.123.107:0/70297874 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.788+0000 7f6467fff640 1 -- 192.168.123.107:0/70297874 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6470102a60 msgr2=0x7f647019e8f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.788+0000 7f6474a0a640 1 --2- 192.168.123.107:0/70297874 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6470102a60 0x7f647019e8f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.789+0000 7f6467fff640 1 --2- 192.168.123.107:0/70297874 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6470102a60 0x7f647019e8f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.789+0000 7f6467fff640 1 -- 192.168.123.107:0/70297874 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6458009660 con 0x7f6470103c60 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.789+0000 7f6474a0a640 1 --2- 192.168.123.107:0/70297874 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6470102a60 0x7f647019e8f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.789+0000 7f6467fff640 1 --2- 192.168.123.107:0/70297874 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6470103c60 0x7f647019ee30 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f6458002410 tx=0x7f6458002f60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.790+0000 7f6465ffb640 1 -- 192.168.123.107:0/70297874 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f645803d070 con 0x7f6470103c60 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.790+0000 7f6476c95640 1 -- 192.168.123.107:0/70297874 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f64701a3f60 con 0x7f6470103c60 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.790+0000 7f6476c95640 1 -- 192.168.123.107:0/70297874 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f64701a43d0 con 0x7f6470103c60 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.790+0000 7f6465ffb640 1 -- 192.168.123.107:0/70297874 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6458031db0 con 0x7f6470103c60 2026-03-09T19:26:52.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.790+0000 7f6465ffb640 1 -- 192.168.123.107:0/70297874 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6458031280 con 0x7f6470103c60 2026-03-09T19:26:52.793 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.793+0000 7f645f7fe640 1 -- 192.168.123.107:0/70297874 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f647010b690 con 0x7f6470103c60 2026-03-09T19:26:52.793 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.793+0000 7f6465ffb640 1 -- 192.168.123.107:0/70297874 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f6458038910 con 0x7f6470103c60 2026-03-09T19:26:52.793 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.794+0000 7f6465ffb640 1 --2- 192.168.123.107:0/70297874 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f644c0761c0 0x7f644c078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:26:52.793 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.794+0000 7f6465ffb640 1 -- 192.168.123.107:0/70297874 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f64580bc680 con 0x7f6470103c60 2026-03-09T19:26:52.793 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.794+0000 7f6474a0a640 1 --2- 192.168.123.107:0/70297874 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f644c0761c0 0x7f644c078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:26:52.793 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.794+0000 7f6474a0a640 1 --2- 192.168.123.107:0/70297874 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f644c0761c0 0x7f644c078680 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f6470103ac0 tx=0x7f6460005eb0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:26:52.795 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.796+0000 7f6465ffb640 1 -- 192.168.123.107:0/70297874 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6458085bc0 con 0x7f6470103c60 2026-03-09T19:26:52.812 INFO:tasks.workunit.client.1.vm08.stdout:3/95: rmdir d0/d6/de/d1b/d16 39 2026-03-09T19:26:52.817 INFO:tasks.workunit.client.1.vm08.stdout:3/96: read d0/d6/de/d1b/fc [5576000,94194] 0 2026-03-09T19:26:52.819 INFO:tasks.workunit.client.1.vm08.stdout:9/50: creat d0/ff x:0 0 0 2026-03-09T19:26:52.828 INFO:tasks.workunit.client.1.vm08.stdout:9/51: fdatasync d0/d3/f9 0 2026-03-09T19:26:52.828 INFO:tasks.workunit.client.1.vm08.stdout:9/52: dread - d0/d3/fd zero size 2026-03-09T19:26:52.828 INFO:tasks.workunit.client.1.vm08.stdout:9/53: stat d0/d2/d8/cc 0 2026-03-09T19:26:52.828 INFO:tasks.workunit.client.1.vm08.stdout:6/43: symlink d3/db/lf 0 2026-03-09T19:26:52.828 INFO:tasks.workunit.client.1.vm08.stdout:6/44: dread f1 [0,4194304] 0 2026-03-09T19:26:52.828 INFO:tasks.workunit.client.1.vm08.stdout:2/40: mkdir d3/d4/d10/d11 0 2026-03-09T19:26:52.830 INFO:tasks.workunit.client.1.vm08.stdout:9/54: symlink d0/d2/d8/l10 0 2026-03-09T19:26:52.833 INFO:tasks.workunit.client.1.vm08.stdout:4/32: unlink l3 0 2026-03-09T19:26:52.835 INFO:tasks.workunit.client.1.vm08.stdout:4/33: dread f2 [0,4194304] 0 2026-03-09T19:26:52.836 INFO:tasks.workunit.client.1.vm08.stdout:6/45: creat d3/f10 x:0 0 0 2026-03-09T19:26:52.837 INFO:tasks.workunit.client.1.vm08.stdout:1/91: rename l5 to d9/da/l21 0 2026-03-09T19:26:52.839 INFO:tasks.workunit.client.1.vm08.stdout:0/30: getdents . 0 2026-03-09T19:26:52.840 INFO:tasks.workunit.client.1.vm08.stdout:9/55: creat d0/d3/f11 x:0 0 0 2026-03-09T19:26:52.841 INFO:tasks.workunit.client.1.vm08.stdout:9/56: truncate d0/d3/f9 287173 0 2026-03-09T19:26:52.842 INFO:tasks.workunit.client.1.vm08.stdout:4/34: symlink l6 0 2026-03-09T19:26:52.844 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:52 vm08.local ceph-mon[57794]: pgmap v150: 65 pgs: 65 active+clean; 190 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 4.6 MiB/s wr, 374 op/s 2026-03-09T19:26:52.844 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:52 vm08.local ceph-mon[57794]: from='client.14656 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:26:52.844 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:52 vm08.local ceph-mon[57794]: from='client.14660 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:26:52.844 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:52 vm08.local ceph-mon[57794]: from='client.24417 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:26:52.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:52 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/3334380433' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:26:52.845 INFO:tasks.workunit.client.1.vm08.stdout:1/92: readlink d9/d11/l18 0 2026-03-09T19:26:52.846 INFO:tasks.workunit.client.1.vm08.stdout:2/41: rename d3/d4/d10/d11 to d3/d4/d12 0 2026-03-09T19:26:52.864 INFO:tasks.workunit.client.1.vm08.stdout:2/42: fdatasync d3/d4/f6 0 2026-03-09T19:26:52.864 INFO:tasks.workunit.client.1.vm08.stdout:3/97: creat d0/d6/de/d1b/d16/d17/f1d x:0 0 0 2026-03-09T19:26:52.864 INFO:tasks.workunit.client.1.vm08.stdout:8/65: getdents de 0 2026-03-09T19:26:52.864 INFO:tasks.workunit.client.1.vm08.stdout:2/43: dwrite d3/d4/f6 [0,4194304] 0 2026-03-09T19:26:52.864 INFO:tasks.workunit.client.1.vm08.stdout:0/31: rename l4 to l9 0 2026-03-09T19:26:52.864 INFO:tasks.workunit.client.1.vm08.stdout:5/38: getdents . 0 2026-03-09T19:26:52.864 INFO:tasks.workunit.client.1.vm08.stdout:4/35: rename l6 to l7 0 2026-03-09T19:26:52.864 INFO:tasks.workunit.client.1.vm08.stdout:4/36: dwrite f1 [0,4194304] 0 2026-03-09T19:26:52.873 INFO:tasks.workunit.client.1.vm08.stdout:6/46: link d3/f9 d3/f11 0 2026-03-09T19:26:52.873 INFO:tasks.workunit.client.1.vm08.stdout:9/57: rename d0/d2/d8/l10 to d0/l12 0 2026-03-09T19:26:52.874 INFO:tasks.workunit.client.1.vm08.stdout:8/66: mknod de/c12 0 2026-03-09T19:26:52.874 INFO:tasks.workunit.client.1.vm08.stdout:8/67: stat f6 0 2026-03-09T19:26:52.874 INFO:tasks.workunit.client.1.vm08.stdout:8/68: stat l2 0 2026-03-09T19:26:52.878 INFO:tasks.workunit.client.1.vm08.stdout:0/32: chown f3 14249783 1 2026-03-09T19:26:52.880 INFO:tasks.workunit.client.1.vm08.stdout:5/39: rename f6 to fd 0 2026-03-09T19:26:52.881 INFO:tasks.workunit.client.1.vm08.stdout:4/37: mknod c8 0 2026-03-09T19:26:52.883 INFO:tasks.workunit.client.1.vm08.stdout:4/38: dread f5 [0,4194304] 0 2026-03-09T19:26:52.923 INFO:tasks.workunit.client.1.vm08.stdout:4/39: write f5 [3528609,81539] 0 2026-03-09T19:26:52.923 INFO:tasks.workunit.client.1.vm08.stdout:6/47: creat d3/f12 x:0 0 0 2026-03-09T19:26:52.923 INFO:tasks.workunit.client.1.vm08.stdout:1/93: rename f6 to d9/f22 0 2026-03-09T19:26:52.923 INFO:tasks.workunit.client.1.vm08.stdout:1/94: write d9/da/dc/f20 [809838,74810] 0 2026-03-09T19:26:52.923 INFO:tasks.workunit.client.1.vm08.stdout:3/98: truncate d0/d8/d19/fd 1379855 0 2026-03-09T19:26:52.923 INFO:tasks.workunit.client.1.vm08.stdout:0/33: write f7 [751489,92210] 0 2026-03-09T19:26:52.923 INFO:tasks.workunit.client.1.vm08.stdout:0/34: fsync f3 0 2026-03-09T19:26:52.923 INFO:tasks.workunit.client.1.vm08.stdout:6/48: creat d3/f13 x:0 0 0 2026-03-09T19:26:52.923 INFO:tasks.workunit.client.1.vm08.stdout:4/40: dwrite f2 [0,4194304] 0 2026-03-09T19:26:52.923 INFO:tasks.workunit.client.1.vm08.stdout:4/41: dwrite f2 [0,4194304] 0 2026-03-09T19:26:52.923 INFO:tasks.workunit.client.1.vm08.stdout:9/58: rename d0/d3/f11 to d0/f13 0 2026-03-09T19:26:52.923 INFO:tasks.workunit.client.1.vm08.stdout:1/95: unlink d9/da/d12/fe 0 2026-03-09T19:26:52.923 INFO:tasks.workunit.client.1.vm08.stdout:1/96: chown d9/da/dc/f20 28 1 2026-03-09T19:26:52.923 INFO:tasks.workunit.client.1.vm08.stdout:1/97: read d9/f22 [1599915,118570] 0 2026-03-09T19:26:52.923 INFO:tasks.workunit.client.1.vm08.stdout:8/69: symlink de/l13 0 2026-03-09T19:26:52.923 INFO:tasks.workunit.client.1.vm08.stdout:8/70: dread f7 [0,4194304] 0 2026-03-09T19:26:52.927 INFO:tasks.workunit.client.1.vm08.stdout:6/49: write d3/f9 [1457133,67522] 0 2026-03-09T19:26:52.931 INFO:tasks.workunit.client.1.vm08.stdout:3/99: mknod d0/d6/de/d1b/c1e 0 2026-03-09T19:26:52.932 INFO:tasks.workunit.client.1.vm08.stdout:8/71: rmdir de 39 2026-03-09T19:26:52.934 INFO:tasks.workunit.client.1.vm08.stdout:6/50: creat d3/db/f14 x:0 0 0 2026-03-09T19:26:52.939 INFO:tasks.workunit.client.1.vm08.stdout:0/35: creat fa x:0 0 0 2026-03-09T19:26:52.965 INFO:tasks.workunit.client.1.vm08.stdout:5/40: rename cc to ce 0 2026-03-09T19:26:52.965 INFO:tasks.workunit.client.1.vm08.stdout:1/98: link d9/da/dc/ld d9/da/l23 0 2026-03-09T19:26:52.965 INFO:tasks.workunit.client.1.vm08.stdout:5/41: creat ff x:0 0 0 2026-03-09T19:26:52.965 INFO:tasks.workunit.client.1.vm08.stdout:8/72: rename c8 to de/c14 0 2026-03-09T19:26:52.965 INFO:tasks.workunit.client.1.vm08.stdout:1/99: getdents d9/d11 0 2026-03-09T19:26:52.965 INFO:tasks.workunit.client.1.vm08.stdout:1/100: stat d9 0 2026-03-09T19:26:52.965 INFO:tasks.workunit.client.1.vm08.stdout:1/101: write d9/da/dc/f20 [95294,14511] 0 2026-03-09T19:26:52.965 INFO:tasks.workunit.client.1.vm08.stdout:5/42: creat f10 x:0 0 0 2026-03-09T19:26:52.965 INFO:tasks.workunit.client.1.vm08.stdout:1/102: dwrite d9/f22 [4194304,4194304] 0 2026-03-09T19:26:52.965 INFO:tasks.workunit.client.1.vm08.stdout:8/73: rename f7 to de/f15 0 2026-03-09T19:26:52.966 INFO:tasks.workunit.client.1.vm08.stdout:5/43: fdatasync f2 0 2026-03-09T19:26:52.966 INFO:tasks.workunit.client.1.vm08.stdout:8/74: chown de/f11 28570889 1 2026-03-09T19:26:52.967 INFO:tasks.workunit.client.1.vm08.stdout:1/103: link l8 d9/da/l24 0 2026-03-09T19:26:52.968 INFO:tasks.workunit.client.1.vm08.stdout:1/104: chown f2 83 1 2026-03-09T19:26:52.971 INFO:tasks.workunit.client.1.vm08.stdout:1/105: rmdir d9 39 2026-03-09T19:26:52.973 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.972+0000 7f645f7fe640 1 -- 192.168.123.107:0/70297874 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f647010b820 con 0x7f6470103c60 2026-03-09T19:26:52.973 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.973+0000 7f6465ffb640 1 -- 192.168.123.107:0/70297874 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f6458085560 con 0x7f6470103c60 2026-03-09T19:26:52.973 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_OK 2026-03-09T19:26:52.974 INFO:tasks.workunit.client.1.vm08.stdout:1/106: mknod d9/da/dc/c25 0 2026-03-09T19:26:52.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.977+0000 7f645f7fe640 1 -- 192.168.123.107:0/70297874 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f644c0761c0 msgr2=0x7f644c078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:52.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.977+0000 7f645f7fe640 1 --2- 192.168.123.107:0/70297874 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f644c0761c0 0x7f644c078680 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f6470103ac0 tx=0x7f6460005eb0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.977+0000 7f645f7fe640 1 -- 192.168.123.107:0/70297874 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6470103c60 msgr2=0x7f647019ee30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:26:52.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.977+0000 7f645f7fe640 1 --2- 192.168.123.107:0/70297874 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6470103c60 0x7f647019ee30 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f6458002410 tx=0x7f6458002f60 comp rx=0 tx=0).stop 2026-03-09T19:26:52.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.977+0000 7f645f7fe640 1 -- 192.168.123.107:0/70297874 shutdown_connections 2026-03-09T19:26:52.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.977+0000 7f645f7fe640 1 --2- 192.168.123.107:0/70297874 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f644c0761c0 0x7f644c078680 secure :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f6470103ac0 tx=0x7f6460005eb0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.977+0000 7f645f7fe640 1 --2- 192.168.123.107:0/70297874 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6470103c60 0x7f647019ee30 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.977+0000 7f645f7fe640 1 --2- 192.168.123.107:0/70297874 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6470102a60 0x7f647019e8f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:26:52.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.977+0000 7f645f7fe640 1 -- 192.168.123.107:0/70297874 >> 192.168.123.107:0/70297874 conn(0x7f64700fe250 msgr2=0x7f64700ffd10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:26:52.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.977+0000 7f645f7fe640 1 -- 192.168.123.107:0/70297874 shutdown_connections 2026-03-09T19:26:52.977 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:26:52.977+0000 7f645f7fe640 1 -- 192.168.123.107:0/70297874 wait complete. 2026-03-09T19:26:52.979 INFO:tasks.workunit.client.1.vm08.stdout:1/107: mknod d9/d11/c26 0 2026-03-09T19:26:52.979 INFO:tasks.workunit.client.1.vm08.stdout:1/108: chown d9/da/l13 819 1 2026-03-09T19:26:52.982 INFO:tasks.workunit.client.1.vm08.stdout:1/109: unlink d9/da/f1f 0 2026-03-09T19:26:52.984 INFO:tasks.workunit.client.1.vm08.stdout:1/110: mknod d9/d11/c27 0 2026-03-09T19:26:52.984 INFO:tasks.workunit.client.1.vm08.stdout:1/111: truncate d9/da/dc/f1d 968778 0 2026-03-09T19:26:52.988 INFO:tasks.workunit.client.1.vm08.stdout:1/112: dwrite d9/da/dc/f10 [0,4194304] 0 2026-03-09T19:26:52.992 INFO:tasks.workunit.client.1.vm08.stdout:1/113: write d9/da/d12/f1b [2084548,110292] 0 2026-03-09T19:26:53.013 INFO:tasks.workunit.client.1.vm08.stdout:1/114: symlink d9/da/dc/l28 0 2026-03-09T19:26:53.034 INFO:tasks.workunit.client.1.vm08.stdout:1/115: creat d9/d11/f29 x:0 0 0 2026-03-09T19:26:53.034 INFO:tasks.workunit.client.1.vm08.stdout:1/116: chown d9/da/c1a 136571640 1 2026-03-09T19:26:53.035 INFO:tasks.workunit.client.1.vm08.stdout:1/117: stat d9/da/d17 0 2026-03-09T19:26:53.035 INFO:tasks.workunit.client.1.vm08.stdout:1/118: rename d9/da/d12/f1b to d9/da/d17/f2a 0 2026-03-09T19:26:53.035 INFO:tasks.workunit.client.1.vm08.stdout:1/119: mknod d9/da/dc/c2b 0 2026-03-09T19:26:53.035 INFO:tasks.workunit.client.1.vm08.stdout:1/120: dwrite d9/d11/f29 [0,4194304] 0 2026-03-09T19:26:53.035 INFO:tasks.workunit.client.1.vm08.stdout:1/121: unlink d9/da/c1a 0 2026-03-09T19:26:53.035 INFO:tasks.workunit.client.1.vm08.stdout:1/122: chown d9/d11/c26 3 1 2026-03-09T19:26:53.035 INFO:tasks.workunit.client.1.vm08.stdout:1/123: chown d9/da/d12 4026055 1 2026-03-09T19:26:53.038 INFO:tasks.workunit.client.1.vm08.stdout:2/44: sync 2026-03-09T19:26:53.106 INFO:tasks.workunit.client.1.vm08.stdout:2/45: fsync d3/d4/f6 0 2026-03-09T19:26:53.114 INFO:tasks.workunit.client.1.vm08.stdout:4/42: fsync f5 0 2026-03-09T19:26:53.119 INFO:tasks.workunit.client.1.vm08.stdout:4/43: creat f9 x:0 0 0 2026-03-09T19:26:53.120 INFO:tasks.workunit.client.1.vm08.stdout:2/46: symlink d3/l13 0 2026-03-09T19:26:53.120 INFO:tasks.workunit.client.1.vm08.stdout:2/47: chown d3/d9/dc/de 596248 1 2026-03-09T19:26:53.122 INFO:tasks.workunit.client.1.vm08.stdout:4/44: mkdir da 0 2026-03-09T19:26:53.127 INFO:tasks.workunit.client.1.vm08.stdout:2/48: mkdir d3/d9/dc/d14 0 2026-03-09T19:26:53.131 INFO:tasks.workunit.client.1.vm08.stdout:2/49: mknod d3/d9/c15 0 2026-03-09T19:26:53.134 INFO:tasks.workunit.client.1.vm08.stdout:4/45: mkdir da/db 0 2026-03-09T19:26:53.138 INFO:tasks.workunit.client.1.vm08.stdout:2/50: symlink d3/d9/dc/de/l16 0 2026-03-09T19:26:53.138 INFO:tasks.workunit.client.1.vm08.stdout:2/51: write d3/f7 [413400,22407] 0 2026-03-09T19:26:53.143 INFO:tasks.workunit.client.1.vm08.stdout:2/52: dwrite d3/d4/fd [0,4194304] 0 2026-03-09T19:26:53.145 INFO:tasks.workunit.client.1.vm08.stdout:4/46: symlink da/lc 0 2026-03-09T19:26:53.157 INFO:tasks.workunit.client.1.vm08.stdout:4/47: rmdir da/db 0 2026-03-09T19:26:53.161 INFO:tasks.workunit.client.1.vm08.stdout:4/48: link f5 da/fd 0 2026-03-09T19:26:53.188 INFO:tasks.workunit.client.1.vm08.stdout:6/51: fsync d3/f11 0 2026-03-09T19:26:53.190 INFO:tasks.workunit.client.1.vm08.stdout:9/59: getdents d0/d2/d8 0 2026-03-09T19:26:53.193 INFO:tasks.workunit.client.1.vm08.stdout:6/52: mkdir d3/d15 0 2026-03-09T19:26:53.196 INFO:tasks.workunit.client.1.vm08.stdout:9/60: unlink d0/ff 0 2026-03-09T19:26:53.197 INFO:tasks.workunit.client.1.vm08.stdout:2/53: rmdir d3/d4/d12 0 2026-03-09T19:26:53.198 INFO:tasks.workunit.client.1.vm08.stdout:2/54: write d3/f7 [1270737,102839] 0 2026-03-09T19:26:53.206 INFO:tasks.workunit.client.1.vm08.stdout:6/53: symlink d3/db/l16 0 2026-03-09T19:26:53.206 INFO:tasks.workunit.client.1.vm08.stdout:6/54: dread - d3/fc zero size 2026-03-09T19:26:53.210 INFO:tasks.workunit.client.1.vm08.stdout:9/61: mkdir d0/d2/d14 0 2026-03-09T19:26:53.215 INFO:tasks.workunit.client.1.vm08.stdout:6/55: dwrite d3/f9 [0,4194304] 0 2026-03-09T19:26:53.223 INFO:tasks.workunit.client.1.vm08.stdout:2/55: rename d3/d9/fa to d3/d9/dc/de/f17 0 2026-03-09T19:26:53.224 INFO:tasks.workunit.client.1.vm08.stdout:2/56: fdatasync f1 0 2026-03-09T19:26:53.224 INFO:tasks.workunit.client.1.vm08.stdout:0/36: getdents . 0 2026-03-09T19:26:53.231 INFO:tasks.workunit.client.1.vm08.stdout:6/56: unlink d3/c8 0 2026-03-09T19:26:53.231 INFO:tasks.workunit.client.1.vm08.stdout:6/57: chown d3/f7 64 1 2026-03-09T19:26:53.232 INFO:tasks.workunit.client.1.vm08.stdout:6/58: write d3/fe [88071,121007] 0 2026-03-09T19:26:53.239 INFO:tasks.workunit.client.1.vm08.stdout:6/59: dwrite d3/db/f14 [0,4194304] 0 2026-03-09T19:26:53.240 INFO:tasks.workunit.client.1.vm08.stdout:2/57: mkdir d3/d9/dc/de/d18 0 2026-03-09T19:26:53.240 INFO:tasks.workunit.client.1.vm08.stdout:6/60: chown d3/d15 441301 1 2026-03-09T19:26:53.255 INFO:tasks.workunit.client.1.vm08.stdout:8/75: dwrite f6 [4194304,4194304] 0 2026-03-09T19:26:53.260 INFO:tasks.workunit.client.1.vm08.stdout:8/76: dwrite de/f11 [0,4194304] 0 2026-03-09T19:26:53.273 INFO:tasks.workunit.client.1.vm08.stdout:5/44: truncate f1 2096392 0 2026-03-09T19:26:53.280 INFO:tasks.workunit.client.1.vm08.stdout:6/61: creat d3/f17 x:0 0 0 2026-03-09T19:26:53.280 INFO:tasks.workunit.client.1.vm08.stdout:6/62: write d3/f11 [844155,67781] 0 2026-03-09T19:26:53.283 INFO:tasks.workunit.client.1.vm08.stdout:0/37: getdents . 0 2026-03-09T19:26:53.283 INFO:tasks.workunit.client.1.vm08.stdout:0/38: readlink l9 0 2026-03-09T19:26:53.294 INFO:tasks.workunit.client.1.vm08.stdout:1/124: truncate d9/da/d17/f2a 221291 0 2026-03-09T19:26:53.295 INFO:tasks.workunit.client.1.vm08.stdout:0/39: creat fb x:0 0 0 2026-03-09T19:26:53.295 INFO:tasks.workunit.client.1.vm08.stdout:0/40: dread - fb zero size 2026-03-09T19:26:53.295 INFO:tasks.workunit.client.1.vm08.stdout:9/62: link d0/l12 d0/d2/d8/d7/l15 0 2026-03-09T19:26:53.300 INFO:tasks.workunit.client.1.vm08.stdout:2/58: rename d3/d4/fb to d3/f19 0 2026-03-09T19:26:53.302 INFO:tasks.workunit.client.1.vm08.stdout:2/59: dread f1 [0,4194304] 0 2026-03-09T19:26:53.303 INFO:tasks.workunit.client.1.vm08.stdout:6/63: getdents d3/d15 0 2026-03-09T19:26:53.304 INFO:tasks.workunit.client.1.vm08.stdout:6/64: write d3/f11 [5065169,130180] 0 2026-03-09T19:26:53.308 INFO:tasks.workunit.client.1.vm08.stdout:1/125: mkdir d9/da/d2c 0 2026-03-09T19:26:53.310 INFO:tasks.workunit.client.1.vm08.stdout:0/41: dread f3 [0,4194304] 0 2026-03-09T19:26:53.313 INFO:tasks.workunit.client.1.vm08.stdout:9/63: rmdir d0/d2/d8 39 2026-03-09T19:26:53.314 INFO:tasks.workunit.client.1.vm08.stdout:8/77: creat de/f16 x:0 0 0 2026-03-09T19:26:53.317 INFO:tasks.workunit.client.1.vm08.stdout:4/49: getdents da 0 2026-03-09T19:26:53.319 INFO:tasks.workunit.client.1.vm08.stdout:7/47: write d5/f9 [31638,48420] 0 2026-03-09T19:26:53.323 INFO:tasks.workunit.client.1.vm08.stdout:6/65: dread d3/f7 [0,4194304] 0 2026-03-09T19:26:53.342 INFO:tasks.workunit.client.1.vm08.stdout:9/64: creat d0/f16 x:0 0 0 2026-03-09T19:26:53.343 INFO:tasks.workunit.client.1.vm08.stdout:3/100: dwrite d0/d8/d19/fd [0,4194304] 0 2026-03-09T19:26:53.343 INFO:tasks.workunit.client.1.vm08.stdout:4/50: fdatasync f9 0 2026-03-09T19:26:53.343 INFO:tasks.workunit.client.1.vm08.stdout:3/101: dwrite d0/d6/de/d1b/d16/d17/f1d [0,4194304] 0 2026-03-09T19:26:53.343 INFO:tasks.workunit.client.1.vm08.stdout:2/60: creat d3/d9/dc/de/d18/f1a x:0 0 0 2026-03-09T19:26:53.343 INFO:tasks.workunit.client.1.vm08.stdout:2/61: readlink l0 0 2026-03-09T19:26:53.347 INFO:tasks.workunit.client.1.vm08.stdout:6/66: mknod d3/db/c18 0 2026-03-09T19:26:53.353 INFO:tasks.workunit.client.1.vm08.stdout:6/67: dwrite d3/f13 [0,4194304] 0 2026-03-09T19:26:53.375 INFO:tasks.workunit.client.1.vm08.stdout:8/78: mknod de/c17 0 2026-03-09T19:26:53.375 INFO:tasks.workunit.client.1.vm08.stdout:8/79: chown l2 669 1 2026-03-09T19:26:53.375 INFO:tasks.workunit.client.1.vm08.stdout:5/45: link c7 c11 0 2026-03-09T19:26:53.375 INFO:tasks.workunit.client.1.vm08.stdout:7/48: rename c1 to d5/c11 0 2026-03-09T19:26:53.375 INFO:tasks.workunit.client.1.vm08.stdout:3/102: rename d0 to d0/d6/de/d1f 22 2026-03-09T19:26:53.375 INFO:tasks.workunit.client.1.vm08.stdout:2/62: mknod d3/d4/d10/c1b 0 2026-03-09T19:26:53.375 INFO:tasks.workunit.client.1.vm08.stdout:2/63: readlink d3/d9/dc/de/l16 0 2026-03-09T19:26:53.375 INFO:tasks.workunit.client.1.vm08.stdout:4/51: symlink da/le 0 2026-03-09T19:26:53.376 INFO:tasks.workunit.client.1.vm08.stdout:3/103: unlink d0/d6/de/d1b/la 0 2026-03-09T19:26:53.379 INFO:tasks.workunit.client.1.vm08.stdout:2/64: creat d3/d9/dc/de/f1c x:0 0 0 2026-03-09T19:26:53.381 INFO:tasks.workunit.client.1.vm08.stdout:2/65: dread d3/d4/fd [0,4194304] 0 2026-03-09T19:26:53.382 INFO:tasks.workunit.client.1.vm08.stdout:2/66: chown d3/d4/d10/c1b 33303 1 2026-03-09T19:26:53.383 INFO:tasks.workunit.client.1.vm08.stdout:6/68: creat d3/d15/f19 x:0 0 0 2026-03-09T19:26:53.384 INFO:tasks.workunit.client.1.vm08.stdout:2/67: read d3/f7 [298820,60638] 0 2026-03-09T19:26:53.384 INFO:tasks.workunit.client.1.vm08.stdout:6/69: chown d3/ca 1 1 2026-03-09T19:26:53.388 INFO:tasks.workunit.client.1.vm08.stdout:7/49: dread d5/fb [0,4194304] 0 2026-03-09T19:26:53.389 INFO:tasks.workunit.client.1.vm08.stdout:6/70: dwrite d3/fc [0,4194304] 0 2026-03-09T19:26:53.389 INFO:tasks.workunit.client.1.vm08.stdout:6/71: stat d3/f11 0 2026-03-09T19:26:53.391 INFO:tasks.workunit.client.1.vm08.stdout:2/68: fdatasync d3/d9/dc/de/f17 0 2026-03-09T19:26:53.391 INFO:tasks.workunit.client.1.vm08.stdout:5/46: creat f12 x:0 0 0 2026-03-09T19:26:53.395 INFO:tasks.workunit.client.1.vm08.stdout:6/72: symlink d3/db/l1a 0 2026-03-09T19:26:53.415 INFO:tasks.workunit.client.1.vm08.stdout:6/73: write d3/f10 [695491,98257] 0 2026-03-09T19:26:53.415 INFO:tasks.workunit.client.1.vm08.stdout:2/69: mknod d3/d9/dc/d14/c1d 0 2026-03-09T19:26:53.415 INFO:tasks.workunit.client.1.vm08.stdout:2/70: write f2 [154818,121041] 0 2026-03-09T19:26:53.415 INFO:tasks.workunit.client.1.vm08.stdout:5/47: rename f12 to f13 0 2026-03-09T19:26:53.415 INFO:tasks.workunit.client.1.vm08.stdout:2/71: truncate d3/d4/f8 1579655 0 2026-03-09T19:26:53.415 INFO:tasks.workunit.client.1.vm08.stdout:5/48: write f13 [48134,28810] 0 2026-03-09T19:26:53.415 INFO:tasks.workunit.client.1.vm08.stdout:2/72: creat d3/d9/f1e x:0 0 0 2026-03-09T19:26:53.415 INFO:tasks.workunit.client.1.vm08.stdout:2/73: write d3/d9/dc/de/d18/f1a [844120,26605] 0 2026-03-09T19:26:53.415 INFO:tasks.workunit.client.1.vm08.stdout:2/74: mkdir d3/d9/dc/de/d18/d1f 0 2026-03-09T19:26:53.415 INFO:tasks.workunit.client.1.vm08.stdout:5/49: dread f0 [0,4194304] 0 2026-03-09T19:26:53.416 INFO:tasks.workunit.client.1.vm08.stdout:5/50: creat f14 x:0 0 0 2026-03-09T19:26:53.417 INFO:tasks.workunit.client.1.vm08.stdout:5/51: creat f15 x:0 0 0 2026-03-09T19:26:53.419 INFO:tasks.workunit.client.1.vm08.stdout:5/52: mkdir d16 0 2026-03-09T19:26:53.423 INFO:tasks.workunit.client.1.vm08.stdout:5/53: dwrite f10 [0,4194304] 0 2026-03-09T19:26:53.424 INFO:tasks.workunit.client.1.vm08.stdout:5/54: dread - f14 zero size 2026-03-09T19:26:53.426 INFO:tasks.workunit.client.1.vm08.stdout:5/55: creat d16/f17 x:0 0 0 2026-03-09T19:26:53.426 INFO:tasks.workunit.client.1.vm08.stdout:5/56: fdatasync f15 0 2026-03-09T19:26:53.427 INFO:tasks.workunit.client.1.vm08.stdout:5/57: write f15 [476912,43516] 0 2026-03-09T19:26:53.602 INFO:tasks.workunit.client.1.vm08.stdout:1/126: fsync d9/da/d17/f2a 0 2026-03-09T19:26:53.603 INFO:tasks.workunit.client.1.vm08.stdout:1/127: write d9/d11/f29 [4597935,1553] 0 2026-03-09T19:26:53.607 INFO:tasks.workunit.client.1.vm08.stdout:1/128: dwrite d9/da/dc/f20 [0,4194304] 0 2026-03-09T19:26:53.623 INFO:tasks.workunit.client.1.vm08.stdout:0/42: getdents . 0 2026-03-09T19:26:53.664 INFO:tasks.workunit.client.1.vm08.stdout:7/50: getdents d5 0 2026-03-09T19:26:53.664 INFO:tasks.workunit.client.1.vm08.stdout:7/51: fsync d5/fa 0 2026-03-09T19:26:53.667 INFO:tasks.workunit.client.1.vm08.stdout:9/65: truncate d0/fa 1426174 0 2026-03-09T19:26:53.673 INFO:tasks.workunit.client.1.vm08.stdout:8/80: dwrite de/f15 [0,4194304] 0 2026-03-09T19:26:53.689 INFO:tasks.workunit.client.1.vm08.stdout:4/52: write f5 [1156092,88212] 0 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:3/104: truncate d0/d8/d19/fd 51677 0 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:6/74: dwrite d3/f7 [0,4194304] 0 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:2/75: dread d3/d4/f6 [0,4194304] 0 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:5/58: dread f1 [0,4194304] 0 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:7/52: mkdir d5/d12 0 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:4/53: dread f2 [0,4194304] 0 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:3/105: symlink d0/d6/l20 0 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:4/54: dwrite f5 [0,4194304] 0 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:2/76: creat d3/d9/f20 x:0 0 0 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:2/77: dread - d3/f19 zero size 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:9/66: rmdir d0/d2/d8/d7 39 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:2/78: read - d3/f19 zero size 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:8/81: mknod de/c18 0 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:8/82: stat de/c12 0 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:2/79: write d3/d9/dc/de/d18/f1a [1876361,82005] 0 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:5/59: dwrite f5 [0,4194304] 0 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:2/80: chown d3/d4/d10 18078 1 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:3/106: creat d0/d6/de/d1b/d16/f21 x:0 0 0 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:6/75: link d3/fc d3/f1b 0 2026-03-09T19:26:53.743 INFO:tasks.workunit.client.1.vm08.stdout:6/76: write d3/d15/f19 [337119,112610] 0 2026-03-09T19:26:53.745 INFO:tasks.workunit.client.1.vm08.stdout:6/77: chown d3/f9 1 1 2026-03-09T19:26:53.749 INFO:tasks.workunit.client.1.vm08.stdout:4/55: dwrite f1 [0,4194304] 0 2026-03-09T19:26:53.761 INFO:tasks.workunit.client.1.vm08.stdout:5/60: write f1 [316574,893] 0 2026-03-09T19:26:53.761 INFO:tasks.workunit.client.1.vm08.stdout:4/56: dread da/fd [0,4194304] 0 2026-03-09T19:26:53.761 INFO:tasks.workunit.client.1.vm08.stdout:9/67: creat d0/d2/d14/f17 x:0 0 0 2026-03-09T19:26:53.761 INFO:tasks.workunit.client.1.vm08.stdout:6/78: creat d3/d15/f1c x:0 0 0 2026-03-09T19:26:53.761 INFO:tasks.workunit.client.1.vm08.stdout:5/61: creat d16/f18 x:0 0 0 2026-03-09T19:26:53.779 INFO:tasks.workunit.client.1.vm08.stdout:2/81: mknod d3/d9/dc/de/d18/d1f/c21 0 2026-03-09T19:26:53.779 INFO:tasks.workunit.client.1.vm08.stdout:4/57: mknod da/cf 0 2026-03-09T19:26:53.779 INFO:tasks.workunit.client.1.vm08.stdout:9/68: mknod d0/d2/d8/d7/c18 0 2026-03-09T19:26:53.779 INFO:tasks.workunit.client.1.vm08.stdout:6/79: mknod d3/c1d 0 2026-03-09T19:26:53.782 INFO:tasks.workunit.client.1.vm08.stdout:4/58: dread da/fd [0,4194304] 0 2026-03-09T19:26:53.787 INFO:tasks.workunit.client.1.vm08.stdout:8/83: creat de/f19 x:0 0 0 2026-03-09T19:26:53.789 INFO:tasks.workunit.client.1.vm08.stdout:8/84: write f1 [8629817,63837] 0 2026-03-09T19:26:53.789 INFO:tasks.workunit.client.1.vm08.stdout:8/85: chown f1 3 1 2026-03-09T19:26:53.797 INFO:tasks.workunit.client.1.vm08.stdout:6/80: creat d3/d15/f1e x:0 0 0 2026-03-09T19:26:53.797 INFO:tasks.workunit.client.1.vm08.stdout:2/82: symlink d3/d9/dc/de/d18/d1f/l22 0 2026-03-09T19:26:53.797 INFO:tasks.workunit.client.1.vm08.stdout:9/69: creat d0/d2/d14/f19 x:0 0 0 2026-03-09T19:26:53.798 INFO:tasks.workunit.client.1.vm08.stdout:4/59: readlink l4 0 2026-03-09T19:26:53.800 INFO:tasks.workunit.client.1.vm08.stdout:6/81: symlink d3/l1f 0 2026-03-09T19:26:53.800 INFO:tasks.workunit.client.1.vm08.stdout:9/70: fsync d0/f13 0 2026-03-09T19:26:53.800 INFO:tasks.workunit.client.1.vm08.stdout:2/83: rmdir d3/d4/d10 39 2026-03-09T19:26:53.801 INFO:tasks.workunit.client.1.vm08.stdout:9/71: write d0/d2/d8/fe [211667,15125] 0 2026-03-09T19:26:53.807 INFO:tasks.workunit.client.1.vm08.stdout:0/43: sync 2026-03-09T19:26:53.811 INFO:tasks.workunit.client.1.vm08.stdout:0/44: dwrite f7 [0,4194304] 0 2026-03-09T19:26:53.819 INFO:tasks.workunit.client.1.vm08.stdout:7/53: sync 2026-03-09T19:26:53.823 INFO:tasks.workunit.client.1.vm08.stdout:3/107: sync 2026-03-09T19:26:53.829 INFO:tasks.workunit.client.1.vm08.stdout:4/60: write f2 [3813457,71937] 0 2026-03-09T19:26:53.829 INFO:tasks.workunit.client.1.vm08.stdout:0/45: creat fc x:0 0 0 2026-03-09T19:26:53.829 INFO:tasks.workunit.client.1.vm08.stdout:7/54: fdatasync d5/f7 0 2026-03-09T19:26:53.835 INFO:tasks.workunit.client.1.vm08.stdout:8/86: link l5 de/l1a 0 2026-03-09T19:26:53.842 INFO:tasks.workunit.client.1.vm08.stdout:2/84: mkdir d3/d4/d23 0 2026-03-09T19:26:53.842 INFO:tasks.workunit.client.1.vm08.stdout:0/46: readlink l8 0 2026-03-09T19:26:53.842 INFO:tasks.workunit.client.1.vm08.stdout:2/85: dread f2 [0,4194304] 0 2026-03-09T19:26:53.843 INFO:tasks.workunit.client.1.vm08.stdout:2/86: write d3/d9/dc/de/f1c [908566,25005] 0 2026-03-09T19:26:53.852 INFO:tasks.workunit.client.1.vm08.stdout:7/55: fsync d5/fd 0 2026-03-09T19:26:53.859 INFO:tasks.workunit.client.1.vm08.stdout:8/87: dwrite de/ff [0,4194304] 0 2026-03-09T19:26:53.861 INFO:tasks.workunit.client.1.vm08.stdout:9/72: link d0/d3/f5 d0/d2/f1a 0 2026-03-09T19:26:53.865 INFO:tasks.workunit.client.1.vm08.stdout:7/56: rmdir d5 39 2026-03-09T19:26:53.884 INFO:tasks.workunit.client.1.vm08.stdout:8/88: readlink l5 0 2026-03-09T19:26:53.885 INFO:tasks.workunit.client.1.vm08.stdout:3/108: dread d0/d6/de/d1b/fc [4194304,4194304] 0 2026-03-09T19:26:53.886 INFO:tasks.workunit.client.1.vm08.stdout:3/109: write d0/d6/de/d1b/d16/d17/f1d [2695391,42262] 0 2026-03-09T19:26:53.887 INFO:tasks.workunit.client.1.vm08.stdout:3/110: read d0/d6/de/d1b/fc [1694317,73476] 0 2026-03-09T19:26:53.893 INFO:tasks.workunit.client.1.vm08.stdout:0/47: mkdir dd 0 2026-03-09T19:26:53.893 INFO:tasks.workunit.client.1.vm08.stdout:0/48: truncate fc 635689 0 2026-03-09T19:26:53.897 INFO:tasks.workunit.client.1.vm08.stdout:8/89: write de/f10 [1349854,80277] 0 2026-03-09T19:26:53.906 INFO:tasks.workunit.client.1.vm08.stdout:7/57: fdatasync d5/fa 0 2026-03-09T19:26:53.909 INFO:tasks.workunit.client.1.vm08.stdout:0/49: dwrite fb [0,4194304] 0 2026-03-09T19:26:53.926 INFO:tasks.workunit.client.1.vm08.stdout:7/58: creat d5/d12/f13 x:0 0 0 2026-03-09T19:26:53.928 INFO:tasks.workunit.client.1.vm08.stdout:0/50: creat dd/fe x:0 0 0 2026-03-09T19:26:53.934 INFO:tasks.workunit.client.1.vm08.stdout:8/90: rename de/f15 to de/f1b 0 2026-03-09T19:26:53.935 INFO:tasks.workunit.client.1.vm08.stdout:7/59: dread d5/f9 [0,4194304] 0 2026-03-09T19:26:53.937 INFO:tasks.workunit.client.1.vm08.stdout:0/51: symlink dd/lf 0 2026-03-09T19:26:53.937 INFO:tasks.workunit.client.1.vm08.stdout:0/52: dread - dd/fe zero size 2026-03-09T19:26:53.940 INFO:tasks.workunit.client.1.vm08.stdout:0/53: write dd/fe [657305,75775] 0 2026-03-09T19:26:53.941 INFO:tasks.workunit.client.1.vm08.stdout:0/54: dread fc [0,4194304] 0 2026-03-09T19:26:53.948 INFO:tasks.workunit.client.1.vm08.stdout:8/91: dwrite de/f1b [0,4194304] 0 2026-03-09T19:26:53.950 INFO:tasks.workunit.client.1.vm08.stdout:8/92: dread - de/f16 zero size 2026-03-09T19:26:53.968 INFO:tasks.workunit.client.1.vm08.stdout:7/60: dwrite d5/fc [0,4194304] 0 2026-03-09T19:26:53.970 INFO:tasks.workunit.client.1.vm08.stdout:7/61: truncate d5/d12/f13 564361 0 2026-03-09T19:26:53.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:53 vm07.local ceph-mon[48545]: pgmap v151: 65 pgs: 65 active+clean; 230 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 8.0 MiB/s wr, 399 op/s 2026-03-09T19:26:53.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:53 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/1548666271' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:26:53.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:53 vm07.local ceph-mon[48545]: from='client.14674 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:26:53.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:53 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/70297874' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:26:53.979 INFO:tasks.workunit.client.1.vm08.stdout:8/93: creat de/f1c x:0 0 0 2026-03-09T19:26:53.979 INFO:tasks.workunit.client.1.vm08.stdout:8/94: dread - de/f1c zero size 2026-03-09T19:26:53.981 INFO:tasks.workunit.client.1.vm08.stdout:7/62: mkdir d5/d14 0 2026-03-09T19:26:53.985 INFO:tasks.workunit.client.1.vm08.stdout:8/95: mkdir de/d1d 0 2026-03-09T19:26:53.986 INFO:tasks.workunit.client.1.vm08.stdout:8/96: write f1 [5417120,110527] 0 2026-03-09T19:26:53.988 INFO:tasks.workunit.client.1.vm08.stdout:7/63: dread d5/fd [0,4194304] 0 2026-03-09T19:26:53.989 INFO:tasks.workunit.client.1.vm08.stdout:7/64: read d5/fa [439118,44289] 0 2026-03-09T19:26:54.018 INFO:tasks.workunit.client.1.vm08.stdout:7/65: sync 2026-03-09T19:26:54.025 INFO:tasks.workunit.client.1.vm08.stdout:7/66: rename c2 to d5/c15 0 2026-03-09T19:26:54.026 INFO:tasks.workunit.client.1.vm08.stdout:7/67: write d5/fc [4829191,80682] 0 2026-03-09T19:26:54.033 INFO:tasks.workunit.client.1.vm08.stdout:7/68: mkdir d5/d16 0 2026-03-09T19:26:54.036 INFO:tasks.workunit.client.1.vm08.stdout:7/69: symlink d5/d14/l17 0 2026-03-09T19:26:54.037 INFO:tasks.workunit.client.1.vm08.stdout:7/70: read d5/fd [849711,89124] 0 2026-03-09T19:26:54.042 INFO:tasks.workunit.client.1.vm08.stdout:7/71: mknod d5/c18 0 2026-03-09T19:26:54.043 INFO:tasks.workunit.client.1.vm08.stdout:7/72: creat d5/d12/f19 x:0 0 0 2026-03-09T19:26:54.046 INFO:tasks.workunit.client.1.vm08.stdout:7/73: dread d5/fc [0,4194304] 0 2026-03-09T19:26:54.050 INFO:tasks.workunit.client.1.vm08.stdout:7/74: dread d5/fa [0,4194304] 0 2026-03-09T19:26:54.061 INFO:tasks.workunit.client.1.vm08.stdout:7/75: creat d5/f1a x:0 0 0 2026-03-09T19:26:54.061 INFO:tasks.workunit.client.1.vm08.stdout:7/76: dread - d5/d12/f19 zero size 2026-03-09T19:26:54.064 INFO:tasks.workunit.client.1.vm08.stdout:7/77: symlink d5/d16/l1b 0 2026-03-09T19:26:54.067 INFO:tasks.workunit.client.1.vm08.stdout:7/78: dread d5/fc [0,4194304] 0 2026-03-09T19:26:54.070 INFO:tasks.workunit.client.1.vm08.stdout:7/79: mkdir d5/d16/d1c 0 2026-03-09T19:26:54.076 INFO:tasks.workunit.client.1.vm08.stdout:2/87: write d3/d4/f8 [155086,43745] 0 2026-03-09T19:26:54.080 INFO:tasks.workunit.client.1.vm08.stdout:2/88: chown d3/d4/d10 820 1 2026-03-09T19:26:54.082 INFO:tasks.workunit.client.1.vm08.stdout:7/80: rename c4 to d5/d16/c1d 0 2026-03-09T19:26:54.083 INFO:tasks.workunit.client.1.vm08.stdout:7/81: dread d5/fd [0,4194304] 0 2026-03-09T19:26:54.084 INFO:tasks.workunit.client.1.vm08.stdout:7/82: truncate d5/f1a 426563 0 2026-03-09T19:26:54.087 INFO:tasks.workunit.client.1.vm08.stdout:7/83: dread d5/fc [0,4194304] 0 2026-03-09T19:26:54.087 INFO:tasks.workunit.client.1.vm08.stdout:2/89: symlink d3/d9/dc/de/l24 0 2026-03-09T19:26:54.088 INFO:tasks.workunit.client.1.vm08.stdout:7/84: write d5/f7 [1284003,111550] 0 2026-03-09T19:26:54.088 INFO:tasks.workunit.client.1.vm08.stdout:2/90: chown d3/d9/dc/de/f1c 15 1 2026-03-09T19:26:54.089 INFO:tasks.workunit.client.1.vm08.stdout:2/91: fdatasync d3/d9/f1e 0 2026-03-09T19:26:54.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:53 vm08.local ceph-mon[57794]: pgmap v151: 65 pgs: 65 active+clean; 230 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 8.0 MiB/s wr, 399 op/s 2026-03-09T19:26:54.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:53 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/1548666271' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:26:54.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:53 vm08.local ceph-mon[57794]: from='client.14674 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:26:54.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:53 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/70297874' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:26:54.095 INFO:tasks.workunit.client.1.vm08.stdout:7/85: dwrite d5/fd [0,4194304] 0 2026-03-09T19:26:54.098 INFO:tasks.workunit.client.1.vm08.stdout:2/92: mknod d3/d9/dc/c25 0 2026-03-09T19:26:54.099 INFO:tasks.workunit.client.1.vm08.stdout:1/129: truncate d9/da/dc/f20 3586156 0 2026-03-09T19:26:54.105 INFO:tasks.workunit.client.1.vm08.stdout:7/86: dwrite d5/d12/f19 [0,4194304] 0 2026-03-09T19:26:54.115 INFO:tasks.workunit.client.1.vm08.stdout:2/93: mkdir d3/d9/d26 0 2026-03-09T19:26:54.118 INFO:tasks.workunit.client.1.vm08.stdout:1/130: mkdir d9/da/d2d 0 2026-03-09T19:26:54.124 INFO:tasks.workunit.client.1.vm08.stdout:2/94: dread f2 [0,4194304] 0 2026-03-09T19:26:54.125 INFO:tasks.workunit.client.1.vm08.stdout:4/61: truncate f5 2984037 0 2026-03-09T19:26:54.128 INFO:tasks.workunit.client.1.vm08.stdout:1/131: chown d9/da/dc/ld 1796 1 2026-03-09T19:26:54.129 INFO:tasks.workunit.client.1.vm08.stdout:5/62: getdents d16 0 2026-03-09T19:26:54.130 INFO:tasks.workunit.client.1.vm08.stdout:4/62: mkdir da/d10 0 2026-03-09T19:26:54.131 INFO:tasks.workunit.client.1.vm08.stdout:4/63: readlink da/lc 0 2026-03-09T19:26:54.133 INFO:tasks.workunit.client.1.vm08.stdout:1/132: rmdir d9/da/d12 39 2026-03-09T19:26:54.134 INFO:tasks.workunit.client.1.vm08.stdout:1/133: write d9/da/d17/f2a [62481,28243] 0 2026-03-09T19:26:54.143 INFO:tasks.workunit.client.1.vm08.stdout:6/82: rmdir d3/d15 39 2026-03-09T19:26:54.144 INFO:tasks.workunit.client.1.vm08.stdout:6/83: truncate d3/f1b 4380334 0 2026-03-09T19:26:54.144 INFO:tasks.workunit.client.1.vm08.stdout:1/134: dwrite d9/f22 [8388608,4194304] 0 2026-03-09T19:26:54.147 INFO:tasks.workunit.client.1.vm08.stdout:1/135: dwrite d9/d11/f29 [0,4194304] 0 2026-03-09T19:26:54.160 INFO:tasks.workunit.client.1.vm08.stdout:5/63: fdatasync f0 0 2026-03-09T19:26:54.161 INFO:tasks.workunit.client.1.vm08.stdout:6/84: rename d3/f11 to d3/db/f20 0 2026-03-09T19:26:54.167 INFO:tasks.workunit.client.1.vm08.stdout:6/85: dwrite d3/f5 [0,4194304] 0 2026-03-09T19:26:54.170 INFO:tasks.workunit.client.1.vm08.stdout:9/73: rmdir d0 39 2026-03-09T19:26:54.172 INFO:tasks.workunit.client.1.vm08.stdout:2/95: sync 2026-03-09T19:26:54.172 INFO:tasks.workunit.client.1.vm08.stdout:6/86: dread d3/f9 [0,4194304] 0 2026-03-09T19:26:54.180 INFO:tasks.workunit.client.1.vm08.stdout:1/136: creat d9/da/dc/f2e x:0 0 0 2026-03-09T19:26:54.181 INFO:tasks.workunit.client.1.vm08.stdout:1/137: write d9/da/dc/f1d [1124151,36629] 0 2026-03-09T19:26:54.190 INFO:tasks.workunit.client.1.vm08.stdout:3/111: write d0/d6/de/d1b/fc [1351820,23942] 0 2026-03-09T19:26:54.195 INFO:tasks.workunit.client.1.vm08.stdout:5/64: unlink fd 0 2026-03-09T19:26:54.199 INFO:tasks.workunit.client.1.vm08.stdout:9/74: stat d0/d2/d8/fe 0 2026-03-09T19:26:54.207 INFO:tasks.workunit.client.1.vm08.stdout:0/55: truncate fb 3120392 0 2026-03-09T19:26:54.210 INFO:tasks.workunit.client.1.vm08.stdout:8/97: fsync de/f1c 0 2026-03-09T19:26:54.210 INFO:tasks.workunit.client.1.vm08.stdout:8/98: stat de/l13 0 2026-03-09T19:26:54.211 INFO:tasks.workunit.client.1.vm08.stdout:8/99: write f1 [2896457,81639] 0 2026-03-09T19:26:54.211 INFO:tasks.workunit.client.1.vm08.stdout:8/100: truncate de/f16 807596 0 2026-03-09T19:26:54.218 INFO:tasks.workunit.client.1.vm08.stdout:3/112: symlink d0/d6/de/d1b/d16/d17/l22 0 2026-03-09T19:26:54.219 INFO:tasks.workunit.client.1.vm08.stdout:5/65: mkdir d16/d19 0 2026-03-09T19:26:54.221 INFO:tasks.workunit.client.1.vm08.stdout:9/75: mkdir d0/d1b 0 2026-03-09T19:26:54.224 INFO:tasks.workunit.client.1.vm08.stdout:6/87: mknod d3/d15/c21 0 2026-03-09T19:26:54.227 INFO:tasks.workunit.client.1.vm08.stdout:0/56: symlink dd/l10 0 2026-03-09T19:26:54.228 INFO:tasks.workunit.client.1.vm08.stdout:2/96: creat d3/d4/d23/f27 x:0 0 0 2026-03-09T19:26:54.229 INFO:tasks.workunit.client.1.vm08.stdout:2/97: dread - d3/d9/dc/de/f17 zero size 2026-03-09T19:26:54.231 INFO:tasks.workunit.client.1.vm08.stdout:8/101: unlink cc 0 2026-03-09T19:26:54.232 INFO:tasks.workunit.client.1.vm08.stdout:8/102: stat de/d1d 0 2026-03-09T19:26:54.237 INFO:tasks.workunit.client.1.vm08.stdout:3/113: write d0/d8/d19/fd [701874,85176] 0 2026-03-09T19:26:54.237 INFO:tasks.workunit.client.1.vm08.stdout:7/87: getdents d5 0 2026-03-09T19:26:54.245 INFO:tasks.workunit.client.1.vm08.stdout:6/88: fdatasync d3/f9 0 2026-03-09T19:26:54.254 INFO:tasks.workunit.client.1.vm08.stdout:0/57: truncate fa 137809 0 2026-03-09T19:26:54.254 INFO:tasks.workunit.client.1.vm08.stdout:0/58: fdatasync fa 0 2026-03-09T19:26:54.254 INFO:tasks.workunit.client.1.vm08.stdout:0/59: chown dd/l10 2200 1 2026-03-09T19:26:54.258 INFO:tasks.workunit.client.1.vm08.stdout:2/98: creat d3/d9/f28 x:0 0 0 2026-03-09T19:26:54.268 INFO:tasks.workunit.client.1.vm08.stdout:4/64: dwrite f5 [0,4194304] 0 2026-03-09T19:26:54.273 INFO:tasks.workunit.client.1.vm08.stdout:7/88: rename d5/fd to d5/d14/f1e 0 2026-03-09T19:26:54.282 INFO:tasks.workunit.client.1.vm08.stdout:5/66: write f0 [1302804,17767] 0 2026-03-09T19:26:54.284 INFO:tasks.workunit.client.1.vm08.stdout:2/99: sync 2026-03-09T19:26:54.284 INFO:tasks.workunit.client.1.vm08.stdout:2/100: dread - d3/d9/f28 zero size 2026-03-09T19:26:54.287 INFO:tasks.workunit.client.1.vm08.stdout:6/89: mknod d3/d15/c22 0 2026-03-09T19:26:54.294 INFO:tasks.workunit.client.1.vm08.stdout:1/138: fsync d9/da/dc/f2e 0 2026-03-09T19:26:54.294 INFO:tasks.workunit.client.1.vm08.stdout:0/60: dread fc [0,4194304] 0 2026-03-09T19:26:54.294 INFO:tasks.workunit.client.1.vm08.stdout:8/103: link f1 de/d1d/f1e 0 2026-03-09T19:26:54.298 INFO:tasks.workunit.client.1.vm08.stdout:3/114: mknod d0/d6/de/d1b/d16/d18/c23 0 2026-03-09T19:26:54.299 INFO:tasks.workunit.client.1.vm08.stdout:3/115: fdatasync d0/d6/de/d1b/fc 0 2026-03-09T19:26:54.306 INFO:tasks.workunit.client.1.vm08.stdout:3/116: dwrite d0/d6/de/d1b/d16/d17/f1d [0,4194304] 0 2026-03-09T19:26:54.313 INFO:tasks.workunit.client.1.vm08.stdout:3/117: readlink d0/d6/l20 0 2026-03-09T19:26:54.313 INFO:tasks.workunit.client.1.vm08.stdout:3/118: chown d0/d6/de/d1b/c1e 0 1 2026-03-09T19:26:54.323 INFO:tasks.workunit.client.1.vm08.stdout:5/67: symlink d16/l1a 0 2026-03-09T19:26:54.331 INFO:tasks.workunit.client.1.vm08.stdout:5/68: dwrite d16/f18 [0,4194304] 0 2026-03-09T19:26:54.331 INFO:tasks.workunit.client.1.vm08.stdout:6/90: symlink d3/d15/l23 0 2026-03-09T19:26:54.331 INFO:tasks.workunit.client.1.vm08.stdout:6/91: dread d3/fe [0,4194304] 0 2026-03-09T19:26:54.338 INFO:tasks.workunit.client.1.vm08.stdout:0/61: mknod dd/c11 0 2026-03-09T19:26:54.344 INFO:tasks.workunit.client.1.vm08.stdout:8/104: dwrite de/d1d/f1e [0,4194304] 0 2026-03-09T19:26:54.359 INFO:tasks.workunit.client.1.vm08.stdout:8/105: dread f6 [4194304,4194304] 0 2026-03-09T19:26:54.359 INFO:tasks.workunit.client.1.vm08.stdout:8/106: read f1 [5713939,76348] 0 2026-03-09T19:26:54.359 INFO:tasks.workunit.client.1.vm08.stdout:8/107: stat cd 0 2026-03-09T19:26:54.359 INFO:tasks.workunit.client.1.vm08.stdout:8/108: dwrite de/f11 [0,4194304] 0 2026-03-09T19:26:54.370 INFO:tasks.workunit.client.1.vm08.stdout:6/92: mkdir d3/db/d24 0 2026-03-09T19:26:54.372 INFO:tasks.workunit.client.1.vm08.stdout:0/62: creat dd/f12 x:0 0 0 2026-03-09T19:26:54.373 INFO:tasks.workunit.client.1.vm08.stdout:9/76: truncate d0/fa 2024683 0 2026-03-09T19:26:54.374 INFO:tasks.workunit.client.1.vm08.stdout:9/77: write d0/d2/d14/f17 [157570,60062] 0 2026-03-09T19:26:54.379 INFO:tasks.workunit.client.1.vm08.stdout:3/119: mkdir d0/d8/d24 0 2026-03-09T19:26:54.438 INFO:tasks.workunit.client.1.vm08.stdout:3/120: dread d0/d6/de/d1b/d16/d17/f1d [0,4194304] 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:1/139: creat d9/da/f2f x:0 0 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:0/63: rename fa to dd/f13 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:9/78: unlink d0/f16 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:9/79: truncate d0/d3/fd 386745 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:4/65: getdents da 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:3/121: rmdir d0 39 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:5/69: rmdir d16/d19 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:5/70: fsync f15 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:5/71: dread f2 [0,4194304] 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:0/64: chown f3 89 1 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:0/65: chown dd/lf 22768 1 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:0/66: chown dd/l10 1054 1 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:8/109: link de/f1b de/f1f 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:6/93: creat d3/f25 x:0 0 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:6/94: write d3/f9 [543415,120140] 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:6/95: chown d3/ca 203 1 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:7/89: rename d5/f7 to d5/d16/f1f 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:5/72: mknod d16/c1b 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:5/73: stat f5 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:5/74: dwrite f13 [0,4194304] 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:0/67: write f5 [1480586,121181] 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:4/66: symlink da/d10/l11 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:4/67: dread f5 [0,4194304] 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:4/68: dwrite f5 [0,4194304] 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:3/122: mkdir d0/d6/d25 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:0/68: symlink dd/l14 0 2026-03-09T19:26:54.439 INFO:tasks.workunit.client.1.vm08.stdout:0/69: creat dd/f15 x:0 0 0 2026-03-09T19:26:54.685 INFO:tasks.workunit.client.1.vm08.stdout:1/140: sync 2026-03-09T19:26:54.685 INFO:tasks.workunit.client.1.vm08.stdout:3/123: sync 2026-03-09T19:26:54.685 INFO:tasks.workunit.client.1.vm08.stdout:5/75: sync 2026-03-09T19:26:54.685 INFO:tasks.workunit.client.1.vm08.stdout:9/80: sync 2026-03-09T19:26:54.685 INFO:tasks.workunit.client.1.vm08.stdout:6/96: sync 2026-03-09T19:26:54.686 INFO:tasks.workunit.client.1.vm08.stdout:3/124: chown d0/d6/de/d1b/d16 505281 1 2026-03-09T19:26:54.692 INFO:tasks.workunit.client.1.vm08.stdout:5/76: rename f13 to d16/f1c 0 2026-03-09T19:26:54.695 INFO:tasks.workunit.client.1.vm08.stdout:3/125: dwrite d0/d6/de/d1b/fc [0,4194304] 0 2026-03-09T19:26:54.696 INFO:tasks.workunit.client.1.vm08.stdout:3/126: read d0/d8/d19/fd [208703,30376] 0 2026-03-09T19:26:54.697 INFO:tasks.workunit.client.1.vm08.stdout:6/97: mknod d3/c26 0 2026-03-09T19:26:54.697 INFO:tasks.workunit.client.1.vm08.stdout:6/98: stat d3/f7 0 2026-03-09T19:26:54.697 INFO:tasks.workunit.client.1.vm08.stdout:3/127: write d0/d6/de/d1b/fc [5175232,67821] 0 2026-03-09T19:26:54.700 INFO:tasks.workunit.client.1.vm08.stdout:3/128: truncate d0/d6/de/d1b/fc 6008635 0 2026-03-09T19:26:54.702 INFO:tasks.workunit.client.1.vm08.stdout:6/99: dwrite d3/f9 [0,4194304] 0 2026-03-09T19:26:54.704 INFO:tasks.workunit.client.1.vm08.stdout:1/141: creat d9/da/f30 x:0 0 0 2026-03-09T19:26:54.706 INFO:tasks.workunit.client.1.vm08.stdout:1/142: write d9/da/dc/f2e [617233,88888] 0 2026-03-09T19:26:54.706 INFO:tasks.workunit.client.1.vm08.stdout:6/100: readlink d3/db/lf 0 2026-03-09T19:26:54.707 INFO:tasks.workunit.client.1.vm08.stdout:6/101: write d3/f6 [1674132,129494] 0 2026-03-09T19:26:54.710 INFO:tasks.workunit.client.1.vm08.stdout:3/129: rename d0/d6/l20 to d0/d6/de/d1b/d16/l26 0 2026-03-09T19:26:54.717 INFO:tasks.workunit.client.1.vm08.stdout:1/143: creat d9/da/dc/f31 x:0 0 0 2026-03-09T19:26:54.719 INFO:tasks.workunit.client.1.vm08.stdout:6/102: symlink d3/db/l27 0 2026-03-09T19:26:54.722 INFO:tasks.workunit.client.1.vm08.stdout:3/130: mknod d0/c27 0 2026-03-09T19:26:54.727 INFO:tasks.workunit.client.1.vm08.stdout:9/81: dread d0/d3/fd [0,4194304] 0 2026-03-09T19:26:54.728 INFO:tasks.workunit.client.1.vm08.stdout:9/82: chown d0/d2/f1a 3496 1 2026-03-09T19:26:54.729 INFO:tasks.workunit.client.1.vm08.stdout:9/83: write d0/d3/f9 [486160,3283] 0 2026-03-09T19:26:54.743 INFO:tasks.workunit.client.1.vm08.stdout:2/101: write d3/d4/fd [955216,121027] 0 2026-03-09T19:26:54.751 INFO:tasks.workunit.client.1.vm08.stdout:2/102: creat d3/d9/dc/de/d18/f29 x:0 0 0 2026-03-09T19:26:54.752 INFO:tasks.workunit.client.1.vm08.stdout:2/103: dread f2 [0,4194304] 0 2026-03-09T19:26:54.752 INFO:tasks.workunit.client.1.vm08.stdout:3/131: creat d0/f28 x:0 0 0 2026-03-09T19:26:54.753 INFO:tasks.workunit.client.1.vm08.stdout:1/144: link c0 d9/da/d12/c32 0 2026-03-09T19:26:54.757 INFO:tasks.workunit.client.1.vm08.stdout:1/145: dwrite f2 [8388608,4194304] 0 2026-03-09T19:26:54.761 INFO:tasks.workunit.client.1.vm08.stdout:3/132: getdents d0/d6/de/d1b/d16/d17 0 2026-03-09T19:26:54.765 INFO:tasks.workunit.client.1.vm08.stdout:3/133: dread d0/d6/de/d1b/d16/d17/f1d [0,4194304] 0 2026-03-09T19:26:54.769 INFO:tasks.workunit.client.1.vm08.stdout:3/134: symlink d0/l29 0 2026-03-09T19:26:54.769 INFO:tasks.workunit.client.1.vm08.stdout:3/135: chown d0/d6/de/d1b/d16/d17/l22 116 1 2026-03-09T19:26:54.773 INFO:tasks.workunit.client.1.vm08.stdout:6/103: sync 2026-03-09T19:26:54.777 INFO:tasks.workunit.client.1.vm08.stdout:2/104: sync 2026-03-09T19:26:54.817 INFO:tasks.workunit.client.1.vm08.stdout:4/69: fsync f5 0 2026-03-09T19:26:54.817 INFO:tasks.workunit.client.1.vm08.stdout:1/146: read d9/da/dc/f1d [392361,37653] 0 2026-03-09T19:26:54.818 INFO:tasks.workunit.client.1.vm08.stdout:4/70: dread - f9 zero size 2026-03-09T19:26:54.818 INFO:tasks.workunit.client.1.vm08.stdout:1/147: fdatasync d9/da/f1e 0 2026-03-09T19:26:54.818 INFO:tasks.workunit.client.1.vm08.stdout:4/71: write f1 [1358086,3047] 0 2026-03-09T19:26:54.820 INFO:tasks.workunit.client.1.vm08.stdout:8/110: rmdir de 39 2026-03-09T19:26:54.820 INFO:tasks.workunit.client.1.vm08.stdout:4/72: read - f9 zero size 2026-03-09T19:26:54.826 INFO:tasks.workunit.client.1.vm08.stdout:1/148: dwrite d9/f22 [4194304,4194304] 0 2026-03-09T19:26:54.828 INFO:tasks.workunit.client.1.vm08.stdout:4/73: dread da/fd [0,4194304] 0 2026-03-09T19:26:54.833 INFO:tasks.workunit.client.1.vm08.stdout:8/111: chown de 297 1 2026-03-09T19:26:54.838 INFO:tasks.workunit.client.1.vm08.stdout:1/149: symlink d9/l33 0 2026-03-09T19:26:54.838 INFO:tasks.workunit.client.1.vm08.stdout:8/112: dread - de/f1c zero size 2026-03-09T19:26:54.840 INFO:tasks.workunit.client.1.vm08.stdout:4/74: dwrite f1 [0,4194304] 0 2026-03-09T19:26:54.846 INFO:tasks.workunit.client.1.vm08.stdout:1/150: dread d9/f22 [4194304,4194304] 0 2026-03-09T19:26:54.850 INFO:tasks.workunit.client.1.vm08.stdout:1/151: write d9/da/f2f [460711,111715] 0 2026-03-09T19:26:54.852 INFO:tasks.workunit.client.1.vm08.stdout:1/152: chown d9/da/d17/f2a 856591022 1 2026-03-09T19:26:54.855 INFO:tasks.workunit.client.1.vm08.stdout:7/90: rmdir d5 39 2026-03-09T19:26:54.865 INFO:tasks.workunit.client.1.vm08.stdout:4/75: creat da/f12 x:0 0 0 2026-03-09T19:26:54.866 INFO:tasks.workunit.client.1.vm08.stdout:4/76: dread - da/f12 zero size 2026-03-09T19:26:54.867 INFO:tasks.workunit.client.1.vm08.stdout:4/77: write da/f12 [439554,45004] 0 2026-03-09T19:26:54.868 INFO:tasks.workunit.client.1.vm08.stdout:4/78: readlink l4 0 2026-03-09T19:26:54.870 INFO:tasks.workunit.client.1.vm08.stdout:0/70: dwrite fc [0,4194304] 0 2026-03-09T19:26:54.874 INFO:tasks.workunit.client.1.vm08.stdout:0/71: fsync f3 0 2026-03-09T19:26:54.874 INFO:tasks.workunit.client.1.vm08.stdout:1/153: mknod d9/d11/c34 0 2026-03-09T19:26:54.876 INFO:tasks.workunit.client.1.vm08.stdout:4/79: dwrite f5 [0,4194304] 0 2026-03-09T19:26:54.883 INFO:tasks.workunit.client.1.vm08.stdout:4/80: dwrite f1 [0,4194304] 0 2026-03-09T19:26:54.883 INFO:tasks.workunit.client.1.vm08.stdout:4/81: write f5 [122441,86215] 0 2026-03-09T19:26:54.889 INFO:tasks.workunit.client.1.vm08.stdout:4/82: dwrite f2 [0,4194304] 0 2026-03-09T19:26:54.894 INFO:tasks.workunit.client.1.vm08.stdout:4/83: write f9 [611063,38885] 0 2026-03-09T19:26:54.914 INFO:tasks.workunit.client.1.vm08.stdout:0/72: readlink l9 0 2026-03-09T19:26:54.922 INFO:tasks.workunit.client.1.vm08.stdout:7/91: dread d5/f9 [0,4194304] 0 2026-03-09T19:26:54.922 INFO:tasks.workunit.client.1.vm08.stdout:0/73: dwrite dd/f12 [0,4194304] 0 2026-03-09T19:26:54.926 INFO:tasks.workunit.client.1.vm08.stdout:0/74: readlink dd/l10 0 2026-03-09T19:26:54.926 INFO:tasks.workunit.client.1.vm08.stdout:8/113: creat de/f20 x:0 0 0 2026-03-09T19:26:54.934 INFO:tasks.workunit.client.1.vm08.stdout:4/84: creat da/d10/f13 x:0 0 0 2026-03-09T19:26:54.941 INFO:tasks.workunit.client.1.vm08.stdout:1/154: fdatasync d9/da/dc/f20 0 2026-03-09T19:26:54.950 INFO:tasks.workunit.client.1.vm08.stdout:0/75: creat dd/f16 x:0 0 0 2026-03-09T19:26:54.961 INFO:tasks.workunit.client.1.vm08.stdout:8/114: mkdir de/d1d/d21 0 2026-03-09T19:26:54.965 INFO:tasks.workunit.client.1.vm08.stdout:8/115: creat de/d1d/f22 x:0 0 0 2026-03-09T19:26:54.965 INFO:tasks.workunit.client.1.vm08.stdout:8/116: chown de/l13 3584565 1 2026-03-09T19:26:54.969 INFO:tasks.workunit.client.1.vm08.stdout:8/117: write de/d1d/f1e [8310899,50880] 0 2026-03-09T19:26:54.972 INFO:tasks.workunit.client.1.vm08.stdout:0/76: link dd/lf dd/l17 0 2026-03-09T19:26:54.975 INFO:tasks.workunit.client.1.vm08.stdout:0/77: creat dd/f18 x:0 0 0 2026-03-09T19:26:54.976 INFO:tasks.workunit.client.1.vm08.stdout:0/78: stat dd/f12 0 2026-03-09T19:26:54.977 INFO:tasks.workunit.client.1.vm08.stdout:8/118: creat de/d1d/d21/f23 x:0 0 0 2026-03-09T19:26:54.977 INFO:tasks.workunit.client.1.vm08.stdout:8/119: write de/d1d/f1e [4662943,22143] 0 2026-03-09T19:26:54.979 INFO:tasks.workunit.client.1.vm08.stdout:0/79: creat dd/f19 x:0 0 0 2026-03-09T19:26:54.983 INFO:tasks.workunit.client.1.vm08.stdout:8/120: symlink de/d1d/l24 0 2026-03-09T19:26:54.984 INFO:tasks.workunit.client.1.vm08.stdout:0/80: symlink dd/l1a 0 2026-03-09T19:26:54.987 INFO:tasks.workunit.client.1.vm08.stdout:8/121: mkdir de/d25 0 2026-03-09T19:26:54.989 INFO:tasks.workunit.client.1.vm08.stdout:8/122: symlink de/d25/l26 0 2026-03-09T19:26:55.004 INFO:tasks.workunit.client.1.vm08.stdout:8/123: creat de/d1d/f27 x:0 0 0 2026-03-09T19:26:55.006 INFO:tasks.workunit.client.1.vm08.stdout:8/124: rename l2 to de/d1d/d21/l28 0 2026-03-09T19:26:55.011 INFO:tasks.workunit.client.1.vm08.stdout:8/125: symlink de/d1d/d21/l29 0 2026-03-09T19:26:55.016 INFO:tasks.workunit.client.1.vm08.stdout:8/126: write de/f11 [2253926,69307] 0 2026-03-09T19:26:55.019 INFO:tasks.workunit.client.1.vm08.stdout:8/127: fsync de/f16 0 2026-03-09T19:26:55.027 INFO:tasks.workunit.client.1.vm08.stdout:8/128: mknod de/d1d/c2a 0 2026-03-09T19:26:55.027 INFO:tasks.workunit.client.1.vm08.stdout:8/129: write de/f1c [656016,83612] 0 2026-03-09T19:26:55.030 INFO:tasks.workunit.client.1.vm08.stdout:8/130: dread de/f1f [0,4194304] 0 2026-03-09T19:26:55.036 INFO:tasks.workunit.client.1.vm08.stdout:8/131: mknod de/d1d/c2b 0 2026-03-09T19:26:55.037 INFO:tasks.workunit.client.1.vm08.stdout:8/132: read de/f16 [160291,74244] 0 2026-03-09T19:26:55.037 INFO:tasks.workunit.client.1.vm08.stdout:8/133: chown de/f1b 3610 1 2026-03-09T19:26:55.043 INFO:tasks.workunit.client.1.vm08.stdout:8/134: mkdir de/d2c 0 2026-03-09T19:26:55.045 INFO:tasks.workunit.client.1.vm08.stdout:8/135: creat de/d2c/f2d x:0 0 0 2026-03-09T19:26:55.098 INFO:tasks.workunit.client.1.vm08.stdout:0/81: fdatasync fb 0 2026-03-09T19:26:55.098 INFO:tasks.workunit.client.1.vm08.stdout:0/82: dread - dd/f19 zero size 2026-03-09T19:26:55.101 INFO:tasks.workunit.client.1.vm08.stdout:0/83: dread dd/f13 [0,4194304] 0 2026-03-09T19:26:55.105 INFO:tasks.workunit.client.1.vm08.stdout:0/84: mknod dd/c1b 0 2026-03-09T19:26:55.105 INFO:tasks.workunit.client.1.vm08.stdout:0/85: readlink dd/l14 0 2026-03-09T19:26:55.107 INFO:tasks.workunit.client.1.vm08.stdout:0/86: mkdir dd/d1c 0 2026-03-09T19:26:55.107 INFO:tasks.workunit.client.1.vm08.stdout:0/87: dread - dd/f18 zero size 2026-03-09T19:26:55.108 INFO:tasks.workunit.client.1.vm08.stdout:0/88: dread dd/f13 [0,4194304] 0 2026-03-09T19:26:55.109 INFO:tasks.workunit.client.1.vm08.stdout:0/89: mknod dd/c1d 0 2026-03-09T19:26:55.115 INFO:tasks.workunit.client.1.vm08.stdout:5/77: getdents d16 0 2026-03-09T19:26:55.119 INFO:tasks.workunit.client.1.vm08.stdout:5/78: dwrite f15 [0,4194304] 0 2026-03-09T19:26:55.126 INFO:tasks.workunit.client.1.vm08.stdout:5/79: dwrite d16/f17 [0,4194304] 0 2026-03-09T19:26:55.127 INFO:tasks.workunit.client.1.vm08.stdout:5/80: write f10 [1261633,110483] 0 2026-03-09T19:26:55.139 INFO:tasks.workunit.client.1.vm08.stdout:6/104: getdents d3 0 2026-03-09T19:26:55.139 INFO:tasks.workunit.client.1.vm08.stdout:6/105: chown d3/f13 70348 1 2026-03-09T19:26:55.145 INFO:tasks.workunit.client.1.vm08.stdout:5/81: rename c7 to d16/c1d 0 2026-03-09T19:26:55.147 INFO:tasks.workunit.client.1.vm08.stdout:5/82: readlink lb 0 2026-03-09T19:26:55.148 INFO:tasks.workunit.client.1.vm08.stdout:5/83: mkdir d16/d1e 0 2026-03-09T19:26:55.152 INFO:tasks.workunit.client.1.vm08.stdout:5/84: rename ce to d16/c1f 0 2026-03-09T19:26:55.156 INFO:tasks.workunit.client.1.vm08.stdout:5/85: link ff d16/f20 0 2026-03-09T19:26:55.156 INFO:tasks.workunit.client.1.vm08.stdout:5/86: write d16/f20 [316843,22984] 0 2026-03-09T19:26:55.156 INFO:tasks.workunit.client.1.vm08.stdout:5/87: write d16/f20 [1270401,3874] 0 2026-03-09T19:26:55.161 INFO:tasks.workunit.client.1.vm08.stdout:9/84: dwrite d0/d3/f5 [0,4194304] 0 2026-03-09T19:26:55.172 INFO:tasks.workunit.client.1.vm08.stdout:5/88: mknod d16/d1e/c21 0 2026-03-09T19:26:55.181 INFO:tasks.workunit.client.1.vm08.stdout:5/89: mknod d16/d1e/c22 0 2026-03-09T19:26:55.185 INFO:tasks.workunit.client.1.vm08.stdout:9/85: link d0/d2/f1a d0/d2/d8/f1c 0 2026-03-09T19:26:55.188 INFO:tasks.workunit.client.1.vm08.stdout:9/86: stat d0/d2/d8/cc 0 2026-03-09T19:26:55.188 INFO:tasks.workunit.client.1.vm08.stdout:2/105: write d3/f7 [1640568,41335] 0 2026-03-09T19:26:55.188 INFO:tasks.workunit.client.1.vm08.stdout:3/136: truncate d0/d6/de/d1b/d16/d17/f1d 1245712 0 2026-03-09T19:26:55.188 INFO:tasks.workunit.client.1.vm08.stdout:2/106: truncate d3/f19 636931 0 2026-03-09T19:26:55.188 INFO:tasks.workunit.client.1.vm08.stdout:3/137: truncate d0/f28 786009 0 2026-03-09T19:26:55.188 INFO:tasks.workunit.client.1.vm08.stdout:2/107: chown d3/d4/d23/f27 1348 1 2026-03-09T19:26:55.189 INFO:tasks.workunit.client.1.vm08.stdout:2/108: write d3/d4/d23/f27 [189429,57769] 0 2026-03-09T19:26:55.189 INFO:tasks.workunit.client.1.vm08.stdout:2/109: fsync f1 0 2026-03-09T19:26:55.190 INFO:tasks.workunit.client.1.vm08.stdout:3/138: dread d0/f28 [0,4194304] 0 2026-03-09T19:26:55.191 INFO:tasks.workunit.client.1.vm08.stdout:3/139: stat d0/f28 0 2026-03-09T19:26:55.207 INFO:tasks.workunit.client.1.vm08.stdout:2/110: mknod d3/d4/d10/c2a 0 2026-03-09T19:26:55.207 INFO:tasks.workunit.client.1.vm08.stdout:2/111: stat d3/d9/dc/de/d18 0 2026-03-09T19:26:55.212 INFO:tasks.workunit.client.1.vm08.stdout:1/155: getdents d9/d11 0 2026-03-09T19:26:55.213 INFO:tasks.workunit.client.1.vm08.stdout:1/156: chown d9/da/dc/f1d 1 1 2026-03-09T19:26:55.216 INFO:tasks.workunit.client.1.vm08.stdout:3/140: read d0/d8/d19/fd [694564,58154] 0 2026-03-09T19:26:55.218 INFO:tasks.workunit.client.1.vm08.stdout:0/90: truncate f5 394102 0 2026-03-09T19:26:55.223 INFO:tasks.workunit.client.1.vm08.stdout:3/141: fsync d0/d8/d19/fd 0 2026-03-09T19:26:55.224 INFO:tasks.workunit.client.1.vm08.stdout:3/142: readlink d0/d6/de/d15/l1c 0 2026-03-09T19:26:55.226 INFO:tasks.workunit.client.1.vm08.stdout:9/87: creat d0/d2/f1d x:0 0 0 2026-03-09T19:26:55.226 INFO:tasks.workunit.client.1.vm08.stdout:4/85: rmdir da 39 2026-03-09T19:26:55.229 INFO:tasks.workunit.client.1.vm08.stdout:2/112: link d3/d4/d23/f27 d3/d9/dc/d14/f2b 0 2026-03-09T19:26:55.234 INFO:tasks.workunit.client.1.vm08.stdout:3/143: unlink d0/d6/de/d1b/fc 0 2026-03-09T19:26:55.240 INFO:tasks.workunit.client.1.vm08.stdout:4/86: read f2 [3175874,6334] 0 2026-03-09T19:26:55.241 INFO:tasks.workunit.client.1.vm08.stdout:4/87: stat f9 0 2026-03-09T19:26:55.241 INFO:tasks.workunit.client.1.vm08.stdout:5/90: sync 2026-03-09T19:26:55.242 INFO:tasks.workunit.client.1.vm08.stdout:1/157: sync 2026-03-09T19:26:55.248 INFO:tasks.workunit.client.1.vm08.stdout:1/158: dwrite d9/da/f1e [0,4194304] 0 2026-03-09T19:26:55.250 INFO:tasks.workunit.client.1.vm08.stdout:4/88: dwrite f9 [0,4194304] 0 2026-03-09T19:26:55.264 INFO:tasks.workunit.client.1.vm08.stdout:2/113: rmdir d3/d9/dc 39 2026-03-09T19:26:55.269 INFO:tasks.workunit.client.1.vm08.stdout:8/136: write de/f1f [3038791,63396] 0 2026-03-09T19:26:55.270 INFO:tasks.workunit.client.1.vm08.stdout:5/91: symlink d16/l23 0 2026-03-09T19:26:55.276 INFO:tasks.workunit.client.1.vm08.stdout:4/89: fsync da/d10/f13 0 2026-03-09T19:26:55.279 INFO:tasks.workunit.client.1.vm08.stdout:1/159: dwrite d9/f22 [8388608,4194304] 0 2026-03-09T19:26:55.280 INFO:tasks.workunit.client.1.vm08.stdout:4/90: dread f5 [0,4194304] 0 2026-03-09T19:26:55.288 INFO:tasks.workunit.client.1.vm08.stdout:3/144: getdents d0/d8/d24 0 2026-03-09T19:26:55.290 INFO:tasks.workunit.client.1.vm08.stdout:8/137: mkdir de/d1d/d2e 0 2026-03-09T19:26:55.295 INFO:tasks.workunit.client.1.vm08.stdout:8/138: dread - de/d1d/d21/f23 zero size 2026-03-09T19:26:55.295 INFO:tasks.workunit.client.1.vm08.stdout:8/139: write de/f1b [1023753,47438] 0 2026-03-09T19:26:55.300 INFO:tasks.workunit.client.1.vm08.stdout:1/160: dwrite d9/da/dc/f1d [0,4194304] 0 2026-03-09T19:26:55.300 INFO:tasks.workunit.client.1.vm08.stdout:4/91: mkdir da/d14 0 2026-03-09T19:26:55.309 INFO:tasks.workunit.client.1.vm08.stdout:1/161: dwrite d9/da/dc/f2e [0,4194304] 0 2026-03-09T19:26:55.312 INFO:tasks.workunit.client.1.vm08.stdout:6/106: truncate d3/db/f20 1526319 0 2026-03-09T19:26:55.312 INFO:tasks.workunit.client.1.vm08.stdout:1/162: write d9/da/dc/f2e [3841761,58291] 0 2026-03-09T19:26:55.312 INFO:tasks.workunit.client.1.vm08.stdout:2/114: write d3/d9/dc/d14/f2b [1235786,82410] 0 2026-03-09T19:26:55.319 INFO:tasks.workunit.client.1.vm08.stdout:2/115: dwrite d3/d4/fd [0,4194304] 0 2026-03-09T19:26:55.319 INFO:tasks.workunit.client.1.vm08.stdout:2/116: read d3/d4/f6 [193597,59490] 0 2026-03-09T19:26:55.327 INFO:tasks.workunit.client.1.vm08.stdout:2/117: dwrite d3/f7 [0,4194304] 0 2026-03-09T19:26:55.335 INFO:tasks.workunit.client.1.vm08.stdout:3/145: dread d0/f28 [0,4194304] 0 2026-03-09T19:26:55.335 INFO:tasks.workunit.client.1.vm08.stdout:3/146: chown d0/l29 15789 1 2026-03-09T19:26:55.339 INFO:tasks.workunit.client.1.vm08.stdout:2/118: dwrite d3/f7 [0,4194304] 0 2026-03-09T19:26:55.345 INFO:tasks.workunit.client.1.vm08.stdout:9/88: fsync d0/d2/f1a 0 2026-03-09T19:26:55.345 INFO:tasks.workunit.client.1.vm08.stdout:7/92: link d5/fc d5/f20 0 2026-03-09T19:26:55.346 INFO:tasks.workunit.client.1.vm08.stdout:9/89: write d0/d2/d8/fe [287601,18683] 0 2026-03-09T19:26:55.346 INFO:tasks.workunit.client.1.vm08.stdout:9/90: chown d0/d3 258897518 1 2026-03-09T19:26:55.351 INFO:tasks.workunit.client.1.vm08.stdout:5/92: rename d16/c1d to d16/d1e/c24 0 2026-03-09T19:26:55.358 INFO:tasks.workunit.client.1.vm08.stdout:4/92: dread da/fd [0,4194304] 0 2026-03-09T19:26:55.358 INFO:tasks.workunit.client.1.vm08.stdout:4/93: chown f5 0 1 2026-03-09T19:26:55.358 INFO:tasks.workunit.client.1.vm08.stdout:0/91: unlink f5 0 2026-03-09T19:26:55.362 INFO:tasks.workunit.client.1.vm08.stdout:8/140: mknod de/d1d/d2e/c2f 0 2026-03-09T19:26:55.362 INFO:tasks.workunit.client.1.vm08.stdout:8/141: write de/f16 [198162,43210] 0 2026-03-09T19:26:55.364 INFO:tasks.workunit.client.1.vm08.stdout:9/91: mknod d0/d2/d8/d7/c1e 0 2026-03-09T19:26:55.365 INFO:tasks.workunit.client.1.vm08.stdout:5/93: rmdir d16/d1e 39 2026-03-09T19:26:55.366 INFO:tasks.workunit.client.1.vm08.stdout:6/107: mknod d3/db/d24/c28 0 2026-03-09T19:26:55.367 INFO:tasks.workunit.client.1.vm08.stdout:1/163: symlink d9/da/d2c/l35 0 2026-03-09T19:26:55.368 INFO:tasks.workunit.client.1.vm08.stdout:0/92: creat dd/f1e x:0 0 0 2026-03-09T19:26:55.369 INFO:tasks.workunit.client.1.vm08.stdout:3/147: mkdir d0/d6/d25/d2a 0 2026-03-09T19:26:55.371 INFO:tasks.workunit.client.1.vm08.stdout:8/142: creat de/d1d/d21/f30 x:0 0 0 2026-03-09T19:26:55.382 INFO:tasks.workunit.client.1.vm08.stdout:8/143: fsync de/ff 0 2026-03-09T19:26:55.382 INFO:tasks.workunit.client.1.vm08.stdout:8/144: chown f1 3159 1 2026-03-09T19:26:55.382 INFO:tasks.workunit.client.1.vm08.stdout:9/92: mknod d0/c1f 0 2026-03-09T19:26:55.382 INFO:tasks.workunit.client.1.vm08.stdout:2/119: sync 2026-03-09T19:26:55.382 INFO:tasks.workunit.client.1.vm08.stdout:8/145: dwrite de/f1b [0,4194304] 0 2026-03-09T19:26:55.382 INFO:tasks.workunit.client.1.vm08.stdout:4/94: mknod da/d14/c15 0 2026-03-09T19:26:55.382 INFO:tasks.workunit.client.1.vm08.stdout:5/94: write d16/f1c [3182816,4389] 0 2026-03-09T19:26:55.383 INFO:tasks.workunit.client.1.vm08.stdout:8/146: chown de/d1d/d21/l29 544 1 2026-03-09T19:26:55.389 INFO:tasks.workunit.client.1.vm08.stdout:5/95: dwrite f5 [0,4194304] 0 2026-03-09T19:26:55.390 INFO:tasks.workunit.client.1.vm08.stdout:4/95: dread f9 [0,4194304] 0 2026-03-09T19:26:55.396 INFO:tasks.workunit.client.1.vm08.stdout:5/96: dread d16/f20 [0,4194304] 0 2026-03-09T19:26:55.410 INFO:tasks.workunit.client.1.vm08.stdout:0/93: mknod dd/c1f 0 2026-03-09T19:26:55.410 INFO:tasks.workunit.client.1.vm08.stdout:0/94: chown dd/d1c 21727 1 2026-03-09T19:26:55.411 INFO:tasks.workunit.client.1.vm08.stdout:3/148: rmdir d0/d6/de 39 2026-03-09T19:26:55.414 INFO:tasks.workunit.client.1.vm08.stdout:9/93: creat d0/d2/d14/f20 x:0 0 0 2026-03-09T19:26:55.418 INFO:tasks.workunit.client.1.vm08.stdout:9/94: write d0/d2/d8/fe [1256840,88343] 0 2026-03-09T19:26:55.418 INFO:tasks.workunit.client.1.vm08.stdout:2/120: rmdir d3/d9/dc/d14 39 2026-03-09T19:26:55.418 INFO:tasks.workunit.client.1.vm08.stdout:2/121: stat f1 0 2026-03-09T19:26:55.421 INFO:tasks.workunit.client.1.vm08.stdout:4/96: mkdir da/d10/d16 0 2026-03-09T19:26:55.435 INFO:tasks.workunit.client.1.vm08.stdout:3/149: mkdir d0/d6/d25/d2b 0 2026-03-09T19:26:55.435 INFO:tasks.workunit.client.1.vm08.stdout:0/95: fdatasync f7 0 2026-03-09T19:26:55.437 INFO:tasks.workunit.client.1.vm08.stdout:2/122: rename d3/d4/d10 to d3/d4/d23/d2c 0 2026-03-09T19:26:55.439 INFO:tasks.workunit.client.1.vm08.stdout:4/97: read f2 [2095654,11815] 0 2026-03-09T19:26:55.451 INFO:tasks.workunit.client.1.vm08.stdout:0/96: sync 2026-03-09T19:26:55.454 INFO:tasks.workunit.client.1.vm08.stdout:3/150: dread - d0/d6/de/d1b/d16/f21 zero size 2026-03-09T19:26:55.457 INFO:tasks.workunit.client.1.vm08.stdout:2/123: creat d3/d9/dc/de/d18/f2d x:0 0 0 2026-03-09T19:26:55.461 INFO:tasks.workunit.client.1.vm08.stdout:2/124: dread d3/d9/dc/de/f1c [0,4194304] 0 2026-03-09T19:26:55.464 INFO:tasks.workunit.client.1.vm08.stdout:2/125: dread f1 [0,4194304] 0 2026-03-09T19:26:55.464 INFO:tasks.workunit.client.1.vm08.stdout:4/98: mknod da/d10/c17 0 2026-03-09T19:26:55.467 INFO:tasks.workunit.client.1.vm08.stdout:3/151: creat d0/d6/de/d1b/d16/d18/f2c x:0 0 0 2026-03-09T19:26:55.468 INFO:tasks.workunit.client.1.vm08.stdout:3/152: chown d0/d6/de/d1b/d16/f21 108375220 1 2026-03-09T19:26:55.469 INFO:tasks.workunit.client.1.vm08.stdout:9/95: link d0/d2/f1a d0/d2/f21 0 2026-03-09T19:26:55.474 INFO:tasks.workunit.client.1.vm08.stdout:6/108: getdents d3/db/d24 0 2026-03-09T19:26:55.481 INFO:tasks.workunit.client.1.vm08.stdout:2/126: dwrite d3/f19 [0,4194304] 0 2026-03-09T19:26:55.484 INFO:tasks.workunit.client.1.vm08.stdout:4/99: creat da/f18 x:0 0 0 2026-03-09T19:26:55.484 INFO:tasks.workunit.client.1.vm08.stdout:4/100: stat f9 0 2026-03-09T19:26:55.491 INFO:tasks.workunit.client.1.vm08.stdout:7/93: write d5/fb [796845,32358] 0 2026-03-09T19:26:55.496 INFO:tasks.workunit.client.1.vm08.stdout:8/147: rmdir de/d1d/d21 39 2026-03-09T19:26:55.506 INFO:tasks.workunit.client.1.vm08.stdout:9/96: rename d0/d2/d8/f1c to d0/d2/d8/d7/f22 0 2026-03-09T19:26:55.506 INFO:tasks.workunit.client.1.vm08.stdout:6/109: rename d3/db/d24 to d3/db/d24/d29 22 2026-03-09T19:26:55.506 INFO:tasks.workunit.client.1.vm08.stdout:9/97: fsync d0/d2/d14/f17 0 2026-03-09T19:26:55.508 INFO:tasks.workunit.client.1.vm08.stdout:9/98: read d0/d2/d8/d7/f22 [2232400,9168] 0 2026-03-09T19:26:55.512 INFO:tasks.workunit.client.1.vm08.stdout:6/110: dwrite d3/d15/f1e [0,4194304] 0 2026-03-09T19:26:55.514 INFO:tasks.workunit.client.1.vm08.stdout:6/111: fsync d3/db/f14 0 2026-03-09T19:26:55.523 INFO:tasks.workunit.client.1.vm08.stdout:4/101: fsync da/fd 0 2026-03-09T19:26:55.524 INFO:tasks.workunit.client.1.vm08.stdout:4/102: write da/d10/f13 [355301,234] 0 2026-03-09T19:26:55.528 INFO:tasks.workunit.client.1.vm08.stdout:7/94: chown d5/d14/f1e 170 1 2026-03-09T19:26:55.535 INFO:tasks.workunit.client.1.vm08.stdout:8/148: mkdir de/d25/d31 0 2026-03-09T19:26:55.535 INFO:tasks.workunit.client.1.vm08.stdout:8/149: write de/d1d/f27 [913696,110359] 0 2026-03-09T19:26:55.535 INFO:tasks.workunit.client.1.vm08.stdout:8/150: chown de/d2c/f2d 0 1 2026-03-09T19:26:55.535 INFO:tasks.workunit.client.1.vm08.stdout:0/97: rmdir dd/d1c 0 2026-03-09T19:26:55.535 INFO:tasks.workunit.client.1.vm08.stdout:0/98: write dd/f18 [106198,14187] 0 2026-03-09T19:26:55.536 INFO:tasks.workunit.client.1.vm08.stdout:0/99: write fc [3683919,64129] 0 2026-03-09T19:26:55.536 INFO:tasks.workunit.client.1.vm08.stdout:0/100: write f7 [705725,15011] 0 2026-03-09T19:26:55.537 INFO:tasks.workunit.client.1.vm08.stdout:0/101: read dd/fe [457511,43492] 0 2026-03-09T19:26:55.538 INFO:tasks.workunit.client.1.vm08.stdout:0/102: truncate dd/f18 372135 0 2026-03-09T19:26:55.544 INFO:tasks.workunit.client.1.vm08.stdout:3/153: creat d0/d8/d24/f2d x:0 0 0 2026-03-09T19:26:55.549 INFO:tasks.workunit.client.1.vm08.stdout:3/154: dwrite d0/d8/d24/f2d [0,4194304] 0 2026-03-09T19:26:55.560 INFO:tasks.workunit.client.1.vm08.stdout:9/99: creat d0/d2/d8/d7/f23 x:0 0 0 2026-03-09T19:26:55.561 INFO:tasks.workunit.client.1.vm08.stdout:1/164: truncate d9/da/f2f 252179 0 2026-03-09T19:26:55.561 INFO:tasks.workunit.client.1.vm08.stdout:1/165: readlink d9/da/dc/ld 0 2026-03-09T19:26:55.565 INFO:tasks.workunit.client.1.vm08.stdout:5/97: truncate f10 1085288 0 2026-03-09T19:26:55.567 INFO:tasks.workunit.client.1.vm08.stdout:5/98: dread f15 [0,4194304] 0 2026-03-09T19:26:55.571 INFO:tasks.workunit.client.1.vm08.stdout:6/112: creat d3/f2a x:0 0 0 2026-03-09T19:26:55.572 INFO:tasks.workunit.client.1.vm08.stdout:2/127: mknod d3/d9/dc/c2e 0 2026-03-09T19:26:55.573 INFO:tasks.workunit.client.1.vm08.stdout:2/128: write d3/d9/dc/de/d18/f29 [735917,103574] 0 2026-03-09T19:26:55.573 INFO:tasks.workunit.client.1.vm08.stdout:2/129: write f1 [288448,57621] 0 2026-03-09T19:26:55.579 INFO:tasks.workunit.client.1.vm08.stdout:8/151: unlink de/d1d/l24 0 2026-03-09T19:26:55.584 INFO:tasks.workunit.client.1.vm08.stdout:4/103: dwrite da/fd [0,4194304] 0 2026-03-09T19:26:55.586 INFO:tasks.workunit.client.1.vm08.stdout:1/166: sync 2026-03-09T19:26:55.587 INFO:tasks.workunit.client.1.vm08.stdout:4/104: read da/f12 [254441,37988] 0 2026-03-09T19:26:55.606 INFO:tasks.workunit.client.1.vm08.stdout:9/100: truncate d0/d3/f5 582455 0 2026-03-09T19:26:55.667 INFO:tasks.workunit.client.1.vm08.stdout:2/130: truncate f2 329066 0 2026-03-09T19:26:55.668 INFO:tasks.workunit.client.1.vm08.stdout:2/131: write d3/d9/dc/de/f17 [17562,13059] 0 2026-03-09T19:26:55.672 INFO:tasks.workunit.client.1.vm08.stdout:2/132: dread d3/d9/dc/de/d18/f29 [0,4194304] 0 2026-03-09T19:26:55.673 INFO:tasks.workunit.client.1.vm08.stdout:2/133: write d3/d9/dc/de/f17 [855914,35347] 0 2026-03-09T19:26:55.676 INFO:tasks.workunit.client.1.vm08.stdout:2/134: dwrite d3/d4/d23/f27 [0,4194304] 0 2026-03-09T19:26:55.693 INFO:tasks.workunit.client.1.vm08.stdout:7/95: mknod d5/d12/c21 0 2026-03-09T19:26:55.708 INFO:tasks.workunit.client.1.vm08.stdout:1/167: fsync d9/da/dc/f20 0 2026-03-09T19:26:55.709 INFO:tasks.workunit.client.1.vm08.stdout:1/168: chown d9 129013874 1 2026-03-09T19:26:55.709 INFO:tasks.workunit.client.1.vm08.stdout:1/169: write d9/d11/f29 [559954,113047] 0 2026-03-09T19:26:55.713 INFO:tasks.workunit.client.1.vm08.stdout:3/155: symlink d0/d6/de/d1a/l2e 0 2026-03-09T19:26:55.715 INFO:tasks.workunit.client.1.vm08.stdout:6/113: creat d3/d15/f2b x:0 0 0 2026-03-09T19:26:55.715 INFO:tasks.workunit.client.1.vm08.stdout:6/114: fdatasync d3/d15/f19 0 2026-03-09T19:26:55.724 INFO:tasks.workunit.client.1.vm08.stdout:7/96: rename d5/d12/c21 to d5/d12/c22 0 2026-03-09T19:26:55.726 INFO:tasks.workunit.client.1.vm08.stdout:8/152: mkdir de/d32 0 2026-03-09T19:26:55.733 INFO:tasks.workunit.client.1.vm08.stdout:1/170: creat d9/f36 x:0 0 0 2026-03-09T19:26:55.733 INFO:tasks.workunit.client.1.vm08.stdout:1/171: chown d9/da/d17/f2a 0 1 2026-03-09T19:26:55.733 INFO:tasks.workunit.client.1.vm08.stdout:1/172: chown d9/f36 1333 1 2026-03-09T19:26:55.733 INFO:tasks.workunit.client.1.vm08.stdout:4/105: rename l4 to da/d14/l19 0 2026-03-09T19:26:55.733 INFO:tasks.workunit.client.1.vm08.stdout:4/106: write f5 [3994893,96873] 0 2026-03-09T19:26:55.739 INFO:tasks.workunit.client.1.vm08.stdout:3/156: rmdir d0/d6/de/d1b 39 2026-03-09T19:26:55.749 INFO:tasks.workunit.client.1.vm08.stdout:8/153: dread - de/d1d/d21/f30 zero size 2026-03-09T19:26:55.752 INFO:tasks.workunit.client.1.vm08.stdout:8/154: dread de/f11 [0,4194304] 0 2026-03-09T19:26:55.753 INFO:tasks.workunit.client.1.vm08.stdout:8/155: write de/f1b [752664,108722] 0 2026-03-09T19:26:55.757 INFO:tasks.workunit.client.1.vm08.stdout:8/156: dwrite de/d1d/d21/f30 [0,4194304] 0 2026-03-09T19:26:55.758 INFO:tasks.workunit.client.1.vm08.stdout:8/157: chown de/c12 124 1 2026-03-09T19:26:55.758 INFO:tasks.workunit.client.1.vm08.stdout:8/158: write de/d1d/f1e [4002653,106214] 0 2026-03-09T19:26:55.759 INFO:tasks.workunit.client.1.vm08.stdout:8/159: write de/f16 [946787,12998] 0 2026-03-09T19:26:55.762 INFO:tasks.workunit.client.1.vm08.stdout:1/173: mknod d9/c37 0 2026-03-09T19:26:55.769 INFO:tasks.workunit.client.1.vm08.stdout:1/174: dwrite d9/da/dc/f10 [0,4194304] 0 2026-03-09T19:26:55.784 INFO:tasks.workunit.client.1.vm08.stdout:0/103: truncate dd/f12 1125980 0 2026-03-09T19:26:55.788 INFO:tasks.workunit.client.1.vm08.stdout:0/104: dwrite dd/f16 [0,4194304] 0 2026-03-09T19:26:55.792 INFO:tasks.workunit.client.1.vm08.stdout:0/105: dread fb [0,4194304] 0 2026-03-09T19:26:55.793 INFO:tasks.workunit.client.1.vm08.stdout:9/101: rename d0/cb to d0/d2/d14/c24 0 2026-03-09T19:26:55.793 INFO:tasks.workunit.client.1.vm08.stdout:9/102: dread - d0/d2/d14/f19 zero size 2026-03-09T19:26:55.799 INFO:tasks.workunit.client.1.vm08.stdout:3/157: dread - d0/d6/de/d1b/d16/f21 zero size 2026-03-09T19:26:55.800 INFO:tasks.workunit.client.1.vm08.stdout:4/107: dread da/d10/f13 [0,4194304] 0 2026-03-09T19:26:55.803 INFO:tasks.workunit.client.1.vm08.stdout:5/99: getdents d16 0 2026-03-09T19:26:55.808 INFO:tasks.workunit.client.1.vm08.stdout:6/115: mknod d3/c2c 0 2026-03-09T19:26:55.810 INFO:tasks.workunit.client.1.vm08.stdout:7/97: creat d5/d16/f23 x:0 0 0 2026-03-09T19:26:55.827 INFO:tasks.workunit.client.1.vm08.stdout:0/106: symlink dd/l20 0 2026-03-09T19:26:55.834 INFO:tasks.workunit.client.1.vm08.stdout:4/108: read f1 [4092937,35807] 0 2026-03-09T19:26:55.837 INFO:tasks.workunit.client.1.vm08.stdout:3/158: sync 2026-03-09T19:26:55.837 INFO:tasks.workunit.client.1.vm08.stdout:9/103: sync 2026-03-09T19:26:55.839 INFO:tasks.workunit.client.1.vm08.stdout:9/104: dread d0/d3/f9 [0,4194304] 0 2026-03-09T19:26:55.840 INFO:tasks.workunit.client.1.vm08.stdout:1/175: rmdir d9 39 2026-03-09T19:26:55.847 INFO:tasks.workunit.client.1.vm08.stdout:2/135: link d3/d4/d23/d2c/c1b d3/d9/dc/c2f 0 2026-03-09T19:26:55.849 INFO:tasks.workunit.client.1.vm08.stdout:7/98: read d5/fa [977800,79941] 0 2026-03-09T19:26:55.852 INFO:tasks.workunit.client.1.vm08.stdout:0/107: fdatasync fb 0 2026-03-09T19:26:55.856 INFO:tasks.workunit.client.1.vm08.stdout:0/108: dwrite dd/f18 [0,4194304] 0 2026-03-09T19:26:55.859 INFO:tasks.workunit.client.1.vm08.stdout:0/109: chown dd/f18 12 1 2026-03-09T19:26:55.860 INFO:tasks.workunit.client.1.vm08.stdout:0/110: read fb [545341,83848] 0 2026-03-09T19:26:55.867 INFO:tasks.workunit.client.1.vm08.stdout:4/109: rename da/d14/c15 to da/c1a 0 2026-03-09T19:26:55.868 INFO:tasks.workunit.client.1.vm08.stdout:4/110: chown da/d10 293489 1 2026-03-09T19:26:55.870 INFO:tasks.workunit.client.1.vm08.stdout:6/116: dread d3/db/f20 [0,4194304] 0 2026-03-09T19:26:55.871 INFO:tasks.workunit.client.1.vm08.stdout:3/159: write d0/f28 [1375183,113086] 0 2026-03-09T19:26:55.871 INFO:tasks.workunit.client.1.vm08.stdout:3/160: chown d0/d6/d25/d2b 291192099 1 2026-03-09T19:26:55.872 INFO:tasks.workunit.client.1.vm08.stdout:3/161: write d0/d8/d24/f2d [917129,11928] 0 2026-03-09T19:26:55.875 INFO:tasks.workunit.client.1.vm08.stdout:1/176: fdatasync d9/da/dc/f1d 0 2026-03-09T19:26:55.881 INFO:tasks.workunit.client.1.vm08.stdout:1/177: chown d9/d11/f29 1212542411 1 2026-03-09T19:26:55.881 INFO:tasks.workunit.client.1.vm08.stdout:0/111: rename dd/l10 to dd/l21 0 2026-03-09T19:26:55.881 INFO:tasks.workunit.client.1.vm08.stdout:2/136: rename d3/d9/dc/de/l24 to d3/d9/dc/de/d18/d1f/l30 0 2026-03-09T19:26:55.884 INFO:tasks.workunit.client.1.vm08.stdout:3/162: fsync d0/f28 0 2026-03-09T19:26:55.885 INFO:tasks.workunit.client.1.vm08.stdout:3/163: fdatasync d0/d6/de/d1b/d16/f21 0 2026-03-09T19:26:55.885 INFO:tasks.workunit.client.1.vm08.stdout:4/111: mkdir da/d10/d1b 0 2026-03-09T19:26:55.886 INFO:tasks.workunit.client.1.vm08.stdout:4/112: dread - da/f18 zero size 2026-03-09T19:26:55.887 INFO:tasks.workunit.client.1.vm08.stdout:5/100: getdents d16 0 2026-03-09T19:26:55.894 INFO:tasks.workunit.client.1.vm08.stdout:7/99: symlink d5/l24 0 2026-03-09T19:26:55.896 INFO:tasks.workunit.client.1.vm08.stdout:9/105: truncate d0/d2/f1a 628476 0 2026-03-09T19:26:55.897 INFO:tasks.workunit.client.1.vm08.stdout:9/106: dread - d0/d2/f1d zero size 2026-03-09T19:26:55.899 INFO:tasks.workunit.client.1.vm08.stdout:8/160: getdents de/d1d 0 2026-03-09T19:26:55.901 INFO:tasks.workunit.client.1.vm08.stdout:1/178: creat d9/da/d2c/f38 x:0 0 0 2026-03-09T19:26:55.904 INFO:tasks.workunit.client.1.vm08.stdout:0/112: mkdir dd/d22 0 2026-03-09T19:26:55.909 INFO:tasks.workunit.client.1.vm08.stdout:4/113: unlink da/d10/l11 0 2026-03-09T19:26:55.910 INFO:tasks.workunit.client.1.vm08.stdout:7/100: rmdir d5 39 2026-03-09T19:26:55.915 INFO:tasks.workunit.client.1.vm08.stdout:8/161: unlink de/ff 0 2026-03-09T19:26:55.918 INFO:tasks.workunit.client.1.vm08.stdout:1/179: mkdir d9/da/d12/d39 0 2026-03-09T19:26:55.920 INFO:tasks.workunit.client.1.vm08.stdout:0/113: fsync dd/f13 0 2026-03-09T19:26:55.925 INFO:tasks.workunit.client.1.vm08.stdout:0/114: dwrite dd/f1e [0,4194304] 0 2026-03-09T19:26:55.927 INFO:tasks.workunit.client.1.vm08.stdout:0/115: read fc [3038807,128602] 0 2026-03-09T19:26:55.939 INFO:tasks.workunit.client.1.vm08.stdout:2/137: link d3/d4/fd d3/d4/d23/d2c/f31 0 2026-03-09T19:26:55.940 INFO:tasks.workunit.client.1.vm08.stdout:4/114: truncate da/f12 322066 0 2026-03-09T19:26:55.942 INFO:tasks.workunit.client.1.vm08.stdout:8/162: rmdir de/d25 39 2026-03-09T19:26:55.943 INFO:tasks.workunit.client.1.vm08.stdout:8/163: write de/d1d/d21/f23 [858651,48918] 0 2026-03-09T19:26:55.944 INFO:tasks.workunit.client.1.vm08.stdout:8/164: fsync de/f19 0 2026-03-09T19:26:55.944 INFO:tasks.workunit.client.1.vm08.stdout:7/101: dwrite d5/d12/f13 [0,4194304] 0 2026-03-09T19:26:55.946 INFO:tasks.workunit.client.1.vm08.stdout:1/180: unlink d9/d11/l1c 0 2026-03-09T19:26:55.947 INFO:tasks.workunit.client.1.vm08.stdout:1/181: dread d9/da/d17/f2a [0,4194304] 0 2026-03-09T19:26:55.949 INFO:tasks.workunit.client.1.vm08.stdout:1/182: write d9/da/dc/f31 [413911,25269] 0 2026-03-09T19:26:55.960 INFO:tasks.workunit.client.1.vm08.stdout:0/116: unlink dd/c1b 0 2026-03-09T19:26:55.960 INFO:tasks.workunit.client.1.vm08.stdout:6/117: rename c2 to d3/c2d 0 2026-03-09T19:26:55.963 INFO:tasks.workunit.client.1.vm08.stdout:0/117: dwrite dd/f18 [0,4194304] 0 2026-03-09T19:26:55.965 INFO:tasks.workunit.client.1.vm08.stdout:9/107: read d0/d2/f1a [38872,110884] 0 2026-03-09T19:26:55.976 INFO:tasks.workunit.client.1.vm08.stdout:7/102: dwrite d5/fa [0,4194304] 0 2026-03-09T19:26:55.990 INFO:tasks.workunit.client.1.vm08.stdout:2/138: link d3/f19 d3/d9/dc/de/f32 0 2026-03-09T19:26:55.997 INFO:tasks.workunit.client.1.vm08.stdout:8/165: mkdir de/d25/d33 0 2026-03-09T19:26:55.997 INFO:tasks.workunit.client.1.vm08.stdout:7/103: symlink d5/d12/l25 0 2026-03-09T19:26:55.997 INFO:tasks.workunit.client.1.vm08.stdout:7/104: read d5/d14/f1e [2781130,85724] 0 2026-03-09T19:26:56.000 INFO:tasks.workunit.client.1.vm08.stdout:3/164: rename d0/d6/de/d1a/l2e to d0/d6/de/d1b/l2f 0 2026-03-09T19:26:56.003 INFO:tasks.workunit.client.1.vm08.stdout:8/166: symlink de/d25/d33/l34 0 2026-03-09T19:26:56.010 INFO:tasks.workunit.client.1.vm08.stdout:2/139: rename d3/d9/dc/c2e to d3/d9/dc/c33 0 2026-03-09T19:26:56.015 INFO:tasks.workunit.client.1.vm08.stdout:8/167: mkdir de/d25/d35 0 2026-03-09T19:26:56.015 INFO:tasks.workunit.client.1.vm08.stdout:8/168: fdatasync de/f20 0 2026-03-09T19:26:56.015 INFO:tasks.workunit.client.1.vm08.stdout:8/169: readlink de/l13 0 2026-03-09T19:26:56.015 INFO:tasks.workunit.client.1.vm08.stdout:8/170: dread de/f1c [0,4194304] 0 2026-03-09T19:26:56.015 INFO:tasks.workunit.client.1.vm08.stdout:8/171: dread - de/f20 zero size 2026-03-09T19:26:56.017 INFO:tasks.workunit.client.1.vm08.stdout:5/101: rename f10 to d16/d1e/f25 0 2026-03-09T19:26:56.017 INFO:tasks.workunit.client.1.vm08.stdout:5/102: dread - f14 zero size 2026-03-09T19:26:56.021 INFO:tasks.workunit.client.1.vm08.stdout:3/165: rename d0/d6/de/c13 to d0/d6/de/c30 0 2026-03-09T19:26:56.021 INFO:tasks.workunit.client.1.vm08.stdout:3/166: chown d0/d8/d24 318 1 2026-03-09T19:26:56.023 INFO:tasks.workunit.client.1.vm08.stdout:1/183: sync 2026-03-09T19:26:56.023 INFO:tasks.workunit.client.1.vm08.stdout:9/108: sync 2026-03-09T19:26:56.024 INFO:tasks.workunit.client.1.vm08.stdout:9/109: fdatasync d0/f4 0 2026-03-09T19:26:56.024 INFO:tasks.workunit.client.1.vm08.stdout:1/184: write d9/da/f30 [396999,82582] 0 2026-03-09T19:26:56.033 INFO:tasks.workunit.client.1.vm08.stdout:5/103: dread d16/f20 [0,4194304] 0 2026-03-09T19:26:56.033 INFO:tasks.workunit.client.1.vm08.stdout:5/104: write f5 [3470950,30936] 0 2026-03-09T19:26:56.034 INFO:tasks.workunit.client.1.vm08.stdout:5/105: chown d16/c1f 6 1 2026-03-09T19:26:56.040 INFO:tasks.workunit.client.1.vm08.stdout:0/118: rmdir dd 39 2026-03-09T19:26:56.045 INFO:tasks.workunit.client.1.vm08.stdout:7/105: write d5/f9 [2859631,102169] 0 2026-03-09T19:26:56.046 INFO:tasks.workunit.client.1.vm08.stdout:7/106: chown d5/d14/f1e 2279260 1 2026-03-09T19:26:56.047 INFO:tasks.workunit.client.1.vm08.stdout:2/140: symlink d3/d4/l34 0 2026-03-09T19:26:56.051 INFO:tasks.workunit.client.1.vm08.stdout:8/172: creat de/d25/d31/f36 x:0 0 0 2026-03-09T19:26:56.065 INFO:tasks.workunit.client.1.vm08.stdout:4/115: rename da/f12 to da/d10/f1c 0 2026-03-09T19:26:56.068 INFO:tasks.workunit.client.1.vm08.stdout:3/167: truncate d0/d8/d24/f2d 3497736 0 2026-03-09T19:26:56.069 INFO:tasks.workunit.client.1.vm08.stdout:9/110: symlink d0/d2/d8/l25 0 2026-03-09T19:26:56.070 INFO:tasks.workunit.client.1.vm08.stdout:1/185: symlink d9/da/d12/d39/l3a 0 2026-03-09T19:26:56.070 INFO:tasks.workunit.client.1.vm08.stdout:1/186: read d9/f22 [11164746,113460] 0 2026-03-09T19:26:56.076 INFO:tasks.workunit.client.1.vm08.stdout:6/118: rename d3/d15/l23 to d3/db/d24/l2e 0 2026-03-09T19:26:56.076 INFO:tasks.workunit.client.1.vm08.stdout:0/119: rename dd to dd/d23 22 2026-03-09T19:26:56.076 INFO:tasks.workunit.client.1.vm08.stdout:9/111: rename d0/d3 to d0/d3/d26 22 2026-03-09T19:26:56.079 INFO:tasks.workunit.client.1.vm08.stdout:5/106: truncate d16/f18 2732163 0 2026-03-09T19:26:56.084 INFO:tasks.workunit.client.1.vm08.stdout:4/116: creat da/f1d x:0 0 0 2026-03-09T19:26:56.088 INFO:tasks.workunit.client.1.vm08.stdout:1/187: sync 2026-03-09T19:26:56.088 INFO:tasks.workunit.client.1.vm08.stdout:0/120: sync 2026-03-09T19:26:56.089 INFO:tasks.workunit.client.1.vm08.stdout:6/119: creat d3/db/d24/f2f x:0 0 0 2026-03-09T19:26:56.093 INFO:tasks.workunit.client.1.vm08.stdout:7/107: truncate d5/f1a 390133 0 2026-03-09T19:26:56.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:55 vm08.local ceph-mon[57794]: pgmap v152: 65 pgs: 65 active+clean; 304 MiB data, 2.0 GiB used, 118 GiB / 120 GiB avail; 436 KiB/s rd, 18 MiB/s wr, 370 op/s 2026-03-09T19:26:56.103 INFO:tasks.workunit.client.1.vm08.stdout:9/112: dwrite d0/d3/fd [0,4194304] 0 2026-03-09T19:26:56.105 INFO:tasks.workunit.client.1.vm08.stdout:9/113: write d0/f13 [334189,57957] 0 2026-03-09T19:26:56.105 INFO:tasks.workunit.client.1.vm08.stdout:9/114: chown d0/d2/d8/d7/f22 3506 1 2026-03-09T19:26:56.115 INFO:tasks.workunit.client.1.vm08.stdout:2/141: getdents d3/d4/d23 0 2026-03-09T19:26:56.116 INFO:tasks.workunit.client.1.vm08.stdout:8/173: getdents de/d1d 0 2026-03-09T19:26:56.117 INFO:tasks.workunit.client.1.vm08.stdout:1/188: mknod d9/da/d17/c3b 0 2026-03-09T19:26:56.119 INFO:tasks.workunit.client.1.vm08.stdout:5/107: write ff [72871,63471] 0 2026-03-09T19:26:56.123 INFO:tasks.workunit.client.1.vm08.stdout:6/120: chown d3/ca 536367623 1 2026-03-09T19:26:56.125 INFO:tasks.workunit.client.1.vm08.stdout:7/108: symlink d5/d16/l26 0 2026-03-09T19:26:56.126 INFO:tasks.workunit.client.1.vm08.stdout:7/109: fsync d5/d16/f23 0 2026-03-09T19:26:56.133 INFO:tasks.workunit.client.1.vm08.stdout:3/168: rmdir d0/d6/d25/d2b 0 2026-03-09T19:26:56.141 INFO:tasks.workunit.client.1.vm08.stdout:1/189: rmdir d9/da/d12 39 2026-03-09T19:26:56.147 INFO:tasks.workunit.client.1.vm08.stdout:8/174: dwrite de/f11 [0,4194304] 0 2026-03-09T19:26:56.147 INFO:tasks.workunit.client.1.vm08.stdout:8/175: write de/f1f [485727,110617] 0 2026-03-09T19:26:56.163 INFO:tasks.workunit.client.1.vm08.stdout:5/108: rename d16/c1f to d16/d1e/c26 0 2026-03-09T19:26:56.164 INFO:tasks.workunit.client.1.vm08.stdout:0/121: mkdir dd/d22/d24 0 2026-03-09T19:26:56.164 INFO:tasks.workunit.client.1.vm08.stdout:7/110: rmdir d5 39 2026-03-09T19:26:56.164 INFO:tasks.workunit.client.1.vm08.stdout:6/121: creat d3/db/f30 x:0 0 0 2026-03-09T19:26:56.164 INFO:tasks.workunit.client.1.vm08.stdout:5/109: chown f5 144698 1 2026-03-09T19:26:56.168 INFO:tasks.workunit.client.1.vm08.stdout:3/169: unlink d0/d8/d19/fd 0 2026-03-09T19:26:56.170 INFO:tasks.workunit.client.1.vm08.stdout:2/142: creat d3/d9/d26/f35 x:0 0 0 2026-03-09T19:26:56.174 INFO:tasks.workunit.client.1.vm08.stdout:1/190: creat d9/d11/f3c x:0 0 0 2026-03-09T19:26:56.174 INFO:tasks.workunit.client.1.vm08.stdout:1/191: chown d9/da/d2c 659 1 2026-03-09T19:26:56.176 INFO:tasks.workunit.client.1.vm08.stdout:1/192: dread d9/da/dc/f20 [0,4194304] 0 2026-03-09T19:26:56.179 INFO:tasks.workunit.client.1.vm08.stdout:0/122: symlink dd/d22/l25 0 2026-03-09T19:26:56.182 INFO:tasks.workunit.client.1.vm08.stdout:7/111: dwrite d5/d16/f1f [0,4194304] 0 2026-03-09T19:26:56.184 INFO:tasks.workunit.client.1.vm08.stdout:9/115: link d0/d2/f1a d0/f27 0 2026-03-09T19:26:56.184 INFO:tasks.workunit.client.1.vm08.stdout:5/110: rmdir d16 39 2026-03-09T19:26:56.192 INFO:tasks.workunit.client.1.vm08.stdout:4/117: getdents da 0 2026-03-09T19:26:56.193 INFO:tasks.workunit.client.1.vm08.stdout:3/170: rmdir d0/d6/de/d15 39 2026-03-09T19:26:56.194 INFO:tasks.workunit.client.1.vm08.stdout:2/143: symlink d3/d9/dc/de/l36 0 2026-03-09T19:26:56.198 INFO:tasks.workunit.client.1.vm08.stdout:6/122: symlink d3/l31 0 2026-03-09T19:26:56.211 INFO:tasks.workunit.client.1.vm08.stdout:9/116: creat d0/d2/d14/f28 x:0 0 0 2026-03-09T19:26:56.211 INFO:tasks.workunit.client.1.vm08.stdout:3/171: mknod d0/d6/de/d1a/c31 0 2026-03-09T19:26:56.212 INFO:tasks.workunit.client.1.vm08.stdout:3/172: fsync d0/d6/de/d1b/d16/f21 0 2026-03-09T19:26:56.212 INFO:tasks.workunit.client.1.vm08.stdout:2/144: symlink d3/d9/d26/l37 0 2026-03-09T19:26:56.212 INFO:tasks.workunit.client.1.vm08.stdout:2/145: stat d3/d4 0 2026-03-09T19:26:56.212 INFO:tasks.workunit.client.1.vm08.stdout:1/193: creat d9/da/d2d/f3d x:0 0 0 2026-03-09T19:26:56.212 INFO:tasks.workunit.client.1.vm08.stdout:8/176: creat de/f37 x:0 0 0 2026-03-09T19:26:56.213 INFO:tasks.workunit.client.1.vm08.stdout:7/112: sync 2026-03-09T19:26:56.213 INFO:tasks.workunit.client.1.vm08.stdout:4/118: sync 2026-03-09T19:26:56.219 INFO:tasks.workunit.client.1.vm08.stdout:4/119: dwrite da/f1d [0,4194304] 0 2026-03-09T19:26:56.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:55 vm07.local ceph-mon[48545]: pgmap v152: 65 pgs: 65 active+clean; 304 MiB data, 2.0 GiB used, 118 GiB / 120 GiB avail; 436 KiB/s rd, 18 MiB/s wr, 370 op/s 2026-03-09T19:26:56.233 INFO:tasks.workunit.client.1.vm08.stdout:3/173: chown d0/d6/de/d1b/d16/d17/f1d 43888 1 2026-03-09T19:26:56.237 INFO:tasks.workunit.client.1.vm08.stdout:1/194: truncate d9/da/d17/f2a 898942 0 2026-03-09T19:26:56.237 INFO:tasks.workunit.client.1.vm08.stdout:1/195: write d9/da/d2d/f3d [614349,916] 0 2026-03-09T19:26:56.240 INFO:tasks.workunit.client.1.vm08.stdout:7/113: mkdir d5/d14/d27 0 2026-03-09T19:26:56.240 INFO:tasks.workunit.client.1.vm08.stdout:7/114: chown d5/l24 259616621 1 2026-03-09T19:26:56.243 INFO:tasks.workunit.client.1.vm08.stdout:7/115: dread d5/d14/f1e [0,4194304] 0 2026-03-09T19:26:56.246 INFO:tasks.workunit.client.1.vm08.stdout:4/120: mknod da/d14/c1e 0 2026-03-09T19:26:56.247 INFO:tasks.workunit.client.1.vm08.stdout:6/123: fsync d3/fe 0 2026-03-09T19:26:56.250 INFO:tasks.workunit.client.1.vm08.stdout:0/123: write f3 [1113940,74862] 0 2026-03-09T19:26:56.255 INFO:tasks.workunit.client.1.vm08.stdout:1/196: readlink d9/da/d12/d39/l3a 0 2026-03-09T19:26:56.258 INFO:tasks.workunit.client.1.vm08.stdout:7/116: creat d5/d16/f28 x:0 0 0 2026-03-09T19:26:56.260 INFO:tasks.workunit.client.1.vm08.stdout:4/121: creat da/d10/f1f x:0 0 0 2026-03-09T19:26:56.261 INFO:tasks.workunit.client.1.vm08.stdout:4/122: truncate da/d10/f1f 250710 0 2026-03-09T19:26:56.267 INFO:tasks.workunit.client.1.vm08.stdout:6/124: dwrite d3/fe [0,4194304] 0 2026-03-09T19:26:56.277 INFO:tasks.workunit.client.1.vm08.stdout:2/146: truncate d3/d9/dc/de/f17 131852 0 2026-03-09T19:26:56.281 INFO:tasks.workunit.client.1.vm08.stdout:9/117: rename d0/f4 to d0/d2/d8/f29 0 2026-03-09T19:26:56.282 INFO:tasks.workunit.client.1.vm08.stdout:1/197: mknod d9/da/d2d/c3e 0 2026-03-09T19:26:56.284 INFO:tasks.workunit.client.1.vm08.stdout:4/123: mknod da/d14/c20 0 2026-03-09T19:26:56.287 INFO:tasks.workunit.client.1.vm08.stdout:4/124: dread da/f1d [0,4194304] 0 2026-03-09T19:26:56.289 INFO:tasks.workunit.client.1.vm08.stdout:5/111: getdents d16/d1e 0 2026-03-09T19:26:56.291 INFO:tasks.workunit.client.1.vm08.stdout:9/118: unlink d0/c1f 0 2026-03-09T19:26:56.292 INFO:tasks.workunit.client.1.vm08.stdout:8/177: getdents de/d25 0 2026-03-09T19:26:56.292 INFO:tasks.workunit.client.1.vm08.stdout:8/178: chown de/d1d/d2e/c2f 147 1 2026-03-09T19:26:56.292 INFO:tasks.workunit.client.1.vm08.stdout:7/117: getdents d5/d14/d27 0 2026-03-09T19:26:56.294 INFO:tasks.workunit.client.1.vm08.stdout:4/125: creat da/f21 x:0 0 0 2026-03-09T19:26:56.317 INFO:tasks.workunit.client.1.vm08.stdout:8/179: dwrite de/f1b [4194304,4194304] 0 2026-03-09T19:26:56.318 INFO:tasks.workunit.client.1.vm08.stdout:6/125: unlink d3/f17 0 2026-03-09T19:26:56.318 INFO:tasks.workunit.client.1.vm08.stdout:2/147: mknod d3/d4/c38 0 2026-03-09T19:26:56.319 INFO:tasks.workunit.client.1.vm08.stdout:2/148: write d3/d9/d26/f35 [82712,100800] 0 2026-03-09T19:26:56.319 INFO:tasks.workunit.client.1.vm08.stdout:4/126: mknod da/d14/c22 0 2026-03-09T19:26:56.319 INFO:tasks.workunit.client.1.vm08.stdout:4/127: stat da/lc 0 2026-03-09T19:26:56.319 INFO:tasks.workunit.client.1.vm08.stdout:3/174: rename d0/d6/de/d1b/d16/d18/c23 to d0/d6/c32 0 2026-03-09T19:26:56.319 INFO:tasks.workunit.client.1.vm08.stdout:3/175: chown d0/d6 11555 1 2026-03-09T19:26:56.319 INFO:tasks.workunit.client.1.vm08.stdout:7/118: unlink d5/f10 0 2026-03-09T19:26:56.319 INFO:tasks.workunit.client.1.vm08.stdout:4/128: truncate f1 4254690 0 2026-03-09T19:26:56.320 INFO:tasks.workunit.client.1.vm08.stdout:5/112: rename f0 to d16/d1e/f27 0 2026-03-09T19:26:56.324 INFO:tasks.workunit.client.1.vm08.stdout:5/113: dread d16/f17 [0,4194304] 0 2026-03-09T19:26:56.324 INFO:tasks.workunit.client.1.vm08.stdout:5/114: stat f5 0 2026-03-09T19:26:56.325 INFO:tasks.workunit.client.1.vm08.stdout:1/198: link d9/d11/c34 d9/da/c3f 0 2026-03-09T19:26:56.328 INFO:tasks.workunit.client.1.vm08.stdout:2/149: truncate f2 77067 0 2026-03-09T19:26:56.328 INFO:tasks.workunit.client.1.vm08.stdout:5/115: dread d16/f17 [0,4194304] 0 2026-03-09T19:26:56.336 INFO:tasks.workunit.client.1.vm08.stdout:7/119: rename d5/d12/f13 to d5/d16/d1c/f29 0 2026-03-09T19:26:56.340 INFO:tasks.workunit.client.1.vm08.stdout:6/126: creat d3/f32 x:0 0 0 2026-03-09T19:26:56.341 INFO:tasks.workunit.client.1.vm08.stdout:6/127: fsync d3/db/f14 0 2026-03-09T19:26:56.341 INFO:tasks.workunit.client.1.vm08.stdout:6/128: stat d3/d15/c22 0 2026-03-09T19:26:56.343 INFO:tasks.workunit.client.1.vm08.stdout:1/199: dread d9/f22 [4194304,4194304] 0 2026-03-09T19:26:56.368 INFO:tasks.workunit.client.1.vm08.stdout:5/116: truncate d16/d1e/f25 355646 0 2026-03-09T19:26:56.368 INFO:tasks.workunit.client.1.vm08.stdout:2/150: mkdir d3/d4/d23/d2c/d39 0 2026-03-09T19:26:56.368 INFO:tasks.workunit.client.1.vm08.stdout:4/129: mkdir da/d10/d1b/d23 0 2026-03-09T19:26:56.368 INFO:tasks.workunit.client.1.vm08.stdout:2/151: dread - d3/d9/f28 zero size 2026-03-09T19:26:56.368 INFO:tasks.workunit.client.1.vm08.stdout:4/130: chown f2 360368 1 2026-03-09T19:26:56.368 INFO:tasks.workunit.client.1.vm08.stdout:5/117: creat d16/d1e/f28 x:0 0 0 2026-03-09T19:26:56.368 INFO:tasks.workunit.client.1.vm08.stdout:2/152: dread d3/d4/f8 [0,4194304] 0 2026-03-09T19:26:56.368 INFO:tasks.workunit.client.1.vm08.stdout:2/153: dread - d3/d9/f1e zero size 2026-03-09T19:26:56.368 INFO:tasks.workunit.client.1.vm08.stdout:4/131: rename da/cf to da/c24 0 2026-03-09T19:26:56.368 INFO:tasks.workunit.client.1.vm08.stdout:5/118: unlink d16/d1e/c21 0 2026-03-09T19:26:56.368 INFO:tasks.workunit.client.1.vm08.stdout:5/119: chown d16/d1e/f27 38084058 1 2026-03-09T19:26:56.368 INFO:tasks.workunit.client.1.vm08.stdout:5/120: symlink d16/l29 0 2026-03-09T19:26:56.368 INFO:tasks.workunit.client.1.vm08.stdout:5/121: write f14 [665777,79256] 0 2026-03-09T19:26:56.368 INFO:tasks.workunit.client.1.vm08.stdout:1/200: getdents d9/da/d17 0 2026-03-09T19:26:56.368 INFO:tasks.workunit.client.1.vm08.stdout:5/122: creat d16/f2a x:0 0 0 2026-03-09T19:26:56.368 INFO:tasks.workunit.client.1.vm08.stdout:1/201: mkdir d9/d40 0 2026-03-09T19:26:56.369 INFO:tasks.workunit.client.1.vm08.stdout:5/123: fdatasync d16/d1e/f27 0 2026-03-09T19:26:56.369 INFO:tasks.workunit.client.1.vm08.stdout:1/202: creat d9/da/d2d/f41 x:0 0 0 2026-03-09T19:26:56.369 INFO:tasks.workunit.client.1.vm08.stdout:1/203: fdatasync d9/d11/f29 0 2026-03-09T19:26:56.369 INFO:tasks.workunit.client.1.vm08.stdout:1/204: readlink d9/da/d12/l16 0 2026-03-09T19:26:56.380 INFO:tasks.workunit.client.1.vm08.stdout:1/205: truncate d9/da/dc/f1d 1288991 0 2026-03-09T19:26:56.381 INFO:tasks.workunit.client.1.vm08.stdout:1/206: symlink d9/da/dc/l42 0 2026-03-09T19:26:56.381 INFO:tasks.workunit.client.1.vm08.stdout:1/207: chown d9/da/d12 0 1 2026-03-09T19:26:56.384 INFO:tasks.workunit.client.1.vm08.stdout:1/208: unlink d9/da/l13 0 2026-03-09T19:26:56.387 INFO:tasks.workunit.client.1.vm08.stdout:1/209: truncate d9/da/dc/f31 1041327 0 2026-03-09T19:26:56.387 INFO:tasks.workunit.client.1.vm08.stdout:1/210: readlink d9/da/l21 0 2026-03-09T19:26:56.387 INFO:tasks.workunit.client.1.vm08.stdout:1/211: stat d9/da/d2c/l35 0 2026-03-09T19:26:56.388 INFO:tasks.workunit.client.1.vm08.stdout:1/212: read d9/da/dc/f10 [3875736,119337] 0 2026-03-09T19:26:56.402 INFO:tasks.workunit.client.1.vm08.stdout:7/120: sync 2026-03-09T19:26:56.402 INFO:tasks.workunit.client.1.vm08.stdout:2/154: sync 2026-03-09T19:26:56.409 INFO:tasks.workunit.client.1.vm08.stdout:7/121: mknod d5/d14/c2a 0 2026-03-09T19:26:56.411 INFO:tasks.workunit.client.1.vm08.stdout:7/122: write d5/f9 [620735,31815] 0 2026-03-09T19:26:56.411 INFO:tasks.workunit.client.1.vm08.stdout:2/155: dwrite d3/d9/f20 [0,4194304] 0 2026-03-09T19:26:56.426 INFO:tasks.workunit.client.1.vm08.stdout:0/124: write fb [2098222,43639] 0 2026-03-09T19:26:56.428 INFO:tasks.workunit.client.1.vm08.stdout:0/125: dread dd/f1e [0,4194304] 0 2026-03-09T19:26:56.431 INFO:tasks.workunit.client.1.vm08.stdout:0/126: dwrite dd/f16 [0,4194304] 0 2026-03-09T19:26:56.439 INFO:tasks.workunit.client.1.vm08.stdout:7/123: mkdir d5/d14/d2b 0 2026-03-09T19:26:56.446 INFO:tasks.workunit.client.1.vm08.stdout:7/124: write d5/d16/d1c/f29 [4173843,39249] 0 2026-03-09T19:26:56.456 INFO:tasks.workunit.client.1.vm08.stdout:8/180: dwrite de/f1c [0,4194304] 0 2026-03-09T19:26:56.472 INFO:tasks.workunit.client.1.vm08.stdout:8/181: dwrite de/d1d/f22 [0,4194304] 0 2026-03-09T19:26:56.474 INFO:tasks.workunit.client.1.vm08.stdout:8/182: fsync de/d1d/f1e 0 2026-03-09T19:26:56.475 INFO:tasks.workunit.client.1.vm08.stdout:2/156: link d3/f19 d3/d9/dc/de/d18/d1f/f3a 0 2026-03-09T19:26:56.482 INFO:tasks.workunit.client.1.vm08.stdout:9/119: dwrite d0/d3/f5 [0,4194304] 0 2026-03-09T19:26:56.487 INFO:tasks.workunit.client.1.vm08.stdout:9/120: write d0/f13 [340870,110960] 0 2026-03-09T19:26:56.496 INFO:tasks.workunit.client.1.vm08.stdout:8/183: creat de/d25/d33/f38 x:0 0 0 2026-03-09T19:26:56.497 INFO:tasks.workunit.client.1.vm08.stdout:3/176: write d0/d6/de/d1b/d16/d17/f1d [1958383,62893] 0 2026-03-09T19:26:56.498 INFO:tasks.workunit.client.1.vm08.stdout:9/121: dwrite d0/d2/d14/f20 [0,4194304] 0 2026-03-09T19:26:56.519 INFO:tasks.workunit.client.1.vm08.stdout:7/125: link d5/c11 d5/d14/d2b/c2c 0 2026-03-09T19:26:56.523 INFO:tasks.workunit.client.1.vm08.stdout:4/132: rmdir da/d10 39 2026-03-09T19:26:56.523 INFO:tasks.workunit.client.1.vm08.stdout:4/133: dread f2 [0,4194304] 0 2026-03-09T19:26:56.526 INFO:tasks.workunit.client.1.vm08.stdout:4/134: dread f2 [0,4194304] 0 2026-03-09T19:26:56.528 INFO:tasks.workunit.client.1.vm08.stdout:6/129: write d3/db/f20 [105786,103259] 0 2026-03-09T19:26:56.532 INFO:tasks.workunit.client.1.vm08.stdout:2/157: link d3/c5 d3/d9/dc/c3b 0 2026-03-09T19:26:56.532 INFO:tasks.workunit.client.1.vm08.stdout:2/158: chown d3/d9/dc/de/l36 298 1 2026-03-09T19:26:56.537 INFO:tasks.workunit.client.1.vm08.stdout:4/135: dread da/d10/f1f [0,4194304] 0 2026-03-09T19:26:56.538 INFO:tasks.workunit.client.1.vm08.stdout:9/122: creat d0/d2/f2a x:0 0 0 2026-03-09T19:26:56.539 INFO:tasks.workunit.client.1.vm08.stdout:5/124: write d16/f18 [2037472,113001] 0 2026-03-09T19:26:56.543 INFO:tasks.workunit.client.1.vm08.stdout:8/184: rmdir de/d32 0 2026-03-09T19:26:56.551 INFO:tasks.workunit.client.1.vm08.stdout:6/130: dread d3/fc [0,4194304] 0 2026-03-09T19:26:56.556 INFO:tasks.workunit.client.1.vm08.stdout:3/177: getdents d0/d6/de/d1b 0 2026-03-09T19:26:56.558 INFO:tasks.workunit.client.1.vm08.stdout:3/178: write d0/d6/de/d1b/d16/d17/f1d [564692,93961] 0 2026-03-09T19:26:56.558 INFO:tasks.workunit.client.1.vm08.stdout:3/179: write d0/d6/de/d1b/d16/f21 [898059,58735] 0 2026-03-09T19:26:56.560 INFO:tasks.workunit.client.1.vm08.stdout:3/180: fdatasync d0/f28 0 2026-03-09T19:26:56.566 INFO:tasks.workunit.client.1.vm08.stdout:4/136: read f1 [2725939,100489] 0 2026-03-09T19:26:56.567 INFO:tasks.workunit.client.1.vm08.stdout:4/137: chown da/d10/f1f 693907 1 2026-03-09T19:26:56.569 INFO:tasks.workunit.client.1.vm08.stdout:9/123: rename d0/f27 to d0/d3/f2b 0 2026-03-09T19:26:56.573 INFO:tasks.workunit.client.1.vm08.stdout:5/125: creat d16/f2b x:0 0 0 2026-03-09T19:26:56.577 INFO:tasks.workunit.client.1.vm08.stdout:1/213: write d9/da/f2f [474316,27229] 0 2026-03-09T19:26:56.586 INFO:tasks.workunit.client.1.vm08.stdout:1/214: dread d9/d11/f29 [0,4194304] 0 2026-03-09T19:26:56.590 INFO:tasks.workunit.client.1.vm08.stdout:2/159: mkdir d3/d4/d3c 0 2026-03-09T19:26:56.592 INFO:tasks.workunit.client.1.vm08.stdout:6/131: fdatasync d3/f10 0 2026-03-09T19:26:56.600 INFO:tasks.workunit.client.1.vm08.stdout:4/138: dread da/f1d [0,4194304] 0 2026-03-09T19:26:56.603 INFO:tasks.workunit.client.1.vm08.stdout:4/139: dread da/f1d [0,4194304] 0 2026-03-09T19:26:56.608 INFO:tasks.workunit.client.1.vm08.stdout:4/140: dwrite da/fd [0,4194304] 0 2026-03-09T19:26:56.623 INFO:tasks.workunit.client.1.vm08.stdout:1/215: symlink d9/d11/l43 0 2026-03-09T19:26:56.625 INFO:tasks.workunit.client.1.vm08.stdout:7/126: getdents d5/d16 0 2026-03-09T19:26:56.626 INFO:tasks.workunit.client.1.vm08.stdout:2/160: rename d3/d4/c38 to d3/d4/d23/d2c/c3d 0 2026-03-09T19:26:56.632 INFO:tasks.workunit.client.1.vm08.stdout:6/132: write d3/f25 [873730,61611] 0 2026-03-09T19:26:56.636 INFO:tasks.workunit.client.1.vm08.stdout:1/216: creat d9/d11/f44 x:0 0 0 2026-03-09T19:26:56.637 INFO:tasks.workunit.client.1.vm08.stdout:1/217: write d9/da/d2c/f38 [379444,81923] 0 2026-03-09T19:26:56.650 INFO:tasks.workunit.client.1.vm08.stdout:8/185: getdents de/d25/d31 0 2026-03-09T19:26:56.655 INFO:tasks.workunit.client.1.vm08.stdout:0/127: dwrite dd/f13 [0,4194304] 0 2026-03-09T19:26:56.657 INFO:tasks.workunit.client.1.vm08.stdout:0/128: read dd/f18 [2827724,100904] 0 2026-03-09T19:26:56.666 INFO:tasks.workunit.client.1.vm08.stdout:7/127: fdatasync d5/fb 0 2026-03-09T19:26:56.670 INFO:tasks.workunit.client.1.vm08.stdout:8/186: rename de/d25/d35 to de/d25/d31/d39 0 2026-03-09T19:26:56.677 INFO:tasks.workunit.client.1.vm08.stdout:1/218: truncate d9/d11/f29 3404855 0 2026-03-09T19:26:56.682 INFO:tasks.workunit.client.1.vm08.stdout:1/219: fsync d9/da/d2d/f41 0 2026-03-09T19:26:56.683 INFO:tasks.workunit.client.1.vm08.stdout:9/124: unlink d0/d3/f2b 0 2026-03-09T19:26:56.683 INFO:tasks.workunit.client.1.vm08.stdout:9/125: fsync d0/d2/d14/f17 0 2026-03-09T19:26:56.684 INFO:tasks.workunit.client.1.vm08.stdout:9/126: fdatasync d0/d2/f2a 0 2026-03-09T19:26:56.690 INFO:tasks.workunit.client.1.vm08.stdout:7/128: symlink d5/d16/l2d 0 2026-03-09T19:26:56.692 INFO:tasks.workunit.client.1.vm08.stdout:6/133: dwrite d3/f1b [0,4194304] 0 2026-03-09T19:26:56.695 INFO:tasks.workunit.client.1.vm08.stdout:2/161: write d3/d9/dc/de/f17 [558322,26393] 0 2026-03-09T19:26:56.697 INFO:tasks.workunit.client.1.vm08.stdout:2/162: dread - d3/d9/f1e zero size 2026-03-09T19:26:56.698 INFO:tasks.workunit.client.1.vm08.stdout:6/134: dread d3/d15/f1e [0,4194304] 0 2026-03-09T19:26:56.712 INFO:tasks.workunit.client.1.vm08.stdout:5/126: write d16/f17 [3754155,29099] 0 2026-03-09T19:26:56.713 INFO:tasks.workunit.client.1.vm08.stdout:5/127: chown f15 1264278089 1 2026-03-09T19:26:56.719 INFO:tasks.workunit.client.1.vm08.stdout:3/181: truncate d0/d6/de/d1b/d16/d17/f1d 1021476 0 2026-03-09T19:26:56.741 INFO:tasks.workunit.client.1.vm08.stdout:3/182: read - d0/d6/de/d1b/d16/d18/f2c zero size 2026-03-09T19:26:56.741 INFO:tasks.workunit.client.1.vm08.stdout:3/183: chown d0/d6/de/d1a/c31 1865898 1 2026-03-09T19:26:56.741 INFO:tasks.workunit.client.1.vm08.stdout:3/184: stat d0/cf 0 2026-03-09T19:26:56.741 INFO:tasks.workunit.client.1.vm08.stdout:6/135: write d3/f7 [3686662,75185] 0 2026-03-09T19:26:56.741 INFO:tasks.workunit.client.1.vm08.stdout:0/129: link fc dd/d22/d24/f26 0 2026-03-09T19:26:56.741 INFO:tasks.workunit.client.1.vm08.stdout:4/141: write f1 [1434604,62042] 0 2026-03-09T19:26:56.741 INFO:tasks.workunit.client.1.vm08.stdout:0/130: write fb [1672658,76298] 0 2026-03-09T19:26:56.741 INFO:tasks.workunit.client.1.vm08.stdout:5/128: creat d16/d1e/f2c x:0 0 0 2026-03-09T19:26:56.741 INFO:tasks.workunit.client.1.vm08.stdout:5/129: stat d16/f2b 0 2026-03-09T19:26:56.741 INFO:tasks.workunit.client.1.vm08.stdout:7/129: dwrite d5/fb [0,4194304] 0 2026-03-09T19:26:56.741 INFO:tasks.workunit.client.1.vm08.stdout:7/130: fsync d5/f20 0 2026-03-09T19:26:56.741 INFO:tasks.workunit.client.1.vm08.stdout:6/136: read d3/db/f14 [862679,122514] 0 2026-03-09T19:26:56.750 INFO:tasks.workunit.client.1.vm08.stdout:6/137: symlink d3/l33 0 2026-03-09T19:26:56.751 INFO:tasks.workunit.client.1.vm08.stdout:3/185: getdents d0/d6/de/d1b/d16 0 2026-03-09T19:26:56.751 INFO:tasks.workunit.client.1.vm08.stdout:3/186: stat d0/d6/de/d1b/d16/d18 0 2026-03-09T19:26:56.757 INFO:tasks.workunit.client.1.vm08.stdout:3/187: dwrite d0/d6/de/d1b/d16/f21 [0,4194304] 0 2026-03-09T19:26:56.780 INFO:tasks.workunit.client.1.vm08.stdout:0/131: truncate fc 678760 0 2026-03-09T19:26:56.780 INFO:tasks.workunit.client.1.vm08.stdout:3/188: mkdir d0/d6/de/d1a/d33 0 2026-03-09T19:26:56.780 INFO:tasks.workunit.client.1.vm08.stdout:7/131: link d5/ce d5/d14/d27/c2e 0 2026-03-09T19:26:56.780 INFO:tasks.workunit.client.1.vm08.stdout:0/132: mkdir dd/d22/d27 0 2026-03-09T19:26:56.780 INFO:tasks.workunit.client.1.vm08.stdout:0/133: chown dd/f1e 997389 1 2026-03-09T19:26:56.780 INFO:tasks.workunit.client.1.vm08.stdout:7/132: symlink d5/d16/d1c/l2f 0 2026-03-09T19:26:56.780 INFO:tasks.workunit.client.1.vm08.stdout:0/134: rmdir dd/d22/d24 39 2026-03-09T19:26:56.780 INFO:tasks.workunit.client.1.vm08.stdout:7/133: dread d5/f1a [0,4194304] 0 2026-03-09T19:26:56.780 INFO:tasks.workunit.client.1.vm08.stdout:7/134: write d5/d12/f19 [2377315,28656] 0 2026-03-09T19:26:56.780 INFO:tasks.workunit.client.1.vm08.stdout:6/138: getdents d3/db 0 2026-03-09T19:26:56.782 INFO:tasks.workunit.client.1.vm08.stdout:6/139: dwrite d3/db/d24/f2f [0,4194304] 0 2026-03-09T19:26:56.785 INFO:tasks.workunit.client.1.vm08.stdout:7/135: creat d5/d14/d2b/f30 x:0 0 0 2026-03-09T19:26:56.786 INFO:tasks.workunit.client.1.vm08.stdout:6/140: rmdir d3/db 39 2026-03-09T19:26:56.788 INFO:tasks.workunit.client.1.vm08.stdout:7/136: dread d5/f20 [0,4194304] 0 2026-03-09T19:26:56.792 INFO:tasks.workunit.client.1.vm08.stdout:6/141: getdents d3/db 0 2026-03-09T19:26:56.792 INFO:tasks.workunit.client.1.vm08.stdout:6/142: fsync d3/fe 0 2026-03-09T19:26:56.793 INFO:tasks.workunit.client.1.vm08.stdout:6/143: write d3/f32 [779795,55507] 0 2026-03-09T19:26:56.794 INFO:tasks.workunit.client.1.vm08.stdout:6/144: chown d3/db/f30 7569 1 2026-03-09T19:26:56.794 INFO:tasks.workunit.client.1.vm08.stdout:6/145: chown d3/db/cd 1 1 2026-03-09T19:26:56.797 INFO:tasks.workunit.client.1.vm08.stdout:6/146: unlink d3/f13 0 2026-03-09T19:26:56.869 INFO:tasks.workunit.client.1.vm08.stdout:2/163: read f2 [18211,64639] 0 2026-03-09T19:26:56.876 INFO:tasks.workunit.client.1.vm08.stdout:2/164: mkdir d3/d4/d3e 0 2026-03-09T19:26:56.879 INFO:tasks.workunit.client.1.vm08.stdout:2/165: creat d3/d9/dc/de/d18/f3f x:0 0 0 2026-03-09T19:26:56.885 INFO:tasks.workunit.client.1.vm08.stdout:8/187: sync 2026-03-09T19:26:56.885 INFO:tasks.workunit.client.1.vm08.stdout:8/188: chown de/d1d/d21/f23 51214 1 2026-03-09T19:26:56.887 INFO:tasks.workunit.client.1.vm08.stdout:9/127: sync 2026-03-09T19:26:56.887 INFO:tasks.workunit.client.1.vm08.stdout:0/135: sync 2026-03-09T19:26:56.892 INFO:tasks.workunit.client.1.vm08.stdout:8/189: unlink de/d25/d33/f38 0 2026-03-09T19:26:56.896 INFO:tasks.workunit.client.1.vm08.stdout:8/190: dread de/f1f [4194304,4194304] 0 2026-03-09T19:26:56.909 INFO:tasks.workunit.client.1.vm08.stdout:1/220: dwrite d9/da/dc/f1d [0,4194304] 0 2026-03-09T19:26:56.917 INFO:tasks.workunit.client.1.vm08.stdout:4/142: truncate f1 4131193 0 2026-03-09T19:26:56.919 INFO:tasks.workunit.client.1.vm08.stdout:0/136: rmdir dd 39 2026-03-09T19:26:56.923 INFO:tasks.workunit.client.1.vm08.stdout:9/128: mknod d0/d3/c2c 0 2026-03-09T19:26:56.932 INFO:tasks.workunit.client.1.vm08.stdout:7/137: truncate d5/fb 2891874 0 2026-03-09T19:26:56.932 INFO:tasks.workunit.client.1.vm08.stdout:7/138: stat d5/d16/f1f 0 2026-03-09T19:26:56.932 INFO:tasks.workunit.client.1.vm08.stdout:8/191: symlink de/d1d/l3a 0 2026-03-09T19:26:56.932 INFO:tasks.workunit.client.1.vm08.stdout:6/147: truncate d3/db/d24/f2f 1190745 0 2026-03-09T19:26:56.936 INFO:tasks.workunit.client.1.vm08.stdout:1/221: rmdir d9 39 2026-03-09T19:26:56.938 INFO:tasks.workunit.client.1.vm08.stdout:0/137: chown dd/d22/d27 50 1 2026-03-09T19:26:56.942 INFO:tasks.workunit.client.1.vm08.stdout:7/139: mknod d5/d14/c31 0 2026-03-09T19:26:56.943 INFO:tasks.workunit.client.1.vm08.stdout:7/140: read - d5/d16/f28 zero size 2026-03-09T19:26:56.946 INFO:tasks.workunit.client.1.vm08.stdout:7/141: dread d5/f9 [0,4194304] 0 2026-03-09T19:26:56.946 INFO:tasks.workunit.client.1.vm08.stdout:7/142: chown d5/d14/c2a 0 1 2026-03-09T19:26:56.947 INFO:tasks.workunit.client.1.vm08.stdout:7/143: truncate d5/fa 4978693 0 2026-03-09T19:26:56.955 INFO:tasks.workunit.client.1.vm08.stdout:7/144: dwrite d5/d12/f19 [4194304,4194304] 0 2026-03-09T19:26:56.955 INFO:tasks.workunit.client.1.vm08.stdout:7/145: chown d5/d12/f19 41576427 1 2026-03-09T19:26:56.968 INFO:tasks.workunit.client.1.vm08.stdout:6/148: fsync d3/f9 0 2026-03-09T19:26:56.972 INFO:tasks.workunit.client.1.vm08.stdout:6/149: dwrite d3/f25 [0,4194304] 0 2026-03-09T19:26:56.982 INFO:tasks.workunit.client.1.vm08.stdout:1/222: dread - d9/d11/f44 zero size 2026-03-09T19:26:56.982 INFO:tasks.workunit.client.1.vm08.stdout:1/223: dwrite d9/da/d2d/f3d [0,4194304] 0 2026-03-09T19:26:56.982 INFO:tasks.workunit.client.1.vm08.stdout:1/224: read - d9/d11/f3c zero size 2026-03-09T19:26:56.988 INFO:tasks.workunit.client.1.vm08.stdout:4/143: link da/f21 da/d10/f25 0 2026-03-09T19:26:56.991 INFO:tasks.workunit.client.1.vm08.stdout:4/144: dread f2 [0,4194304] 0 2026-03-09T19:26:56.991 INFO:tasks.workunit.client.1.vm08.stdout:4/145: chown da/f21 32378 1 2026-03-09T19:26:57.002 INFO:tasks.workunit.client.1.vm08.stdout:5/130: dwrite d16/d1e/f25 [0,4194304] 0 2026-03-09T19:26:57.005 INFO:tasks.workunit.client.1.vm08.stdout:7/146: creat d5/d14/d2b/f32 x:0 0 0 2026-03-09T19:26:57.007 INFO:tasks.workunit.client.1.vm08.stdout:5/131: dread f2 [0,4194304] 0 2026-03-09T19:26:57.013 INFO:tasks.workunit.client.1.vm08.stdout:7/147: dwrite d5/fc [0,4194304] 0 2026-03-09T19:26:57.017 INFO:tasks.workunit.client.1.vm08.stdout:7/148: write d5/fc [1706413,15774] 0 2026-03-09T19:26:57.032 INFO:tasks.workunit.client.1.vm08.stdout:7/149: fsync d5/d14/d2b/f30 0 2026-03-09T19:26:57.032 INFO:tasks.workunit.client.1.vm08.stdout:3/189: truncate d0/d6/de/d1b/d16/d17/f1d 61582 0 2026-03-09T19:26:57.032 INFO:tasks.workunit.client.1.vm08.stdout:2/166: truncate d3/f7 1849840 0 2026-03-09T19:26:57.032 INFO:tasks.workunit.client.1.vm08.stdout:4/146: mkdir da/d10/d26 0 2026-03-09T19:26:57.032 INFO:tasks.workunit.client.1.vm08.stdout:4/147: truncate da/d10/f25 74958 0 2026-03-09T19:26:57.036 INFO:tasks.workunit.client.1.vm08.stdout:9/129: creat d0/d2/d8/f2d x:0 0 0 2026-03-09T19:26:57.041 INFO:tasks.workunit.client.1.vm08.stdout:5/132: symlink d16/d1e/l2d 0 2026-03-09T19:26:57.045 INFO:tasks.workunit.client.1.vm08.stdout:6/150: mkdir d3/d34 0 2026-03-09T19:26:57.046 INFO:tasks.workunit.client.1.vm08.stdout:6/151: dread d3/f9 [0,4194304] 0 2026-03-09T19:26:57.048 INFO:tasks.workunit.client.1.vm08.stdout:4/148: dread da/d10/f13 [0,4194304] 0 2026-03-09T19:26:57.048 INFO:tasks.workunit.client.1.vm08.stdout:4/149: fdatasync da/fd 0 2026-03-09T19:26:57.050 INFO:tasks.workunit.client.1.vm08.stdout:4/150: dread da/f1d [0,4194304] 0 2026-03-09T19:26:57.050 INFO:tasks.workunit.client.1.vm08.stdout:4/151: chown da/d10/d1b/d23 233 1 2026-03-09T19:26:57.051 INFO:tasks.workunit.client.1.vm08.stdout:0/138: rename f3 to dd/d22/f28 0 2026-03-09T19:26:57.053 INFO:tasks.workunit.client.1.vm08.stdout:9/130: chown d0/d2/d8/d7/l15 384917315 1 2026-03-09T19:26:57.054 INFO:tasks.workunit.client.1.vm08.stdout:9/131: chown d0/d2/d8/d7/f23 54256 1 2026-03-09T19:26:57.055 INFO:tasks.workunit.client.1.vm08.stdout:5/133: creat d16/d1e/f2e x:0 0 0 2026-03-09T19:26:57.055 INFO:tasks.workunit.client.1.vm08.stdout:7/150: symlink d5/d14/d27/l33 0 2026-03-09T19:26:57.056 INFO:tasks.workunit.client.1.vm08.stdout:3/190: unlink d0/d6/de/d15/l1c 0 2026-03-09T19:26:57.058 INFO:tasks.workunit.client.1.vm08.stdout:2/167: truncate d3/d9/dc/de/d18/d1f/f3a 3085454 0 2026-03-09T19:26:57.059 INFO:tasks.workunit.client.1.vm08.stdout:1/225: link d9/c37 d9/d40/c45 0 2026-03-09T19:26:57.059 INFO:tasks.workunit.client.1.vm08.stdout:6/152: chown d3/c26 68517 1 2026-03-09T19:26:57.063 INFO:tasks.workunit.client.1.vm08.stdout:0/139: truncate dd/fe 583352 0 2026-03-09T19:26:57.063 INFO:tasks.workunit.client.1.vm08.stdout:0/140: dread - dd/f15 zero size 2026-03-09T19:26:57.064 INFO:tasks.workunit.client.1.vm08.stdout:6/153: dread d3/f10 [0,4194304] 0 2026-03-09T19:26:57.069 INFO:tasks.workunit.client.1.vm08.stdout:9/132: unlink d0/d2/d8/d7/l15 0 2026-03-09T19:26:57.073 INFO:tasks.workunit.client.1.vm08.stdout:9/133: dwrite d0/d2/f2a [0,4194304] 0 2026-03-09T19:26:57.088 INFO:tasks.workunit.client.1.vm08.stdout:6/154: dread f1 [0,4194304] 0 2026-03-09T19:26:57.088 INFO:tasks.workunit.client.1.vm08.stdout:6/155: fdatasync d3/f32 0 2026-03-09T19:26:57.094 INFO:tasks.workunit.client.1.vm08.stdout:8/192: truncate de/f1c 3666134 0 2026-03-09T19:26:57.102 INFO:tasks.workunit.client.1.vm08.stdout:1/226: rename d9/d11/l18 to d9/l46 0 2026-03-09T19:26:57.103 INFO:tasks.workunit.client.1.vm08.stdout:4/152: mkdir da/d10/d26/d27 0 2026-03-09T19:26:57.108 INFO:tasks.workunit.client.1.vm08.stdout:9/134: symlink d0/l2e 0 2026-03-09T19:26:57.109 INFO:tasks.workunit.client.1.vm08.stdout:6/156: readlink d3/l1f 0 2026-03-09T19:26:57.109 INFO:tasks.workunit.client.1.vm08.stdout:6/157: fdatasync d3/f2a 0 2026-03-09T19:26:57.113 INFO:tasks.workunit.client.1.vm08.stdout:6/158: dwrite d3/fc [0,4194304] 0 2026-03-09T19:26:57.117 INFO:tasks.workunit.client.1.vm08.stdout:2/168: mknod d3/d9/dc/d14/c40 0 2026-03-09T19:26:57.125 INFO:tasks.workunit.client.1.vm08.stdout:8/193: rename de/d25/d33/l34 to de/d1d/d2e/l3b 0 2026-03-09T19:26:57.135 INFO:tasks.workunit.client.1.vm08.stdout:2/169: unlink d3/d9/f28 0 2026-03-09T19:26:57.140 INFO:tasks.workunit.client.1.vm08.stdout:2/170: dread f1 [0,4194304] 0 2026-03-09T19:26:57.140 INFO:tasks.workunit.client.1.vm08.stdout:2/171: dread - d3/d9/dc/de/d18/f3f zero size 2026-03-09T19:26:57.140 INFO:tasks.workunit.client.1.vm08.stdout:8/194: symlink de/d1d/d2e/l3c 0 2026-03-09T19:26:57.143 INFO:tasks.workunit.client.1.vm08.stdout:1/227: stat d9/d40/c45 0 2026-03-09T19:26:57.144 INFO:tasks.workunit.client.1.vm08.stdout:8/195: dread de/d1d/f22 [0,4194304] 0 2026-03-09T19:26:57.146 INFO:tasks.workunit.client.1.vm08.stdout:2/172: dread d3/d9/dc/de/f17 [0,4194304] 0 2026-03-09T19:26:57.151 INFO:tasks.workunit.client.1.vm08.stdout:4/153: mkdir da/d10/d16/d28 0 2026-03-09T19:26:57.157 INFO:tasks.workunit.client.1.vm08.stdout:0/141: creat dd/d22/f29 x:0 0 0 2026-03-09T19:26:57.166 INFO:tasks.workunit.client.1.vm08.stdout:3/191: write d0/d6/de/d1b/d16/f21 [5006771,68216] 0 2026-03-09T19:26:57.170 INFO:tasks.workunit.client.1.vm08.stdout:3/192: dwrite d0/d6/de/d1b/d16/d18/f2c [0,4194304] 0 2026-03-09T19:26:57.182 INFO:tasks.workunit.client.1.vm08.stdout:6/159: link d3/db/d24/f2f d3/d34/f35 0 2026-03-09T19:26:57.186 INFO:tasks.workunit.client.1.vm08.stdout:5/134: getdents d16 0 2026-03-09T19:26:57.186 INFO:tasks.workunit.client.1.vm08.stdout:5/135: stat d16/d1e/c22 0 2026-03-09T19:26:57.186 INFO:tasks.workunit.client.1.vm08.stdout:5/136: chown d16/l29 465639 1 2026-03-09T19:26:57.188 INFO:tasks.workunit.client.1.vm08.stdout:7/151: getdents d5/d16/d1c 0 2026-03-09T19:26:57.189 INFO:tasks.workunit.client.1.vm08.stdout:7/152: fsync d5/d16/d1c/f29 0 2026-03-09T19:26:57.190 INFO:tasks.workunit.client.1.vm08.stdout:7/153: truncate d5/d14/d2b/f32 718555 0 2026-03-09T19:26:57.194 INFO:tasks.workunit.client.1.vm08.stdout:1/228: creat d9/da/d12/d39/f47 x:0 0 0 2026-03-09T19:26:57.195 INFO:tasks.workunit.client.1.vm08.stdout:1/229: stat d9/da/dc/f1d 0 2026-03-09T19:26:57.195 INFO:tasks.workunit.client.1.vm08.stdout:1/230: chown d9 1 1 2026-03-09T19:26:57.200 INFO:tasks.workunit.client.1.vm08.stdout:9/135: truncate d0/d2/d8/fe 325933 0 2026-03-09T19:26:57.204 INFO:tasks.workunit.client.1.vm08.stdout:4/154: stat da/c24 0 2026-03-09T19:26:57.205 INFO:tasks.workunit.client.1.vm08.stdout:2/173: dread d3/d9/dc/de/f1c [0,4194304] 0 2026-03-09T19:26:57.205 INFO:tasks.workunit.client.1.vm08.stdout:2/174: stat f2 0 2026-03-09T19:26:57.208 INFO:tasks.workunit.client.1.vm08.stdout:0/142: fsync dd/f1e 0 2026-03-09T19:26:57.216 INFO:tasks.workunit.client.1.vm08.stdout:3/193: mknod d0/d8/d19/c34 0 2026-03-09T19:26:57.220 INFO:tasks.workunit.client.1.vm08.stdout:7/154: creat d5/d12/f34 x:0 0 0 2026-03-09T19:26:57.230 INFO:tasks.workunit.client.1.vm08.stdout:6/160: dwrite d3/db/d24/f2f [0,4194304] 0 2026-03-09T19:26:57.244 INFO:tasks.workunit.client.1.vm08.stdout:1/231: creat d9/f48 x:0 0 0 2026-03-09T19:26:57.249 INFO:tasks.workunit.client.1.vm08.stdout:1/232: dwrite d9/da/dc/f31 [0,4194304] 0 2026-03-09T19:26:57.260 INFO:tasks.workunit.client.1.vm08.stdout:4/155: creat da/d10/d1b/f29 x:0 0 0 2026-03-09T19:26:57.263 INFO:tasks.workunit.client.1.vm08.stdout:4/156: dread f2 [0,4194304] 0 2026-03-09T19:26:57.269 INFO:tasks.workunit.client.1.vm08.stdout:7/155: creat d5/d14/d27/f35 x:0 0 0 2026-03-09T19:26:57.269 INFO:tasks.workunit.client.1.vm08.stdout:7/156: read d5/d16/f1f [2093952,51383] 0 2026-03-09T19:26:57.275 INFO:tasks.workunit.client.1.vm08.stdout:8/196: creat de/f3d x:0 0 0 2026-03-09T19:26:57.277 INFO:tasks.workunit.client.1.vm08.stdout:1/233: unlink d9/f22 0 2026-03-09T19:26:57.277 INFO:tasks.workunit.client.1.vm08.stdout:1/234: read - d9/d11/f3c zero size 2026-03-09T19:26:57.280 INFO:tasks.workunit.client.1.vm08.stdout:0/143: symlink dd/d22/d27/l2a 0 2026-03-09T19:26:57.294 INFO:tasks.workunit.client.1.vm08.stdout:5/137: getdents d16 0 2026-03-09T19:26:57.294 INFO:tasks.workunit.client.1.vm08.stdout:5/138: readlink lb 0 2026-03-09T19:26:57.294 INFO:tasks.workunit.client.1.vm08.stdout:7/157: truncate d5/f1a 1073618 0 2026-03-09T19:26:57.294 INFO:tasks.workunit.client.1.vm08.stdout:0/144: dwrite fb [0,4194304] 0 2026-03-09T19:26:57.294 INFO:tasks.workunit.client.1.vm08.stdout:4/157: getdents da/d10/d1b/d23 0 2026-03-09T19:26:57.294 INFO:tasks.workunit.client.1.vm08.stdout:1/235: dread d9/d11/f29 [0,4194304] 0 2026-03-09T19:26:57.294 INFO:tasks.workunit.client.1.vm08.stdout:1/236: fdatasync d9/da/dc/f1d 0 2026-03-09T19:26:57.294 INFO:tasks.workunit.client.1.vm08.stdout:1/237: truncate d9/da/f30 1507425 0 2026-03-09T19:26:57.294 INFO:tasks.workunit.client.1.vm08.stdout:1/238: write d9/d11/f44 [202762,10090] 0 2026-03-09T19:26:57.294 INFO:tasks.workunit.client.1.vm08.stdout:1/239: dread - d9/f48 zero size 2026-03-09T19:26:57.302 INFO:tasks.workunit.client.1.vm08.stdout:2/175: rmdir d3/d4/d3c 0 2026-03-09T19:26:57.302 INFO:tasks.workunit.client.1.vm08.stdout:2/176: chown d3/d9/dc/d14/f2b 2940 1 2026-03-09T19:26:57.310 INFO:tasks.workunit.client.1.vm08.stdout:9/136: rename d0/d3/f5 to d0/d2/f2f 0 2026-03-09T19:26:57.311 INFO:tasks.workunit.client.1.vm08.stdout:1/240: dread d9/da/f1e [0,4194304] 0 2026-03-09T19:26:57.314 INFO:tasks.workunit.client.1.vm08.stdout:2/177: mkdir d3/d4/d23/d2c/d41 0 2026-03-09T19:26:57.315 INFO:tasks.workunit.client.1.vm08.stdout:2/178: stat d3/d9/f20 0 2026-03-09T19:26:57.315 INFO:tasks.workunit.client.1.vm08.stdout:3/194: link d0/c27 d0/c35 0 2026-03-09T19:26:57.316 INFO:tasks.workunit.client.1.vm08.stdout:3/195: write d0/f28 [2253971,35727] 0 2026-03-09T19:26:57.319 INFO:tasks.workunit.client.1.vm08.stdout:3/196: dread d0/d6/de/d1b/d16/f21 [0,4194304] 0 2026-03-09T19:26:57.322 INFO:tasks.workunit.client.1.vm08.stdout:3/197: dread d0/d6/de/d1b/d16/f21 [0,4194304] 0 2026-03-09T19:26:57.323 INFO:tasks.workunit.client.1.vm08.stdout:8/197: link de/c12 de/d25/d33/c3e 0 2026-03-09T19:26:57.325 INFO:tasks.workunit.client.1.vm08.stdout:4/158: symlink da/l2a 0 2026-03-09T19:26:57.326 INFO:tasks.workunit.client.1.vm08.stdout:2/179: mknod d3/d9/d26/c42 0 2026-03-09T19:26:57.328 INFO:tasks.workunit.client.1.vm08.stdout:0/145: creat dd/d22/f2b x:0 0 0 2026-03-09T19:26:57.329 INFO:tasks.workunit.client.1.vm08.stdout:3/198: creat d0/d6/de/d1b/d16/d17/f36 x:0 0 0 2026-03-09T19:26:57.329 INFO:tasks.workunit.client.1.vm08.stdout:3/199: readlink d0/d6/de/d1b/d16/l26 0 2026-03-09T19:26:57.330 INFO:tasks.workunit.client.1.vm08.stdout:8/198: mknod de/d2c/c3f 0 2026-03-09T19:26:57.333 INFO:tasks.workunit.client.1.vm08.stdout:8/199: dwrite de/d1d/d21/f23 [0,4194304] 0 2026-03-09T19:26:57.338 INFO:tasks.workunit.client.1.vm08.stdout:1/241: mkdir d9/d40/d49 0 2026-03-09T19:26:57.338 INFO:tasks.workunit.client.1.vm08.stdout:2/180: symlink d3/d9/dc/de/d18/d1f/l43 0 2026-03-09T19:26:57.352 INFO:tasks.workunit.client.1.vm08.stdout:9/137: link d0/d2/d8/d7/c1e d0/d2/d14/c30 0 2026-03-09T19:26:57.354 INFO:tasks.workunit.client.1.vm08.stdout:1/242: mknod d9/d40/c4a 0 2026-03-09T19:26:57.355 INFO:tasks.workunit.client.1.vm08.stdout:0/146: truncate dd/d22/d24/f26 1417389 0 2026-03-09T19:26:57.356 INFO:tasks.workunit.client.1.vm08.stdout:0/147: write dd/d22/f29 [127634,15047] 0 2026-03-09T19:26:57.358 INFO:tasks.workunit.client.1.vm08.stdout:9/138: creat d0/d2/d14/f31 x:0 0 0 2026-03-09T19:26:57.363 INFO:tasks.workunit.client.1.vm08.stdout:9/139: dread d0/d3/fd [0,4194304] 0 2026-03-09T19:26:57.363 INFO:tasks.workunit.client.1.vm08.stdout:4/159: rename da/d14/c20 to da/c2b 0 2026-03-09T19:26:57.363 INFO:tasks.workunit.client.1.vm08.stdout:3/200: rename d0 to d0/d6/de/d15/d37 22 2026-03-09T19:26:57.363 INFO:tasks.workunit.client.1.vm08.stdout:1/243: symlink d9/da/d12/d39/l4b 0 2026-03-09T19:26:57.366 INFO:tasks.workunit.client.1.vm08.stdout:0/148: dread dd/f18 [0,4194304] 0 2026-03-09T19:26:57.370 INFO:tasks.workunit.client.1.vm08.stdout:9/140: mkdir d0/d3/d32 0 2026-03-09T19:26:57.371 INFO:tasks.workunit.client.1.vm08.stdout:6/161: write d3/db/d24/f2f [4545696,107388] 0 2026-03-09T19:26:57.372 INFO:tasks.workunit.client.1.vm08.stdout:6/162: write d3/db/f14 [1220838,128274] 0 2026-03-09T19:26:57.376 INFO:tasks.workunit.client.1.vm08.stdout:2/181: rename d3/d9/dc/de/d18/f29 to d3/d9/dc/d14/f44 0 2026-03-09T19:26:57.377 INFO:tasks.workunit.client.1.vm08.stdout:3/201: creat d0/d8/d19/f38 x:0 0 0 2026-03-09T19:26:57.387 INFO:tasks.workunit.client.1.vm08.stdout:5/139: truncate d16/f18 229225 0 2026-03-09T19:26:57.387 INFO:tasks.workunit.client.1.vm08.stdout:5/140: fdatasync d16/f2a 0 2026-03-09T19:26:57.388 INFO:tasks.workunit.client.1.vm08.stdout:5/141: write d16/f2a [614309,85953] 0 2026-03-09T19:26:57.390 INFO:tasks.workunit.client.1.vm08.stdout:0/149: creat dd/f2c x:0 0 0 2026-03-09T19:26:57.392 INFO:tasks.workunit.client.1.vm08.stdout:9/141: write d0/d3/fd [1974866,65561] 0 2026-03-09T19:26:57.395 INFO:tasks.workunit.client.1.vm08.stdout:6/163: mknod d3/db/d24/c36 0 2026-03-09T19:26:57.399 INFO:tasks.workunit.client.1.vm08.stdout:6/164: dwrite d3/db/f14 [0,4194304] 0 2026-03-09T19:26:57.403 INFO:tasks.workunit.client.1.vm08.stdout:8/200: sync 2026-03-09T19:26:57.408 INFO:tasks.workunit.client.1.vm08.stdout:5/142: write f15 [3493073,108465] 0 2026-03-09T19:26:57.410 INFO:tasks.workunit.client.1.vm08.stdout:0/150: symlink dd/d22/d27/l2d 0 2026-03-09T19:26:57.415 INFO:tasks.workunit.client.1.vm08.stdout:7/158: truncate d5/d14/f1e 567561 0 2026-03-09T19:26:57.420 INFO:tasks.workunit.client.1.vm08.stdout:9/142: mknod d0/d2/d8/d7/c33 0 2026-03-09T19:26:57.425 INFO:tasks.workunit.client.1.vm08.stdout:9/143: dwrite d0/d2/d14/f17 [0,4194304] 0 2026-03-09T19:26:57.444 INFO:tasks.workunit.client.1.vm08.stdout:8/201: mknod de/d1d/d21/c40 0 2026-03-09T19:26:57.447 INFO:tasks.workunit.client.1.vm08.stdout:2/182: fsync d3/d9/dc/de/f32 0 2026-03-09T19:26:57.451 INFO:tasks.workunit.client.1.vm08.stdout:8/202: dwrite de/d1d/d21/f30 [0,4194304] 0 2026-03-09T19:26:57.452 INFO:tasks.workunit.client.1.vm08.stdout:6/165: rmdir d3 39 2026-03-09T19:26:57.458 INFO:tasks.workunit.client.1.vm08.stdout:4/160: write da/d10/f1c [375249,116954] 0 2026-03-09T19:26:57.459 INFO:tasks.workunit.client.1.vm08.stdout:4/161: write da/f18 [867344,88592] 0 2026-03-09T19:26:57.465 INFO:tasks.workunit.client.1.vm08.stdout:0/151: unlink f7 0 2026-03-09T19:26:57.471 INFO:tasks.workunit.client.1.vm08.stdout:9/144: creat d0/d2/d8/d7/f34 x:0 0 0 2026-03-09T19:26:57.472 INFO:tasks.workunit.client.1.vm08.stdout:9/145: fsync d0/d2/d14/f17 0 2026-03-09T19:26:57.484 INFO:tasks.workunit.client.1.vm08.stdout:2/183: dwrite d3/d9/dc/de/d18/d1f/f3a [0,4194304] 0 2026-03-09T19:26:57.486 INFO:tasks.workunit.client.1.vm08.stdout:2/184: dread - d3/d9/f1e zero size 2026-03-09T19:26:57.495 INFO:tasks.workunit.client.1.vm08.stdout:1/244: getdents d9/da/d2c 0 2026-03-09T19:26:57.495 INFO:tasks.workunit.client.1.vm08.stdout:1/245: chown d9/da/dc/f1d 366171 1 2026-03-09T19:26:57.496 INFO:tasks.workunit.client.1.vm08.stdout:1/246: chown d9/da/l23 3148 1 2026-03-09T19:26:57.496 INFO:tasks.workunit.client.1.vm08.stdout:1/247: write d9/da/dc/f31 [3787058,123354] 0 2026-03-09T19:26:57.500 INFO:tasks.workunit.client.1.vm08.stdout:4/162: mkdir da/d14/d2c 0 2026-03-09T19:26:57.503 INFO:tasks.workunit.client.1.vm08.stdout:3/202: rename d0/d6/de/d1b/d16/f21 to d0/d6/f39 0 2026-03-09T19:26:57.506 INFO:tasks.workunit.client.1.vm08.stdout:0/152: mkdir dd/d22/d27/d2e 0 2026-03-09T19:26:57.509 INFO:tasks.workunit.client.1.vm08.stdout:9/146: creat d0/d2/d8/d7/f35 x:0 0 0 2026-03-09T19:26:57.511 INFO:tasks.workunit.client.1.vm08.stdout:8/203: link de/f20 de/d25/d33/f41 0 2026-03-09T19:26:57.515 INFO:tasks.workunit.client.1.vm08.stdout:6/166: unlink d3/l33 0 2026-03-09T19:26:57.518 INFO:tasks.workunit.client.1.vm08.stdout:1/248: mknod d9/da/d12/c4c 0 2026-03-09T19:26:57.528 INFO:tasks.workunit.client.1.vm08.stdout:4/163: dread f9 [0,4194304] 0 2026-03-09T19:26:57.530 INFO:tasks.workunit.client.1.vm08.stdout:3/203: mknod d0/d8/d19/c3a 0 2026-03-09T19:26:57.531 INFO:tasks.workunit.client.1.vm08.stdout:5/143: dread d16/f18 [0,4194304] 0 2026-03-09T19:26:57.533 INFO:tasks.workunit.client.1.vm08.stdout:0/153: symlink dd/d22/d24/l2f 0 2026-03-09T19:26:57.535 INFO:tasks.workunit.client.1.vm08.stdout:0/154: dread dd/f13 [0,4194304] 0 2026-03-09T19:26:57.539 INFO:tasks.workunit.client.1.vm08.stdout:4/164: truncate f9 5113376 0 2026-03-09T19:26:57.541 INFO:tasks.workunit.client.1.vm08.stdout:7/159: rename d5/d14/d27/c2e to d5/d16/d1c/c36 0 2026-03-09T19:26:57.543 INFO:tasks.workunit.client.1.vm08.stdout:5/144: symlink d16/l2f 0 2026-03-09T19:26:57.548 INFO:tasks.workunit.client.1.vm08.stdout:6/167: link d3/d15/f2b d3/d34/f37 0 2026-03-09T19:26:57.555 INFO:tasks.workunit.client.1.vm08.stdout:2/185: creat d3/f45 x:0 0 0 2026-03-09T19:26:57.555 INFO:tasks.workunit.client.1.vm08.stdout:2/186: chown d3/d4/d3e 0 1 2026-03-09T19:26:57.556 INFO:tasks.workunit.client.1.vm08.stdout:2/187: chown d3/d9/d26/l37 269481052 1 2026-03-09T19:26:57.556 INFO:tasks.workunit.client.1.vm08.stdout:8/204: truncate de/d1d/d21/f30 2541523 0 2026-03-09T19:26:57.557 INFO:tasks.workunit.client.1.vm08.stdout:8/205: write de/d25/d31/f36 [497912,43987] 0 2026-03-09T19:26:57.563 INFO:tasks.workunit.client.1.vm08.stdout:0/155: rename dd/d22/d27/l2a to dd/l30 0 2026-03-09T19:26:57.567 INFO:tasks.workunit.client.1.vm08.stdout:1/249: dwrite f2 [0,4194304] 0 2026-03-09T19:26:57.568 INFO:tasks.workunit.client.1.vm08.stdout:1/250: chown d9/l15 366737 1 2026-03-09T19:26:57.572 INFO:tasks.workunit.client.1.vm08.stdout:7/160: creat d5/d14/d2b/f37 x:0 0 0 2026-03-09T19:26:57.572 INFO:tasks.workunit.client.1.vm08.stdout:5/145: mkdir d16/d1e/d30 0 2026-03-09T19:26:57.583 INFO:tasks.workunit.client.1.vm08.stdout:9/147: creat d0/d2/f36 x:0 0 0 2026-03-09T19:26:57.585 INFO:tasks.workunit.client.1.vm08.stdout:6/168: rmdir d3/db/d24 39 2026-03-09T19:26:57.588 INFO:tasks.workunit.client.1.vm08.stdout:8/206: chown de/d25/d33/c3e 21691283 1 2026-03-09T19:26:57.588 INFO:tasks.workunit.client.1.vm08.stdout:8/207: fsync de/f19 0 2026-03-09T19:26:57.589 INFO:tasks.workunit.client.1.vm08.stdout:8/208: truncate de/f3d 1006829 0 2026-03-09T19:26:57.591 INFO:tasks.workunit.client.1.vm08.stdout:4/165: mkdir da/d10/d26/d27/d2d 0 2026-03-09T19:26:57.597 INFO:tasks.workunit.client.1.vm08.stdout:0/156: dwrite dd/d22/f28 [0,4194304] 0 2026-03-09T19:26:57.598 INFO:tasks.workunit.client.1.vm08.stdout:0/157: chown fb 5 1 2026-03-09T19:26:57.603 INFO:tasks.workunit.client.1.vm08.stdout:3/204: write d0/d8/d24/f2d [2656939,57146] 0 2026-03-09T19:26:57.610 INFO:tasks.workunit.client.1.vm08.stdout:5/146: unlink d16/f1c 0 2026-03-09T19:26:57.612 INFO:tasks.workunit.client.1.vm08.stdout:2/188: mkdir d3/d4/d23/d2c/d41/d46 0 2026-03-09T19:26:57.613 INFO:tasks.workunit.client.1.vm08.stdout:2/189: write d3/d9/dc/de/d18/f2d [989268,31747] 0 2026-03-09T19:26:57.614 INFO:tasks.workunit.client.1.vm08.stdout:4/166: readlink da/d14/l19 0 2026-03-09T19:26:57.614 INFO:tasks.workunit.client.1.vm08.stdout:4/167: chown f5 33565057 1 2026-03-09T19:26:57.616 INFO:tasks.workunit.client.1.vm08.stdout:0/158: mkdir dd/d31 0 2026-03-09T19:26:57.621 INFO:tasks.workunit.client.1.vm08.stdout:9/148: mknod d0/d1b/c37 0 2026-03-09T19:26:57.637 INFO:tasks.workunit.client.1.vm08.stdout:6/169: symlink d3/db/d24/l38 0 2026-03-09T19:26:57.637 INFO:tasks.workunit.client.1.vm08.stdout:6/170: readlink d3/db/d24/l38 0 2026-03-09T19:26:57.637 INFO:tasks.workunit.client.1.vm08.stdout:4/168: creat da/d10/f2e x:0 0 0 2026-03-09T19:26:57.637 INFO:tasks.workunit.client.1.vm08.stdout:6/171: creat d3/db/d24/f39 x:0 0 0 2026-03-09T19:26:57.637 INFO:tasks.workunit.client.1.vm08.stdout:3/205: dread d0/d6/de/d1b/d16/d17/f1d [0,4194304] 0 2026-03-09T19:26:57.637 INFO:tasks.workunit.client.1.vm08.stdout:0/159: mknod dd/d31/c32 0 2026-03-09T19:26:57.637 INFO:tasks.workunit.client.1.vm08.stdout:3/206: dwrite d0/d6/de/d1b/d16/d18/f2c [0,4194304] 0 2026-03-09T19:26:57.637 INFO:tasks.workunit.client.1.vm08.stdout:5/147: link d16/d1e/c26 d16/d1e/c31 0 2026-03-09T19:26:57.637 INFO:tasks.workunit.client.1.vm08.stdout:5/148: chown d16 51601489 1 2026-03-09T19:26:57.637 INFO:tasks.workunit.client.1.vm08.stdout:3/207: dread - d0/d8/d19/f38 zero size 2026-03-09T19:26:57.638 INFO:tasks.workunit.client.1.vm08.stdout:5/149: write f15 [1273466,30900] 0 2026-03-09T19:26:57.638 INFO:tasks.workunit.client.1.vm08.stdout:3/208: write d0/d8/d19/f38 [981041,59593] 0 2026-03-09T19:26:57.641 INFO:tasks.workunit.client.1.vm08.stdout:2/190: creat d3/d4/f47 x:0 0 0 2026-03-09T19:26:57.646 INFO:tasks.workunit.client.1.vm08.stdout:3/209: dread d0/d6/de/d1b/d16/d17/f1d [0,4194304] 0 2026-03-09T19:26:57.647 INFO:tasks.workunit.client.1.vm08.stdout:5/150: dwrite d16/d1e/f27 [0,4194304] 0 2026-03-09T19:26:57.647 INFO:tasks.workunit.client.1.vm08.stdout:8/209: getdents de/d25/d31 0 2026-03-09T19:26:57.647 INFO:tasks.workunit.client.1.vm08.stdout:2/191: dread d3/d4/f6 [0,4194304] 0 2026-03-09T19:26:57.651 INFO:tasks.workunit.client.1.vm08.stdout:5/151: read - d16/d1e/f2e zero size 2026-03-09T19:26:57.657 INFO:tasks.workunit.client.1.vm08.stdout:8/210: dread de/d1d/f27 [0,4194304] 0 2026-03-09T19:26:57.665 INFO:tasks.workunit.client.1.vm08.stdout:5/152: symlink d16/l32 0 2026-03-09T19:26:57.666 INFO:tasks.workunit.client.1.vm08.stdout:5/153: dread d16/f18 [0,4194304] 0 2026-03-09T19:26:57.668 INFO:tasks.workunit.client.1.vm08.stdout:0/160: fdatasync fc 0 2026-03-09T19:26:57.669 INFO:tasks.workunit.client.1.vm08.stdout:9/149: getdents d0/d2/d14 0 2026-03-09T19:26:57.669 INFO:tasks.workunit.client.1.vm08.stdout:9/150: chown d0/d2/f2a 352 1 2026-03-09T19:26:57.670 INFO:tasks.workunit.client.1.vm08.stdout:8/211: creat de/d25/f42 x:0 0 0 2026-03-09T19:26:57.673 INFO:tasks.workunit.client.1.vm08.stdout:6/172: rename d3/fe to d3/db/f3a 0 2026-03-09T19:26:57.676 INFO:tasks.workunit.client.1.vm08.stdout:0/161: mknod dd/d22/d24/c33 0 2026-03-09T19:26:57.678 INFO:tasks.workunit.client.1.vm08.stdout:8/212: mknod de/d1d/d21/c43 0 2026-03-09T19:26:57.684 INFO:tasks.workunit.client.1.vm08.stdout:2/192: creat d3/d4/f48 x:0 0 0 2026-03-09T19:26:57.684 INFO:tasks.workunit.client.1.vm08.stdout:2/193: dread - d3/f45 zero size 2026-03-09T19:26:57.691 INFO:tasks.workunit.client.1.vm08.stdout:1/251: write d9/da/f1e [5150346,5373] 0 2026-03-09T19:26:57.696 INFO:tasks.workunit.client.1.vm08.stdout:7/161: write d5/f1a [1388752,127683] 0 2026-03-09T19:26:57.698 INFO:tasks.workunit.client.1.vm08.stdout:5/154: creat d16/d1e/d30/f33 x:0 0 0 2026-03-09T19:26:57.709 INFO:tasks.workunit.client.1.vm08.stdout:0/162: creat dd/d31/f34 x:0 0 0 2026-03-09T19:26:57.715 INFO:tasks.workunit.client.1.vm08.stdout:9/151: creat d0/d3/d32/f38 x:0 0 0 2026-03-09T19:26:57.715 INFO:tasks.workunit.client.1.vm08.stdout:3/210: link d0/d6/de/c30 d0/d6/de/d1a/d33/c3b 0 2026-03-09T19:26:57.720 INFO:tasks.workunit.client.1.vm08.stdout:4/169: dwrite da/f21 [0,4194304] 0 2026-03-09T19:26:57.727 INFO:tasks.workunit.client.1.vm08.stdout:1/252: mknod d9/d11/c4d 0 2026-03-09T19:26:57.727 INFO:tasks.workunit.client.1.vm08.stdout:7/162: mkdir d5/d14/d38 0 2026-03-09T19:26:57.732 INFO:tasks.workunit.client.1.vm08.stdout:1/253: dwrite d9/da/dc/f31 [4194304,4194304] 0 2026-03-09T19:26:57.754 INFO:tasks.workunit.client.1.vm08.stdout:1/254: write d9/f48 [663361,20228] 0 2026-03-09T19:26:57.754 INFO:tasks.workunit.client.1.vm08.stdout:9/152: unlink d0/d2/d14/f17 0 2026-03-09T19:26:57.754 INFO:tasks.workunit.client.1.vm08.stdout:9/153: dread d0/d2/f2f [0,4194304] 0 2026-03-09T19:26:57.754 INFO:tasks.workunit.client.1.vm08.stdout:3/211: symlink d0/d6/d25/l3c 0 2026-03-09T19:26:57.755 INFO:tasks.workunit.client.1.vm08.stdout:4/170: rename da/d10/d26/d27/d2d to da/d10/d16/d28/d2f 0 2026-03-09T19:26:57.755 INFO:tasks.workunit.client.1.vm08.stdout:4/171: chown da/d10/f25 827 1 2026-03-09T19:26:57.755 INFO:tasks.workunit.client.1.vm08.stdout:4/172: write da/d10/d1b/f29 [964656,121591] 0 2026-03-09T19:26:57.755 INFO:tasks.workunit.client.1.vm08.stdout:4/173: read da/d10/f1c [44091,70789] 0 2026-03-09T19:26:57.755 INFO:tasks.workunit.client.1.vm08.stdout:4/174: chown da/d10/f1c 6901625 1 2026-03-09T19:26:57.755 INFO:tasks.workunit.client.1.vm08.stdout:4/175: write da/f18 [1100104,93204] 0 2026-03-09T19:26:57.755 INFO:tasks.workunit.client.1.vm08.stdout:1/255: unlink d9/da/d2d/c3e 0 2026-03-09T19:26:57.756 INFO:tasks.workunit.client.1.vm08.stdout:9/154: symlink d0/d2/d8/d7/l39 0 2026-03-09T19:26:57.760 INFO:tasks.workunit.client.1.vm08.stdout:9/155: dwrite d0/d2/d14/f28 [0,4194304] 0 2026-03-09T19:26:57.768 INFO:tasks.workunit.client.1.vm08.stdout:9/156: dwrite d0/d2/f1d [0,4194304] 0 2026-03-09T19:26:57.770 INFO:tasks.workunit.client.1.vm08.stdout:1/256: mkdir d9/da/d2d/d4e 0 2026-03-09T19:26:57.770 INFO:tasks.workunit.client.1.vm08.stdout:0/163: link dd/c1d dd/d31/c35 0 2026-03-09T19:26:57.770 INFO:tasks.workunit.client.1.vm08.stdout:1/257: dread - d9/da/d2d/f41 zero size 2026-03-09T19:26:57.773 INFO:tasks.workunit.client.1.vm08.stdout:1/258: dread d9/d11/f29 [0,4194304] 0 2026-03-09T19:26:57.784 INFO:tasks.workunit.client.1.vm08.stdout:3/212: symlink d0/d6/d25/d2a/l3d 0 2026-03-09T19:26:57.786 INFO:tasks.workunit.client.1.vm08.stdout:0/164: mknod dd/d31/c36 0 2026-03-09T19:26:57.789 INFO:tasks.workunit.client.1.vm08.stdout:9/157: symlink d0/d2/d8/l3a 0 2026-03-09T19:26:57.789 INFO:tasks.workunit.client.1.vm08.stdout:9/158: write d0/d2/d8/f2d [924419,15170] 0 2026-03-09T19:26:57.793 INFO:tasks.workunit.client.1.vm08.stdout:1/259: getdents d9/da/d2c 0 2026-03-09T19:26:57.799 INFO:tasks.workunit.client.1.vm08.stdout:1/260: write d9/da/f1e [5035741,5861] 0 2026-03-09T19:26:57.799 INFO:tasks.workunit.client.1.vm08.stdout:1/261: dread d9/f48 [0,4194304] 0 2026-03-09T19:26:57.799 INFO:tasks.workunit.client.1.vm08.stdout:5/155: sync 2026-03-09T19:26:57.807 INFO:tasks.workunit.client.1.vm08.stdout:9/159: dwrite d0/d2/d8/f29 [0,4194304] 0 2026-03-09T19:26:57.808 INFO:tasks.workunit.client.1.vm08.stdout:4/176: sync 2026-03-09T19:26:57.813 INFO:tasks.workunit.client.1.vm08.stdout:4/177: dwrite da/d10/d1b/f29 [0,4194304] 0 2026-03-09T19:26:57.829 INFO:tasks.workunit.client.1.vm08.stdout:1/262: mknod d9/d11/c4f 0 2026-03-09T19:26:57.829 INFO:tasks.workunit.client.1.vm08.stdout:1/263: read - d9/d11/f3c zero size 2026-03-09T19:26:57.834 INFO:tasks.workunit.client.1.vm08.stdout:1/264: dwrite d9/da/f30 [0,4194304] 0 2026-03-09T19:26:57.843 INFO:tasks.workunit.client.1.vm08.stdout:5/156: chown d16/d1e/c24 319582 1 2026-03-09T19:26:57.846 INFO:tasks.workunit.client.1.vm08.stdout:5/157: dwrite d16/f17 [0,4194304] 0 2026-03-09T19:26:57.847 INFO:tasks.workunit.client.1.vm08.stdout:5/158: stat d16 0 2026-03-09T19:26:57.847 INFO:tasks.workunit.client.1.vm08.stdout:5/159: readlink d16/d1e/l2d 0 2026-03-09T19:26:57.848 INFO:tasks.workunit.client.1.vm08.stdout:5/160: chown d16/l2f 678572 1 2026-03-09T19:26:57.848 INFO:tasks.workunit.client.1.vm08.stdout:9/160: creat d0/d2/d14/f3b x:0 0 0 2026-03-09T19:26:57.849 INFO:tasks.workunit.client.1.vm08.stdout:9/161: write d0/d2/d8/f29 [1858176,30905] 0 2026-03-09T19:26:57.850 INFO:tasks.workunit.client.1.vm08.stdout:9/162: write d0/d3/d32/f38 [219634,95059] 0 2026-03-09T19:26:57.858 INFO:tasks.workunit.client.1.vm08.stdout:9/163: stat d0/l2e 0 2026-03-09T19:26:57.858 INFO:tasks.workunit.client.1.vm08.stdout:6/173: dwrite d3/d15/f1e [0,4194304] 0 2026-03-09T19:26:57.881 INFO:tasks.workunit.client.1.vm08.stdout:1/265: creat d9/da/d2d/f50 x:0 0 0 2026-03-09T19:26:57.881 INFO:tasks.workunit.client.1.vm08.stdout:1/266: chown d9/d11/c26 198309 1 2026-03-09T19:26:57.882 INFO:tasks.workunit.client.1.vm08.stdout:1/267: readlink d9/da/d2c/l35 0 2026-03-09T19:26:57.885 INFO:tasks.workunit.client.1.vm08.stdout:8/213: dwrite de/f1b [4194304,4194304] 0 2026-03-09T19:26:57.888 INFO:tasks.workunit.client.1.vm08.stdout:8/214: write de/d1d/d21/f23 [1132416,112322] 0 2026-03-09T19:26:57.891 INFO:tasks.workunit.client.1.vm08.stdout:8/215: write de/f37 [498390,24011] 0 2026-03-09T19:26:57.895 INFO:tasks.workunit.client.1.vm08.stdout:2/194: dwrite d3/f7 [0,4194304] 0 2026-03-09T19:26:57.902 INFO:tasks.workunit.client.1.vm08.stdout:5/161: creat d16/f34 x:0 0 0 2026-03-09T19:26:57.906 INFO:tasks.workunit.client.1.vm08.stdout:5/162: chown d16 976361978 1 2026-03-09T19:26:57.906 INFO:tasks.workunit.client.1.vm08.stdout:2/195: dread d3/f7 [0,4194304] 0 2026-03-09T19:26:57.906 INFO:tasks.workunit.client.1.vm08.stdout:9/164: chown d0/fa 2 1 2026-03-09T19:26:57.906 INFO:tasks.workunit.client.1.vm08.stdout:9/165: stat d0/d2/d8/d7 0 2026-03-09T19:26:57.914 INFO:tasks.workunit.client.1.vm08.stdout:9/166: dwrite d0/d2/d14/f31 [0,4194304] 0 2026-03-09T19:26:57.923 INFO:tasks.workunit.client.1.vm08.stdout:4/178: symlink da/d10/d1b/d23/l30 0 2026-03-09T19:26:57.923 INFO:tasks.workunit.client.1.vm08.stdout:4/179: stat da/lc 0 2026-03-09T19:26:57.923 INFO:tasks.workunit.client.1.vm08.stdout:4/180: write da/d10/f2e [361076,81398] 0 2026-03-09T19:26:57.924 INFO:tasks.workunit.client.1.vm08.stdout:1/268: creat d9/da/d12/d39/f51 x:0 0 0 2026-03-09T19:26:57.929 INFO:tasks.workunit.client.1.vm08.stdout:7/163: truncate d5/f20 1146314 0 2026-03-09T19:26:57.940 INFO:tasks.workunit.client.1.vm08.stdout:3/213: write d0/d6/f39 [2941305,63190] 0 2026-03-09T19:26:57.945 INFO:tasks.workunit.client.1.vm08.stdout:0/165: truncate dd/f16 3789195 0 2026-03-09T19:26:57.946 INFO:tasks.workunit.client.1.vm08.stdout:3/214: truncate d0/d8/d24/f2d 4358587 0 2026-03-09T19:26:57.946 INFO:tasks.workunit.client.1.vm08.stdout:0/166: dwrite dd/f2c [0,4194304] 0 2026-03-09T19:26:57.947 INFO:tasks.workunit.client.1.vm08.stdout:5/163: rmdir d16/d1e 39 2026-03-09T19:26:57.954 INFO:tasks.workunit.client.1.vm08.stdout:9/167: write d0/d2/f21 [3264866,50870] 0 2026-03-09T19:26:57.954 INFO:tasks.workunit.client.1.vm08.stdout:9/168: chown d0/d2/f2f 58 1 2026-03-09T19:26:57.955 INFO:tasks.workunit.client.1.vm08.stdout:9/169: write d0/d2/d8/d7/f23 [435839,47170] 0 2026-03-09T19:26:57.959 INFO:tasks.workunit.client.1.vm08.stdout:6/174: mkdir d3/d34/d3b 0 2026-03-09T19:26:57.964 INFO:tasks.workunit.client.1.vm08.stdout:8/216: symlink de/l44 0 2026-03-09T19:26:57.972 INFO:tasks.workunit.client.1.vm08.stdout:4/181: sync 2026-03-09T19:26:57.972 INFO:tasks.workunit.client.1.vm08.stdout:3/215: sync 2026-03-09T19:26:57.972 INFO:tasks.workunit.client.1.vm08.stdout:1/269: sync 2026-03-09T19:26:57.973 INFO:tasks.workunit.client.1.vm08.stdout:1/270: truncate d9/da/f2f 1076696 0 2026-03-09T19:26:57.977 INFO:tasks.workunit.client.1.vm08.stdout:1/271: dwrite d9/da/dc/f31 [0,4194304] 0 2026-03-09T19:26:57.991 INFO:tasks.workunit.client.1.vm08.stdout:0/167: mkdir dd/d22/d27/d2e/d37 0 2026-03-09T19:26:57.991 INFO:tasks.workunit.client.1.vm08.stdout:1/272: sync 2026-03-09T19:26:57.994 INFO:tasks.workunit.client.1.vm08.stdout:5/164: dwrite d16/d1e/f25 [4194304,4194304] 0 2026-03-09T19:26:57.994 INFO:tasks.workunit.client.1.vm08.stdout:5/165: fdatasync f15 0 2026-03-09T19:26:57.995 INFO:tasks.workunit.client.1.vm08.stdout:1/273: chown d9/da/dc/f1d 56 1 2026-03-09T19:26:57.995 INFO:tasks.workunit.client.1.vm08.stdout:4/182: symlink da/d10/d26/d27/l31 0 2026-03-09T19:26:57.995 INFO:tasks.workunit.client.1.vm08.stdout:3/216: symlink d0/d6/de/d1b/d16/d18/l3e 0 2026-03-09T19:26:57.997 INFO:tasks.workunit.client.1.vm08.stdout:2/196: rename f1 to d3/d4/f49 0 2026-03-09T19:26:57.999 INFO:tasks.workunit.client.1.vm08.stdout:6/175: mknod d3/d34/d3b/c3c 0 2026-03-09T19:26:58.006 INFO:tasks.workunit.client.1.vm08.stdout:2/197: write d3/d9/dc/de/d18/f2d [473416,19047] 0 2026-03-09T19:26:58.006 INFO:tasks.workunit.client.1.vm08.stdout:6/176: dread - d3/d34/f37 zero size 2026-03-09T19:26:58.006 INFO:tasks.workunit.client.1.vm08.stdout:6/177: write d3/db/f14 [228068,51411] 0 2026-03-09T19:26:58.007 INFO:tasks.workunit.client.1.vm08.stdout:9/170: symlink d0/d2/d8/l3c 0 2026-03-09T19:26:58.010 INFO:tasks.workunit.client.1.vm08.stdout:8/217: link de/f19 de/d1d/d21/f45 0 2026-03-09T19:26:58.027 INFO:tasks.workunit.client.1.vm08.stdout:6/178: dread d3/db/f3a [0,4194304] 0 2026-03-09T19:26:58.028 INFO:tasks.workunit.client.1.vm08.stdout:5/166: creat d16/d1e/f35 x:0 0 0 2026-03-09T19:26:58.029 INFO:tasks.workunit.client.1.vm08.stdout:5/167: write d16/f34 [328158,4717] 0 2026-03-09T19:26:58.033 INFO:tasks.workunit.client.1.vm08.stdout:4/183: mkdir da/d10/d26/d27/d32 0 2026-03-09T19:26:58.035 INFO:tasks.workunit.client.1.vm08.stdout:0/168: rename dd/d22/d24/l2f to dd/d22/d27/d2e/d37/l38 0 2026-03-09T19:26:58.038 INFO:tasks.workunit.client.1.vm08.stdout:6/179: rmdir d3/db/d24 39 2026-03-09T19:26:58.039 INFO:tasks.workunit.client.1.vm08.stdout:5/168: symlink d16/d1e/l36 0 2026-03-09T19:26:58.045 INFO:tasks.workunit.client.1.vm08.stdout:8/218: rename de/d2c to de/d25/d33/d46 0 2026-03-09T19:26:58.045 INFO:tasks.workunit.client.1.vm08.stdout:8/219: fdatasync de/f11 0 2026-03-09T19:26:58.051 INFO:tasks.workunit.client.1.vm08.stdout:0/169: creat dd/d22/d27/d2e/f39 x:0 0 0 2026-03-09T19:26:58.052 INFO:tasks.workunit.client.1.vm08.stdout:9/171: mknod d0/d2/c3d 0 2026-03-09T19:26:58.053 INFO:tasks.workunit.client.1.vm08.stdout:9/172: read d0/d2/d14/f28 [1749855,104639] 0 2026-03-09T19:26:58.055 INFO:tasks.workunit.client.1.vm08.stdout:0/170: dwrite dd/d22/f29 [0,4194304] 0 2026-03-09T19:26:58.058 INFO:tasks.workunit.client.1.vm08.stdout:5/169: creat d16/d1e/f37 x:0 0 0 2026-03-09T19:26:58.063 INFO:tasks.workunit.client.1.vm08.stdout:3/217: link d0/d6/de/d1b/d16/d17/f1d d0/d6/de/d1b/d16/d17/f3f 0 2026-03-09T19:26:58.064 INFO:tasks.workunit.client.1.vm08.stdout:3/218: fsync d0/d6/f39 0 2026-03-09T19:26:58.067 INFO:tasks.workunit.client.1.vm08.stdout:3/219: dread d0/d6/f39 [0,4194304] 0 2026-03-09T19:26:58.069 INFO:tasks.workunit.client.1.vm08.stdout:8/220: fdatasync de/d1d/d21/f45 0 2026-03-09T19:26:58.069 INFO:tasks.workunit.client.1.vm08.stdout:8/221: chown de/d25/f42 3788 1 2026-03-09T19:26:58.070 INFO:tasks.workunit.client.1.vm08.stdout:8/222: fdatasync de/f11 0 2026-03-09T19:26:58.086 INFO:tasks.workunit.client.1.vm08.stdout:9/173: creat d0/d1b/f3e x:0 0 0 2026-03-09T19:26:58.089 INFO:tasks.workunit.client.1.vm08.stdout:5/170: dread f1 [0,4194304] 0 2026-03-09T19:26:58.094 INFO:tasks.workunit.client.1.vm08.stdout:6/180: link d3/f25 d3/d15/f3d 0 2026-03-09T19:26:58.095 INFO:tasks.workunit.client.1.vm08.stdout:6/181: dread d3/f9 [0,4194304] 0 2026-03-09T19:26:58.096 INFO:tasks.workunit.client.1.vm08.stdout:0/171: stat dd/c1d 0 2026-03-09T19:26:58.106 INFO:tasks.workunit.client.1.vm08.stdout:2/198: getdents d3/d9/dc/de/d18/d1f 0 2026-03-09T19:26:58.108 INFO:tasks.workunit.client.1.vm08.stdout:1/274: getdents d9/da/d2d 0 2026-03-09T19:26:58.109 INFO:tasks.workunit.client.1.vm08.stdout:1/275: write d9/da/dc/f2e [1844919,57036] 0 2026-03-09T19:26:58.112 INFO:tasks.workunit.client.1.vm08.stdout:3/220: symlink d0/d8/d19/l40 0 2026-03-09T19:26:58.112 INFO:tasks.workunit.client.1.vm08.stdout:1/276: dread d9/da/d2c/f38 [0,4194304] 0 2026-03-09T19:26:58.113 INFO:tasks.workunit.client.1.vm08.stdout:3/221: chown d0/d6/de/d1b/d16/d18/f2c 24587747 1 2026-03-09T19:26:58.118 INFO:tasks.workunit.client.1.vm08.stdout:7/164: fsync d5/fc 0 2026-03-09T19:26:58.130 INFO:tasks.workunit.client.1.vm08.stdout:9/174: creat d0/d2/d8/d7/f3f x:0 0 0 2026-03-09T19:26:58.131 INFO:tasks.workunit.client.1.vm08.stdout:5/171: mknod d16/d1e/c38 0 2026-03-09T19:26:58.134 INFO:tasks.workunit.client.1.vm08.stdout:0/172: creat dd/d22/d27/d2e/d37/f3a x:0 0 0 2026-03-09T19:26:58.151 INFO:tasks.workunit.client.1.vm08.stdout:1/277: creat d9/da/d12/d39/f52 x:0 0 0 2026-03-09T19:26:58.155 INFO:tasks.workunit.client.1.vm08.stdout:4/184: dwrite da/d10/d1b/f29 [4194304,4194304] 0 2026-03-09T19:26:58.169 INFO:tasks.workunit.client.1.vm08.stdout:7/165: fdatasync d5/f20 0 2026-03-09T19:26:58.174 INFO:tasks.workunit.client.1.vm08.stdout:8/223: mkdir de/d47 0 2026-03-09T19:26:58.174 INFO:tasks.workunit.client.1.vm08.stdout:9/175: creat d0/d3/d32/f40 x:0 0 0 2026-03-09T19:26:58.174 INFO:tasks.workunit.client.1.vm08.stdout:9/176: readlink d0/d2/d8/l3c 0 2026-03-09T19:26:58.179 INFO:tasks.workunit.client.1.vm08.stdout:9/177: dwrite d0/d2/d8/d7/f22 [0,4194304] 0 2026-03-09T19:26:58.179 INFO:tasks.workunit.client.1.vm08.stdout:9/178: readlink d0/d2/d8/l25 0 2026-03-09T19:26:58.187 INFO:tasks.workunit.client.1.vm08.stdout:0/173: write dd/f13 [4196795,122022] 0 2026-03-09T19:26:58.202 INFO:tasks.workunit.client.1.vm08.stdout:4/185: truncate da/f1d 2625922 0 2026-03-09T19:26:58.206 INFO:tasks.workunit.client.1.vm08.stdout:2/199: write d3/d9/dc/de/f17 [759309,110370] 0 2026-03-09T19:26:58.207 INFO:tasks.workunit.client.1.vm08.stdout:2/200: write d3/d4/f47 [20882,7505] 0 2026-03-09T19:26:58.212 INFO:tasks.workunit.client.1.vm08.stdout:6/182: creat d3/f3e x:0 0 0 2026-03-09T19:26:58.213 INFO:tasks.workunit.client.1.vm08.stdout:0/174: symlink dd/d31/l3b 0 2026-03-09T19:26:58.216 INFO:tasks.workunit.client.1.vm08.stdout:4/186: symlink da/d10/d26/l33 0 2026-03-09T19:26:58.216 INFO:tasks.workunit.client.1.vm08.stdout:8/224: symlink de/d47/l48 0 2026-03-09T19:26:58.221 INFO:tasks.workunit.client.1.vm08.stdout:6/183: creat d3/d34/d3b/f3f x:0 0 0 2026-03-09T19:26:58.221 INFO:tasks.workunit.client.1.vm08.stdout:0/175: fdatasync dd/d22/d24/f26 0 2026-03-09T19:26:58.222 INFO:tasks.workunit.client.1.vm08.stdout:0/176: stat dd/f13 0 2026-03-09T19:26:58.226 INFO:tasks.workunit.client.1.vm08.stdout:9/179: creat d0/d2/d8/f41 x:0 0 0 2026-03-09T19:26:58.237 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:26:57 vm07.local ceph-mon[48545]: pgmap v153: 65 pgs: 65 active+clean; 363 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 694 KiB/s rd, 27 MiB/s wr, 311 op/s 2026-03-09T19:26:58.237 INFO:tasks.workunit.client.1.vm08.stdout:9/180: readlink d0/l2e 0 2026-03-09T19:26:58.237 INFO:tasks.workunit.client.1.vm08.stdout:5/172: getdents d16 0 2026-03-09T19:26:58.238 INFO:tasks.workunit.client.1.vm08.stdout:5/173: dread - d16/d1e/d30/f33 zero size 2026-03-09T19:26:58.238 INFO:tasks.workunit.client.1.vm08.stdout:5/174: write d16/d1e/f28 [526806,74427] 0 2026-03-09T19:26:58.238 INFO:tasks.workunit.client.1.vm08.stdout:6/184: unlink d3/d15/f3d 0 2026-03-09T19:26:58.238 INFO:tasks.workunit.client.1.vm08.stdout:0/177: symlink dd/d22/d24/l3c 0 2026-03-09T19:26:58.238 INFO:tasks.workunit.client.1.vm08.stdout:0/178: dread - dd/d22/d27/d2e/d37/f3a zero size 2026-03-09T19:26:58.238 INFO:tasks.workunit.client.1.vm08.stdout:0/179: dwrite dd/f2c [0,4194304] 0 2026-03-09T19:26:58.245 INFO:tasks.workunit.client.1.vm08.stdout:4/187: link da/d10/f2e da/d10/d16/d28/f34 0 2026-03-09T19:26:58.245 INFO:tasks.workunit.client.1.vm08.stdout:7/166: link d5/c15 d5/d12/c39 0 2026-03-09T19:26:58.245 INFO:tasks.workunit.client.1.vm08.stdout:9/181: mknod d0/d3/c42 0 2026-03-09T19:26:58.246 INFO:tasks.workunit.client.1.vm08.stdout:9/182: read - d0/d2/d8/d7/f35 zero size 2026-03-09T19:26:58.247 INFO:tasks.workunit.client.1.vm08.stdout:9/183: chown d0/d2/f2f 905 1 2026-03-09T19:26:58.249 INFO:tasks.workunit.client.1.vm08.stdout:5/175: unlink d16/l2f 0 2026-03-09T19:26:58.250 INFO:tasks.workunit.client.1.vm08.stdout:6/185: creat d3/d15/f40 x:0 0 0 2026-03-09T19:26:58.251 INFO:tasks.workunit.client.1.vm08.stdout:4/188: dread da/f21 [0,4194304] 0 2026-03-09T19:26:58.251 INFO:tasks.workunit.client.1.vm08.stdout:6/186: write d3/db/f14 [359883,87121] 0 2026-03-09T19:26:58.251 INFO:tasks.workunit.client.1.vm08.stdout:7/167: dwrite d5/d12/f19 [4194304,4194304] 0 2026-03-09T19:26:58.252 INFO:tasks.workunit.client.1.vm08.stdout:6/187: read - d3/d15/f40 zero size 2026-03-09T19:26:58.261 INFO:tasks.workunit.client.1.vm08.stdout:7/168: dwrite d5/d16/f23 [0,4194304] 0 2026-03-09T19:26:58.277 INFO:tasks.workunit.client.1.vm08.stdout:7/169: dread d5/fa [0,4194304] 0 2026-03-09T19:26:58.278 INFO:tasks.workunit.client.1.vm08.stdout:9/184: mknod d0/d2/c43 0 2026-03-09T19:26:58.281 INFO:tasks.workunit.client.1.vm08.stdout:7/170: chown d5/ce 2104071 1 2026-03-09T19:26:58.282 INFO:tasks.workunit.client.1.vm08.stdout:9/185: dread d0/d2/f1a [0,4194304] 0 2026-03-09T19:26:58.293 INFO:tasks.workunit.client.1.vm08.stdout:4/189: link da/d10/f13 da/d10/d26/d27/f35 0 2026-03-09T19:26:58.293 INFO:tasks.workunit.client.1.vm08.stdout:6/188: getdents d3/db/d24 0 2026-03-09T19:26:58.293 INFO:tasks.workunit.client.1.vm08.stdout:4/190: getdents da/d10/d26/d27/d32 0 2026-03-09T19:26:58.293 INFO:tasks.workunit.client.1.vm08.stdout:4/191: creat da/d14/d2c/f36 x:0 0 0 2026-03-09T19:26:58.293 INFO:tasks.workunit.client.1.vm08.stdout:4/192: dread - da/d14/d2c/f36 zero size 2026-03-09T19:26:58.293 INFO:tasks.workunit.client.1.vm08.stdout:4/193: creat da/d10/d1b/f37 x:0 0 0 2026-03-09T19:26:58.293 INFO:tasks.workunit.client.1.vm08.stdout:4/194: mkdir da/d10/d26/d38 0 2026-03-09T19:26:58.342 INFO:tasks.workunit.client.1.vm08.stdout:2/201: sync 2026-03-09T19:26:58.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:26:57 vm08.local ceph-mon[57794]: pgmap v153: 65 pgs: 65 active+clean; 363 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 694 KiB/s rd, 27 MiB/s wr, 311 op/s 2026-03-09T19:26:58.345 INFO:tasks.workunit.client.1.vm08.stdout:2/202: mkdir d3/d9/d4a 0 2026-03-09T19:26:58.346 INFO:tasks.workunit.client.1.vm08.stdout:2/203: chown d3/d4/d23/d2c/d41/d46 157 1 2026-03-09T19:26:58.347 INFO:tasks.workunit.client.1.vm08.stdout:2/204: write d3/d9/f1e [323103,110084] 0 2026-03-09T19:26:58.491 INFO:tasks.workunit.client.1.vm08.stdout:1/278: write d9/da/d2d/f3d [4608220,7952] 0 2026-03-09T19:26:58.504 INFO:tasks.workunit.client.1.vm08.stdout:3/222: dwrite d0/d6/de/d1b/d16/d17/f3f [0,4194304] 0 2026-03-09T19:26:58.506 INFO:tasks.workunit.client.1.vm08.stdout:3/223: fdatasync d0/f28 0 2026-03-09T19:26:58.508 INFO:tasks.workunit.client.1.vm08.stdout:3/224: write d0/d6/de/d1b/d16/d18/f2c [429202,39856] 0 2026-03-09T19:26:58.514 INFO:tasks.workunit.client.1.vm08.stdout:7/171: truncate d5/f9 2039140 0 2026-03-09T19:26:58.522 INFO:tasks.workunit.client.1.vm08.stdout:0/180: rename dd/f16 to dd/d22/d27/f3d 0 2026-03-09T19:26:58.529 INFO:tasks.workunit.client.1.vm08.stdout:7/172: mkdir d5/d16/d3a 0 2026-03-09T19:26:58.531 INFO:tasks.workunit.client.1.vm08.stdout:9/186: rename d0/d2/d14/f20 to d0/f44 0 2026-03-09T19:26:58.531 INFO:tasks.workunit.client.1.vm08.stdout:9/187: chown d0/d2/d8/d7/f22 14368012 1 2026-03-09T19:26:58.532 INFO:tasks.workunit.client.1.vm08.stdout:9/188: readlink d0/d2/d8/l3a 0 2026-03-09T19:26:58.537 INFO:tasks.workunit.client.1.vm08.stdout:8/225: truncate de/f1f 1322048 0 2026-03-09T19:26:58.541 INFO:tasks.workunit.client.1.vm08.stdout:6/189: rename d3/d15/f1e to d3/db/d24/f41 0 2026-03-09T19:26:58.542 INFO:tasks.workunit.client.1.vm08.stdout:6/190: chown d3/db/d24/l2e 5018373 1 2026-03-09T19:26:58.547 INFO:tasks.workunit.client.1.vm08.stdout:8/226: symlink de/d25/d31/d39/l49 0 2026-03-09T19:26:58.551 INFO:tasks.workunit.client.1.vm08.stdout:2/205: rename l0 to d3/d4/d23/d2c/d39/l4b 0 2026-03-09T19:26:58.551 INFO:tasks.workunit.client.1.vm08.stdout:2/206: write d3/d9/f20 [3180867,7055] 0 2026-03-09T19:26:58.554 INFO:tasks.workunit.client.1.vm08.stdout:1/279: dread f2 [4194304,4194304] 0 2026-03-09T19:26:58.555 INFO:tasks.workunit.client.1.vm08.stdout:2/207: dread d3/d9/dc/de/f32 [0,4194304] 0 2026-03-09T19:26:58.565 INFO:tasks.workunit.client.1.vm08.stdout:3/225: fsync d0/d6/de/d1b/d16/d17/f1d 0 2026-03-09T19:26:58.570 INFO:tasks.workunit.client.1.vm08.stdout:5/176: dread - d16/f2b zero size 2026-03-09T19:26:58.574 INFO:tasks.workunit.client.1.vm08.stdout:8/227: mkdir de/d25/d31/d4a 0 2026-03-09T19:26:58.579 INFO:tasks.workunit.client.1.vm08.stdout:8/228: write de/f10 [578320,4900] 0 2026-03-09T19:26:58.589 INFO:tasks.workunit.client.1.vm08.stdout:3/226: creat d0/d8/d19/f41 x:0 0 0 2026-03-09T19:26:58.595 INFO:tasks.workunit.client.1.vm08.stdout:4/195: write f2 [4287227,63038] 0 2026-03-09T19:26:58.604 INFO:tasks.workunit.client.1.vm08.stdout:8/229: creat de/d1d/d21/f4b x:0 0 0 2026-03-09T19:26:58.604 INFO:tasks.workunit.client.1.vm08.stdout:1/280: mkdir d9/da/d53 0 2026-03-09T19:26:58.604 INFO:tasks.workunit.client.1.vm08.stdout:6/191: creat d3/db/f42 x:0 0 0 2026-03-09T19:26:58.605 INFO:tasks.workunit.client.1.vm08.stdout:5/177: rename d16/d1e/f28 to d16/d1e/d30/f39 0 2026-03-09T19:26:58.606 INFO:tasks.workunit.client.1.vm08.stdout:5/178: chown d16/d1e/f27 49781 1 2026-03-09T19:26:58.609 INFO:tasks.workunit.client.1.vm08.stdout:1/281: symlink d9/d40/l54 0 2026-03-09T19:26:58.617 INFO:tasks.workunit.client.1.vm08.stdout:3/227: symlink d0/d6/de/d1a/d33/l42 0 2026-03-09T19:26:58.618 INFO:tasks.workunit.client.1.vm08.stdout:3/228: write d0/d6/de/d1b/d16/d17/f36 [422746,127871] 0 2026-03-09T19:26:58.624 INFO:tasks.workunit.client.1.vm08.stdout:5/179: creat d16/d1e/d30/f3a x:0 0 0 2026-03-09T19:26:58.626 INFO:tasks.workunit.client.1.vm08.stdout:5/180: readlink d16/d1e/l36 0 2026-03-09T19:26:58.626 INFO:tasks.workunit.client.1.vm08.stdout:5/181: truncate d16/d1e/f37 488246 0 2026-03-09T19:26:58.626 INFO:tasks.workunit.client.1.vm08.stdout:5/182: write d16/d1e/f25 [7180388,72302] 0 2026-03-09T19:26:58.629 INFO:tasks.workunit.client.1.vm08.stdout:4/196: creat da/d10/d26/d27/d32/f39 x:0 0 0 2026-03-09T19:26:58.629 INFO:tasks.workunit.client.1.vm08.stdout:4/197: chown da/d10/d16/d28/d2f 10 1 2026-03-09T19:26:58.638 INFO:tasks.workunit.client.1.vm08.stdout:6/192: mkdir d3/db/d43 0 2026-03-09T19:26:58.647 INFO:tasks.workunit.client.1.vm08.stdout:5/183: mkdir d16/d1e/d3b 0 2026-03-09T19:26:58.647 INFO:tasks.workunit.client.1.vm08.stdout:5/184: write ff [2281738,47701] 0 2026-03-09T19:26:58.647 INFO:tasks.workunit.client.1.vm08.stdout:5/185: dwrite d16/d1e/f2e [0,4194304] 0 2026-03-09T19:26:58.658 INFO:tasks.workunit.client.1.vm08.stdout:8/230: link de/d25/d33/d46/c3f de/d1d/c4c 0 2026-03-09T19:26:58.667 INFO:tasks.workunit.client.1.vm08.stdout:8/231: stat f6 0 2026-03-09T19:26:58.668 INFO:tasks.workunit.client.1.vm08.stdout:1/282: mknod d9/da/d53/c55 0 2026-03-09T19:26:58.668 INFO:tasks.workunit.client.1.vm08.stdout:1/283: dread d9/d11/f29 [0,4194304] 0 2026-03-09T19:26:58.668 INFO:tasks.workunit.client.1.vm08.stdout:1/284: readlink d9/d11/l43 0 2026-03-09T19:26:58.668 INFO:tasks.workunit.client.1.vm08.stdout:1/285: dwrite d9/da/f2f [0,4194304] 0 2026-03-09T19:26:58.668 INFO:tasks.workunit.client.1.vm08.stdout:1/286: write d9/d11/f3c [935869,62286] 0 2026-03-09T19:26:58.686 INFO:tasks.workunit.client.1.vm08.stdout:5/186: chown d16/c1b 8275834 1 2026-03-09T19:26:58.714 INFO:tasks.workunit.client.1.vm08.stdout:8/232: readlink de/l1a 0 2026-03-09T19:26:58.714 INFO:tasks.workunit.client.1.vm08.stdout:8/233: dread - de/f20 zero size 2026-03-09T19:26:58.714 INFO:tasks.workunit.client.1.vm08.stdout:3/229: rename d0/cf to d0/d8/c43 0 2026-03-09T19:26:58.714 INFO:tasks.workunit.client.1.vm08.stdout:3/230: fdatasync d0/d6/f39 0 2026-03-09T19:26:58.714 INFO:tasks.workunit.client.1.vm08.stdout:3/231: write d0/d6/de/d1b/d16/d17/f36 [1005153,31218] 0 2026-03-09T19:26:58.714 INFO:tasks.workunit.client.1.vm08.stdout:3/232: dread d0/d6/de/d1b/d16/d18/f2c [0,4194304] 0 2026-03-09T19:26:58.714 INFO:tasks.workunit.client.1.vm08.stdout:5/187: rmdir d16 39 2026-03-09T19:26:58.714 INFO:tasks.workunit.client.1.vm08.stdout:8/234: mknod de/d1d/c4d 0 2026-03-09T19:26:58.714 INFO:tasks.workunit.client.1.vm08.stdout:6/193: creat d3/db/f44 x:0 0 0 2026-03-09T19:26:58.714 INFO:tasks.workunit.client.1.vm08.stdout:6/194: dwrite d3/db/f42 [0,4194304] 0 2026-03-09T19:26:58.714 INFO:tasks.workunit.client.1.vm08.stdout:6/195: stat d3/db/d24/f39 0 2026-03-09T19:26:58.716 INFO:tasks.workunit.client.1.vm08.stdout:3/233: creat d0/d8/d19/f44 x:0 0 0 2026-03-09T19:26:58.717 INFO:tasks.workunit.client.1.vm08.stdout:3/234: chown d0/d6/de/d1b/d16/d17/f3f 22280528 1 2026-03-09T19:26:58.720 INFO:tasks.workunit.client.1.vm08.stdout:5/188: dread - d16/d1e/f35 zero size 2026-03-09T19:26:58.726 INFO:tasks.workunit.client.1.vm08.stdout:6/196: rename d3/db/f3a to d3/d15/f45 0 2026-03-09T19:26:58.749 INFO:tasks.workunit.client.1.vm08.stdout:3/235: mknod d0/d6/de/d1b/d16/d18/c45 0 2026-03-09T19:26:58.749 INFO:tasks.workunit.client.1.vm08.stdout:3/236: write d0/d8/d24/f2d [4799943,106596] 0 2026-03-09T19:26:58.749 INFO:tasks.workunit.client.1.vm08.stdout:3/237: mknod d0/d6/de/d1b/d16/c46 0 2026-03-09T19:26:58.749 INFO:tasks.workunit.client.1.vm08.stdout:3/238: mknod d0/d6/de/d1b/d16/d17/c47 0 2026-03-09T19:26:58.749 INFO:tasks.workunit.client.1.vm08.stdout:3/239: write d0/d8/d24/f2d [955354,98662] 0 2026-03-09T19:26:58.749 INFO:tasks.workunit.client.1.vm08.stdout:8/235: getdents de/d25/d33 0 2026-03-09T19:26:58.749 INFO:tasks.workunit.client.1.vm08.stdout:6/197: rename d3/c2c to d3/db/c46 0 2026-03-09T19:26:58.749 INFO:tasks.workunit.client.1.vm08.stdout:8/236: creat de/d25/d31/d39/f4e x:0 0 0 2026-03-09T19:26:58.749 INFO:tasks.workunit.client.1.vm08.stdout:8/237: truncate de/d25/d33/d46/f2d 198772 0 2026-03-09T19:26:58.750 INFO:tasks.workunit.client.1.vm08.stdout:8/238: readlink de/l13 0 2026-03-09T19:26:58.753 INFO:tasks.workunit.client.1.vm08.stdout:3/240: rename d0/d6/d25/l3c to d0/d6/de/d15/l48 0 2026-03-09T19:26:58.754 INFO:tasks.workunit.client.1.vm08.stdout:3/241: read - d0/d8/d19/f41 zero size 2026-03-09T19:26:58.755 INFO:tasks.workunit.client.1.vm08.stdout:8/239: dwrite de/d1d/d21/f23 [0,4194304] 0 2026-03-09T19:26:58.759 INFO:tasks.workunit.client.1.vm08.stdout:6/198: mknod d3/d15/c47 0 2026-03-09T19:26:58.786 INFO:tasks.workunit.client.1.vm08.stdout:6/199: creat d3/d34/d3b/f48 x:0 0 0 2026-03-09T19:26:58.787 INFO:tasks.workunit.client.1.vm08.stdout:6/200: mknod d3/db/d24/c49 0 2026-03-09T19:26:58.787 INFO:tasks.workunit.client.1.vm08.stdout:6/201: dwrite d3/f32 [0,4194304] 0 2026-03-09T19:26:58.792 INFO:tasks.workunit.client.1.vm08.stdout:6/202: mkdir d3/d4a 0 2026-03-09T19:26:58.792 INFO:tasks.workunit.client.1.vm08.stdout:6/203: chown d3/db/d24/f2f 2291 1 2026-03-09T19:26:58.793 INFO:tasks.workunit.client.1.vm08.stdout:6/204: dread - d3/d15/f1c zero size 2026-03-09T19:26:58.796 INFO:tasks.workunit.client.1.vm08.stdout:6/205: mkdir d3/db/d24/d4b 0 2026-03-09T19:26:58.797 INFO:tasks.workunit.client.1.vm08.stdout:6/206: fsync d3/db/f30 0 2026-03-09T19:26:58.800 INFO:tasks.workunit.client.1.vm08.stdout:6/207: mknod d3/d34/d3b/c4c 0 2026-03-09T19:26:58.813 INFO:tasks.workunit.client.1.vm08.stdout:3/242: truncate d0/d6/de/d1b/d16/d17/f1d 2353465 0 2026-03-09T19:26:58.816 INFO:tasks.workunit.client.1.vm08.stdout:3/243: symlink d0/d6/de/d1b/l49 0 2026-03-09T19:26:58.821 INFO:tasks.workunit.client.1.vm08.stdout:0/181: dread dd/d22/d27/f3d [0,4194304] 0 2026-03-09T19:26:58.846 INFO:tasks.workunit.client.1.vm08.stdout:9/189: truncate d0/d2/f1a 113375 0 2026-03-09T19:26:58.904 INFO:tasks.workunit.client.1.vm08.stdout:2/208: dwrite d3/d4/f8 [0,4194304] 0 2026-03-09T19:26:59.045 INFO:tasks.workunit.client.1.vm08.stdout:1/287: sync 2026-03-09T19:26:59.052 INFO:tasks.workunit.client.1.vm08.stdout:1/288: creat d9/d11/f56 x:0 0 0 2026-03-09T19:26:59.052 INFO:tasks.workunit.client.1.vm08.stdout:1/289: dread - d9/da/d12/d39/f52 zero size 2026-03-09T19:26:59.053 INFO:tasks.workunit.client.1.vm08.stdout:1/290: readlink d9/da/l23 0 2026-03-09T19:26:59.060 INFO:tasks.workunit.client.1.vm08.stdout:1/291: dread d9/da/d17/f2a [0,4194304] 0 2026-03-09T19:26:59.061 INFO:tasks.workunit.client.1.vm08.stdout:1/292: write d9/d11/f56 [838824,31363] 0 2026-03-09T19:26:59.062 INFO:tasks.workunit.client.1.vm08.stdout:1/293: creat d9/d40/f57 x:0 0 0 2026-03-09T19:26:59.063 INFO:tasks.workunit.client.1.vm08.stdout:1/294: creat d9/da/d2c/f58 x:0 0 0 2026-03-09T19:26:59.118 INFO:tasks.workunit.client.1.vm08.stdout:8/240: sync 2026-03-09T19:26:59.145 INFO:tasks.workunit.client.1.vm08.stdout:8/241: rename de/d25/d31/d4a to de/d1d/d4f 0 2026-03-09T19:26:59.145 INFO:tasks.workunit.client.1.vm08.stdout:8/242: creat de/d25/d33/d46/f50 x:0 0 0 2026-03-09T19:26:59.145 INFO:tasks.workunit.client.1.vm08.stdout:8/243: creat de/d1d/d4f/f51 x:0 0 0 2026-03-09T19:26:59.145 INFO:tasks.workunit.client.1.vm08.stdout:8/244: unlink de/d1d/d2e/l3b 0 2026-03-09T19:26:59.145 INFO:tasks.workunit.client.1.vm08.stdout:8/245: chown de/l1a 596705 1 2026-03-09T19:26:59.145 INFO:tasks.workunit.client.1.vm08.stdout:8/246: mknod de/d25/c52 0 2026-03-09T19:26:59.217 INFO:tasks.workunit.client.1.vm08.stdout:4/198: write da/d10/f13 [1329533,94534] 0 2026-03-09T19:26:59.221 INFO:tasks.workunit.client.1.vm08.stdout:4/199: dwrite da/d10/f13 [0,4194304] 0 2026-03-09T19:26:59.224 INFO:tasks.workunit.client.1.vm08.stdout:4/200: dread da/d10/f13 [0,4194304] 0 2026-03-09T19:26:59.225 INFO:tasks.workunit.client.1.vm08.stdout:4/201: readlink da/d10/d26/d27/l31 0 2026-03-09T19:26:59.226 INFO:tasks.workunit.client.1.vm08.stdout:4/202: write da/d10/d26/d27/d32/f39 [84004,100402] 0 2026-03-09T19:26:59.229 INFO:tasks.workunit.client.1.vm08.stdout:4/203: dread da/d10/d26/d27/f35 [0,4194304] 0 2026-03-09T19:26:59.260 INFO:tasks.workunit.client.1.vm08.stdout:5/189: dwrite f1 [0,4194304] 0 2026-03-09T19:26:59.266 INFO:tasks.workunit.client.1.vm08.stdout:2/209: fdatasync d3/d4/f6 0 2026-03-09T19:26:59.290 INFO:tasks.workunit.client.1.vm08.stdout:2/210: symlink d3/l4c 0 2026-03-09T19:26:59.291 INFO:tasks.workunit.client.1.vm08.stdout:2/211: chown d3/d4/d23/d2c/d41 20924536 1 2026-03-09T19:26:59.300 INFO:tasks.workunit.client.1.vm08.stdout:5/190: link d16/f34 d16/d1e/d3b/f3c 0 2026-03-09T19:26:59.311 INFO:tasks.workunit.client.1.vm08.stdout:6/208: getdents d3 0 2026-03-09T19:26:59.315 INFO:tasks.workunit.client.1.vm08.stdout:2/212: unlink d3/d4/d23/d2c/c1b 0 2026-03-09T19:26:59.317 INFO:tasks.workunit.client.1.vm08.stdout:3/244: dread d0/d8/d24/f2d [0,4194304] 0 2026-03-09T19:26:59.317 INFO:tasks.workunit.client.1.vm08.stdout:5/191: symlink d16/d1e/l3d 0 2026-03-09T19:26:59.321 INFO:tasks.workunit.client.1.vm08.stdout:6/209: unlink f1 0 2026-03-09T19:26:59.327 INFO:tasks.workunit.client.1.vm08.stdout:2/213: fsync d3/d9/dc/de/f32 0 2026-03-09T19:26:59.327 INFO:tasks.workunit.client.1.vm08.stdout:2/214: dread - d3/f45 zero size 2026-03-09T19:26:59.337 INFO:tasks.workunit.client.1.vm08.stdout:5/192: write d16/f2a [1464288,86071] 0 2026-03-09T19:26:59.340 INFO:tasks.workunit.client.1.vm08.stdout:7/173: write d5/f9 [1607685,45402] 0 2026-03-09T19:26:59.359 INFO:tasks.workunit.client.1.vm08.stdout:6/210: rename d3/c26 to d3/d34/c4d 0 2026-03-09T19:26:59.363 INFO:tasks.workunit.client.1.vm08.stdout:0/182: write dd/d22/d24/f26 [1002824,85288] 0 2026-03-09T19:26:59.367 INFO:tasks.workunit.client.1.vm08.stdout:0/183: read - dd/d22/d27/d2e/d37/f3a zero size 2026-03-09T19:26:59.374 INFO:tasks.workunit.client.1.vm08.stdout:5/193: creat d16/d1e/d30/f3e x:0 0 0 2026-03-09T19:26:59.374 INFO:tasks.workunit.client.1.vm08.stdout:5/194: stat d16/c1b 0 2026-03-09T19:26:59.374 INFO:tasks.workunit.client.1.vm08.stdout:5/195: truncate ff 2810764 0 2026-03-09T19:26:59.379 INFO:tasks.workunit.client.1.vm08.stdout:7/174: chown d5/c6 191923 1 2026-03-09T19:26:59.392 INFO:tasks.workunit.client.1.vm08.stdout:2/215: mknod d3/d9/c4d 0 2026-03-09T19:26:59.393 INFO:tasks.workunit.client.1.vm08.stdout:2/216: chown d3/d9/dc/de/l36 2774 1 2026-03-09T19:26:59.397 INFO:tasks.workunit.client.1.vm08.stdout:5/196: creat d16/d1e/d30/f3f x:0 0 0 2026-03-09T19:26:59.410 INFO:tasks.workunit.client.1.vm08.stdout:7/175: creat d5/d14/d38/f3b x:0 0 0 2026-03-09T19:26:59.410 INFO:tasks.workunit.client.1.vm08.stdout:7/176: stat d5/fb 0 2026-03-09T19:26:59.410 INFO:tasks.workunit.client.1.vm08.stdout:7/177: chown d5/d12/l25 728 1 2026-03-09T19:26:59.411 INFO:tasks.workunit.client.1.vm08.stdout:7/178: truncate d5/d12/f34 227918 0 2026-03-09T19:26:59.421 INFO:tasks.workunit.client.1.vm08.stdout:7/179: creat d5/d14/d38/f3c x:0 0 0 2026-03-09T19:26:59.422 INFO:tasks.workunit.client.1.vm08.stdout:0/184: getdents dd/d22/d27/d2e 0 2026-03-09T19:26:59.425 INFO:tasks.workunit.client.1.vm08.stdout:7/180: creat d5/d14/d27/f3d x:0 0 0 2026-03-09T19:26:59.426 INFO:tasks.workunit.client.1.vm08.stdout:7/181: chown d5/d14/d38/f3b 1968 1 2026-03-09T19:26:59.427 INFO:tasks.workunit.client.1.vm08.stdout:0/185: dwrite dd/d31/f34 [0,4194304] 0 2026-03-09T19:26:59.439 INFO:tasks.workunit.client.1.vm08.stdout:7/182: creat d5/d14/d2b/f3e x:0 0 0 2026-03-09T19:26:59.440 INFO:tasks.workunit.client.1.vm08.stdout:0/186: chown dd/d22/d27/d2e/d37/l38 7831863 1 2026-03-09T19:26:59.442 INFO:tasks.workunit.client.1.vm08.stdout:7/183: mkdir d5/d12/d3f 0 2026-03-09T19:26:59.444 INFO:tasks.workunit.client.1.vm08.stdout:0/187: rmdir dd/d22/d27/d2e 39 2026-03-09T19:26:59.448 INFO:tasks.workunit.client.1.vm08.stdout:7/184: creat d5/d14/d38/f40 x:0 0 0 2026-03-09T19:26:59.454 INFO:tasks.workunit.client.1.vm08.stdout:7/185: dwrite d5/d16/f23 [0,4194304] 0 2026-03-09T19:26:59.454 INFO:tasks.workunit.client.1.vm08.stdout:0/188: truncate dd/d22/d27/d2e/f39 209997 0 2026-03-09T19:26:59.466 INFO:tasks.workunit.client.1.vm08.stdout:7/186: mknod d5/d14/d38/c41 0 2026-03-09T19:26:59.471 INFO:tasks.workunit.client.1.vm08.stdout:0/189: truncate dd/fe 361329 0 2026-03-09T19:26:59.490 INFO:tasks.workunit.client.1.vm08.stdout:1/295: dwrite d9/da/dc/f20 [0,4194304] 0 2026-03-09T19:26:59.515 INFO:tasks.workunit.client.1.vm08.stdout:1/296: symlink d9/l59 0 2026-03-09T19:26:59.515 INFO:tasks.workunit.client.1.vm08.stdout:1/297: fsync d9/da/dc/f2e 0 2026-03-09T19:26:59.515 INFO:tasks.workunit.client.1.vm08.stdout:1/298: truncate d9/f48 622207 0 2026-03-09T19:26:59.515 INFO:tasks.workunit.client.1.vm08.stdout:1/299: unlink c0 0 2026-03-09T19:26:59.515 INFO:tasks.workunit.client.1.vm08.stdout:1/300: dread - d9/da/d12/d39/f51 zero size 2026-03-09T19:26:59.515 INFO:tasks.workunit.client.1.vm08.stdout:1/301: mknod d9/d40/d49/c5a 0 2026-03-09T19:26:59.515 INFO:tasks.workunit.client.1.vm08.stdout:1/302: write d9/da/d12/d39/f51 [56674,116093] 0 2026-03-09T19:26:59.515 INFO:tasks.workunit.client.1.vm08.stdout:1/303: dread d9/da/f30 [0,4194304] 0 2026-03-09T19:26:59.532 INFO:tasks.workunit.client.1.vm08.stdout:1/304: dread d9/da/dc/f10 [0,4194304] 0 2026-03-09T19:26:59.552 INFO:tasks.workunit.client.1.vm08.stdout:1/305: chown d9/d40/f57 159814574 1 2026-03-09T19:26:59.552 INFO:tasks.workunit.client.1.vm08.stdout:8/247: dwrite de/d1d/f27 [0,4194304] 0 2026-03-09T19:26:59.552 INFO:tasks.workunit.client.1.vm08.stdout:1/306: dwrite d9/da/dc/f2e [0,4194304] 0 2026-03-09T19:26:59.558 INFO:tasks.workunit.client.1.vm08.stdout:8/248: rename cd to de/d1d/c53 0 2026-03-09T19:26:59.602 INFO:tasks.workunit.client.1.vm08.stdout:1/307: symlink d9/da/d2d/l5b 0 2026-03-09T19:26:59.602 INFO:tasks.workunit.client.1.vm08.stdout:1/308: rename d9/da/dc/f20 to d9/da/d12/f5c 0 2026-03-09T19:26:59.602 INFO:tasks.workunit.client.1.vm08.stdout:4/204: write da/d10/f13 [5143123,22675] 0 2026-03-09T19:26:59.602 INFO:tasks.workunit.client.1.vm08.stdout:4/205: chown da/d10/d16/d28 94782125 1 2026-03-09T19:26:59.602 INFO:tasks.workunit.client.1.vm08.stdout:1/309: dread d9/d11/f56 [0,4194304] 0 2026-03-09T19:26:59.602 INFO:tasks.workunit.client.1.vm08.stdout:4/206: write f5 [3909131,31986] 0 2026-03-09T19:26:59.602 INFO:tasks.workunit.client.1.vm08.stdout:8/249: creat de/f54 x:0 0 0 2026-03-09T19:26:59.602 INFO:tasks.workunit.client.1.vm08.stdout:4/207: mkdir da/d10/d26/d3a 0 2026-03-09T19:26:59.602 INFO:tasks.workunit.client.1.vm08.stdout:4/208: dwrite da/d10/d1b/f37 [0,4194304] 0 2026-03-09T19:26:59.602 INFO:tasks.workunit.client.1.vm08.stdout:4/209: creat da/d10/d26/d27/f3b x:0 0 0 2026-03-09T19:26:59.605 INFO:tasks.workunit.client.1.vm08.stdout:9/190: write d0/d2/d8/d7/f22 [1063783,98157] 0 2026-03-09T19:26:59.608 INFO:tasks.workunit.client.1.vm08.stdout:3/245: truncate d0/d8/d19/f38 743052 0 2026-03-09T19:26:59.608 INFO:tasks.workunit.client.1.vm08.stdout:3/246: chown d0/d6/f39 85322221 1 2026-03-09T19:26:59.609 INFO:tasks.workunit.client.1.vm08.stdout:3/247: chown d0/d6/de/d1b/d16/d18/c45 129853581 1 2026-03-09T19:26:59.610 INFO:tasks.workunit.client.1.vm08.stdout:4/210: symlink da/d10/l3c 0 2026-03-09T19:26:59.614 INFO:tasks.workunit.client.1.vm08.stdout:5/197: rmdir d16/d1e/d30 39 2026-03-09T19:26:59.615 INFO:tasks.workunit.client.1.vm08.stdout:5/198: dread - d16/d1e/f35 zero size 2026-03-09T19:26:59.622 INFO:tasks.workunit.client.1.vm08.stdout:8/250: getdents de/d47 0 2026-03-09T19:26:59.626 INFO:tasks.workunit.client.1.vm08.stdout:2/217: truncate d3/d9/dc/de/d18/f2d 52882 0 2026-03-09T19:26:59.629 INFO:tasks.workunit.client.1.vm08.stdout:9/191: creat d0/d2/f45 x:0 0 0 2026-03-09T19:26:59.630 INFO:tasks.workunit.client.1.vm08.stdout:8/251: rename de/f3d to de/d25/d33/f55 0 2026-03-09T19:26:59.630 INFO:tasks.workunit.client.1.vm08.stdout:8/252: readlink de/l44 0 2026-03-09T19:26:59.634 INFO:tasks.workunit.client.1.vm08.stdout:3/248: creat d0/d8/f4a x:0 0 0 2026-03-09T19:26:59.634 INFO:tasks.workunit.client.1.vm08.stdout:3/249: stat d0/d6/de/d1b/d16/d18 0 2026-03-09T19:26:59.644 INFO:tasks.workunit.client.1.vm08.stdout:6/211: getdents d3/db/d24 0 2026-03-09T19:26:59.646 INFO:tasks.workunit.client.1.vm08.stdout:2/218: dread d3/d4/fd [0,4194304] 0 2026-03-09T19:26:59.650 INFO:tasks.workunit.client.1.vm08.stdout:9/192: creat d0/d2/d8/f46 x:0 0 0 2026-03-09T19:26:59.653 INFO:tasks.workunit.client.1.vm08.stdout:3/250: rename d0/d6/d25/d2a to d0/d4b 0 2026-03-09T19:26:59.653 INFO:tasks.workunit.client.1.vm08.stdout:2/219: dwrite d3/d4/f48 [0,4194304] 0 2026-03-09T19:26:59.654 INFO:tasks.workunit.client.1.vm08.stdout:6/212: rename d3/db to d3/db/d24/d4b/d4e 22 2026-03-09T19:26:59.655 INFO:tasks.workunit.client.1.vm08.stdout:9/193: mkdir d0/d2/d14/d47 0 2026-03-09T19:26:59.657 INFO:tasks.workunit.client.1.vm08.stdout:6/213: stat d3/f1b 0 2026-03-09T19:26:59.661 INFO:tasks.workunit.client.1.vm08.stdout:9/194: mkdir d0/d2/d8/d7/d48 0 2026-03-09T19:26:59.674 INFO:tasks.workunit.client.1.vm08.stdout:6/214: creat d3/db/d24/f4f x:0 0 0 2026-03-09T19:26:59.674 INFO:tasks.workunit.client.1.vm08.stdout:2/220: mkdir d3/d4/d3e/d4e 0 2026-03-09T19:26:59.674 INFO:tasks.workunit.client.1.vm08.stdout:3/251: dread d0/d6/de/d1b/d16/d17/f1d [0,4194304] 0 2026-03-09T19:26:59.674 INFO:tasks.workunit.client.1.vm08.stdout:3/252: dread - d0/d8/d19/f41 zero size 2026-03-09T19:26:59.675 INFO:tasks.workunit.client.1.vm08.stdout:6/215: dread d3/db/f14 [0,4194304] 0 2026-03-09T19:26:59.675 INFO:tasks.workunit.client.1.vm08.stdout:6/216: fsync d3/db/d24/f41 0 2026-03-09T19:26:59.675 INFO:tasks.workunit.client.1.vm08.stdout:3/253: creat d0/d8/f4c x:0 0 0 2026-03-09T19:26:59.679 INFO:tasks.workunit.client.1.vm08.stdout:3/254: readlink d0/d6/de/d1b/d16/d17/l22 0 2026-03-09T19:26:59.684 INFO:tasks.workunit.client.1.vm08.stdout:6/217: rmdir d3/d4a 0 2026-03-09T19:26:59.692 INFO:tasks.workunit.client.1.vm08.stdout:7/187: sync 2026-03-09T19:26:59.692 INFO:tasks.workunit.client.1.vm08.stdout:8/253: sync 2026-03-09T19:26:59.693 INFO:tasks.workunit.client.1.vm08.stdout:7/188: dread - d5/d14/d38/f40 zero size 2026-03-09T19:26:59.693 INFO:tasks.workunit.client.1.vm08.stdout:6/218: dwrite d3/d34/d3b/f3f [0,4194304] 0 2026-03-09T19:26:59.695 INFO:tasks.workunit.client.1.vm08.stdout:7/189: fsync d5/f1a 0 2026-03-09T19:26:59.695 INFO:tasks.workunit.client.1.vm08.stdout:7/190: stat d5 0 2026-03-09T19:26:59.696 INFO:tasks.workunit.client.1.vm08.stdout:7/191: fdatasync d5/f20 0 2026-03-09T19:26:59.699 INFO:tasks.workunit.client.1.vm08.stdout:9/195: fdatasync d0/d2/f1a 0 2026-03-09T19:26:59.701 INFO:tasks.workunit.client.1.vm08.stdout:9/196: chown d0/d2/d8/d7/f23 30185668 1 2026-03-09T19:26:59.714 INFO:tasks.workunit.client.1.vm08.stdout:9/197: rename d0/d3/f9 to d0/d1b/f49 0 2026-03-09T19:26:59.724 INFO:tasks.workunit.client.1.vm08.stdout:8/254: dread de/f10 [0,4194304] 0 2026-03-09T19:26:59.725 INFO:tasks.workunit.client.1.vm08.stdout:8/255: chown de/d25/d33/f41 7 1 2026-03-09T19:26:59.744 INFO:tasks.workunit.client.1.vm08.stdout:6/219: rename d3/d34/d3b/f48 to d3/db/d24/f50 0 2026-03-09T19:26:59.750 INFO:tasks.workunit.client.1.vm08.stdout:8/256: creat de/d1d/d2e/f56 x:0 0 0 2026-03-09T19:26:59.758 INFO:tasks.workunit.client.1.vm08.stdout:6/220: dwrite d3/fc [4194304,4194304] 0 2026-03-09T19:26:59.773 INFO:tasks.workunit.client.1.vm08.stdout:9/198: mknod d0/d2/d8/c4a 0 2026-03-09T19:26:59.776 INFO:tasks.workunit.client.1.vm08.stdout:7/192: getdents d5/d16/d1c 0 2026-03-09T19:26:59.777 INFO:tasks.workunit.client.1.vm08.stdout:8/257: rename de/d1d/f22 to de/d25/d31/d39/f57 0 2026-03-09T19:26:59.778 INFO:tasks.workunit.client.1.vm08.stdout:8/258: stat de/d1d/d21/c40 0 2026-03-09T19:26:59.780 INFO:tasks.workunit.client.1.vm08.stdout:9/199: creat d0/d1b/f4b x:0 0 0 2026-03-09T19:26:59.781 INFO:tasks.workunit.client.1.vm08.stdout:9/200: chown d0/d2/d8/l3a 5679207 1 2026-03-09T19:26:59.789 INFO:tasks.workunit.client.1.vm08.stdout:8/259: mknod de/d25/d33/c58 0 2026-03-09T19:26:59.790 INFO:tasks.workunit.client.1.vm08.stdout:6/221: creat d3/db/d43/f51 x:0 0 0 2026-03-09T19:26:59.791 INFO:tasks.workunit.client.1.vm08.stdout:9/201: mknod d0/d3/d32/c4c 0 2026-03-09T19:26:59.791 INFO:tasks.workunit.client.1.vm08.stdout:9/202: chown d0/d2/d8/f41 62 1 2026-03-09T19:26:59.792 INFO:tasks.workunit.client.1.vm08.stdout:7/193: mkdir d5/d16/d3a/d42 0 2026-03-09T19:26:59.796 INFO:tasks.workunit.client.1.vm08.stdout:7/194: dwrite d5/d12/f34 [0,4194304] 0 2026-03-09T19:26:59.806 INFO:tasks.workunit.client.1.vm08.stdout:8/260: creat de/d1d/f59 x:0 0 0 2026-03-09T19:26:59.808 INFO:tasks.workunit.client.1.vm08.stdout:9/203: creat d0/d2/d14/f4d x:0 0 0 2026-03-09T19:26:59.810 INFO:tasks.workunit.client.1.vm08.stdout:9/204: dread d0/f13 [0,4194304] 0 2026-03-09T19:26:59.821 INFO:tasks.workunit.client.1.vm08.stdout:7/195: rename d5/d14/c2a to d5/d16/c43 0 2026-03-09T19:26:59.821 INFO:tasks.workunit.client.1.vm08.stdout:1/310: write d9/da/dc/f10 [1672308,10881] 0 2026-03-09T19:26:59.828 INFO:tasks.workunit.client.1.vm08.stdout:8/261: dread de/f1c [0,4194304] 0 2026-03-09T19:26:59.829 INFO:tasks.workunit.client.1.vm08.stdout:4/211: write da/d10/f2e [546265,128879] 0 2026-03-09T19:26:59.832 INFO:tasks.workunit.client.1.vm08.stdout:6/222: rmdir d3/d15 39 2026-03-09T19:26:59.837 INFO:tasks.workunit.client.1.vm08.stdout:9/205: mkdir d0/d1b/d4e 0 2026-03-09T19:26:59.838 INFO:tasks.workunit.client.1.vm08.stdout:9/206: dread - d0/d1b/f4b zero size 2026-03-09T19:26:59.841 INFO:tasks.workunit.client.1.vm08.stdout:7/196: symlink d5/d16/d1c/l44 0 2026-03-09T19:26:59.844 INFO:tasks.workunit.client.1.vm08.stdout:9/207: dwrite d0/d3/d32/f40 [0,4194304] 0 2026-03-09T19:26:59.847 INFO:tasks.workunit.client.1.vm08.stdout:9/208: fdatasync d0/d2/f36 0 2026-03-09T19:26:59.858 INFO:tasks.workunit.client.1.vm08.stdout:8/262: symlink de/d25/l5a 0 2026-03-09T19:26:59.859 INFO:tasks.workunit.client.1.vm08.stdout:4/212: creat da/d10/f3d x:0 0 0 2026-03-09T19:26:59.862 INFO:tasks.workunit.client.1.vm08.stdout:5/199: truncate d16/d1e/f27 1381195 0 2026-03-09T19:26:59.869 INFO:tasks.workunit.client.1.vm08.stdout:2/221: dread d3/d9/dc/d14/f2b [0,4194304] 0 2026-03-09T19:26:59.873 INFO:tasks.workunit.client.1.vm08.stdout:7/197: rmdir d5 39 2026-03-09T19:26:59.881 INFO:tasks.workunit.client.1.vm08.stdout:8/263: unlink de/d25/d33/c58 0 2026-03-09T19:26:59.882 INFO:tasks.workunit.client.1.vm08.stdout:4/213: symlink da/d10/d1b/l3e 0 2026-03-09T19:26:59.885 INFO:tasks.workunit.client.1.vm08.stdout:6/223: mknod d3/db/d24/d4b/c52 0 2026-03-09T19:26:59.890 INFO:tasks.workunit.client.1.vm08.stdout:7/198: dread - d5/d16/f28 zero size 2026-03-09T19:26:59.894 INFO:tasks.workunit.client.1.vm08.stdout:7/199: dwrite d5/d14/d2b/f30 [0,4194304] 0 2026-03-09T19:26:59.896 INFO:tasks.workunit.client.1.vm08.stdout:9/209: mknod d0/d2/d8/d7/d48/c4f 0 2026-03-09T19:26:59.903 INFO:tasks.workunit.client.1.vm08.stdout:8/264: rmdir de/d25/d31 39 2026-03-09T19:26:59.906 INFO:tasks.workunit.client.1.vm08.stdout:5/200: symlink d16/d1e/l40 0 2026-03-09T19:26:59.911 INFO:tasks.workunit.client.1.vm08.stdout:6/224: dwrite d3/d15/f1c [0,4194304] 0 2026-03-09T19:26:59.917 INFO:tasks.workunit.client.1.vm08.stdout:2/222: symlink d3/d4/d3e/d4e/l4f 0 2026-03-09T19:26:59.918 INFO:tasks.workunit.client.1.vm08.stdout:2/223: truncate d3/f45 558199 0 2026-03-09T19:26:59.918 INFO:tasks.workunit.client.1.vm08.stdout:2/224: chown d3/f45 4870 1 2026-03-09T19:26:59.928 INFO:tasks.workunit.client.1.vm08.stdout:8/265: mkdir de/d1d/d2e/d5b 0 2026-03-09T19:26:59.932 INFO:tasks.workunit.client.1.vm08.stdout:5/201: chown d16/d1e/d30/f3a 25899550 1 2026-03-09T19:26:59.937 INFO:tasks.workunit.client.1.vm08.stdout:6/225: rmdir d3 39 2026-03-09T19:26:59.942 INFO:tasks.workunit.client.1.vm08.stdout:0/190: link dd/fe dd/d22/f3e 0 2026-03-09T19:26:59.942 INFO:tasks.workunit.client.1.vm08.stdout:0/191: fdatasync dd/d22/d27/d2e/d37/f3a 0 2026-03-09T19:26:59.948 INFO:tasks.workunit.client.1.vm08.stdout:8/266: mknod de/d1d/d21/c5c 0 2026-03-09T19:26:59.949 INFO:tasks.workunit.client.1.vm08.stdout:5/202: mknod d16/d1e/d30/c41 0 2026-03-09T19:26:59.954 INFO:tasks.workunit.client.1.vm08.stdout:2/225: truncate d3/d9/dc/de/d18/d1f/f3a 1614355 0 2026-03-09T19:26:59.955 INFO:tasks.workunit.client.1.vm08.stdout:0/192: readlink dd/l17 0 2026-03-09T19:26:59.966 INFO:tasks.workunit.client.1.vm08.stdout:5/203: dwrite d16/f17 [0,4194304] 0 2026-03-09T19:26:59.966 INFO:tasks.workunit.client.1.vm08.stdout:5/204: chown d16/d1e/l40 28371 1 2026-03-09T19:26:59.967 INFO:tasks.workunit.client.1.vm08.stdout:5/205: write d16/d1e/d30/f3f [587152,119062] 0 2026-03-09T19:26:59.974 INFO:tasks.workunit.client.1.vm08.stdout:2/226: write d3/d9/dc/d14/f2b [2262945,52096] 0 2026-03-09T19:26:59.982 INFO:tasks.workunit.client.1.vm08.stdout:3/255: write d0/d8/d19/f38 [39560,118822] 0 2026-03-09T19:26:59.987 INFO:tasks.workunit.client.1.vm08.stdout:7/200: getdents d5/d16 0 2026-03-09T19:26:59.989 INFO:tasks.workunit.client.1.vm08.stdout:9/210: getdents d0/d1b 0 2026-03-09T19:26:59.994 INFO:tasks.workunit.client.1.vm08.stdout:1/311: dwrite d9/da/d17/f2a [0,4194304] 0 2026-03-09T19:27:00.000 INFO:tasks.workunit.client.1.vm08.stdout:6/226: mknod d3/db/c53 0 2026-03-09T19:27:00.009 INFO:tasks.workunit.client.1.vm08.stdout:7/201: truncate d5/d16/f1f 5688617 0 2026-03-09T19:27:00.010 INFO:tasks.workunit.client.1.vm08.stdout:7/202: truncate d5/d14/d27/f35 688815 0 2026-03-09T19:27:00.013 INFO:tasks.workunit.client.1.vm08.stdout:8/267: rmdir de/d1d/d2e/d5b 0 2026-03-09T19:27:00.016 INFO:tasks.workunit.client.1.vm08.stdout:6/227: dread - d3/f12 zero size 2026-03-09T19:27:00.019 INFO:tasks.workunit.client.1.vm08.stdout:0/193: rename dd/d22/d27/d2e/d37/f3a to dd/d22/d27/f3f 0 2026-03-09T19:27:00.020 INFO:tasks.workunit.client.1.vm08.stdout:4/214: truncate da/d10/d1b/f37 1055974 0 2026-03-09T19:27:00.022 INFO:tasks.workunit.client.1.vm08.stdout:7/203: creat d5/d16/f45 x:0 0 0 2026-03-09T19:27:00.038 INFO:tasks.workunit.client.1.vm08.stdout:7/204: fsync d5/f1a 0 2026-03-09T19:27:00.038 INFO:tasks.workunit.client.1.vm08.stdout:1/312: symlink d9/da/d2d/d4e/l5d 0 2026-03-09T19:27:00.038 INFO:tasks.workunit.client.1.vm08.stdout:1/313: write d9/da/dc/f2e [3369113,25056] 0 2026-03-09T19:27:00.038 INFO:tasks.workunit.client.1.vm08.stdout:6/228: mknod d3/d34/c54 0 2026-03-09T19:27:00.038 INFO:tasks.workunit.client.1.vm08.stdout:0/194: creat dd/d22/d27/d2e/d37/f40 x:0 0 0 2026-03-09T19:27:00.038 INFO:tasks.workunit.client.1.vm08.stdout:0/195: write dd/d31/f34 [1176700,40973] 0 2026-03-09T19:27:00.039 INFO:tasks.workunit.client.1.vm08.stdout:4/215: mknod da/d10/d16/d28/c3f 0 2026-03-09T19:27:00.044 INFO:tasks.workunit.client.1.vm08.stdout:7/205: creat d5/d14/f46 x:0 0 0 2026-03-09T19:27:00.045 INFO:tasks.workunit.client.1.vm08.stdout:8/268: fdatasync de/f1b 0 2026-03-09T19:27:00.046 INFO:tasks.workunit.client.1.vm08.stdout:8/269: truncate de/f11 5182891 0 2026-03-09T19:27:00.047 INFO:tasks.workunit.client.1.vm08.stdout:8/270: write de/d1d/d21/f23 [1020346,31799] 0 2026-03-09T19:27:00.049 INFO:tasks.workunit.client.1.vm08.stdout:1/314: mknod d9/da/dc/c5e 0 2026-03-09T19:27:00.050 INFO:tasks.workunit.client.1.vm08.stdout:5/206: sync 2026-03-09T19:27:00.054 INFO:tasks.workunit.client.1.vm08.stdout:3/256: link d0/d6/de/d1b/l2f d0/d6/de/l4d 0 2026-03-09T19:27:00.056 INFO:tasks.workunit.client.1.vm08.stdout:3/257: chown d0/d6/de/d1b/d16/d17/f1d 209828252 1 2026-03-09T19:27:00.056 INFO:tasks.workunit.client.1.vm08.stdout:3/258: stat d0/d6/de/c10 0 2026-03-09T19:27:00.062 INFO:tasks.workunit.client.1.vm08.stdout:4/216: mkdir da/d14/d40 0 2026-03-09T19:27:00.065 INFO:tasks.workunit.client.1.vm08.stdout:4/217: dwrite f5 [0,4194304] 0 2026-03-09T19:27:00.067 INFO:tasks.workunit.client.1.vm08.stdout:4/218: fsync da/d10/d26/d27/f3b 0 2026-03-09T19:27:00.075 INFO:tasks.workunit.client.1.vm08.stdout:1/315: creat d9/d11/f5f x:0 0 0 2026-03-09T19:27:00.077 INFO:tasks.workunit.client.1.vm08.stdout:1/316: dread d9/d11/f29 [0,4194304] 0 2026-03-09T19:27:00.080 INFO:tasks.workunit.client.1.vm08.stdout:6/229: mkdir d3/d55 0 2026-03-09T19:27:00.087 INFO:tasks.workunit.client.1.vm08.stdout:2/227: dwrite d3/d9/dc/de/f1c [0,4194304] 0 2026-03-09T19:27:00.088 INFO:tasks.workunit.client.1.vm08.stdout:2/228: write d3/d4/d23/f27 [451786,6818] 0 2026-03-09T19:27:00.099 INFO:tasks.workunit.client.1.vm08.stdout:9/211: truncate d0/d2/d8/f29 2839947 0 2026-03-09T19:27:00.102 INFO:tasks.workunit.client.1.vm08.stdout:3/259: mkdir d0/d6/de/d1b/d16/d17/d4e 0 2026-03-09T19:27:00.106 INFO:tasks.workunit.client.1.vm08.stdout:7/206: fsync d5/fa 0 2026-03-09T19:27:00.108 INFO:tasks.workunit.client.1.vm08.stdout:1/317: mkdir d9/da/d17/d60 0 2026-03-09T19:27:00.111 INFO:tasks.workunit.client.1.vm08.stdout:0/196: write dd/f18 [3891813,128291] 0 2026-03-09T19:27:00.111 INFO:tasks.workunit.client.1.vm08.stdout:0/197: fsync fc 0 2026-03-09T19:27:00.115 INFO:tasks.workunit.client.1.vm08.stdout:0/198: dwrite dd/f15 [0,4194304] 0 2026-03-09T19:27:00.132 INFO:tasks.workunit.client.1.vm08.stdout:2/229: rename d3/d4/f47 to d3/d9/dc/de/d18/f50 0 2026-03-09T19:27:00.133 INFO:tasks.workunit.client.1.vm08.stdout:2/230: fdatasync d3/d4/f6 0 2026-03-09T19:27:00.149 INFO:tasks.workunit.client.1.vm08.stdout:7/207: creat d5/d14/d38/f47 x:0 0 0 2026-03-09T19:27:00.160 INFO:tasks.workunit.client.1.vm08.stdout:4/219: creat da/d14/d40/f41 x:0 0 0 2026-03-09T19:27:00.161 INFO:tasks.workunit.client.1.vm08.stdout:4/220: write da/d14/d2c/f36 [364176,90200] 0 2026-03-09T19:27:00.164 INFO:tasks.workunit.client.1.vm08.stdout:8/271: creat de/f5d x:0 0 0 2026-03-09T19:27:00.172 INFO:tasks.workunit.client.1.vm08.stdout:5/207: creat d16/d1e/f42 x:0 0 0 2026-03-09T19:27:00.172 INFO:tasks.workunit.client.1.vm08.stdout:5/208: chown d16/d1e/d30/f3e 66640558 1 2026-03-09T19:27:00.189 INFO:tasks.workunit.client.1.vm08.stdout:9/212: symlink d0/d2/d8/l50 0 2026-03-09T19:27:00.193 INFO:tasks.workunit.client.1.vm08.stdout:3/260: dwrite d0/d6/de/d1b/d16/d18/f2c [0,4194304] 0 2026-03-09T19:27:00.194 INFO:tasks.workunit.client.1.vm08.stdout:7/208: mknod d5/d16/c48 0 2026-03-09T19:27:00.194 INFO:tasks.workunit.client.1.vm08.stdout:7/209: readlink d5/d14/d27/l33 0 2026-03-09T19:27:00.204 INFO:tasks.workunit.client.1.vm08.stdout:8/272: creat de/d1d/d4f/f5e x:0 0 0 2026-03-09T19:27:00.213 INFO:tasks.workunit.client.1.vm08.stdout:1/318: dwrite d9/d11/f56 [0,4194304] 0 2026-03-09T19:27:00.225 INFO:tasks.workunit.client.1.vm08.stdout:7/210: creat d5/d16/f49 x:0 0 0 2026-03-09T19:27:00.228 INFO:tasks.workunit.client.1.vm08.stdout:7/211: dwrite d5/d16/f1f [4194304,4194304] 0 2026-03-09T19:27:00.228 INFO:tasks.workunit.client.1.vm08.stdout:7/212: fsync d5/d14/d2b/f32 0 2026-03-09T19:27:00.229 INFO:tasks.workunit.client.1.vm08.stdout:7/213: chown d5/d14/d27 428 1 2026-03-09T19:27:00.232 INFO:tasks.workunit.client.1.vm08.stdout:8/273: fsync de/d25/d33/f41 0 2026-03-09T19:27:00.238 INFO:tasks.workunit.client.1.vm08.stdout:5/209: creat d16/d1e/d3b/f43 x:0 0 0 2026-03-09T19:27:00.243 INFO:tasks.workunit.client.1.vm08.stdout:1/319: mknod d9/d40/c61 0 2026-03-09T19:27:00.244 INFO:tasks.workunit.client.1.vm08.stdout:1/320: stat d9/d11/f5f 0 2026-03-09T19:27:00.247 INFO:tasks.workunit.client.1.vm08.stdout:6/230: getdents d3/db/d24/d4b 0 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:7/214: dread d5/f9 [0,4194304] 0 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:7/215: dwrite d5/d14/d27/f35 [0,4194304] 0 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:9/213: symlink d0/d2/l51 0 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:7/216: stat d5/d14/d38/f47 0 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:9/214: dread - d0/d2/d8/f41 zero size 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:3/261: mknod d0/d6/de/d1b/d16/d17/d4e/c4f 0 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:3/262: write d0/d6/de/d1b/d16/d18/f2c [1366043,26025] 0 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:1/321: dwrite f2 [0,4194304] 0 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:6/231: creat d3/db/d43/f56 x:0 0 0 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:9/215: symlink d0/d1b/l52 0 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:6/232: dwrite d3/db/f44 [0,4194304] 0 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:3/263: unlink d0/d6/de/d1b/d16/d17/d4e/c4f 0 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:8/274: rename de/d25/d31/d39 to de/d1d/d2e/d5f 0 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:3/264: write d0/d8/f4a [201221,81948] 0 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:1/322: chown d9/l46 2 1 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:8/275: dread de/d25/d33/f55 [0,4194304] 0 2026-03-09T19:27:00.290 INFO:tasks.workunit.client.1.vm08.stdout:9/216: creat d0/d2/d8/d7/d48/f53 x:0 0 0 2026-03-09T19:27:00.293 INFO:tasks.workunit.client.1.vm08.stdout:7/217: link d5/d16/f45 d5/d16/f4a 0 2026-03-09T19:27:00.295 INFO:tasks.workunit.client.1.vm08.stdout:3/265: mknod d0/d8/d19/c50 0 2026-03-09T19:27:00.295 INFO:tasks.workunit.client.1.vm08.stdout:9/217: dwrite d0/d1b/f4b [0,4194304] 0 2026-03-09T19:27:00.296 INFO:tasks.workunit.client.1.vm08.stdout:5/210: creat d16/d1e/f44 x:0 0 0 2026-03-09T19:27:00.297 INFO:tasks.workunit.client.1.vm08.stdout:3/266: write d0/f28 [3030213,32805] 0 2026-03-09T19:27:00.297 INFO:tasks.workunit.client.1.vm08.stdout:5/211: write d16/d1e/f37 [801459,36628] 0 2026-03-09T19:27:00.300 INFO:tasks.workunit.client.1.vm08.stdout:9/218: fdatasync d0/d2/d14/f4d 0 2026-03-09T19:27:00.300 INFO:tasks.workunit.client.1.vm08.stdout:9/219: stat d0/d1b/f4b 0 2026-03-09T19:27:00.305 INFO:tasks.workunit.client.1.vm08.stdout:1/323: mkdir d9/da/d2d/d62 0 2026-03-09T19:27:00.307 INFO:tasks.workunit.client.1.vm08.stdout:6/233: symlink d3/l57 0 2026-03-09T19:27:00.310 INFO:tasks.workunit.client.1.vm08.stdout:7/218: rmdir d5/d14/d2b 39 2026-03-09T19:27:00.313 INFO:tasks.workunit.client.1.vm08.stdout:9/220: truncate d0/d1b/f49 979632 0 2026-03-09T19:27:00.313 INFO:tasks.workunit.client.1.vm08.stdout:9/221: chown d0/d3/c42 13407351 1 2026-03-09T19:27:00.315 INFO:tasks.workunit.client.1.vm08.stdout:1/324: dread d9/da/d2c/f38 [0,4194304] 0 2026-03-09T19:27:00.315 INFO:tasks.workunit.client.1.vm08.stdout:1/325: chown d9/d11/c4d 16163430 1 2026-03-09T19:27:00.318 INFO:tasks.workunit.client.1.vm08.stdout:6/234: creat d3/d34/d3b/f58 x:0 0 0 2026-03-09T19:27:00.320 INFO:tasks.workunit.client.1.vm08.stdout:6/235: dread d3/db/f44 [0,4194304] 0 2026-03-09T19:27:00.321 INFO:tasks.workunit.client.1.vm08.stdout:6/236: write d3/db/d24/f4f [724296,12126] 0 2026-03-09T19:27:00.323 INFO:tasks.workunit.client.1.vm08.stdout:5/212: mkdir d16/d45 0 2026-03-09T19:27:00.342 INFO:tasks.workunit.client.1.vm08.stdout:9/222: mknod d0/d2/d8/d7/d48/c54 0 2026-03-09T19:27:00.342 INFO:tasks.workunit.client.1.vm08.stdout:6/237: mknod d3/d34/c59 0 2026-03-09T19:27:00.342 INFO:tasks.workunit.client.1.vm08.stdout:6/238: dwrite d3/db/f20 [0,4194304] 0 2026-03-09T19:27:00.342 INFO:tasks.workunit.client.1.vm08.stdout:8/276: getdents de/d1d/d2e/d5f 0 2026-03-09T19:27:00.342 INFO:tasks.workunit.client.1.vm08.stdout:8/277: fdatasync de/f11 0 2026-03-09T19:27:00.342 INFO:tasks.workunit.client.1.vm08.stdout:8/278: dwrite de/d25/d31/f36 [0,4194304] 0 2026-03-09T19:27:00.342 INFO:tasks.workunit.client.1.vm08.stdout:5/213: creat d16/d45/f46 x:0 0 0 2026-03-09T19:27:00.345 INFO:tasks.workunit.client.1.vm08.stdout:6/239: mknod d3/d34/c5a 0 2026-03-09T19:27:00.347 INFO:tasks.workunit.client.1.vm08.stdout:6/240: chown d3/db/d24/f4f 3107217 1 2026-03-09T19:27:00.347 INFO:tasks.workunit.client.1.vm08.stdout:7/219: getdents d5/d12 0 2026-03-09T19:27:00.347 INFO:tasks.workunit.client.1.vm08.stdout:6/241: stat d3/d34/f37 0 2026-03-09T19:27:00.386 INFO:tasks.workunit.client.1.vm08.stdout:8/279: mknod de/d25/c60 0 2026-03-09T19:27:00.386 INFO:tasks.workunit.client.1.vm08.stdout:8/280: write de/f5d [870366,36605] 0 2026-03-09T19:27:00.390 INFO:tasks.workunit.client.1.vm08.stdout:5/214: mkdir d16/d1e/d47 0 2026-03-09T19:27:00.391 INFO:tasks.workunit.client.1.vm08.stdout:1/326: rename d9/da/d12/c4c to d9/da/c63 0 2026-03-09T19:27:00.393 INFO:tasks.workunit.client.1.vm08.stdout:9/223: getdents d0/d2/d8/d7 0 2026-03-09T19:27:00.394 INFO:tasks.workunit.client.1.vm08.stdout:8/281: write de/d1d/d2e/d5f/f57 [2506512,46299] 0 2026-03-09T19:27:00.395 INFO:tasks.workunit.client.1.vm08.stdout:8/282: chown de/d1d/f1e 605 1 2026-03-09T19:27:00.399 INFO:tasks.workunit.client.1.vm08.stdout:6/242: symlink d3/l5b 0 2026-03-09T19:27:00.399 INFO:tasks.workunit.client.1.vm08.stdout:6/243: fdatasync d3/d15/f19 0 2026-03-09T19:27:00.402 INFO:tasks.workunit.client.1.vm08.stdout:5/215: write d16/f20 [2297837,52118] 0 2026-03-09T19:27:00.408 INFO:tasks.workunit.client.1.vm08.stdout:6/244: mkdir d3/d34/d5c 0 2026-03-09T19:27:00.428 INFO:tasks.workunit.client.1.vm08.stdout:9/224: mknod d0/d2/d14/d47/c55 0 2026-03-09T19:27:00.428 INFO:tasks.workunit.client.1.vm08.stdout:9/225: dwrite d0/d2/d14/f4d [0,4194304] 0 2026-03-09T19:27:00.428 INFO:tasks.workunit.client.1.vm08.stdout:9/226: creat d0/d2/d14/f56 x:0 0 0 2026-03-09T19:27:00.430 INFO:tasks.workunit.client.1.vm08.stdout:9/227: mkdir d0/d3/d32/d57 0 2026-03-09T19:27:00.436 INFO:tasks.workunit.client.1.vm08.stdout:9/228: rename d0/d2/d8/f46 to d0/d2/d8/d7/f58 0 2026-03-09T19:27:00.525 INFO:tasks.workunit.client.1.vm08.stdout:5/216: sync 2026-03-09T19:27:00.525 INFO:tasks.workunit.client.1.vm08.stdout:8/283: sync 2026-03-09T19:27:00.526 INFO:tasks.workunit.client.1.vm08.stdout:8/284: write de/d25/d33/d46/f50 [716072,104377] 0 2026-03-09T19:27:00.526 INFO:tasks.workunit.client.1.vm08.stdout:5/217: write d16/f2a [1018003,86971] 0 2026-03-09T19:27:00.527 INFO:tasks.workunit.client.1.vm08.stdout:8/285: fsync de/d1d/f1e 0 2026-03-09T19:27:00.532 INFO:tasks.workunit.client.1.vm08.stdout:8/286: link de/f10 de/d1d/d2e/f61 0 2026-03-09T19:27:00.532 INFO:tasks.workunit.client.1.vm08.stdout:8/287: fsync f1 0 2026-03-09T19:27:00.534 INFO:tasks.workunit.client.1.vm08.stdout:8/288: creat de/d1d/d21/f62 x:0 0 0 2026-03-09T19:27:00.534 INFO:tasks.workunit.client.1.vm08.stdout:8/289: chown de/d1d/l3a 3795487 1 2026-03-09T19:27:00.556 INFO:tasks.workunit.client.1.vm08.stdout:8/290: sync 2026-03-09T19:27:00.574 INFO:tasks.workunit.client.1.vm08.stdout:8/291: rename de/l44 to de/d25/l63 0 2026-03-09T19:27:00.576 INFO:tasks.workunit.client.1.vm08.stdout:8/292: creat de/d25/f64 x:0 0 0 2026-03-09T19:27:00.592 INFO:tasks.workunit.client.1.vm08.stdout:1/327: read d9/d11/f3c [455806,12686] 0 2026-03-09T19:27:00.601 INFO:tasks.workunit.client.1.vm08.stdout:1/328: mknod d9/da/d2d/d4e/c64 0 2026-03-09T19:27:00.603 INFO:tasks.workunit.client.1.vm08.stdout:1/329: dread d9/da/dc/f2e [0,4194304] 0 2026-03-09T19:27:00.604 INFO:tasks.workunit.client.1.vm08.stdout:1/330: fsync d9/da/d2c/f58 0 2026-03-09T19:27:00.604 INFO:tasks.workunit.client.1.vm08.stdout:8/293: sync 2026-03-09T19:27:00.607 INFO:tasks.workunit.client.1.vm08.stdout:8/294: rename de/d1d/d2e/d5f/l49 to de/d1d/d21/l65 0 2026-03-09T19:27:00.610 INFO:tasks.workunit.client.1.vm08.stdout:1/331: symlink d9/da/l65 0 2026-03-09T19:27:00.610 INFO:tasks.workunit.client.1.vm08.stdout:1/332: chown d9/da/dc/f1d 9 1 2026-03-09T19:27:00.614 INFO:tasks.workunit.client.1.vm08.stdout:1/333: dwrite d9/da/f1e [4194304,4194304] 0 2026-03-09T19:27:00.617 INFO:tasks.workunit.client.1.vm08.stdout:1/334: chown d9/d11/c27 3348290 1 2026-03-09T19:27:00.635 INFO:tasks.workunit.client.1.vm08.stdout:8/295: mknod de/d1d/d4f/c66 0 2026-03-09T19:27:00.650 INFO:tasks.workunit.client.1.vm08.stdout:5/218: read d16/f20 [2339074,82021] 0 2026-03-09T19:27:00.677 INFO:tasks.workunit.client.1.vm08.stdout:4/221: dwrite da/d10/f1f [0,4194304] 0 2026-03-09T19:27:00.678 INFO:tasks.workunit.client.1.vm08.stdout:8/296: truncate f1 4904438 0 2026-03-09T19:27:00.680 INFO:tasks.workunit.client.1.vm08.stdout:0/199: write dd/d22/f3e [240007,45222] 0 2026-03-09T19:27:00.680 INFO:tasks.workunit.client.1.vm08.stdout:0/200: write dd/f18 [866406,64767] 0 2026-03-09T19:27:00.681 INFO:tasks.workunit.client.1.vm08.stdout:0/201: readlink dd/l20 0 2026-03-09T19:27:00.683 INFO:tasks.workunit.client.1.vm08.stdout:5/219: sync 2026-03-09T19:27:00.687 INFO:tasks.workunit.client.1.vm08.stdout:8/297: mkdir de/d25/d33/d46/d67 0 2026-03-09T19:27:00.695 INFO:tasks.workunit.client.1.vm08.stdout:4/222: dread da/f21 [0,4194304] 0 2026-03-09T19:27:00.698 INFO:tasks.workunit.client.1.vm08.stdout:1/335: getdents d9/da/d2c 0 2026-03-09T19:27:00.698 INFO:tasks.workunit.client.1.vm08.stdout:1/336: readlink d9/l33 0 2026-03-09T19:27:00.701 INFO:tasks.workunit.client.1.vm08.stdout:1/337: dread d9/da/f2f [0,4194304] 0 2026-03-09T19:27:00.719 INFO:tasks.workunit.client.1.vm08.stdout:0/202: unlink dd/d22/d24/l3c 0 2026-03-09T19:27:00.719 INFO:tasks.workunit.client.1.vm08.stdout:0/203: stat dd/d22/d27/d2e/d37 0 2026-03-09T19:27:00.729 INFO:tasks.workunit.client.1.vm08.stdout:5/220: chown d16/d1e/f27 12404346 1 2026-03-09T19:27:00.737 INFO:tasks.workunit.client.1.vm08.stdout:4/223: mkdir da/d10/d42 0 2026-03-09T19:27:00.746 INFO:tasks.workunit.client.1.vm08.stdout:1/338: fdatasync d9/d11/f29 0 2026-03-09T19:27:00.747 INFO:tasks.workunit.client.1.vm08.stdout:1/339: truncate d9/da/d2c/f58 579405 0 2026-03-09T19:27:00.761 INFO:tasks.workunit.client.1.vm08.stdout:4/224: creat da/d10/d26/d38/f43 x:0 0 0 2026-03-09T19:27:00.767 INFO:tasks.workunit.client.1.vm08.stdout:4/225: fdatasync da/f1d 0 2026-03-09T19:27:00.768 INFO:tasks.workunit.client.1.vm08.stdout:1/340: link d9/d11/c4d d9/da/d12/c66 0 2026-03-09T19:27:00.770 INFO:tasks.workunit.client.1.vm08.stdout:4/226: fsync f1 0 2026-03-09T19:27:00.771 INFO:tasks.workunit.client.1.vm08.stdout:1/341: mkdir d9/da/d53/d67 0 2026-03-09T19:27:00.772 INFO:tasks.workunit.client.1.vm08.stdout:4/227: creat da/d10/d16/d28/f44 x:0 0 0 2026-03-09T19:27:00.773 INFO:tasks.workunit.client.1.vm08.stdout:1/342: creat d9/da/dc/f68 x:0 0 0 2026-03-09T19:27:00.780 INFO:tasks.workunit.client.1.vm08.stdout:4/228: unlink da/d10/d26/d27/l31 0 2026-03-09T19:27:00.783 INFO:tasks.workunit.client.1.vm08.stdout:4/229: dwrite f1 [0,4194304] 0 2026-03-09T19:27:00.795 INFO:tasks.workunit.client.1.vm08.stdout:1/343: link d9/da/dc/c2b d9/da/d12/d39/c69 0 2026-03-09T19:27:00.799 INFO:tasks.workunit.client.1.vm08.stdout:1/344: mkdir d9/da/d2c/d6a 0 2026-03-09T19:27:00.802 INFO:tasks.workunit.client.1.vm08.stdout:1/345: dread d9/da/dc/f10 [0,4194304] 0 2026-03-09T19:27:00.814 INFO:tasks.workunit.client.1.vm08.stdout:1/346: dread d9/da/dc/f2e [0,4194304] 0 2026-03-09T19:27:00.831 INFO:tasks.workunit.client.1.vm08.stdout:3/267: write d0/d6/de/d1b/d16/d17/f3f [2549285,91374] 0 2026-03-09T19:27:00.833 INFO:tasks.workunit.client.1.vm08.stdout:3/268: mkdir d0/d6/de/d1b/d16/d18/d51 0 2026-03-09T19:27:00.837 INFO:tasks.workunit.client.1.vm08.stdout:3/269: dwrite d0/f28 [0,4194304] 0 2026-03-09T19:27:00.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:00 vm08.local ceph-mon[57794]: pgmap v154: 65 pgs: 65 active+clean; 479 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 1.3 MiB/s rd, 48 MiB/s wr, 361 op/s 2026-03-09T19:27:00.863 INFO:tasks.workunit.client.1.vm08.stdout:0/204: read dd/d22/f3e [174247,62304] 0 2026-03-09T19:27:00.867 INFO:tasks.workunit.client.1.vm08.stdout:0/205: rmdir dd/d22/d27/d2e 39 2026-03-09T19:27:00.868 INFO:tasks.workunit.client.1.vm08.stdout:0/206: write dd/f2c [1537682,54371] 0 2026-03-09T19:27:00.872 INFO:tasks.workunit.client.1.vm08.stdout:0/207: creat dd/d22/f41 x:0 0 0 2026-03-09T19:27:00.873 INFO:tasks.workunit.client.1.vm08.stdout:0/208: creat dd/d22/d27/f42 x:0 0 0 2026-03-09T19:27:00.880 INFO:tasks.workunit.client.1.vm08.stdout:7/220: dwrite d5/d14/f1e [0,4194304] 0 2026-03-09T19:27:00.886 INFO:tasks.workunit.client.1.vm08.stdout:0/209: dread dd/f2c [0,4194304] 0 2026-03-09T19:27:00.893 INFO:tasks.workunit.client.1.vm08.stdout:6/245: dwrite d3/f12 [0,4194304] 0 2026-03-09T19:27:00.900 INFO:tasks.workunit.client.1.vm08.stdout:0/210: symlink dd/d31/l43 0 2026-03-09T19:27:00.901 INFO:tasks.workunit.client.1.vm08.stdout:0/211: dread - dd/d22/d27/f42 zero size 2026-03-09T19:27:00.905 INFO:tasks.workunit.client.1.vm08.stdout:7/221: dwrite d5/d14/d2b/f32 [0,4194304] 0 2026-03-09T19:27:00.909 INFO:tasks.workunit.client.1.vm08.stdout:6/246: symlink d3/d15/l5d 0 2026-03-09T19:27:00.915 INFO:tasks.workunit.client.1.vm08.stdout:7/222: mkdir d5/d14/d2b/d4b 0 2026-03-09T19:27:00.919 INFO:tasks.workunit.client.1.vm08.stdout:6/247: creat d3/db/d24/d4b/f5e x:0 0 0 2026-03-09T19:27:00.920 INFO:tasks.workunit.client.1.vm08.stdout:0/212: creat dd/d22/d27/d2e/d37/f44 x:0 0 0 2026-03-09T19:27:00.924 INFO:tasks.workunit.client.1.vm08.stdout:0/213: dwrite dd/d22/d27/f3f [0,4194304] 0 2026-03-09T19:27:00.933 INFO:tasks.workunit.client.1.vm08.stdout:0/214: creat dd/d22/f45 x:0 0 0 2026-03-09T19:27:00.937 INFO:tasks.workunit.client.1.vm08.stdout:0/215: unlink dd/f2c 0 2026-03-09T19:27:00.939 INFO:tasks.workunit.client.1.vm08.stdout:0/216: dread dd/d22/d27/f3d [0,4194304] 0 2026-03-09T19:27:00.940 INFO:tasks.workunit.client.1.vm08.stdout:0/217: creat dd/d22/d27/d2e/d37/f46 x:0 0 0 2026-03-09T19:27:00.941 INFO:tasks.workunit.client.1.vm08.stdout:0/218: write dd/d22/f29 [2279605,81464] 0 2026-03-09T19:27:00.942 INFO:tasks.workunit.client.1.vm08.stdout:0/219: chown dd/d22/l25 81321 1 2026-03-09T19:27:00.944 INFO:tasks.workunit.client.1.vm08.stdout:0/220: dread dd/d22/f3e [0,4194304] 0 2026-03-09T19:27:00.946 INFO:tasks.workunit.client.1.vm08.stdout:0/221: mknod dd/d22/d27/c47 0 2026-03-09T19:27:00.947 INFO:tasks.workunit.client.1.vm08.stdout:0/222: symlink dd/d22/d24/l48 0 2026-03-09T19:27:00.949 INFO:tasks.workunit.client.1.vm08.stdout:9/229: getdents d0/d2/d8/d7 0 2026-03-09T19:27:00.950 INFO:tasks.workunit.client.1.vm08.stdout:9/230: chown d0/d2/d8/d7/f34 1581 1 2026-03-09T19:27:00.951 INFO:tasks.workunit.client.1.vm08.stdout:0/223: mkdir dd/d22/d24/d49 0 2026-03-09T19:27:00.952 INFO:tasks.workunit.client.1.vm08.stdout:0/224: readlink dd/l30 0 2026-03-09T19:27:00.957 INFO:tasks.workunit.client.1.vm08.stdout:0/225: dwrite dd/d22/f28 [0,4194304] 0 2026-03-09T19:27:00.960 INFO:tasks.workunit.client.1.vm08.stdout:0/226: fsync dd/d22/d27/d2e/f39 0 2026-03-09T19:27:00.968 INFO:tasks.workunit.client.1.vm08.stdout:0/227: mknod dd/d22/d24/c4a 0 2026-03-09T19:27:00.969 INFO:tasks.workunit.client.1.vm08.stdout:0/228: fdatasync dd/d22/f3e 0 2026-03-09T19:27:00.973 INFO:tasks.workunit.client.1.vm08.stdout:0/229: dwrite dd/d22/d27/f42 [0,4194304] 0 2026-03-09T19:27:00.975 INFO:tasks.workunit.client.1.vm08.stdout:0/230: chown dd/l30 279 1 2026-03-09T19:27:00.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:00 vm07.local ceph-mon[48545]: pgmap v154: 65 pgs: 65 active+clean; 479 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 1.3 MiB/s rd, 48 MiB/s wr, 361 op/s 2026-03-09T19:27:01.041 INFO:tasks.workunit.client.1.vm08.stdout:2/231: dwrite d3/f19 [0,4194304] 0 2026-03-09T19:27:01.071 INFO:tasks.workunit.client.1.vm08.stdout:5/221: write d16/f34 [1067677,14465] 0 2026-03-09T19:27:01.071 INFO:tasks.workunit.client.1.vm08.stdout:5/222: fsync d16/d1e/f42 0 2026-03-09T19:27:01.079 INFO:tasks.workunit.client.1.vm08.stdout:5/223: dread f1 [0,4194304] 0 2026-03-09T19:27:01.081 INFO:tasks.workunit.client.1.vm08.stdout:5/224: symlink d16/d45/l48 0 2026-03-09T19:27:01.082 INFO:tasks.workunit.client.1.vm08.stdout:5/225: fdatasync f1 0 2026-03-09T19:27:01.083 INFO:tasks.workunit.client.1.vm08.stdout:5/226: mknod d16/d1e/d30/c49 0 2026-03-09T19:27:01.087 INFO:tasks.workunit.client.1.vm08.stdout:5/227: dwrite d16/d1e/d3b/f3c [0,4194304] 0 2026-03-09T19:27:01.094 INFO:tasks.workunit.client.1.vm08.stdout:5/228: creat d16/d45/f4a x:0 0 0 2026-03-09T19:27:01.096 INFO:tasks.workunit.client.1.vm08.stdout:1/347: write d9/f48 [899378,16769] 0 2026-03-09T19:27:01.101 INFO:tasks.workunit.client.1.vm08.stdout:1/348: dread d9/da/f30 [0,4194304] 0 2026-03-09T19:27:01.105 INFO:tasks.workunit.client.1.vm08.stdout:1/349: dwrite d9/d11/f5f [0,4194304] 0 2026-03-09T19:27:01.118 INFO:tasks.workunit.client.1.vm08.stdout:1/350: dwrite d9/da/d2d/f41 [0,4194304] 0 2026-03-09T19:27:01.124 INFO:tasks.workunit.client.1.vm08.stdout:1/351: fdatasync d9/da/dc/f10 0 2026-03-09T19:27:01.129 INFO:tasks.workunit.client.1.vm08.stdout:9/231: write d0/d1b/f49 [631089,122555] 0 2026-03-09T19:27:01.135 INFO:tasks.workunit.client.1.vm08.stdout:9/232: mknod d0/d3/c59 0 2026-03-09T19:27:01.135 INFO:tasks.workunit.client.1.vm08.stdout:0/231: write dd/f12 [844087,119880] 0 2026-03-09T19:27:01.136 INFO:tasks.workunit.client.1.vm08.stdout:0/232: symlink dd/d22/d24/l4b 0 2026-03-09T19:27:01.140 INFO:tasks.workunit.client.1.vm08.stdout:0/233: dwrite dd/f18 [0,4194304] 0 2026-03-09T19:27:01.142 INFO:tasks.workunit.client.1.vm08.stdout:0/234: chown dd/d22/d27/f42 3 1 2026-03-09T19:27:01.163 INFO:tasks.workunit.client.1.vm08.stdout:8/298: rename de/d1d/c4c to de/d1d/d2e/c68 0 2026-03-09T19:27:01.170 INFO:tasks.workunit.client.1.vm08.stdout:0/235: dread dd/f1e [0,4194304] 0 2026-03-09T19:27:01.171 INFO:tasks.workunit.client.1.vm08.stdout:0/236: creat dd/d22/d24/d49/f4c x:0 0 0 2026-03-09T19:27:01.174 INFO:tasks.workunit.client.1.vm08.stdout:3/270: mkdir d0/d52 0 2026-03-09T19:27:01.175 INFO:tasks.workunit.client.1.vm08.stdout:0/237: getdents dd/d22/d27/d2e 0 2026-03-09T19:27:01.176 INFO:tasks.workunit.client.1.vm08.stdout:3/271: dread d0/d8/f4a [0,4194304] 0 2026-03-09T19:27:01.180 INFO:tasks.workunit.client.1.vm08.stdout:3/272: dwrite d0/d8/f4c [0,4194304] 0 2026-03-09T19:27:01.180 INFO:tasks.workunit.client.1.vm08.stdout:0/238: dwrite dd/d22/f29 [0,4194304] 0 2026-03-09T19:27:01.185 INFO:tasks.workunit.client.1.vm08.stdout:4/230: rmdir da/d10 39 2026-03-09T19:27:01.190 INFO:tasks.workunit.client.1.vm08.stdout:3/273: dwrite d0/d8/d19/f44 [0,4194304] 0 2026-03-09T19:27:01.199 INFO:tasks.workunit.client.1.vm08.stdout:3/274: read d0/d8/d19/f38 [283653,11056] 0 2026-03-09T19:27:01.199 INFO:tasks.workunit.client.1.vm08.stdout:3/275: dread d0/d8/f4c [0,4194304] 0 2026-03-09T19:27:01.209 INFO:tasks.workunit.client.1.vm08.stdout:4/231: write da/d10/f1f [2649710,21066] 0 2026-03-09T19:27:01.213 INFO:tasks.workunit.client.1.vm08.stdout:2/232: dwrite d3/d9/dc/de/d18/f1a [0,4194304] 0 2026-03-09T19:27:01.216 INFO:tasks.workunit.client.1.vm08.stdout:3/276: creat d0/d6/de/d15/f53 x:0 0 0 2026-03-09T19:27:01.217 INFO:tasks.workunit.client.1.vm08.stdout:3/277: truncate d0/d8/d19/f41 831521 0 2026-03-09T19:27:01.218 INFO:tasks.workunit.client.1.vm08.stdout:4/232: creat da/d10/d26/d27/d32/f45 x:0 0 0 2026-03-09T19:27:01.220 INFO:tasks.workunit.client.1.vm08.stdout:4/233: write da/d10/f2e [1244459,57627] 0 2026-03-09T19:27:01.224 INFO:tasks.workunit.client.1.vm08.stdout:2/233: symlink d3/d9/l51 0 2026-03-09T19:27:01.224 INFO:tasks.workunit.client.1.vm08.stdout:3/278: getdents d0/d52 0 2026-03-09T19:27:01.225 INFO:tasks.workunit.client.1.vm08.stdout:2/234: creat d3/d9/d26/f52 x:0 0 0 2026-03-09T19:27:01.226 INFO:tasks.workunit.client.1.vm08.stdout:2/235: chown d3/d4/d23/d2c 21620795 1 2026-03-09T19:27:01.226 INFO:tasks.workunit.client.1.vm08.stdout:2/236: chown d3/d9/dc/de/l16 0 1 2026-03-09T19:27:01.234 INFO:tasks.workunit.client.1.vm08.stdout:1/352: sync 2026-03-09T19:27:01.244 INFO:tasks.workunit.client.1.vm08.stdout:8/299: sync 2026-03-09T19:27:01.244 INFO:tasks.workunit.client.1.vm08.stdout:8/300: chown de/d1d/d2e/f56 82451 1 2026-03-09T19:27:01.246 INFO:tasks.workunit.client.1.vm08.stdout:8/301: mkdir de/d1d/d69 0 2026-03-09T19:27:01.249 INFO:tasks.workunit.client.1.vm08.stdout:8/302: dwrite de/d1d/d21/f23 [0,4194304] 0 2026-03-09T19:27:01.257 INFO:tasks.workunit.client.1.vm08.stdout:8/303: dwrite de/d1d/d2e/d5f/f4e [0,4194304] 0 2026-03-09T19:27:01.267 INFO:tasks.workunit.client.1.vm08.stdout:8/304: getdents de/d47 0 2026-03-09T19:27:01.270 INFO:tasks.workunit.client.1.vm08.stdout:8/305: creat de/d25/d33/f6a x:0 0 0 2026-03-09T19:27:01.272 INFO:tasks.workunit.client.1.vm08.stdout:8/306: creat de/d1d/d4f/f6b x:0 0 0 2026-03-09T19:27:01.273 INFO:tasks.workunit.client.1.vm08.stdout:8/307: symlink de/d1d/d2e/d5f/l6c 0 2026-03-09T19:27:01.323 INFO:tasks.workunit.client.1.vm08.stdout:7/223: rmdir d5/d14 39 2026-03-09T19:27:01.341 INFO:tasks.workunit.client.1.vm08.stdout:7/224: dread d5/d12/f19 [0,4194304] 0 2026-03-09T19:27:01.343 INFO:tasks.workunit.client.1.vm08.stdout:7/225: creat d5/d14/d38/f4c x:0 0 0 2026-03-09T19:27:01.344 INFO:tasks.workunit.client.1.vm08.stdout:5/229: rename d16/l32 to d16/d1e/d30/l4b 0 2026-03-09T19:27:01.346 INFO:tasks.workunit.client.1.vm08.stdout:5/230: read d16/d1e/f27 [209172,72283] 0 2026-03-09T19:27:01.346 INFO:tasks.workunit.client.1.vm08.stdout:5/231: read d16/f34 [1353863,61672] 0 2026-03-09T19:27:01.347 INFO:tasks.workunit.client.1.vm08.stdout:7/226: creat d5/f4d x:0 0 0 2026-03-09T19:27:01.348 INFO:tasks.workunit.client.1.vm08.stdout:6/248: rmdir d3/d34 39 2026-03-09T19:27:01.349 INFO:tasks.workunit.client.1.vm08.stdout:6/249: fdatasync d3/db/d24/f39 0 2026-03-09T19:27:01.349 INFO:tasks.workunit.client.1.vm08.stdout:6/250: chown d3/db/f14 7463 1 2026-03-09T19:27:01.350 INFO:tasks.workunit.client.1.vm08.stdout:6/251: chown d3/db/f42 46726270 1 2026-03-09T19:27:01.352 INFO:tasks.workunit.client.1.vm08.stdout:5/232: write d16/d1e/d30/f39 [485970,40625] 0 2026-03-09T19:27:01.353 INFO:tasks.workunit.client.1.vm08.stdout:5/233: write d16/d1e/f42 [58449,77215] 0 2026-03-09T19:27:01.358 INFO:tasks.workunit.client.1.vm08.stdout:9/233: rename d0/d3/c59 to d0/d2/d8/c5a 0 2026-03-09T19:27:01.361 INFO:tasks.workunit.client.1.vm08.stdout:9/234: dwrite d0/d2/f2a [0,4194304] 0 2026-03-09T19:27:01.363 INFO:tasks.workunit.client.1.vm08.stdout:9/235: write d0/d2/f1a [524582,25130] 0 2026-03-09T19:27:01.364 INFO:tasks.workunit.client.1.vm08.stdout:9/236: readlink d0/d2/d8/l3a 0 2026-03-09T19:27:01.380 INFO:tasks.workunit.client.1.vm08.stdout:5/234: unlink d16/d1e/f42 0 2026-03-09T19:27:01.381 INFO:tasks.workunit.client.1.vm08.stdout:7/227: mkdir d5/d14/d4e 0 2026-03-09T19:27:01.381 INFO:tasks.workunit.client.1.vm08.stdout:7/228: dread - d5/d14/d38/f47 zero size 2026-03-09T19:27:01.387 INFO:tasks.workunit.client.1.vm08.stdout:1/353: rename d9/da/d2c/f38 to d9/da/d2c/d6a/f6b 0 2026-03-09T19:27:01.403 INFO:tasks.workunit.client.1.vm08.stdout:1/354: write d9/da/f1e [9237758,119017] 0 2026-03-09T19:27:01.404 INFO:tasks.workunit.client.1.vm08.stdout:3/279: mkdir d0/d6/de/d54 0 2026-03-09T19:27:01.404 INFO:tasks.workunit.client.1.vm08.stdout:3/280: chown d0/d8/d19/f44 4 1 2026-03-09T19:27:01.404 INFO:tasks.workunit.client.1.vm08.stdout:7/229: chown d5/c18 47 1 2026-03-09T19:27:01.404 INFO:tasks.workunit.client.1.vm08.stdout:7/230: write d5/f9 [363655,291] 0 2026-03-09T19:27:01.404 INFO:tasks.workunit.client.1.vm08.stdout:7/231: chown d5/d14/d2b 13294199 1 2026-03-09T19:27:01.404 INFO:tasks.workunit.client.1.vm08.stdout:0/239: dwrite dd/f19 [0,4194304] 0 2026-03-09T19:27:01.404 INFO:tasks.workunit.client.1.vm08.stdout:0/240: rename dd/d22 to dd/d22/d27/d2e/d4d 22 2026-03-09T19:27:01.404 INFO:tasks.workunit.client.1.vm08.stdout:0/241: write dd/d22/d27/d2e/d37/f40 [164509,52795] 0 2026-03-09T19:27:01.404 INFO:tasks.workunit.client.1.vm08.stdout:9/237: sync 2026-03-09T19:27:01.412 INFO:tasks.workunit.client.1.vm08.stdout:0/242: dread dd/d22/d27/d2e/d37/f40 [0,4194304] 0 2026-03-09T19:27:01.424 INFO:tasks.workunit.client.1.vm08.stdout:4/234: getdents da/d10/d26/d27/d32 0 2026-03-09T19:27:01.425 INFO:tasks.workunit.client.1.vm08.stdout:4/235: chown da/d10/f1c 9404309 1 2026-03-09T19:27:01.425 INFO:tasks.workunit.client.1.vm08.stdout:4/236: readlink da/d10/d26/l33 0 2026-03-09T19:27:01.429 INFO:tasks.workunit.client.1.vm08.stdout:2/237: dwrite d3/d4/fd [0,4194304] 0 2026-03-09T19:27:01.438 INFO:tasks.workunit.client.1.vm08.stdout:5/235: link d16/d1e/f37 d16/d1e/d3b/f4c 0 2026-03-09T19:27:01.441 INFO:tasks.workunit.client.1.vm08.stdout:7/232: symlink d5/d16/d3a/l4f 0 2026-03-09T19:27:01.443 INFO:tasks.workunit.client.1.vm08.stdout:7/233: dread d5/d14/d2b/f32 [0,4194304] 0 2026-03-09T19:27:01.448 INFO:tasks.workunit.client.1.vm08.stdout:9/238: read d0/f13 [207024,89324] 0 2026-03-09T19:27:01.452 INFO:tasks.workunit.client.1.vm08.stdout:9/239: stat d0/d1b/c37 0 2026-03-09T19:27:01.452 INFO:tasks.workunit.client.1.vm08.stdout:9/240: chown d0/d2/d8/d7/f22 11028 1 2026-03-09T19:27:01.452 INFO:tasks.workunit.client.1.vm08.stdout:0/243: rename l9 to dd/d22/d24/d49/l4e 0 2026-03-09T19:27:01.452 INFO:tasks.workunit.client.1.vm08.stdout:0/244: dread - dd/d22/d27/d2e/d37/f46 zero size 2026-03-09T19:27:01.453 INFO:tasks.workunit.client.1.vm08.stdout:1/355: mkdir d9/da/d53/d67/d6c 0 2026-03-09T19:27:01.454 INFO:tasks.workunit.client.1.vm08.stdout:6/252: getdents d3/db/d43 0 2026-03-09T19:27:01.454 INFO:tasks.workunit.client.1.vm08.stdout:4/237: mkdir da/d10/d16/d28/d46 0 2026-03-09T19:27:01.455 INFO:tasks.workunit.client.1.vm08.stdout:6/253: write d3/db/d24/f39 [61546,69113] 0 2026-03-09T19:27:01.458 INFO:tasks.workunit.client.1.vm08.stdout:1/356: dwrite d9/d11/f44 [0,4194304] 0 2026-03-09T19:27:01.468 INFO:tasks.workunit.client.1.vm08.stdout:6/254: dread d3/db/f42 [0,4194304] 0 2026-03-09T19:27:01.473 INFO:tasks.workunit.client.1.vm08.stdout:2/238: mknod d3/d9/dc/de/d18/d1f/c53 0 2026-03-09T19:27:01.476 INFO:tasks.workunit.client.1.vm08.stdout:3/281: symlink d0/l55 0 2026-03-09T19:27:01.477 INFO:tasks.workunit.client.1.vm08.stdout:2/239: readlink d3/d9/dc/de/l16 0 2026-03-09T19:27:01.477 INFO:tasks.workunit.client.1.vm08.stdout:2/240: truncate d3/d9/d26/f52 774659 0 2026-03-09T19:27:01.477 INFO:tasks.workunit.client.1.vm08.stdout:7/234: creat d5/d16/f50 x:0 0 0 2026-03-09T19:27:01.482 INFO:tasks.workunit.client.1.vm08.stdout:9/241: rmdir d0/d2/d14 39 2026-03-09T19:27:01.483 INFO:tasks.workunit.client.1.vm08.stdout:9/242: write d0/d1b/f49 [816289,115432] 0 2026-03-09T19:27:01.488 INFO:tasks.workunit.client.1.vm08.stdout:8/308: write de/d1d/d2e/f61 [4898681,58342] 0 2026-03-09T19:27:01.489 INFO:tasks.workunit.client.1.vm08.stdout:8/309: dread - de/d1d/d21/f62 zero size 2026-03-09T19:27:01.493 INFO:tasks.workunit.client.1.vm08.stdout:8/310: dwrite de/d1d/d21/f62 [0,4194304] 0 2026-03-09T19:27:01.504 INFO:tasks.workunit.client.1.vm08.stdout:0/245: unlink dd/d31/c32 0 2026-03-09T19:27:01.514 INFO:tasks.workunit.client.1.vm08.stdout:3/282: truncate d0/d8/d19/f38 1437009 0 2026-03-09T19:27:01.517 INFO:tasks.workunit.client.1.vm08.stdout:3/283: chown d0/l55 218432 1 2026-03-09T19:27:01.517 INFO:tasks.workunit.client.1.vm08.stdout:6/255: dwrite d3/f3e [0,4194304] 0 2026-03-09T19:27:01.529 INFO:tasks.workunit.client.1.vm08.stdout:4/238: symlink da/l47 0 2026-03-09T19:27:01.535 INFO:tasks.workunit.client.1.vm08.stdout:1/357: symlink d9/da/d53/d67/d6c/l6d 0 2026-03-09T19:27:01.539 INFO:tasks.workunit.client.1.vm08.stdout:1/358: dwrite d9/da/d2c/f58 [0,4194304] 0 2026-03-09T19:27:01.543 INFO:tasks.workunit.client.1.vm08.stdout:1/359: chown d9/da/dc/f10 40389145 1 2026-03-09T19:27:01.553 INFO:tasks.workunit.client.1.vm08.stdout:7/235: symlink d5/d12/d3f/l51 0 2026-03-09T19:27:01.555 INFO:tasks.workunit.client.1.vm08.stdout:9/243: mknod d0/d1b/d4e/c5b 0 2026-03-09T19:27:01.556 INFO:tasks.workunit.client.1.vm08.stdout:9/244: read - d0/d1b/f3e zero size 2026-03-09T19:27:01.557 INFO:tasks.workunit.client.1.vm08.stdout:7/236: dwrite d5/f20 [0,4194304] 0 2026-03-09T19:27:01.559 INFO:tasks.workunit.client.1.vm08.stdout:8/311: mkdir de/d25/d33/d46/d67/d6d 0 2026-03-09T19:27:01.567 INFO:tasks.workunit.client.1.vm08.stdout:7/237: dread - d5/d14/d27/f3d zero size 2026-03-09T19:27:01.567 INFO:tasks.workunit.client.1.vm08.stdout:0/246: mkdir dd/d22/d27/d4f 0 2026-03-09T19:27:01.567 INFO:tasks.workunit.client.1.vm08.stdout:0/247: readlink dd/d31/l3b 0 2026-03-09T19:27:01.574 INFO:tasks.workunit.client.1.vm08.stdout:4/239: symlink da/d10/d16/l48 0 2026-03-09T19:27:01.577 INFO:tasks.workunit.client.1.vm08.stdout:2/241: truncate d3/d9/dc/de/d18/d1f/f3a 3043229 0 2026-03-09T19:27:01.580 INFO:tasks.workunit.client.1.vm08.stdout:1/360: fdatasync d9/d11/f3c 0 2026-03-09T19:27:01.588 INFO:tasks.workunit.client.1.vm08.stdout:3/284: unlink d0/c27 0 2026-03-09T19:27:01.588 INFO:tasks.workunit.client.1.vm08.stdout:1/361: dwrite d9/d11/f3c [0,4194304] 0 2026-03-09T19:27:01.592 INFO:tasks.workunit.client.1.vm08.stdout:9/245: dwrite d0/d2/d14/f31 [0,4194304] 0 2026-03-09T19:27:01.604 INFO:tasks.workunit.client.1.vm08.stdout:8/312: fsync de/f20 0 2026-03-09T19:27:01.604 INFO:tasks.workunit.client.1.vm08.stdout:7/238: mkdir d5/d14/d38/d52 0 2026-03-09T19:27:01.606 INFO:tasks.workunit.client.1.vm08.stdout:5/236: rename d16/d1e/d3b/f4c to d16/f4d 0 2026-03-09T19:27:01.617 INFO:tasks.workunit.client.1.vm08.stdout:2/242: mkdir d3/d9/dc/de/d18/d54 0 2026-03-09T19:27:01.617 INFO:tasks.workunit.client.1.vm08.stdout:2/243: chown d3/d9/dc/de/d18 116679 1 2026-03-09T19:27:01.621 INFO:tasks.workunit.client.1.vm08.stdout:6/256: creat d3/f5f x:0 0 0 2026-03-09T19:27:01.621 INFO:tasks.workunit.client.1.vm08.stdout:6/257: readlink d3/db/d24/l2e 0 2026-03-09T19:27:01.622 INFO:tasks.workunit.client.1.vm08.stdout:3/285: creat d0/d6/d25/f56 x:0 0 0 2026-03-09T19:27:01.622 INFO:tasks.workunit.client.1.vm08.stdout:3/286: fsync d0/d6/de/d1b/d16/d18/f2c 0 2026-03-09T19:27:01.623 INFO:tasks.workunit.client.1.vm08.stdout:3/287: chown d0/d8/f4a 2310 1 2026-03-09T19:27:01.624 INFO:tasks.workunit.client.1.vm08.stdout:1/362: symlink d9/da/d12/d39/l6e 0 2026-03-09T19:27:01.627 INFO:tasks.workunit.client.1.vm08.stdout:7/239: symlink d5/d12/d3f/l53 0 2026-03-09T19:27:01.629 INFO:tasks.workunit.client.1.vm08.stdout:8/313: creat de/d1d/d2e/f6e x:0 0 0 2026-03-09T19:27:01.637 INFO:tasks.workunit.client.1.vm08.stdout:9/246: rename d0/d3 to d0/d2/d14/d5c 0 2026-03-09T19:27:01.656 INFO:tasks.workunit.client.1.vm08.stdout:4/240: mkdir da/d10/d26/d3a/d49 0 2026-03-09T19:27:01.663 INFO:tasks.workunit.client.1.vm08.stdout:3/288: dwrite d0/d8/f4c [0,4194304] 0 2026-03-09T19:27:01.664 INFO:tasks.workunit.client.1.vm08.stdout:7/240: mkdir d5/d14/d27/d54 0 2026-03-09T19:27:01.667 INFO:tasks.workunit.client.1.vm08.stdout:0/248: truncate dd/d22/d27/f3f 3526078 0 2026-03-09T19:27:01.667 INFO:tasks.workunit.client.1.vm08.stdout:0/249: write dd/f12 [461833,117520] 0 2026-03-09T19:27:01.668 INFO:tasks.workunit.client.1.vm08.stdout:5/237: chown d16/d1e/f37 260652 1 2026-03-09T19:27:01.670 INFO:tasks.workunit.client.1.vm08.stdout:9/247: unlink d0/d2/d8/d7/f23 0 2026-03-09T19:27:01.672 INFO:tasks.workunit.client.1.vm08.stdout:7/241: dwrite d5/fa [0,4194304] 0 2026-03-09T19:27:01.684 INFO:tasks.workunit.client.1.vm08.stdout:4/241: creat da/d14/d2c/f4a x:0 0 0 2026-03-09T19:27:01.695 INFO:tasks.workunit.client.1.vm08.stdout:0/250: mkdir dd/d22/d24/d49/d50 0 2026-03-09T19:27:01.699 INFO:tasks.workunit.client.1.vm08.stdout:9/248: rename d0/d2/d14/d47 to d0/d2/d8/d7/d48/d5d 0 2026-03-09T19:27:01.701 INFO:tasks.workunit.client.1.vm08.stdout:4/242: creat da/d10/d16/f4b x:0 0 0 2026-03-09T19:27:01.703 INFO:tasks.workunit.client.1.vm08.stdout:2/244: rmdir d3/d9/dc/de/d18/d54 0 2026-03-09T19:27:01.705 INFO:tasks.workunit.client.1.vm08.stdout:6/258: symlink d3/d34/d5c/l60 0 2026-03-09T19:27:01.706 INFO:tasks.workunit.client.1.vm08.stdout:1/363: creat d9/da/f6f x:0 0 0 2026-03-09T19:27:01.710 INFO:tasks.workunit.client.1.vm08.stdout:8/314: link de/c18 de/d1d/d2e/d5f/c6f 0 2026-03-09T19:27:01.710 INFO:tasks.workunit.client.1.vm08.stdout:9/249: mkdir d0/d2/d8/d7/d48/d5e 0 2026-03-09T19:27:01.713 INFO:tasks.workunit.client.1.vm08.stdout:9/250: dwrite d0/d2/d14/d5c/fd [0,4194304] 0 2026-03-09T19:27:01.715 INFO:tasks.workunit.client.1.vm08.stdout:7/242: getdents d5/d14/d27/d54 0 2026-03-09T19:27:01.721 INFO:tasks.workunit.client.1.vm08.stdout:9/251: dwrite d0/d1b/f49 [0,4194304] 0 2026-03-09T19:27:01.723 INFO:tasks.workunit.client.1.vm08.stdout:9/252: write d0/d2/d8/d7/f34 [609840,42817] 0 2026-03-09T19:27:01.724 INFO:tasks.workunit.client.1.vm08.stdout:9/253: write d0/d2/d14/f3b [694016,106692] 0 2026-03-09T19:27:01.731 INFO:tasks.workunit.client.1.vm08.stdout:4/243: mknod da/d14/d2c/c4c 0 2026-03-09T19:27:01.733 INFO:tasks.workunit.client.1.vm08.stdout:1/364: creat d9/d40/d49/f70 x:0 0 0 2026-03-09T19:27:01.737 INFO:tasks.workunit.client.1.vm08.stdout:5/238: link d16/d45/f4a d16/f4e 0 2026-03-09T19:27:01.739 INFO:tasks.workunit.client.1.vm08.stdout:7/243: creat d5/d16/f55 x:0 0 0 2026-03-09T19:27:01.753 INFO:tasks.workunit.client.1.vm08.stdout:7/244: creat d5/d16/d3a/f56 x:0 0 0 2026-03-09T19:27:01.753 INFO:tasks.workunit.client.1.vm08.stdout:7/245: read - d5/d14/d38/f40 zero size 2026-03-09T19:27:01.754 INFO:tasks.workunit.client.1.vm08.stdout:2/245: creat d3/d4/f55 x:0 0 0 2026-03-09T19:27:01.755 INFO:tasks.workunit.client.1.vm08.stdout:2/246: readlink d3/d4/l34 0 2026-03-09T19:27:01.757 INFO:tasks.workunit.client.1.vm08.stdout:6/259: creat d3/f61 x:0 0 0 2026-03-09T19:27:01.759 INFO:tasks.workunit.client.1.vm08.stdout:0/251: getdents dd/d22/d24 0 2026-03-09T19:27:01.759 INFO:tasks.workunit.client.1.vm08.stdout:0/252: chown dd/d22/d27/c47 2474000 1 2026-03-09T19:27:01.760 INFO:tasks.workunit.client.1.vm08.stdout:0/253: write dd/d22/d24/f26 [1557202,108940] 0 2026-03-09T19:27:01.761 INFO:tasks.workunit.client.1.vm08.stdout:0/254: dread - dd/d22/d27/d2e/d37/f46 zero size 2026-03-09T19:27:01.767 INFO:tasks.workunit.client.1.vm08.stdout:6/260: rename d3/db/d24/d4b/c52 to d3/d34/d5c/c62 0 2026-03-09T19:27:01.768 INFO:tasks.workunit.client.1.vm08.stdout:0/255: creat dd/d22/d27/d2e/f51 x:0 0 0 2026-03-09T19:27:01.769 INFO:tasks.workunit.client.1.vm08.stdout:0/256: write dd/d22/d24/f26 [2124298,67505] 0 2026-03-09T19:27:01.772 INFO:tasks.workunit.client.1.vm08.stdout:0/257: dwrite dd/d22/d24/f26 [0,4194304] 0 2026-03-09T19:27:01.798 INFO:tasks.workunit.client.1.vm08.stdout:7/246: symlink d5/l57 0 2026-03-09T19:27:01.802 INFO:tasks.workunit.client.1.vm08.stdout:6/261: mknod d3/d34/d5c/c63 0 2026-03-09T19:27:01.806 INFO:tasks.workunit.client.1.vm08.stdout:6/262: stat d3/d15/c47 0 2026-03-09T19:27:01.809 INFO:tasks.workunit.client.1.vm08.stdout:8/315: dwrite de/f20 [0,4194304] 0 2026-03-09T19:27:01.817 INFO:tasks.workunit.client.1.vm08.stdout:9/254: write d0/f13 [1378147,79853] 0 2026-03-09T19:27:01.821 INFO:tasks.workunit.client.1.vm08.stdout:1/365: write d9/da/f2f [4152650,61119] 0 2026-03-09T19:27:01.825 INFO:tasks.workunit.client.1.vm08.stdout:5/239: write d16/f20 [1946273,33418] 0 2026-03-09T19:27:01.829 INFO:tasks.workunit.client.1.vm08.stdout:6/263: stat d3/l31 0 2026-03-09T19:27:01.830 INFO:tasks.workunit.client.1.vm08.stdout:6/264: write d3/d34/f35 [2922564,29547] 0 2026-03-09T19:27:01.835 INFO:tasks.workunit.client.1.vm08.stdout:8/316: symlink de/d25/d33/d46/d67/l70 0 2026-03-09T19:27:01.841 INFO:tasks.workunit.client.1.vm08.stdout:8/317: dwrite de/f37 [0,4194304] 0 2026-03-09T19:27:01.844 INFO:tasks.workunit.client.1.vm08.stdout:9/255: symlink d0/d2/d8/d7/l5f 0 2026-03-09T19:27:01.846 INFO:tasks.workunit.client.1.vm08.stdout:4/244: dwrite da/d10/d1b/f37 [0,4194304] 0 2026-03-09T19:27:01.847 INFO:tasks.workunit.client.1.vm08.stdout:4/245: write da/d10/f3d [700665,130617] 0 2026-03-09T19:27:01.858 INFO:tasks.workunit.client.1.vm08.stdout:1/366: symlink d9/da/dc/l71 0 2026-03-09T19:27:01.865 INFO:tasks.workunit.client.1.vm08.stdout:2/247: truncate d3/f7 3907563 0 2026-03-09T19:27:01.865 INFO:tasks.workunit.client.1.vm08.stdout:6/265: stat d3/c1d 0 2026-03-09T19:27:01.866 INFO:tasks.workunit.client.1.vm08.stdout:6/266: fsync d3/d34/d3b/f58 0 2026-03-09T19:27:01.871 INFO:tasks.workunit.client.1.vm08.stdout:4/246: mkdir da/d10/d16/d28/d4d 0 2026-03-09T19:27:01.872 INFO:tasks.workunit.client.1.vm08.stdout:0/258: link dd/l30 dd/d22/d27/d2e/l52 0 2026-03-09T19:27:01.872 INFO:tasks.workunit.client.1.vm08.stdout:4/247: truncate da/d10/d26/d27/d32/f39 1177660 0 2026-03-09T19:27:01.876 INFO:tasks.workunit.client.1.vm08.stdout:7/247: rmdir d5/d14/d38/d52 0 2026-03-09T19:27:01.887 INFO:tasks.workunit.client.1.vm08.stdout:7/248: fdatasync d5/d14/f1e 0 2026-03-09T19:27:01.887 INFO:tasks.workunit.client.1.vm08.stdout:7/249: chown d5/d16/f4a 28210454 1 2026-03-09T19:27:01.887 INFO:tasks.workunit.client.1.vm08.stdout:7/250: chown d5/d16/l2d 1186 1 2026-03-09T19:27:01.887 INFO:tasks.workunit.client.1.vm08.stdout:7/251: chown d5/d14/c31 14492 1 2026-03-09T19:27:01.887 INFO:tasks.workunit.client.1.vm08.stdout:6/267: creat d3/d15/f64 x:0 0 0 2026-03-09T19:27:01.887 INFO:tasks.workunit.client.1.vm08.stdout:8/318: truncate f1 150850 0 2026-03-09T19:27:01.887 INFO:tasks.workunit.client.1.vm08.stdout:0/259: creat dd/d22/d27/d2e/f53 x:0 0 0 2026-03-09T19:27:01.887 INFO:tasks.workunit.client.1.vm08.stdout:7/252: dread d5/f1a [0,4194304] 0 2026-03-09T19:27:01.891 INFO:tasks.workunit.client.1.vm08.stdout:6/268: creat d3/db/d24/d4b/f65 x:0 0 0 2026-03-09T19:27:01.895 INFO:tasks.workunit.client.1.vm08.stdout:5/240: sync 2026-03-09T19:27:01.896 INFO:tasks.workunit.client.1.vm08.stdout:6/269: dwrite d3/d15/f19 [0,4194304] 0 2026-03-09T19:27:01.898 INFO:tasks.workunit.client.1.vm08.stdout:6/270: chown d3/l57 57 1 2026-03-09T19:27:01.904 INFO:tasks.workunit.client.1.vm08.stdout:0/260: creat dd/d31/f54 x:0 0 0 2026-03-09T19:27:01.907 INFO:tasks.workunit.client.1.vm08.stdout:1/367: creat d9/da/d12/f72 x:0 0 0 2026-03-09T19:27:01.914 INFO:tasks.workunit.client.1.vm08.stdout:7/253: dread d5/d16/f23 [0,4194304] 0 2026-03-09T19:27:01.915 INFO:tasks.workunit.client.1.vm08.stdout:7/254: dread - d5/d16/f49 zero size 2026-03-09T19:27:01.921 INFO:tasks.workunit.client.1.vm08.stdout:9/256: link d0/l12 d0/d2/d14/d5c/d32/l60 0 2026-03-09T19:27:01.926 INFO:tasks.workunit.client.1.vm08.stdout:0/261: symlink dd/d31/l55 0 2026-03-09T19:27:01.926 INFO:tasks.workunit.client.1.vm08.stdout:0/262: dwrite dd/d22/d24/d49/f4c [0,4194304] 0 2026-03-09T19:27:01.929 INFO:tasks.workunit.client.1.vm08.stdout:1/368: creat d9/d11/f73 x:0 0 0 2026-03-09T19:27:01.934 INFO:tasks.workunit.client.1.vm08.stdout:6/271: fsync d3/f5 0 2026-03-09T19:27:01.943 INFO:tasks.workunit.client.1.vm08.stdout:8/319: link de/d25/f64 de/d25/f71 0 2026-03-09T19:27:01.943 INFO:tasks.workunit.client.1.vm08.stdout:8/320: stat de 0 2026-03-09T19:27:01.944 INFO:tasks.workunit.client.1.vm08.stdout:6/272: dread d3/d15/f1c [0,4194304] 0 2026-03-09T19:27:01.948 INFO:tasks.workunit.client.1.vm08.stdout:2/248: fsync d3/f7 0 2026-03-09T19:27:01.958 INFO:tasks.workunit.client.1.vm08.stdout:3/289: write d0/d8/d19/f38 [2470310,104011] 0 2026-03-09T19:27:01.958 INFO:tasks.workunit.client.1.vm08.stdout:4/248: getdents da/d10/d26/d3a 0 2026-03-09T19:27:01.958 INFO:tasks.workunit.client.1.vm08.stdout:0/263: symlink dd/d22/d24/d49/l56 0 2026-03-09T19:27:01.958 INFO:tasks.workunit.client.1.vm08.stdout:7/255: creat d5/d14/d27/d54/f58 x:0 0 0 2026-03-09T19:27:01.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:01 vm07.local ceph-mon[48545]: pgmap v155: 65 pgs: 65 active+clean; 535 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 2.7 MiB/s rd, 54 MiB/s wr, 287 op/s 2026-03-09T19:27:01.981 INFO:tasks.workunit.client.1.vm08.stdout:8/321: creat de/d1d/d21/f72 x:0 0 0 2026-03-09T19:27:01.982 INFO:tasks.workunit.client.1.vm08.stdout:2/249: mknod d3/d9/dc/de/c56 0 2026-03-09T19:27:01.988 INFO:tasks.workunit.client.1.vm08.stdout:5/241: getdents d16/d1e/d3b 0 2026-03-09T19:27:01.991 INFO:tasks.workunit.client.1.vm08.stdout:8/322: rmdir de/d1d 39 2026-03-09T19:27:01.996 INFO:tasks.workunit.client.1.vm08.stdout:9/257: link d0/d2/f21 d0/d2/d8/f61 0 2026-03-09T19:27:01.998 INFO:tasks.workunit.client.1.vm08.stdout:4/249: unlink l0 0 2026-03-09T19:27:02.003 INFO:tasks.workunit.client.1.vm08.stdout:1/369: link d9/da/d17/c3b d9/da/d53/d67/d6c/c74 0 2026-03-09T19:27:02.005 INFO:tasks.workunit.client.1.vm08.stdout:5/242: mknod d16/d1e/d3b/c4f 0 2026-03-09T19:27:02.006 INFO:tasks.workunit.client.1.vm08.stdout:8/323: truncate de/f16 902384 0 2026-03-09T19:27:02.013 INFO:tasks.workunit.client.1.vm08.stdout:0/264: dwrite dd/d22/d27/f3f [0,4194304] 0 2026-03-09T19:27:02.015 INFO:tasks.workunit.client.1.vm08.stdout:0/265: stat dd/d22/d24/d49/f4c 0 2026-03-09T19:27:02.016 INFO:tasks.workunit.client.1.vm08.stdout:0/266: truncate dd/d22/d27/d2e/d37/f44 1027131 0 2026-03-09T19:27:02.021 INFO:tasks.workunit.client.1.vm08.stdout:6/273: link d3/l31 d3/db/d24/d4b/l66 0 2026-03-09T19:27:02.021 INFO:tasks.workunit.client.1.vm08.stdout:2/250: mknod d3/d9/dc/c57 0 2026-03-09T19:27:02.022 INFO:tasks.workunit.client.1.vm08.stdout:6/274: write d3/d15/f40 [71510,3967] 0 2026-03-09T19:27:02.026 INFO:tasks.workunit.client.1.vm08.stdout:2/251: dwrite d3/d4/f6 [4194304,4194304] 0 2026-03-09T19:27:02.028 INFO:tasks.workunit.client.1.vm08.stdout:2/252: chown d3/d4/d23/d2c 118693 1 2026-03-09T19:27:02.030 INFO:tasks.workunit.client.1.vm08.stdout:2/253: read d3/d9/f1e [181627,43861] 0 2026-03-09T19:27:02.031 INFO:tasks.workunit.client.1.vm08.stdout:2/254: stat d3/d4/d23/d2c/c2a 0 2026-03-09T19:27:02.031 INFO:tasks.workunit.client.1.vm08.stdout:2/255: write d3/f45 [1572132,50881] 0 2026-03-09T19:27:02.036 INFO:tasks.workunit.client.1.vm08.stdout:2/256: dwrite d3/d4/f48 [0,4194304] 0 2026-03-09T19:27:02.044 INFO:tasks.workunit.client.1.vm08.stdout:9/258: unlink d0/d2/d8/l3a 0 2026-03-09T19:27:02.045 INFO:tasks.workunit.client.1.vm08.stdout:4/250: symlink da/d10/d26/d3a/l4e 0 2026-03-09T19:27:02.045 INFO:tasks.workunit.client.1.vm08.stdout:7/256: creat d5/d14/f59 x:0 0 0 2026-03-09T19:27:02.046 INFO:tasks.workunit.client.1.vm08.stdout:4/251: write da/d10/d16/f4b [120487,73570] 0 2026-03-09T19:27:02.046 INFO:tasks.workunit.client.1.vm08.stdout:4/252: write f1 [4331044,128594] 0 2026-03-09T19:27:02.058 INFO:tasks.workunit.client.1.vm08.stdout:5/243: dread d16/d1e/d3b/f3c [0,4194304] 0 2026-03-09T19:27:02.059 INFO:tasks.workunit.client.1.vm08.stdout:5/244: truncate d16/d1e/f44 794063 0 2026-03-09T19:27:02.060 INFO:tasks.workunit.client.1.vm08.stdout:5/245: chown d16/d1e/f44 3974065 1 2026-03-09T19:27:02.061 INFO:tasks.workunit.client.1.vm08.stdout:0/267: mkdir dd/d22/d24/d57 0 2026-03-09T19:27:02.064 INFO:tasks.workunit.client.1.vm08.stdout:6/275: fdatasync d3/f9 0 2026-03-09T19:27:02.071 INFO:tasks.workunit.client.1.vm08.stdout:6/276: write d3/fc [3804086,93569] 0 2026-03-09T19:27:02.071 INFO:tasks.workunit.client.1.vm08.stdout:6/277: fdatasync d3/f3e 0 2026-03-09T19:27:02.071 INFO:tasks.workunit.client.1.vm08.stdout:3/290: creat d0/d6/f57 x:0 0 0 2026-03-09T19:27:02.071 INFO:tasks.workunit.client.1.vm08.stdout:3/291: read - d0/d6/d25/f56 zero size 2026-03-09T19:27:02.071 INFO:tasks.workunit.client.1.vm08.stdout:9/259: stat d0/d2/d8/d7/c1e 0 2026-03-09T19:27:02.071 INFO:tasks.workunit.client.1.vm08.stdout:7/257: creat d5/d16/d1c/f5a x:0 0 0 2026-03-09T19:27:02.078 INFO:tasks.workunit.client.1.vm08.stdout:8/324: mkdir de/d1d/d21/d73 0 2026-03-09T19:27:02.082 INFO:tasks.workunit.client.1.vm08.stdout:5/246: creat d16/d1e/d3b/f50 x:0 0 0 2026-03-09T19:27:02.082 INFO:tasks.workunit.client.1.vm08.stdout:5/247: stat d16/f2a 0 2026-03-09T19:27:02.082 INFO:tasks.workunit.client.1.vm08.stdout:5/248: chown f2 0 1 2026-03-09T19:27:02.089 INFO:tasks.workunit.client.1.vm08.stdout:6/278: creat d3/d34/d3b/f67 x:0 0 0 2026-03-09T19:27:02.095 INFO:tasks.workunit.client.1.vm08.stdout:2/257: unlink d3/d9/dc/de/l16 0 2026-03-09T19:27:02.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:01 vm08.local ceph-mon[57794]: pgmap v155: 65 pgs: 65 active+clean; 535 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 2.7 MiB/s rd, 54 MiB/s wr, 287 op/s 2026-03-09T19:27:02.104 INFO:tasks.workunit.client.1.vm08.stdout:7/258: symlink d5/d16/d3a/l5b 0 2026-03-09T19:27:02.105 INFO:tasks.workunit.client.1.vm08.stdout:7/259: chown d5/d14/d2b/f3e 2089955743 1 2026-03-09T19:27:02.105 INFO:tasks.workunit.client.1.vm08.stdout:7/260: chown d5/fb 558103334 1 2026-03-09T19:27:02.109 INFO:tasks.workunit.client.1.vm08.stdout:8/325: symlink de/d25/d33/d46/d67/l74 0 2026-03-09T19:27:02.110 INFO:tasks.workunit.client.1.vm08.stdout:0/268: sync 2026-03-09T19:27:02.111 INFO:tasks.workunit.client.1.vm08.stdout:0/269: write dd/d22/d27/f42 [1954057,45611] 0 2026-03-09T19:27:02.114 INFO:tasks.workunit.client.1.vm08.stdout:5/249: symlink d16/d1e/d30/l51 0 2026-03-09T19:27:02.119 INFO:tasks.workunit.client.1.vm08.stdout:5/250: dwrite d16/d1e/d3b/f43 [0,4194304] 0 2026-03-09T19:27:02.123 INFO:tasks.workunit.client.1.vm08.stdout:6/279: rmdir d3/d34 39 2026-03-09T19:27:02.124 INFO:tasks.workunit.client.1.vm08.stdout:9/260: fdatasync d0/d2/d8/f29 0 2026-03-09T19:27:02.130 INFO:tasks.workunit.client.1.vm08.stdout:1/370: getdents d9 0 2026-03-09T19:27:02.130 INFO:tasks.workunit.client.1.vm08.stdout:1/371: readlink d9/l46 0 2026-03-09T19:27:02.130 INFO:tasks.workunit.client.1.vm08.stdout:1/372: stat d9/da/d2d/d62 0 2026-03-09T19:27:02.131 INFO:tasks.workunit.client.1.vm08.stdout:1/373: chown d9/da/d53 34371337 1 2026-03-09T19:27:02.138 INFO:tasks.workunit.client.1.vm08.stdout:8/326: rename de/d1d/d21/l29 to de/d25/d33/d46/d67/l75 0 2026-03-09T19:27:02.146 INFO:tasks.workunit.client.1.vm08.stdout:1/374: rename d9/da/d2d/d4e to d9/da/d2d/d4e/d75 22 2026-03-09T19:27:02.146 INFO:tasks.workunit.client.1.vm08.stdout:0/270: mknod dd/d22/d24/d49/c58 0 2026-03-09T19:27:02.149 INFO:tasks.workunit.client.1.vm08.stdout:5/251: unlink d16/d1e/d30/f39 0 2026-03-09T19:27:02.159 INFO:tasks.workunit.client.1.vm08.stdout:2/258: creat d3/d9/dc/d14/f58 x:0 0 0 2026-03-09T19:27:02.162 INFO:tasks.workunit.client.1.vm08.stdout:7/261: symlink d5/d14/d2b/d4b/l5c 0 2026-03-09T19:27:02.167 INFO:tasks.workunit.client.1.vm08.stdout:7/262: dwrite d5/d14/d2b/f37 [0,4194304] 0 2026-03-09T19:27:02.180 INFO:tasks.workunit.client.1.vm08.stdout:4/253: write da/f1d [1871625,94695] 0 2026-03-09T19:27:02.185 INFO:tasks.workunit.client.1.vm08.stdout:1/375: mkdir d9/da/d53/d67/d6c/d76 0 2026-03-09T19:27:02.189 INFO:tasks.workunit.client.1.vm08.stdout:1/376: dwrite f2 [4194304,4194304] 0 2026-03-09T19:27:02.206 INFO:tasks.workunit.client.1.vm08.stdout:5/252: symlink d16/d1e/d30/l52 0 2026-03-09T19:27:02.207 INFO:tasks.workunit.client.1.vm08.stdout:3/292: getdents d0/d8/d24 0 2026-03-09T19:27:02.209 INFO:tasks.workunit.client.1.vm08.stdout:6/280: mkdir d3/d68 0 2026-03-09T19:27:02.216 INFO:tasks.workunit.client.1.vm08.stdout:7/263: rename d5/d12/d3f to d5/d14/d2b/d5d 0 2026-03-09T19:27:02.217 INFO:tasks.workunit.client.1.vm08.stdout:7/264: chown d5/d14/d27/d54/f58 185 1 2026-03-09T19:27:02.225 INFO:tasks.workunit.client.1.vm08.stdout:4/254: mkdir da/d10/d16/d28/d2f/d4f 0 2026-03-09T19:27:02.225 INFO:tasks.workunit.client.1.vm08.stdout:4/255: fdatasync da/d10/f3d 0 2026-03-09T19:27:02.226 INFO:tasks.workunit.client.1.vm08.stdout:4/256: write da/d10/d26/d27/d32/f39 [1284963,35622] 0 2026-03-09T19:27:02.230 INFO:tasks.workunit.client.1.vm08.stdout:4/257: dwrite da/d10/f2e [0,4194304] 0 2026-03-09T19:27:02.231 INFO:tasks.workunit.client.1.vm08.stdout:4/258: fdatasync da/d14/d2c/f36 0 2026-03-09T19:27:02.232 INFO:tasks.workunit.client.1.vm08.stdout:4/259: readlink da/d10/l3c 0 2026-03-09T19:27:02.232 INFO:tasks.workunit.client.1.vm08.stdout:6/281: sync 2026-03-09T19:27:02.233 INFO:tasks.workunit.client.1.vm08.stdout:4/260: dread - da/d14/d40/f41 zero size 2026-03-09T19:27:02.234 INFO:tasks.workunit.client.1.vm08.stdout:4/261: write da/d14/d40/f41 [747104,120642] 0 2026-03-09T19:27:02.240 INFO:tasks.workunit.client.1.vm08.stdout:1/377: unlink d9/l33 0 2026-03-09T19:27:02.248 INFO:tasks.workunit.client.1.vm08.stdout:2/259: creat d3/d9/d4a/f59 x:0 0 0 2026-03-09T19:27:02.254 INFO:tasks.workunit.client.1.vm08.stdout:9/261: creat d0/d2/f62 x:0 0 0 2026-03-09T19:27:02.254 INFO:tasks.workunit.client.1.vm08.stdout:7/265: creat d5/d14/d27/d54/f5e x:0 0 0 2026-03-09T19:27:02.254 INFO:tasks.workunit.client.1.vm08.stdout:7/266: chown d5/d14/d38/f3c 63183 1 2026-03-09T19:27:02.256 INFO:tasks.workunit.client.1.vm08.stdout:8/327: dwrite de/d25/f71 [0,4194304] 0 2026-03-09T19:27:02.270 INFO:tasks.workunit.client.1.vm08.stdout:6/282: mkdir d3/db/d43/d69 0 2026-03-09T19:27:02.271 INFO:tasks.workunit.client.1.vm08.stdout:4/262: mkdir da/d10/d26/d50 0 2026-03-09T19:27:02.273 INFO:tasks.workunit.client.1.vm08.stdout:1/378: creat d9/da/d53/d67/f77 x:0 0 0 2026-03-09T19:27:02.273 INFO:tasks.workunit.client.1.vm08.stdout:1/379: write d9/da/f2f [1854762,20248] 0 2026-03-09T19:27:02.279 INFO:tasks.workunit.client.1.vm08.stdout:7/267: unlink d5/d14/d27/f3d 0 2026-03-09T19:27:02.280 INFO:tasks.workunit.client.1.vm08.stdout:7/268: dread - d5/d14/d38/f40 zero size 2026-03-09T19:27:02.294 INFO:tasks.workunit.client.1.vm08.stdout:6/283: dwrite d3/d34/f35 [0,4194304] 0 2026-03-09T19:27:02.302 INFO:tasks.workunit.client.1.vm08.stdout:1/380: creat d9/da/dc/f78 x:0 0 0 2026-03-09T19:27:02.302 INFO:tasks.workunit.client.1.vm08.stdout:1/381: dread - d9/da/d12/d39/f47 zero size 2026-03-09T19:27:02.303 INFO:tasks.workunit.client.1.vm08.stdout:1/382: stat d9/da 0 2026-03-09T19:27:02.305 INFO:tasks.workunit.client.1.vm08.stdout:0/271: getdents dd/d31 0 2026-03-09T19:27:02.314 INFO:tasks.workunit.client.1.vm08.stdout:2/260: mknod d3/d4/c5a 0 2026-03-09T19:27:02.320 INFO:tasks.workunit.client.1.vm08.stdout:1/383: dwrite d9/da/dc/f31 [4194304,4194304] 0 2026-03-09T19:27:02.322 INFO:tasks.workunit.client.1.vm08.stdout:3/293: rename d0/d6/de/d1b/d16/c46 to d0/c58 0 2026-03-09T19:27:02.333 INFO:tasks.workunit.client.1.vm08.stdout:7/269: unlink d5/f20 0 2026-03-09T19:27:02.336 INFO:tasks.workunit.client.1.vm08.stdout:7/270: dwrite d5/d14/d27/f35 [0,4194304] 0 2026-03-09T19:27:02.343 INFO:tasks.workunit.client.1.vm08.stdout:7/271: dwrite d5/d14/d2b/f30 [0,4194304] 0 2026-03-09T19:27:02.345 INFO:tasks.workunit.client.1.vm08.stdout:7/272: write d5/d14/f1e [1390190,31451] 0 2026-03-09T19:27:02.357 INFO:tasks.workunit.client.1.vm08.stdout:6/284: fsync d3/d15/f2b 0 2026-03-09T19:27:02.360 INFO:tasks.workunit.client.1.vm08.stdout:0/272: readlink dd/d22/d24/d49/l4e 0 2026-03-09T19:27:02.361 INFO:tasks.workunit.client.1.vm08.stdout:5/253: getdents d16/d45 0 2026-03-09T19:27:02.363 INFO:tasks.workunit.client.1.vm08.stdout:1/384: creat d9/da/d53/d67/f79 x:0 0 0 2026-03-09T19:27:02.363 INFO:tasks.workunit.client.1.vm08.stdout:1/385: chown d9/l59 919 1 2026-03-09T19:27:02.368 INFO:tasks.workunit.client.1.vm08.stdout:9/262: rename d0/d1b/f3e to d0/d2/d8/d7/f63 0 2026-03-09T19:27:02.381 INFO:tasks.workunit.client.1.vm08.stdout:3/294: chown d0/d8/d19/l40 557989144 1 2026-03-09T19:27:02.381 INFO:tasks.workunit.client.1.vm08.stdout:3/295: fdatasync d0/d8/f4c 0 2026-03-09T19:27:02.381 INFO:tasks.workunit.client.1.vm08.stdout:6/285: rmdir d3/db/d24/d4b 39 2026-03-09T19:27:02.381 INFO:tasks.workunit.client.1.vm08.stdout:0/273: read dd/d22/d27/d2e/d37/f40 [117321,111808] 0 2026-03-09T19:27:02.381 INFO:tasks.workunit.client.1.vm08.stdout:5/254: creat d16/d1e/d30/f53 x:0 0 0 2026-03-09T19:27:02.382 INFO:tasks.workunit.client.1.vm08.stdout:0/274: fdatasync dd/d22/f28 0 2026-03-09T19:27:02.383 INFO:tasks.workunit.client.1.vm08.stdout:5/255: dread d16/d1e/d3b/f3c [0,4194304] 0 2026-03-09T19:27:02.385 INFO:tasks.workunit.client.1.vm08.stdout:4/263: rename da/fd to da/d10/d26/d3a/f51 0 2026-03-09T19:27:02.386 INFO:tasks.workunit.client.1.vm08.stdout:4/264: read - da/d10/d26/d27/d32/f45 zero size 2026-03-09T19:27:02.390 INFO:tasks.workunit.client.1.vm08.stdout:9/263: mknod d0/d2/d8/d7/d48/c64 0 2026-03-09T19:27:02.394 INFO:tasks.workunit.client.1.vm08.stdout:7/273: sync 2026-03-09T19:27:02.395 INFO:tasks.workunit.client.1.vm08.stdout:9/264: dread d0/d2/d14/f3b [0,4194304] 0 2026-03-09T19:27:02.400 INFO:tasks.workunit.client.1.vm08.stdout:5/256: dread d16/d1e/f25 [0,4194304] 0 2026-03-09T19:27:02.401 INFO:tasks.workunit.client.1.vm08.stdout:6/286: creat d3/d15/f6a x:0 0 0 2026-03-09T19:27:02.404 INFO:tasks.workunit.client.1.vm08.stdout:9/265: dread d0/f13 [0,4194304] 0 2026-03-09T19:27:02.406 INFO:tasks.workunit.client.1.vm08.stdout:7/274: sync 2026-03-09T19:27:02.413 INFO:tasks.workunit.client.1.vm08.stdout:0/275: truncate dd/d22/d27/f3d 2537979 0 2026-03-09T19:27:02.414 INFO:tasks.workunit.client.1.vm08.stdout:0/276: write dd/d22/f3e [870580,14162] 0 2026-03-09T19:27:02.422 INFO:tasks.workunit.client.1.vm08.stdout:3/296: mknod d0/d6/de/d1b/c59 0 2026-03-09T19:27:02.422 INFO:tasks.workunit.client.1.vm08.stdout:3/297: write d0/d6/de/d1b/d16/d17/f36 [1362412,58578] 0 2026-03-09T19:27:02.426 INFO:tasks.workunit.client.1.vm08.stdout:3/298: dwrite d0/d8/d19/f44 [4194304,4194304] 0 2026-03-09T19:27:02.437 INFO:tasks.workunit.client.1.vm08.stdout:6/287: symlink d3/d15/l6b 0 2026-03-09T19:27:02.439 INFO:tasks.workunit.client.1.vm08.stdout:5/257: rename f14 to d16/d45/f54 0 2026-03-09T19:27:02.443 INFO:tasks.workunit.client.1.vm08.stdout:7/275: unlink d5/d16/f50 0 2026-03-09T19:27:02.445 INFO:tasks.workunit.client.1.vm08.stdout:8/328: write de/f16 [1503479,120408] 0 2026-03-09T19:27:02.448 INFO:tasks.workunit.client.1.vm08.stdout:8/329: dwrite de/d1d/d21/f62 [0,4194304] 0 2026-03-09T19:27:02.449 INFO:tasks.workunit.client.1.vm08.stdout:8/330: dread - de/d1d/f59 zero size 2026-03-09T19:27:02.461 INFO:tasks.workunit.client.1.vm08.stdout:2/261: getdents d3/d9/dc/de/d18 0 2026-03-09T19:27:02.462 INFO:tasks.workunit.client.1.vm08.stdout:4/265: getdents da/d10/d16/d28/d4d 0 2026-03-09T19:27:02.469 INFO:tasks.workunit.client.1.vm08.stdout:6/288: symlink d3/d34/d5c/l6c 0 2026-03-09T19:27:02.469 INFO:tasks.workunit.client.1.vm08.stdout:6/289: read - d3/f5f zero size 2026-03-09T19:27:02.470 INFO:tasks.workunit.client.1.vm08.stdout:6/290: readlink d3/d34/d5c/l60 0 2026-03-09T19:27:02.473 INFO:tasks.workunit.client.1.vm08.stdout:2/262: dread d3/d9/dc/de/f1c [0,4194304] 0 2026-03-09T19:27:02.473 INFO:tasks.workunit.client.1.vm08.stdout:7/276: rmdir d5/d12 39 2026-03-09T19:27:02.475 INFO:tasks.workunit.client.1.vm08.stdout:2/263: read d3/d9/f20 [2613996,18583] 0 2026-03-09T19:27:02.480 INFO:tasks.workunit.client.1.vm08.stdout:7/277: dwrite d5/d14/d38/f3b [0,4194304] 0 2026-03-09T19:27:02.483 INFO:tasks.workunit.client.1.vm08.stdout:7/278: chown d5/d16/f49 1512221 1 2026-03-09T19:27:02.492 INFO:tasks.workunit.client.1.vm08.stdout:8/331: mknod de/d25/d33/d46/c76 0 2026-03-09T19:27:02.502 INFO:tasks.workunit.client.1.vm08.stdout:8/332: dread de/d1d/d2e/d5f/f57 [0,4194304] 0 2026-03-09T19:27:02.506 INFO:tasks.workunit.client.1.vm08.stdout:1/386: truncate d9/d11/f3c 841126 0 2026-03-09T19:27:02.507 INFO:tasks.workunit.client.1.vm08.stdout:1/387: write d9/d40/d49/f70 [851523,51833] 0 2026-03-09T19:27:02.513 INFO:tasks.workunit.client.1.vm08.stdout:9/266: write d0/d2/f2f [500593,75] 0 2026-03-09T19:27:02.516 INFO:tasks.workunit.client.1.vm08.stdout:0/277: dread dd/d22/d27/f3d [0,4194304] 0 2026-03-09T19:27:02.516 INFO:tasks.workunit.client.1.vm08.stdout:0/278: write dd/fe [855937,69939] 0 2026-03-09T19:27:02.517 INFO:tasks.workunit.client.1.vm08.stdout:0/279: dread - dd/d22/f41 zero size 2026-03-09T19:27:02.518 INFO:tasks.workunit.client.1.vm08.stdout:0/280: chown dd/d22/f41 914943209 1 2026-03-09T19:27:02.519 INFO:tasks.workunit.client.1.vm08.stdout:0/281: write dd/d22/f3e [119729,62939] 0 2026-03-09T19:27:02.520 INFO:tasks.workunit.client.1.vm08.stdout:0/282: chown dd/f13 3 1 2026-03-09T19:27:02.521 INFO:tasks.workunit.client.1.vm08.stdout:0/283: dread - dd/d22/f41 zero size 2026-03-09T19:27:02.523 INFO:tasks.workunit.client.1.vm08.stdout:4/266: rename da/d10/d42 to da/d10/d16/d28/d46/d52 0 2026-03-09T19:27:02.528 INFO:tasks.workunit.client.1.vm08.stdout:4/267: dwrite da/d10/d16/d28/f44 [0,4194304] 0 2026-03-09T19:27:02.544 INFO:tasks.workunit.client.1.vm08.stdout:5/258: dwrite d16/f18 [0,4194304] 0 2026-03-09T19:27:02.549 INFO:tasks.workunit.client.1.vm08.stdout:2/264: creat d3/d4/d23/d2c/f5b x:0 0 0 2026-03-09T19:27:02.557 INFO:tasks.workunit.client.1.vm08.stdout:2/265: readlink d3/d4/l34 0 2026-03-09T19:27:02.563 INFO:tasks.workunit.client.1.vm08.stdout:5/259: dread d16/d1e/f2e [0,4194304] 0 2026-03-09T19:27:02.567 INFO:tasks.workunit.client.1.vm08.stdout:5/260: read d16/d1e/f37 [274869,69455] 0 2026-03-09T19:27:02.569 INFO:tasks.workunit.client.1.vm08.stdout:5/261: dread d16/d1e/f44 [0,4194304] 0 2026-03-09T19:27:02.571 INFO:tasks.workunit.client.1.vm08.stdout:8/333: mknod de/d25/d33/d46/c77 0 2026-03-09T19:27:02.575 INFO:tasks.workunit.client.0.vm07.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-09T19:27:02.576 INFO:tasks.workunit.client.1.vm08.stdout:9/267: creat d0/d1b/f65 x:0 0 0 2026-03-09T19:27:02.578 INFO:tasks.workunit.client.1.vm08.stdout:7/279: rename d5/d16/f23 to d5/d16/d3a/d42/f5f 0 2026-03-09T19:27:02.579 INFO:tasks.workunit.client.0.vm07.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-09T19:27:02.579 INFO:tasks.workunit.client.0.vm07.stderr:+ make 2026-03-09T19:27:02.584 INFO:tasks.workunit.client.1.vm08.stdout:4/268: creat da/d10/f53 x:0 0 0 2026-03-09T19:27:02.586 INFO:tasks.workunit.client.1.vm08.stdout:4/269: dread da/d10/d16/f4b [0,4194304] 0 2026-03-09T19:27:02.588 INFO:tasks.workunit.client.1.vm08.stdout:6/291: symlink d3/db/l6d 0 2026-03-09T19:27:02.589 INFO:tasks.workunit.client.1.vm08.stdout:6/292: readlink d3/db/l16 0 2026-03-09T19:27:02.595 INFO:tasks.workunit.client.1.vm08.stdout:2/266: mkdir d3/d4/d23/d5c 0 2026-03-09T19:27:02.595 INFO:tasks.workunit.client.1.vm08.stdout:2/267: readlink d3/l4c 0 2026-03-09T19:27:02.600 INFO:tasks.workunit.client.1.vm08.stdout:7/280: sync 2026-03-09T19:27:02.600 INFO:tasks.workunit.client.1.vm08.stdout:9/268: sync 2026-03-09T19:27:02.603 INFO:tasks.workunit.client.1.vm08.stdout:3/299: truncate d0/d8/d19/f38 636116 0 2026-03-09T19:27:02.604 INFO:tasks.workunit.client.1.vm08.stdout:3/300: write d0/d6/f39 [1397118,75827] 0 2026-03-09T19:27:02.604 INFO:tasks.workunit.client.1.vm08.stdout:3/301: fsync d0/d8/f4c 0 2026-03-09T19:27:02.609 INFO:tasks.workunit.client.1.vm08.stdout:1/388: write d9/da/dc/f2e [4718764,128357] 0 2026-03-09T19:27:02.628 INFO:tasks.workunit.client.1.vm08.stdout:0/284: mknod dd/d22/d27/d4f/c59 0 2026-03-09T19:27:02.629 INFO:tasks.workunit.client.1.vm08.stdout:0/285: dread dd/d22/d27/d2e/f39 [0,4194304] 0 2026-03-09T19:27:02.647 INFO:tasks.workunit.client.1.vm08.stdout:5/262: dwrite d16/d45/f4a [0,4194304] 0 2026-03-09T19:27:02.651 INFO:tasks.workunit.client.1.vm08.stdout:9/269: rename d0/d2/d8/d7/c1e to d0/c66 0 2026-03-09T19:27:02.651 INFO:tasks.workunit.client.1.vm08.stdout:9/270: chown d0 5453 1 2026-03-09T19:27:02.652 INFO:tasks.workunit.client.1.vm08.stdout:9/271: stat d0/d2/f21 0 2026-03-09T19:27:02.657 INFO:tasks.workunit.client.1.vm08.stdout:3/302: rmdir d0/d8 39 2026-03-09T19:27:02.658 INFO:tasks.workunit.client.1.vm08.stdout:3/303: chown d0/d6/de/d1b/d16/d17/f3f 868569 1 2026-03-09T19:27:02.662 INFO:tasks.workunit.client.1.vm08.stdout:1/389: mkdir d9/d11/d7a 0 2026-03-09T19:27:02.663 INFO:tasks.workunit.client.1.vm08.stdout:1/390: dread - d9/da/dc/f78 zero size 2026-03-09T19:27:02.663 INFO:tasks.workunit.client.1.vm08.stdout:1/391: fdatasync d9/d11/f56 0 2026-03-09T19:27:02.669 INFO:tasks.workunit.client.1.vm08.stdout:1/392: dwrite d9/da/d12/f72 [0,4194304] 0 2026-03-09T19:27:02.676 INFO:tasks.workunit.client.0.vm07.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-09T19:27:02.680 INFO:tasks.workunit.client.1.vm08.stdout:8/334: symlink de/d25/d33/d46/d67/d6d/l78 0 2026-03-09T19:27:02.684 INFO:tasks.workunit.client.1.vm08.stdout:0/286: unlink dd/d31/l43 0 2026-03-09T19:27:02.694 INFO:tasks.workunit.client.1.vm08.stdout:7/281: rename d5/d14/d4e to d5/d14/d27/d54/d60 0 2026-03-09T19:27:02.698 INFO:tasks.workunit.client.1.vm08.stdout:9/272: unlink d0/d2/d8/f41 0 2026-03-09T19:27:02.702 INFO:tasks.workunit.client.1.vm08.stdout:9/273: dwrite d0/d2/d14/f56 [0,4194304] 0 2026-03-09T19:27:02.703 INFO:tasks.workunit.client.1.vm08.stdout:9/274: read d0/d2/d14/d5c/fd [3302397,119751] 0 2026-03-09T19:27:02.706 INFO:tasks.workunit.client.1.vm08.stdout:9/275: dread - d0/d2/d8/d7/f58 zero size 2026-03-09T19:27:02.707 INFO:tasks.workunit.client.1.vm08.stdout:9/276: readlink d0/d2/d8/d7/l5f 0 2026-03-09T19:27:02.709 INFO:tasks.workunit.client.1.vm08.stdout:4/270: truncate da/d10/d16/d28/f34 523144 0 2026-03-09T19:27:02.711 INFO:tasks.workunit.client.1.vm08.stdout:6/293: write d3/f6 [616124,57023] 0 2026-03-09T19:27:02.713 INFO:tasks.workunit.client.1.vm08.stdout:5/263: truncate d16/d1e/f44 5225 0 2026-03-09T19:27:02.717 INFO:tasks.workunit.client.1.vm08.stdout:2/268: creat d3/d9/f5d x:0 0 0 2026-03-09T19:27:02.730 INFO:tasks.workunit.client.1.vm08.stdout:2/269: chown d3/d4/fd 34 1 2026-03-09T19:27:02.730 INFO:tasks.workunit.client.1.vm08.stdout:2/270: fsync d3/d9/dc/d14/f2b 0 2026-03-09T19:27:02.730 INFO:tasks.workunit.client.1.vm08.stdout:2/271: chown d3/d9/dc/de/c56 168389 1 2026-03-09T19:27:02.730 INFO:tasks.workunit.client.1.vm08.stdout:7/282: creat d5/d14/d27/d54/d60/f61 x:0 0 0 2026-03-09T19:27:02.730 INFO:tasks.workunit.client.1.vm08.stdout:5/264: dread f15 [0,4194304] 0 2026-03-09T19:27:02.730 INFO:tasks.workunit.client.1.vm08.stdout:4/271: dread da/d10/d16/f4b [0,4194304] 0 2026-03-09T19:27:02.730 INFO:tasks.workunit.client.1.vm08.stdout:3/304: dwrite d0/d8/d24/f2d [0,4194304] 0 2026-03-09T19:27:02.730 INFO:tasks.workunit.client.1.vm08.stdout:0/287: symlink dd/d22/d24/d57/l5a 0 2026-03-09T19:27:02.730 INFO:tasks.workunit.client.1.vm08.stdout:0/288: chown dd/f1e 20860 1 2026-03-09T19:27:02.730 INFO:tasks.workunit.client.1.vm08.stdout:3/305: stat d0/d6/de/d1b/l49 0 2026-03-09T19:27:02.734 INFO:tasks.workunit.client.1.vm08.stdout:9/277: read d0/d2/d8/fe [217047,33690] 0 2026-03-09T19:27:02.738 INFO:tasks.workunit.client.1.vm08.stdout:9/278: dwrite d0/d1b/f4b [0,4194304] 0 2026-03-09T19:27:02.744 INFO:tasks.workunit.client.1.vm08.stdout:2/272: truncate d3/d9/d26/f35 24697 0 2026-03-09T19:27:02.755 INFO:tasks.workunit.client.1.vm08.stdout:7/283: creat d5/d14/d27/d54/d60/f62 x:0 0 0 2026-03-09T19:27:02.755 INFO:tasks.workunit.client.1.vm08.stdout:7/284: write d5/d16/f1f [6425967,64222] 0 2026-03-09T19:27:02.755 INFO:tasks.workunit.client.1.vm08.stdout:0/289: unlink dd/d22/f45 0 2026-03-09T19:27:02.755 INFO:tasks.workunit.client.1.vm08.stdout:6/294: link d3/db/d24/f2f d3/f6e 0 2026-03-09T19:27:02.755 INFO:tasks.workunit.client.1.vm08.stdout:6/295: fdatasync d3/f9 0 2026-03-09T19:27:02.755 INFO:tasks.workunit.client.1.vm08.stdout:2/273: rename d3/d9/dc to d3/d4/d23/d2c/d39/d5e 0 2026-03-09T19:27:02.757 INFO:tasks.workunit.client.1.vm08.stdout:5/265: sync 2026-03-09T19:27:02.757 INFO:tasks.workunit.client.1.vm08.stdout:5/266: chown d16/d45/f46 86759 1 2026-03-09T19:27:02.763 INFO:tasks.workunit.client.1.vm08.stdout:7/285: creat d5/d14/d2b/d5d/f63 x:0 0 0 2026-03-09T19:27:02.765 INFO:tasks.workunit.client.1.vm08.stdout:3/306: link d0/d6/de/d1b/d16/d17/f1d d0/d6/de/d1a/f5a 0 2026-03-09T19:27:02.767 INFO:tasks.workunit.client.1.vm08.stdout:9/279: symlink d0/d2/l67 0 2026-03-09T19:27:02.770 INFO:tasks.workunit.client.1.vm08.stdout:4/272: link da/d10/d26/l33 da/d10/d1b/l54 0 2026-03-09T19:27:02.773 INFO:tasks.workunit.client.1.vm08.stdout:0/290: symlink dd/d22/d27/d2e/l5b 0 2026-03-09T19:27:02.781 INFO:tasks.workunit.client.1.vm08.stdout:7/286: creat d5/d16/d3a/f64 x:0 0 0 2026-03-09T19:27:02.787 INFO:tasks.workunit.client.1.vm08.stdout:9/280: mkdir d0/d1b/d68 0 2026-03-09T19:27:02.793 INFO:tasks.workunit.client.1.vm08.stdout:7/287: unlink d5/d14/d2b/d5d/l51 0 2026-03-09T19:27:02.800 INFO:tasks.workunit.client.1.vm08.stdout:1/393: write d9/da/dc/f1d [617194,103900] 0 2026-03-09T19:27:02.802 INFO:tasks.workunit.client.1.vm08.stdout:8/335: dwrite de/f1b [0,4194304] 0 2026-03-09T19:27:02.812 INFO:tasks.workunit.client.1.vm08.stdout:4/273: symlink da/d10/d26/d50/l55 0 2026-03-09T19:27:02.818 INFO:tasks.workunit.client.1.vm08.stdout:7/288: fsync d5/d16/f45 0 2026-03-09T19:27:02.822 INFO:tasks.workunit.client.1.vm08.stdout:3/307: creat d0/d8/f5b x:0 0 0 2026-03-09T19:27:02.823 INFO:tasks.workunit.client.1.vm08.stdout:1/394: symlink d9/da/d17/l7b 0 2026-03-09T19:27:02.826 INFO:tasks.workunit.client.1.vm08.stdout:8/336: dread de/d1d/d21/f23 [0,4194304] 0 2026-03-09T19:27:02.828 INFO:tasks.workunit.client.1.vm08.stdout:1/395: dwrite d9/da/d2d/f3d [0,4194304] 0 2026-03-09T19:27:02.835 INFO:tasks.workunit.client.1.vm08.stdout:1/396: dwrite d9/da/d2d/f3d [0,4194304] 0 2026-03-09T19:27:02.844 INFO:tasks.workunit.client.1.vm08.stdout:6/296: rename d3/db/d24 to d3/d34/d6f 0 2026-03-09T19:27:02.844 INFO:tasks.workunit.client.1.vm08.stdout:3/308: sync 2026-03-09T19:27:02.845 INFO:tasks.workunit.client.1.vm08.stdout:3/309: stat d0/d6/de/d1b/d16 0 2026-03-09T19:27:02.849 INFO:tasks.workunit.client.1.vm08.stdout:6/297: dread d3/d34/d6f/f4f [0,4194304] 0 2026-03-09T19:27:02.850 INFO:tasks.workunit.client.1.vm08.stdout:0/291: creat dd/d22/f5c x:0 0 0 2026-03-09T19:27:02.852 INFO:tasks.workunit.client.1.vm08.stdout:6/298: dread d3/f6e [0,4194304] 0 2026-03-09T19:27:02.856 INFO:tasks.workunit.client.1.vm08.stdout:8/337: chown de/d1d/d21/f30 17935204 1 2026-03-09T19:27:02.866 INFO:tasks.workunit.client.1.vm08.stdout:1/397: chown d9/da/dc/c2b 192528432 1 2026-03-09T19:27:02.866 INFO:tasks.workunit.client.1.vm08.stdout:2/274: rename d3/d4/d23/d2c/d39/d5e/de/l36 to d3/d4/d23/d5c/l5f 0 2026-03-09T19:27:02.866 INFO:tasks.workunit.client.1.vm08.stdout:5/267: write d16/f4d [405707,111976] 0 2026-03-09T19:27:02.866 INFO:tasks.workunit.client.1.vm08.stdout:4/274: mkdir da/d10/d16/d28/d2f/d4f/d56 0 2026-03-09T19:27:02.866 INFO:tasks.workunit.client.1.vm08.stdout:7/289: read d5/d12/f34 [3613890,74529] 0 2026-03-09T19:27:02.870 INFO:tasks.workunit.client.1.vm08.stdout:0/292: creat dd/d22/d27/d4f/f5d x:0 0 0 2026-03-09T19:27:02.876 INFO:tasks.workunit.client.1.vm08.stdout:9/281: dwrite d0/d1b/f49 [4194304,4194304] 0 2026-03-09T19:27:02.879 INFO:tasks.workunit.client.1.vm08.stdout:2/275: fdatasync d3/d4/d23/d2c/d39/d5e/de/d18/f3f 0 2026-03-09T19:27:02.882 INFO:tasks.workunit.client.1.vm08.stdout:1/398: sync 2026-03-09T19:27:02.894 INFO:tasks.workunit.client.1.vm08.stdout:4/275: creat da/d10/d26/d38/f57 x:0 0 0 2026-03-09T19:27:02.896 INFO:tasks.workunit.client.1.vm08.stdout:7/290: creat d5/d16/d3a/d42/f65 x:0 0 0 2026-03-09T19:27:02.897 INFO:tasks.workunit.client.1.vm08.stdout:8/338: symlink de/d1d/d21/d73/l79 0 2026-03-09T19:27:02.899 INFO:tasks.workunit.client.1.vm08.stdout:6/299: rename d3/c2d to d3/d55/c70 0 2026-03-09T19:27:02.901 INFO:tasks.workunit.client.1.vm08.stdout:2/276: rename d3/d4/d23/d2c/d39/d5e to d3/d4/d23/d2c/d39/d5e/de/d18/d60 22 2026-03-09T19:27:02.901 INFO:tasks.workunit.client.1.vm08.stdout:9/282: readlink d0/l12 0 2026-03-09T19:27:02.902 INFO:tasks.workunit.client.1.vm08.stdout:9/283: truncate d0/d2/d8/d7/f63 460443 0 2026-03-09T19:27:02.903 INFO:tasks.workunit.client.1.vm08.stdout:1/399: creat d9/d40/d49/f7c x:0 0 0 2026-03-09T19:27:02.908 INFO:tasks.workunit.client.1.vm08.stdout:5/268: unlink d16/d1e/c26 0 2026-03-09T19:27:02.909 INFO:tasks.workunit.client.1.vm08.stdout:5/269: write d16/f4e [2822188,1072] 0 2026-03-09T19:27:02.914 INFO:tasks.workunit.client.1.vm08.stdout:1/400: dread d9/da/f30 [0,4194304] 0 2026-03-09T19:27:02.915 INFO:tasks.workunit.client.1.vm08.stdout:5/270: dwrite d16/d1e/d3b/f50 [0,4194304] 0 2026-03-09T19:27:02.926 INFO:tasks.workunit.client.1.vm08.stdout:1/401: dwrite d9/da/f1e [4194304,4194304] 0 2026-03-09T19:27:02.930 INFO:tasks.workunit.client.1.vm08.stdout:5/271: dwrite d16/d1e/d30/f3f [0,4194304] 0 2026-03-09T19:27:02.930 INFO:tasks.workunit.client.1.vm08.stdout:4/276: mknod da/d10/d16/d28/d2f/d4f/c58 0 2026-03-09T19:27:02.935 INFO:tasks.workunit.client.1.vm08.stdout:7/291: creat d5/d14/d2b/d4b/f66 x:0 0 0 2026-03-09T19:27:02.939 INFO:tasks.workunit.client.1.vm08.stdout:6/300: fdatasync d3/f5 0 2026-03-09T19:27:02.941 INFO:tasks.workunit.client.1.vm08.stdout:7/292: dread d5/d14/d2b/f37 [0,4194304] 0 2026-03-09T19:27:02.944 INFO:tasks.workunit.client.1.vm08.stdout:7/293: write d5/d16/f28 [18420,33572] 0 2026-03-09T19:27:02.950 INFO:tasks.workunit.client.1.vm08.stdout:6/301: dread d3/f3e [0,4194304] 0 2026-03-09T19:27:02.951 INFO:tasks.workunit.client.1.vm08.stdout:6/302: read - d3/d34/d3b/f58 zero size 2026-03-09T19:27:02.951 INFO:tasks.workunit.client.1.vm08.stdout:6/303: chown d3/d55 4606 1 2026-03-09T19:27:02.952 INFO:tasks.workunit.client.1.vm08.stdout:9/284: rmdir d0/d1b/d4e 39 2026-03-09T19:27:02.957 INFO:tasks.workunit.client.1.vm08.stdout:9/285: dwrite d0/d2/d8/d7/f35 [0,4194304] 0 2026-03-09T19:27:02.972 INFO:tasks.workunit.client.1.vm08.stdout:3/310: getdents d0/d6/de/d1b/d16/d17 0 2026-03-09T19:27:02.987 INFO:tasks.workunit.client.1.vm08.stdout:8/339: symlink de/d1d/d69/l7a 0 2026-03-09T19:27:02.991 INFO:tasks.workunit.client.1.vm08.stdout:9/286: sync 2026-03-09T19:27:02.993 INFO:tasks.workunit.client.1.vm08.stdout:7/294: symlink d5/d14/d27/d54/l67 0 2026-03-09T19:27:02.997 INFO:tasks.workunit.client.1.vm08.stdout:6/304: creat d3/db/d43/f71 x:0 0 0 2026-03-09T19:27:03.003 INFO:tasks.workunit.client.1.vm08.stdout:4/277: getdents da/d10/d16/d28/d4d 0 2026-03-09T19:27:03.004 INFO:tasks.workunit.client.1.vm08.stdout:4/278: write da/d10/d16/d28/f44 [4906377,28364] 0 2026-03-09T19:27:03.004 INFO:tasks.workunit.client.0.vm07.stderr:++ readlink -f fsstress 2026-03-09T19:27:03.006 INFO:tasks.workunit.client.0.vm07.stderr:+ BIN=/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-09T19:27:03.006 INFO:tasks.workunit.client.0.vm07.stderr:+ popd 2026-03-09T19:27:03.008 INFO:tasks.workunit.client.0.vm07.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-09T19:27:03.008 INFO:tasks.workunit.client.0.vm07.stderr:+ popd 2026-03-09T19:27:03.008 INFO:tasks.workunit.client.1.vm08.stdout:4/279: dwrite da/d10/d26/d27/d32/f45 [0,4194304] 0 2026-03-09T19:27:03.009 INFO:tasks.workunit.client.0.vm07.stdout:~/cephtest/mnt.0/client.0/tmp 2026-03-09T19:27:03.009 INFO:tasks.workunit.client.0.vm07.stderr:++ mktemp -d -p . 2026-03-09T19:27:03.013 INFO:tasks.workunit.client.1.vm08.stdout:0/293: getdents dd 0 2026-03-09T19:27:03.016 INFO:tasks.workunit.client.1.vm08.stdout:0/294: dread dd/d22/d24/d49/f4c [0,4194304] 0 2026-03-09T19:27:03.016 INFO:tasks.workunit.client.1.vm08.stdout:0/295: truncate dd/d22/f41 231686 0 2026-03-09T19:27:03.018 INFO:tasks.workunit.client.1.vm08.stdout:0/296: dread dd/d22/d27/d2e/f39 [0,4194304] 0 2026-03-09T19:27:03.023 INFO:tasks.workunit.client.1.vm08.stdout:0/297: dwrite dd/d22/f29 [0,4194304] 0 2026-03-09T19:27:03.028 INFO:tasks.workunit.client.1.vm08.stdout:0/298: truncate dd/d22/f28 4587715 0 2026-03-09T19:27:03.028 INFO:tasks.workunit.client.0.vm07.stderr:+ T=./tmp.h9HUznJztE 2026-03-09T19:27:03.028 INFO:tasks.workunit.client.0.vm07.stderr:+ /home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.h9HUznJztE -l 1 -n 1000 -p 10 -v 2026-03-09T19:27:03.034 INFO:tasks.workunit.client.1.vm08.stdout:2/277: creat d3/d4/f61 x:0 0 0 2026-03-09T19:27:03.044 INFO:tasks.workunit.client.1.vm08.stdout:7/295: creat d5/d16/d3a/d42/f68 x:0 0 0 2026-03-09T19:27:03.044 INFO:tasks.workunit.client.1.vm08.stdout:6/305: write d3/d15/f45 [4224619,55375] 0 2026-03-09T19:27:03.044 INFO:tasks.workunit.client.0.vm07.stdout:seed = 1773119459 2026-03-09T19:27:03.046 INFO:tasks.workunit.client.1.vm08.stdout:3/311: rename d0/d6/de/d1b/d16/d17/f36 to d0/d52/f5c 0 2026-03-09T19:27:03.049 INFO:tasks.workunit.client.1.vm08.stdout:3/312: dwrite d0/d6/f57 [0,4194304] 0 2026-03-09T19:27:03.051 INFO:tasks.workunit.client.1.vm08.stdout:3/313: truncate d0/d6/d25/f56 139862 0 2026-03-09T19:27:03.052 INFO:tasks.workunit.client.1.vm08.stdout:4/280: mknod da/d10/d26/d27/c59 0 2026-03-09T19:27:03.052 INFO:tasks.workunit.client.1.vm08.stdout:4/281: chown da 37533601 1 2026-03-09T19:27:03.056 INFO:tasks.workunit.client.1.vm08.stdout:3/314: dread d0/d8/f4c [0,4194304] 0 2026-03-09T19:27:03.059 INFO:tasks.workunit.client.1.vm08.stdout:0/299: rmdir dd/d22/d24 39 2026-03-09T19:27:03.061 INFO:tasks.workunit.client.1.vm08.stdout:3/315: read d0/d6/d25/f56 [38387,43895] 0 2026-03-09T19:27:03.061 INFO:tasks.workunit.client.1.vm08.stdout:8/340: truncate de/f11 1577900 0 2026-03-09T19:27:03.062 INFO:tasks.workunit.client.0.vm07.stdout:1/0: rmdir - no directory 2026-03-09T19:27:03.062 INFO:tasks.workunit.client.0.vm07.stdout:1/1: fdatasync - no filename 2026-03-09T19:27:03.062 INFO:tasks.workunit.client.1.vm08.stdout:0/300: fdatasync dd/d22/d27/f42 0 2026-03-09T19:27:03.062 INFO:tasks.workunit.client.1.vm08.stdout:9/287: mkdir d0/d2/d14/d5c/d32/d57/d69 0 2026-03-09T19:27:03.062 INFO:tasks.workunit.client.1.vm08.stdout:7/296: mknod d5/d12/c69 0 2026-03-09T19:27:03.066 INFO:tasks.workunit.client.1.vm08.stdout:7/297: dwrite d5/d14/f59 [0,4194304] 0 2026-03-09T19:27:03.073 INFO:tasks.workunit.client.0.vm07.stdout:8/0: chown . 6 1 2026-03-09T19:27:03.073 INFO:tasks.workunit.client.0.vm07.stdout:8/1: rmdir - no directory 2026-03-09T19:27:03.079 INFO:tasks.workunit.client.0.vm07.stdout:1/2: creat f0 x:0 0 0 2026-03-09T19:27:03.091 INFO:tasks.workunit.client.0.vm07.stdout:1/3: write f0 [713500,73001] 0 2026-03-09T19:27:03.091 INFO:tasks.workunit.client.0.vm07.stdout:4/0: creat f0 x:0 0 0 2026-03-09T19:27:03.091 INFO:tasks.workunit.client.0.vm07.stdout:4/1: dread - f0 zero size 2026-03-09T19:27:03.091 INFO:tasks.workunit.client.0.vm07.stdout:4/2: stat f0 0 2026-03-09T19:27:03.091 INFO:tasks.workunit.client.0.vm07.stdout:4/3: readlink - no filename 2026-03-09T19:27:03.091 INFO:tasks.workunit.client.0.vm07.stdout:4/4: dread - f0 zero size 2026-03-09T19:27:03.091 INFO:tasks.workunit.client.0.vm07.stdout:4/5: readlink - no filename 2026-03-09T19:27:03.091 INFO:tasks.workunit.client.0.vm07.stdout:9/0: truncate - no filename 2026-03-09T19:27:03.091 INFO:tasks.workunit.client.0.vm07.stdout:9/1: fsync - no filename 2026-03-09T19:27:03.091 INFO:tasks.workunit.client.0.vm07.stdout:9/2: dread - no filename 2026-03-09T19:27:03.091 INFO:tasks.workunit.client.0.vm07.stdout:9/3: write - no filename 2026-03-09T19:27:03.091 INFO:tasks.workunit.client.0.vm07.stdout:9/4: chown . 129 1 2026-03-09T19:27:03.091 INFO:tasks.workunit.client.0.vm07.stdout:9/5: truncate - no filename 2026-03-09T19:27:03.091 INFO:tasks.workunit.client.0.vm07.stdout:1/4: mkdir d1 0 2026-03-09T19:27:03.091 INFO:tasks.workunit.client.0.vm07.stdout:1/5: truncate f0 1076560 0 2026-03-09T19:27:03.091 INFO:tasks.workunit.client.0.vm07.stdout:7/0: write - no filename 2026-03-09T19:27:03.092 INFO:tasks.workunit.client.1.vm08.stdout:5/272: dwrite d16/d1e/f2c [0,4194304] 0 2026-03-09T19:27:03.093 INFO:tasks.workunit.client.0.vm07.stdout:8/2: mknod c0 0 2026-03-09T19:27:03.094 INFO:tasks.workunit.client.0.vm07.stdout:8/3: fsync - no filename 2026-03-09T19:27:03.098 INFO:tasks.workunit.client.1.vm08.stdout:6/306: creat d3/d34/d6f/d4b/f72 x:0 0 0 2026-03-09T19:27:03.101 INFO:tasks.workunit.client.0.vm07.stdout:9/6: mkdir d0 0 2026-03-09T19:27:03.101 INFO:tasks.workunit.client.0.vm07.stdout:6/0: truncate - no filename 2026-03-09T19:27:03.101 INFO:tasks.workunit.client.1.vm08.stdout:2/278: rename d3/d4/d23/d2c/d39/d5e/c25 to d3/d4/d23/d2c/c62 0 2026-03-09T19:27:03.109 INFO:tasks.workunit.client.1.vm08.stdout:4/282: fsync da/d10/f2e 0 2026-03-09T19:27:03.109 INFO:tasks.workunit.client.1.vm08.stdout:4/283: stat da/d14/d40 0 2026-03-09T19:27:03.109 INFO:tasks.workunit.client.1.vm08.stdout:9/288: dread d0/d2/d8/fe [0,4194304] 0 2026-03-09T19:27:03.110 INFO:tasks.workunit.client.1.vm08.stdout:4/284: fsync da/d10/d26/d38/f57 0 2026-03-09T19:27:03.116 INFO:tasks.workunit.client.1.vm08.stdout:9/289: dwrite d0/d2/f2a [0,4194304] 0 2026-03-09T19:27:03.118 INFO:tasks.workunit.client.1.vm08.stdout:9/290: stat d0/d1b/f49 0 2026-03-09T19:27:03.124 INFO:tasks.workunit.client.1.vm08.stdout:9/291: dread d0/d2/f21 [0,4194304] 0 2026-03-09T19:27:03.126 INFO:tasks.workunit.client.1.vm08.stdout:4/285: dread da/d10/f13 [4194304,4194304] 0 2026-03-09T19:27:03.127 INFO:tasks.workunit.client.1.vm08.stdout:4/286: truncate da/d10/d26/d38/f43 968642 0 2026-03-09T19:27:03.127 INFO:tasks.workunit.client.1.vm08.stdout:4/287: chown da/d14/d2c 51 1 2026-03-09T19:27:03.133 INFO:tasks.workunit.client.0.vm07.stdout:8/4: unlink c0 0 2026-03-09T19:27:03.134 INFO:tasks.workunit.client.1.vm08.stdout:1/402: write d9/da/d2c/d6a/f6b [716337,126843] 0 2026-03-09T19:27:03.135 INFO:tasks.workunit.client.0.vm07.stdout:1/6: creat d1/f2 x:0 0 0 2026-03-09T19:27:03.138 INFO:tasks.workunit.client.1.vm08.stdout:1/403: dwrite d9/d11/f44 [0,4194304] 0 2026-03-09T19:27:03.139 INFO:tasks.workunit.client.1.vm08.stdout:8/341: symlink de/l7b 0 2026-03-09T19:27:03.141 INFO:tasks.workunit.client.1.vm08.stdout:1/404: dread d9/d11/f44 [0,4194304] 0 2026-03-09T19:27:03.150 INFO:tasks.workunit.client.0.vm07.stdout:3/0: write - no filename 2026-03-09T19:27:03.152 INFO:tasks.workunit.client.1.vm08.stdout:7/298: rmdir d5/d16/d3a/d42 39 2026-03-09T19:27:03.158 INFO:tasks.workunit.client.1.vm08.stdout:0/301: rmdir dd/d22/d24/d49 39 2026-03-09T19:27:03.158 INFO:tasks.workunit.client.1.vm08.stdout:0/302: dwrite dd/f15 [4194304,4194304] 0 2026-03-09T19:27:03.171 INFO:tasks.workunit.client.1.vm08.stdout:7/299: dread d5/d16/f1f [0,4194304] 0 2026-03-09T19:27:03.173 INFO:tasks.workunit.client.0.vm07.stdout:7/1: mkdir d0 0 2026-03-09T19:27:03.175 INFO:tasks.workunit.client.1.vm08.stdout:7/300: dwrite d5/d14/f1e [0,4194304] 0 2026-03-09T19:27:03.175 INFO:tasks.workunit.client.1.vm08.stdout:6/307: symlink d3/d34/d3b/l73 0 2026-03-09T19:27:03.187 INFO:tasks.workunit.client.0.vm07.stdout:4/6: getdents . 0 2026-03-09T19:27:03.187 INFO:tasks.workunit.client.0.vm07.stdout:4/7: truncate f0 514886 0 2026-03-09T19:27:03.187 INFO:tasks.workunit.client.0.vm07.stdout:4/8: rmdir - no directory 2026-03-09T19:27:03.187 INFO:tasks.workunit.client.0.vm07.stdout:4/9: fsync f0 0 2026-03-09T19:27:03.189 INFO:tasks.workunit.client.0.vm07.stdout:4/10: read f0 [431976,35082] 0 2026-03-09T19:27:03.191 INFO:tasks.workunit.client.0.vm07.stdout:4/11: dread f0 [0,4194304] 0 2026-03-09T19:27:03.197 INFO:tasks.workunit.client.1.vm08.stdout:3/316: rename d0/d8/d19/f41 to d0/d52/f5d 0 2026-03-09T19:27:03.197 INFO:tasks.workunit.client.1.vm08.stdout:2/279: unlink d3/f45 0 2026-03-09T19:27:03.198 INFO:tasks.workunit.client.0.vm07.stdout:8/5: creat f1 x:0 0 0 2026-03-09T19:27:03.198 INFO:tasks.workunit.client.0.vm07.stdout:1/7: mkdir d1/d3 0 2026-03-09T19:27:03.198 INFO:tasks.workunit.client.0.vm07.stdout:2/0: read - no filename 2026-03-09T19:27:03.198 INFO:tasks.workunit.client.1.vm08.stdout:9/292: rmdir d0/d2/d8/d7/d48 39 2026-03-09T19:27:03.200 INFO:tasks.workunit.client.1.vm08.stdout:4/288: creat da/d14/f5a x:0 0 0 2026-03-09T19:27:03.201 INFO:tasks.workunit.client.0.vm07.stdout:9/7: symlink d0/l1 0 2026-03-09T19:27:03.201 INFO:tasks.workunit.client.0.vm07.stdout:9/8: dwrite - no filename 2026-03-09T19:27:03.201 INFO:tasks.workunit.client.0.vm07.stdout:9/9: dwrite - no filename 2026-03-09T19:27:03.201 INFO:tasks.workunit.client.0.vm07.stdout:9/10: truncate - no filename 2026-03-09T19:27:03.201 INFO:tasks.workunit.client.0.vm07.stdout:6/1: mkdir d0 0 2026-03-09T19:27:03.201 INFO:tasks.workunit.client.0.vm07.stdout:6/2: write - no filename 2026-03-09T19:27:03.201 INFO:tasks.workunit.client.0.vm07.stdout:6/3: fdatasync - no filename 2026-03-09T19:27:03.202 INFO:tasks.workunit.client.0.vm07.stdout:8/6: creat f2 x:0 0 0 2026-03-09T19:27:03.204 INFO:tasks.workunit.client.0.vm07.stdout:0/0: write - no filename 2026-03-09T19:27:03.204 INFO:tasks.workunit.client.0.vm07.stdout:0/1: rmdir - no directory 2026-03-09T19:27:03.204 INFO:tasks.workunit.client.0.vm07.stdout:0/2: write - no filename 2026-03-09T19:27:03.204 INFO:tasks.workunit.client.0.vm07.stdout:0/3: write - no filename 2026-03-09T19:27:03.207 INFO:tasks.workunit.client.1.vm08.stdout:4/289: dwrite da/d10/f3d [0,4194304] 0 2026-03-09T19:27:03.210 INFO:tasks.workunit.client.1.vm08.stdout:1/405: mknod d9/da/d12/d39/c7d 0 2026-03-09T19:27:03.212 INFO:tasks.workunit.client.0.vm07.stdout:3/1: creat f0 x:0 0 0 2026-03-09T19:27:03.214 INFO:tasks.workunit.client.0.vm07.stdout:7/2: creat d0/f1 x:0 0 0 2026-03-09T19:27:03.215 INFO:tasks.workunit.client.0.vm07.stdout:4/12: link f0 f1 0 2026-03-09T19:27:03.215 INFO:tasks.workunit.client.0.vm07.stdout:7/3: dread - d0/f1 zero size 2026-03-09T19:27:03.215 INFO:tasks.workunit.client.1.vm08.stdout:6/308: symlink d3/d34/d5c/l74 0 2026-03-09T19:27:03.216 INFO:tasks.workunit.client.0.vm07.stdout:4/13: dread f0 [0,4194304] 0 2026-03-09T19:27:03.216 INFO:tasks.workunit.client.0.vm07.stdout:7/4: write d0/f1 [229136,81525] 0 2026-03-09T19:27:03.217 INFO:tasks.workunit.client.0.vm07.stdout:8/7: creat f3 x:0 0 0 2026-03-09T19:27:03.218 INFO:tasks.workunit.client.0.vm07.stdout:5/0: chown . 25553245 1 2026-03-09T19:27:03.218 INFO:tasks.workunit.client.0.vm07.stdout:5/1: chown . 824967746 1 2026-03-09T19:27:03.218 INFO:tasks.workunit.client.0.vm07.stdout:5/2: dread - no filename 2026-03-09T19:27:03.220 INFO:tasks.workunit.client.0.vm07.stdout:2/1: creat f0 x:0 0 0 2026-03-09T19:27:03.220 INFO:tasks.workunit.client.0.vm07.stdout:1/8: link f0 d1/d3/f4 0 2026-03-09T19:27:03.220 INFO:tasks.workunit.client.0.vm07.stdout:2/2: chown f0 12002607 1 2026-03-09T19:27:03.220 INFO:tasks.workunit.client.1.vm08.stdout:2/280: symlink d3/d4/d3e/d4e/l63 0 2026-03-09T19:27:03.222 INFO:tasks.workunit.client.1.vm08.stdout:2/281: write d3/d4/d23/d2c/f31 [545459,47899] 0 2026-03-09T19:27:03.225 INFO:tasks.workunit.client.0.vm07.stdout:6/4: mkdir d0/d1 0 2026-03-09T19:27:03.225 INFO:tasks.workunit.client.0.vm07.stdout:6/5: dread - no filename 2026-03-09T19:27:03.225 INFO:tasks.workunit.client.0.vm07.stdout:6/6: link - no file 2026-03-09T19:27:03.225 INFO:tasks.workunit.client.0.vm07.stdout:6/7: read - no filename 2026-03-09T19:27:03.228 INFO:tasks.workunit.client.0.vm07.stdout:4/14: unlink f0 0 2026-03-09T19:27:03.228 INFO:tasks.workunit.client.1.vm08.stdout:9/293: sync 2026-03-09T19:27:03.228 INFO:tasks.workunit.client.1.vm08.stdout:7/301: sync 2026-03-09T19:27:03.229 INFO:tasks.workunit.client.1.vm08.stdout:9/294: readlink d0/l2e 0 2026-03-09T19:27:03.231 INFO:tasks.workunit.client.0.vm07.stdout:8/8: creat f4 x:0 0 0 2026-03-09T19:27:03.239 INFO:tasks.workunit.client.0.vm07.stdout:7/5: symlink d0/l2 0 2026-03-09T19:27:03.248 INFO:tasks.workunit.client.0.vm07.stdout:2/3: mknod c1 0 2026-03-09T19:27:03.250 INFO:tasks.workunit.client.1.vm08.stdout:0/303: mknod dd/d22/d24/c5e 0 2026-03-09T19:27:03.251 INFO:tasks.workunit.client.1.vm08.stdout:0/304: fsync dd/d22/d27/f42 0 2026-03-09T19:27:03.251 INFO:tasks.workunit.client.1.vm08.stdout:0/305: dread - dd/d22/f5c zero size 2026-03-09T19:27:03.252 INFO:tasks.workunit.client.1.vm08.stdout:0/306: stat dd/d22/d24/f26 0 2026-03-09T19:27:03.253 INFO:tasks.workunit.client.0.vm07.stdout:6/8: creat d0/f2 x:0 0 0 2026-03-09T19:27:03.254 INFO:tasks.workunit.client.0.vm07.stdout:6/9: dread - d0/f2 zero size 2026-03-09T19:27:03.254 INFO:tasks.workunit.client.1.vm08.stdout:6/309: fdatasync d3/f32 0 2026-03-09T19:27:03.255 INFO:tasks.workunit.client.0.vm07.stdout:6/10: truncate d0/f2 1002779 0 2026-03-09T19:27:03.258 INFO:tasks.workunit.client.1.vm08.stdout:4/290: truncate da/d10/d16/d28/f34 596886 0 2026-03-09T19:27:03.260 INFO:tasks.workunit.client.0.vm07.stdout:8/9: creat f5 x:0 0 0 2026-03-09T19:27:03.260 INFO:tasks.workunit.client.0.vm07.stdout:8/10: write f2 [73985,88860] 0 2026-03-09T19:27:03.265 INFO:tasks.workunit.client.0.vm07.stdout:8/11: dwrite f4 [0,4194304] 0 2026-03-09T19:27:03.273 INFO:tasks.workunit.client.0.vm07.stdout:7/6: symlink d0/l3 0 2026-03-09T19:27:03.273 INFO:tasks.workunit.client.0.vm07.stdout:7/7: fsync d0/f1 0 2026-03-09T19:27:03.275 INFO:tasks.workunit.client.0.vm07.stdout:5/3: symlink l0 0 2026-03-09T19:27:03.306 INFO:tasks.workunit.client.0.vm07.stdout:5/4: write - no filename 2026-03-09T19:27:03.306 INFO:tasks.workunit.client.0.vm07.stdout:5/5: truncate - no filename 2026-03-09T19:27:03.306 INFO:tasks.workunit.client.0.vm07.stdout:5/6: dwrite - no filename 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:9/295: fsync d0/d2/f1a 0 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:4/291: fsync da/d10/d26/d27/f35 0 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:9/296: creat d0/d2/d14/d5c/d32/d57/f6a x:0 0 0 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:9/297: dread - d0/d2/d8/d7/f58 zero size 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:7/302: rename d5/d14/d27/d54/d60 to d5/d16/d3a/d42/d6a 0 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:0/307: creat dd/d22/d24/d49/f5f x:0 0 0 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:4/292: unlink da/d14/c1e 0 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:3/317: getdents d0/d6 0 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:3/318: fsync d0/d8/f5b 0 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:9/298: rmdir d0/d2/d14 39 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:4/293: dwrite da/f1d [0,4194304] 0 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:4/294: fsync da/d10/d1b/f29 0 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:7/303: dread d5/d14/d2b/f37 [0,4194304] 0 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:4/295: write f2 [4754186,7592] 0 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:4/296: stat f1 0 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:7/304: stat d5/d16/d1c/c36 0 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:4/297: chown f5 79559 1 2026-03-09T19:27:03.307 INFO:tasks.workunit.client.1.vm08.stdout:4/298: stat da/d14/l19 0 2026-03-09T19:27:03.311 INFO:tasks.workunit.client.1.vm08.stdout:7/305: dwrite d5/d16/f28 [0,4194304] 0 2026-03-09T19:27:03.317 INFO:tasks.workunit.client.1.vm08.stdout:4/299: dwrite da/d10/d26/d27/f3b [0,4194304] 0 2026-03-09T19:27:03.328 INFO:tasks.workunit.client.1.vm08.stdout:9/299: mknod d0/d2/d8/d7/c6b 0 2026-03-09T19:27:03.335 INFO:tasks.workunit.client.1.vm08.stdout:0/308: creat dd/d22/d24/f60 x:0 0 0 2026-03-09T19:27:03.335 INFO:tasks.workunit.client.1.vm08.stdout:3/319: getdents d0/d6/d25 0 2026-03-09T19:27:03.337 INFO:tasks.workunit.client.1.vm08.stdout:9/300: creat d0/d2/f6c x:0 0 0 2026-03-09T19:27:03.337 INFO:tasks.workunit.client.1.vm08.stdout:7/306: getdents d5/d16/d1c 0 2026-03-09T19:27:03.338 INFO:tasks.workunit.client.1.vm08.stdout:3/320: symlink d0/d6/de/d1a/l5e 0 2026-03-09T19:27:03.339 INFO:tasks.workunit.client.0.vm07.stdout:2/4: creat f2 x:0 0 0 2026-03-09T19:27:03.339 INFO:tasks.workunit.client.0.vm07.stdout:1/9: mknod d1/d3/c5 0 2026-03-09T19:27:03.339 INFO:tasks.workunit.client.1.vm08.stdout:9/301: truncate d0/f13 1525958 0 2026-03-09T19:27:03.340 INFO:tasks.workunit.client.1.vm08.stdout:7/307: creat d5/d16/f6b x:0 0 0 2026-03-09T19:27:03.340 INFO:tasks.workunit.client.1.vm08.stdout:7/308: fsync d5/f1a 0 2026-03-09T19:27:03.341 INFO:tasks.workunit.client.1.vm08.stdout:3/321: symlink d0/d6/l5f 0 2026-03-09T19:27:03.342 INFO:tasks.workunit.client.1.vm08.stdout:3/322: chown d0/d6/de/d1b/d16/l26 4674 1 2026-03-09T19:27:03.342 INFO:tasks.workunit.client.1.vm08.stdout:0/309: getdents dd/d31 0 2026-03-09T19:27:03.346 INFO:tasks.workunit.client.0.vm07.stdout:1/10: dread d1/d3/f4 [0,4194304] 0 2026-03-09T19:27:03.347 INFO:tasks.workunit.client.1.vm08.stdout:0/310: dwrite dd/f19 [0,4194304] 0 2026-03-09T19:27:03.348 INFO:tasks.workunit.client.1.vm08.stdout:9/302: dwrite d0/f44 [4194304,4194304] 0 2026-03-09T19:27:03.349 INFO:tasks.workunit.client.0.vm07.stdout:7/8: mkdir d0/d4 0 2026-03-09T19:27:03.351 INFO:tasks.workunit.client.0.vm07.stdout:5/7: mknod c1 0 2026-03-09T19:27:03.351 INFO:tasks.workunit.client.0.vm07.stdout:2/5: mkdir d3 0 2026-03-09T19:27:03.351 INFO:tasks.workunit.client.0.vm07.stdout:5/8: chown l0 81 1 2026-03-09T19:27:03.351 INFO:tasks.workunit.client.0.vm07.stdout:5/9: fsync - no filename 2026-03-09T19:27:03.352 INFO:tasks.workunit.client.0.vm07.stdout:6/11: link d0/f2 d0/f3 0 2026-03-09T19:27:03.352 INFO:tasks.workunit.client.0.vm07.stdout:5/10: chown l0 37561 1 2026-03-09T19:27:03.355 INFO:tasks.workunit.client.1.vm08.stdout:0/311: dwrite dd/d22/d27/d2e/d37/f46 [0,4194304] 0 2026-03-09T19:27:03.356 INFO:tasks.workunit.client.0.vm07.stdout:8/12: link f3 f6 0 2026-03-09T19:27:03.357 INFO:tasks.workunit.client.0.vm07.stdout:5/11: creat f2 x:0 0 0 2026-03-09T19:27:03.357 INFO:tasks.workunit.client.0.vm07.stdout:2/6: dwrite f0 [0,4194304] 0 2026-03-09T19:27:03.358 INFO:tasks.workunit.client.0.vm07.stdout:5/12: read - f2 zero size 2026-03-09T19:27:03.372 INFO:tasks.workunit.client.0.vm07.stdout:7/9: mkdir d0/d4/d5 0 2026-03-09T19:27:03.372 INFO:tasks.workunit.client.0.vm07.stdout:8/13: write f3 [354325,11207] 0 2026-03-09T19:27:03.373 INFO:tasks.workunit.client.0.vm07.stdout:5/13: mkdir d3 0 2026-03-09T19:27:03.375 INFO:tasks.workunit.client.1.vm08.stdout:0/312: symlink dd/d22/d24/d49/l61 0 2026-03-09T19:27:03.375 INFO:tasks.workunit.client.1.vm08.stdout:3/323: mknod d0/d6/de/c60 0 2026-03-09T19:27:03.375 INFO:tasks.workunit.client.0.vm07.stdout:5/14: write f2 [503522,32380] 0 2026-03-09T19:27:03.375 INFO:tasks.workunit.client.0.vm07.stdout:2/7: creat d3/f4 x:0 0 0 2026-03-09T19:27:03.376 INFO:tasks.workunit.client.0.vm07.stdout:2/8: truncate d3/f4 159985 0 2026-03-09T19:27:03.376 INFO:tasks.workunit.client.1.vm08.stdout:7/309: creat d5/d14/f6c x:0 0 0 2026-03-09T19:27:03.377 INFO:tasks.workunit.client.1.vm08.stdout:7/310: chown d5/d16/d1c/f5a 499403024 1 2026-03-09T19:27:03.377 INFO:tasks.workunit.client.0.vm07.stdout:2/9: truncate d3/f4 498263 0 2026-03-09T19:27:03.377 INFO:tasks.workunit.client.1.vm08.stdout:7/311: chown d5/d14/d2b/d4b 37067 1 2026-03-09T19:27:03.377 INFO:tasks.workunit.client.0.vm07.stdout:2/10: stat f0 0 2026-03-09T19:27:03.378 INFO:tasks.workunit.client.0.vm07.stdout:2/11: chown d3 5295 1 2026-03-09T19:27:03.378 INFO:tasks.workunit.client.1.vm08.stdout:0/313: dread dd/f19 [0,4194304] 0 2026-03-09T19:27:03.385 INFO:tasks.workunit.client.0.vm07.stdout:7/10: dwrite d0/f1 [0,4194304] 0 2026-03-09T19:27:03.398 INFO:tasks.workunit.client.1.vm08.stdout:3/324: dwrite d0/d8/f4a [0,4194304] 0 2026-03-09T19:27:03.398 INFO:tasks.workunit.client.1.vm08.stdout:7/312: symlink d5/d16/d3a/d42/l6d 0 2026-03-09T19:27:03.398 INFO:tasks.workunit.client.1.vm08.stdout:0/314: creat dd/d22/d24/d49/f62 x:0 0 0 2026-03-09T19:27:03.398 INFO:tasks.workunit.client.0.vm07.stdout:5/15: mknod d3/c4 0 2026-03-09T19:27:03.398 INFO:tasks.workunit.client.0.vm07.stdout:6/12: dwrite d0/f3 [0,4194304] 0 2026-03-09T19:27:03.398 INFO:tasks.workunit.client.0.vm07.stdout:5/16: creat d3/f5 x:0 0 0 2026-03-09T19:27:03.398 INFO:tasks.workunit.client.0.vm07.stdout:5/17: dread - d3/f5 zero size 2026-03-09T19:27:03.398 INFO:tasks.workunit.client.0.vm07.stdout:2/12: dwrite d3/f4 [0,4194304] 0 2026-03-09T19:27:03.399 INFO:tasks.workunit.client.1.vm08.stdout:0/315: dread dd/d22/f29 [0,4194304] 0 2026-03-09T19:27:03.401 INFO:tasks.workunit.client.0.vm07.stdout:6/13: dread d0/f2 [0,4194304] 0 2026-03-09T19:27:03.401 INFO:tasks.workunit.client.0.vm07.stdout:6/14: readlink - no filename 2026-03-09T19:27:03.403 INFO:tasks.workunit.client.0.vm07.stdout:5/18: mkdir d3/d6 0 2026-03-09T19:27:03.405 INFO:tasks.workunit.client.1.vm08.stdout:0/316: rename dd/d22/d24/d57 to dd/d22/d63 0 2026-03-09T19:27:03.406 INFO:tasks.workunit.client.1.vm08.stdout:0/317: truncate dd/d22/f5c 803503 0 2026-03-09T19:27:03.407 INFO:tasks.workunit.client.0.vm07.stdout:6/15: link d0/f2 d0/f4 0 2026-03-09T19:27:03.408 INFO:tasks.workunit.client.0.vm07.stdout:5/19: rmdir d3/d6 0 2026-03-09T19:27:03.409 INFO:tasks.workunit.client.1.vm08.stdout:0/318: getdents dd/d22/d27/d2e 0 2026-03-09T19:27:03.411 INFO:tasks.workunit.client.1.vm08.stdout:0/319: mknod dd/d22/d24/c64 0 2026-03-09T19:27:03.411 INFO:tasks.workunit.client.1.vm08.stdout:0/320: write dd/d22/d27/d2e/d37/f46 [3797604,61455] 0 2026-03-09T19:27:03.420 INFO:tasks.workunit.client.1.vm08.stdout:0/321: dread dd/f15 [0,4194304] 0 2026-03-09T19:27:03.421 INFO:tasks.workunit.client.1.vm08.stdout:0/322: dread dd/d22/d27/d2e/d37/f44 [0,4194304] 0 2026-03-09T19:27:03.434 INFO:tasks.workunit.client.1.vm08.stdout:3/325: sync 2026-03-09T19:27:03.434 INFO:tasks.workunit.client.1.vm08.stdout:3/326: chown d0/d6/de/d1b/d16/d18/l3e 12806791 1 2026-03-09T19:27:03.437 INFO:tasks.workunit.client.1.vm08.stdout:3/327: dread d0/d52/f5d [0,4194304] 0 2026-03-09T19:27:03.438 INFO:tasks.workunit.client.0.vm07.stdout:9/11: sync 2026-03-09T19:27:03.438 INFO:tasks.workunit.client.0.vm07.stdout:0/4: sync 2026-03-09T19:27:03.456 INFO:tasks.workunit.client.0.vm07.stdout:0/5: dread - no filename 2026-03-09T19:27:03.456 INFO:tasks.workunit.client.0.vm07.stdout:9/12: dread - no filename 2026-03-09T19:27:03.456 INFO:tasks.workunit.client.0.vm07.stdout:3/2: sync 2026-03-09T19:27:03.456 INFO:tasks.workunit.client.0.vm07.stdout:3/3: readlink - no filename 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.1.vm08.stdout:3/328: rename d0/d6/l5f to d0/d6/de/d1b/d16/l61 0 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.1.vm08.stdout:0/323: dwrite dd/d22/d27/f3f [4194304,4194304] 0 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:3/4: mkdir d1 0 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:9/13: creat d0/f2 x:0 0 0 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:0/6: mkdir d0 0 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:0/7: write - no filename 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:9/14: write d0/f2 [640041,24470] 0 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:9/15: write d0/f2 [1134124,96821] 0 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:3/5: mknod d1/c2 0 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:3/6: chown d1/c2 3865 1 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:3/7: dread - f0 zero size 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:3/8: write f0 [630371,130997] 0 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:3/9: write f0 [551396,118167] 0 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:3/10: write f0 [1181938,97484] 0 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:9/16: dwrite d0/f2 [0,4194304] 0 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:3/11: dwrite f0 [0,4194304] 0 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:3/12: dread f0 [0,4194304] 0 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:9/17: creat d0/f3 x:0 0 0 2026-03-09T19:27:03.457 INFO:tasks.workunit.client.0.vm07.stdout:1/11: sync 2026-03-09T19:27:03.461 INFO:tasks.workunit.client.0.vm07.stdout:9/18: dread - d0/f3 zero size 2026-03-09T19:27:03.462 INFO:tasks.workunit.client.0.vm07.stdout:3/13: symlink d1/l3 0 2026-03-09T19:27:03.462 INFO:tasks.workunit.client.1.vm08.stdout:3/329: link d0/c35 d0/d52/c62 0 2026-03-09T19:27:03.463 INFO:tasks.workunit.client.1.vm08.stdout:3/330: write d0/d6/de/d15/f53 [157282,111535] 0 2026-03-09T19:27:03.464 INFO:tasks.workunit.client.0.vm07.stdout:1/12: creat d1/f6 x:0 0 0 2026-03-09T19:27:03.464 INFO:tasks.workunit.client.0.vm07.stdout:1/13: read d1/d3/f4 [267646,2640] 0 2026-03-09T19:27:03.466 INFO:tasks.workunit.client.0.vm07.stdout:9/19: creat d0/f4 x:0 0 0 2026-03-09T19:27:03.469 INFO:tasks.workunit.client.0.vm07.stdout:1/14: symlink d1/l7 0 2026-03-09T19:27:03.473 INFO:tasks.workunit.client.0.vm07.stdout:1/15: creat d1/d3/f8 x:0 0 0 2026-03-09T19:27:03.477 INFO:tasks.workunit.client.0.vm07.stdout:1/16: dwrite d1/d3/f8 [0,4194304] 0 2026-03-09T19:27:03.481 INFO:tasks.workunit.client.0.vm07.stdout:0/8: sync 2026-03-09T19:27:03.499 INFO:tasks.workunit.client.0.vm07.stdout:1/17: mkdir d1/d9 0 2026-03-09T19:27:03.501 INFO:tasks.workunit.client.1.vm08.stdout:8/342: write f1 [197498,86672] 0 2026-03-09T19:27:03.511 INFO:tasks.workunit.client.1.vm08.stdout:8/343: dread de/d1d/d21/f23 [0,4194304] 0 2026-03-09T19:27:03.524 INFO:tasks.workunit.client.0.vm07.stdout:0/9: creat d0/f1 x:0 0 0 2026-03-09T19:27:03.524 INFO:tasks.workunit.client.0.vm07.stdout:0/10: chown d0 59 1 2026-03-09T19:27:03.525 INFO:tasks.workunit.client.0.vm07.stdout:0/11: creat d0/f2 x:0 0 0 2026-03-09T19:27:03.534 INFO:tasks.workunit.client.1.vm08.stdout:8/344: mkdir de/d7c 0 2026-03-09T19:27:03.538 INFO:tasks.workunit.client.1.vm08.stdout:8/345: truncate de/d1d/f59 973963 0 2026-03-09T19:27:03.539 INFO:tasks.workunit.client.1.vm08.stdout:8/346: dwrite de/d25/f64 [0,4194304] 0 2026-03-09T19:27:03.540 INFO:tasks.workunit.client.0.vm07.stdout:1/18: sync 2026-03-09T19:27:03.544 INFO:tasks.workunit.client.1.vm08.stdout:1/406: dwrite d9/da/dc/f10 [4194304,4194304] 0 2026-03-09T19:27:03.546 INFO:tasks.workunit.client.1.vm08.stdout:6/310: write d3/f5 [1592614,65558] 0 2026-03-09T19:27:03.561 INFO:tasks.workunit.client.1.vm08.stdout:2/282: dwrite d3/d9/d26/f35 [0,4194304] 0 2026-03-09T19:27:03.565 INFO:tasks.workunit.client.0.vm07.stdout:1/19: symlink d1/d3/la 0 2026-03-09T19:27:03.565 INFO:tasks.workunit.client.0.vm07.stdout:1/20: stat d1/f2 0 2026-03-09T19:27:03.565 INFO:tasks.workunit.client.0.vm07.stdout:1/21: fdatasync d1/f6 0 2026-03-09T19:27:03.566 INFO:tasks.workunit.client.0.vm07.stdout:1/22: truncate d1/f2 280854 0 2026-03-09T19:27:03.570 INFO:tasks.workunit.client.1.vm08.stdout:1/407: symlink d9/d11/l7e 0 2026-03-09T19:27:03.571 INFO:tasks.workunit.client.1.vm08.stdout:2/283: creat d3/d4/d23/d2c/f64 x:0 0 0 2026-03-09T19:27:03.571 INFO:tasks.workunit.client.1.vm08.stdout:1/408: dread - d9/d11/f73 zero size 2026-03-09T19:27:03.578 INFO:tasks.workunit.client.1.vm08.stdout:1/409: dwrite d9/da/d17/f2a [0,4194304] 0 2026-03-09T19:27:03.587 INFO:tasks.workunit.client.1.vm08.stdout:1/410: dwrite d9/da/dc/f78 [0,4194304] 0 2026-03-09T19:27:03.594 INFO:tasks.workunit.client.1.vm08.stdout:1/411: dwrite d9/da/dc/f31 [4194304,4194304] 0 2026-03-09T19:27:03.595 INFO:tasks.workunit.client.0.vm07.stdout:1/23: mkdir d1/db 0 2026-03-09T19:27:03.597 INFO:tasks.workunit.client.1.vm08.stdout:1/412: chown d9/l15 624 1 2026-03-09T19:27:03.599 INFO:tasks.workunit.client.1.vm08.stdout:1/413: truncate d9/da/d12/d39/f52 748075 0 2026-03-09T19:27:03.617 INFO:tasks.workunit.client.0.vm07.stdout:1/24: rename d1/d3/f8 to d1/db/fc 0 2026-03-09T19:27:03.644 INFO:tasks.workunit.client.1.vm08.stdout:2/284: rename d3/d4/d23/d2c/d39/d5e/de/d18/d1f/l30 to d3/d4/d23/d2c/d39/d5e/l65 0 2026-03-09T19:27:03.646 INFO:tasks.workunit.client.1.vm08.stdout:1/414: rename d9/da/d12/d39/c7d to d9/da/dc/c7f 0 2026-03-09T19:27:03.647 INFO:tasks.workunit.client.1.vm08.stdout:1/415: fdatasync d9/d11/f56 0 2026-03-09T19:27:03.649 INFO:tasks.workunit.client.1.vm08.stdout:2/285: rename d3/d9/c4d to d3/d4/d23/d2c/d39/c66 0 2026-03-09T19:27:03.665 INFO:tasks.workunit.client.1.vm08.stdout:5/273: dwrite d16/d1e/f27 [0,4194304] 0 2026-03-09T19:27:03.673 INFO:tasks.workunit.client.1.vm08.stdout:5/274: rmdir d16/d1e/d47 0 2026-03-09T19:27:03.674 INFO:tasks.workunit.client.1.vm08.stdout:5/275: creat d16/d45/f55 x:0 0 0 2026-03-09T19:27:03.687 INFO:tasks.workunit.client.1.vm08.stdout:5/276: dread d16/d1e/f37 [0,4194304] 0 2026-03-09T19:27:03.696 INFO:tasks.workunit.client.1.vm08.stdout:5/277: creat d16/f56 x:0 0 0 2026-03-09T19:27:03.699 INFO:tasks.workunit.client.1.vm08.stdout:5/278: rename d16/d1e/f25 to d16/d1e/f57 0 2026-03-09T19:27:03.699 INFO:tasks.workunit.client.1.vm08.stdout:5/279: truncate d16/d45/f46 512862 0 2026-03-09T19:27:03.701 INFO:tasks.workunit.client.1.vm08.stdout:5/280: mknod d16/d1e/d3b/c58 0 2026-03-09T19:27:03.755 INFO:tasks.workunit.client.1.vm08.stdout:4/300: truncate da/d14/d40/f41 202772 0 2026-03-09T19:27:03.760 INFO:tasks.workunit.client.1.vm08.stdout:4/301: dwrite da/d10/f1f [0,4194304] 0 2026-03-09T19:27:03.761 INFO:tasks.workunit.client.0.vm07.stdout:2/13: fdatasync d3/f4 0 2026-03-09T19:27:03.762 INFO:tasks.workunit.client.1.vm08.stdout:4/302: rmdir da 39 2026-03-09T19:27:03.766 INFO:tasks.workunit.client.1.vm08.stdout:4/303: link da/d10/f2e da/d10/d16/d28/d46/d52/f5b 0 2026-03-09T19:27:03.766 INFO:tasks.workunit.client.1.vm08.stdout:4/304: fsync da/d10/f3d 0 2026-03-09T19:27:03.768 INFO:tasks.workunit.client.1.vm08.stdout:4/305: mknod da/d10/d1b/c5c 0 2026-03-09T19:27:03.768 INFO:tasks.workunit.client.1.vm08.stdout:4/306: rename da/d10/d16/d28 to da/d10/d16/d28/d5d 22 2026-03-09T19:27:03.768 INFO:tasks.workunit.client.1.vm08.stdout:4/307: chown da/f18 1229 1 2026-03-09T19:27:03.770 INFO:tasks.workunit.client.1.vm08.stdout:4/308: rename da/lc to da/d14/d2c/l5e 0 2026-03-09T19:27:03.771 INFO:tasks.workunit.client.1.vm08.stdout:4/309: mknod da/d10/d26/d3a/c5f 0 2026-03-09T19:27:03.774 INFO:tasks.workunit.client.1.vm08.stdout:5/281: read f2 [1653893,58290] 0 2026-03-09T19:27:03.777 INFO:tasks.workunit.client.1.vm08.stdout:5/282: dread d16/f4e [0,4194304] 0 2026-03-09T19:27:03.828 INFO:tasks.workunit.client.1.vm08.stdout:5/283: sync 2026-03-09T19:27:03.830 INFO:tasks.workunit.client.1.vm08.stdout:5/284: mknod d16/c59 0 2026-03-09T19:27:03.830 INFO:tasks.workunit.client.1.vm08.stdout:5/285: stat d16/d1e/d3b/f3c 0 2026-03-09T19:27:03.831 INFO:tasks.workunit.client.1.vm08.stdout:5/286: write d16/d1e/d30/f3f [4071500,30591] 0 2026-03-09T19:27:03.832 INFO:tasks.workunit.client.1.vm08.stdout:5/287: dread - d16/d1e/d30/f3a zero size 2026-03-09T19:27:03.834 INFO:tasks.workunit.client.1.vm08.stdout:5/288: creat d16/d1e/f5a x:0 0 0 2026-03-09T19:27:03.838 INFO:tasks.workunit.client.1.vm08.stdout:5/289: dwrite d16/d45/f4a [0,4194304] 0 2026-03-09T19:27:03.859 INFO:tasks.workunit.client.1.vm08.stdout:5/290: link d16/d1e/d30/f3f d16/d1e/f5b 0 2026-03-09T19:27:03.891 INFO:tasks.workunit.client.1.vm08.stdout:6/311: dread d3/f25 [0,4194304] 0 2026-03-09T19:27:03.891 INFO:tasks.workunit.client.1.vm08.stdout:6/312: write d3/db/f20 [2539845,10395] 0 2026-03-09T19:27:03.898 INFO:tasks.workunit.client.1.vm08.stdout:6/313: mknod d3/d34/d3b/c75 0 2026-03-09T19:27:03.911 INFO:tasks.workunit.client.1.vm08.stdout:6/314: rename d3/f5f to d3/d34/f76 0 2026-03-09T19:27:03.913 INFO:tasks.workunit.client.1.vm08.stdout:1/416: dread d9/da/d12/f5c [0,4194304] 0 2026-03-09T19:27:03.914 INFO:tasks.workunit.client.1.vm08.stdout:1/417: write d9/da/dc/f68 [445060,89877] 0 2026-03-09T19:27:03.915 INFO:tasks.workunit.client.1.vm08.stdout:1/418: stat d9/da/f2f 0 2026-03-09T19:27:03.915 INFO:tasks.workunit.client.1.vm08.stdout:1/419: truncate d9/d40/d49/f7c 729336 0 2026-03-09T19:27:03.941 INFO:tasks.workunit.client.0.vm07.stdout:8/14: getdents . 0 2026-03-09T19:27:03.942 INFO:tasks.workunit.client.0.vm07.stdout:5/20: fsync f2 0 2026-03-09T19:27:03.942 INFO:tasks.workunit.client.0.vm07.stdout:5/21: rename d3 to d3/d7 22 2026-03-09T19:27:03.942 INFO:tasks.workunit.client.0.vm07.stdout:5/22: write d3/f5 [98949,43528] 0 2026-03-09T19:27:03.943 INFO:tasks.workunit.client.0.vm07.stdout:2/14: write f0 [4353865,17572] 0 2026-03-09T19:27:03.948 INFO:tasks.workunit.client.1.vm08.stdout:9/303: write d0/d2/d8/fe [1274334,39963] 0 2026-03-09T19:27:03.953 INFO:tasks.workunit.client.0.vm07.stdout:8/15: mkdir d7 0 2026-03-09T19:27:03.956 INFO:tasks.workunit.client.0.vm07.stdout:5/23: mknod d3/c8 0 2026-03-09T19:27:03.959 INFO:tasks.workunit.client.0.vm07.stdout:8/16: sync 2026-03-09T19:27:03.959 INFO:tasks.workunit.client.0.vm07.stdout:2/15: creat d3/f5 x:0 0 0 2026-03-09T19:27:03.960 INFO:tasks.workunit.client.0.vm07.stdout:5/24: mkdir d3/d9 0 2026-03-09T19:27:03.961 INFO:tasks.workunit.client.1.vm08.stdout:7/313: rmdir d5/d16 39 2026-03-09T19:27:03.962 INFO:tasks.workunit.client.1.vm08.stdout:0/324: fsync dd/d22/d24/d49/f62 0 2026-03-09T19:27:03.963 INFO:tasks.workunit.client.0.vm07.stdout:2/16: mknod d3/c6 0 2026-03-09T19:27:03.964 INFO:tasks.workunit.client.1.vm08.stdout:7/314: symlink d5/d14/d2b/d5d/l6e 0 2026-03-09T19:27:03.964 INFO:tasks.workunit.client.0.vm07.stdout:7/11: truncate d0/f1 2151417 0 2026-03-09T19:27:03.965 INFO:tasks.workunit.client.1.vm08.stdout:9/304: rename d0/c66 to d0/d2/d8/d7/c6d 0 2026-03-09T19:27:03.967 INFO:tasks.workunit.client.0.vm07.stdout:6/16: dwrite d0/f3 [4194304,4194304] 0 2026-03-09T19:27:03.969 INFO:tasks.workunit.client.0.vm07.stdout:8/17: rename f6 to d7/f8 0 2026-03-09T19:27:03.970 INFO:tasks.workunit.client.1.vm08.stdout:0/325: write dd/d22/d27/f3d [317647,18785] 0 2026-03-09T19:27:03.972 INFO:tasks.workunit.client.0.vm07.stdout:2/17: creat d3/f7 x:0 0 0 2026-03-09T19:27:03.973 INFO:tasks.workunit.client.0.vm07.stdout:2/18: write f0 [3152868,101428] 0 2026-03-09T19:27:03.973 INFO:tasks.workunit.client.0.vm07.stdout:2/19: write d3/f7 [90824,110516] 0 2026-03-09T19:27:03.973 INFO:tasks.workunit.client.0.vm07.stdout:2/20: rename d3 to d3/d8 22 2026-03-09T19:27:03.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:03 vm07.local ceph-mon[48545]: pgmap v156: 65 pgs: 65 active+clean; 647 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 6.2 MiB/s rd, 74 MiB/s wr, 315 op/s 2026-03-09T19:27:03.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:03 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:27:03.980 INFO:tasks.workunit.client.0.vm07.stdout:7/12: symlink d0/l6 0 2026-03-09T19:27:03.985 INFO:tasks.workunit.client.1.vm08.stdout:9/305: chown d0/d2/d8/d7/d48/d5d/c55 2918623 1 2026-03-09T19:27:03.988 INFO:tasks.workunit.client.0.vm07.stdout:6/17: rmdir d0 39 2026-03-09T19:27:03.994 INFO:tasks.workunit.client.1.vm08.stdout:7/315: dread d5/d16/f1f [0,4194304] 0 2026-03-09T19:27:03.996 INFO:tasks.workunit.client.1.vm08.stdout:7/316: write d5/d16/f28 [2642955,21929] 0 2026-03-09T19:27:03.998 INFO:tasks.workunit.client.1.vm08.stdout:9/306: dwrite d0/d2/f1d [4194304,4194304] 0 2026-03-09T19:27:04.005 INFO:tasks.workunit.client.0.vm07.stdout:8/18: mkdir d7/d9 0 2026-03-09T19:27:04.009 INFO:tasks.workunit.client.1.vm08.stdout:0/326: mkdir dd/d22/d27/d65 0 2026-03-09T19:27:04.009 INFO:tasks.workunit.client.1.vm08.stdout:0/327: chown dd/d22/d24/f26 3 1 2026-03-09T19:27:04.012 INFO:tasks.workunit.client.0.vm07.stdout:3/14: truncate f0 185265 0 2026-03-09T19:27:04.023 INFO:tasks.workunit.client.0.vm07.stdout:9/20: rmdir d0 39 2026-03-09T19:27:04.023 INFO:tasks.workunit.client.0.vm07.stdout:2/21: symlink d3/l9 0 2026-03-09T19:27:04.023 INFO:tasks.workunit.client.0.vm07.stdout:4/15: write f1 [421232,32201] 0 2026-03-09T19:27:04.023 INFO:tasks.workunit.client.0.vm07.stdout:0/12: fsync d0/f2 0 2026-03-09T19:27:04.023 INFO:tasks.workunit.client.1.vm08.stdout:3/331: write d0/d8/d19/f38 [1307125,32352] 0 2026-03-09T19:27:04.023 INFO:tasks.workunit.client.1.vm08.stdout:3/332: stat d0/d6 0 2026-03-09T19:27:04.023 INFO:tasks.workunit.client.1.vm08.stdout:7/317: symlink d5/d16/d1c/l6f 0 2026-03-09T19:27:04.023 INFO:tasks.workunit.client.1.vm08.stdout:7/318: write d5/d14/d38/f3c [330056,82607] 0 2026-03-09T19:27:04.023 INFO:tasks.workunit.client.1.vm08.stdout:7/319: truncate d5/d14/d2b/d4b/f66 513609 0 2026-03-09T19:27:04.023 INFO:tasks.workunit.client.1.vm08.stdout:8/347: getdents de 0 2026-03-09T19:27:04.027 INFO:tasks.workunit.client.0.vm07.stdout:0/13: dwrite d0/f1 [0,4194304] 0 2026-03-09T19:27:04.027 INFO:tasks.workunit.client.1.vm08.stdout:0/328: read dd/d22/d24/d49/f4c [46011,102753] 0 2026-03-09T19:27:04.033 INFO:tasks.workunit.client.0.vm07.stdout:6/18: dread d0/f2 [0,4194304] 0 2026-03-09T19:27:04.036 INFO:tasks.workunit.client.1.vm08.stdout:7/320: unlink d5/d14/d2b/f3e 0 2026-03-09T19:27:04.036 INFO:tasks.workunit.client.1.vm08.stdout:7/321: readlink d5/d14/l17 0 2026-03-09T19:27:04.038 INFO:tasks.workunit.client.0.vm07.stdout:1/25: write f0 [1620536,37041] 0 2026-03-09T19:27:04.050 INFO:tasks.workunit.client.0.vm07.stdout:4/16: chown f1 686 1 2026-03-09T19:27:04.051 INFO:tasks.workunit.client.0.vm07.stdout:9/21: dwrite d0/f4 [0,4194304] 0 2026-03-09T19:27:04.051 INFO:tasks.workunit.client.0.vm07.stdout:9/22: read d0/f4 [58159,121920] 0 2026-03-09T19:27:04.051 INFO:tasks.workunit.client.1.vm08.stdout:8/348: symlink de/d25/d33/l7d 0 2026-03-09T19:27:04.051 INFO:tasks.workunit.client.1.vm08.stdout:9/307: unlink d0/d2/d14/c24 0 2026-03-09T19:27:04.051 INFO:tasks.workunit.client.1.vm08.stdout:0/329: mknod dd/d22/d24/d49/c66 0 2026-03-09T19:27:04.051 INFO:tasks.workunit.client.1.vm08.stdout:2/286: dwrite d3/d4/d23/d2c/d39/d5e/de/f32 [0,4194304] 0 2026-03-09T19:27:04.051 INFO:tasks.workunit.client.0.vm07.stdout:0/14: sync 2026-03-09T19:27:04.052 INFO:tasks.workunit.client.1.vm08.stdout:3/333: sync 2026-03-09T19:27:04.053 INFO:tasks.workunit.client.1.vm08.stdout:3/334: chown d0/d8/d19/c34 409797330 1 2026-03-09T19:27:04.054 INFO:tasks.workunit.client.1.vm08.stdout:3/335: write d0/d8/d24/f2d [218356,16686] 0 2026-03-09T19:27:04.055 INFO:tasks.workunit.client.1.vm08.stdout:3/336: dread d0/d6/d25/f56 [0,4194304] 0 2026-03-09T19:27:04.056 INFO:tasks.workunit.client.0.vm07.stdout:9/23: dread d0/f4 [0,4194304] 0 2026-03-09T19:27:04.064 INFO:tasks.workunit.client.1.vm08.stdout:7/322: symlink d5/d14/d2b/l70 0 2026-03-09T19:27:04.069 INFO:tasks.workunit.client.0.vm07.stdout:8/19: symlink d7/d9/la 0 2026-03-09T19:27:04.072 INFO:tasks.workunit.client.0.vm07.stdout:1/26: fsync f0 0 2026-03-09T19:27:04.074 INFO:tasks.workunit.client.1.vm08.stdout:8/349: creat de/d1d/d69/f7e x:0 0 0 2026-03-09T19:27:04.080 INFO:tasks.workunit.client.1.vm08.stdout:4/310: truncate da/f1d 3967016 0 2026-03-09T19:27:04.087 INFO:tasks.workunit.client.1.vm08.stdout:3/337: dwrite d0/d52/f5d [0,4194304] 0 2026-03-09T19:27:04.087 INFO:tasks.workunit.client.1.vm08.stdout:3/338: write d0/d8/f4c [1562051,79869] 0 2026-03-09T19:27:04.087 INFO:tasks.workunit.client.1.vm08.stdout:3/339: write d0/d8/f4c [2987051,94791] 0 2026-03-09T19:27:04.088 INFO:tasks.workunit.client.1.vm08.stdout:3/340: write d0/d8/d19/f44 [8815751,116305] 0 2026-03-09T19:27:04.088 INFO:tasks.workunit.client.1.vm08.stdout:3/341: write d0/d8/f5b [4667,99529] 0 2026-03-09T19:27:04.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:03 vm08.local ceph-mon[57794]: pgmap v156: 65 pgs: 65 active+clean; 647 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 6.2 MiB/s rd, 74 MiB/s wr, 315 op/s 2026-03-09T19:27:04.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:03 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:27:04.097 INFO:tasks.workunit.client.1.vm08.stdout:7/323: creat d5/d16/d3a/d42/f71 x:0 0 0 2026-03-09T19:27:04.100 INFO:tasks.workunit.client.1.vm08.stdout:5/291: rmdir d16 39 2026-03-09T19:27:04.100 INFO:tasks.workunit.client.1.vm08.stdout:7/324: dwrite d5/d16/f6b [0,4194304] 0 2026-03-09T19:27:04.106 INFO:tasks.workunit.client.0.vm07.stdout:0/15: unlink d0/f1 0 2026-03-09T19:27:04.112 INFO:tasks.workunit.client.0.vm07.stdout:0/16: chown d0 796630808 1 2026-03-09T19:27:04.112 INFO:tasks.workunit.client.0.vm07.stdout:0/17: fdatasync d0/f2 0 2026-03-09T19:27:04.113 INFO:tasks.workunit.client.0.vm07.stdout:9/24: rmdir d0 39 2026-03-09T19:27:04.113 INFO:tasks.workunit.client.1.vm08.stdout:8/350: symlink de/d1d/d69/l7f 0 2026-03-09T19:27:04.113 INFO:tasks.workunit.client.1.vm08.stdout:9/308: creat d0/d2/d8/d7/d48/d5e/f6e x:0 0 0 2026-03-09T19:27:04.113 INFO:tasks.workunit.client.1.vm08.stdout:9/309: dread - d0/d2/d8/d7/d48/f53 zero size 2026-03-09T19:27:04.115 INFO:tasks.workunit.client.1.vm08.stdout:6/315: write d3/d34/f37 [539870,26619] 0 2026-03-09T19:27:04.116 INFO:tasks.workunit.client.1.vm08.stdout:6/316: write d3/d34/d6f/f39 [250710,55759] 0 2026-03-09T19:27:04.117 INFO:tasks.workunit.client.1.vm08.stdout:6/317: chown d3/d34/d3b 75177025 1 2026-03-09T19:27:04.117 INFO:tasks.workunit.client.1.vm08.stdout:6/318: truncate d3/d34/f76 1025016 0 2026-03-09T19:27:04.123 INFO:tasks.workunit.client.1.vm08.stdout:1/420: truncate d9/da/dc/f78 822860 0 2026-03-09T19:27:04.128 INFO:tasks.workunit.client.0.vm07.stdout:8/20: write d7/f8 [987620,11054] 0 2026-03-09T19:27:04.128 INFO:tasks.workunit.client.0.vm07.stdout:8/21: readlink d7/d9/la 0 2026-03-09T19:27:04.128 INFO:tasks.workunit.client.1.vm08.stdout:0/330: symlink dd/d22/d27/l67 0 2026-03-09T19:27:04.128 INFO:tasks.workunit.client.1.vm08.stdout:0/331: truncate dd/d22/d27/d2e/f39 910444 0 2026-03-09T19:27:04.128 INFO:tasks.workunit.client.1.vm08.stdout:9/310: dread d0/d2/d14/f4d [0,4194304] 0 2026-03-09T19:27:04.132 INFO:tasks.workunit.client.1.vm08.stdout:4/311: dread da/d10/f2e [0,4194304] 0 2026-03-09T19:27:04.132 INFO:tasks.workunit.client.1.vm08.stdout:4/312: chown da/f21 889 1 2026-03-09T19:27:04.133 INFO:tasks.workunit.client.0.vm07.stdout:1/27: mknod d1/d3/cd 0 2026-03-09T19:27:04.135 INFO:tasks.workunit.client.0.vm07.stdout:5/25: rmdir d3 39 2026-03-09T19:27:04.140 INFO:tasks.workunit.client.1.vm08.stdout:4/313: dwrite da/d10/d26/d27/f35 [0,4194304] 0 2026-03-09T19:27:04.140 INFO:tasks.workunit.client.1.vm08.stdout:7/325: sync 2026-03-09T19:27:04.147 INFO:tasks.workunit.client.0.vm07.stdout:6/19: mknod d0/d1/c5 0 2026-03-09T19:27:04.147 INFO:tasks.workunit.client.0.vm07.stdout:7/13: write d0/f1 [922410,97104] 0 2026-03-09T19:27:04.157 INFO:tasks.workunit.client.1.vm08.stdout:3/342: readlink d0/l55 0 2026-03-09T19:27:04.159 INFO:tasks.workunit.client.0.vm07.stdout:0/18: mknod d0/c3 0 2026-03-09T19:27:04.162 INFO:tasks.workunit.client.0.vm07.stdout:9/25: rename d0 to d0/d5 22 2026-03-09T19:27:04.163 INFO:tasks.workunit.client.1.vm08.stdout:8/351: rename de/d1d/d2e/f6e to de/d1d/d2e/d5f/f80 0 2026-03-09T19:27:04.164 INFO:tasks.workunit.client.0.vm07.stdout:3/15: dwrite f0 [0,4194304] 0 2026-03-09T19:27:04.165 INFO:tasks.workunit.client.0.vm07.stdout:3/16: write f0 [4962257,5729] 0 2026-03-09T19:27:04.167 INFO:tasks.workunit.client.0.vm07.stdout:9/26: dread d0/f4 [0,4194304] 0 2026-03-09T19:27:04.175 INFO:tasks.workunit.client.0.vm07.stdout:9/27: read d0/f4 [3319517,60413] 0 2026-03-09T19:27:04.175 INFO:tasks.workunit.client.0.vm07.stdout:1/28: unlink d1/db/fc 0 2026-03-09T19:27:04.175 INFO:tasks.workunit.client.0.vm07.stdout:1/29: write d1/f2 [678388,102514] 0 2026-03-09T19:27:04.175 INFO:tasks.workunit.client.1.vm08.stdout:3/343: sync 2026-03-09T19:27:04.182 INFO:tasks.workunit.client.0.vm07.stdout:6/20: dwrite d0/f4 [4194304,4194304] 0 2026-03-09T19:27:04.182 INFO:tasks.workunit.client.0.vm07.stdout:5/26: dwrite d3/f5 [0,4194304] 0 2026-03-09T19:27:04.182 INFO:tasks.workunit.client.1.vm08.stdout:0/332: unlink dd/d31/f34 0 2026-03-09T19:27:04.183 INFO:tasks.workunit.client.0.vm07.stdout:6/21: stat d0/d1/c5 0 2026-03-09T19:27:04.187 INFO:tasks.workunit.client.0.vm07.stdout:5/27: dread d3/f5 [0,4194304] 0 2026-03-09T19:27:04.198 INFO:tasks.workunit.client.0.vm07.stdout:5/28: dwrite d3/f5 [0,4194304] 0 2026-03-09T19:27:04.203 INFO:tasks.workunit.client.1.vm08.stdout:2/287: creat d3/d9/f67 x:0 0 0 2026-03-09T19:27:04.204 INFO:tasks.workunit.client.1.vm08.stdout:4/314: truncate da/f18 1733008 0 2026-03-09T19:27:04.205 INFO:tasks.workunit.client.0.vm07.stdout:2/22: getdents d3 0 2026-03-09T19:27:04.205 INFO:tasks.workunit.client.0.vm07.stdout:4/17: creat f2 x:0 0 0 2026-03-09T19:27:04.205 INFO:tasks.workunit.client.0.vm07.stdout:4/18: readlink - no filename 2026-03-09T19:27:04.205 INFO:tasks.workunit.client.0.vm07.stdout:0/19: creat d0/f4 x:0 0 0 2026-03-09T19:27:04.205 INFO:tasks.workunit.client.0.vm07.stdout:3/17: creat d1/f4 x:0 0 0 2026-03-09T19:27:04.205 INFO:tasks.workunit.client.0.vm07.stdout:9/28: mkdir d0/d6 0 2026-03-09T19:27:04.207 INFO:tasks.workunit.client.1.vm08.stdout:6/319: rename d3/d34/d6f/d4b/f65 to d3/d68/f77 0 2026-03-09T19:27:04.208 INFO:tasks.workunit.client.0.vm07.stdout:3/18: truncate d1/f4 797067 0 2026-03-09T19:27:04.210 INFO:tasks.workunit.client.0.vm07.stdout:3/19: write d1/f4 [1352960,92188] 0 2026-03-09T19:27:04.222 INFO:tasks.workunit.client.0.vm07.stdout:8/22: read d7/f8 [909331,50707] 0 2026-03-09T19:27:04.223 INFO:tasks.workunit.client.1.vm08.stdout:8/352: creat de/d47/f81 x:0 0 0 2026-03-09T19:27:04.223 INFO:tasks.workunit.client.0.vm07.stdout:0/20: dwrite d0/f4 [0,4194304] 0 2026-03-09T19:27:04.226 INFO:tasks.workunit.client.1.vm08.stdout:3/344: fdatasync d0/d6/d25/f56 0 2026-03-09T19:27:04.227 INFO:tasks.workunit.client.1.vm08.stdout:3/345: stat d0/d8/f4a 0 2026-03-09T19:27:04.237 INFO:tasks.workunit.client.1.vm08.stdout:9/311: truncate d0/d2/f21 526897 0 2026-03-09T19:27:04.237 INFO:tasks.workunit.client.0.vm07.stdout:0/21: dwrite d0/f2 [0,4194304] 0 2026-03-09T19:27:04.237 INFO:tasks.workunit.client.0.vm07.stdout:0/22: dwrite d0/f2 [0,4194304] 0 2026-03-09T19:27:04.237 INFO:tasks.workunit.client.0.vm07.stdout:2/23: creat d3/fa x:0 0 0 2026-03-09T19:27:04.241 INFO:tasks.workunit.client.1.vm08.stdout:4/315: chown f9 222353 1 2026-03-09T19:27:04.257 INFO:tasks.workunit.client.0.vm07.stdout:8/23: symlink d7/d9/lb 0 2026-03-09T19:27:04.274 INFO:tasks.workunit.client.1.vm08.stdout:1/421: dwrite d9/d11/f29 [0,4194304] 0 2026-03-09T19:27:04.274 INFO:tasks.workunit.client.1.vm08.stdout:6/320: mknod d3/d55/c78 0 2026-03-09T19:27:04.274 INFO:tasks.workunit.client.1.vm08.stdout:8/353: rename de/d25/d33/d46/d67 to de/d25/d31/d82 0 2026-03-09T19:27:04.274 INFO:tasks.workunit.client.1.vm08.stdout:3/346: mkdir d0/d6/de/d1a/d63 0 2026-03-09T19:27:04.274 INFO:tasks.workunit.client.1.vm08.stdout:4/316: creat da/d14/d2c/f60 x:0 0 0 2026-03-09T19:27:04.274 INFO:tasks.workunit.client.1.vm08.stdout:4/317: truncate da/d14/d2c/f4a 538251 0 2026-03-09T19:27:04.274 INFO:tasks.workunit.client.0.vm07.stdout:1/30: symlink d1/d9/le 0 2026-03-09T19:27:04.274 INFO:tasks.workunit.client.0.vm07.stdout:7/14: link d0/l2 d0/l7 0 2026-03-09T19:27:04.274 INFO:tasks.workunit.client.0.vm07.stdout:5/29: creat d3/d9/fa x:0 0 0 2026-03-09T19:27:04.274 INFO:tasks.workunit.client.0.vm07.stdout:0/23: symlink d0/l5 0 2026-03-09T19:27:04.274 INFO:tasks.workunit.client.0.vm07.stdout:4/19: mkdir d3 0 2026-03-09T19:27:04.274 INFO:tasks.workunit.client.0.vm07.stdout:2/24: unlink d3/c6 0 2026-03-09T19:27:04.274 INFO:tasks.workunit.client.0.vm07.stdout:4/20: dwrite f2 [0,4194304] 0 2026-03-09T19:27:04.281 INFO:tasks.workunit.client.1.vm08.stdout:6/321: mkdir d3/d68/d79 0 2026-03-09T19:27:04.328 INFO:tasks.workunit.client.1.vm08.stdout:3/347: dread d0/d6/de/d1b/d16/d18/f2c [0,4194304] 0 2026-03-09T19:27:04.328 INFO:tasks.workunit.client.1.vm08.stdout:3/348: write d0/d8/d19/f44 [2317191,25536] 0 2026-03-09T19:27:04.328 INFO:tasks.workunit.client.1.vm08.stdout:3/349: write d0/d8/f5b [638935,124345] 0 2026-03-09T19:27:04.328 INFO:tasks.workunit.client.1.vm08.stdout:8/354: rename de/f1b to de/d25/d33/f83 0 2026-03-09T19:27:04.328 INFO:tasks.workunit.client.1.vm08.stdout:7/326: link d5/d16/d1c/c36 d5/d16/d3a/c72 0 2026-03-09T19:27:04.328 INFO:tasks.workunit.client.1.vm08.stdout:7/327: dread d5/d14/f59 [0,4194304] 0 2026-03-09T19:27:04.328 INFO:tasks.workunit.client.1.vm08.stdout:3/350: dwrite d0/d52/f5c [0,4194304] 0 2026-03-09T19:27:04.328 INFO:tasks.workunit.client.1.vm08.stdout:7/328: mkdir d5/d16/d1c/d73 0 2026-03-09T19:27:04.329 INFO:tasks.workunit.client.1.vm08.stdout:3/351: symlink d0/d6/de/d15/l64 0 2026-03-09T19:27:04.329 INFO:tasks.workunit.client.1.vm08.stdout:7/329: dread d5/d16/d1c/f29 [0,4194304] 0 2026-03-09T19:27:04.329 INFO:tasks.workunit.client.1.vm08.stdout:7/330: dread - d5/d16/d1c/f5a zero size 2026-03-09T19:27:04.329 INFO:tasks.workunit.client.1.vm08.stdout:6/322: link d3/db/l1a d3/d34/d3b/l7a 0 2026-03-09T19:27:04.329 INFO:tasks.workunit.client.1.vm08.stdout:7/331: symlink d5/d16/d1c/l74 0 2026-03-09T19:27:04.329 INFO:tasks.workunit.client.1.vm08.stdout:6/323: creat d3/d15/f7b x:0 0 0 2026-03-09T19:27:04.329 INFO:tasks.workunit.client.1.vm08.stdout:7/332: dread d5/d16/d1c/f29 [0,4194304] 0 2026-03-09T19:27:04.329 INFO:tasks.workunit.client.1.vm08.stdout:6/324: chown d3/f32 81068879 1 2026-03-09T19:27:04.329 INFO:tasks.workunit.client.1.vm08.stdout:7/333: read d5/d14/d2b/d4b/f66 [15435,69188] 0 2026-03-09T19:27:04.329 INFO:tasks.workunit.client.1.vm08.stdout:7/334: stat d5/d14/d38 0 2026-03-09T19:27:04.329 INFO:tasks.workunit.client.1.vm08.stdout:3/352: mknod d0/d8/c65 0 2026-03-09T19:27:04.329 INFO:tasks.workunit.client.1.vm08.stdout:6/325: dwrite d3/fc [4194304,4194304] 0 2026-03-09T19:27:04.329 INFO:tasks.workunit.client.1.vm08.stdout:7/335: creat d5/d14/d27/d54/f75 x:0 0 0 2026-03-09T19:27:04.329 INFO:tasks.workunit.client.1.vm08.stdout:7/336: mknod d5/d14/d38/c76 0 2026-03-09T19:27:04.329 INFO:tasks.workunit.client.1.vm08.stdout:7/337: read d5/d12/f19 [4052632,10223] 0 2026-03-09T19:27:04.329 INFO:tasks.workunit.client.1.vm08.stdout:7/338: write d5/d16/d3a/d42/f71 [799946,111656] 0 2026-03-09T19:27:04.337 INFO:tasks.workunit.client.1.vm08.stdout:1/422: dread d9/da/dc/f2e [0,4194304] 0 2026-03-09T19:27:04.341 INFO:tasks.workunit.client.1.vm08.stdout:3/353: creat d0/d8/f66 x:0 0 0 2026-03-09T19:27:04.347 INFO:tasks.workunit.client.1.vm08.stdout:6/326: dread d3/f5 [0,4194304] 0 2026-03-09T19:27:04.347 INFO:tasks.workunit.client.1.vm08.stdout:6/327: chown d3/f6 153 1 2026-03-09T19:27:04.349 INFO:tasks.workunit.client.1.vm08.stdout:7/339: chown d5/d16/d3a/c72 17045 1 2026-03-09T19:27:04.350 INFO:tasks.workunit.client.1.vm08.stdout:7/340: stat d5/d16 0 2026-03-09T19:27:04.350 INFO:tasks.workunit.client.1.vm08.stdout:7/341: fdatasync d5/d16/d3a/f64 0 2026-03-09T19:27:04.351 INFO:tasks.workunit.client.1.vm08.stdout:7/342: truncate d5/d14/d38/f3c 505149 0 2026-03-09T19:27:04.353 INFO:tasks.workunit.client.1.vm08.stdout:7/343: chown d5/d14/d2b/d4b/f66 147 1 2026-03-09T19:27:04.353 INFO:tasks.workunit.client.1.vm08.stdout:6/328: creat d3/d34/d5c/f7c x:0 0 0 2026-03-09T19:27:04.353 INFO:tasks.workunit.client.1.vm08.stdout:7/344: chown d5/d16/d3a/d42/f68 12493972 1 2026-03-09T19:27:04.354 INFO:tasks.workunit.client.1.vm08.stdout:7/345: stat d5/d16/d3a/d42/f65 0 2026-03-09T19:27:04.358 INFO:tasks.workunit.client.1.vm08.stdout:3/354: link d0/d6/de/d1b/l2f d0/d6/de/d1b/d16/l67 0 2026-03-09T19:27:04.360 INFO:tasks.workunit.client.1.vm08.stdout:6/329: write d3/d34/d6f/f2f [1145125,7942] 0 2026-03-09T19:27:04.360 INFO:tasks.workunit.client.1.vm08.stdout:7/346: symlink d5/d14/d38/l77 0 2026-03-09T19:27:04.363 INFO:tasks.workunit.client.1.vm08.stdout:3/355: creat d0/d6/de/d1a/d33/f68 x:0 0 0 2026-03-09T19:27:04.363 INFO:tasks.workunit.client.1.vm08.stdout:7/347: dwrite d5/d16/d1c/f5a [0,4194304] 0 2026-03-09T19:27:04.368 INFO:tasks.workunit.client.1.vm08.stdout:3/356: readlink d0/d4b/l3d 0 2026-03-09T19:27:04.368 INFO:tasks.workunit.client.1.vm08.stdout:7/348: mkdir d5/d14/d27/d78 0 2026-03-09T19:27:04.370 INFO:tasks.workunit.client.1.vm08.stdout:7/349: symlink d5/d14/d38/l79 0 2026-03-09T19:27:04.371 INFO:tasks.workunit.client.1.vm08.stdout:3/357: mknod d0/d6/de/c69 0 2026-03-09T19:27:04.377 INFO:tasks.workunit.client.1.vm08.stdout:7/350: link d5/d16/c1d d5/d14/c7a 0 2026-03-09T19:27:04.377 INFO:tasks.workunit.client.1.vm08.stdout:7/351: fdatasync d5/d16/d3a/f64 0 2026-03-09T19:27:04.378 INFO:tasks.workunit.client.1.vm08.stdout:7/352: write d5/d16/d3a/d42/f65 [176192,5170] 0 2026-03-09T19:27:04.382 INFO:tasks.workunit.client.1.vm08.stdout:1/423: sync 2026-03-09T19:27:04.418 INFO:tasks.workunit.client.1.vm08.stdout:1/424: sync 2026-03-09T19:27:04.426 INFO:tasks.workunit.client.1.vm08.stdout:3/358: dread d0/d8/d19/f38 [0,4194304] 0 2026-03-09T19:27:04.435 INFO:tasks.workunit.client.1.vm08.stdout:1/425: unlink d9/da/d17/f2a 0 2026-03-09T19:27:04.439 INFO:tasks.workunit.client.1.vm08.stdout:3/359: creat d0/d52/f6a x:0 0 0 2026-03-09T19:27:04.440 INFO:tasks.workunit.client.1.vm08.stdout:3/360: truncate d0/d52/f5d 4606532 0 2026-03-09T19:27:04.456 INFO:tasks.workunit.client.0.vm07.stdout:3/20: dread d1/f4 [0,4194304] 0 2026-03-09T19:27:04.456 INFO:tasks.workunit.client.0.vm07.stdout:3/21: fdatasync f0 0 2026-03-09T19:27:04.458 INFO:tasks.workunit.client.1.vm08.stdout:3/361: mknod d0/d6/de/d15/c6b 0 2026-03-09T19:27:04.459 INFO:tasks.workunit.client.0.vm07.stdout:6/22: fsync d0/f4 0 2026-03-09T19:27:04.462 INFO:tasks.workunit.client.1.vm08.stdout:6/330: rmdir d3/d34/d6f/d4b 39 2026-03-09T19:27:04.463 INFO:tasks.workunit.client.1.vm08.stdout:1/426: creat d9/d11/d7a/f80 x:0 0 0 2026-03-09T19:27:04.465 INFO:tasks.workunit.client.1.vm08.stdout:3/362: creat d0/d6/de/d1b/f6c x:0 0 0 2026-03-09T19:27:04.469 INFO:tasks.workunit.client.1.vm08.stdout:1/427: mknod d9/d40/c81 0 2026-03-09T19:27:04.481 INFO:tasks.workunit.client.1.vm08.stdout:1/428: getdents d9/d11 0 2026-03-09T19:27:04.490 INFO:tasks.workunit.client.1.vm08.stdout:1/429: dread d9/da/dc/f31 [0,4194304] 0 2026-03-09T19:27:04.510 INFO:tasks.workunit.client.1.vm08.stdout:1/430: dread d9/da/dc/f1d [0,4194304] 0 2026-03-09T19:27:04.510 INFO:tasks.workunit.client.1.vm08.stdout:1/431: chown d9/d40 104799725 1 2026-03-09T19:27:04.514 INFO:tasks.workunit.client.1.vm08.stdout:0/333: write dd/f13 [894386,40391] 0 2026-03-09T19:27:04.517 INFO:tasks.workunit.client.1.vm08.stdout:5/292: truncate d16/f17 3618722 0 2026-03-09T19:27:04.518 INFO:tasks.workunit.client.1.vm08.stdout:2/288: write d3/d9/f20 [2711104,114792] 0 2026-03-09T19:27:04.524 INFO:tasks.workunit.client.0.vm07.stdout:9/29: truncate d0/f4 1819030 0 2026-03-09T19:27:04.524 INFO:tasks.workunit.client.0.vm07.stdout:9/30: chown d0/l1 1 1 2026-03-09T19:27:04.526 INFO:tasks.workunit.client.1.vm08.stdout:7/353: getdents d5/d16/d3a 0 2026-03-09T19:27:04.527 INFO:tasks.workunit.client.1.vm08.stdout:9/312: dwrite d0/fa [0,4194304] 0 2026-03-09T19:27:04.527 INFO:tasks.workunit.client.1.vm08.stdout:7/354: chown d5/d16/f49 37799196 1 2026-03-09T19:27:04.527 INFO:tasks.workunit.client.0.vm07.stdout:9/31: dwrite d0/f2 [0,4194304] 0 2026-03-09T19:27:04.531 INFO:tasks.workunit.client.1.vm08.stdout:1/432: mknod d9/da/d53/d67/d6c/c82 0 2026-03-09T19:27:04.532 INFO:tasks.workunit.client.1.vm08.stdout:1/433: write d9/da/f1e [4878539,70012] 0 2026-03-09T19:27:04.549 INFO:tasks.workunit.client.1.vm08.stdout:5/293: unlink f2 0 2026-03-09T19:27:04.554 INFO:tasks.workunit.client.1.vm08.stdout:4/318: truncate da/d10/d26/d38/f43 906439 0 2026-03-09T19:27:04.561 INFO:tasks.workunit.client.1.vm08.stdout:4/319: dread - da/d10/d26/d38/f57 zero size 2026-03-09T19:27:04.561 INFO:tasks.workunit.client.1.vm08.stdout:1/434: unlink d9/da/dc/l71 0 2026-03-09T19:27:04.561 INFO:tasks.workunit.client.1.vm08.stdout:1/435: readlink d9/l59 0 2026-03-09T19:27:04.565 INFO:tasks.workunit.client.1.vm08.stdout:5/294: dwrite d16/d1e/f35 [0,4194304] 0 2026-03-09T19:27:04.566 INFO:tasks.workunit.client.1.vm08.stdout:5/295: dread - d16/d45/f55 zero size 2026-03-09T19:27:04.579 INFO:tasks.workunit.client.1.vm08.stdout:4/320: dread da/d10/f1c [0,4194304] 0 2026-03-09T19:27:04.603 INFO:tasks.workunit.client.1.vm08.stdout:2/289: creat d3/d4/d23/d2c/d39/d5e/f68 x:0 0 0 2026-03-09T19:27:04.609 INFO:tasks.workunit.client.1.vm08.stdout:9/313: truncate d0/d2/d8/f61 1369282 0 2026-03-09T19:27:04.611 INFO:tasks.workunit.client.1.vm08.stdout:7/355: rename d5/c6 to d5/d14/c7b 0 2026-03-09T19:27:04.612 INFO:tasks.workunit.client.1.vm08.stdout:4/321: symlink da/d14/l61 0 2026-03-09T19:27:04.613 INFO:tasks.workunit.client.1.vm08.stdout:4/322: fdatasync da/d10/d26/d27/d32/f39 0 2026-03-09T19:27:04.616 INFO:tasks.workunit.client.1.vm08.stdout:9/314: dread d0/d2/d8/f29 [0,4194304] 0 2026-03-09T19:27:04.617 INFO:tasks.workunit.client.1.vm08.stdout:2/290: creat d3/d9/d26/f69 x:0 0 0 2026-03-09T19:27:04.620 INFO:tasks.workunit.client.1.vm08.stdout:7/356: creat d5/d16/f7c x:0 0 0 2026-03-09T19:27:04.622 INFO:tasks.workunit.client.1.vm08.stdout:4/323: symlink da/d14/l62 0 2026-03-09T19:27:04.624 INFO:tasks.workunit.client.1.vm08.stdout:4/324: read da/d10/d1b/f29 [7976909,46431] 0 2026-03-09T19:27:04.624 INFO:tasks.workunit.client.1.vm08.stdout:4/325: write da/d10/d16/d28/f44 [3853647,11281] 0 2026-03-09T19:27:04.631 INFO:tasks.workunit.client.1.vm08.stdout:2/291: creat d3/d9/d26/f6a x:0 0 0 2026-03-09T19:27:04.634 INFO:tasks.workunit.client.1.vm08.stdout:5/296: link d16/d45/f46 d16/d1e/f5c 0 2026-03-09T19:27:04.635 INFO:tasks.workunit.client.1.vm08.stdout:2/292: dwrite d3/d9/d26/f35 [0,4194304] 0 2026-03-09T19:27:04.648 INFO:tasks.workunit.client.1.vm08.stdout:7/357: creat d5/d16/d3a/d42/d6a/f7d x:0 0 0 2026-03-09T19:27:04.651 INFO:tasks.workunit.client.1.vm08.stdout:7/358: dread d5/f1a [0,4194304] 0 2026-03-09T19:27:04.652 INFO:tasks.workunit.client.1.vm08.stdout:7/359: stat d5/d14/c31 0 2026-03-09T19:27:04.654 INFO:tasks.workunit.client.1.vm08.stdout:4/326: mknod da/d10/d26/d50/c63 0 2026-03-09T19:27:04.662 INFO:tasks.workunit.client.1.vm08.stdout:5/297: creat d16/d45/f5d x:0 0 0 2026-03-09T19:27:04.663 INFO:tasks.workunit.client.0.vm07.stdout:8/24: mknod d7/d9/cc 0 2026-03-09T19:27:04.663 INFO:tasks.workunit.client.0.vm07.stdout:1/31: mknod d1/cf 0 2026-03-09T19:27:04.665 INFO:tasks.workunit.client.1.vm08.stdout:2/293: creat d3/d4/d23/d2c/d41/f6b x:0 0 0 2026-03-09T19:27:04.667 INFO:tasks.workunit.client.1.vm08.stdout:7/360: creat d5/d14/d2b/d4b/f7e x:0 0 0 2026-03-09T19:27:04.668 INFO:tasks.workunit.client.0.vm07.stdout:5/30: creat d3/d9/fb x:0 0 0 2026-03-09T19:27:04.668 INFO:tasks.workunit.client.0.vm07.stdout:5/31: dread - d3/d9/fb zero size 2026-03-09T19:27:04.669 INFO:tasks.workunit.client.0.vm07.stdout:0/24: unlink d0/l5 0 2026-03-09T19:27:04.670 INFO:tasks.workunit.client.1.vm08.stdout:2/294: mknod d3/d4/d23/c6c 0 2026-03-09T19:27:04.675 INFO:tasks.workunit.client.1.vm08.stdout:2/295: dwrite d3/d9/d4a/f59 [0,4194304] 0 2026-03-09T19:27:04.677 INFO:tasks.workunit.client.1.vm08.stdout:2/296: write d3/d4/f48 [1954861,106278] 0 2026-03-09T19:27:04.686 INFO:tasks.workunit.client.0.vm07.stdout:2/25: mknod d3/cb 0 2026-03-09T19:27:04.686 INFO:tasks.workunit.client.0.vm07.stdout:2/26: write f2 [989852,14353] 0 2026-03-09T19:27:04.694 INFO:tasks.workunit.client.1.vm08.stdout:7/361: creat d5/d14/d27/d78/f7f x:0 0 0 2026-03-09T19:27:04.695 INFO:tasks.workunit.client.1.vm08.stdout:7/362: chown d5/d14/d27/d54/f75 85518749 1 2026-03-09T19:27:04.696 INFO:tasks.workunit.client.0.vm07.stdout:6/23: creat d0/d1/f6 x:0 0 0 2026-03-09T19:27:04.696 INFO:tasks.workunit.client.1.vm08.stdout:2/297: symlink d3/l6d 0 2026-03-09T19:27:04.697 INFO:tasks.workunit.client.0.vm07.stdout:3/22: dwrite d1/f4 [0,4194304] 0 2026-03-09T19:27:04.698 INFO:tasks.workunit.client.0.vm07.stdout:3/23: rename d1 to d1/d5 22 2026-03-09T19:27:04.698 INFO:tasks.workunit.client.0.vm07.stdout:3/24: readlink d1/l3 0 2026-03-09T19:27:04.708 INFO:tasks.workunit.client.0.vm07.stdout:3/25: dwrite f0 [4194304,4194304] 0 2026-03-09T19:27:04.708 INFO:tasks.workunit.client.1.vm08.stdout:7/363: write d5/fb [2139343,113255] 0 2026-03-09T19:27:04.708 INFO:tasks.workunit.client.1.vm08.stdout:7/364: fdatasync d5/d14/d27/f35 0 2026-03-09T19:27:04.713 INFO:tasks.workunit.client.1.vm08.stdout:2/298: mknod d3/d4/d23/d2c/d41/c6e 0 2026-03-09T19:27:04.715 INFO:tasks.workunit.client.0.vm07.stdout:9/32: symlink d0/l7 0 2026-03-09T19:27:04.716 INFO:tasks.workunit.client.0.vm07.stdout:8/25: creat d7/d9/fd x:0 0 0 2026-03-09T19:27:04.717 INFO:tasks.workunit.client.1.vm08.stdout:7/365: read d5/f9 [2359718,92209] 0 2026-03-09T19:27:04.718 INFO:tasks.workunit.client.1.vm08.stdout:7/366: truncate d5/d14/f6c 950312 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.1.vm08.stdout:2/299: mknod d3/d4/d23/d2c/d39/d5e/de/c6f 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.1.vm08.stdout:2/300: readlink d3/d4/d23/d2c/d39/d5e/de/d18/d1f/l43 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.1.vm08.stdout:2/301: fdatasync d3/d4/d23/d2c/f31 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.1.vm08.stdout:3/363: rmdir d0/d6/de/d15 39 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.1.vm08.stdout:3/364: write d0/d6/de/d1a/d33/f68 [497159,29866] 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.1.vm08.stdout:7/367: mknod d5/d12/c80 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.1.vm08.stdout:7/368: truncate d5/d14/d27/d78/f7f 6066 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.1.vm08.stdout:2/302: fdatasync d3/d4/d23/d2c/d39/d5e/de/d18/f3f 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.1.vm08.stdout:6/331: dwrite d3/db/f14 [0,4194304] 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.0.vm07.stdout:7/15: getdents d0/d4/d5 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.0.vm07.stdout:5/32: creat d3/d9/fc x:0 0 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.0.vm07.stdout:7/16: dread d0/f1 [0,4194304] 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.0.vm07.stdout:2/27: creat d3/fc x:0 0 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.0.vm07.stdout:2/28: chown d3/f7 3488224 1 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.0.vm07.stdout:2/29: chown d3/f4 3466 1 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.0.vm07.stdout:6/24: symlink d0/d1/l7 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.0.vm07.stdout:2/30: dwrite d3/fc [0,4194304] 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.0.vm07.stdout:6/25: write d0/f3 [7281748,105583] 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.0.vm07.stdout:3/26: mkdir d1/d6 0 2026-03-09T19:27:04.736 INFO:tasks.workunit.client.0.vm07.stdout:3/27: readlink d1/l3 0 2026-03-09T19:27:04.737 INFO:tasks.workunit.client.0.vm07.stdout:0/25: sync 2026-03-09T19:27:04.740 INFO:tasks.workunit.client.1.vm08.stdout:3/365: unlink d0/d8/d19/f38 0 2026-03-09T19:27:04.744 INFO:tasks.workunit.client.0.vm07.stdout:3/28: dread d1/f4 [0,4194304] 0 2026-03-09T19:27:04.749 INFO:tasks.workunit.client.0.vm07.stdout:8/26: creat d7/fe x:0 0 0 2026-03-09T19:27:04.754 INFO:tasks.workunit.client.0.vm07.stdout:5/33: mkdir d3/dd 0 2026-03-09T19:27:04.754 INFO:tasks.workunit.client.0.vm07.stdout:7/17: stat d0/l2 0 2026-03-09T19:27:04.755 INFO:tasks.workunit.client.0.vm07.stdout:5/34: dwrite d3/d9/fb [0,4194304] 0 2026-03-09T19:27:04.755 INFO:tasks.workunit.client.1.vm08.stdout:3/366: dwrite d0/d6/de/d15/f53 [0,4194304] 0 2026-03-09T19:27:04.762 INFO:tasks.workunit.client.0.vm07.stdout:5/35: dwrite f2 [0,4194304] 0 2026-03-09T19:27:04.764 INFO:tasks.workunit.client.0.vm07.stdout:5/36: chown d3/c8 1 1 2026-03-09T19:27:04.766 INFO:tasks.workunit.client.1.vm08.stdout:5/298: sync 2026-03-09T19:27:04.773 INFO:tasks.workunit.client.1.vm08.stdout:5/299: write d16/d45/f4a [2479845,15519] 0 2026-03-09T19:27:04.773 INFO:tasks.workunit.client.1.vm08.stdout:5/300: stat d16/d1e/d30/l51 0 2026-03-09T19:27:04.773 INFO:tasks.workunit.client.1.vm08.stdout:7/369: symlink d5/d14/l81 0 2026-03-09T19:27:04.775 INFO:tasks.workunit.client.1.vm08.stdout:7/370: write d5/d16/d3a/d42/d6a/f61 [967480,60239] 0 2026-03-09T19:27:04.778 INFO:tasks.workunit.client.0.vm07.stdout:2/31: mkdir d3/dd 0 2026-03-09T19:27:04.793 INFO:tasks.workunit.client.1.vm08.stdout:7/371: truncate d5/d16/f4a 782552 0 2026-03-09T19:27:04.793 INFO:tasks.workunit.client.1.vm08.stdout:7/372: link d5/d14/d2b/d5d/l53 d5/d16/d3a/l82 0 2026-03-09T19:27:04.793 INFO:tasks.workunit.client.1.vm08.stdout:7/373: mkdir d5/d16/d1c/d83 0 2026-03-09T19:27:04.793 INFO:tasks.workunit.client.1.vm08.stdout:7/374: dwrite d5/d14/d27/d54/f5e [0,4194304] 0 2026-03-09T19:27:04.794 INFO:tasks.workunit.client.0.vm07.stdout:2/32: readlink d3/l9 0 2026-03-09T19:27:04.794 INFO:tasks.workunit.client.0.vm07.stdout:6/26: creat d0/d1/f8 x:0 0 0 2026-03-09T19:27:04.794 INFO:tasks.workunit.client.0.vm07.stdout:0/26: unlink d0/f4 0 2026-03-09T19:27:04.794 INFO:tasks.workunit.client.0.vm07.stdout:0/27: chown d0/c3 48924434 1 2026-03-09T19:27:04.794 INFO:tasks.workunit.client.0.vm07.stdout:9/33: creat d0/d6/f8 x:0 0 0 2026-03-09T19:27:04.794 INFO:tasks.workunit.client.0.vm07.stdout:3/29: creat d1/f7 x:0 0 0 2026-03-09T19:27:04.794 INFO:tasks.workunit.client.0.vm07.stdout:9/34: dread - d0/d6/f8 zero size 2026-03-09T19:27:04.794 INFO:tasks.workunit.client.0.vm07.stdout:2/33: dwrite d3/f7 [0,4194304] 0 2026-03-09T19:27:04.794 INFO:tasks.workunit.client.0.vm07.stdout:5/37: creat d3/fe x:0 0 0 2026-03-09T19:27:04.794 INFO:tasks.workunit.client.0.vm07.stdout:6/27: dwrite d0/f3 [0,4194304] 0 2026-03-09T19:27:04.794 INFO:tasks.workunit.client.1.vm08.stdout:7/375: write d5/d14/d2b/d4b/f7e [815914,6094] 0 2026-03-09T19:27:04.804 INFO:tasks.workunit.client.0.vm07.stdout:0/28: mkdir d0/d6 0 2026-03-09T19:27:04.804 INFO:tasks.workunit.client.1.vm08.stdout:2/303: sync 2026-03-09T19:27:04.804 INFO:tasks.workunit.client.0.vm07.stdout:7/18: dwrite d0/f1 [0,4194304] 0 2026-03-09T19:27:04.807 INFO:tasks.workunit.client.0.vm07.stdout:3/30: rename d1/f4 to d1/f8 0 2026-03-09T19:27:04.809 INFO:tasks.workunit.client.0.vm07.stdout:0/29: write d0/f2 [1386318,141] 0 2026-03-09T19:27:04.809 INFO:tasks.workunit.client.0.vm07.stdout:3/31: chown d1/l3 1148891 1 2026-03-09T19:27:04.809 INFO:tasks.workunit.client.0.vm07.stdout:2/34: dread f0 [0,4194304] 0 2026-03-09T19:27:04.809 INFO:tasks.workunit.client.1.vm08.stdout:5/301: sync 2026-03-09T19:27:04.826 INFO:tasks.workunit.client.1.vm08.stdout:6/332: read d3/d15/f19 [3365622,81222] 0 2026-03-09T19:27:04.829 INFO:tasks.workunit.client.0.vm07.stdout:7/19: sync 2026-03-09T19:27:04.837 INFO:tasks.workunit.client.1.vm08.stdout:5/302: creat d16/d1e/d3b/f5e x:0 0 0 2026-03-09T19:27:04.837 INFO:tasks.workunit.client.1.vm08.stdout:5/303: chown d16/d1e/d30/l51 434784 1 2026-03-09T19:27:04.837 INFO:tasks.workunit.client.0.vm07.stdout:9/35: symlink d0/d6/l9 0 2026-03-09T19:27:04.838 INFO:tasks.workunit.client.0.vm07.stdout:9/36: stat d0/f3 0 2026-03-09T19:27:04.847 INFO:tasks.workunit.client.0.vm07.stdout:8/27: link f4 d7/ff 0 2026-03-09T19:27:04.848 INFO:tasks.workunit.client.1.vm08.stdout:2/304: link d3/d4/d23/d2c/f5b d3/d4/d23/d5c/f70 0 2026-03-09T19:27:04.852 INFO:tasks.workunit.client.1.vm08.stdout:5/304: chown d16/d1e/l36 64700 1 2026-03-09T19:27:04.852 INFO:tasks.workunit.client.1.vm08.stdout:0/334: write dd/d22/d24/d49/f4c [530128,104533] 0 2026-03-09T19:27:04.853 INFO:tasks.workunit.client.1.vm08.stdout:0/335: fsync dd/d31/f54 0 2026-03-09T19:27:04.853 INFO:tasks.workunit.client.1.vm08.stdout:0/336: write dd/f13 [2755762,3640] 0 2026-03-09T19:27:04.855 INFO:tasks.workunit.client.0.vm07.stdout:0/30: mkdir d0/d7 0 2026-03-09T19:27:04.858 INFO:tasks.workunit.client.1.vm08.stdout:1/436: truncate d9/da/dc/f78 602570 0 2026-03-09T19:27:04.863 INFO:tasks.workunit.client.0.vm07.stdout:0/31: dwrite d0/f2 [4194304,4194304] 0 2026-03-09T19:27:04.863 INFO:tasks.workunit.client.1.vm08.stdout:0/337: write dd/d22/f29 [2141099,36589] 0 2026-03-09T19:27:04.865 INFO:tasks.workunit.client.1.vm08.stdout:1/437: symlink d9/da/dc/l83 0 2026-03-09T19:27:04.866 INFO:tasks.workunit.client.1.vm08.stdout:8/355: write de/d1d/d21/f30 [1236723,10839] 0 2026-03-09T19:27:04.867 INFO:tasks.workunit.client.0.vm07.stdout:8/28: unlink d7/fe 0 2026-03-09T19:27:04.869 INFO:tasks.workunit.client.1.vm08.stdout:1/438: symlink d9/da/d53/d67/l84 0 2026-03-09T19:27:04.870 INFO:tasks.workunit.client.1.vm08.stdout:1/439: fsync d9/d11/f73 0 2026-03-09T19:27:04.872 INFO:tasks.workunit.client.0.vm07.stdout:2/35: creat d3/dd/fe x:0 0 0 2026-03-09T19:27:04.876 INFO:tasks.workunit.client.0.vm07.stdout:3/32: creat d1/d6/f9 x:0 0 0 2026-03-09T19:27:04.876 INFO:tasks.workunit.client.1.vm08.stdout:5/305: creat d16/d1e/f5f x:0 0 0 2026-03-09T19:27:04.878 INFO:tasks.workunit.client.0.vm07.stdout:7/20: mkdir d0/d4/d5/d8 0 2026-03-09T19:27:04.879 INFO:tasks.workunit.client.1.vm08.stdout:0/338: mknod dd/d22/d24/c68 0 2026-03-09T19:27:04.880 INFO:tasks.workunit.client.0.vm07.stdout:7/21: dread d0/f1 [0,4194304] 0 2026-03-09T19:27:04.880 INFO:tasks.workunit.client.1.vm08.stdout:0/339: dread - dd/d22/d24/d49/f62 zero size 2026-03-09T19:27:04.880 INFO:tasks.workunit.client.1.vm08.stdout:5/306: dwrite d16/d1e/f5f [0,4194304] 0 2026-03-09T19:27:04.882 INFO:tasks.workunit.client.1.vm08.stdout:5/307: dread - d16/d45/f5d zero size 2026-03-09T19:27:04.885 INFO:tasks.workunit.client.0.vm07.stdout:7/22: dwrite d0/f1 [4194304,4194304] 0 2026-03-09T19:27:04.892 INFO:tasks.workunit.client.0.vm07.stdout:0/32: rename d0/f2 to d0/d7/f8 0 2026-03-09T19:27:04.893 INFO:tasks.workunit.client.1.vm08.stdout:2/305: sync 2026-03-09T19:27:04.898 INFO:tasks.workunit.client.1.vm08.stdout:2/306: dwrite d3/d4/d23/d2c/d39/d5e/de/d18/f1a [4194304,4194304] 0 2026-03-09T19:27:04.903 INFO:tasks.workunit.client.1.vm08.stdout:1/440: rename d9/da/l65 to d9/da/d53/l85 0 2026-03-09T19:27:04.905 INFO:tasks.workunit.client.1.vm08.stdout:1/441: dread d9/d11/f29 [0,4194304] 0 2026-03-09T19:27:04.912 INFO:tasks.workunit.client.1.vm08.stdout:8/356: unlink de/d47/f81 0 2026-03-09T19:27:04.915 INFO:tasks.workunit.client.0.vm07.stdout:3/33: sync 2026-03-09T19:27:04.915 INFO:tasks.workunit.client.1.vm08.stdout:5/308: dread ff [0,4194304] 0 2026-03-09T19:27:04.917 INFO:tasks.workunit.client.0.vm07.stdout:8/29: chown f2 3035870 1 2026-03-09T19:27:04.919 INFO:tasks.workunit.client.0.vm07.stdout:2/36: creat d3/ff x:0 0 0 2026-03-09T19:27:04.930 INFO:tasks.workunit.client.1.vm08.stdout:0/340: rename dd/d22/d27/l2d to dd/d22/d24/d49/d50/l69 0 2026-03-09T19:27:04.940 INFO:tasks.workunit.client.1.vm08.stdout:2/307: dread d3/f7 [0,4194304] 0 2026-03-09T19:27:04.951 INFO:tasks.workunit.client.1.vm08.stdout:1/442: write d9/da/dc/f1d [4951192,82544] 0 2026-03-09T19:27:04.952 INFO:tasks.workunit.client.1.vm08.stdout:1/443: fsync d9/f48 0 2026-03-09T19:27:04.952 INFO:tasks.workunit.client.1.vm08.stdout:1/444: fsync d9/da/f6f 0 2026-03-09T19:27:04.958 INFO:tasks.workunit.client.0.vm07.stdout:3/34: creat d1/d6/fa x:0 0 0 2026-03-09T19:27:04.964 INFO:tasks.workunit.client.0.vm07.stdout:8/30: dwrite f4 [0,4194304] 0 2026-03-09T19:27:04.965 INFO:tasks.workunit.client.1.vm08.stdout:9/315: dwrite d0/d2/d8/d7/f3f [0,4194304] 0 2026-03-09T19:27:04.973 INFO:tasks.workunit.client.0.vm07.stdout:0/33: link d0/d7/f8 d0/d7/f9 0 2026-03-09T19:27:04.977 INFO:tasks.workunit.client.1.vm08.stdout:0/341: mkdir dd/d6a 0 2026-03-09T19:27:04.977 INFO:tasks.workunit.client.1.vm08.stdout:0/342: readlink dd/l17 0 2026-03-09T19:27:04.977 INFO:tasks.workunit.client.1.vm08.stdout:0/343: stat dd/d22/d24/c5e 0 2026-03-09T19:27:04.977 INFO:tasks.workunit.client.1.vm08.stdout:9/316: dread d0/d2/d14/d5c/d32/f40 [0,4194304] 0 2026-03-09T19:27:04.978 INFO:tasks.workunit.client.1.vm08.stdout:9/317: write d0/fa [4630688,125153] 0 2026-03-09T19:27:04.983 INFO:tasks.workunit.client.0.vm07.stdout:3/35: fdatasync f0 0 2026-03-09T19:27:04.985 INFO:tasks.workunit.client.0.vm07.stdout:3/36: dread f0 [4194304,4194304] 0 2026-03-09T19:27:04.986 INFO:tasks.workunit.client.0.vm07.stdout:3/37: write d1/d6/f9 [376336,22646] 0 2026-03-09T19:27:04.990 INFO:tasks.workunit.client.0.vm07.stdout:8/31: mkdir d7/d9/d10 0 2026-03-09T19:27:04.991 INFO:tasks.workunit.client.0.vm07.stdout:7/23: link d0/l3 d0/d4/l9 0 2026-03-09T19:27:04.991 INFO:tasks.workunit.client.1.vm08.stdout:1/445: fdatasync d9/da/dc/f31 0 2026-03-09T19:27:04.992 INFO:tasks.workunit.client.1.vm08.stdout:1/446: write d9/da/d2d/f3d [878665,38148] 0 2026-03-09T19:27:04.994 INFO:tasks.workunit.client.0.vm07.stdout:1/32: rmdir d1 39 2026-03-09T19:27:04.998 INFO:tasks.workunit.client.1.vm08.stdout:5/309: mknod d16/c60 0 2026-03-09T19:27:04.998 INFO:tasks.workunit.client.0.vm07.stdout:1/33: read f0 [163465,52246] 0 2026-03-09T19:27:04.998 INFO:tasks.workunit.client.0.vm07.stdout:4/21: write f2 [4987911,33041] 0 2026-03-09T19:27:05.000 INFO:tasks.workunit.client.0.vm07.stdout:7/24: readlink d0/l3 0 2026-03-09T19:27:05.002 INFO:tasks.workunit.client.1.vm08.stdout:7/376: rmdir d5/d14/d27 39 2026-03-09T19:27:05.008 INFO:tasks.workunit.client.1.vm08.stdout:2/308: symlink d3/d9/l71 0 2026-03-09T19:27:05.013 INFO:tasks.workunit.client.1.vm08.stdout:2/309: dwrite d3/d4/d23/d2c/d39/d5e/de/d18/f1a [4194304,4194304] 0 2026-03-09T19:27:05.027 INFO:tasks.workunit.client.0.vm07.stdout:1/34: creat d1/f10 x:0 0 0 2026-03-09T19:27:05.027 INFO:tasks.workunit.client.1.vm08.stdout:5/310: chown d16/d1e/f27 2 1 2026-03-09T19:27:05.029 INFO:tasks.workunit.client.0.vm07.stdout:8/32: link f5 d7/d9/f11 0 2026-03-09T19:27:05.031 INFO:tasks.workunit.client.0.vm07.stdout:4/22: symlink d3/l4 0 2026-03-09T19:27:05.033 INFO:tasks.workunit.client.0.vm07.stdout:7/25: creat d0/d4/d5/d8/fa x:0 0 0 2026-03-09T19:27:05.035 INFO:tasks.workunit.client.0.vm07.stdout:1/35: unlink f0 0 2026-03-09T19:27:05.036 INFO:tasks.workunit.client.0.vm07.stdout:1/36: write d1/f2 [1145872,116612] 0 2026-03-09T19:27:05.036 INFO:tasks.workunit.client.0.vm07.stdout:1/37: readlink d1/d9/le 0 2026-03-09T19:27:05.040 INFO:tasks.workunit.client.0.vm07.stdout:4/23: creat d3/f5 x:0 0 0 2026-03-09T19:27:05.041 INFO:tasks.workunit.client.1.vm08.stdout:1/447: symlink d9/da/d17/d60/l86 0 2026-03-09T19:27:05.042 INFO:tasks.workunit.client.0.vm07.stdout:7/26: rmdir d0/d4 39 2026-03-09T19:27:05.044 INFO:tasks.workunit.client.0.vm07.stdout:1/38: dwrite d1/f10 [0,4194304] 0 2026-03-09T19:27:05.044 INFO:tasks.workunit.client.1.vm08.stdout:5/311: mkdir d16/d1e/d3b/d61 0 2026-03-09T19:27:05.045 INFO:tasks.workunit.client.0.vm07.stdout:8/33: dwrite d7/d9/f11 [0,4194304] 0 2026-03-09T19:27:05.049 INFO:tasks.workunit.client.0.vm07.stdout:6/28: fdatasync d0/f2 0 2026-03-09T19:27:05.049 INFO:tasks.workunit.client.0.vm07.stdout:8/34: write d7/ff [4306350,119433] 0 2026-03-09T19:27:05.049 INFO:tasks.workunit.client.0.vm07.stdout:6/29: chown d0/d1/l7 88888 1 2026-03-09T19:27:05.049 INFO:tasks.workunit.client.0.vm07.stdout:8/35: dread - d7/d9/fd zero size 2026-03-09T19:27:05.058 INFO:tasks.workunit.client.0.vm07.stdout:6/30: dwrite d0/d1/f8 [0,4194304] 0 2026-03-09T19:27:05.068 INFO:tasks.workunit.client.0.vm07.stdout:8/36: dwrite d7/d9/fd [0,4194304] 0 2026-03-09T19:27:05.070 INFO:tasks.workunit.client.1.vm08.stdout:0/344: rename dd/lf to dd/d22/d27/l6b 0 2026-03-09T19:27:05.075 INFO:tasks.workunit.client.0.vm07.stdout:4/24: fsync f1 0 2026-03-09T19:27:05.077 INFO:tasks.workunit.client.1.vm08.stdout:7/377: link d5/d14/f1e d5/d14/d2b/d5d/f84 0 2026-03-09T19:27:05.080 INFO:tasks.workunit.client.0.vm07.stdout:4/25: dwrite d3/f5 [0,4194304] 0 2026-03-09T19:27:05.083 INFO:tasks.workunit.client.1.vm08.stdout:2/310: creat d3/d4/d23/d2c/d41/d46/f72 x:0 0 0 2026-03-09T19:27:05.087 INFO:tasks.workunit.client.1.vm08.stdout:3/367: write d0/d6/d25/f56 [841981,56507] 0 2026-03-09T19:27:05.087 INFO:tasks.workunit.client.1.vm08.stdout:3/368: read d0/d6/de/d1b/d16/d18/f2c [3553523,114532] 0 2026-03-09T19:27:05.088 INFO:tasks.workunit.client.0.vm07.stdout:1/39: mkdir d1/d11 0 2026-03-09T19:27:05.089 INFO:tasks.workunit.client.1.vm08.stdout:3/369: dwrite d0/d52/f5d [0,4194304] 0 2026-03-09T19:27:05.090 INFO:tasks.workunit.client.1.vm08.stdout:3/370: write d0/d8/f66 [275949,122264] 0 2026-03-09T19:27:05.092 INFO:tasks.workunit.client.1.vm08.stdout:8/357: getdents de/d25/d31 0 2026-03-09T19:27:05.095 INFO:tasks.workunit.client.0.vm07.stdout:4/26: dwrite f2 [4194304,4194304] 0 2026-03-09T19:27:05.099 INFO:tasks.workunit.client.1.vm08.stdout:5/312: mknod d16/d45/c62 0 2026-03-09T19:27:05.101 INFO:tasks.workunit.client.0.vm07.stdout:6/31: rmdir d0 39 2026-03-09T19:27:05.111 INFO:tasks.workunit.client.1.vm08.stdout:7/378: unlink d5/d16/d1c/l44 0 2026-03-09T19:27:05.113 INFO:tasks.workunit.client.0.vm07.stdout:8/37: rename d7/d9/la to d7/d9/l12 0 2026-03-09T19:27:05.115 INFO:tasks.workunit.client.0.vm07.stdout:8/38: dread d7/d9/f11 [0,4194304] 0 2026-03-09T19:27:05.119 INFO:tasks.workunit.client.1.vm08.stdout:9/318: getdents d0/d2/d14/d5c/d32 0 2026-03-09T19:27:05.119 INFO:tasks.workunit.client.0.vm07.stdout:8/39: dwrite d7/d9/fd [0,4194304] 0 2026-03-09T19:27:05.124 INFO:tasks.workunit.client.0.vm07.stdout:3/38: fsync d1/f8 0 2026-03-09T19:27:05.125 INFO:tasks.workunit.client.1.vm08.stdout:6/333: write d3/db/f42 [4565426,114989] 0 2026-03-09T19:27:05.126 INFO:tasks.workunit.client.1.vm08.stdout:6/334: write d3/d34/d6f/f41 [4680474,52632] 0 2026-03-09T19:27:05.126 INFO:tasks.workunit.client.0.vm07.stdout:3/39: truncate d1/f7 757012 0 2026-03-09T19:27:05.131 INFO:tasks.workunit.client.0.vm07.stdout:4/27: sync 2026-03-09T19:27:05.132 INFO:tasks.workunit.client.0.vm07.stdout:3/40: dwrite d1/f7 [0,4194304] 0 2026-03-09T19:27:05.133 INFO:tasks.workunit.client.0.vm07.stdout:4/28: write f2 [2146723,28182] 0 2026-03-09T19:27:05.140 INFO:tasks.workunit.client.0.vm07.stdout:4/29: dwrite d3/f5 [0,4194304] 0 2026-03-09T19:27:05.142 INFO:tasks.workunit.client.1.vm08.stdout:4/327: write da/d10/d26/d38/f43 [1436358,117577] 0 2026-03-09T19:27:05.151 INFO:tasks.workunit.client.0.vm07.stdout:4/30: dwrite f1 [0,4194304] 0 2026-03-09T19:27:05.152 INFO:tasks.workunit.client.0.vm07.stdout:5/38: truncate d3/d9/fb 516038 0 2026-03-09T19:27:05.152 INFO:tasks.workunit.client.1.vm08.stdout:4/328: write da/d10/d1b/f29 [2132486,109831] 0 2026-03-09T19:27:05.152 INFO:tasks.workunit.client.1.vm08.stdout:2/311: stat d3/d4/d23/d2c/d39/l4b 0 2026-03-09T19:27:05.164 INFO:tasks.workunit.client.0.vm07.stdout:1/40: creat d1/d3/f12 x:0 0 0 2026-03-09T19:27:05.164 INFO:tasks.workunit.client.0.vm07.stdout:7/27: symlink d0/d4/d5/d8/lb 0 2026-03-09T19:27:05.164 INFO:tasks.workunit.client.1.vm08.stdout:0/345: mkdir dd/d22/d27/d6c 0 2026-03-09T19:27:05.169 INFO:tasks.workunit.client.0.vm07.stdout:6/32: dread d0/f2 [0,4194304] 0 2026-03-09T19:27:05.173 INFO:tasks.workunit.client.0.vm07.stdout:2/37: getdents d3/dd 0 2026-03-09T19:27:05.174 INFO:tasks.workunit.client.0.vm07.stdout:0/34: truncate d0/d7/f8 2168018 0 2026-03-09T19:27:05.174 INFO:tasks.workunit.client.0.vm07.stdout:6/33: write d0/f4 [5233526,97219] 0 2026-03-09T19:27:05.176 INFO:tasks.workunit.client.0.vm07.stdout:5/39: dwrite d3/fe [0,4194304] 0 2026-03-09T19:27:05.176 INFO:tasks.workunit.client.0.vm07.stdout:6/34: chown d0 0 1 2026-03-09T19:27:05.181 INFO:tasks.workunit.client.0.vm07.stdout:3/41: creat d1/d6/fb x:0 0 0 2026-03-09T19:27:05.183 INFO:tasks.workunit.client.1.vm08.stdout:1/448: truncate d9/f48 695128 0 2026-03-09T19:27:05.184 INFO:tasks.workunit.client.1.vm08.stdout:1/449: dread - d9/da/d12/d39/f47 zero size 2026-03-09T19:27:05.184 INFO:tasks.workunit.client.0.vm07.stdout:2/38: dwrite d3/ff [0,4194304] 0 2026-03-09T19:27:05.184 INFO:tasks.workunit.client.0.vm07.stdout:2/39: fdatasync d3/fc 0 2026-03-09T19:27:05.195 INFO:tasks.workunit.client.0.vm07.stdout:8/40: mknod d7/d9/d10/c13 0 2026-03-09T19:27:05.196 INFO:tasks.workunit.client.0.vm07.stdout:8/41: rename d7 to d7/d9/d14 22 2026-03-09T19:27:05.201 INFO:tasks.workunit.client.0.vm07.stdout:7/28: creat d0/d4/fc x:0 0 0 2026-03-09T19:27:05.207 INFO:tasks.workunit.client.0.vm07.stdout:0/35: fdatasync d0/d7/f8 0 2026-03-09T19:27:05.207 INFO:tasks.workunit.client.1.vm08.stdout:6/335: symlink d3/l7d 0 2026-03-09T19:27:05.207 INFO:tasks.workunit.client.1.vm08.stdout:0/346: dread fb [0,4194304] 0 2026-03-09T19:27:05.207 INFO:tasks.workunit.client.1.vm08.stdout:0/347: chown dd/c11 0 1 2026-03-09T19:27:05.207 INFO:tasks.workunit.client.0.vm07.stdout:4/31: rmdir d3 39 2026-03-09T19:27:05.213 INFO:tasks.workunit.client.0.vm07.stdout:6/35: creat d0/f9 x:0 0 0 2026-03-09T19:27:05.218 INFO:tasks.workunit.client.1.vm08.stdout:1/450: creat d9/d11/f87 x:0 0 0 2026-03-09T19:27:05.218 INFO:tasks.workunit.client.0.vm07.stdout:3/42: creat d1/fc x:0 0 0 2026-03-09T19:27:05.218 INFO:tasks.workunit.client.1.vm08.stdout:5/313: rename c11 to d16/c63 0 2026-03-09T19:27:05.222 INFO:tasks.workunit.client.1.vm08.stdout:6/336: fdatasync d3/d34/d6f/d4b/f5e 0 2026-03-09T19:27:05.225 INFO:tasks.workunit.client.1.vm08.stdout:0/348: unlink dd/f12 0 2026-03-09T19:27:05.225 INFO:tasks.workunit.client.1.vm08.stdout:0/349: write dd/d22/f3e [1106398,59200] 0 2026-03-09T19:27:05.227 INFO:tasks.workunit.client.1.vm08.stdout:1/451: mknod d9/da/d2c/c88 0 2026-03-09T19:27:05.228 INFO:tasks.workunit.client.1.vm08.stdout:2/312: rename d3/d4/f48 to d3/d4/d23/d2c/d39/d5e/d14/f73 0 2026-03-09T19:27:05.229 INFO:tasks.workunit.client.1.vm08.stdout:2/313: write d3/d9/d26/f69 [349572,121048] 0 2026-03-09T19:27:05.230 INFO:tasks.workunit.client.1.vm08.stdout:2/314: truncate d3/d4/d23/d2c/d39/d5e/d14/f58 609251 0 2026-03-09T19:27:05.230 INFO:tasks.workunit.client.0.vm07.stdout:2/40: dread f0 [0,4194304] 0 2026-03-09T19:27:05.232 INFO:tasks.workunit.client.0.vm07.stdout:8/42: dwrite f5 [0,4194304] 0 2026-03-09T19:27:05.232 INFO:tasks.workunit.client.0.vm07.stdout:7/29: stat d0/d4/l9 0 2026-03-09T19:27:05.233 INFO:tasks.workunit.client.1.vm08.stdout:6/337: mkdir d3/d68/d7e 0 2026-03-09T19:27:05.233 INFO:tasks.workunit.client.0.vm07.stdout:8/43: write f2 [824965,21188] 0 2026-03-09T19:27:05.235 INFO:tasks.workunit.client.0.vm07.stdout:0/36: creat d0/fa x:0 0 0 2026-03-09T19:27:05.236 INFO:tasks.workunit.client.0.vm07.stdout:4/32: unlink f2 0 2026-03-09T19:27:05.236 INFO:tasks.workunit.client.0.vm07.stdout:0/37: dread - d0/fa zero size 2026-03-09T19:27:05.237 INFO:tasks.workunit.client.0.vm07.stdout:6/36: rename d0/f3 to d0/d1/fa 0 2026-03-09T19:27:05.238 INFO:tasks.workunit.client.0.vm07.stdout:0/38: rename d0/d6 to d0/d6/db 22 2026-03-09T19:27:05.238 INFO:tasks.workunit.client.0.vm07.stdout:7/30: dread d0/f1 [0,4194304] 0 2026-03-09T19:27:05.238 INFO:tasks.workunit.client.0.vm07.stdout:5/40: creat d3/dd/ff x:0 0 0 2026-03-09T19:27:05.239 INFO:tasks.workunit.client.0.vm07.stdout:5/41: chown d3/c8 43650 1 2026-03-09T19:27:05.239 INFO:tasks.workunit.client.0.vm07.stdout:7/31: stat d0/d4 0 2026-03-09T19:27:05.239 INFO:tasks.workunit.client.0.vm07.stdout:5/42: dread - d3/dd/ff zero size 2026-03-09T19:27:05.239 INFO:tasks.workunit.client.1.vm08.stdout:1/452: mkdir d9/d11/d7a/d89 0 2026-03-09T19:27:05.240 INFO:tasks.workunit.client.1.vm08.stdout:0/350: dread dd/d22/d27/d2e/d37/f46 [0,4194304] 0 2026-03-09T19:27:05.240 INFO:tasks.workunit.client.1.vm08.stdout:7/379: rename d5/d12 to d5/d16/d3a/d42/d85 0 2026-03-09T19:27:05.240 INFO:tasks.workunit.client.1.vm08.stdout:5/314: rename d16 to d16/d45/d64 22 2026-03-09T19:27:05.242 INFO:tasks.workunit.client.0.vm07.stdout:3/43: unlink d1/fc 0 2026-03-09T19:27:05.242 INFO:tasks.workunit.client.1.vm08.stdout:2/315: mknod d3/d4/d23/d2c/c74 0 2026-03-09T19:27:05.253 INFO:tasks.workunit.client.0.vm07.stdout:3/44: write f0 [4256953,36643] 0 2026-03-09T19:27:05.254 INFO:tasks.workunit.client.0.vm07.stdout:2/41: mknod d3/dd/c10 0 2026-03-09T19:27:05.254 INFO:tasks.workunit.client.0.vm07.stdout:3/45: read f0 [5255296,21437] 0 2026-03-09T19:27:05.254 INFO:tasks.workunit.client.0.vm07.stdout:3/46: write d1/f7 [227361,40075] 0 2026-03-09T19:27:05.254 INFO:tasks.workunit.client.0.vm07.stdout:5/43: dwrite d3/d9/fa [0,4194304] 0 2026-03-09T19:27:05.254 INFO:tasks.workunit.client.0.vm07.stdout:8/44: creat d7/f15 x:0 0 0 2026-03-09T19:27:05.254 INFO:tasks.workunit.client.1.vm08.stdout:2/316: chown d3/d4/d23/d2c/d39/d5e/f68 14307 1 2026-03-09T19:27:05.254 INFO:tasks.workunit.client.1.vm08.stdout:1/453: creat d9/da/d2c/f8a x:0 0 0 2026-03-09T19:27:05.254 INFO:tasks.workunit.client.1.vm08.stdout:1/454: truncate d9/d11/f87 142217 0 2026-03-09T19:27:05.254 INFO:tasks.workunit.client.1.vm08.stdout:0/351: creat dd/f6d x:0 0 0 2026-03-09T19:27:05.254 INFO:tasks.workunit.client.1.vm08.stdout:6/338: rename d3/d15/f1c to d3/d34/d5c/f7f 0 2026-03-09T19:27:05.254 INFO:tasks.workunit.client.1.vm08.stdout:7/380: stat d5/d14/d2b/d5d/l53 0 2026-03-09T19:27:05.254 INFO:tasks.workunit.client.1.vm08.stdout:2/317: mknod d3/d4/d23/d2c/d41/c75 0 2026-03-09T19:27:05.254 INFO:tasks.workunit.client.1.vm08.stdout:0/352: mkdir dd/d22/d63/d6e 0 2026-03-09T19:27:05.256 INFO:tasks.workunit.client.0.vm07.stdout:2/42: dwrite d3/fa [0,4194304] 0 2026-03-09T19:27:05.257 INFO:tasks.workunit.client.1.vm08.stdout:5/315: rename d16/f20 to d16/d45/f65 0 2026-03-09T19:27:05.257 INFO:tasks.workunit.client.0.vm07.stdout:3/47: dread d1/f7 [0,4194304] 0 2026-03-09T19:27:05.258 INFO:tasks.workunit.client.1.vm08.stdout:6/339: mknod d3/d55/c80 0 2026-03-09T19:27:05.261 INFO:tasks.workunit.client.1.vm08.stdout:2/318: creat d3/d4/d23/d5c/f76 x:0 0 0 2026-03-09T19:27:05.266 INFO:tasks.workunit.client.1.vm08.stdout:1/455: rename d9/da/dc/c2b to d9/da/d53/d67/d6c/c8b 0 2026-03-09T19:27:05.267 INFO:tasks.workunit.client.1.vm08.stdout:5/316: symlink d16/d45/l66 0 2026-03-09T19:27:05.268 INFO:tasks.workunit.client.0.vm07.stdout:7/32: mkdir d0/d4/d5/dd 0 2026-03-09T19:27:05.270 INFO:tasks.workunit.client.1.vm08.stdout:6/340: mknod d3/db/d43/c81 0 2026-03-09T19:27:05.271 INFO:tasks.workunit.client.1.vm08.stdout:5/317: dwrite d16/d45/f5d [0,4194304] 0 2026-03-09T19:27:05.274 INFO:tasks.workunit.client.1.vm08.stdout:7/381: unlink d5/d14/d27/d54/l67 0 2026-03-09T19:27:05.276 INFO:tasks.workunit.client.1.vm08.stdout:2/319: symlink d3/d4/d23/d2c/d39/d5e/de/l77 0 2026-03-09T19:27:05.278 INFO:tasks.workunit.client.0.vm07.stdout:8/45: mkdir d7/d16 0 2026-03-09T19:27:05.283 INFO:tasks.workunit.client.1.vm08.stdout:1/456: symlink d9/da/d12/d39/l8c 0 2026-03-09T19:27:05.283 INFO:tasks.workunit.client.0.vm07.stdout:6/37: dwrite d0/f2 [8388608,4194304] 0 2026-03-09T19:27:05.285 INFO:tasks.workunit.client.1.vm08.stdout:7/382: mkdir d5/d14/d27/d54/d86 0 2026-03-09T19:27:05.286 INFO:tasks.workunit.client.0.vm07.stdout:2/43: dwrite f2 [0,4194304] 0 2026-03-09T19:27:05.286 INFO:tasks.workunit.client.0.vm07.stdout:5/44: dwrite d3/d9/fc [0,4194304] 0 2026-03-09T19:27:05.286 INFO:tasks.workunit.client.1.vm08.stdout:7/383: write d5/d16/d3a/d42/d6a/f61 [1225798,107238] 0 2026-03-09T19:27:05.286 INFO:tasks.workunit.client.0.vm07.stdout:6/38: truncate d0/d1/f6 450962 0 2026-03-09T19:27:05.292 INFO:tasks.workunit.client.0.vm07.stdout:6/39: read d0/f4 [10671851,61843] 0 2026-03-09T19:27:05.294 INFO:tasks.workunit.client.0.vm07.stdout:6/40: dread d0/d1/fa [0,4194304] 0 2026-03-09T19:27:05.295 INFO:tasks.workunit.client.1.vm08.stdout:2/320: rename d3/f19 to d3/d4/d23/d2c/d39/d5e/d14/f78 0 2026-03-09T19:27:05.298 INFO:tasks.workunit.client.1.vm08.stdout:6/341: symlink d3/l82 0 2026-03-09T19:27:05.307 INFO:tasks.workunit.client.1.vm08.stdout:7/384: creat d5/d16/d1c/f87 x:0 0 0 2026-03-09T19:27:05.307 INFO:tasks.workunit.client.1.vm08.stdout:7/385: dread - d5/d16/d3a/d42/d6a/f7d zero size 2026-03-09T19:27:05.315 INFO:tasks.workunit.client.1.vm08.stdout:5/318: dread d16/d1e/f5b [0,4194304] 0 2026-03-09T19:27:05.316 INFO:tasks.workunit.client.1.vm08.stdout:5/319: read - d16/d45/f55 zero size 2026-03-09T19:27:05.319 INFO:tasks.workunit.client.1.vm08.stdout:1/457: mkdir d9/d11/d7a/d89/d8d 0 2026-03-09T19:27:05.321 INFO:tasks.workunit.client.1.vm08.stdout:6/342: symlink d3/d34/d5c/l83 0 2026-03-09T19:27:05.321 INFO:tasks.workunit.client.1.vm08.stdout:6/343: fdatasync d3/d34/d3b/f67 0 2026-03-09T19:27:05.322 INFO:tasks.workunit.client.1.vm08.stdout:6/344: fsync d3/d34/d5c/f7c 0 2026-03-09T19:27:05.323 INFO:tasks.workunit.client.1.vm08.stdout:7/386: symlink d5/d14/d2b/d5d/l88 0 2026-03-09T19:27:05.326 INFO:tasks.workunit.client.1.vm08.stdout:5/320: unlink d16/d1e/d30/f33 0 2026-03-09T19:27:05.327 INFO:tasks.workunit.client.1.vm08.stdout:6/345: mknod d3/d34/d6f/d4b/c84 0 2026-03-09T19:27:05.334 INFO:tasks.workunit.client.1.vm08.stdout:6/346: mkdir d3/d34/d3b/d85 0 2026-03-09T19:27:05.335 INFO:tasks.workunit.client.1.vm08.stdout:5/321: mknod d16/c67 0 2026-03-09T19:27:05.335 INFO:tasks.workunit.client.1.vm08.stdout:5/322: read d16/d1e/f35 [1512511,94374] 0 2026-03-09T19:27:05.340 INFO:tasks.workunit.client.1.vm08.stdout:6/347: creat d3/d34/d6f/d4b/f86 x:0 0 0 2026-03-09T19:27:05.343 INFO:tasks.workunit.client.1.vm08.stdout:1/458: getdents d9/da/d12 0 2026-03-09T19:27:05.348 INFO:tasks.workunit.client.1.vm08.stdout:1/459: dread - d9/da/d2d/f50 zero size 2026-03-09T19:27:05.348 INFO:tasks.workunit.client.1.vm08.stdout:1/460: chown d9/l59 18786 1 2026-03-09T19:27:05.350 INFO:tasks.workunit.client.1.vm08.stdout:6/348: link d3/db/l27 d3/d55/l87 0 2026-03-09T19:27:05.356 INFO:tasks.workunit.client.0.vm07.stdout:3/48: mkdir d1/d6/dd 0 2026-03-09T19:27:05.356 INFO:tasks.workunit.client.0.vm07.stdout:7/33: mknod d0/d4/ce 0 2026-03-09T19:27:05.356 INFO:tasks.workunit.client.0.vm07.stdout:7/34: fdatasync d0/d4/fc 0 2026-03-09T19:27:05.357 INFO:tasks.workunit.client.0.vm07.stdout:8/46: mkdir d7/d17 0 2026-03-09T19:27:05.358 INFO:tasks.workunit.client.0.vm07.stdout:5/45: rename d3/f5 to d3/d9/f10 0 2026-03-09T19:27:05.363 INFO:tasks.workunit.client.0.vm07.stdout:8/47: rename d7 to d7/d16/d18 22 2026-03-09T19:27:05.363 INFO:tasks.workunit.client.0.vm07.stdout:6/41: mkdir d0/d1/db 0 2026-03-09T19:27:05.363 INFO:tasks.workunit.client.0.vm07.stdout:7/35: symlink d0/d4/lf 0 2026-03-09T19:27:05.363 INFO:tasks.workunit.client.0.vm07.stdout:6/42: rename d0 to d0/dc 22 2026-03-09T19:27:05.363 INFO:tasks.workunit.client.0.vm07.stdout:8/48: unlink d7/f8 0 2026-03-09T19:27:05.363 INFO:tasks.workunit.client.0.vm07.stdout:3/49: link d1/f7 d1/fe 0 2026-03-09T19:27:05.363 INFO:tasks.workunit.client.0.vm07.stdout:6/43: rename d0/f2 to d0/fd 0 2026-03-09T19:27:05.364 INFO:tasks.workunit.client.0.vm07.stdout:3/50: readlink d1/l3 0 2026-03-09T19:27:05.364 INFO:tasks.workunit.client.0.vm07.stdout:8/49: write d7/f15 [833032,125726] 0 2026-03-09T19:27:05.364 INFO:tasks.workunit.client.0.vm07.stdout:6/44: creat d0/fe x:0 0 0 2026-03-09T19:27:05.367 INFO:tasks.workunit.client.0.vm07.stdout:5/46: dwrite d3/dd/ff [0,4194304] 0 2026-03-09T19:27:05.367 INFO:tasks.workunit.client.0.vm07.stdout:8/50: write f1 [581867,98314] 0 2026-03-09T19:27:05.380 INFO:tasks.workunit.client.0.vm07.stdout:7/36: dwrite d0/d4/fc [0,4194304] 0 2026-03-09T19:27:05.380 INFO:tasks.workunit.client.0.vm07.stdout:8/51: dread f4 [0,4194304] 0 2026-03-09T19:27:05.380 INFO:tasks.workunit.client.0.vm07.stdout:3/51: symlink d1/d6/dd/lf 0 2026-03-09T19:27:05.380 INFO:tasks.workunit.client.0.vm07.stdout:6/45: creat d0/ff x:0 0 0 2026-03-09T19:27:05.380 INFO:tasks.workunit.client.0.vm07.stdout:6/46: chown d0/d1/db 367747 1 2026-03-09T19:27:05.381 INFO:tasks.workunit.client.0.vm07.stdout:5/47: symlink d3/d9/l11 0 2026-03-09T19:27:05.381 INFO:tasks.workunit.client.0.vm07.stdout:3/52: fsync d1/d6/f9 0 2026-03-09T19:27:05.382 INFO:tasks.workunit.client.0.vm07.stdout:6/47: creat d0/d1/f10 x:0 0 0 2026-03-09T19:27:05.388 INFO:tasks.workunit.client.0.vm07.stdout:5/48: unlink d3/dd/ff 0 2026-03-09T19:27:05.388 INFO:tasks.workunit.client.0.vm07.stdout:3/53: mkdir d1/d10 0 2026-03-09T19:27:05.388 INFO:tasks.workunit.client.0.vm07.stdout:6/48: symlink d0/d1/l11 0 2026-03-09T19:27:05.388 INFO:tasks.workunit.client.0.vm07.stdout:8/52: rmdir d7/d17 0 2026-03-09T19:27:05.394 INFO:tasks.workunit.client.0.vm07.stdout:8/53: read d7/f15 [860300,71298] 0 2026-03-09T19:27:05.394 INFO:tasks.workunit.client.0.vm07.stdout:5/49: creat d3/d9/f12 x:0 0 0 2026-03-09T19:27:05.396 INFO:tasks.workunit.client.0.vm07.stdout:3/54: dwrite d1/d6/fa [0,4194304] 0 2026-03-09T19:27:05.396 INFO:tasks.workunit.client.0.vm07.stdout:5/50: write d3/d9/fc [3289783,81355] 0 2026-03-09T19:27:05.398 INFO:tasks.workunit.client.0.vm07.stdout:6/49: dwrite d0/ff [0,4194304] 0 2026-03-09T19:27:05.398 INFO:tasks.workunit.client.0.vm07.stdout:6/50: read - d0/d1/f10 zero size 2026-03-09T19:27:05.398 INFO:tasks.workunit.client.0.vm07.stdout:6/51: read d0/d1/fa [12547865,24854] 0 2026-03-09T19:27:05.402 INFO:tasks.workunit.client.0.vm07.stdout:8/54: unlink d7/d9/f11 0 2026-03-09T19:27:05.403 INFO:tasks.workunit.client.0.vm07.stdout:3/55: creat d1/d6/dd/f11 x:0 0 0 2026-03-09T19:27:05.405 INFO:tasks.workunit.client.0.vm07.stdout:8/55: creat d7/f19 x:0 0 0 2026-03-09T19:27:05.405 INFO:tasks.workunit.client.0.vm07.stdout:6/52: mknod d0/d1/db/c12 0 2026-03-09T19:27:05.405 INFO:tasks.workunit.client.0.vm07.stdout:8/56: write d7/f19 [121315,90717] 0 2026-03-09T19:27:05.406 INFO:tasks.workunit.client.0.vm07.stdout:3/56: creat d1/d6/f12 x:0 0 0 2026-03-09T19:27:05.408 INFO:tasks.workunit.client.0.vm07.stdout:3/57: rename d1/d6/f12 to d1/d10/f13 0 2026-03-09T19:27:05.424 INFO:tasks.workunit.client.0.vm07.stdout:4/33: fsync d3/f5 0 2026-03-09T19:27:05.425 INFO:tasks.workunit.client.0.vm07.stdout:4/34: truncate d3/f5 4786734 0 2026-03-09T19:27:05.426 INFO:tasks.workunit.client.0.vm07.stdout:4/35: write d3/f5 [3588530,52420] 0 2026-03-09T19:27:05.426 INFO:tasks.workunit.client.0.vm07.stdout:4/36: write f1 [676947,36636] 0 2026-03-09T19:27:05.435 INFO:tasks.workunit.client.0.vm07.stdout:4/37: link d3/l4 d3/l6 0 2026-03-09T19:27:05.480 INFO:tasks.workunit.client.0.vm07.stdout:4/38: sync 2026-03-09T19:27:05.480 INFO:tasks.workunit.client.0.vm07.stdout:4/39: write f1 [1768999,118348] 0 2026-03-09T19:27:05.481 INFO:tasks.workunit.client.0.vm07.stdout:4/40: rename f1 to d3/f7 0 2026-03-09T19:27:05.484 INFO:tasks.workunit.client.0.vm07.stdout:4/41: dread d3/f7 [0,4194304] 0 2026-03-09T19:27:05.486 INFO:tasks.workunit.client.0.vm07.stdout:4/42: write d3/f7 [2244535,120213] 0 2026-03-09T19:27:05.519 INFO:tasks.workunit.client.0.vm07.stdout:8/57: dread f2 [0,4194304] 0 2026-03-09T19:27:05.520 INFO:tasks.workunit.client.0.vm07.stdout:8/58: fdatasync f4 0 2026-03-09T19:27:05.537 INFO:tasks.workunit.client.0.vm07.stdout:8/59: dread d7/f19 [0,4194304] 0 2026-03-09T19:27:05.538 INFO:tasks.workunit.client.0.vm07.stdout:8/60: symlink d7/d9/l1a 0 2026-03-09T19:27:05.541 INFO:tasks.workunit.client.0.vm07.stdout:8/61: link d7/f15 d7/d9/d10/f1b 0 2026-03-09T19:27:05.543 INFO:tasks.workunit.client.0.vm07.stdout:8/62: creat d7/f1c x:0 0 0 2026-03-09T19:27:05.555 INFO:tasks.workunit.client.1.vm08.stdout:8/358: write de/d1d/f27 [2145251,68750] 0 2026-03-09T19:27:05.557 INFO:tasks.workunit.client.1.vm08.stdout:8/359: rmdir de/d25/d31/d82/d6d 39 2026-03-09T19:27:05.559 INFO:tasks.workunit.client.1.vm08.stdout:8/360: read de/f20 [2305313,35069] 0 2026-03-09T19:27:05.559 INFO:tasks.workunit.client.1.vm08.stdout:8/361: fsync de/f20 0 2026-03-09T19:27:05.560 INFO:tasks.workunit.client.1.vm08.stdout:8/362: write de/d1d/d4f/f5e [854261,92677] 0 2026-03-09T19:27:05.563 INFO:tasks.workunit.client.1.vm08.stdout:8/363: creat de/d1d/d69/f84 x:0 0 0 2026-03-09T19:27:05.564 INFO:tasks.workunit.client.1.vm08.stdout:8/364: mkdir de/d47/d85 0 2026-03-09T19:27:05.567 INFO:tasks.workunit.client.1.vm08.stdout:8/365: link de/d25/d33/f55 de/d1d/d21/f86 0 2026-03-09T19:27:05.586 INFO:tasks.workunit.client.0.vm07.stdout:2/44: fdatasync d3/fa 0 2026-03-09T19:27:05.592 INFO:tasks.workunit.client.1.vm08.stdout:3/371: truncate d0/d8/f5b 225528 0 2026-03-09T19:27:05.593 INFO:tasks.workunit.client.1.vm08.stdout:3/372: fdatasync d0/d8/f4c 0 2026-03-09T19:27:05.594 INFO:tasks.workunit.client.1.vm08.stdout:3/373: chown d0/d6/de/d1b/c1e 19045945 1 2026-03-09T19:27:05.594 INFO:tasks.workunit.client.0.vm07.stdout:0/39: read d0/d7/f8 [203678,99921] 0 2026-03-09T19:27:05.595 INFO:tasks.workunit.client.1.vm08.stdout:3/374: dread - d0/d52/f6a zero size 2026-03-09T19:27:05.595 INFO:tasks.workunit.client.0.vm07.stdout:9/37: chown d0/f4 26 1 2026-03-09T19:27:05.597 INFO:tasks.workunit.client.0.vm07.stdout:0/40: write d0/d7/f8 [2818053,66113] 0 2026-03-09T19:27:05.598 INFO:tasks.workunit.client.0.vm07.stdout:9/38: chown d0/f3 70 1 2026-03-09T19:27:05.603 INFO:tasks.workunit.client.1.vm08.stdout:3/375: dwrite d0/d6/de/d1b/f6c [0,4194304] 0 2026-03-09T19:27:05.610 INFO:tasks.workunit.client.0.vm07.stdout:0/41: mknod d0/d7/cc 0 2026-03-09T19:27:05.611 INFO:tasks.workunit.client.0.vm07.stdout:9/39: creat d0/d6/fa x:0 0 0 2026-03-09T19:27:05.611 INFO:tasks.workunit.client.0.vm07.stdout:9/40: truncate d0/f3 216353 0 2026-03-09T19:27:05.612 INFO:tasks.workunit.client.1.vm08.stdout:9/319: truncate d0/d2/d14/f56 3825990 0 2026-03-09T19:27:05.613 INFO:tasks.workunit.client.0.vm07.stdout:9/41: dread d0/f4 [0,4194304] 0 2026-03-09T19:27:05.618 INFO:tasks.workunit.client.0.vm07.stdout:9/42: mkdir d0/db 0 2026-03-09T19:27:05.618 INFO:tasks.workunit.client.0.vm07.stdout:9/43: chown d0/f4 3 1 2026-03-09T19:27:05.619 INFO:tasks.workunit.client.0.vm07.stdout:9/44: chown d0 543262 1 2026-03-09T19:27:05.619 INFO:tasks.workunit.client.0.vm07.stdout:9/45: fdatasync d0/d6/fa 0 2026-03-09T19:27:05.619 INFO:tasks.workunit.client.1.vm08.stdout:4/329: truncate f2 3278708 0 2026-03-09T19:27:05.620 INFO:tasks.workunit.client.0.vm07.stdout:9/46: dread - d0/d6/fa zero size 2026-03-09T19:27:05.628 INFO:tasks.workunit.client.0.vm07.stdout:0/42: mknod d0/d6/cd 0 2026-03-09T19:27:05.630 INFO:tasks.workunit.client.1.vm08.stdout:9/320: dread - d0/d2/f45 zero size 2026-03-09T19:27:05.632 INFO:tasks.workunit.client.1.vm08.stdout:9/321: read d0/f44 [4554842,48377] 0 2026-03-09T19:27:05.645 INFO:tasks.workunit.client.0.vm07.stdout:9/47: rename d0/l7 to d0/d6/lc 0 2026-03-09T19:27:05.646 INFO:tasks.workunit.client.0.vm07.stdout:9/48: write d0/d6/fa [368302,57740] 0 2026-03-09T19:27:05.652 INFO:tasks.workunit.client.0.vm07.stdout:9/49: dwrite d0/f3 [0,4194304] 0 2026-03-09T19:27:05.679 INFO:tasks.workunit.client.0.vm07.stdout:0/43: mknod d0/d7/ce 0 2026-03-09T19:27:05.685 INFO:tasks.workunit.client.0.vm07.stdout:0/44: dread - d0/fa zero size 2026-03-09T19:27:05.685 INFO:tasks.workunit.client.0.vm07.stdout:9/50: truncate d0/f4 143457 0 2026-03-09T19:27:05.685 INFO:tasks.workunit.client.0.vm07.stdout:9/51: fdatasync d0/d6/f8 0 2026-03-09T19:27:05.691 INFO:tasks.workunit.client.0.vm07.stdout:0/45: mknod d0/cf 0 2026-03-09T19:27:05.691 INFO:tasks.workunit.client.0.vm07.stdout:0/46: read - d0/fa zero size 2026-03-09T19:27:05.693 INFO:tasks.workunit.client.1.vm08.stdout:4/330: getdents da/d10/d26/d27/d32 0 2026-03-09T19:27:05.696 INFO:tasks.workunit.client.0.vm07.stdout:0/47: sync 2026-03-09T19:27:05.696 INFO:tasks.workunit.client.0.vm07.stdout:0/48: chown d0 0 1 2026-03-09T19:27:05.697 INFO:tasks.workunit.client.0.vm07.stdout:0/49: read d0/d7/f8 [119561,54914] 0 2026-03-09T19:27:05.697 INFO:tasks.workunit.client.0.vm07.stdout:9/52: symlink d0/d6/ld 0 2026-03-09T19:27:05.708 INFO:tasks.workunit.client.1.vm08.stdout:4/331: mkdir da/d10/d16/d28/d2f/d4f/d64 0 2026-03-09T19:27:05.709 INFO:tasks.workunit.client.1.vm08.stdout:4/332: write da/d10/f3d [1829216,68514] 0 2026-03-09T19:27:05.710 INFO:tasks.workunit.client.1.vm08.stdout:4/333: write da/d10/f13 [1688153,99521] 0 2026-03-09T19:27:05.723 INFO:tasks.workunit.client.0.vm07.stdout:9/53: symlink d0/d6/le 0 2026-03-09T19:27:05.725 INFO:tasks.workunit.client.0.vm07.stdout:0/50: mknod d0/d6/c10 0 2026-03-09T19:27:05.725 INFO:tasks.workunit.client.0.vm07.stdout:0/51: chown d0 160106 1 2026-03-09T19:27:05.728 INFO:tasks.workunit.client.0.vm07.stdout:9/54: dwrite d0/d6/fa [0,4194304] 0 2026-03-09T19:27:05.729 INFO:tasks.workunit.client.0.vm07.stdout:9/55: dread - d0/d6/f8 zero size 2026-03-09T19:27:05.732 INFO:tasks.workunit.client.0.vm07.stdout:0/52: mkdir d0/d7/d11 0 2026-03-09T19:27:05.732 INFO:tasks.workunit.client.0.vm07.stdout:0/53: readlink - no filename 2026-03-09T19:27:05.735 INFO:tasks.workunit.client.1.vm08.stdout:2/321: rename d3/d4/d23/d2c/d41 to d3/d9/d79 0 2026-03-09T19:27:05.736 INFO:tasks.workunit.client.1.vm08.stdout:2/322: chown d3/d9/f5d 2475 1 2026-03-09T19:27:05.743 INFO:tasks.workunit.client.1.vm08.stdout:4/334: getdents da/d10/d1b 0 2026-03-09T19:27:05.743 INFO:tasks.workunit.client.1.vm08.stdout:4/335: dread - da/d10/f53 zero size 2026-03-09T19:27:05.746 INFO:tasks.workunit.client.1.vm08.stdout:7/387: rename d5/d14/d27/d78/f7f to d5/d16/d3a/d42/f89 0 2026-03-09T19:27:05.750 INFO:tasks.workunit.client.1.vm08.stdout:7/388: dwrite d5/d14/d2b/d5d/f63 [0,4194304] 0 2026-03-09T19:27:05.752 INFO:tasks.workunit.client.0.vm07.stdout:0/54: fsync d0/d7/f9 0 2026-03-09T19:27:05.753 INFO:tasks.workunit.client.0.vm07.stdout:0/55: write d0/d7/f8 [2521638,89008] 0 2026-03-09T19:27:05.754 INFO:tasks.workunit.client.1.vm08.stdout:7/389: write d5/d14/d2b/d4b/f7e [273016,3800] 0 2026-03-09T19:27:05.757 INFO:tasks.workunit.client.1.vm08.stdout:4/336: rmdir da/d10/d16/d28/d46 39 2026-03-09T19:27:05.763 INFO:tasks.workunit.client.0.vm07.stdout:0/56: rename d0/d7/f8 to d0/f12 0 2026-03-09T19:27:05.763 INFO:tasks.workunit.client.0.vm07.stdout:0/57: dread - d0/fa zero size 2026-03-09T19:27:05.763 INFO:tasks.workunit.client.0.vm07.stdout:3/58: truncate f0 5758122 0 2026-03-09T19:27:05.763 INFO:tasks.workunit.client.0.vm07.stdout:3/59: stat d1/l3 0 2026-03-09T19:27:05.763 INFO:tasks.workunit.client.0.vm07.stdout:3/60: chown d1/l3 2 1 2026-03-09T19:27:05.763 INFO:tasks.workunit.client.1.vm08.stdout:0/353: dwrite fb [0,4194304] 0 2026-03-09T19:27:05.768 INFO:tasks.workunit.client.1.vm08.stdout:0/354: dwrite dd/fe [0,4194304] 0 2026-03-09T19:27:05.780 INFO:tasks.workunit.client.0.vm07.stdout:3/61: mknod d1/c14 0 2026-03-09T19:27:05.781 INFO:tasks.workunit.client.1.vm08.stdout:7/390: unlink d5/d16/d3a/d42/d85/f34 0 2026-03-09T19:27:05.786 INFO:tasks.workunit.client.0.vm07.stdout:0/58: getdents d0/d7/d11 0 2026-03-09T19:27:05.787 INFO:tasks.workunit.client.0.vm07.stdout:3/62: creat d1/d6/dd/f15 x:0 0 0 2026-03-09T19:27:05.790 INFO:tasks.workunit.client.0.vm07.stdout:3/63: mkdir d1/d10/d16 0 2026-03-09T19:27:05.790 INFO:tasks.workunit.client.1.vm08.stdout:0/355: mkdir dd/d22/d27/d4f/d6f 0 2026-03-09T19:27:05.793 INFO:tasks.workunit.client.1.vm08.stdout:7/391: creat d5/d14/d27/f8a x:0 0 0 2026-03-09T19:27:05.794 INFO:tasks.workunit.client.0.vm07.stdout:3/64: unlink d1/d6/dd/lf 0 2026-03-09T19:27:05.799 INFO:tasks.workunit.client.1.vm08.stdout:7/392: symlink d5/d16/d3a/d42/d85/l8b 0 2026-03-09T19:27:05.800 INFO:tasks.workunit.client.1.vm08.stdout:7/393: truncate d5/d16/d3a/d42/f65 208276 0 2026-03-09T19:27:05.800 INFO:tasks.workunit.client.1.vm08.stdout:7/394: chown d5/d16/l1b 5441 1 2026-03-09T19:27:05.802 INFO:tasks.workunit.client.1.vm08.stdout:7/395: symlink d5/d14/d27/d54/l8c 0 2026-03-09T19:27:05.804 INFO:tasks.workunit.client.1.vm08.stdout:0/356: getdents dd/d22/d24 0 2026-03-09T19:27:05.805 INFO:tasks.workunit.client.1.vm08.stdout:7/396: creat d5/d14/d38/f8d x:0 0 0 2026-03-09T19:27:05.809 INFO:tasks.workunit.client.1.vm08.stdout:0/357: dwrite dd/f15 [0,4194304] 0 2026-03-09T19:27:05.811 INFO:tasks.workunit.client.1.vm08.stdout:0/358: readlink dd/d22/d27/d2e/d37/l38 0 2026-03-09T19:27:05.829 INFO:tasks.workunit.client.1.vm08.stdout:0/359: symlink dd/d22/d24/d49/l70 0 2026-03-09T19:27:05.830 INFO:tasks.workunit.client.1.vm08.stdout:7/397: dwrite d5/d16/d3a/d42/d85/f19 [4194304,4194304] 0 2026-03-09T19:27:05.835 INFO:tasks.workunit.client.1.vm08.stdout:5/323: truncate d16/d1e/f2c 3169641 0 2026-03-09T19:27:05.836 INFO:tasks.workunit.client.1.vm08.stdout:1/461: write d9/da/f30 [2732857,97393] 0 2026-03-09T19:27:05.840 INFO:tasks.workunit.client.1.vm08.stdout:5/324: dwrite d16/d45/f4a [0,4194304] 0 2026-03-09T19:27:05.842 INFO:tasks.workunit.client.1.vm08.stdout:6/349: rmdir d3/d34/d6f/d4b 39 2026-03-09T19:27:05.843 INFO:tasks.workunit.client.1.vm08.stdout:0/360: creat dd/d22/d24/f71 x:0 0 0 2026-03-09T19:27:05.846 INFO:tasks.workunit.client.1.vm08.stdout:6/350: fdatasync d3/d34/d6f/d4b/f5e 0 2026-03-09T19:27:05.850 INFO:tasks.workunit.client.1.vm08.stdout:1/462: creat d9/da/f8e x:0 0 0 2026-03-09T19:27:05.851 INFO:tasks.workunit.client.1.vm08.stdout:0/361: mkdir dd/d22/d63/d6e/d72 0 2026-03-09T19:27:05.853 INFO:tasks.workunit.client.1.vm08.stdout:7/398: fsync d5/d14/d2b/d5d/f63 0 2026-03-09T19:27:05.858 INFO:tasks.workunit.client.1.vm08.stdout:6/351: mknod d3/db/d43/d69/c88 0 2026-03-09T19:27:05.862 INFO:tasks.workunit.client.1.vm08.stdout:6/352: dwrite d3/d15/f40 [0,4194304] 0 2026-03-09T19:27:05.863 INFO:tasks.workunit.client.1.vm08.stdout:1/463: symlink d9/da/d2c/l8f 0 2026-03-09T19:27:05.879 INFO:tasks.workunit.client.1.vm08.stdout:1/464: getdents d9/da/d17/d60 0 2026-03-09T19:27:05.894 INFO:tasks.workunit.client.0.vm07.stdout:3/65: fsync d1/f7 0 2026-03-09T19:27:05.894 INFO:tasks.workunit.client.0.vm07.stdout:6/53: fsync d0/d1/f10 0 2026-03-09T19:27:05.894 INFO:tasks.workunit.client.0.vm07.stdout:3/66: truncate d1/d10/f13 1023961 0 2026-03-09T19:27:05.894 INFO:tasks.workunit.client.0.vm07.stdout:3/67: chown f0 4 1 2026-03-09T19:27:05.894 INFO:tasks.workunit.client.0.vm07.stdout:3/68: write d1/d6/dd/f11 [827789,91600] 0 2026-03-09T19:27:05.898 INFO:tasks.workunit.client.0.vm07.stdout:3/69: dwrite d1/d6/dd/f11 [0,4194304] 0 2026-03-09T19:27:05.900 INFO:tasks.workunit.client.0.vm07.stdout:3/70: dread - d1/d6/dd/f15 zero size 2026-03-09T19:27:05.900 INFO:tasks.workunit.client.0.vm07.stdout:3/71: dread - d1/d6/dd/f15 zero size 2026-03-09T19:27:05.912 INFO:tasks.workunit.client.1.vm08.stdout:2/323: sync 2026-03-09T19:27:05.917 INFO:tasks.workunit.client.0.vm07.stdout:7/37: truncate d0/d4/fc 3693917 0 2026-03-09T19:27:05.931 INFO:tasks.workunit.client.1.vm08.stdout:2/324: fsync d3/d4/d23/d5c/f70 0 2026-03-09T19:27:05.939 INFO:tasks.workunit.client.0.vm07.stdout:7/38: getdents d0/d4/d5/dd 0 2026-03-09T19:27:05.940 INFO:tasks.workunit.client.0.vm07.stdout:3/72: sync 2026-03-09T19:27:05.940 INFO:tasks.workunit.client.0.vm07.stdout:3/73: fdatasync d1/d6/dd/f15 0 2026-03-09T19:27:05.945 INFO:tasks.workunit.client.0.vm07.stdout:3/74: dwrite d1/f7 [0,4194304] 0 2026-03-09T19:27:05.945 INFO:tasks.workunit.client.0.vm07.stdout:7/39: creat d0/d4/d5/d8/f10 x:0 0 0 2026-03-09T19:27:05.945 INFO:tasks.workunit.client.0.vm07.stdout:7/40: dread - d0/d4/d5/d8/f10 zero size 2026-03-09T19:27:05.946 INFO:tasks.workunit.client.0.vm07.stdout:7/41: write d0/d4/d5/d8/fa [454787,33238] 0 2026-03-09T19:27:05.947 INFO:tasks.workunit.client.0.vm07.stdout:7/42: chown d0/d4 20293811 1 2026-03-09T19:27:05.948 INFO:tasks.workunit.client.0.vm07.stdout:6/54: rmdir d0 39 2026-03-09T19:27:05.948 INFO:tasks.workunit.client.0.vm07.stdout:5/51: truncate d3/fe 3577570 0 2026-03-09T19:27:05.949 INFO:tasks.workunit.client.0.vm07.stdout:3/75: rename d1/c14 to d1/d10/c17 0 2026-03-09T19:27:05.950 INFO:tasks.workunit.client.0.vm07.stdout:3/76: readlink d1/l3 0 2026-03-09T19:27:05.952 INFO:tasks.workunit.client.0.vm07.stdout:6/55: chown d0/d1/l11 458944 1 2026-03-09T19:27:05.959 INFO:tasks.workunit.client.0.vm07.stdout:3/77: symlink d1/d6/dd/l18 0 2026-03-09T19:27:05.959 INFO:tasks.workunit.client.0.vm07.stdout:3/78: fsync d1/d6/fa 0 2026-03-09T19:27:05.959 INFO:tasks.workunit.client.0.vm07.stdout:6/56: mkdir d0/d13 0 2026-03-09T19:27:05.975 INFO:tasks.workunit.client.1.vm08.stdout:1/465: dread d9/da/d12/f72 [0,4194304] 0 2026-03-09T19:27:05.984 INFO:tasks.workunit.client.0.vm07.stdout:4/43: truncate d3/f7 2690288 0 2026-03-09T19:27:05.984 INFO:tasks.workunit.client.0.vm07.stdout:4/44: fdatasync d3/f5 0 2026-03-09T19:27:05.985 INFO:tasks.workunit.client.0.vm07.stdout:6/57: creat d0/d1/db/f14 x:0 0 0 2026-03-09T19:27:05.986 INFO:tasks.workunit.client.0.vm07.stdout:3/79: dwrite d1/d10/f13 [0,4194304] 0 2026-03-09T19:27:05.991 INFO:tasks.workunit.client.0.vm07.stdout:3/80: readlink d1/l3 0 2026-03-09T19:27:06.005 INFO:tasks.workunit.client.0.vm07.stdout:6/58: rename d0/d1/f10 to d0/d1/db/f15 0 2026-03-09T19:27:06.010 INFO:tasks.workunit.client.0.vm07.stdout:8/63: rmdir d7 39 2026-03-09T19:27:06.011 INFO:tasks.workunit.client.0.vm07.stdout:5/52: getdents d3/dd 0 2026-03-09T19:27:06.012 INFO:tasks.workunit.client.0.vm07.stdout:5/53: write d3/d9/f12 [469980,17813] 0 2026-03-09T19:27:06.013 INFO:tasks.workunit.client.0.vm07.stdout:5/54: dread d3/d9/fb [0,4194304] 0 2026-03-09T19:27:06.020 INFO:tasks.workunit.client.0.vm07.stdout:6/59: mknod d0/c16 0 2026-03-09T19:27:06.020 INFO:tasks.workunit.client.0.vm07.stdout:6/60: readlink d0/d1/l11 0 2026-03-09T19:27:06.020 INFO:tasks.workunit.client.1.vm08.stdout:8/366: write de/d1d/d4f/f51 [273418,75208] 0 2026-03-09T19:27:06.021 INFO:tasks.workunit.client.0.vm07.stdout:8/64: chown d7 656033813 1 2026-03-09T19:27:06.021 INFO:tasks.workunit.client.1.vm08.stdout:8/367: chown de/d1d/d21/d73 5 1 2026-03-09T19:27:06.023 INFO:tasks.workunit.client.0.vm07.stdout:5/55: symlink d3/dd/l13 0 2026-03-09T19:27:06.025 INFO:tasks.workunit.client.0.vm07.stdout:5/56: dread d3/d9/fa [0,4194304] 0 2026-03-09T19:27:06.028 INFO:tasks.workunit.client.0.vm07.stdout:3/81: link d1/d6/f9 d1/d6/f19 0 2026-03-09T19:27:06.034 INFO:tasks.workunit.client.0.vm07.stdout:2/45: write d3/f7 [5218571,51536] 0 2026-03-09T19:27:06.041 INFO:tasks.workunit.client.0.vm07.stdout:8/65: mkdir d7/d1d 0 2026-03-09T19:27:06.046 INFO:tasks.workunit.client.0.vm07.stdout:1/41: truncate d1/d3/f4 1511215 0 2026-03-09T19:27:06.046 INFO:tasks.workunit.client.0.vm07.stdout:1/42: readlink d1/l7 0 2026-03-09T19:27:06.054 INFO:tasks.workunit.client.0.vm07.stdout:2/46: mkdir d3/d11 0 2026-03-09T19:27:06.064 INFO:tasks.workunit.client.0.vm07.stdout:8/66: mkdir d7/d16/d1e 0 2026-03-09T19:27:06.066 INFO:tasks.workunit.client.0.vm07.stdout:1/43: mknod d1/d3/c13 0 2026-03-09T19:27:06.069 INFO:tasks.workunit.client.1.vm08.stdout:3/376: rename d0/d6/de/d1a/d63 to d0/d52/d6d 0 2026-03-09T19:27:06.070 INFO:tasks.workunit.client.0.vm07.stdout:2/47: mknod d3/d11/c12 0 2026-03-09T19:27:06.071 INFO:tasks.workunit.client.0.vm07.stdout:8/67: mknod d7/d16/c1f 0 2026-03-09T19:27:06.072 INFO:tasks.workunit.client.1.vm08.stdout:5/325: rename d16/d45/f4a to d16/d1e/d3b/f68 0 2026-03-09T19:27:06.074 INFO:tasks.workunit.client.0.vm07.stdout:5/57: getdents d3/d9 0 2026-03-09T19:27:06.074 INFO:tasks.workunit.client.0.vm07.stdout:5/58: read d3/d9/f10 [2379149,114417] 0 2026-03-09T19:27:06.081 INFO:tasks.workunit.client.0.vm07.stdout:2/48: creat d3/d11/f13 x:0 0 0 2026-03-09T19:27:06.087 INFO:tasks.workunit.client.0.vm07.stdout:5/59: creat d3/dd/f14 x:0 0 0 2026-03-09T19:27:06.089 INFO:tasks.workunit.client.1.vm08.stdout:3/377: rename d0/d6/de/d1b/d16/d18 to d0/d6/de/d6e 0 2026-03-09T19:27:06.096 INFO:tasks.workunit.client.1.vm08.stdout:5/326: link d16/c60 d16/d1e/d3b/d61/c69 0 2026-03-09T19:27:06.103 INFO:tasks.workunit.client.1.vm08.stdout:3/378: mknod d0/d8/d24/c6f 0 2026-03-09T19:27:06.105 INFO:tasks.workunit.client.0.vm07.stdout:2/49: mknod d3/c14 0 2026-03-09T19:27:06.108 INFO:tasks.workunit.client.1.vm08.stdout:5/327: rename f15 to d16/d45/f6a 0 2026-03-09T19:27:06.109 INFO:tasks.workunit.client.0.vm07.stdout:9/56: write d0/f3 [4592792,116835] 0 2026-03-09T19:27:06.119 INFO:tasks.workunit.client.0.vm07.stdout:2/50: creat d3/f15 x:0 0 0 2026-03-09T19:27:06.120 INFO:tasks.workunit.client.1.vm08.stdout:5/328: creat d16/d45/f6b x:0 0 0 2026-03-09T19:27:06.128 INFO:tasks.workunit.client.1.vm08.stdout:9/322: dwrite d0/d2/d8/d7/f22 [0,4194304] 0 2026-03-09T19:27:06.132 INFO:tasks.workunit.client.1.vm08.stdout:3/379: creat d0/d6/de/d6e/d51/f70 x:0 0 0 2026-03-09T19:27:06.133 INFO:tasks.workunit.client.1.vm08.stdout:3/380: read d0/d6/de/d6e/f2c [2685380,49859] 0 2026-03-09T19:27:06.134 INFO:tasks.workunit.client.1.vm08.stdout:5/329: dread d16/f34 [0,4194304] 0 2026-03-09T19:27:06.136 INFO:tasks.workunit.client.0.vm07.stdout:9/57: creat d0/d6/ff x:0 0 0 2026-03-09T19:27:06.143 INFO:tasks.workunit.client.0.vm07.stdout:9/58: rename d0/f2 to d0/d6/f10 0 2026-03-09T19:27:06.145 INFO:tasks.workunit.client.1.vm08.stdout:9/323: mkdir d0/d2/d8/d7/d48/d6f 0 2026-03-09T19:27:06.148 INFO:tasks.workunit.client.1.vm08.stdout:3/381: unlink d0/d6/de/d1b/f6c 0 2026-03-09T19:27:06.149 INFO:tasks.workunit.client.1.vm08.stdout:3/382: chown d0/d6/de/d1b/l49 6795854 1 2026-03-09T19:27:06.160 INFO:tasks.workunit.client.1.vm08.stdout:5/330: mknod d16/d45/c6c 0 2026-03-09T19:27:06.165 INFO:tasks.workunit.client.0.vm07.stdout:9/59: mknod d0/c11 0 2026-03-09T19:27:06.165 INFO:tasks.workunit.client.0.vm07.stdout:9/60: chown d0/d6 48 1 2026-03-09T19:27:06.166 INFO:tasks.workunit.client.0.vm07.stdout:0/59: truncate d0/d7/f9 2217191 0 2026-03-09T19:27:06.167 INFO:tasks.workunit.client.0.vm07.stdout:5/60: getdents d3 0 2026-03-09T19:27:06.170 INFO:tasks.workunit.client.1.vm08.stdout:3/383: creat d0/d6/de/d1b/d16/d17/f71 x:0 0 0 2026-03-09T19:27:06.174 INFO:tasks.workunit.client.1.vm08.stdout:3/384: dwrite d0/d8/f4a [0,4194304] 0 2026-03-09T19:27:06.175 INFO:tasks.workunit.client.0.vm07.stdout:5/61: dread d3/d9/f12 [0,4194304] 0 2026-03-09T19:27:06.180 INFO:tasks.workunit.client.0.vm07.stdout:9/61: mknod d0/c12 0 2026-03-09T19:27:06.188 INFO:tasks.workunit.client.0.vm07.stdout:9/62: dwrite d0/d6/fa [0,4194304] 0 2026-03-09T19:27:06.189 INFO:tasks.workunit.client.0.vm07.stdout:0/60: mkdir d0/d6/d13 0 2026-03-09T19:27:06.190 INFO:tasks.workunit.client.0.vm07.stdout:0/61: write d0/fa [307985,67191] 0 2026-03-09T19:27:06.217 INFO:tasks.workunit.client.0.vm07.stdout:5/62: symlink d3/d9/l15 0 2026-03-09T19:27:06.225 INFO:tasks.workunit.client.1.vm08.stdout:4/337: dwrite da/f1d [4194304,4194304] 0 2026-03-09T19:27:06.227 INFO:tasks.workunit.client.0.vm07.stdout:9/63: creat d0/d6/f13 x:0 0 0 2026-03-09T19:27:06.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:05 vm07.local ceph-mon[48545]: pgmap v157: 65 pgs: 65 active+clean; 726 MiB data, 3.5 GiB used, 116 GiB / 120 GiB avail; 8.6 MiB/s rd, 81 MiB/s wr, 265 op/s 2026-03-09T19:27:06.228 INFO:tasks.workunit.client.1.vm08.stdout:4/338: truncate da/d10/d26/d38/f43 1821966 0 2026-03-09T19:27:06.232 INFO:tasks.workunit.client.1.vm08.stdout:5/331: chown d16/d1e/d3b/d61/c69 551 1 2026-03-09T19:27:06.233 INFO:tasks.workunit.client.0.vm07.stdout:9/64: sync 2026-03-09T19:27:06.243 INFO:tasks.workunit.client.1.vm08.stdout:1/466: getdents d9/da 0 2026-03-09T19:27:06.243 INFO:tasks.workunit.client.0.vm07.stdout:5/63: rmdir d3/d9 39 2026-03-09T19:27:06.245 INFO:tasks.workunit.client.1.vm08.stdout:3/385: rmdir d0/d6/de/d1b/d16/d17/d4e 0 2026-03-09T19:27:06.246 INFO:tasks.workunit.client.1.vm08.stdout:3/386: readlink d0/d4b/l3d 0 2026-03-09T19:27:06.248 INFO:tasks.workunit.client.1.vm08.stdout:7/399: dwrite d5/d14/d2b/f37 [0,4194304] 0 2026-03-09T19:27:06.249 INFO:tasks.workunit.client.0.vm07.stdout:9/65: chown d0/d6/lc 10008591 1 2026-03-09T19:27:06.256 INFO:tasks.workunit.client.0.vm07.stdout:7/43: write d0/d4/fc [1849142,51849] 0 2026-03-09T19:27:06.259 INFO:tasks.workunit.client.1.vm08.stdout:2/325: dwrite d3/d4/d23/d2c/d39/d5e/de/d18/f2d [0,4194304] 0 2026-03-09T19:27:06.264 INFO:tasks.workunit.client.0.vm07.stdout:5/64: fsync d3/d9/fc 0 2026-03-09T19:27:06.265 INFO:tasks.workunit.client.1.vm08.stdout:3/387: symlink d0/d8/d19/l72 0 2026-03-09T19:27:06.265 INFO:tasks.workunit.client.1.vm08.stdout:3/388: write d0/d6/f39 [4504610,23304] 0 2026-03-09T19:27:06.268 INFO:tasks.workunit.client.0.vm07.stdout:9/66: rename d0/d6/lc to d0/l14 0 2026-03-09T19:27:06.269 INFO:tasks.workunit.client.1.vm08.stdout:7/400: symlink d5/d16/d3a/d42/d85/l8e 0 2026-03-09T19:27:06.271 INFO:tasks.workunit.client.0.vm07.stdout:7/44: unlink d0/d4/l9 0 2026-03-09T19:27:06.273 INFO:tasks.workunit.client.0.vm07.stdout:4/45: truncate d3/f5 1756784 0 2026-03-09T19:27:06.274 INFO:tasks.workunit.client.1.vm08.stdout:0/362: rename dd/d22/d24/c5e to dd/d22/d27/d2e/d37/c73 0 2026-03-09T19:27:06.275 INFO:tasks.workunit.client.1.vm08.stdout:0/363: write dd/d22/f28 [3609145,83339] 0 2026-03-09T19:27:06.276 INFO:tasks.workunit.client.0.vm07.stdout:6/61: fsync d0/d1/db/f15 0 2026-03-09T19:27:06.281 INFO:tasks.workunit.client.0.vm07.stdout:3/82: getdents d1/d6 0 2026-03-09T19:27:06.282 INFO:tasks.workunit.client.0.vm07.stdout:3/83: stat d1/d6/dd 0 2026-03-09T19:27:06.282 INFO:tasks.workunit.client.0.vm07.stdout:3/84: stat f0 0 2026-03-09T19:27:06.282 INFO:tasks.workunit.client.0.vm07.stdout:3/85: read d1/f8 [240090,36178] 0 2026-03-09T19:27:06.293 INFO:tasks.workunit.client.1.vm08.stdout:7/401: mkdir d5/d16/d3a/d42/d6a/d8f 0 2026-03-09T19:27:06.293 INFO:tasks.workunit.client.1.vm08.stdout:7/402: truncate d5/fb 3662643 0 2026-03-09T19:27:06.305 INFO:tasks.workunit.client.0.vm07.stdout:8/68: write d7/d9/d10/f1b [150142,47847] 0 2026-03-09T19:27:06.306 INFO:tasks.workunit.client.0.vm07.stdout:8/69: dread d7/f19 [0,4194304] 0 2026-03-09T19:27:06.307 INFO:tasks.workunit.client.0.vm07.stdout:8/70: stat d7/d9/cc 0 2026-03-09T19:27:06.308 INFO:tasks.workunit.client.1.vm08.stdout:7/403: mkdir d5/d16/d1c/d83/d90 0 2026-03-09T19:27:06.308 INFO:tasks.workunit.client.1.vm08.stdout:7/404: fsync d5/d14/d27/f8a 0 2026-03-09T19:27:06.308 INFO:tasks.workunit.client.1.vm08.stdout:7/405: stat d5/d14/d27/f35 0 2026-03-09T19:27:06.309 INFO:tasks.workunit.client.1.vm08.stdout:7/406: dread - d5/d16/d1c/f87 zero size 2026-03-09T19:27:06.309 INFO:tasks.workunit.client.0.vm07.stdout:1/44: truncate d1/f2 1054010 0 2026-03-09T19:27:06.309 INFO:tasks.workunit.client.1.vm08.stdout:7/407: chown d5/d16/d3a/d42/d85 89554851 1 2026-03-09T19:27:06.312 INFO:tasks.workunit.client.1.vm08.stdout:7/408: dwrite d5/d14/d2b/f30 [0,4194304] 0 2026-03-09T19:27:06.328 INFO:tasks.workunit.client.1.vm08.stdout:7/409: link d5/d16/d3a/d42/d85/c22 d5/d16/d3a/d42/d6a/d8f/c91 0 2026-03-09T19:27:06.334 INFO:tasks.workunit.client.1.vm08.stdout:7/410: symlink d5/d16/d3a/d42/l92 0 2026-03-09T19:27:06.334 INFO:tasks.workunit.client.0.vm07.stdout:2/51: truncate d3/ff 1769500 0 2026-03-09T19:27:06.334 INFO:tasks.workunit.client.0.vm07.stdout:2/52: read - d3/d11/f13 zero size 2026-03-09T19:27:06.334 INFO:tasks.workunit.client.0.vm07.stdout:2/53: dwrite d3/f15 [0,4194304] 0 2026-03-09T19:27:06.335 INFO:tasks.workunit.client.0.vm07.stdout:0/62: read d0/d7/f9 [1474214,67797] 0 2026-03-09T19:27:06.344 INFO:tasks.workunit.client.1.vm08.stdout:9/324: write d0/f13 [2222601,21843] 0 2026-03-09T19:27:06.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:05 vm08.local ceph-mon[57794]: pgmap v157: 65 pgs: 65 active+clean; 726 MiB data, 3.5 GiB used, 116 GiB / 120 GiB avail; 8.6 MiB/s rd, 81 MiB/s wr, 265 op/s 2026-03-09T19:27:06.345 INFO:tasks.workunit.client.1.vm08.stdout:9/325: read d0/d2/f2f [200876,62144] 0 2026-03-09T19:27:06.351 INFO:tasks.workunit.client.1.vm08.stdout:9/326: write d0/d2/d8/d7/f63 [1446722,60485] 0 2026-03-09T19:27:06.351 INFO:tasks.workunit.client.1.vm08.stdout:9/327: chown d0/d2/d8 486 1 2026-03-09T19:27:06.354 INFO:tasks.workunit.client.1.vm08.stdout:9/328: symlink d0/d2/d14/d5c/l70 0 2026-03-09T19:27:06.358 INFO:tasks.workunit.client.1.vm08.stdout:9/329: dwrite d0/d2/d8/d7/f35 [4194304,4194304] 0 2026-03-09T19:27:06.379 INFO:tasks.workunit.client.1.vm08.stdout:6/353: rmdir d3/d34/d6f 39 2026-03-09T19:27:06.379 INFO:tasks.workunit.client.1.vm08.stdout:6/354: chown d3/db/l6d 32760 1 2026-03-09T19:27:06.381 INFO:tasks.workunit.client.1.vm08.stdout:2/326: write d3/d4/d23/d2c/d39/d5e/de/f17 [1755216,115357] 0 2026-03-09T19:27:06.384 INFO:tasks.workunit.client.1.vm08.stdout:6/355: dread - d3/d34/d6f/d4b/f72 zero size 2026-03-09T19:27:06.385 INFO:tasks.workunit.client.1.vm08.stdout:6/356: write d3/d15/f7b [397761,18783] 0 2026-03-09T19:27:06.385 INFO:tasks.workunit.client.1.vm08.stdout:6/357: chown d3/d34/d6f/l2e 3956 1 2026-03-09T19:27:06.388 INFO:tasks.workunit.client.1.vm08.stdout:8/368: mkdir de/d25/d87 0 2026-03-09T19:27:06.388 INFO:tasks.workunit.client.1.vm08.stdout:8/369: truncate f1 729159 0 2026-03-09T19:27:06.389 INFO:tasks.workunit.client.1.vm08.stdout:8/370: chown de/d25/d31/d82/l70 519583 1 2026-03-09T19:27:06.389 INFO:tasks.workunit.client.1.vm08.stdout:2/327: rmdir d3/d4/d23 39 2026-03-09T19:27:06.392 INFO:tasks.workunit.client.1.vm08.stdout:0/364: mknod dd/d22/d24/d49/d50/c74 0 2026-03-09T19:27:06.393 INFO:tasks.workunit.client.1.vm08.stdout:3/389: dwrite d0/d6/de/d1b/d16/d17/f1d [0,4194304] 0 2026-03-09T19:27:06.397 INFO:tasks.workunit.client.1.vm08.stdout:3/390: dread d0/d6/d25/f56 [0,4194304] 0 2026-03-09T19:27:06.406 INFO:tasks.workunit.client.1.vm08.stdout:4/339: sync 2026-03-09T19:27:06.406 INFO:tasks.workunit.client.1.vm08.stdout:5/332: sync 2026-03-09T19:27:06.406 INFO:tasks.workunit.client.1.vm08.stdout:1/467: sync 2026-03-09T19:27:06.406 INFO:tasks.workunit.client.1.vm08.stdout:0/365: dread dd/f1e [0,4194304] 0 2026-03-09T19:27:06.407 INFO:tasks.workunit.client.1.vm08.stdout:0/366: fdatasync dd/d22/d27/f42 0 2026-03-09T19:27:06.407 INFO:tasks.workunit.client.1.vm08.stdout:1/468: truncate d9/da/d12/f5c 5198606 0 2026-03-09T19:27:06.409 INFO:tasks.workunit.client.1.vm08.stdout:0/367: chown dd/d22/d24/d49/f5f 4426 1 2026-03-09T19:27:06.410 INFO:tasks.workunit.client.1.vm08.stdout:3/391: dread d0/d8/f66 [0,4194304] 0 2026-03-09T19:27:06.410 INFO:tasks.workunit.client.1.vm08.stdout:3/392: chown d0/d6 88 1 2026-03-09T19:27:06.415 INFO:tasks.workunit.client.1.vm08.stdout:3/393: write d0/d6/de/d1b/d16/d17/f3f [4680551,114583] 0 2026-03-09T19:27:06.417 INFO:tasks.workunit.client.1.vm08.stdout:1/469: dread d9/da/dc/f31 [4194304,4194304] 0 2026-03-09T19:27:06.417 INFO:tasks.workunit.client.1.vm08.stdout:3/394: chown d0/d8/f4a 137 1 2026-03-09T19:27:06.417 INFO:tasks.workunit.client.1.vm08.stdout:8/371: chown de/c12 49 1 2026-03-09T19:27:06.420 INFO:tasks.workunit.client.1.vm08.stdout:1/470: read - d9/da/d53/d67/f77 zero size 2026-03-09T19:27:06.421 INFO:tasks.workunit.client.1.vm08.stdout:1/471: chown d9/da/d53/d67/d6c/l6d 886 1 2026-03-09T19:27:06.423 INFO:tasks.workunit.client.0.vm07.stdout:7/45: rename d0/d4/lf to d0/d4/d5/l11 0 2026-03-09T19:27:06.424 INFO:tasks.workunit.client.1.vm08.stdout:6/358: sync 2026-03-09T19:27:06.424 INFO:tasks.workunit.client.1.vm08.stdout:7/411: sync 2026-03-09T19:27:06.424 INFO:tasks.workunit.client.0.vm07.stdout:4/46: creat d3/f8 x:0 0 0 2026-03-09T19:27:06.425 INFO:tasks.workunit.client.1.vm08.stdout:6/359: stat d3/d34/d3b 0 2026-03-09T19:27:06.426 INFO:tasks.workunit.client.0.vm07.stdout:7/46: dread d0/d4/fc [0,4194304] 0 2026-03-09T19:27:06.427 INFO:tasks.workunit.client.0.vm07.stdout:7/47: chown d0/d4/d5/d8/fa 524199 1 2026-03-09T19:27:06.428 INFO:tasks.workunit.client.0.vm07.stdout:7/48: fdatasync d0/d4/fc 0 2026-03-09T19:27:06.431 INFO:tasks.workunit.client.1.vm08.stdout:6/360: dread d3/d15/f40 [0,4194304] 0 2026-03-09T19:27:06.433 INFO:tasks.workunit.client.0.vm07.stdout:6/62: mkdir d0/d1/db/d17 0 2026-03-09T19:27:06.434 INFO:tasks.workunit.client.1.vm08.stdout:4/340: creat da/d10/d16/d28/d2f/d4f/f65 x:0 0 0 2026-03-09T19:27:06.445 INFO:tasks.workunit.client.0.vm07.stdout:8/71: creat d7/d9/d10/f20 x:0 0 0 2026-03-09T19:27:06.446 INFO:tasks.workunit.client.0.vm07.stdout:8/72: write d7/ff [4402471,43616] 0 2026-03-09T19:27:06.446 INFO:tasks.workunit.client.0.vm07.stdout:8/73: stat d7/d9/l1a 0 2026-03-09T19:27:06.447 INFO:tasks.workunit.client.0.vm07.stdout:8/74: fsync d7/d9/fd 0 2026-03-09T19:27:06.450 INFO:tasks.workunit.client.1.vm08.stdout:3/395: symlink d0/d4b/l73 0 2026-03-09T19:27:06.451 INFO:tasks.workunit.client.0.vm07.stdout:1/45: dwrite d1/d3/f12 [0,4194304] 0 2026-03-09T19:27:06.452 INFO:tasks.workunit.client.1.vm08.stdout:8/372: rmdir de/d25/d33 39 2026-03-09T19:27:06.463 INFO:tasks.workunit.client.1.vm08.stdout:2/328: creat d3/d4/d23/d2c/d39/d5e/de/f7a x:0 0 0 2026-03-09T19:27:06.468 INFO:tasks.workunit.client.0.vm07.stdout:2/54: mkdir d3/dd/d16 0 2026-03-09T19:27:06.473 INFO:tasks.workunit.client.1.vm08.stdout:1/472: dwrite d9/da/d12/f5c [0,4194304] 0 2026-03-09T19:27:06.474 INFO:tasks.workunit.client.0.vm07.stdout:2/55: chown d3/dd/d16 63520953 1 2026-03-09T19:27:06.474 INFO:tasks.workunit.client.0.vm07.stdout:0/63: creat d0/d6/f14 x:0 0 0 2026-03-09T19:27:06.474 INFO:tasks.workunit.client.0.vm07.stdout:2/56: write d3/dd/fe [168572,98784] 0 2026-03-09T19:27:06.474 INFO:tasks.workunit.client.0.vm07.stdout:2/57: dread - d3/f5 zero size 2026-03-09T19:27:06.478 INFO:tasks.workunit.client.1.vm08.stdout:1/473: dread d9/d40/d49/f70 [0,4194304] 0 2026-03-09T19:27:06.478 INFO:tasks.workunit.client.1.vm08.stdout:1/474: stat l3 0 2026-03-09T19:27:06.490 INFO:tasks.workunit.client.1.vm08.stdout:7/412: stat d5/d16/d1c/c36 0 2026-03-09T19:27:06.491 INFO:tasks.workunit.client.1.vm08.stdout:7/413: truncate d5/d16/d3a/d42/f89 56916 0 2026-03-09T19:27:06.491 INFO:tasks.workunit.client.1.vm08.stdout:7/414: chown d5/f1a 52268221 1 2026-03-09T19:27:06.492 INFO:tasks.workunit.client.1.vm08.stdout:7/415: chown d5/d16/f49 20 1 2026-03-09T19:27:06.492 INFO:tasks.workunit.client.1.vm08.stdout:7/416: chown d5/d16/d3a/d42/f68 1 1 2026-03-09T19:27:06.496 INFO:tasks.workunit.client.1.vm08.stdout:5/333: mknod d16/c6d 0 2026-03-09T19:27:06.501 INFO:tasks.workunit.client.1.vm08.stdout:4/341: symlink da/d14/d2c/l66 0 2026-03-09T19:27:06.509 INFO:tasks.workunit.client.0.vm07.stdout:7/49: write d0/f1 [273531,5592] 0 2026-03-09T19:27:06.509 INFO:tasks.workunit.client.1.vm08.stdout:3/396: creat d0/d4b/f74 x:0 0 0 2026-03-09T19:27:06.509 INFO:tasks.workunit.client.0.vm07.stdout:7/50: stat d0/d4/d5/dd 0 2026-03-09T19:27:06.509 INFO:tasks.workunit.client.1.vm08.stdout:3/397: chown d0/d6/f39 123 1 2026-03-09T19:27:06.509 INFO:tasks.workunit.client.0.vm07.stdout:7/51: chown d0/d4/ce 646251662 1 2026-03-09T19:27:06.513 INFO:tasks.workunit.client.0.vm07.stdout:3/86: link d1/d10/f13 d1/d10/f1a 0 2026-03-09T19:27:06.513 INFO:tasks.workunit.client.1.vm08.stdout:8/373: truncate de/d1d/d21/f4b 375052 0 2026-03-09T19:27:06.517 INFO:tasks.workunit.client.1.vm08.stdout:2/329: creat d3/d4/d3e/f7b x:0 0 0 2026-03-09T19:27:06.522 INFO:tasks.workunit.client.1.vm08.stdout:1/475: symlink d9/da/dc/l90 0 2026-03-09T19:27:06.522 INFO:tasks.workunit.client.1.vm08.stdout:1/476: dread - d9/da/d53/d67/f79 zero size 2026-03-09T19:27:06.526 INFO:tasks.workunit.client.1.vm08.stdout:8/374: dread de/d1d/d2e/d5f/f4e [0,4194304] 0 2026-03-09T19:27:06.528 INFO:tasks.workunit.client.1.vm08.stdout:8/375: truncate de/d1d/d21/f86 1605565 0 2026-03-09T19:27:06.531 INFO:tasks.workunit.client.1.vm08.stdout:1/477: dwrite d9/da/d2c/d6a/f6b [0,4194304] 0 2026-03-09T19:27:06.533 INFO:tasks.workunit.client.1.vm08.stdout:1/478: readlink l8 0 2026-03-09T19:27:06.536 INFO:tasks.workunit.client.1.vm08.stdout:7/417: rename d5/d16/d3a/l82 to d5/d16/d1c/d83/d90/l93 0 2026-03-09T19:27:06.542 INFO:tasks.workunit.client.1.vm08.stdout:1/479: dwrite d9/d40/d49/f7c [0,4194304] 0 2026-03-09T19:27:06.544 INFO:tasks.workunit.client.0.vm07.stdout:1/46: rename d1/f10 to d1/db/f14 0 2026-03-09T19:27:06.544 INFO:tasks.workunit.client.1.vm08.stdout:1/480: readlink d9/da/dc/l83 0 2026-03-09T19:27:06.544 INFO:tasks.workunit.client.1.vm08.stdout:1/481: stat d9/da/d2d 0 2026-03-09T19:27:06.545 INFO:tasks.workunit.client.0.vm07.stdout:1/47: write d1/f6 [298650,15458] 0 2026-03-09T19:27:06.545 INFO:tasks.workunit.client.1.vm08.stdout:1/482: chown d9/da/d53/d67/l84 52673744 1 2026-03-09T19:27:06.545 INFO:tasks.workunit.client.0.vm07.stdout:1/48: chown d1/d11 46927 1 2026-03-09T19:27:06.546 INFO:tasks.workunit.client.1.vm08.stdout:9/330: truncate d0/fa 4076579 0 2026-03-09T19:27:06.547 INFO:tasks.workunit.client.1.vm08.stdout:9/331: chown d0/d2/d8/cc 3 1 2026-03-09T19:27:06.548 INFO:tasks.workunit.client.0.vm07.stdout:2/58: creat d3/d11/f17 x:0 0 0 2026-03-09T19:27:06.550 INFO:tasks.workunit.client.1.vm08.stdout:6/361: getdents d3/d68/d7e 0 2026-03-09T19:27:06.550 INFO:tasks.workunit.client.0.vm07.stdout:5/65: write d3/d9/fa [4932004,100500] 0 2026-03-09T19:27:06.561 INFO:tasks.workunit.client.0.vm07.stdout:7/52: creat d0/d4/f12 x:0 0 0 2026-03-09T19:27:06.562 INFO:tasks.workunit.client.1.vm08.stdout:4/342: mknod da/d10/d26/d27/d32/c67 0 2026-03-09T19:27:06.562 INFO:tasks.workunit.client.0.vm07.stdout:7/53: write d0/f1 [5913484,46585] 0 2026-03-09T19:27:06.562 INFO:tasks.workunit.client.0.vm07.stdout:7/54: dread - d0/d4/f12 zero size 2026-03-09T19:27:06.563 INFO:tasks.workunit.client.1.vm08.stdout:4/343: write da/d14/f5a [712061,39429] 0 2026-03-09T19:27:06.564 INFO:tasks.workunit.client.1.vm08.stdout:4/344: chown da/d10/d26/d50/c63 173 1 2026-03-09T19:27:06.564 INFO:tasks.workunit.client.1.vm08.stdout:1/483: dread d9/d11/f5f [0,4194304] 0 2026-03-09T19:27:06.564 INFO:tasks.workunit.client.1.vm08.stdout:4/345: chown da/d10/d26 0 1 2026-03-09T19:27:06.573 INFO:tasks.workunit.client.0.vm07.stdout:6/63: link d0/ff d0/d13/f18 0 2026-03-09T19:27:06.574 INFO:tasks.workunit.client.1.vm08.stdout:3/398: creat d0/d6/de/d1b/d16/d17/f75 x:0 0 0 2026-03-09T19:27:06.576 INFO:tasks.workunit.client.1.vm08.stdout:3/399: truncate d0/d6/de/d1b/d16/d17/f75 381901 0 2026-03-09T19:27:06.580 INFO:tasks.workunit.client.0.vm07.stdout:4/47: rmdir d3 39 2026-03-09T19:27:06.582 INFO:tasks.workunit.client.1.vm08.stdout:4/346: dread da/d10/f3d [0,4194304] 0 2026-03-09T19:27:06.587 INFO:tasks.workunit.client.1.vm08.stdout:3/400: dwrite d0/d52/f6a [0,4194304] 0 2026-03-09T19:27:06.588 INFO:tasks.workunit.client.1.vm08.stdout:3/401: fsync d0/d6/de/d1b/d16/d17/f1d 0 2026-03-09T19:27:06.591 INFO:tasks.workunit.client.0.vm07.stdout:8/75: dwrite f3 [0,4194304] 0 2026-03-09T19:27:06.596 INFO:tasks.workunit.client.0.vm07.stdout:0/64: rename d0/d6/f14 to d0/d7/d11/f15 0 2026-03-09T19:27:06.596 INFO:tasks.workunit.client.0.vm07.stdout:8/76: chown d7/d9/d10/c13 28 1 2026-03-09T19:27:06.609 INFO:tasks.workunit.client.1.vm08.stdout:4/347: dread f5 [0,4194304] 0 2026-03-09T19:27:06.609 INFO:tasks.workunit.client.1.vm08.stdout:4/348: fdatasync da/d10/d26/d27/f35 0 2026-03-09T19:27:06.611 INFO:tasks.workunit.client.0.vm07.stdout:5/66: creat d3/dd/f16 x:0 0 0 2026-03-09T19:27:06.612 INFO:tasks.workunit.client.0.vm07.stdout:5/67: write d3/dd/f14 [954392,73632] 0 2026-03-09T19:27:06.612 INFO:tasks.workunit.client.1.vm08.stdout:7/418: creat d5/d14/d2b/d5d/f94 x:0 0 0 2026-03-09T19:27:06.617 INFO:tasks.workunit.client.0.vm07.stdout:6/64: fdatasync d0/d13/f18 0 2026-03-09T19:27:06.618 INFO:tasks.workunit.client.1.vm08.stdout:5/334: mkdir d16/d1e/d6e 0 2026-03-09T19:27:06.620 INFO:tasks.workunit.client.1.vm08.stdout:5/335: dread d16/f4d [0,4194304] 0 2026-03-09T19:27:06.621 INFO:tasks.workunit.client.1.vm08.stdout:5/336: chown d16/d45 13 1 2026-03-09T19:27:06.623 INFO:tasks.workunit.client.0.vm07.stdout:9/67: truncate d0/d6/fa 1921673 0 2026-03-09T19:27:06.625 INFO:tasks.workunit.client.1.vm08.stdout:9/332: rename d0/d2/d8/d7/d48/c64 to d0/d2/d14/d5c/d32/c71 0 2026-03-09T19:27:06.629 INFO:tasks.workunit.client.1.vm08.stdout:6/362: dread d3/d34/d6f/f4f [0,4194304] 0 2026-03-09T19:27:06.630 INFO:tasks.workunit.client.1.vm08.stdout:0/368: getdents dd/d22/d27 0 2026-03-09T19:27:06.632 INFO:tasks.workunit.client.1.vm08.stdout:1/484: mkdir d9/da/d12/d91 0 2026-03-09T19:27:06.635 INFO:tasks.workunit.client.0.vm07.stdout:0/65: unlink d0/f12 0 2026-03-09T19:27:06.645 INFO:tasks.workunit.client.0.vm07.stdout:0/66: dread d0/fa [0,4194304] 0 2026-03-09T19:27:06.645 INFO:tasks.workunit.client.0.vm07.stdout:1/49: getdents d1/d11 0 2026-03-09T19:27:06.645 INFO:tasks.workunit.client.0.vm07.stdout:8/77: symlink d7/d16/l21 0 2026-03-09T19:27:06.645 INFO:tasks.workunit.client.0.vm07.stdout:8/78: truncate d7/f19 1225331 0 2026-03-09T19:27:06.645 INFO:tasks.workunit.client.0.vm07.stdout:5/68: fdatasync d3/d9/fb 0 2026-03-09T19:27:06.645 INFO:tasks.workunit.client.0.vm07.stdout:6/65: rename d0/f4 to d0/d1/f19 0 2026-03-09T19:27:06.646 INFO:tasks.workunit.client.0.vm07.stdout:9/68: mknod d0/d6/c15 0 2026-03-09T19:27:06.647 INFO:tasks.workunit.client.0.vm07.stdout:0/67: creat d0/d6/f16 x:0 0 0 2026-03-09T19:27:06.648 INFO:tasks.workunit.client.0.vm07.stdout:1/50: creat d1/d9/f15 x:0 0 0 2026-03-09T19:27:06.652 INFO:tasks.workunit.client.0.vm07.stdout:8/79: mkdir d7/d9/d10/d22 0 2026-03-09T19:27:06.652 INFO:tasks.workunit.client.1.vm08.stdout:7/419: sync 2026-03-09T19:27:06.656 INFO:tasks.workunit.client.0.vm07.stdout:7/55: creat d0/f13 x:0 0 0 2026-03-09T19:27:06.656 INFO:tasks.workunit.client.0.vm07.stdout:7/56: rename d0/d4/d5 to d0/d4/d5/d14 22 2026-03-09T19:27:06.657 INFO:tasks.workunit.client.0.vm07.stdout:7/57: fsync d0/d4/f12 0 2026-03-09T19:27:06.657 INFO:tasks.workunit.client.1.vm08.stdout:3/402: fdatasync d0/d6/d25/f56 0 2026-03-09T19:27:06.659 INFO:tasks.workunit.client.0.vm07.stdout:5/69: dread d3/dd/f14 [0,4194304] 0 2026-03-09T19:27:06.660 INFO:tasks.workunit.client.0.vm07.stdout:5/70: readlink d3/d9/l15 0 2026-03-09T19:27:06.666 INFO:tasks.workunit.client.1.vm08.stdout:5/337: mkdir d16/d1e/d30/d6f 0 2026-03-09T19:27:06.667 INFO:tasks.workunit.client.1.vm08.stdout:5/338: chown d16/d1e/d3b/f68 3 1 2026-03-09T19:27:06.667 INFO:tasks.workunit.client.0.vm07.stdout:1/51: chown d1/d3/cd 270 1 2026-03-09T19:27:06.673 INFO:tasks.workunit.client.1.vm08.stdout:9/333: read d0/f44 [4496631,64303] 0 2026-03-09T19:27:06.674 INFO:tasks.workunit.client.0.vm07.stdout:3/87: truncate d1/fe 3491682 0 2026-03-09T19:27:06.675 INFO:tasks.workunit.client.0.vm07.stdout:8/80: rename d7/d9/d10/c13 to d7/d9/d10/c23 0 2026-03-09T19:27:06.676 INFO:tasks.workunit.client.0.vm07.stdout:7/58: creat d0/d4/d5/d8/f15 x:0 0 0 2026-03-09T19:27:06.677 INFO:tasks.workunit.client.0.vm07.stdout:3/88: dread d1/d10/f1a [0,4194304] 0 2026-03-09T19:27:06.677 INFO:tasks.workunit.client.0.vm07.stdout:6/66: creat d0/d1/db/d17/f1a x:0 0 0 2026-03-09T19:27:06.677 INFO:tasks.workunit.client.1.vm08.stdout:6/363: creat d3/db/d43/d69/f89 x:0 0 0 2026-03-09T19:27:06.678 INFO:tasks.workunit.client.0.vm07.stdout:7/59: dread d0/f1 [4194304,4194304] 0 2026-03-09T19:27:06.680 INFO:tasks.workunit.client.0.vm07.stdout:7/60: dread d0/d4/d5/d8/fa [0,4194304] 0 2026-03-09T19:27:06.682 INFO:tasks.workunit.client.1.vm08.stdout:1/485: creat d9/d40/f92 x:0 0 0 2026-03-09T19:27:06.686 INFO:tasks.workunit.client.1.vm08.stdout:1/486: dwrite d9/da/d53/d67/f77 [0,4194304] 0 2026-03-09T19:27:06.696 INFO:tasks.workunit.client.0.vm07.stdout:0/68: mkdir d0/d6/d13/d17 0 2026-03-09T19:27:06.698 INFO:tasks.workunit.client.1.vm08.stdout:7/420: fsync d5/d16/d3a/f56 0 2026-03-09T19:27:06.699 INFO:tasks.workunit.client.0.vm07.stdout:1/52: readlink d1/d3/la 0 2026-03-09T19:27:06.702 INFO:tasks.workunit.client.1.vm08.stdout:3/403: rmdir d0/d6/de/d6e/d51 39 2026-03-09T19:27:06.704 INFO:tasks.workunit.client.0.vm07.stdout:2/59: stat d3/ff 0 2026-03-09T19:27:06.706 INFO:tasks.workunit.client.1.vm08.stdout:3/404: dwrite d0/d8/d24/f2d [4194304,4194304] 0 2026-03-09T19:27:06.707 INFO:tasks.workunit.client.1.vm08.stdout:3/405: readlink d0/d6/de/d1b/l49 0 2026-03-09T19:27:06.715 INFO:tasks.workunit.client.1.vm08.stdout:2/330: write d3/d4/d23/d2c/d39/d5e/d14/f44 [1619622,45218] 0 2026-03-09T19:27:06.718 INFO:tasks.workunit.client.1.vm08.stdout:2/331: readlink d3/d4/d23/d2c/d39/d5e/de/l77 0 2026-03-09T19:27:06.720 INFO:tasks.workunit.client.1.vm08.stdout:4/349: symlink da/d10/d16/d28/d46/l68 0 2026-03-09T19:27:06.720 INFO:tasks.workunit.client.1.vm08.stdout:4/350: dread - da/d10/f53 zero size 2026-03-09T19:27:06.721 INFO:tasks.workunit.client.1.vm08.stdout:4/351: dread - da/d10/d26/d38/f57 zero size 2026-03-09T19:27:06.722 INFO:tasks.workunit.client.1.vm08.stdout:4/352: chown da/d10/d26/d27/c59 1436337 1 2026-03-09T19:27:06.726 INFO:tasks.workunit.client.0.vm07.stdout:5/71: link d3/dd/f14 d3/d9/f17 0 2026-03-09T19:27:06.736 INFO:tasks.workunit.client.0.vm07.stdout:5/72: write d3/d9/fb [719885,86486] 0 2026-03-09T19:27:06.736 INFO:tasks.workunit.client.0.vm07.stdout:4/48: link d3/l4 d3/l9 0 2026-03-09T19:27:06.736 INFO:tasks.workunit.client.1.vm08.stdout:6/364: mkdir d3/d15/d8a 0 2026-03-09T19:27:06.736 INFO:tasks.workunit.client.1.vm08.stdout:0/369: mknod dd/d22/d27/d6c/c75 0 2026-03-09T19:27:06.736 INFO:tasks.workunit.client.1.vm08.stdout:1/487: mkdir d9/da/d53/d67/d6c/d93 0 2026-03-09T19:27:06.736 INFO:tasks.workunit.client.1.vm08.stdout:7/421: rename d5/d16/f55 to d5/d16/f95 0 2026-03-09T19:27:06.736 INFO:tasks.workunit.client.1.vm08.stdout:3/406: readlink d0/d6/de/d1b/d16/l61 0 2026-03-09T19:27:06.736 INFO:tasks.workunit.client.0.vm07.stdout:9/69: dwrite d0/d6/fa [0,4194304] 0 2026-03-09T19:27:06.737 INFO:tasks.workunit.client.0.vm07.stdout:9/70: stat d0/f3 0 2026-03-09T19:27:06.737 INFO:tasks.workunit.client.0.vm07.stdout:2/60: creat d3/d11/f18 x:0 0 0 2026-03-09T19:27:06.737 INFO:tasks.workunit.client.1.vm08.stdout:2/332: write d3/d4/d23/d2c/d39/d5e/de/d18/f50 [835705,14359] 0 2026-03-09T19:27:06.742 INFO:tasks.workunit.client.0.vm07.stdout:9/71: dwrite d0/d6/f8 [0,4194304] 0 2026-03-09T19:27:06.742 INFO:tasks.workunit.client.1.vm08.stdout:2/333: dwrite d3/d9/d26/f6a [0,4194304] 0 2026-03-09T19:27:06.742 INFO:tasks.workunit.client.0.vm07.stdout:9/72: write d0/f3 [3973741,38655] 0 2026-03-09T19:27:06.742 INFO:tasks.workunit.client.0.vm07.stdout:9/73: stat d0/l1 0 2026-03-09T19:27:06.759 INFO:tasks.workunit.client.1.vm08.stdout:4/353: mkdir da/d10/d26/d3a/d69 0 2026-03-09T19:27:06.761 INFO:tasks.workunit.client.0.vm07.stdout:8/81: symlink d7/d1d/l24 0 2026-03-09T19:27:06.761 INFO:tasks.workunit.client.0.vm07.stdout:7/61: fsync d0/d4/d5/d8/f15 0 2026-03-09T19:27:06.761 INFO:tasks.workunit.client.0.vm07.stdout:6/67: creat d0/d13/f1b x:0 0 0 2026-03-09T19:27:06.763 INFO:tasks.workunit.client.0.vm07.stdout:7/62: readlink d0/l7 0 2026-03-09T19:27:06.769 INFO:tasks.workunit.client.1.vm08.stdout:8/376: truncate de/d25/d33/f83 1385617 0 2026-03-09T19:27:06.769 INFO:tasks.workunit.client.1.vm08.stdout:8/377: write de/d1d/d69/f7e [768347,35492] 0 2026-03-09T19:27:06.770 INFO:tasks.workunit.client.0.vm07.stdout:6/68: dwrite d0/d1/db/f15 [0,4194304] 0 2026-03-09T19:27:06.771 INFO:tasks.workunit.client.1.vm08.stdout:6/365: mknod d3/db/d43/d69/c8b 0 2026-03-09T19:27:06.783 INFO:tasks.workunit.client.1.vm08.stdout:6/366: read d3/d34/d6f/f4f [583549,9842] 0 2026-03-09T19:27:06.783 INFO:tasks.workunit.client.1.vm08.stdout:9/334: dwrite d0/d2/d8/f2d [0,4194304] 0 2026-03-09T19:27:06.783 INFO:tasks.workunit.client.1.vm08.stdout:0/370: symlink dd/d31/l76 0 2026-03-09T19:27:06.783 INFO:tasks.workunit.client.1.vm08.stdout:1/488: symlink d9/da/d12/d39/l94 0 2026-03-09T19:27:06.783 INFO:tasks.workunit.client.1.vm08.stdout:7/422: rename d5/f9 to d5/d14/d2b/d4b/f96 0 2026-03-09T19:27:06.783 INFO:tasks.workunit.client.0.vm07.stdout:7/63: dwrite d0/d4/d5/d8/f10 [0,4194304] 0 2026-03-09T19:27:06.783 INFO:tasks.workunit.client.0.vm07.stdout:7/64: dread - d0/f13 zero size 2026-03-09T19:27:06.783 INFO:tasks.workunit.client.0.vm07.stdout:7/65: chown d0/d4/d5/d8/f10 2741785 1 2026-03-09T19:27:06.783 INFO:tasks.workunit.client.0.vm07.stdout:6/69: dread d0/d1/f8 [0,4194304] 0 2026-03-09T19:27:06.783 INFO:tasks.workunit.client.0.vm07.stdout:4/49: creat d3/fa x:0 0 0 2026-03-09T19:27:06.784 INFO:tasks.workunit.client.0.vm07.stdout:4/50: chown d3/fa 188 1 2026-03-09T19:27:06.784 INFO:tasks.workunit.client.1.vm08.stdout:7/423: dread - d5/d14/d38/f40 zero size 2026-03-09T19:27:06.786 INFO:tasks.workunit.client.0.vm07.stdout:1/53: unlink d1/f2 0 2026-03-09T19:27:06.786 INFO:tasks.workunit.client.0.vm07.stdout:2/61: creat d3/d11/f19 x:0 0 0 2026-03-09T19:27:06.787 INFO:tasks.workunit.client.1.vm08.stdout:3/407: mknod d0/d6/de/d15/c76 0 2026-03-09T19:27:06.790 INFO:tasks.workunit.client.0.vm07.stdout:8/82: sync 2026-03-09T19:27:06.795 INFO:tasks.workunit.client.0.vm07.stdout:8/83: dwrite f1 [0,4194304] 0 2026-03-09T19:27:06.796 INFO:tasks.workunit.client.1.vm08.stdout:4/354: symlink da/d10/d26/d38/l6a 0 2026-03-09T19:27:06.802 INFO:tasks.workunit.client.0.vm07.stdout:8/84: write d7/d9/fd [5060734,122021] 0 2026-03-09T19:27:06.802 INFO:tasks.workunit.client.0.vm07.stdout:9/74: chown d0/f4 45952 1 2026-03-09T19:27:06.803 INFO:tasks.workunit.client.1.vm08.stdout:6/367: mknod d3/d34/d5c/c8c 0 2026-03-09T19:27:06.803 INFO:tasks.workunit.client.0.vm07.stdout:9/75: chown d0/d6/le 6 1 2026-03-09T19:27:06.803 INFO:tasks.workunit.client.0.vm07.stdout:8/85: fsync d7/d9/d10/f1b 0 2026-03-09T19:27:06.803 INFO:tasks.workunit.client.0.vm07.stdout:8/86: write d7/f15 [1177445,22695] 0 2026-03-09T19:27:06.805 INFO:tasks.workunit.client.0.vm07.stdout:8/87: write d7/d9/d10/f20 [289199,3065] 0 2026-03-09T19:27:06.809 INFO:tasks.workunit.client.0.vm07.stdout:8/88: dwrite f1 [0,4194304] 0 2026-03-09T19:27:06.812 INFO:tasks.workunit.client.1.vm08.stdout:3/408: dread d0/d8/f4c [0,4194304] 0 2026-03-09T19:27:06.815 INFO:tasks.workunit.client.0.vm07.stdout:8/89: dread d7/f19 [0,4194304] 0 2026-03-09T19:27:06.822 INFO:tasks.workunit.client.1.vm08.stdout:0/371: unlink dd/d22/d27/d4f/f5d 0 2026-03-09T19:27:06.822 INFO:tasks.workunit.client.1.vm08.stdout:0/372: chown dd/d22/d27/d2e/d37 390890 1 2026-03-09T19:27:06.824 INFO:tasks.workunit.client.1.vm08.stdout:1/489: rmdir d9/d40 39 2026-03-09T19:27:06.824 INFO:tasks.workunit.client.1.vm08.stdout:1/490: write d9/da/dc/f10 [2098519,68649] 0 2026-03-09T19:27:06.827 INFO:tasks.workunit.client.0.vm07.stdout:1/54: write d1/db/f14 [1775934,59640] 0 2026-03-09T19:27:06.835 INFO:tasks.workunit.client.0.vm07.stdout:2/62: creat d3/f1a x:0 0 0 2026-03-09T19:27:06.835 INFO:tasks.workunit.client.0.vm07.stdout:1/55: chown d1/d3/f12 36564 1 2026-03-09T19:27:06.835 INFO:tasks.workunit.client.1.vm08.stdout:5/339: getdents d16/d1e/d3b/d61 0 2026-03-09T19:27:06.835 INFO:tasks.workunit.client.1.vm08.stdout:8/378: creat de/d25/d31/d82/d6d/f88 x:0 0 0 2026-03-09T19:27:06.835 INFO:tasks.workunit.client.1.vm08.stdout:8/379: dread de/d1d/d4f/f51 [0,4194304] 0 2026-03-09T19:27:06.835 INFO:tasks.workunit.client.1.vm08.stdout:4/355: rmdir da/d10/d16/d28/d46/d52 39 2026-03-09T19:27:06.835 INFO:tasks.workunit.client.1.vm08.stdout:6/368: creat d3/d34/d3b/f8d x:0 0 0 2026-03-09T19:27:06.835 INFO:tasks.workunit.client.1.vm08.stdout:6/369: readlink d3/db/lf 0 2026-03-09T19:27:06.836 INFO:tasks.workunit.client.0.vm07.stdout:1/56: dwrite d1/d3/f12 [0,4194304] 0 2026-03-09T19:27:06.837 INFO:tasks.workunit.client.0.vm07.stdout:9/76: mknod d0/c16 0 2026-03-09T19:27:06.838 INFO:tasks.workunit.client.1.vm08.stdout:6/370: dwrite d3/d34/d5c/f7c [0,4194304] 0 2026-03-09T19:27:06.848 INFO:tasks.workunit.client.1.vm08.stdout:0/373: dread dd/d22/d24/d49/f4c [0,4194304] 0 2026-03-09T19:27:06.850 INFO:tasks.workunit.client.0.vm07.stdout:7/66: creat d0/d4/d5/dd/f16 x:0 0 0 2026-03-09T19:27:06.863 INFO:tasks.workunit.client.1.vm08.stdout:1/491: mkdir d9/da/d95 0 2026-03-09T19:27:06.863 INFO:tasks.workunit.client.0.vm07.stdout:5/73: creat d3/f18 x:0 0 0 2026-03-09T19:27:06.863 INFO:tasks.workunit.client.0.vm07.stdout:8/90: mknod d7/d9/c25 0 2026-03-09T19:27:06.864 INFO:tasks.workunit.client.1.vm08.stdout:2/334: creat d3/f7c x:0 0 0 2026-03-09T19:27:06.865 INFO:tasks.workunit.client.1.vm08.stdout:5/340: dread - d16/f2b zero size 2026-03-09T19:27:06.865 INFO:tasks.workunit.client.0.vm07.stdout:4/51: rename d3/l9 to d3/lb 0 2026-03-09T19:27:06.866 INFO:tasks.workunit.client.1.vm08.stdout:5/341: write d16/d1e/d3b/f43 [1162260,30849] 0 2026-03-09T19:27:06.866 INFO:tasks.workunit.client.0.vm07.stdout:4/52: read - d3/fa zero size 2026-03-09T19:27:06.869 INFO:tasks.workunit.client.0.vm07.stdout:2/63: rename d3/c14 to d3/c1b 0 2026-03-09T19:27:06.869 INFO:tasks.workunit.client.1.vm08.stdout:4/356: creat da/d10/f6b x:0 0 0 2026-03-09T19:27:06.870 INFO:tasks.workunit.client.0.vm07.stdout:5/74: dwrite d3/f18 [0,4194304] 0 2026-03-09T19:27:06.871 INFO:tasks.workunit.client.0.vm07.stdout:7/67: sync 2026-03-09T19:27:06.873 INFO:tasks.workunit.client.0.vm07.stdout:1/57: creat d1/d9/f16 x:0 0 0 2026-03-09T19:27:06.873 INFO:tasks.workunit.client.0.vm07.stdout:5/75: readlink d3/dd/l13 0 2026-03-09T19:27:06.879 INFO:tasks.workunit.client.1.vm08.stdout:0/374: unlink dd/d31/l3b 0 2026-03-09T19:27:06.879 INFO:tasks.workunit.client.0.vm07.stdout:1/58: dwrite d1/d3/f12 [0,4194304] 0 2026-03-09T19:27:06.881 INFO:tasks.workunit.client.0.vm07.stdout:9/77: mkdir d0/d17 0 2026-03-09T19:27:06.882 INFO:tasks.workunit.client.0.vm07.stdout:0/69: creat d0/d7/f18 x:0 0 0 2026-03-09T19:27:06.884 INFO:tasks.workunit.client.0.vm07.stdout:8/91: symlink d7/d9/d10/l26 0 2026-03-09T19:27:06.885 INFO:tasks.workunit.client.0.vm07.stdout:1/59: dwrite d1/d9/f16 [0,4194304] 0 2026-03-09T19:27:06.888 INFO:tasks.workunit.client.0.vm07.stdout:8/92: dread d7/f19 [0,4194304] 0 2026-03-09T19:27:06.889 INFO:tasks.workunit.client.0.vm07.stdout:8/93: write d7/f1c [82954,73478] 0 2026-03-09T19:27:06.889 INFO:tasks.workunit.client.0.vm07.stdout:8/94: chown d7/d9/c25 469697 1 2026-03-09T19:27:06.892 INFO:tasks.workunit.client.1.vm08.stdout:1/492: readlink d9/da/l24 0 2026-03-09T19:27:06.894 INFO:tasks.workunit.client.0.vm07.stdout:2/64: rename d3/d11/f17 to d3/dd/d16/f1c 0 2026-03-09T19:27:06.895 INFO:tasks.workunit.client.0.vm07.stdout:4/53: dread d3/f5 [0,4194304] 0 2026-03-09T19:27:06.902 INFO:tasks.workunit.client.0.vm07.stdout:2/65: dwrite d3/d11/f18 [0,4194304] 0 2026-03-09T19:27:06.904 INFO:tasks.workunit.client.0.vm07.stdout:7/68: mkdir d0/d4/d17 0 2026-03-09T19:27:06.913 INFO:tasks.workunit.client.1.vm08.stdout:2/335: rename d3/d9/f67 to d3/d9/d79/f7d 0 2026-03-09T19:27:06.913 INFO:tasks.workunit.client.1.vm08.stdout:5/342: rename d16/d1e/d30/f3e to d16/d1e/d30/f70 0 2026-03-09T19:27:06.916 INFO:tasks.workunit.client.1.vm08.stdout:5/343: write d16/d1e/f5f [917404,23551] 0 2026-03-09T19:27:06.918 INFO:tasks.workunit.client.1.vm08.stdout:5/344: write d16/d1e/d3b/f50 [2556272,73753] 0 2026-03-09T19:27:06.920 INFO:tasks.workunit.client.1.vm08.stdout:8/380: chown de/d25/d33/f83 193028 1 2026-03-09T19:27:06.921 INFO:tasks.workunit.client.1.vm08.stdout:4/357: rmdir da/d10/d16/d28/d2f/d4f 39 2026-03-09T19:27:06.928 INFO:tasks.workunit.client.1.vm08.stdout:6/371: mknod d3/d34/d3b/d85/c8e 0 2026-03-09T19:27:06.931 INFO:tasks.workunit.client.1.vm08.stdout:6/372: dread d3/db/f14 [0,4194304] 0 2026-03-09T19:27:06.931 INFO:tasks.workunit.client.1.vm08.stdout:6/373: readlink d3/l5b 0 2026-03-09T19:27:06.935 INFO:tasks.workunit.client.0.vm07.stdout:2/66: creat d3/dd/f1d x:0 0 0 2026-03-09T19:27:06.935 INFO:tasks.workunit.client.0.vm07.stdout:2/67: fsync d3/f15 0 2026-03-09T19:27:06.935 INFO:tasks.workunit.client.1.vm08.stdout:8/381: symlink de/d25/d31/l89 0 2026-03-09T19:27:06.936 INFO:tasks.workunit.client.0.vm07.stdout:7/69: creat d0/d4/d5/dd/f18 x:0 0 0 2026-03-09T19:27:06.937 INFO:tasks.workunit.client.1.vm08.stdout:4/358: mkdir da/d14/d40/d6c 0 2026-03-09T19:27:06.938 INFO:tasks.workunit.client.1.vm08.stdout:3/409: getdents d0/d6/de/d1b/d16 0 2026-03-09T19:27:06.939 INFO:tasks.workunit.client.1.vm08.stdout:3/410: write d0/d6/de/d1b/d16/d17/f1d [3366018,100826] 0 2026-03-09T19:27:06.940 INFO:tasks.workunit.client.0.vm07.stdout:1/60: truncate d1/d3/f4 2487520 0 2026-03-09T19:27:06.942 INFO:tasks.workunit.client.0.vm07.stdout:4/54: chown d3/l4 5752467 1 2026-03-09T19:27:06.942 INFO:tasks.workunit.client.1.vm08.stdout:1/493: rmdir d9/da/d17/d60 39 2026-03-09T19:27:06.942 INFO:tasks.workunit.client.0.vm07.stdout:1/61: dread d1/d9/f16 [0,4194304] 0 2026-03-09T19:27:06.943 INFO:tasks.workunit.client.0.vm07.stdout:7/70: symlink d0/d4/d5/dd/l19 0 2026-03-09T19:27:06.944 INFO:tasks.workunit.client.1.vm08.stdout:8/382: truncate de/d1d/d4f/f6b 1037423 0 2026-03-09T19:27:06.945 INFO:tasks.workunit.client.0.vm07.stdout:9/78: link d0/d6/le d0/d17/l18 0 2026-03-09T19:27:06.945 INFO:tasks.workunit.client.1.vm08.stdout:4/359: fsync da/d10/f2e 0 2026-03-09T19:27:06.946 INFO:tasks.workunit.client.0.vm07.stdout:9/79: read d0/d6/f10 [213241,122064] 0 2026-03-09T19:27:06.946 INFO:tasks.workunit.client.1.vm08.stdout:4/360: read da/d14/d2c/f4a [113817,26894] 0 2026-03-09T19:27:06.946 INFO:tasks.workunit.client.0.vm07.stdout:1/62: dwrite d1/f6 [0,4194304] 0 2026-03-09T19:27:06.948 INFO:tasks.workunit.client.1.vm08.stdout:0/375: creat dd/d22/d24/f77 x:0 0 0 2026-03-09T19:27:06.949 INFO:tasks.workunit.client.1.vm08.stdout:3/411: rename d0/d6/de/d1a/d33 to d0/d52/d6d/d77 0 2026-03-09T19:27:06.951 INFO:tasks.workunit.client.0.vm07.stdout:4/55: creat d3/fc x:0 0 0 2026-03-09T19:27:06.952 INFO:tasks.workunit.client.1.vm08.stdout:1/494: write d9/da/d2c/d6a/f6b [4143312,23659] 0 2026-03-09T19:27:06.953 INFO:tasks.workunit.client.0.vm07.stdout:7/71: rename d0/d4/d17 to d0/d4/d5/d8/d1a 0 2026-03-09T19:27:06.953 INFO:tasks.workunit.client.0.vm07.stdout:7/72: write d0/d4/d5/dd/f16 [82416,32442] 0 2026-03-09T19:27:06.953 INFO:tasks.workunit.client.1.vm08.stdout:8/383: symlink de/d1d/d69/l8a 0 2026-03-09T19:27:06.955 INFO:tasks.workunit.client.0.vm07.stdout:9/80: symlink d0/l19 0 2026-03-09T19:27:06.955 INFO:tasks.workunit.client.0.vm07.stdout:4/56: readlink d3/l4 0 2026-03-09T19:27:06.955 INFO:tasks.workunit.client.0.vm07.stdout:1/63: creat d1/f17 x:0 0 0 2026-03-09T19:27:06.955 INFO:tasks.workunit.client.1.vm08.stdout:4/361: mkdir da/d14/d6d 0 2026-03-09T19:27:06.955 INFO:tasks.workunit.client.0.vm07.stdout:9/81: dread - d0/d6/ff zero size 2026-03-09T19:27:06.955 INFO:tasks.workunit.client.1.vm08.stdout:4/362: stat da/d10/d26/d38/l6a 0 2026-03-09T19:27:06.956 INFO:tasks.workunit.client.0.vm07.stdout:1/64: rename d1 to d1/d9/d18 22 2026-03-09T19:27:06.956 INFO:tasks.workunit.client.0.vm07.stdout:7/73: creat d0/d4/d5/dd/f1b x:0 0 0 2026-03-09T19:27:06.957 INFO:tasks.workunit.client.1.vm08.stdout:0/376: write dd/d22/d24/f26 [3918739,106006] 0 2026-03-09T19:27:06.959 INFO:tasks.workunit.client.1.vm08.stdout:3/412: dread - d0/d6/de/d6e/d51/f70 zero size 2026-03-09T19:27:06.969 INFO:tasks.workunit.client.0.vm07.stdout:7/74: creat d0/d4/d5/d8/f1c x:0 0 0 2026-03-09T19:27:06.970 INFO:tasks.workunit.client.0.vm07.stdout:9/82: symlink d0/db/l1a 0 2026-03-09T19:27:06.970 INFO:tasks.workunit.client.0.vm07.stdout:7/75: creat d0/d4/d5/d8/d1a/f1d x:0 0 0 2026-03-09T19:27:06.970 INFO:tasks.workunit.client.0.vm07.stdout:7/76: fdatasync d0/d4/d5/d8/fa 0 2026-03-09T19:27:06.970 INFO:tasks.workunit.client.0.vm07.stdout:7/77: dread - d0/d4/d5/d8/d1a/f1d zero size 2026-03-09T19:27:06.970 INFO:tasks.workunit.client.0.vm07.stdout:7/78: symlink d0/d4/d5/d8/d1a/l1e 0 2026-03-09T19:27:06.970 INFO:tasks.workunit.client.0.vm07.stdout:7/79: creat d0/d4/d5/dd/f1f x:0 0 0 2026-03-09T19:27:06.970 INFO:tasks.workunit.client.1.vm08.stdout:8/384: symlink de/d7c/l8b 0 2026-03-09T19:27:06.976 INFO:tasks.workunit.client.0.vm07.stdout:7/80: creat d0/d4/d5/f20 x:0 0 0 2026-03-09T19:27:06.976 INFO:tasks.workunit.client.0.vm07.stdout:7/81: rename d0/l6 to d0/d4/d5/d8/d1a/l21 0 2026-03-09T19:27:06.976 INFO:tasks.workunit.client.1.vm08.stdout:8/385: write de/d1d/d21/f30 [3400583,16361] 0 2026-03-09T19:27:06.976 INFO:tasks.workunit.client.1.vm08.stdout:8/386: dread - de/d1d/d2e/d5f/f80 zero size 2026-03-09T19:27:06.976 INFO:tasks.workunit.client.1.vm08.stdout:8/387: chown de/d47/d85 10 1 2026-03-09T19:27:06.976 INFO:tasks.workunit.client.1.vm08.stdout:8/388: mknod de/d1d/d21/d73/c8c 0 2026-03-09T19:27:06.976 INFO:tasks.workunit.client.1.vm08.stdout:1/495: link d9/d40/l54 d9/da/d12/l96 0 2026-03-09T19:27:06.977 INFO:tasks.workunit.client.0.vm07.stdout:7/82: creat d0/f22 x:0 0 0 2026-03-09T19:27:06.977 INFO:tasks.workunit.client.1.vm08.stdout:1/496: symlink d9/da/d2c/d6a/l97 0 2026-03-09T19:27:06.979 INFO:tasks.workunit.client.1.vm08.stdout:1/497: fsync d9/da/d53/d67/f79 0 2026-03-09T19:27:06.979 INFO:tasks.workunit.client.0.vm07.stdout:7/83: link d0/l2 d0/d4/d5/d8/d1a/l23 0 2026-03-09T19:27:06.982 INFO:tasks.workunit.client.0.vm07.stdout:7/84: link d0/l2 d0/d4/l24 0 2026-03-09T19:27:06.982 INFO:tasks.workunit.client.0.vm07.stdout:7/85: read - d0/d4/d5/d8/f15 zero size 2026-03-09T19:27:06.984 INFO:tasks.workunit.client.0.vm07.stdout:7/86: creat d0/f25 x:0 0 0 2026-03-09T19:27:06.985 INFO:tasks.workunit.client.0.vm07.stdout:7/87: write d0/f1 [5869932,24525] 0 2026-03-09T19:27:06.992 INFO:tasks.workunit.client.0.vm07.stdout:7/88: dwrite d0/d4/d5/d8/f15 [0,4194304] 0 2026-03-09T19:27:07.002 INFO:tasks.workunit.client.0.vm07.stdout:7/89: truncate d0/f13 270121 0 2026-03-09T19:27:07.007 INFO:tasks.workunit.client.0.vm07.stdout:7/90: mkdir d0/d4/d5/d26 0 2026-03-09T19:27:07.010 INFO:tasks.workunit.client.0.vm07.stdout:9/83: sync 2026-03-09T19:27:07.010 INFO:tasks.workunit.client.0.vm07.stdout:2/68: sync 2026-03-09T19:27:07.010 INFO:tasks.workunit.client.0.vm07.stdout:7/91: dwrite d0/d4/f12 [0,4194304] 0 2026-03-09T19:27:07.010 INFO:tasks.workunit.client.0.vm07.stdout:1/65: sync 2026-03-09T19:27:07.012 INFO:tasks.workunit.client.0.vm07.stdout:9/84: symlink d0/db/l1b 0 2026-03-09T19:27:07.012 INFO:tasks.workunit.client.0.vm07.stdout:2/69: creat d3/dd/f1e x:0 0 0 2026-03-09T19:27:07.014 INFO:tasks.workunit.client.0.vm07.stdout:1/66: symlink d1/d11/l19 0 2026-03-09T19:27:07.016 INFO:tasks.workunit.client.0.vm07.stdout:2/70: rename d3/d11/f13 to d3/d11/f1f 0 2026-03-09T19:27:07.018 INFO:tasks.workunit.client.0.vm07.stdout:9/85: dwrite d0/d6/ff [0,4194304] 0 2026-03-09T19:27:07.022 INFO:tasks.workunit.client.0.vm07.stdout:9/86: symlink d0/d17/l1c 0 2026-03-09T19:27:07.023 INFO:tasks.workunit.client.0.vm07.stdout:9/87: dread - d0/d6/f13 zero size 2026-03-09T19:27:07.023 INFO:tasks.workunit.client.0.vm07.stdout:1/67: dwrite d1/d9/f15 [0,4194304] 0 2026-03-09T19:27:07.025 INFO:tasks.workunit.client.0.vm07.stdout:9/88: dread d0/f4 [0,4194304] 0 2026-03-09T19:27:07.025 INFO:tasks.workunit.client.0.vm07.stdout:9/89: chown d0/d6/ff 3464 1 2026-03-09T19:27:07.026 INFO:tasks.workunit.client.0.vm07.stdout:9/90: chown d0/d6/c15 28592093 1 2026-03-09T19:27:07.030 INFO:tasks.workunit.client.1.vm08.stdout:2/336: sync 2026-03-09T19:27:07.030 INFO:tasks.workunit.client.1.vm08.stdout:8/389: sync 2026-03-09T19:27:07.031 INFO:tasks.workunit.client.1.vm08.stdout:8/390: readlink de/d25/l26 0 2026-03-09T19:27:07.033 INFO:tasks.workunit.client.1.vm08.stdout:2/337: rmdir d3/d4/d3e/d4e 39 2026-03-09T19:27:07.033 INFO:tasks.workunit.client.0.vm07.stdout:9/91: dwrite d0/d6/f8 [0,4194304] 0 2026-03-09T19:27:07.051 INFO:tasks.workunit.client.1.vm08.stdout:8/391: mknod de/d25/d31/d82/c8d 0 2026-03-09T19:27:07.051 INFO:tasks.workunit.client.1.vm08.stdout:8/392: stat de/f20 0 2026-03-09T19:27:07.054 INFO:tasks.workunit.client.1.vm08.stdout:0/377: dread dd/d22/d27/d2e/f39 [0,4194304] 0 2026-03-09T19:27:07.057 INFO:tasks.workunit.client.1.vm08.stdout:2/338: unlink d3/d4/d23/d2c/d39/d5e/c2f 0 2026-03-09T19:27:07.061 INFO:tasks.workunit.client.0.vm07.stdout:9/92: creat d0/db/f1d x:0 0 0 2026-03-09T19:27:07.063 INFO:tasks.workunit.client.0.vm07.stdout:9/93: mknod d0/db/c1e 0 2026-03-09T19:27:07.063 INFO:tasks.workunit.client.1.vm08.stdout:0/378: mkdir dd/d22/d24/d49/d50/d78 0 2026-03-09T19:27:07.064 INFO:tasks.workunit.client.0.vm07.stdout:9/94: truncate d0/d6/f10 5153421 0 2026-03-09T19:27:07.064 INFO:tasks.workunit.client.0.vm07.stdout:9/95: read d0/f4 [87597,49152] 0 2026-03-09T19:27:07.065 INFO:tasks.workunit.client.0.vm07.stdout:9/96: dread d0/f4 [0,4194304] 0 2026-03-09T19:27:07.066 INFO:tasks.workunit.client.0.vm07.stdout:9/97: creat d0/d17/f1f x:0 0 0 2026-03-09T19:27:07.069 INFO:tasks.workunit.client.0.vm07.stdout:9/98: creat d0/d6/f20 x:0 0 0 2026-03-09T19:27:07.070 INFO:tasks.workunit.client.0.vm07.stdout:9/99: rename d0/d6/f13 to d0/db/f21 0 2026-03-09T19:27:07.071 INFO:tasks.workunit.client.0.vm07.stdout:9/100: chown d0/l1 33 1 2026-03-09T19:27:07.072 INFO:tasks.workunit.client.0.vm07.stdout:9/101: mknod d0/c22 0 2026-03-09T19:27:07.135 INFO:tasks.workunit.client.0.vm07.stdout:8/95: read d7/d9/d10/f1b [534622,97314] 0 2026-03-09T19:27:07.136 INFO:tasks.workunit.client.0.vm07.stdout:8/96: write f1 [2185755,16079] 0 2026-03-09T19:27:07.138 INFO:tasks.workunit.client.0.vm07.stdout:8/97: creat d7/d16/d1e/f27 x:0 0 0 2026-03-09T19:27:07.139 INFO:tasks.workunit.client.0.vm07.stdout:8/98: write d7/f1c [180132,125713] 0 2026-03-09T19:27:07.141 INFO:tasks.workunit.client.0.vm07.stdout:8/99: read d7/ff [231877,43614] 0 2026-03-09T19:27:07.145 INFO:tasks.workunit.client.0.vm07.stdout:8/100: dwrite d7/d9/fd [4194304,4194304] 0 2026-03-09T19:27:07.148 INFO:tasks.workunit.client.0.vm07.stdout:8/101: symlink d7/d16/l28 0 2026-03-09T19:27:07.153 INFO:tasks.workunit.client.0.vm07.stdout:8/102: write d7/d9/d10/f1b [965035,123395] 0 2026-03-09T19:27:07.160 INFO:tasks.workunit.client.0.vm07.stdout:3/89: write d1/f7 [4406194,27284] 0 2026-03-09T19:27:07.196 INFO:tasks.workunit.client.1.vm08.stdout:9/335: write d0/d2/d14/f4d [2261697,36580] 0 2026-03-09T19:27:07.197 INFO:tasks.workunit.client.1.vm08.stdout:9/336: mkdir d0/d2/d14/d5c/d32/d57/d72 0 2026-03-09T19:27:07.198 INFO:tasks.workunit.client.1.vm08.stdout:9/337: symlink d0/d2/d14/d5c/d32/d57/l73 0 2026-03-09T19:27:07.202 INFO:tasks.workunit.client.1.vm08.stdout:9/338: dwrite d0/d2/d8/d7/d48/f53 [0,4194304] 0 2026-03-09T19:27:07.206 INFO:tasks.workunit.client.1.vm08.stdout:9/339: dread d0/d2/d14/d5c/d32/f40 [0,4194304] 0 2026-03-09T19:27:07.207 INFO:tasks.workunit.client.1.vm08.stdout:9/340: dread - d0/d2/f36 zero size 2026-03-09T19:27:07.207 INFO:tasks.workunit.client.1.vm08.stdout:9/341: readlink d0/d2/d8/l50 0 2026-03-09T19:27:07.208 INFO:tasks.workunit.client.1.vm08.stdout:9/342: mkdir d0/d2/d8/d7/d48/d5d/d74 0 2026-03-09T19:27:07.209 INFO:tasks.workunit.client.1.vm08.stdout:9/343: truncate d0/d2/f6c 538471 0 2026-03-09T19:27:07.210 INFO:tasks.workunit.client.1.vm08.stdout:9/344: truncate d0/d2/d8/d7/f58 507132 0 2026-03-09T19:27:07.213 INFO:tasks.workunit.client.1.vm08.stdout:9/345: unlink d0/d2/d14/d5c/d32/c4c 0 2026-03-09T19:27:07.217 INFO:tasks.workunit.client.1.vm08.stdout:9/346: read d0/f44 [380873,107618] 0 2026-03-09T19:27:07.219 INFO:tasks.workunit.client.1.vm08.stdout:9/347: mknod d0/d2/d14/d5c/d32/d57/d72/c75 0 2026-03-09T19:27:07.219 INFO:tasks.workunit.client.1.vm08.stdout:9/348: truncate d0/d2/f6c 1581124 0 2026-03-09T19:27:07.220 INFO:tasks.workunit.client.1.vm08.stdout:9/349: stat d0/d2/f2f 0 2026-03-09T19:27:07.220 INFO:tasks.workunit.client.1.vm08.stdout:9/350: fsync d0/d2/d8/d7/f63 0 2026-03-09T19:27:07.230 INFO:tasks.workunit.client.1.vm08.stdout:9/351: read d0/d2/f1d [432951,5578] 0 2026-03-09T19:27:07.231 INFO:tasks.workunit.client.1.vm08.stdout:7/424: dread d5/d14/d2b/f32 [0,4194304] 0 2026-03-09T19:27:07.231 INFO:tasks.workunit.client.1.vm08.stdout:9/352: write d0/d2/d8/d7/f58 [1446412,118485] 0 2026-03-09T19:27:07.235 INFO:tasks.workunit.client.1.vm08.stdout:7/425: mknod d5/d16/d3a/c97 0 2026-03-09T19:27:07.235 INFO:tasks.workunit.client.1.vm08.stdout:7/426: chown d5/d14/d38/c41 2 1 2026-03-09T19:27:07.236 INFO:tasks.workunit.client.1.vm08.stdout:9/353: mknod d0/d2/d8/d7/d48/d6f/c76 0 2026-03-09T19:27:07.238 INFO:tasks.workunit.client.1.vm08.stdout:9/354: truncate d0/d2/f21 244413 0 2026-03-09T19:27:07.239 INFO:tasks.workunit.client.1.vm08.stdout:9/355: dread - d0/d1b/f65 zero size 2026-03-09T19:27:07.257 INFO:tasks.workunit.client.1.vm08.stdout:7/427: sync 2026-03-09T19:27:07.257 INFO:tasks.workunit.client.1.vm08.stdout:7/428: write d5/fb [4709053,105415] 0 2026-03-09T19:27:07.304 INFO:tasks.workunit.client.0.vm07.stdout:6/70: truncate d0/d1/f8 2113233 0 2026-03-09T19:27:07.316 INFO:tasks.workunit.client.0.vm07.stdout:0/70: write d0/fa [1286041,85020] 0 2026-03-09T19:27:07.318 INFO:tasks.workunit.client.0.vm07.stdout:0/71: read d0/fa [130364,35608] 0 2026-03-09T19:27:07.320 INFO:tasks.workunit.client.0.vm07.stdout:5/76: dwrite d3/d9/f12 [0,4194304] 0 2026-03-09T19:27:07.322 INFO:tasks.workunit.client.0.vm07.stdout:6/71: creat d0/d13/f1c x:0 0 0 2026-03-09T19:27:07.323 INFO:tasks.workunit.client.1.vm08.stdout:5/345: write ff [1409428,122533] 0 2026-03-09T19:27:07.335 INFO:tasks.workunit.client.1.vm08.stdout:5/346: dread d16/d1e/f44 [0,4194304] 0 2026-03-09T19:27:07.335 INFO:tasks.workunit.client.1.vm08.stdout:5/347: chown d16/d1e/f44 18 1 2026-03-09T19:27:07.336 INFO:tasks.workunit.client.0.vm07.stdout:1/68: truncate d1/d9/f16 446579 0 2026-03-09T19:27:07.344 INFO:tasks.workunit.client.1.vm08.stdout:5/348: dread d16/d1e/f27 [0,4194304] 0 2026-03-09T19:27:07.344 INFO:tasks.workunit.client.1.vm08.stdout:5/349: fsync d16/d1e/d30/f3a 0 2026-03-09T19:27:07.345 INFO:tasks.workunit.client.0.vm07.stdout:6/72: fsync d0/fd 0 2026-03-09T19:27:07.345 INFO:tasks.workunit.client.1.vm08.stdout:5/350: read d16/d1e/d3b/f68 [2554389,92854] 0 2026-03-09T19:27:07.346 INFO:tasks.workunit.client.0.vm07.stdout:0/72: mkdir d0/d6/d13/d17/d19 0 2026-03-09T19:27:07.347 INFO:tasks.workunit.client.0.vm07.stdout:4/57: link d3/lb d3/ld 0 2026-03-09T19:27:07.348 INFO:tasks.workunit.client.0.vm07.stdout:4/58: dread - d3/f8 zero size 2026-03-09T19:27:07.348 INFO:tasks.workunit.client.0.vm07.stdout:4/59: fsync d3/fa 0 2026-03-09T19:27:07.349 INFO:tasks.workunit.client.0.vm07.stdout:4/60: dread - d3/fc zero size 2026-03-09T19:27:07.349 INFO:tasks.workunit.client.1.vm08.stdout:6/374: dwrite d3/f2a [0,4194304] 0 2026-03-09T19:27:07.349 INFO:tasks.workunit.client.0.vm07.stdout:4/61: chown d3/f8 54967152 1 2026-03-09T19:27:07.349 INFO:tasks.workunit.client.0.vm07.stdout:4/62: fdatasync d3/f8 0 2026-03-09T19:27:07.364 INFO:tasks.workunit.client.0.vm07.stdout:1/69: rename d1/d3/c13 to d1/c1a 0 2026-03-09T19:27:07.364 INFO:tasks.workunit.client.1.vm08.stdout:6/375: truncate d3/d34/d6f/f4f 1139474 0 2026-03-09T19:27:07.364 INFO:tasks.workunit.client.0.vm07.stdout:1/70: read d1/d3/f12 [2752510,100619] 0 2026-03-09T19:27:07.365 INFO:tasks.workunit.client.1.vm08.stdout:4/363: rename da/d14 to da/d10/d16/d28/d46/d52/d6e 0 2026-03-09T19:27:07.366 INFO:tasks.workunit.client.0.vm07.stdout:6/73: mkdir d0/d1/db/d1d 0 2026-03-09T19:27:07.367 INFO:tasks.workunit.client.1.vm08.stdout:5/351: symlink d16/l71 0 2026-03-09T19:27:07.371 INFO:tasks.workunit.client.0.vm07.stdout:4/63: mknod d3/ce 0 2026-03-09T19:27:07.373 INFO:tasks.workunit.client.1.vm08.stdout:1/498: rename f2 to d9/da/d12/f98 0 2026-03-09T19:27:07.377 INFO:tasks.workunit.client.1.vm08.stdout:1/499: dwrite d9/d40/d49/f7c [0,4194304] 0 2026-03-09T19:27:07.388 INFO:tasks.workunit.client.0.vm07.stdout:5/77: creat d3/f19 x:0 0 0 2026-03-09T19:27:07.388 INFO:tasks.workunit.client.0.vm07.stdout:5/78: readlink d3/d9/l15 0 2026-03-09T19:27:07.388 INFO:tasks.workunit.client.0.vm07.stdout:5/79: chown d3/d9 15293 1 2026-03-09T19:27:07.388 INFO:tasks.workunit.client.0.vm07.stdout:5/80: read f2 [1214939,91636] 0 2026-03-09T19:27:07.389 INFO:tasks.workunit.client.1.vm08.stdout:4/364: fdatasync da/d10/d16/d28/d2f/d4f/f65 0 2026-03-09T19:27:07.390 INFO:tasks.workunit.client.1.vm08.stdout:4/365: truncate da/d10/d16/d28/d2f/d4f/f65 929080 0 2026-03-09T19:27:07.395 INFO:tasks.workunit.client.0.vm07.stdout:1/71: creat d1/d11/f1b x:0 0 0 2026-03-09T19:27:07.395 INFO:tasks.workunit.client.1.vm08.stdout:5/352: chown d16/d1e/c22 51 1 2026-03-09T19:27:07.395 INFO:tasks.workunit.client.1.vm08.stdout:3/413: write d0/d6/d25/f56 [866411,41471] 0 2026-03-09T19:27:07.400 INFO:tasks.workunit.client.0.vm07.stdout:0/73: write d0/d7/f9 [198940,23165] 0 2026-03-09T19:27:07.402 INFO:tasks.workunit.client.1.vm08.stdout:5/353: dread d16/d1e/d3b/f43 [0,4194304] 0 2026-03-09T19:27:07.405 INFO:tasks.workunit.client.1.vm08.stdout:2/339: rename d3/d4/d3e/f7b to d3/d4/d23/d5c/f7e 0 2026-03-09T19:27:07.409 INFO:tasks.workunit.client.1.vm08.stdout:2/340: dwrite d3/d4/d23/d2c/d39/d5e/de/f7a [0,4194304] 0 2026-03-09T19:27:07.412 INFO:tasks.workunit.client.1.vm08.stdout:2/341: dread - d3/d4/d23/d2c/f64 zero size 2026-03-09T19:27:07.422 INFO:tasks.workunit.client.0.vm07.stdout:7/92: getdents d0/d4/d5 0 2026-03-09T19:27:07.425 INFO:tasks.workunit.client.0.vm07.stdout:2/71: fsync d3/dd/f1e 0 2026-03-09T19:27:07.425 INFO:tasks.workunit.client.0.vm07.stdout:2/72: dread d3/ff [0,4194304] 0 2026-03-09T19:27:07.425 INFO:tasks.workunit.client.0.vm07.stdout:2/73: chown d3/dd/d16 8405 1 2026-03-09T19:27:07.426 INFO:tasks.workunit.client.1.vm08.stdout:4/366: truncate da/d10/f1c 219446 0 2026-03-09T19:27:07.428 INFO:tasks.workunit.client.1.vm08.stdout:8/393: write de/d1d/d2e/d5f/f57 [1969961,73793] 0 2026-03-09T19:27:07.429 INFO:tasks.workunit.client.1.vm08.stdout:8/394: readlink de/d1d/d2e/d5f/l6c 0 2026-03-09T19:27:07.432 INFO:tasks.workunit.client.0.vm07.stdout:1/72: creat d1/d9/f1c x:0 0 0 2026-03-09T19:27:07.434 INFO:tasks.workunit.client.1.vm08.stdout:0/379: truncate dd/d22/f29 3885016 0 2026-03-09T19:27:07.437 INFO:tasks.workunit.client.0.vm07.stdout:0/74: rename d0/d6/cd to d0/c1a 0 2026-03-09T19:27:07.438 INFO:tasks.workunit.client.0.vm07.stdout:9/102: getdents d0/d6 0 2026-03-09T19:27:07.438 INFO:tasks.workunit.client.0.vm07.stdout:9/103: chown d0 1457 1 2026-03-09T19:27:07.440 INFO:tasks.workunit.client.0.vm07.stdout:9/104: write d0/db/f1d [803479,114518] 0 2026-03-09T19:27:07.440 INFO:tasks.workunit.client.0.vm07.stdout:9/105: chown d0/db/l1a 51564939 1 2026-03-09T19:27:07.440 INFO:tasks.workunit.client.1.vm08.stdout:6/376: creat d3/db/f8f x:0 0 0 2026-03-09T19:27:07.441 INFO:tasks.workunit.client.0.vm07.stdout:8/103: truncate d7/ff 378330 0 2026-03-09T19:27:07.441 INFO:tasks.workunit.client.1.vm08.stdout:6/377: truncate d3/f9 4877510 0 2026-03-09T19:27:07.441 INFO:tasks.workunit.client.0.vm07.stdout:9/106: write d0/d6/f8 [2479271,34653] 0 2026-03-09T19:27:07.444 INFO:tasks.workunit.client.0.vm07.stdout:4/64: rename d3/lb to d3/lf 0 2026-03-09T19:27:07.444 INFO:tasks.workunit.client.0.vm07.stdout:3/90: dwrite d1/d10/f1a [0,4194304] 0 2026-03-09T19:27:07.457 INFO:tasks.workunit.client.0.vm07.stdout:2/74: mknod d3/d11/c20 0 2026-03-09T19:27:07.461 INFO:tasks.workunit.client.1.vm08.stdout:2/342: unlink d3/d4/d23/d2c/d39/d5e/f68 0 2026-03-09T19:27:07.467 INFO:tasks.workunit.client.0.vm07.stdout:1/73: rename d1/f17 to d1/f1d 0 2026-03-09T19:27:07.467 INFO:tasks.workunit.client.0.vm07.stdout:1/74: readlink d1/l7 0 2026-03-09T19:27:07.467 INFO:tasks.workunit.client.1.vm08.stdout:4/367: sync 2026-03-09T19:27:07.467 INFO:tasks.workunit.client.1.vm08.stdout:1/500: creat d9/da/d53/d67/d6c/d76/f99 x:0 0 0 2026-03-09T19:27:07.469 INFO:tasks.workunit.client.1.vm08.stdout:8/395: creat de/d25/d31/f8e x:0 0 0 2026-03-09T19:27:07.471 INFO:tasks.workunit.client.0.vm07.stdout:0/75: unlink d0/c3 0 2026-03-09T19:27:07.474 INFO:tasks.workunit.client.1.vm08.stdout:0/380: unlink dd/d22/d24/d49/c66 0 2026-03-09T19:27:07.474 INFO:tasks.workunit.client.1.vm08.stdout:0/381: fdatasync dd/d22/d24/f26 0 2026-03-09T19:27:07.479 INFO:tasks.workunit.client.0.vm07.stdout:8/104: symlink d7/d16/l29 0 2026-03-09T19:27:07.481 INFO:tasks.workunit.client.1.vm08.stdout:0/382: sync 2026-03-09T19:27:07.482 INFO:tasks.workunit.client.1.vm08.stdout:0/383: readlink dd/l14 0 2026-03-09T19:27:07.482 INFO:tasks.workunit.client.1.vm08.stdout:0/384: fsync fb 0 2026-03-09T19:27:07.484 INFO:tasks.workunit.client.0.vm07.stdout:9/107: unlink d0/c11 0 2026-03-09T19:27:07.489 INFO:tasks.workunit.client.0.vm07.stdout:4/65: mkdir d3/d10 0 2026-03-09T19:27:07.490 INFO:tasks.workunit.client.1.vm08.stdout:9/356: write d0/d2/d14/f19 [644652,68723] 0 2026-03-09T19:27:07.491 INFO:tasks.workunit.client.1.vm08.stdout:9/357: write d0/d2/d8/d7/d48/f53 [4800607,119770] 0 2026-03-09T19:27:07.491 INFO:tasks.workunit.client.1.vm08.stdout:6/378: mknod d3/d34/d3b/d85/c90 0 2026-03-09T19:27:07.492 INFO:tasks.workunit.client.1.vm08.stdout:6/379: fsync d3/f6e 0 2026-03-09T19:27:07.492 INFO:tasks.workunit.client.1.vm08.stdout:6/380: chown d3/d15/c21 115087 1 2026-03-09T19:27:07.496 INFO:tasks.workunit.client.1.vm08.stdout:6/381: dread d3/f6 [0,4194304] 0 2026-03-09T19:27:07.499 INFO:tasks.workunit.client.0.vm07.stdout:3/91: fsync d1/d6/f9 0 2026-03-09T19:27:07.500 INFO:tasks.workunit.client.1.vm08.stdout:5/354: link d16/d1e/d30/f3f d16/d1e/d6e/f72 0 2026-03-09T19:27:07.503 INFO:tasks.workunit.client.1.vm08.stdout:5/355: dread d16/d1e/f27 [0,4194304] 0 2026-03-09T19:27:07.508 INFO:tasks.workunit.client.1.vm08.stdout:7/429: rename d5/d16/f4a to d5/d16/d3a/d42/d6a/d8f/f98 0 2026-03-09T19:27:07.509 INFO:tasks.workunit.client.1.vm08.stdout:7/430: chown d5/d14/d27/d54/d86 15644148 1 2026-03-09T19:27:07.511 INFO:tasks.workunit.client.0.vm07.stdout:5/81: rename d3/d9 to d3/d1a 0 2026-03-09T19:27:07.512 INFO:tasks.workunit.client.0.vm07.stdout:5/82: write d3/d1a/fc [1580245,5570] 0 2026-03-09T19:27:07.516 INFO:tasks.workunit.client.1.vm08.stdout:2/343: creat d3/d4/d23/d2c/d39/d5e/de/d18/d1f/f7f x:0 0 0 2026-03-09T19:27:07.517 INFO:tasks.workunit.client.1.vm08.stdout:2/344: write d3/d9/d26/f6a [2120224,85521] 0 2026-03-09T19:27:07.521 INFO:tasks.workunit.client.1.vm08.stdout:1/501: creat d9/d11/d7a/f9a x:0 0 0 2026-03-09T19:27:07.525 INFO:tasks.workunit.client.1.vm08.stdout:1/502: dwrite d9/da/d12/d39/f52 [0,4194304] 0 2026-03-09T19:27:07.530 INFO:tasks.workunit.client.0.vm07.stdout:8/105: mknod d7/d9/c2a 0 2026-03-09T19:27:07.532 INFO:tasks.workunit.client.0.vm07.stdout:9/108: unlink d0/c16 0 2026-03-09T19:27:07.534 INFO:tasks.workunit.client.0.vm07.stdout:4/66: rename d3/d10 to d3/d11 0 2026-03-09T19:27:07.535 INFO:tasks.workunit.client.1.vm08.stdout:0/385: symlink dd/d22/d27/l79 0 2026-03-09T19:27:07.538 INFO:tasks.workunit.client.0.vm07.stdout:3/92: creat d1/d6/f1b x:0 0 0 2026-03-09T19:27:07.545 INFO:tasks.workunit.client.1.vm08.stdout:9/358: rename d0 to d0/d1b/d77 22 2026-03-09T19:27:07.550 INFO:tasks.workunit.client.0.vm07.stdout:8/106: creat d7/d9/d10/f2b x:0 0 0 2026-03-09T19:27:07.551 INFO:tasks.workunit.client.1.vm08.stdout:9/359: chown d0/d1b 0 1 2026-03-09T19:27:07.551 INFO:tasks.workunit.client.1.vm08.stdout:9/360: chown d0/d2/d8/d7 0 1 2026-03-09T19:27:07.554 INFO:tasks.workunit.client.0.vm07.stdout:9/109: symlink d0/db/l23 0 2026-03-09T19:27:07.557 INFO:tasks.workunit.client.0.vm07.stdout:4/67: creat d3/d11/f12 x:0 0 0 2026-03-09T19:27:07.558 INFO:tasks.workunit.client.1.vm08.stdout:2/345: creat d3/d4/d23/d2c/f80 x:0 0 0 2026-03-09T19:27:07.562 INFO:tasks.workunit.client.0.vm07.stdout:7/93: getdents d0/d4/d5/d8 0 2026-03-09T19:27:07.570 INFO:tasks.workunit.client.1.vm08.stdout:3/414: link d0/d6/de/d1b/d16/l67 d0/d6/de/d15/l78 0 2026-03-09T19:27:07.574 INFO:tasks.workunit.client.1.vm08.stdout:0/386: read - dd/d22/d24/f60 zero size 2026-03-09T19:27:07.576 INFO:tasks.workunit.client.0.vm07.stdout:8/107: rename d7/d9/c25 to d7/d1d/c2c 0 2026-03-09T19:27:07.577 INFO:tasks.workunit.client.0.vm07.stdout:9/110: symlink d0/d6/l24 0 2026-03-09T19:27:07.577 INFO:tasks.workunit.client.1.vm08.stdout:6/382: mknod d3/c91 0 2026-03-09T19:27:07.580 INFO:tasks.workunit.client.0.vm07.stdout:9/111: dread d0/f3 [0,4194304] 0 2026-03-09T19:27:07.581 INFO:tasks.workunit.client.1.vm08.stdout:6/383: dwrite d3/d34/d5c/f7c [0,4194304] 0 2026-03-09T19:27:07.585 INFO:tasks.workunit.client.1.vm08.stdout:9/361: mknod d0/d2/d14/d5c/d32/d57/d72/c78 0 2026-03-09T19:27:07.586 INFO:tasks.workunit.client.0.vm07.stdout:4/68: creat d3/f13 x:0 0 0 2026-03-09T19:27:07.586 INFO:tasks.workunit.client.1.vm08.stdout:9/362: chown d0/d2/d8/d7/f58 831459929 1 2026-03-09T19:27:07.586 INFO:tasks.workunit.client.0.vm07.stdout:4/69: chown d3/ce 8 1 2026-03-09T19:27:07.587 INFO:tasks.workunit.client.0.vm07.stdout:7/94: chown d0/d4/d5/d8/d1a/l23 1582 1 2026-03-09T19:27:07.588 INFO:tasks.workunit.client.0.vm07.stdout:2/75: link d3/c1b d3/dd/d16/c21 0 2026-03-09T19:27:07.588 INFO:tasks.workunit.client.0.vm07.stdout:7/95: read - d0/d4/d5/d8/f1c zero size 2026-03-09T19:27:07.588 INFO:tasks.workunit.client.1.vm08.stdout:2/346: write d3/d4/d23/d2c/d39/d5e/d14/f78 [3656637,64607] 0 2026-03-09T19:27:07.588 INFO:tasks.workunit.client.0.vm07.stdout:2/76: write f2 [3611649,55495] 0 2026-03-09T19:27:07.590 INFO:tasks.workunit.client.0.vm07.stdout:7/96: fsync d0/d4/d5/dd/f1f 0 2026-03-09T19:27:07.594 INFO:tasks.workunit.client.1.vm08.stdout:2/347: dwrite d3/d9/f5d [0,4194304] 0 2026-03-09T19:27:07.605 INFO:tasks.workunit.client.0.vm07.stdout:0/76: getdents d0/d6/d13/d17 0 2026-03-09T19:27:07.606 INFO:tasks.workunit.client.0.vm07.stdout:9/112: symlink d0/d6/l25 0 2026-03-09T19:27:07.606 INFO:tasks.workunit.client.0.vm07.stdout:0/77: truncate d0/d6/f16 696225 0 2026-03-09T19:27:07.607 INFO:tasks.workunit.client.0.vm07.stdout:0/78: chown d0/cf 28669 1 2026-03-09T19:27:07.609 INFO:tasks.workunit.client.0.vm07.stdout:1/75: getdents d1/d3 0 2026-03-09T19:27:07.609 INFO:tasks.workunit.client.0.vm07.stdout:4/70: symlink d3/d11/l14 0 2026-03-09T19:27:07.611 INFO:tasks.workunit.client.0.vm07.stdout:6/74: getdents d0/d1/db 0 2026-03-09T19:27:07.612 INFO:tasks.workunit.client.0.vm07.stdout:6/75: chown d0/fd 475361 1 2026-03-09T19:27:07.614 INFO:tasks.workunit.client.0.vm07.stdout:9/113: dread d0/d6/f10 [0,4194304] 0 2026-03-09T19:27:07.614 INFO:tasks.workunit.client.0.vm07.stdout:7/97: mknod d0/d4/d5/d8/c27 0 2026-03-09T19:27:07.615 INFO:tasks.workunit.client.0.vm07.stdout:9/114: dread d0/f4 [0,4194304] 0 2026-03-09T19:27:07.617 INFO:tasks.workunit.client.0.vm07.stdout:1/76: dwrite d1/db/f14 [0,4194304] 0 2026-03-09T19:27:07.617 INFO:tasks.workunit.client.0.vm07.stdout:9/115: dread d0/f4 [0,4194304] 0 2026-03-09T19:27:07.622 INFO:tasks.workunit.client.0.vm07.stdout:9/116: write d0/d6/f20 [337577,101830] 0 2026-03-09T19:27:07.622 INFO:tasks.workunit.client.0.vm07.stdout:9/117: fsync d0/d6/fa 0 2026-03-09T19:27:07.623 INFO:tasks.workunit.client.0.vm07.stdout:9/118: readlink d0/d6/l9 0 2026-03-09T19:27:07.631 INFO:tasks.workunit.client.0.vm07.stdout:0/79: rename d0/d7/f18 to d0/d6/d13/d17/d19/f1b 0 2026-03-09T19:27:07.632 INFO:tasks.workunit.client.0.vm07.stdout:4/71: unlink d3/d11/l14 0 2026-03-09T19:27:07.632 INFO:tasks.workunit.client.0.vm07.stdout:4/72: chown d3/d11 4 1 2026-03-09T19:27:07.641 INFO:tasks.workunit.client.0.vm07.stdout:6/76: mkdir d0/d13/d1e 0 2026-03-09T19:27:07.641 INFO:tasks.workunit.client.0.vm07.stdout:6/77: write d0/d1/db/f14 [134002,121209] 0 2026-03-09T19:27:07.641 INFO:tasks.workunit.client.0.vm07.stdout:5/83: truncate f2 4111781 0 2026-03-09T19:27:07.657 INFO:tasks.workunit.client.0.vm07.stdout:0/80: chown d0/d7/d11 4430115 1 2026-03-09T19:27:07.658 INFO:tasks.workunit.client.0.vm07.stdout:4/73: mknod d3/d11/c15 0 2026-03-09T19:27:07.659 INFO:tasks.workunit.client.1.vm08.stdout:3/415: creat d0/d6/de/d6e/d51/f79 x:0 0 0 2026-03-09T19:27:07.661 INFO:tasks.workunit.client.0.vm07.stdout:6/78: unlink d0/fd 0 2026-03-09T19:27:07.661 INFO:tasks.workunit.client.0.vm07.stdout:3/93: dwrite d1/d6/f19 [0,4194304] 0 2026-03-09T19:27:07.665 INFO:tasks.workunit.client.0.vm07.stdout:3/94: dread d1/d6/fa [0,4194304] 0 2026-03-09T19:27:07.665 INFO:tasks.workunit.client.0.vm07.stdout:3/95: chown d1/d6/dd 21567 1 2026-03-09T19:27:07.687 INFO:tasks.workunit.client.0.vm07.stdout:5/84: creat d3/d1a/f1b x:0 0 0 2026-03-09T19:27:07.690 INFO:tasks.workunit.client.0.vm07.stdout:2/77: link d3/ff d3/f22 0 2026-03-09T19:27:07.690 INFO:tasks.workunit.client.1.vm08.stdout:4/368: write f9 [5702194,41038] 0 2026-03-09T19:27:07.691 INFO:tasks.workunit.client.1.vm08.stdout:4/369: readlink da/d10/d16/d28/d46/d52/d6e/l62 0 2026-03-09T19:27:07.692 INFO:tasks.workunit.client.0.vm07.stdout:5/85: dwrite d3/f18 [0,4194304] 0 2026-03-09T19:27:07.694 INFO:tasks.workunit.client.1.vm08.stdout:4/370: dread da/d10/d16/d28/d2f/d4f/f65 [0,4194304] 0 2026-03-09T19:27:07.703 INFO:tasks.workunit.client.0.vm07.stdout:7/98: mknod d0/c28 0 2026-03-09T19:27:07.704 INFO:tasks.workunit.client.1.vm08.stdout:6/384: rename d3/l31 to d3/d15/l92 0 2026-03-09T19:27:07.714 INFO:tasks.workunit.client.1.vm08.stdout:5/356: dwrite d16/d1e/f44 [0,4194304] 0 2026-03-09T19:27:07.714 INFO:tasks.workunit.client.1.vm08.stdout:7/431: write d5/d14/f59 [4180840,98513] 0 2026-03-09T19:27:07.715 INFO:tasks.workunit.client.1.vm08.stdout:7/432: dread - d5/d16/d3a/d42/f68 zero size 2026-03-09T19:27:07.720 INFO:tasks.workunit.client.1.vm08.stdout:2/348: creat d3/d4/d23/d5c/f81 x:0 0 0 2026-03-09T19:27:07.720 INFO:tasks.workunit.client.1.vm08.stdout:2/349: dread - d3/d9/d79/f6b zero size 2026-03-09T19:27:07.722 INFO:tasks.workunit.client.0.vm07.stdout:9/119: rmdir d0/db 39 2026-03-09T19:27:07.726 INFO:tasks.workunit.client.1.vm08.stdout:1/503: write d9/d40/f57 [267985,112839] 0 2026-03-09T19:27:07.726 INFO:tasks.workunit.client.1.vm08.stdout:1/504: dread - d9/da/d2c/f8a zero size 2026-03-09T19:27:07.729 INFO:tasks.workunit.client.1.vm08.stdout:8/396: truncate de/f1f 1803619 0 2026-03-09T19:27:07.731 INFO:tasks.workunit.client.1.vm08.stdout:8/397: dread de/d1d/f1e [0,4194304] 0 2026-03-09T19:27:07.732 INFO:tasks.workunit.client.0.vm07.stdout:6/79: symlink d0/d1/db/l1f 0 2026-03-09T19:27:07.732 INFO:tasks.workunit.client.0.vm07.stdout:3/96: mknod d1/d6/dd/c1c 0 2026-03-09T19:27:07.733 INFO:tasks.workunit.client.0.vm07.stdout:6/80: chown d0/d1/db 12692 1 2026-03-09T19:27:07.733 INFO:tasks.workunit.client.0.vm07.stdout:3/97: write d1/d6/fa [896067,113164] 0 2026-03-09T19:27:07.737 INFO:tasks.workunit.client.0.vm07.stdout:2/78: write f2 [4184667,50536] 0 2026-03-09T19:27:07.737 INFO:tasks.workunit.client.0.vm07.stdout:5/86: creat d3/d1a/f1c x:0 0 0 2026-03-09T19:27:07.738 INFO:tasks.workunit.client.0.vm07.stdout:3/98: dwrite d1/d6/fa [4194304,4194304] 0 2026-03-09T19:27:07.739 INFO:tasks.workunit.client.0.vm07.stdout:2/79: write d3/fa [2295763,13641] 0 2026-03-09T19:27:07.747 INFO:tasks.workunit.client.0.vm07.stdout:1/77: link d1/d11/l19 d1/d9/l1e 0 2026-03-09T19:27:07.748 INFO:tasks.workunit.client.0.vm07.stdout:1/78: chown d1/d3/c5 4 1 2026-03-09T19:27:07.748 INFO:tasks.workunit.client.0.vm07.stdout:1/79: write d1/d9/f15 [1006625,125995] 0 2026-03-09T19:27:07.749 INFO:tasks.workunit.client.1.vm08.stdout:5/357: fdatasync d16/d1e/d3b/f3c 0 2026-03-09T19:27:07.750 INFO:tasks.workunit.client.0.vm07.stdout:8/108: truncate f3 4049879 0 2026-03-09T19:27:07.753 INFO:tasks.workunit.client.1.vm08.stdout:2/350: chown d3/d4/d23/d2c/d39/d5e/l65 0 1 2026-03-09T19:27:07.753 INFO:tasks.workunit.client.1.vm08.stdout:2/351: fdatasync d3/d4/d23/d5c/f76 0 2026-03-09T19:27:07.756 INFO:tasks.workunit.client.1.vm08.stdout:1/505: write d9/da/dc/f68 [444314,115015] 0 2026-03-09T19:27:07.767 INFO:tasks.workunit.client.0.vm07.stdout:6/81: mknod d0/c20 0 2026-03-09T19:27:07.768 INFO:tasks.workunit.client.1.vm08.stdout:8/398: stat de/f1c 0 2026-03-09T19:27:07.772 INFO:tasks.workunit.client.0.vm07.stdout:5/87: rename l0 to d3/dd/l1d 0 2026-03-09T19:27:07.777 INFO:tasks.workunit.client.1.vm08.stdout:4/371: creat da/d10/d16/d28/d2f/d4f/d64/f6f x:0 0 0 2026-03-09T19:27:07.777 INFO:tasks.workunit.client.1.vm08.stdout:4/372: readlink da/l47 0 2026-03-09T19:27:07.777 INFO:tasks.workunit.client.1.vm08.stdout:4/373: readlink da/d10/d16/d28/d46/d52/d6e/l61 0 2026-03-09T19:27:07.777 INFO:tasks.workunit.client.1.vm08.stdout:4/374: chown da/d10/f6b 2587 1 2026-03-09T19:27:07.777 INFO:tasks.workunit.client.0.vm07.stdout:2/80: creat d3/d11/f23 x:0 0 0 2026-03-09T19:27:07.778 INFO:tasks.workunit.client.0.vm07.stdout:2/81: dread - d3/f5 zero size 2026-03-09T19:27:07.780 INFO:tasks.workunit.client.0.vm07.stdout:7/99: symlink d0/l29 0 2026-03-09T19:27:07.782 INFO:tasks.workunit.client.0.vm07.stdout:7/100: write d0/d4/f12 [3791770,73662] 0 2026-03-09T19:27:07.788 INFO:tasks.workunit.client.0.vm07.stdout:8/109: mknod d7/d16/c2d 0 2026-03-09T19:27:07.792 INFO:tasks.workunit.client.1.vm08.stdout:2/352: mknod d3/d9/d4a/c82 0 2026-03-09T19:27:07.795 INFO:tasks.workunit.client.0.vm07.stdout:1/80: rename d1/d9/f1c to d1/db/f1f 0 2026-03-09T19:27:07.796 INFO:tasks.workunit.client.0.vm07.stdout:5/88: mknod d3/d1a/c1e 0 2026-03-09T19:27:07.797 INFO:tasks.workunit.client.1.vm08.stdout:3/416: creat d0/f7a x:0 0 0 2026-03-09T19:27:07.800 INFO:tasks.workunit.client.0.vm07.stdout:2/82: creat d3/dd/f24 x:0 0 0 2026-03-09T19:27:07.800 INFO:tasks.workunit.client.1.vm08.stdout:9/363: write d0/d2/d8/f29 [238319,99245] 0 2026-03-09T19:27:07.805 INFO:tasks.workunit.client.1.vm08.stdout:4/375: creat da/d10/d16/d28/d46/d52/d6e/d40/f70 x:0 0 0 2026-03-09T19:27:07.806 INFO:tasks.workunit.client.0.vm07.stdout:7/101: mkdir d0/d4/d5/d8/d1a/d2a 0 2026-03-09T19:27:07.809 INFO:tasks.workunit.client.0.vm07.stdout:0/81: write d0/fa [1662655,112679] 0 2026-03-09T19:27:07.810 INFO:tasks.workunit.client.0.vm07.stdout:8/110: creat d7/f2e x:0 0 0 2026-03-09T19:27:07.812 INFO:tasks.workunit.client.0.vm07.stdout:8/111: readlink d7/d9/d10/l26 0 2026-03-09T19:27:07.812 INFO:tasks.workunit.client.1.vm08.stdout:0/387: link dd/d22/d27/f3f dd/f7a 0 2026-03-09T19:27:07.813 INFO:tasks.workunit.client.0.vm07.stdout:4/74: getdents d3 0 2026-03-09T19:27:07.814 INFO:tasks.workunit.client.0.vm07.stdout:8/112: dread f2 [0,4194304] 0 2026-03-09T19:27:07.814 INFO:tasks.workunit.client.0.vm07.stdout:6/82: symlink d0/d1/db/d1d/l21 0 2026-03-09T19:27:07.814 INFO:tasks.workunit.client.0.vm07.stdout:6/83: dread - d0/f9 zero size 2026-03-09T19:27:07.815 INFO:tasks.workunit.client.1.vm08.stdout:7/433: rename d5/d16/d3a/d42/d85/c69 to d5/c99 0 2026-03-09T19:27:07.815 INFO:tasks.workunit.client.0.vm07.stdout:4/75: write d3/fc [816507,40815] 0 2026-03-09T19:27:07.815 INFO:tasks.workunit.client.1.vm08.stdout:7/434: fdatasync d5/d16/d3a/d42/d6a/f7d 0 2026-03-09T19:27:07.817 INFO:tasks.workunit.client.0.vm07.stdout:0/82: dwrite d0/d6/f16 [0,4194304] 0 2026-03-09T19:27:07.820 INFO:tasks.workunit.client.1.vm08.stdout:2/353: rmdir d3/d4/d23/d2c/d39 39 2026-03-09T19:27:07.827 INFO:tasks.workunit.client.0.vm07.stdout:4/76: dread d3/fc [0,4194304] 0 2026-03-09T19:27:07.831 INFO:tasks.workunit.client.1.vm08.stdout:3/417: creat d0/d6/de/d1b/d16/f7b x:0 0 0 2026-03-09T19:27:07.831 INFO:tasks.workunit.client.0.vm07.stdout:4/77: dread - d3/f13 zero size 2026-03-09T19:27:07.832 INFO:tasks.workunit.client.0.vm07.stdout:9/120: dwrite d0/f3 [0,4194304] 0 2026-03-09T19:27:07.833 INFO:tasks.workunit.client.0.vm07.stdout:4/78: dwrite d3/fa [0,4194304] 0 2026-03-09T19:27:07.833 INFO:tasks.workunit.client.1.vm08.stdout:9/364: creat d0/d2/d8/d7/d48/d6f/f79 x:0 0 0 2026-03-09T19:27:07.835 INFO:tasks.workunit.client.1.vm08.stdout:9/365: chown d0/d2/d14/f19 7414 1 2026-03-09T19:27:07.851 INFO:tasks.workunit.client.1.vm08.stdout:4/376: unlink da/c24 0 2026-03-09T19:27:07.855 INFO:tasks.workunit.client.0.vm07.stdout:7/102: symlink d0/d4/d5/d8/l2b 0 2026-03-09T19:27:07.855 INFO:tasks.workunit.client.1.vm08.stdout:6/385: getdents d3/db 0 2026-03-09T19:27:07.855 INFO:tasks.workunit.client.1.vm08.stdout:0/388: truncate dd/d22/d27/f3d 2710214 0 2026-03-09T19:27:07.856 INFO:tasks.workunit.client.1.vm08.stdout:8/399: rename de/f20 to de/d1d/d69/f8f 0 2026-03-09T19:27:07.858 INFO:tasks.workunit.client.0.vm07.stdout:3/99: link d1/d10/c17 d1/d10/d16/c1d 0 2026-03-09T19:27:07.863 INFO:tasks.workunit.client.1.vm08.stdout:3/418: mkdir d0/d52/d7c 0 2026-03-09T19:27:07.865 INFO:tasks.workunit.client.0.vm07.stdout:0/83: dwrite d0/d7/d11/f15 [0,4194304] 0 2026-03-09T19:27:07.866 INFO:tasks.workunit.client.0.vm07.stdout:3/100: dwrite d1/d6/f9 [0,4194304] 0 2026-03-09T19:27:07.868 INFO:tasks.workunit.client.0.vm07.stdout:3/101: read d1/fe [3023249,130050] 0 2026-03-09T19:27:07.869 INFO:tasks.workunit.client.0.vm07.stdout:1/81: truncate d1/d3/f4 2115933 0 2026-03-09T19:27:07.878 INFO:tasks.workunit.client.0.vm07.stdout:9/121: mknod d0/d17/c26 0 2026-03-09T19:27:07.879 INFO:tasks.workunit.client.1.vm08.stdout:4/377: dread f1 [0,4194304] 0 2026-03-09T19:27:07.885 INFO:tasks.workunit.client.0.vm07.stdout:7/103: unlink d0/d4/d5/d8/lb 0 2026-03-09T19:27:07.888 INFO:tasks.workunit.client.1.vm08.stdout:2/354: creat d3/d4/d23/d2c/d39/d5e/de/d18/d1f/f83 x:0 0 0 2026-03-09T19:27:07.890 INFO:tasks.workunit.client.1.vm08.stdout:1/506: getdents d9/da/d17 0 2026-03-09T19:27:07.890 INFO:tasks.workunit.client.1.vm08.stdout:1/507: write d9/da/dc/f10 [6470312,67450] 0 2026-03-09T19:27:07.896 INFO:tasks.workunit.client.1.vm08.stdout:9/366: creat d0/d2/d14/d5c/d32/d57/d69/f7a x:0 0 0 2026-03-09T19:27:07.896 INFO:tasks.workunit.client.1.vm08.stdout:3/419: chown d0/d52/d6d/d77/c3b 0 1 2026-03-09T19:27:07.896 INFO:tasks.workunit.client.1.vm08.stdout:3/420: dread d0/d6/d25/f56 [0,4194304] 0 2026-03-09T19:27:07.896 INFO:tasks.workunit.client.1.vm08.stdout:3/421: read d0/d6/f39 [1311905,13271] 0 2026-03-09T19:27:07.900 INFO:tasks.workunit.client.1.vm08.stdout:0/389: mkdir dd/d22/d7b 0 2026-03-09T19:27:07.901 INFO:tasks.workunit.client.1.vm08.stdout:0/390: chown dd/d22/d63/d6e/d72 9205 1 2026-03-09T19:27:07.904 INFO:tasks.workunit.client.1.vm08.stdout:3/422: dread d0/d6/f57 [0,4194304] 0 2026-03-09T19:27:07.909 INFO:tasks.workunit.client.0.vm07.stdout:5/89: fdatasync f2 0 2026-03-09T19:27:07.913 INFO:tasks.workunit.client.1.vm08.stdout:1/508: chown d9/da/dc/c7f 45469 1 2026-03-09T19:27:07.915 INFO:tasks.workunit.client.0.vm07.stdout:9/122: symlink d0/d17/l27 0 2026-03-09T19:27:07.916 INFO:tasks.workunit.client.1.vm08.stdout:9/367: symlink d0/l7b 0 2026-03-09T19:27:07.925 INFO:tasks.workunit.client.1.vm08.stdout:5/358: truncate d16/d1e/f2e 2959460 0 2026-03-09T19:27:07.929 INFO:tasks.workunit.client.0.vm07.stdout:4/79: write d3/f5 [1004215,25453] 0 2026-03-09T19:27:07.930 INFO:tasks.workunit.client.1.vm08.stdout:6/386: write d3/d15/f40 [1506275,56632] 0 2026-03-09T19:27:07.933 INFO:tasks.workunit.client.0.vm07.stdout:4/80: dwrite d3/fa [0,4194304] 0 2026-03-09T19:27:07.938 INFO:tasks.workunit.client.0.vm07.stdout:4/81: dread d3/f7 [0,4194304] 0 2026-03-09T19:27:07.945 INFO:tasks.workunit.client.0.vm07.stdout:7/104: chown d0/d4/d5/d8/d1a/l21 205426097 1 2026-03-09T19:27:07.949 INFO:tasks.workunit.client.1.vm08.stdout:4/378: creat da/d10/d16/d28/d46/d52/d6e/d40/d6c/f71 x:0 0 0 2026-03-09T19:27:07.950 INFO:tasks.workunit.client.1.vm08.stdout:4/379: write da/d10/f6b [886884,753] 0 2026-03-09T19:27:07.952 INFO:tasks.workunit.client.0.vm07.stdout:8/113: link d7/d9/l1a d7/d1d/l2f 0 2026-03-09T19:27:07.953 INFO:tasks.workunit.client.1.vm08.stdout:0/391: stat dd/d22/d27/d2e/l5b 0 2026-03-09T19:27:07.956 INFO:tasks.workunit.client.0.vm07.stdout:8/114: dwrite d7/f2e [0,4194304] 0 2026-03-09T19:27:07.962 INFO:tasks.workunit.client.1.vm08.stdout:3/423: chown d0/d6/de/d15/l48 35436881 1 2026-03-09T19:27:07.975 INFO:tasks.workunit.client.0.vm07.stdout:3/102: link d1/f7 d1/d10/d16/f1e 0 2026-03-09T19:27:07.975 INFO:tasks.workunit.client.0.vm07.stdout:3/103: stat d1/d6/dd/f11 0 2026-03-09T19:27:07.975 INFO:tasks.workunit.client.1.vm08.stdout:7/435: getdents d5/d16/d1c 0 2026-03-09T19:27:07.977 INFO:tasks.workunit.client.1.vm08.stdout:7/436: write d5/d16/d3a/d42/f68 [17030,81624] 0 2026-03-09T19:27:07.981 INFO:tasks.workunit.client.1.vm08.stdout:1/509: creat d9/d11/f9b x:0 0 0 2026-03-09T19:27:07.981 INFO:tasks.workunit.client.0.vm07.stdout:3/104: dwrite d1/d6/fa [4194304,4194304] 0 2026-03-09T19:27:07.981 INFO:tasks.workunit.client.1.vm08.stdout:1/510: readlink d9/da/dc/l83 0 2026-03-09T19:27:07.989 INFO:tasks.workunit.client.0.vm07.stdout:5/90: creat d3/d1a/f1f x:0 0 0 2026-03-09T19:27:07.989 INFO:tasks.workunit.client.0.vm07.stdout:2/83: getdents d3/dd/d16 0 2026-03-09T19:27:07.990 INFO:tasks.workunit.client.0.vm07.stdout:2/84: chown d3 5806 1 2026-03-09T19:27:07.990 INFO:tasks.workunit.client.0.vm07.stdout:2/85: read - d3/dd/f1e zero size 2026-03-09T19:27:07.993 INFO:tasks.workunit.client.0.vm07.stdout:3/105: dwrite d1/d10/f1a [4194304,4194304] 0 2026-03-09T19:27:07.995 INFO:tasks.workunit.client.1.vm08.stdout:6/387: truncate d3/f25 1676446 0 2026-03-09T19:27:07.997 INFO:tasks.workunit.client.0.vm07.stdout:5/91: dwrite d3/dd/f16 [0,4194304] 0 2026-03-09T19:27:08.006 INFO:tasks.workunit.client.0.vm07.stdout:4/82: mkdir d3/d11/d16 0 2026-03-09T19:27:08.010 INFO:tasks.workunit.client.0.vm07.stdout:4/83: write d3/f13 [996995,10970] 0 2026-03-09T19:27:08.012 INFO:tasks.workunit.client.1.vm08.stdout:0/392: dread dd/d22/d24/d49/f4c [0,4194304] 0 2026-03-09T19:27:08.015 INFO:tasks.workunit.client.1.vm08.stdout:7/437: creat d5/d16/d3a/d42/f9a x:0 0 0 2026-03-09T19:27:08.030 INFO:tasks.workunit.client.0.vm07.stdout:8/115: dwrite d7/f19 [0,4194304] 0 2026-03-09T19:27:08.030 INFO:tasks.workunit.client.1.vm08.stdout:7/438: chown d5/d16/d3a/d42/d6a 912 1 2026-03-09T19:27:08.030 INFO:tasks.workunit.client.1.vm08.stdout:2/355: rename d3/d4/d23/f27 to d3/d9/f84 0 2026-03-09T19:27:08.030 INFO:tasks.workunit.client.1.vm08.stdout:1/511: truncate d9/da/f8e 125453 0 2026-03-09T19:27:08.030 INFO:tasks.workunit.client.1.vm08.stdout:1/512: dwrite d9/da/d53/d67/d6c/d76/f99 [0,4194304] 0 2026-03-09T19:27:08.033 INFO:tasks.workunit.client.0.vm07.stdout:2/86: dread d3/fc [0,4194304] 0 2026-03-09T19:27:08.040 INFO:tasks.workunit.client.1.vm08.stdout:7/439: dread d5/d16/f28 [0,4194304] 0 2026-03-09T19:27:08.041 INFO:tasks.workunit.client.1.vm08.stdout:5/359: symlink d16/d1e/l73 0 2026-03-09T19:27:08.043 INFO:tasks.workunit.client.0.vm07.stdout:9/123: rename d0/c12 to d0/db/c28 0 2026-03-09T19:27:08.046 INFO:tasks.workunit.client.0.vm07.stdout:6/84: creat d0/d1/db/d1d/f22 x:0 0 0 2026-03-09T19:27:08.057 INFO:tasks.workunit.client.0.vm07.stdout:5/92: mknod d3/d1a/c20 0 2026-03-09T19:27:08.057 INFO:tasks.workunit.client.0.vm07.stdout:7/105: mkdir d0/d2c 0 2026-03-09T19:27:08.057 INFO:tasks.workunit.client.0.vm07.stdout:7/106: dread d0/d4/d5/dd/f16 [0,4194304] 0 2026-03-09T19:27:08.058 INFO:tasks.workunit.client.0.vm07.stdout:7/107: fsync d0/d4/d5/dd/f1f 0 2026-03-09T19:27:08.058 INFO:tasks.workunit.client.0.vm07.stdout:4/84: symlink d3/d11/l17 0 2026-03-09T19:27:08.058 INFO:tasks.workunit.client.0.vm07.stdout:7/108: write d0/f25 [238570,56446] 0 2026-03-09T19:27:08.058 INFO:tasks.workunit.client.0.vm07.stdout:7/109: read d0/d4/d5/d8/fa [235749,18733] 0 2026-03-09T19:27:08.058 INFO:tasks.workunit.client.0.vm07.stdout:7/110: write d0/d4/d5/d8/d1a/f1d [301576,22271] 0 2026-03-09T19:27:08.059 INFO:tasks.workunit.client.1.vm08.stdout:2/356: symlink d3/d4/d23/d2c/d39/d5e/d14/l85 0 2026-03-09T19:27:08.062 INFO:tasks.workunit.client.1.vm08.stdout:2/357: dread d3/d4/d23/d2c/d39/d5e/de/d18/f2d [0,4194304] 0 2026-03-09T19:27:08.067 INFO:tasks.workunit.client.1.vm08.stdout:3/424: sync 2026-03-09T19:27:08.073 INFO:tasks.workunit.client.1.vm08.stdout:9/368: creat d0/d1b/f7c x:0 0 0 2026-03-09T19:27:08.074 INFO:tasks.workunit.client.0.vm07.stdout:8/116: rmdir d7/d9 39 2026-03-09T19:27:08.076 INFO:tasks.workunit.client.0.vm07.stdout:0/84: getdents d0 0 2026-03-09T19:27:08.078 INFO:tasks.workunit.client.1.vm08.stdout:1/513: rename d9/d11/f3c to d9/da/d2c/d6a/f9c 0 2026-03-09T19:27:08.078 INFO:tasks.workunit.client.1.vm08.stdout:0/393: rename dd/d22/d27 to dd/d22/d27/d65/d7c 22 2026-03-09T19:27:08.082 INFO:tasks.workunit.client.1.vm08.stdout:0/394: read dd/d22/d24/f26 [2570272,80579] 0 2026-03-09T19:27:08.086 INFO:tasks.workunit.client.0.vm07.stdout:9/124: mkdir d0/db/d29 0 2026-03-09T19:27:08.086 INFO:tasks.workunit.client.1.vm08.stdout:0/395: dwrite dd/d22/d24/f60 [0,4194304] 0 2026-03-09T19:27:08.087 INFO:tasks.workunit.client.0.vm07.stdout:6/85: mknod d0/d1/db/d1d/c23 0 2026-03-09T19:27:08.095 INFO:tasks.workunit.client.1.vm08.stdout:0/396: dwrite dd/d22/f28 [0,4194304] 0 2026-03-09T19:27:08.095 INFO:tasks.workunit.client.0.vm07.stdout:5/93: rename d3/d1a/f1f to d3/d1a/f21 0 2026-03-09T19:27:08.099 INFO:tasks.workunit.client.0.vm07.stdout:4/85: rename d3/f5 to d3/d11/f18 0 2026-03-09T19:27:08.101 INFO:tasks.workunit.client.1.vm08.stdout:0/397: dwrite dd/d22/d24/f77 [0,4194304] 0 2026-03-09T19:27:08.106 INFO:tasks.workunit.client.0.vm07.stdout:9/125: symlink d0/d6/l2a 0 2026-03-09T19:27:08.109 INFO:tasks.workunit.client.0.vm07.stdout:8/117: dread d7/d9/d10/f1b [0,4194304] 0 2026-03-09T19:27:08.109 INFO:tasks.workunit.client.0.vm07.stdout:8/118: stat d7/d9/lb 0 2026-03-09T19:27:08.113 INFO:tasks.workunit.client.0.vm07.stdout:6/86: mkdir d0/d1/db/d24 0 2026-03-09T19:27:08.117 INFO:tasks.workunit.client.1.vm08.stdout:1/514: symlink d9/da/d53/d67/d6c/d76/l9d 0 2026-03-09T19:27:08.121 INFO:tasks.workunit.client.1.vm08.stdout:1/515: chown d9/da/d53/d67/f79 306330 1 2026-03-09T19:27:08.122 INFO:tasks.workunit.client.1.vm08.stdout:1/516: dwrite d9/da/d12/d39/f47 [0,4194304] 0 2026-03-09T19:27:08.123 INFO:tasks.workunit.client.0.vm07.stdout:7/111: sync 2026-03-09T19:27:08.127 INFO:tasks.workunit.client.1.vm08.stdout:1/517: dwrite d9/da/f8e [0,4194304] 0 2026-03-09T19:27:08.135 INFO:tasks.workunit.client.1.vm08.stdout:1/518: dread - d9/d11/f73 zero size 2026-03-09T19:27:08.135 INFO:tasks.workunit.client.0.vm07.stdout:5/94: sync 2026-03-09T19:27:08.137 INFO:tasks.workunit.client.0.vm07.stdout:0/85: rename d0/d7 to d0/d6/d13/d1c 0 2026-03-09T19:27:08.140 INFO:tasks.workunit.client.0.vm07.stdout:4/86: mknod d3/d11/c19 0 2026-03-09T19:27:08.140 INFO:tasks.workunit.client.1.vm08.stdout:5/360: symlink d16/d1e/l74 0 2026-03-09T19:27:08.149 INFO:tasks.workunit.client.1.vm08.stdout:0/398: rmdir dd/d22/d63 39 2026-03-09T19:27:08.154 INFO:tasks.workunit.client.0.vm07.stdout:5/95: creat d3/dd/f22 x:0 0 0 2026-03-09T19:27:08.155 INFO:tasks.workunit.client.0.vm07.stdout:5/96: truncate d3/f19 691588 0 2026-03-09T19:27:08.161 INFO:tasks.workunit.client.0.vm07.stdout:2/87: getdents d3/dd 0 2026-03-09T19:27:08.162 INFO:tasks.workunit.client.1.vm08.stdout:8/400: link de/d1d/d21/c40 de/d47/d85/c90 0 2026-03-09T19:27:08.162 INFO:tasks.workunit.client.0.vm07.stdout:9/126: mknod d0/db/d29/c2b 0 2026-03-09T19:27:08.175 INFO:tasks.workunit.client.1.vm08.stdout:6/388: getdents d3/d34/d6f 0 2026-03-09T19:27:08.178 INFO:tasks.workunit.client.0.vm07.stdout:6/87: rename d0/d1/l11 to d0/d1/db/d24/l25 0 2026-03-09T19:27:08.188 INFO:tasks.workunit.client.1.vm08.stdout:6/389: dread d3/f3e [0,4194304] 0 2026-03-09T19:27:08.188 INFO:tasks.workunit.client.0.vm07.stdout:5/97: rename d3/dd/f14 to d3/dd/f23 0 2026-03-09T19:27:08.188 INFO:tasks.workunit.client.0.vm07.stdout:2/88: rmdir d3/dd/d16 39 2026-03-09T19:27:08.188 INFO:tasks.workunit.client.0.vm07.stdout:2/89: truncate d3/dd/fe 427627 0 2026-03-09T19:27:08.188 INFO:tasks.workunit.client.0.vm07.stdout:9/127: mkdir d0/db/d29/d2c 0 2026-03-09T19:27:08.188 INFO:tasks.workunit.client.0.vm07.stdout:7/112: rename d0/d4/d5/d8/d1a/l21 to d0/d4/d5/d26/l2d 0 2026-03-09T19:27:08.188 INFO:tasks.workunit.client.0.vm07.stdout:6/88: creat d0/d13/f26 x:0 0 0 2026-03-09T19:27:08.188 INFO:tasks.workunit.client.0.vm07.stdout:6/89: write d0/ff [4665571,129479] 0 2026-03-09T19:27:08.188 INFO:tasks.workunit.client.0.vm07.stdout:6/90: write d0/f9 [988493,107695] 0 2026-03-09T19:27:08.201 INFO:tasks.workunit.client.1.vm08.stdout:0/399: rename dd/d22/d27/l67 to dd/d22/d63/l7d 0 2026-03-09T19:27:08.202 INFO:tasks.workunit.client.1.vm08.stdout:1/519: getdents d9/da/d2c 0 2026-03-09T19:27:08.205 INFO:tasks.workunit.client.0.vm07.stdout:8/119: getdents d7/d16 0 2026-03-09T19:27:08.205 INFO:tasks.workunit.client.1.vm08.stdout:0/400: mkdir dd/d7e 0 2026-03-09T19:27:08.206 INFO:tasks.workunit.client.0.vm07.stdout:2/90: dwrite d3/d11/f1f [0,4194304] 0 2026-03-09T19:27:08.210 INFO:tasks.workunit.client.1.vm08.stdout:0/401: dwrite dd/d22/d24/d49/f5f [0,4194304] 0 2026-03-09T19:27:08.215 INFO:tasks.workunit.client.1.vm08.stdout:0/402: dread dd/d22/f3e [0,4194304] 0 2026-03-09T19:27:08.222 INFO:tasks.workunit.client.0.vm07.stdout:7/113: mkdir d0/d4/d5/d8/d1a/d2a/d2e 0 2026-03-09T19:27:08.224 INFO:tasks.workunit.client.1.vm08.stdout:0/403: rmdir dd/d22/d63 39 2026-03-09T19:27:08.225 INFO:tasks.workunit.client.1.vm08.stdout:6/390: sync 2026-03-09T19:27:08.225 INFO:tasks.workunit.client.1.vm08.stdout:0/404: write dd/d31/f54 [520825,9457] 0 2026-03-09T19:27:08.226 INFO:tasks.workunit.client.1.vm08.stdout:6/391: truncate d3/db/d43/f71 258039 0 2026-03-09T19:27:08.227 INFO:tasks.workunit.client.0.vm07.stdout:7/114: dwrite d0/f13 [0,4194304] 0 2026-03-09T19:27:08.241 INFO:tasks.workunit.client.0.vm07.stdout:7/115: write d0/d4/d5/d8/f15 [4604588,57427] 0 2026-03-09T19:27:08.242 INFO:tasks.workunit.client.0.vm07.stdout:9/128: rename d0/d6/le to d0/db/d29/l2d 0 2026-03-09T19:27:08.242 INFO:tasks.workunit.client.0.vm07.stdout:8/120: mkdir d7/d30 0 2026-03-09T19:27:08.242 INFO:tasks.workunit.client.0.vm07.stdout:2/91: creat d3/dd/d16/f25 x:0 0 0 2026-03-09T19:27:08.242 INFO:tasks.workunit.client.0.vm07.stdout:2/92: dread - d3/dd/d16/f1c zero size 2026-03-09T19:27:08.252 INFO:tasks.workunit.client.0.vm07.stdout:8/121: mknod d7/d9/c31 0 2026-03-09T19:27:08.253 INFO:tasks.workunit.client.0.vm07.stdout:2/93: symlink d3/l26 0 2026-03-09T19:27:08.254 INFO:tasks.workunit.client.0.vm07.stdout:2/94: truncate d3/dd/f1e 416555 0 2026-03-09T19:27:08.254 INFO:tasks.workunit.client.0.vm07.stdout:7/116: mknod d0/c2f 0 2026-03-09T19:27:08.255 INFO:tasks.workunit.client.0.vm07.stdout:7/117: read d0/d4/f12 [1896910,124470] 0 2026-03-09T19:27:08.256 INFO:tasks.workunit.client.0.vm07.stdout:9/129: symlink d0/l2e 0 2026-03-09T19:27:08.259 INFO:tasks.workunit.client.0.vm07.stdout:7/118: dwrite d0/d4/d5/d8/f1c [0,4194304] 0 2026-03-09T19:27:08.260 INFO:tasks.workunit.client.0.vm07.stdout:9/130: creat d0/db/d29/f2f x:0 0 0 2026-03-09T19:27:08.262 INFO:tasks.workunit.client.0.vm07.stdout:8/122: mkdir d7/d30/d32 0 2026-03-09T19:27:08.268 INFO:tasks.workunit.client.0.vm07.stdout:8/123: rename f2 to d7/d16/d1e/f33 0 2026-03-09T19:27:08.270 INFO:tasks.workunit.client.0.vm07.stdout:7/119: symlink d0/d2c/l30 0 2026-03-09T19:27:08.271 INFO:tasks.workunit.client.0.vm07.stdout:9/131: creat d0/db/d29/d2c/f30 x:0 0 0 2026-03-09T19:27:08.271 INFO:tasks.workunit.client.0.vm07.stdout:7/120: chown d0/d4/d5/d8/f1c 18060 1 2026-03-09T19:27:08.272 INFO:tasks.workunit.client.0.vm07.stdout:9/132: symlink d0/d6/l31 0 2026-03-09T19:27:08.272 INFO:tasks.workunit.client.0.vm07.stdout:8/124: mkdir d7/d9/d10/d22/d34 0 2026-03-09T19:27:08.272 INFO:tasks.workunit.client.0.vm07.stdout:8/125: chown d7 157918 1 2026-03-09T19:27:08.276 INFO:tasks.workunit.client.0.vm07.stdout:8/126: symlink d7/d9/l35 0 2026-03-09T19:27:08.278 INFO:tasks.workunit.client.0.vm07.stdout:8/127: unlink d7/d1d/l2f 0 2026-03-09T19:27:08.284 INFO:tasks.workunit.client.0.vm07.stdout:2/95: dread d3/f4 [0,4194304] 0 2026-03-09T19:27:08.284 INFO:tasks.workunit.client.0.vm07.stdout:2/96: stat d3/fa 0 2026-03-09T19:27:08.284 INFO:tasks.workunit.client.0.vm07.stdout:7/121: sync 2026-03-09T19:27:08.284 INFO:tasks.workunit.client.0.vm07.stdout:2/97: write d3/f4 [4221102,1746] 0 2026-03-09T19:27:08.287 INFO:tasks.workunit.client.0.vm07.stdout:8/128: dwrite d7/d9/d10/f20 [0,4194304] 0 2026-03-09T19:27:08.288 INFO:tasks.workunit.client.0.vm07.stdout:7/122: truncate d0/d4/d5/dd/f1f 482590 0 2026-03-09T19:27:08.288 INFO:tasks.workunit.client.0.vm07.stdout:8/129: chown d7/d9/c31 123261 1 2026-03-09T19:27:08.294 INFO:tasks.workunit.client.0.vm07.stdout:7/123: creat d0/d4/d5/d26/f31 x:0 0 0 2026-03-09T19:27:08.296 INFO:tasks.workunit.client.0.vm07.stdout:8/130: rename d7/d16/d1e/f27 to d7/d9/f36 0 2026-03-09T19:27:08.300 INFO:tasks.workunit.client.0.vm07.stdout:7/124: chown d0/d4/d5/d8/d1a/l1e 13 1 2026-03-09T19:27:08.301 INFO:tasks.workunit.client.0.vm07.stdout:2/98: dread d3/f4 [0,4194304] 0 2026-03-09T19:27:08.306 INFO:tasks.workunit.client.0.vm07.stdout:2/99: write d3/dd/f1d [895112,41977] 0 2026-03-09T19:27:08.311 INFO:tasks.workunit.client.0.vm07.stdout:2/100: dwrite d3/dd/f24 [0,4194304] 0 2026-03-09T19:27:08.317 INFO:tasks.workunit.client.0.vm07.stdout:2/101: creat d3/f27 x:0 0 0 2026-03-09T19:27:08.317 INFO:tasks.workunit.client.0.vm07.stdout:2/102: chown d3/dd/d16 16019880 1 2026-03-09T19:27:08.324 INFO:tasks.workunit.client.0.vm07.stdout:2/103: dwrite f2 [0,4194304] 0 2026-03-09T19:27:08.326 INFO:tasks.workunit.client.0.vm07.stdout:2/104: write d3/fc [1507633,20679] 0 2026-03-09T19:27:08.328 INFO:tasks.workunit.client.0.vm07.stdout:2/105: chown d3/dd/fe 2 1 2026-03-09T19:27:08.329 INFO:tasks.workunit.client.0.vm07.stdout:2/106: stat d3/fc 0 2026-03-09T19:27:08.331 INFO:tasks.workunit.client.0.vm07.stdout:2/107: symlink d3/dd/l28 0 2026-03-09T19:27:08.335 INFO:tasks.workunit.client.1.vm08.stdout:5/361: fdatasync d16/d1e/f2e 0 2026-03-09T19:27:08.335 INFO:tasks.workunit.client.0.vm07.stdout:3/106: getdents d1/d10/d16 0 2026-03-09T19:27:08.336 INFO:tasks.workunit.client.1.vm08.stdout:5/362: dread - d16/d1e/d30/f70 zero size 2026-03-09T19:27:08.339 INFO:tasks.workunit.client.0.vm07.stdout:1/82: truncate d1/f6 2770902 0 2026-03-09T19:27:08.342 INFO:tasks.workunit.client.0.vm07.stdout:2/108: dwrite d3/f1a [0,4194304] 0 2026-03-09T19:27:08.345 INFO:tasks.workunit.client.1.vm08.stdout:4/380: dread da/d10/f6b [0,4194304] 0 2026-03-09T19:27:08.351 INFO:tasks.workunit.client.1.vm08.stdout:5/363: dread d16/d1e/f35 [0,4194304] 0 2026-03-09T19:27:08.352 INFO:tasks.workunit.client.0.vm07.stdout:3/107: getdents d1/d6 0 2026-03-09T19:27:08.352 INFO:tasks.workunit.client.0.vm07.stdout:3/108: chown d1/d6/dd/c1c 8 1 2026-03-09T19:27:08.352 INFO:tasks.workunit.client.0.vm07.stdout:3/109: rename d1/d10 to d1/d1f 0 2026-03-09T19:27:08.352 INFO:tasks.workunit.client.0.vm07.stdout:2/109: mkdir d3/dd/d16/d29 0 2026-03-09T19:27:08.352 INFO:tasks.workunit.client.0.vm07.stdout:3/110: dread - d1/d6/f1b zero size 2026-03-09T19:27:08.352 INFO:tasks.workunit.client.0.vm07.stdout:2/110: dread - d3/dd/d16/f1c zero size 2026-03-09T19:27:08.352 INFO:tasks.workunit.client.1.vm08.stdout:4/381: symlink da/d10/d26/d3a/d49/l72 0 2026-03-09T19:27:08.357 INFO:tasks.workunit.client.1.vm08.stdout:4/382: dwrite da/d10/d26/d27/d32/f39 [0,4194304] 0 2026-03-09T19:27:08.358 INFO:tasks.workunit.client.0.vm07.stdout:3/111: creat d1/f20 x:0 0 0 2026-03-09T19:27:08.358 INFO:tasks.workunit.client.0.vm07.stdout:2/111: mknod d3/dd/d16/d29/c2a 0 2026-03-09T19:27:08.359 INFO:tasks.workunit.client.0.vm07.stdout:3/112: rmdir d1/d1f/d16 39 2026-03-09T19:27:08.360 INFO:tasks.workunit.client.0.vm07.stdout:2/112: write d3/dd/f1e [1017212,71215] 0 2026-03-09T19:27:08.360 INFO:tasks.workunit.client.0.vm07.stdout:2/113: fdatasync d3/f7 0 2026-03-09T19:27:08.361 INFO:tasks.workunit.client.0.vm07.stdout:2/114: chown d3/dd/f1e 70 1 2026-03-09T19:27:08.363 INFO:tasks.workunit.client.0.vm07.stdout:2/115: write d3/d11/f19 [291411,32979] 0 2026-03-09T19:27:08.367 INFO:tasks.workunit.client.0.vm07.stdout:4/87: fdatasync d3/f13 0 2026-03-09T19:27:08.369 INFO:tasks.workunit.client.0.vm07.stdout:3/113: creat d1/d6/f21 x:0 0 0 2026-03-09T19:27:08.369 INFO:tasks.workunit.client.0.vm07.stdout:3/114: chown d1/fe 97909481 1 2026-03-09T19:27:08.371 INFO:tasks.workunit.client.1.vm08.stdout:5/364: rename d16/d1e/l40 to d16/d1e/d30/l75 0 2026-03-09T19:27:08.374 INFO:tasks.workunit.client.0.vm07.stdout:2/116: dwrite d3/d11/f23 [0,4194304] 0 2026-03-09T19:27:08.382 INFO:tasks.workunit.client.1.vm08.stdout:5/365: dwrite d16/d45/f6b [0,4194304] 0 2026-03-09T19:27:08.383 INFO:tasks.workunit.client.0.vm07.stdout:2/117: dread d3/d11/f18 [0,4194304] 0 2026-03-09T19:27:08.384 INFO:tasks.workunit.client.0.vm07.stdout:4/88: sync 2026-03-09T19:27:08.384 INFO:tasks.workunit.client.0.vm07.stdout:2/118: fdatasync d3/fa 0 2026-03-09T19:27:08.387 INFO:tasks.workunit.client.0.vm07.stdout:3/115: creat d1/f22 x:0 0 0 2026-03-09T19:27:08.396 INFO:tasks.workunit.client.0.vm07.stdout:4/89: readlink d3/ld 0 2026-03-09T19:27:08.407 INFO:tasks.workunit.client.1.vm08.stdout:5/366: getdents d16 0 2026-03-09T19:27:08.407 INFO:tasks.workunit.client.1.vm08.stdout:5/367: truncate d16/d45/f46 1097516 0 2026-03-09T19:27:08.407 INFO:tasks.workunit.client.1.vm08.stdout:2/358: write d3/f7 [3515939,101995] 0 2026-03-09T19:27:08.408 INFO:tasks.workunit.client.0.vm07.stdout:4/90: dread - d3/d11/f12 zero size 2026-03-09T19:27:08.408 INFO:tasks.workunit.client.0.vm07.stdout:4/91: chown d3/ce 80136777 1 2026-03-09T19:27:08.408 INFO:tasks.workunit.client.0.vm07.stdout:4/92: stat d3/d11/f12 0 2026-03-09T19:27:08.408 INFO:tasks.workunit.client.0.vm07.stdout:4/93: fsync d3/f13 0 2026-03-09T19:27:08.408 INFO:tasks.workunit.client.0.vm07.stdout:4/94: stat d3/ld 0 2026-03-09T19:27:08.408 INFO:tasks.workunit.client.0.vm07.stdout:9/133: unlink d0/db/c28 0 2026-03-09T19:27:08.408 INFO:tasks.workunit.client.0.vm07.stdout:2/119: mkdir d3/dd/d2b 0 2026-03-09T19:27:08.408 INFO:tasks.workunit.client.0.vm07.stdout:2/120: read d3/f7 [3008533,29481] 0 2026-03-09T19:27:08.409 INFO:tasks.workunit.client.1.vm08.stdout:9/369: dwrite d0/f44 [0,4194304] 0 2026-03-09T19:27:08.410 INFO:tasks.workunit.client.0.vm07.stdout:2/121: sync 2026-03-09T19:27:08.414 INFO:tasks.workunit.client.0.vm07.stdout:2/122: dread d3/fc [0,4194304] 0 2026-03-09T19:27:08.414 INFO:tasks.workunit.client.0.vm07.stdout:2/123: readlink d3/l26 0 2026-03-09T19:27:08.415 INFO:tasks.workunit.client.1.vm08.stdout:2/359: dwrite d3/d9/d4a/f59 [0,4194304] 0 2026-03-09T19:27:08.417 INFO:tasks.workunit.client.1.vm08.stdout:7/440: truncate d5/d14/d2b/f30 1067536 0 2026-03-09T19:27:08.423 INFO:tasks.workunit.client.1.vm08.stdout:2/360: chown d3/d4/d23/d2c/d39/d5e/de/d18/d1f/f3a 239947 1 2026-03-09T19:27:08.424 INFO:tasks.workunit.client.1.vm08.stdout:2/361: dread - d3/d4/d23/d2c/f80 zero size 2026-03-09T19:27:08.429 INFO:tasks.workunit.client.1.vm08.stdout:5/368: dread d16/d1e/d3b/f43 [0,4194304] 0 2026-03-09T19:27:08.430 INFO:tasks.workunit.client.1.vm08.stdout:5/369: chown d16/d1e/d30/l52 1314 1 2026-03-09T19:27:08.430 INFO:tasks.workunit.client.1.vm08.stdout:3/425: dwrite d0/f28 [0,4194304] 0 2026-03-09T19:27:08.431 INFO:tasks.workunit.client.0.vm07.stdout:0/86: rmdir d0/d6/d13/d1c 39 2026-03-09T19:27:08.433 INFO:tasks.workunit.client.1.vm08.stdout:5/370: chown d16/d1e/d3b/c58 1471 1 2026-03-09T19:27:08.433 INFO:tasks.workunit.client.1.vm08.stdout:3/426: chown d0/d52/d6d 190281004 1 2026-03-09T19:27:08.437 INFO:tasks.workunit.client.1.vm08.stdout:3/427: truncate d0/f7a 735862 0 2026-03-09T19:27:08.453 INFO:tasks.workunit.client.0.vm07.stdout:5/98: fsync d3/dd/f23 0 2026-03-09T19:27:08.455 INFO:tasks.workunit.client.0.vm07.stdout:9/134: mkdir d0/db/d29/d32 0 2026-03-09T19:27:08.455 INFO:tasks.workunit.client.0.vm07.stdout:9/135: fdatasync d0/d6/ff 0 2026-03-09T19:27:08.456 INFO:tasks.workunit.client.0.vm07.stdout:5/99: dread d3/f18 [0,4194304] 0 2026-03-09T19:27:08.457 INFO:tasks.workunit.client.0.vm07.stdout:5/100: fdatasync d3/dd/f16 0 2026-03-09T19:27:08.459 INFO:tasks.workunit.client.0.vm07.stdout:5/101: dread d3/d1a/f12 [0,4194304] 0 2026-03-09T19:27:08.464 INFO:tasks.workunit.client.1.vm08.stdout:9/370: dread d0/d2/d8/d7/f34 [0,4194304] 0 2026-03-09T19:27:08.469 INFO:tasks.workunit.client.1.vm08.stdout:7/441: dread d5/d16/f28 [0,4194304] 0 2026-03-09T19:27:08.472 INFO:tasks.workunit.client.1.vm08.stdout:2/362: fdatasync d3/d9/f84 0 2026-03-09T19:27:08.477 INFO:tasks.workunit.client.1.vm08.stdout:2/363: dwrite d3/f7 [0,4194304] 0 2026-03-09T19:27:08.477 INFO:tasks.workunit.client.1.vm08.stdout:5/371: rename d16/f2a to d16/d1e/d30/f76 0 2026-03-09T19:27:08.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:08 vm07.local ceph-mon[48545]: pgmap v158: 65 pgs: 65 active+clean; 908 MiB data, 3.9 GiB used, 116 GiB / 120 GiB avail; 13 MiB/s rd, 95 MiB/s wr, 289 op/s 2026-03-09T19:27:08.483 INFO:tasks.workunit.client.1.vm08.stdout:8/401: fsync de/d25/d33/f55 0 2026-03-09T19:27:08.491 INFO:tasks.workunit.client.0.vm07.stdout:6/91: truncate d0/d1/db/d17/f1a 482574 0 2026-03-09T19:27:08.492 INFO:tasks.workunit.client.1.vm08.stdout:2/364: dread d3/d4/d23/d2c/d39/d5e/de/d18/f50 [0,4194304] 0 2026-03-09T19:27:08.493 INFO:tasks.workunit.client.1.vm08.stdout:3/428: creat d0/d6/de/d1b/f7d x:0 0 0 2026-03-09T19:27:08.498 INFO:tasks.workunit.client.1.vm08.stdout:1/520: truncate d9/da/d12/d39/f52 1175960 0 2026-03-09T19:27:08.502 INFO:tasks.workunit.client.1.vm08.stdout:1/521: dwrite d9/da/dc/f31 [4194304,4194304] 0 2026-03-09T19:27:08.515 INFO:tasks.workunit.client.1.vm08.stdout:7/442: creat d5/d16/d1c/d83/f9b x:0 0 0 2026-03-09T19:27:08.516 INFO:tasks.workunit.client.1.vm08.stdout:7/443: write d5/d16/d3a/d42/d85/f19 [2016993,30764] 0 2026-03-09T19:27:08.520 INFO:tasks.workunit.client.1.vm08.stdout:7/444: dwrite d5/d16/d3a/d42/f65 [0,4194304] 0 2026-03-09T19:27:08.527 INFO:tasks.workunit.client.0.vm07.stdout:9/136: dwrite d0/db/f21 [0,4194304] 0 2026-03-09T19:27:08.529 INFO:tasks.workunit.client.0.vm07.stdout:9/137: write d0/d6/f8 [2703600,99658] 0 2026-03-09T19:27:08.536 INFO:tasks.workunit.client.1.vm08.stdout:6/392: write d3/d34/d6f/f4f [1960162,71557] 0 2026-03-09T19:27:08.539 INFO:tasks.workunit.client.1.vm08.stdout:7/445: dread d5/d16/f1f [4194304,4194304] 0 2026-03-09T19:27:08.544 INFO:tasks.workunit.client.1.vm08.stdout:7/446: dread d5/d16/f1f [4194304,4194304] 0 2026-03-09T19:27:08.551 INFO:tasks.workunit.client.0.vm07.stdout:7/125: dwrite d0/f13 [4194304,4194304] 0 2026-03-09T19:27:08.557 INFO:tasks.workunit.client.0.vm07.stdout:8/131: rename d7/d9/d10/d22 to d7/d9/d37 0 2026-03-09T19:27:08.563 INFO:tasks.workunit.client.0.vm07.stdout:5/102: rmdir d3 39 2026-03-09T19:27:08.565 INFO:tasks.workunit.client.1.vm08.stdout:2/365: creat d3/d9/d79/f86 x:0 0 0 2026-03-09T19:27:08.569 INFO:tasks.workunit.client.0.vm07.stdout:3/116: getdents d1/d6/dd 0 2026-03-09T19:27:08.570 INFO:tasks.workunit.client.0.vm07.stdout:3/117: fdatasync d1/d6/f9 0 2026-03-09T19:27:08.571 INFO:tasks.workunit.client.1.vm08.stdout:1/522: dread - d9/da/f6f zero size 2026-03-09T19:27:08.577 INFO:tasks.workunit.client.1.vm08.stdout:9/371: truncate d0/d2/d8/f61 668498 0 2026-03-09T19:27:08.577 INFO:tasks.workunit.client.1.vm08.stdout:9/372: dread d0/d2/f6c [0,4194304] 0 2026-03-09T19:27:08.586 INFO:tasks.workunit.client.0.vm07.stdout:6/92: creat d0/d1/db/d1d/f27 x:0 0 0 2026-03-09T19:27:08.596 INFO:tasks.workunit.client.1.vm08.stdout:0/405: creat dd/d22/d27/d6c/f7f x:0 0 0 2026-03-09T19:27:08.596 INFO:tasks.workunit.client.1.vm08.stdout:6/393: fsync d3/d15/f19 0 2026-03-09T19:27:08.596 INFO:tasks.workunit.client.1.vm08.stdout:7/447: unlink d5/d14/d2b/d5d/f63 0 2026-03-09T19:27:08.596 INFO:tasks.workunit.client.0.vm07.stdout:1/83: truncate d1/db/f14 2921998 0 2026-03-09T19:27:08.596 INFO:tasks.workunit.client.0.vm07.stdout:1/84: dread - d1/d11/f1b zero size 2026-03-09T19:27:08.596 INFO:tasks.workunit.client.0.vm07.stdout:9/138: rmdir d0/d6 39 2026-03-09T19:27:08.596 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:08 vm08.local ceph-mon[57794]: pgmap v158: 65 pgs: 65 active+clean; 908 MiB data, 3.9 GiB used, 116 GiB / 120 GiB avail; 13 MiB/s rd, 95 MiB/s wr, 289 op/s 2026-03-09T19:27:08.602 INFO:tasks.workunit.client.1.vm08.stdout:4/383: dwrite f2 [0,4194304] 0 2026-03-09T19:27:08.606 INFO:tasks.workunit.client.0.vm07.stdout:6/93: sync 2026-03-09T19:27:08.607 INFO:tasks.workunit.client.0.vm07.stdout:6/94: write d0/d13/f18 [3343554,23673] 0 2026-03-09T19:27:08.608 INFO:tasks.workunit.client.0.vm07.stdout:6/95: readlink d0/d1/db/l1f 0 2026-03-09T19:27:08.615 INFO:tasks.workunit.client.0.vm07.stdout:5/103: fdatasync d3/d1a/fc 0 2026-03-09T19:27:08.617 INFO:tasks.workunit.client.0.vm07.stdout:9/139: creat d0/d17/f33 x:0 0 0 2026-03-09T19:27:08.619 INFO:tasks.workunit.client.0.vm07.stdout:9/140: sync 2026-03-09T19:27:08.634 INFO:tasks.workunit.client.0.vm07.stdout:8/132: creat d7/d9/d37/d34/f38 x:0 0 0 2026-03-09T19:27:08.635 INFO:tasks.workunit.client.0.vm07.stdout:3/118: mknod d1/d1f/c23 0 2026-03-09T19:27:08.639 INFO:tasks.workunit.client.0.vm07.stdout:9/141: creat d0/db/d29/d2c/f34 x:0 0 0 2026-03-09T19:27:08.640 INFO:tasks.workunit.client.0.vm07.stdout:2/124: rename d3/dd/d16/f1c to d3/dd/f2c 0 2026-03-09T19:27:08.640 INFO:tasks.workunit.client.0.vm07.stdout:6/96: mkdir d0/d1/d28 0 2026-03-09T19:27:08.650 INFO:tasks.workunit.client.0.vm07.stdout:8/133: unlink d7/d9/d10/f2b 0 2026-03-09T19:27:08.650 INFO:tasks.workunit.client.0.vm07.stdout:3/119: rmdir d1/d6 39 2026-03-09T19:27:08.663 INFO:tasks.workunit.client.0.vm07.stdout:7/126: getdents d0 0 2026-03-09T19:27:08.664 INFO:tasks.workunit.client.1.vm08.stdout:7/448: dread d5/d16/d3a/d42/f5f [0,4194304] 0 2026-03-09T19:27:08.665 INFO:tasks.workunit.client.0.vm07.stdout:7/127: chown d0/d4/d5/d26/f31 13338 1 2026-03-09T19:27:08.666 INFO:tasks.workunit.client.0.vm07.stdout:7/128: write d0/d4/d5/f20 [38153,23912] 0 2026-03-09T19:27:08.668 INFO:tasks.workunit.client.1.vm08.stdout:7/449: dread d5/d14/d2b/f32 [0,4194304] 0 2026-03-09T19:27:08.672 INFO:tasks.workunit.client.1.vm08.stdout:7/450: dwrite d5/d14/d27/f35 [0,4194304] 0 2026-03-09T19:27:08.673 INFO:tasks.workunit.client.1.vm08.stdout:7/451: write d5/d14/d27/d54/f5e [1623839,96252] 0 2026-03-09T19:27:08.681 INFO:tasks.workunit.client.0.vm07.stdout:4/95: truncate d3/fa 2473806 0 2026-03-09T19:27:08.681 INFO:tasks.workunit.client.0.vm07.stdout:0/87: truncate d0/fa 1302010 0 2026-03-09T19:27:08.682 INFO:tasks.workunit.client.0.vm07.stdout:4/96: read d3/f7 [532393,42541] 0 2026-03-09T19:27:08.687 INFO:tasks.workunit.client.0.vm07.stdout:5/104: rename d3/d1a/f21 to d3/dd/f24 0 2026-03-09T19:27:08.691 INFO:tasks.workunit.client.1.vm08.stdout:2/366: mkdir d3/d4/d23/d2c/d39/d5e/d87 0 2026-03-09T19:27:08.693 INFO:tasks.workunit.client.1.vm08.stdout:3/429: truncate d0/d8/d24/f2d 66247 0 2026-03-09T19:27:08.694 INFO:tasks.workunit.client.0.vm07.stdout:6/97: truncate d0/d1/db/f14 1243078 0 2026-03-09T19:27:08.694 INFO:tasks.workunit.client.1.vm08.stdout:3/430: chown d0/d6/de/d1a 6631 1 2026-03-09T19:27:08.694 INFO:tasks.workunit.client.0.vm07.stdout:6/98: fsync d0/ff 0 2026-03-09T19:27:08.696 INFO:tasks.workunit.client.0.vm07.stdout:1/85: getdents d1 0 2026-03-09T19:27:08.696 INFO:tasks.workunit.client.0.vm07.stdout:1/86: read - d1/db/f1f zero size 2026-03-09T19:27:08.698 INFO:tasks.workunit.client.1.vm08.stdout:1/523: dwrite d9/d11/f29 [4194304,4194304] 0 2026-03-09T19:27:08.708 INFO:tasks.workunit.client.1.vm08.stdout:4/384: mkdir da/d10/d16/d28/d46/d52/d6e/d73 0 2026-03-09T19:27:08.712 INFO:tasks.workunit.client.0.vm07.stdout:9/142: symlink d0/db/d29/d32/l35 0 2026-03-09T19:27:08.713 INFO:tasks.workunit.client.0.vm07.stdout:9/143: write d0/db/d29/d2c/f30 [384483,69625] 0 2026-03-09T19:27:08.713 INFO:tasks.workunit.client.1.vm08.stdout:9/373: creat d0/d1b/d4e/f7d x:0 0 0 2026-03-09T19:27:08.714 INFO:tasks.workunit.client.1.vm08.stdout:9/374: chown d0/d1b/l52 1348196 1 2026-03-09T19:27:08.720 INFO:tasks.workunit.client.1.vm08.stdout:0/406: creat dd/d22/d63/d6e/f80 x:0 0 0 2026-03-09T19:27:08.729 INFO:tasks.workunit.client.1.vm08.stdout:8/402: getdents de/d25 0 2026-03-09T19:27:08.729 INFO:tasks.workunit.client.0.vm07.stdout:7/129: dread d0/d4/fc [0,4194304] 0 2026-03-09T19:27:08.731 INFO:tasks.workunit.client.1.vm08.stdout:5/372: stat d16/d1e 0 2026-03-09T19:27:08.752 INFO:tasks.workunit.client.0.vm07.stdout:1/87: rmdir d1/d3 39 2026-03-09T19:27:08.764 INFO:tasks.workunit.client.1.vm08.stdout:3/431: rmdir d0/d6/de/d1b 39 2026-03-09T19:27:08.768 INFO:tasks.workunit.client.0.vm07.stdout:9/144: mkdir d0/db/d29/d2c/d36 0 2026-03-09T19:27:08.769 INFO:tasks.workunit.client.1.vm08.stdout:1/524: mkdir d9/d40/d49/d9e 0 2026-03-09T19:27:08.770 INFO:tasks.workunit.client.0.vm07.stdout:2/125: rmdir d3/dd/d2b 0 2026-03-09T19:27:08.774 INFO:tasks.workunit.client.0.vm07.stdout:7/130: mkdir d0/d4/d5/d26/d32 0 2026-03-09T19:27:08.774 INFO:tasks.workunit.client.1.vm08.stdout:6/394: truncate d3/f2a 2411237 0 2026-03-09T19:27:08.775 INFO:tasks.workunit.client.1.vm08.stdout:6/395: write d3/d15/f2b [1017753,88862] 0 2026-03-09T19:27:08.775 INFO:tasks.workunit.client.0.vm07.stdout:7/131: write d0/d4/d5/dd/f1f [715476,21174] 0 2026-03-09T19:27:08.775 INFO:tasks.workunit.client.1.vm08.stdout:6/396: chown d3/d34/d3b/d85 982048500 1 2026-03-09T19:27:08.780 INFO:tasks.workunit.client.1.vm08.stdout:9/375: truncate d0/d2/f36 500164 0 2026-03-09T19:27:08.785 INFO:tasks.workunit.client.1.vm08.stdout:0/407: symlink dd/l81 0 2026-03-09T19:27:08.791 INFO:tasks.workunit.client.1.vm08.stdout:7/452: dwrite d5/d14/d2b/d5d/f84 [0,4194304] 0 2026-03-09T19:27:08.791 INFO:tasks.workunit.client.1.vm08.stdout:7/453: chown d5/d16/c48 9836 1 2026-03-09T19:27:08.792 INFO:tasks.workunit.client.1.vm08.stdout:7/454: write d5/d16/d3a/d42/f65 [3849010,58544] 0 2026-03-09T19:27:08.796 INFO:tasks.workunit.client.0.vm07.stdout:8/134: link d7/d16/c2d d7/d30/d32/c39 0 2026-03-09T19:27:08.801 INFO:tasks.workunit.client.1.vm08.stdout:2/367: mkdir d3/d4/d3e/d4e/d88 0 2026-03-09T19:27:08.801 INFO:tasks.workunit.client.1.vm08.stdout:2/368: write d3/d9/f5d [3487502,101397] 0 2026-03-09T19:27:08.805 INFO:tasks.workunit.client.0.vm07.stdout:9/145: mknod d0/db/d29/d32/c37 0 2026-03-09T19:27:08.806 INFO:tasks.workunit.client.0.vm07.stdout:9/146: dread - d0/db/d29/d2c/f34 zero size 2026-03-09T19:27:08.809 INFO:tasks.workunit.client.1.vm08.stdout:4/385: dwrite da/d10/d26/d27/d32/f45 [4194304,4194304] 0 2026-03-09T19:27:08.815 INFO:tasks.workunit.client.0.vm07.stdout:2/126: mkdir d3/dd/d16/d29/d2d 0 2026-03-09T19:27:08.818 INFO:tasks.workunit.client.1.vm08.stdout:1/525: mknod d9/d11/d7a/c9f 0 2026-03-09T19:27:08.818 INFO:tasks.workunit.client.1.vm08.stdout:1/526: write d9/d11/f73 [33888,49678] 0 2026-03-09T19:27:08.819 INFO:tasks.workunit.client.0.vm07.stdout:0/88: mknod d0/d6/d13/d1c/c1d 0 2026-03-09T19:27:08.824 INFO:tasks.workunit.client.0.vm07.stdout:4/97: dwrite d3/fa [0,4194304] 0 2026-03-09T19:27:08.842 INFO:tasks.workunit.client.0.vm07.stdout:5/105: rename d3/dd/f16 to d3/f25 0 2026-03-09T19:27:08.843 INFO:tasks.workunit.client.1.vm08.stdout:0/408: dread - dd/d22/f2b zero size 2026-03-09T19:27:08.847 INFO:tasks.workunit.client.1.vm08.stdout:0/409: dwrite fc [0,4194304] 0 2026-03-09T19:27:08.859 INFO:tasks.workunit.client.0.vm07.stdout:8/135: dread d7/f15 [0,4194304] 0 2026-03-09T19:27:08.859 INFO:tasks.workunit.client.0.vm07.stdout:3/120: link d1/c2 d1/d1f/c24 0 2026-03-09T19:27:08.860 INFO:tasks.workunit.client.0.vm07.stdout:8/136: fsync d7/f1c 0 2026-03-09T19:27:08.860 INFO:tasks.workunit.client.1.vm08.stdout:7/455: unlink d5/ce 0 2026-03-09T19:27:08.860 INFO:tasks.workunit.client.1.vm08.stdout:7/456: fdatasync d5/d14/d38/f40 0 2026-03-09T19:27:08.863 INFO:tasks.workunit.client.0.vm07.stdout:9/147: mknod d0/d17/c38 0 2026-03-09T19:27:08.864 INFO:tasks.workunit.client.1.vm08.stdout:7/457: dwrite d5/d14/d2b/d5d/f84 [0,4194304] 0 2026-03-09T19:27:08.864 INFO:tasks.workunit.client.0.vm07.stdout:7/132: unlink d0/d4/d5/d26/l2d 0 2026-03-09T19:27:08.865 INFO:tasks.workunit.client.0.vm07.stdout:0/89: chown d0/d6/d13/d1c/ce 281901275 1 2026-03-09T19:27:08.866 INFO:tasks.workunit.client.1.vm08.stdout:2/369: creat d3/d9/d4a/f89 x:0 0 0 2026-03-09T19:27:08.884 INFO:tasks.workunit.client.1.vm08.stdout:2/370: chown d3/d4/d23/d2c/d39/d5e/de/f7a 63 1 2026-03-09T19:27:08.884 INFO:tasks.workunit.client.1.vm08.stdout:4/386: creat da/d10/d26/f74 x:0 0 0 2026-03-09T19:27:08.885 INFO:tasks.workunit.client.0.vm07.stdout:6/99: rename d0/c16 to d0/d1/db/d1d/c29 0 2026-03-09T19:27:08.885 INFO:tasks.workunit.client.0.vm07.stdout:5/106: mkdir d3/dd/d26 0 2026-03-09T19:27:08.885 INFO:tasks.workunit.client.0.vm07.stdout:6/100: dwrite d0/d13/f18 [0,4194304] 0 2026-03-09T19:27:08.885 INFO:tasks.workunit.client.0.vm07.stdout:6/101: readlink d0/d1/db/l1f 0 2026-03-09T19:27:08.885 INFO:tasks.workunit.client.0.vm07.stdout:6/102: stat d0/d13/f1b 0 2026-03-09T19:27:08.885 INFO:tasks.workunit.client.0.vm07.stdout:9/148: creat d0/db/f39 x:0 0 0 2026-03-09T19:27:08.887 INFO:tasks.workunit.client.0.vm07.stdout:0/90: rmdir d0/d6/d13/d1c 39 2026-03-09T19:27:08.888 INFO:tasks.workunit.client.0.vm07.stdout:2/127: rename d3/dd/f2c to d3/d11/f2e 0 2026-03-09T19:27:08.890 INFO:tasks.workunit.client.0.vm07.stdout:6/103: dwrite d0/d1/db/d17/f1a [0,4194304] 0 2026-03-09T19:27:08.891 INFO:tasks.workunit.client.1.vm08.stdout:7/458: dread d5/d16/f45 [0,4194304] 0 2026-03-09T19:27:08.891 INFO:tasks.workunit.client.0.vm07.stdout:6/104: fsync d0/f9 0 2026-03-09T19:27:08.892 INFO:tasks.workunit.client.0.vm07.stdout:6/105: dread - d0/d13/f1b zero size 2026-03-09T19:27:08.896 INFO:tasks.workunit.client.0.vm07.stdout:5/107: truncate d3/d1a/f17 69567 0 2026-03-09T19:27:08.896 INFO:tasks.workunit.client.0.vm07.stdout:5/108: readlink d3/d1a/l15 0 2026-03-09T19:27:08.898 INFO:tasks.workunit.client.0.vm07.stdout:3/121: unlink d1/d6/dd/f11 0 2026-03-09T19:27:08.899 INFO:tasks.workunit.client.0.vm07.stdout:6/106: dread d0/d1/db/f14 [0,4194304] 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.1.vm08.stdout:8/403: getdents de/d47 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.1.vm08.stdout:2/371: symlink d3/d4/d3e/d4e/d88/l8a 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.1.vm08.stdout:8/404: mkdir de/d91 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.1.vm08.stdout:8/405: fdatasync de/d25/f64 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.1.vm08.stdout:8/406: symlink de/d25/d87/l92 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.1.vm08.stdout:8/407: mknod de/d25/d31/d82/d6d/c93 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.0.vm07.stdout:8/137: truncate f3 3483623 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.0.vm07.stdout:8/138: readlink d7/d9/l35 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.0.vm07.stdout:3/122: dwrite d1/d6/fb [0,4194304] 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.0.vm07.stdout:7/133: rename d0/d4/d5/dd/f1b to d0/d4/f33 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.0.vm07.stdout:2/128: mkdir d3/dd/d16/d2f 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.0.vm07.stdout:6/107: mknod d0/d1/db/c2a 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.0.vm07.stdout:2/129: dwrite d3/f27 [0,4194304] 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.0.vm07.stdout:8/139: symlink d7/d9/d10/l3a 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.0.vm07.stdout:8/140: truncate d7/d16/d1e/f33 1046017 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.0.vm07.stdout:9/149: mkdir d0/d6/d3a 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.0.vm07.stdout:4/98: getdents d3 0 2026-03-09T19:27:08.925 INFO:tasks.workunit.client.0.vm07.stdout:4/99: write d3/f13 [1270844,41210] 0 2026-03-09T19:27:08.926 INFO:tasks.workunit.client.0.vm07.stdout:9/150: write d0/db/d29/d2c/f34 [171144,17500] 0 2026-03-09T19:27:08.927 INFO:tasks.workunit.client.0.vm07.stdout:3/123: symlink d1/d1f/l25 0 2026-03-09T19:27:08.928 INFO:tasks.workunit.client.0.vm07.stdout:3/124: dread - d1/d6/dd/f15 zero size 2026-03-09T19:27:08.928 INFO:tasks.workunit.client.0.vm07.stdout:2/130: mkdir d3/dd/d16/d30 0 2026-03-09T19:27:08.929 INFO:tasks.workunit.client.0.vm07.stdout:8/141: creat d7/d9/d37/f3b x:0 0 0 2026-03-09T19:27:08.930 INFO:tasks.workunit.client.0.vm07.stdout:0/91: link d0/d6/d13/d1c/f9 d0/f1e 0 2026-03-09T19:27:08.932 INFO:tasks.workunit.client.0.vm07.stdout:4/100: creat d3/f1a x:0 0 0 2026-03-09T19:27:08.935 INFO:tasks.workunit.client.0.vm07.stdout:5/109: link d3/d1a/l15 d3/d1a/l27 0 2026-03-09T19:27:08.936 INFO:tasks.workunit.client.0.vm07.stdout:9/151: creat d0/db/f3b x:0 0 0 2026-03-09T19:27:08.936 INFO:tasks.workunit.client.0.vm07.stdout:8/142: symlink d7/d9/l3c 0 2026-03-09T19:27:08.936 INFO:tasks.workunit.client.0.vm07.stdout:8/143: chown d7/d1d/l24 13 1 2026-03-09T19:27:08.936 INFO:tasks.workunit.client.0.vm07.stdout:2/131: creat d3/d11/f31 x:0 0 0 2026-03-09T19:27:08.936 INFO:tasks.workunit.client.0.vm07.stdout:4/101: rmdir d3/d11 39 2026-03-09T19:27:08.936 INFO:tasks.workunit.client.0.vm07.stdout:5/110: mkdir d3/d1a/d28 0 2026-03-09T19:27:08.940 INFO:tasks.workunit.client.0.vm07.stdout:3/125: getdents d1/d1f/d16 0 2026-03-09T19:27:08.940 INFO:tasks.workunit.client.0.vm07.stdout:8/144: creat d7/d1d/f3d x:0 0 0 2026-03-09T19:27:08.940 INFO:tasks.workunit.client.0.vm07.stdout:8/145: fsync d7/f2e 0 2026-03-09T19:27:08.941 INFO:tasks.workunit.client.1.vm08.stdout:2/372: dread d3/d4/d23/d2c/d39/d5e/de/d18/d1f/f3a [0,4194304] 0 2026-03-09T19:27:08.941 INFO:tasks.workunit.client.0.vm07.stdout:0/92: getdents d0/d6/d13/d17 0 2026-03-09T19:27:08.942 INFO:tasks.workunit.client.0.vm07.stdout:3/126: write d1/d6/f21 [995896,66982] 0 2026-03-09T19:27:08.943 INFO:tasks.workunit.client.0.vm07.stdout:9/152: dwrite d0/db/f3b [0,4194304] 0 2026-03-09T19:27:08.943 INFO:tasks.workunit.client.0.vm07.stdout:0/93: read d0/d6/d13/d1c/f9 [1532703,112704] 0 2026-03-09T19:27:08.947 INFO:tasks.workunit.client.0.vm07.stdout:8/146: unlink f1 0 2026-03-09T19:27:08.960 INFO:tasks.workunit.client.1.vm08.stdout:2/373: rename d3/d4/d23/d5c to d3/d4/d23/d2c/d39/d5e/de/d8b 0 2026-03-09T19:27:08.960 INFO:tasks.workunit.client.1.vm08.stdout:1/527: sync 2026-03-09T19:27:08.960 INFO:tasks.workunit.client.1.vm08.stdout:2/374: mkdir d3/d9/d79/d46/d8c 0 2026-03-09T19:27:08.960 INFO:tasks.workunit.client.1.vm08.stdout:2/375: chown d3/d4/d23/d2c/d39/d5e/d14/f58 5 1 2026-03-09T19:27:08.960 INFO:tasks.workunit.client.1.vm08.stdout:1/528: dwrite d9/d40/d49/f7c [0,4194304] 0 2026-03-09T19:27:08.960 INFO:tasks.workunit.client.0.vm07.stdout:8/147: read - d7/d9/f36 zero size 2026-03-09T19:27:08.960 INFO:tasks.workunit.client.0.vm07.stdout:2/132: rename d3/cb to d3/dd/c32 0 2026-03-09T19:27:08.960 INFO:tasks.workunit.client.0.vm07.stdout:2/133: readlink d3/l26 0 2026-03-09T19:27:08.960 INFO:tasks.workunit.client.0.vm07.stdout:0/94: chown d0/d6/d13/d1c/cc 12 1 2026-03-09T19:27:08.960 INFO:tasks.workunit.client.0.vm07.stdout:3/127: unlink d1/d1f/c24 0 2026-03-09T19:27:08.960 INFO:tasks.workunit.client.0.vm07.stdout:5/111: rename d3/c8 to d3/d1a/c29 0 2026-03-09T19:27:08.964 INFO:tasks.workunit.client.1.vm08.stdout:1/529: mknod d9/d11/d7a/ca0 0 2026-03-09T19:27:08.969 INFO:tasks.workunit.client.1.vm08.stdout:1/530: write d9/da/d53/d67/f79 [870009,34444] 0 2026-03-09T19:27:08.969 INFO:tasks.workunit.client.1.vm08.stdout:1/531: dread - d9/d11/f9b zero size 2026-03-09T19:27:08.970 INFO:tasks.workunit.client.0.vm07.stdout:7/134: sync 2026-03-09T19:27:08.970 INFO:tasks.workunit.client.0.vm07.stdout:5/112: readlink d3/d1a/l11 0 2026-03-09T19:27:08.970 INFO:tasks.workunit.client.0.vm07.stdout:9/153: link d0/d6/f20 d0/db/d29/d2c/d36/f3c 0 2026-03-09T19:27:08.970 INFO:tasks.workunit.client.0.vm07.stdout:0/95: creat d0/d6/d13/d17/d19/f1f x:0 0 0 2026-03-09T19:27:08.970 INFO:tasks.workunit.client.0.vm07.stdout:0/96: dread - d0/d6/d13/d17/d19/f1b zero size 2026-03-09T19:27:08.973 INFO:tasks.workunit.client.0.vm07.stdout:5/113: dwrite d3/d1a/f1c [0,4194304] 0 2026-03-09T19:27:08.978 INFO:tasks.workunit.client.0.vm07.stdout:2/134: rename d3/d11/c20 to d3/dd/d16/d2f/c33 0 2026-03-09T19:27:08.989 INFO:tasks.workunit.client.1.vm08.stdout:1/532: link d9/d40/d49/f70 d9/d11/d7a/d89/d8d/fa1 0 2026-03-09T19:27:08.989 INFO:tasks.workunit.client.1.vm08.stdout:1/533: write d9/da/dc/f31 [8459819,15978] 0 2026-03-09T19:27:08.990 INFO:tasks.workunit.client.0.vm07.stdout:3/128: mkdir d1/d26 0 2026-03-09T19:27:08.990 INFO:tasks.workunit.client.0.vm07.stdout:7/135: rmdir d0/d4/d5/d8/d1a/d2a 39 2026-03-09T19:27:08.990 INFO:tasks.workunit.client.0.vm07.stdout:9/154: symlink d0/d6/l3d 0 2026-03-09T19:27:08.990 INFO:tasks.workunit.client.0.vm07.stdout:7/136: dwrite d0/d4/d5/d8/f1c [0,4194304] 0 2026-03-09T19:27:08.990 INFO:tasks.workunit.client.0.vm07.stdout:5/114: symlink d3/dd/l2a 0 2026-03-09T19:27:08.992 INFO:tasks.workunit.client.0.vm07.stdout:3/129: fdatasync d1/fe 0 2026-03-09T19:27:08.992 INFO:tasks.workunit.client.0.vm07.stdout:7/137: write d0/d4/d5/d8/f1c [284763,126514] 0 2026-03-09T19:27:08.992 INFO:tasks.workunit.client.0.vm07.stdout:0/97: dwrite d0/d6/d13/d17/d19/f1b [0,4194304] 0 2026-03-09T19:27:08.993 INFO:tasks.workunit.client.0.vm07.stdout:9/155: write d0/db/d29/d2c/d36/f3c [114918,21295] 0 2026-03-09T19:27:08.993 INFO:tasks.workunit.client.0.vm07.stdout:3/130: chown d1/d26 34287037 1 2026-03-09T19:27:08.993 INFO:tasks.workunit.client.0.vm07.stdout:9/156: readlink d0/l1 0 2026-03-09T19:27:08.994 INFO:tasks.workunit.client.0.vm07.stdout:0/98: dread - d0/d6/d13/d17/d19/f1f zero size 2026-03-09T19:27:08.994 INFO:tasks.workunit.client.0.vm07.stdout:9/157: dread - d0/d17/f1f zero size 2026-03-09T19:27:08.994 INFO:tasks.workunit.client.0.vm07.stdout:0/99: readlink - no filename 2026-03-09T19:27:08.994 INFO:tasks.workunit.client.0.vm07.stdout:0/100: readlink - no filename 2026-03-09T19:27:09.002 INFO:tasks.workunit.client.0.vm07.stdout:7/138: unlink d0/l2 0 2026-03-09T19:27:09.003 INFO:tasks.workunit.client.0.vm07.stdout:9/158: write d0/d6/fa [1161771,101433] 0 2026-03-09T19:27:09.004 INFO:tasks.workunit.client.0.vm07.stdout:3/131: unlink d1/d6/fa 0 2026-03-09T19:27:09.006 INFO:tasks.workunit.client.0.vm07.stdout:5/115: mknod d3/dd/d26/c2b 0 2026-03-09T19:27:09.007 INFO:tasks.workunit.client.0.vm07.stdout:5/116: mkdir d3/dd/d26/d2c 0 2026-03-09T19:27:09.008 INFO:tasks.workunit.client.0.vm07.stdout:9/159: chown d0/db/f21 193 1 2026-03-09T19:27:09.008 INFO:tasks.workunit.client.0.vm07.stdout:3/132: chown d1/c2 28592 1 2026-03-09T19:27:09.010 INFO:tasks.workunit.client.0.vm07.stdout:0/101: dwrite d0/d6/d13/d17/d19/f1f [0,4194304] 0 2026-03-09T19:27:09.013 INFO:tasks.workunit.client.0.vm07.stdout:9/160: readlink d0/d17/l18 0 2026-03-09T19:27:09.013 INFO:tasks.workunit.client.0.vm07.stdout:0/102: rmdir d0/d6/d13/d1c 39 2026-03-09T19:27:09.014 INFO:tasks.workunit.client.0.vm07.stdout:3/133: mknod d1/d26/c27 0 2026-03-09T19:27:09.017 INFO:tasks.workunit.client.0.vm07.stdout:3/134: mkdir d1/d1f/d16/d28 0 2026-03-09T19:27:09.019 INFO:tasks.workunit.client.0.vm07.stdout:3/135: unlink d1/d6/dd/c1c 0 2026-03-09T19:27:09.023 INFO:tasks.workunit.client.0.vm07.stdout:3/136: dread d1/d1f/f13 [4194304,4194304] 0 2026-03-09T19:27:09.027 INFO:tasks.workunit.client.0.vm07.stdout:0/103: dread d0/d6/d13/d1c/f9 [0,4194304] 0 2026-03-09T19:27:09.027 INFO:tasks.workunit.client.0.vm07.stdout:3/137: symlink d1/d1f/d16/d28/l29 0 2026-03-09T19:27:09.027 INFO:tasks.workunit.client.0.vm07.stdout:3/138: chown d1/d6/dd/l18 79 1 2026-03-09T19:27:09.029 INFO:tasks.workunit.client.0.vm07.stdout:3/139: chown d1/d1f/d16/d28/l29 3225792 1 2026-03-09T19:27:09.029 INFO:tasks.workunit.client.0.vm07.stdout:0/104: dread d0/d6/f16 [0,4194304] 0 2026-03-09T19:27:09.031 INFO:tasks.workunit.client.0.vm07.stdout:0/105: chown d0/d6/d13/d1c/d11 22609987 1 2026-03-09T19:27:09.033 INFO:tasks.workunit.client.1.vm08.stdout:1/534: sync 2026-03-09T19:27:09.035 INFO:tasks.workunit.client.0.vm07.stdout:0/106: creat d0/d6/d13/d17/f20 x:0 0 0 2026-03-09T19:27:09.043 INFO:tasks.workunit.client.1.vm08.stdout:1/535: link d9/da/d53/d67/d6c/c74 d9/da/d2d/d4e/ca2 0 2026-03-09T19:27:09.063 INFO:tasks.workunit.client.0.vm07.stdout:0/107: sync 2026-03-09T19:27:09.070 INFO:tasks.workunit.client.0.vm07.stdout:0/108: mknod d0/d6/d13/d17/c21 0 2026-03-09T19:27:09.070 INFO:tasks.workunit.client.0.vm07.stdout:0/109: write d0/d6/d13/d17/d19/f1b [2000810,31602] 0 2026-03-09T19:27:09.086 INFO:tasks.workunit.client.0.vm07.stdout:0/110: dwrite d0/d6/f16 [0,4194304] 0 2026-03-09T19:27:09.091 INFO:tasks.workunit.client.0.vm07.stdout:0/111: mknod d0/d6/d13/d1c/c22 0 2026-03-09T19:27:09.103 INFO:tasks.workunit.client.0.vm07.stdout:0/112: write d0/d6/d13/d17/d19/f1b [337937,111547] 0 2026-03-09T19:27:09.103 INFO:tasks.workunit.client.0.vm07.stdout:0/113: rmdir d0 39 2026-03-09T19:27:09.103 INFO:tasks.workunit.client.0.vm07.stdout:0/114: creat d0/f23 x:0 0 0 2026-03-09T19:27:09.103 INFO:tasks.workunit.client.0.vm07.stdout:0/115: symlink d0/d6/l24 0 2026-03-09T19:27:09.103 INFO:tasks.workunit.client.0.vm07.stdout:0/116: symlink d0/d6/l25 0 2026-03-09T19:27:09.103 INFO:tasks.workunit.client.0.vm07.stdout:0/117: dread d0/d6/f16 [0,4194304] 0 2026-03-09T19:27:09.103 INFO:tasks.workunit.client.0.vm07.stdout:0/118: mknod d0/d6/d13/d17/c26 0 2026-03-09T19:27:09.105 INFO:tasks.workunit.client.0.vm07.stdout:0/119: link d0/d6/d13/d1c/d11/f15 d0/d6/d13/d1c/f27 0 2026-03-09T19:27:09.107 INFO:tasks.workunit.client.0.vm07.stdout:0/120: symlink d0/d6/d13/d1c/d11/l28 0 2026-03-09T19:27:09.111 INFO:tasks.workunit.client.0.vm07.stdout:3/140: dread d1/d1f/d16/f1e [0,4194304] 0 2026-03-09T19:27:09.123 INFO:tasks.workunit.client.0.vm07.stdout:9/161: getdents d0/d17 0 2026-03-09T19:27:09.125 INFO:tasks.workunit.client.1.vm08.stdout:6/397: write d3/db/d43/f56 [948550,92542] 0 2026-03-09T19:27:09.127 INFO:tasks.workunit.client.0.vm07.stdout:1/88: dwrite d1/d3/f12 [0,4194304] 0 2026-03-09T19:27:09.129 INFO:tasks.workunit.client.1.vm08.stdout:9/376: dwrite d0/d2/d14/d5c/d32/d57/f6a [0,4194304] 0 2026-03-09T19:27:09.130 INFO:tasks.workunit.client.1.vm08.stdout:5/373: dwrite d16/d1e/f5c [0,4194304] 0 2026-03-09T19:27:09.131 INFO:tasks.workunit.client.0.vm07.stdout:9/162: dwrite d0/d17/f1f [0,4194304] 0 2026-03-09T19:27:09.137 INFO:tasks.workunit.client.1.vm08.stdout:9/377: readlink d0/d1b/l52 0 2026-03-09T19:27:09.156 INFO:tasks.workunit.client.0.vm07.stdout:9/163: mkdir d0/db/d3e 0 2026-03-09T19:27:09.156 INFO:tasks.workunit.client.1.vm08.stdout:9/378: rmdir d0/d2 39 2026-03-09T19:27:09.159 INFO:tasks.workunit.client.0.vm07.stdout:1/89: truncate d1/d3/f4 2076697 0 2026-03-09T19:27:09.161 INFO:tasks.workunit.client.0.vm07.stdout:3/141: rename f0 to d1/f2a 0 2026-03-09T19:27:09.162 INFO:tasks.workunit.client.0.vm07.stdout:3/142: write d1/d6/f9 [2148880,99833] 0 2026-03-09T19:27:09.174 INFO:tasks.workunit.client.1.vm08.stdout:9/379: mkdir d0/d2/d8/d7/d48/d5d/d7e 0 2026-03-09T19:27:09.174 INFO:tasks.workunit.client.0.vm07.stdout:1/90: mkdir d1/d20 0 2026-03-09T19:27:09.174 INFO:tasks.workunit.client.0.vm07.stdout:1/91: chown d1/f1d 1547 1 2026-03-09T19:27:09.174 INFO:tasks.workunit.client.0.vm07.stdout:9/164: rename d0/d17/l1c to d0/db/d29/l3f 0 2026-03-09T19:27:09.179 INFO:tasks.workunit.client.0.vm07.stdout:3/143: fsync d1/d1f/f13 0 2026-03-09T19:27:09.180 INFO:tasks.workunit.client.0.vm07.stdout:1/92: mkdir d1/d3/d21 0 2026-03-09T19:27:09.180 INFO:tasks.workunit.client.0.vm07.stdout:1/93: rename d1/d3/d21 to d1/d3/d21/d22 22 2026-03-09T19:27:09.182 INFO:tasks.workunit.client.1.vm08.stdout:9/380: mkdir d0/d1b/d68/d7f 0 2026-03-09T19:27:09.184 INFO:tasks.workunit.client.0.vm07.stdout:9/165: chown d0/l14 260908 1 2026-03-09T19:27:09.188 INFO:tasks.workunit.client.1.vm08.stdout:9/381: rmdir d0/d2/d8/d7/d48/d5d/d7e 0 2026-03-09T19:27:09.188 INFO:tasks.workunit.client.1.vm08.stdout:9/382: read d0/d2/d8/d7/d48/f53 [2354839,19926] 0 2026-03-09T19:27:09.188 INFO:tasks.workunit.client.0.vm07.stdout:3/144: dread d1/d6/fb [0,4194304] 0 2026-03-09T19:27:09.188 INFO:tasks.workunit.client.0.vm07.stdout:1/94: unlink d1/l7 0 2026-03-09T19:27:09.188 INFO:tasks.workunit.client.0.vm07.stdout:3/145: readlink d1/d6/dd/l18 0 2026-03-09T19:27:09.188 INFO:tasks.workunit.client.0.vm07.stdout:9/166: symlink d0/db/d29/d2c/l40 0 2026-03-09T19:27:09.188 INFO:tasks.workunit.client.0.vm07.stdout:9/167: readlink d0/db/l1b 0 2026-03-09T19:27:09.188 INFO:tasks.workunit.client.0.vm07.stdout:3/146: chown d1/d6/f19 132287 1 2026-03-09T19:27:09.189 INFO:tasks.workunit.client.0.vm07.stdout:1/95: rmdir d1/d3 39 2026-03-09T19:27:09.192 INFO:tasks.workunit.client.0.vm07.stdout:9/168: dread d0/d17/f1f [0,4194304] 0 2026-03-09T19:27:09.198 INFO:tasks.workunit.client.0.vm07.stdout:3/147: rename d1/f8 to d1/d6/dd/f2b 0 2026-03-09T19:27:09.210 INFO:tasks.workunit.client.0.vm07.stdout:3/148: stat d1/d1f/d16/f1e 0 2026-03-09T19:27:09.211 INFO:tasks.workunit.client.0.vm07.stdout:3/149: dread - d1/f20 zero size 2026-03-09T19:27:09.211 INFO:tasks.workunit.client.0.vm07.stdout:9/169: rmdir d0/db/d3e 0 2026-03-09T19:27:09.211 INFO:tasks.workunit.client.0.vm07.stdout:3/150: truncate d1/d1f/d16/f1e 4892709 0 2026-03-09T19:27:09.211 INFO:tasks.workunit.client.0.vm07.stdout:9/170: creat d0/db/f41 x:0 0 0 2026-03-09T19:27:09.211 INFO:tasks.workunit.client.0.vm07.stdout:3/151: fsync d1/d6/f21 0 2026-03-09T19:27:09.211 INFO:tasks.workunit.client.0.vm07.stdout:9/171: creat d0/d17/f42 x:0 0 0 2026-03-09T19:27:09.212 INFO:tasks.workunit.client.0.vm07.stdout:9/172: creat d0/db/d29/d2c/f43 x:0 0 0 2026-03-09T19:27:09.212 INFO:tasks.workunit.client.0.vm07.stdout:9/173: dread - d0/d17/f42 zero size 2026-03-09T19:27:09.212 INFO:tasks.workunit.client.0.vm07.stdout:3/152: dread d1/f7 [0,4194304] 0 2026-03-09T19:27:09.213 INFO:tasks.workunit.client.0.vm07.stdout:9/174: dread d0/f4 [0,4194304] 0 2026-03-09T19:27:09.214 INFO:tasks.workunit.client.0.vm07.stdout:3/153: read d1/d1f/f1a [748061,89575] 0 2026-03-09T19:27:09.217 INFO:tasks.workunit.client.1.vm08.stdout:9/383: sync 2026-03-09T19:27:09.219 INFO:tasks.workunit.client.0.vm07.stdout:3/154: dread d1/f7 [0,4194304] 0 2026-03-09T19:27:09.219 INFO:tasks.workunit.client.0.vm07.stdout:3/155: readlink d1/d1f/l25 0 2026-03-09T19:27:09.219 INFO:tasks.workunit.client.1.vm08.stdout:9/384: fsync d0/d2/d14/f31 0 2026-03-09T19:27:09.220 INFO:tasks.workunit.client.1.vm08.stdout:9/385: fdatasync d0/d2/d8/d7/f58 0 2026-03-09T19:27:09.224 INFO:tasks.workunit.client.0.vm07.stdout:3/156: symlink d1/l2c 0 2026-03-09T19:27:09.224 INFO:tasks.workunit.client.0.vm07.stdout:9/175: getdents d0 0 2026-03-09T19:27:09.225 INFO:tasks.workunit.client.1.vm08.stdout:9/386: rename d0/d2/d14/d5c/d32/d57 to d0/d2/d80 0 2026-03-09T19:27:09.230 INFO:tasks.workunit.client.0.vm07.stdout:3/157: creat d1/d1f/f2d x:0 0 0 2026-03-09T19:27:09.232 INFO:tasks.workunit.client.0.vm07.stdout:9/176: dwrite d0/d17/f1f [0,4194304] 0 2026-03-09T19:27:09.235 INFO:tasks.workunit.client.0.vm07.stdout:3/158: dread d1/fe [0,4194304] 0 2026-03-09T19:27:09.237 INFO:tasks.workunit.client.0.vm07.stdout:9/177: symlink d0/d17/l44 0 2026-03-09T19:27:09.244 INFO:tasks.workunit.client.0.vm07.stdout:3/159: mknod d1/d6/c2e 0 2026-03-09T19:27:09.245 INFO:tasks.workunit.client.0.vm07.stdout:3/160: dread - d1/d6/dd/f15 zero size 2026-03-09T19:27:09.246 INFO:tasks.workunit.client.0.vm07.stdout:9/178: creat d0/d6/d3a/f45 x:0 0 0 2026-03-09T19:27:09.246 INFO:tasks.workunit.client.0.vm07.stdout:3/161: symlink d1/d1f/d16/l2f 0 2026-03-09T19:27:09.249 INFO:tasks.workunit.client.0.vm07.stdout:3/162: creat d1/d1f/d16/f30 x:0 0 0 2026-03-09T19:27:09.250 INFO:tasks.workunit.client.0.vm07.stdout:3/163: write d1/d6/dd/f2b [783590,25173] 0 2026-03-09T19:27:09.259 INFO:tasks.workunit.client.1.vm08.stdout:3/432: truncate d0/d52/d6d/d77/f68 344697 0 2026-03-09T19:27:09.259 INFO:tasks.workunit.client.0.vm07.stdout:3/164: write d1/f7 [5220314,68039] 0 2026-03-09T19:27:09.260 INFO:tasks.workunit.client.0.vm07.stdout:3/165: dwrite d1/d6/dd/f2b [0,4194304] 0 2026-03-09T19:27:09.260 INFO:tasks.workunit.client.0.vm07.stdout:3/166: readlink d1/l2c 0 2026-03-09T19:27:09.281 INFO:tasks.workunit.client.0.vm07.stdout:3/167: creat d1/d26/f31 x:0 0 0 2026-03-09T19:27:09.282 INFO:tasks.workunit.client.0.vm07.stdout:3/168: write d1/d6/f19 [4967345,115806] 0 2026-03-09T19:27:09.282 INFO:tasks.workunit.client.1.vm08.stdout:3/433: mkdir d0/d52/d7c/d7e 0 2026-03-09T19:27:09.288 INFO:tasks.workunit.client.1.vm08.stdout:3/434: mkdir d0/d6/de/d6e/d51/d7f 0 2026-03-09T19:27:09.288 INFO:tasks.workunit.client.0.vm07.stdout:3/169: truncate d1/d6/f9 5173415 0 2026-03-09T19:27:09.288 INFO:tasks.workunit.client.0.vm07.stdout:3/170: chown d1/d6/dd/f15 6652471 1 2026-03-09T19:27:09.288 INFO:tasks.workunit.client.1.vm08.stdout:3/435: mknod d0/d52/d7c/c80 0 2026-03-09T19:27:09.293 INFO:tasks.workunit.client.1.vm08.stdout:4/387: dwrite da/d10/d16/d28/d46/d52/d6e/d2c/f4a [0,4194304] 0 2026-03-09T19:27:09.309 INFO:tasks.workunit.client.1.vm08.stdout:0/410: dwrite dd/d22/f29 [0,4194304] 0 2026-03-09T19:27:09.311 INFO:tasks.workunit.client.0.vm07.stdout:4/102: dread d3/f13 [0,4194304] 0 2026-03-09T19:27:09.311 INFO:tasks.workunit.client.1.vm08.stdout:7/459: write d5/d14/d27/d54/f58 [1028733,37471] 0 2026-03-09T19:27:09.312 INFO:tasks.workunit.client.1.vm08.stdout:7/460: write d5/d14/d38/f40 [207696,62360] 0 2026-03-09T19:27:09.316 INFO:tasks.workunit.client.0.vm07.stdout:3/171: link d1/d1f/d16/c1d d1/d6/dd/c32 0 2026-03-09T19:27:09.317 INFO:tasks.workunit.client.0.vm07.stdout:3/172: write d1/d6/dd/f15 [572621,58134] 0 2026-03-09T19:27:09.318 INFO:tasks.workunit.client.0.vm07.stdout:3/173: chown d1/d6/c2e 0 1 2026-03-09T19:27:09.319 INFO:tasks.workunit.client.0.vm07.stdout:3/174: write d1/d6/f19 [2021653,46222] 0 2026-03-09T19:27:09.320 INFO:tasks.workunit.client.0.vm07.stdout:3/175: read d1/d6/fb [3715576,26669] 0 2026-03-09T19:27:09.321 INFO:tasks.workunit.client.1.vm08.stdout:3/436: creat d0/d6/de/d6e/f81 x:0 0 0 2026-03-09T19:27:09.322 INFO:tasks.workunit.client.1.vm08.stdout:3/437: write d0/f28 [3855802,3237] 0 2026-03-09T19:27:09.322 INFO:tasks.workunit.client.1.vm08.stdout:3/438: write d0/d4b/f74 [528684,70866] 0 2026-03-09T19:27:09.330 INFO:tasks.workunit.client.1.vm08.stdout:4/388: sync 2026-03-09T19:27:09.335 INFO:tasks.workunit.client.0.vm07.stdout:4/103: mkdir d3/d11/d1b 0 2026-03-09T19:27:09.341 INFO:tasks.workunit.client.1.vm08.stdout:0/411: mkdir dd/d22/d7b/d82 0 2026-03-09T19:27:09.341 INFO:tasks.workunit.client.0.vm07.stdout:8/148: getdents d7/d9/d10 0 2026-03-09T19:27:09.341 INFO:tasks.workunit.client.0.vm07.stdout:8/149: dread - d7/d1d/f3d zero size 2026-03-09T19:27:09.342 INFO:tasks.workunit.client.1.vm08.stdout:3/439: symlink d0/d6/de/l82 0 2026-03-09T19:27:09.346 INFO:tasks.workunit.client.1.vm08.stdout:0/412: creat dd/d22/d7b/f83 x:0 0 0 2026-03-09T19:27:09.347 INFO:tasks.workunit.client.1.vm08.stdout:4/389: mkdir da/d10/d26/d3a/d69/d75 0 2026-03-09T19:27:09.351 INFO:tasks.workunit.client.1.vm08.stdout:8/408: write de/f11 [1088779,73845] 0 2026-03-09T19:27:09.352 INFO:tasks.workunit.client.1.vm08.stdout:8/409: write de/d25/d31/f8e [545255,11394] 0 2026-03-09T19:27:09.358 INFO:tasks.workunit.client.1.vm08.stdout:8/410: symlink de/d25/d31/d82/d6d/l94 0 2026-03-09T19:27:09.360 INFO:tasks.workunit.client.1.vm08.stdout:0/413: link dd/d31/l55 dd/d22/d27/d2e/d37/l84 0 2026-03-09T19:27:09.362 INFO:tasks.workunit.client.1.vm08.stdout:4/390: link da/d10/d16/d28/c3f da/c76 0 2026-03-09T19:27:09.364 INFO:tasks.workunit.client.1.vm08.stdout:0/414: creat dd/d22/d27/d6c/f85 x:0 0 0 2026-03-09T19:27:09.368 INFO:tasks.workunit.client.1.vm08.stdout:0/415: dwrite dd/d22/d7b/f83 [0,4194304] 0 2026-03-09T19:27:09.371 INFO:tasks.workunit.client.0.vm07.stdout:6/108: dwrite d0/d1/fa [4194304,4194304] 0 2026-03-09T19:27:09.380 INFO:tasks.workunit.client.1.vm08.stdout:4/391: link f9 da/d10/f77 0 2026-03-09T19:27:09.388 INFO:tasks.workunit.client.0.vm07.stdout:6/109: link d0/d13/f18 d0/d1/d28/f2b 0 2026-03-09T19:27:09.401 INFO:tasks.workunit.client.0.vm07.stdout:6/110: write d0/d1/db/d1d/f22 [838779,50501] 0 2026-03-09T19:27:09.401 INFO:tasks.workunit.client.0.vm07.stdout:2/135: chown d3/dd/d16/d2f/c33 47611 1 2026-03-09T19:27:09.401 INFO:tasks.workunit.client.1.vm08.stdout:4/392: truncate da/f21 236799 0 2026-03-09T19:27:09.401 INFO:tasks.workunit.client.1.vm08.stdout:0/416: mkdir dd/d22/d24/d49/d50/d78/d86 0 2026-03-09T19:27:09.401 INFO:tasks.workunit.client.1.vm08.stdout:0/417: dwrite dd/d22/f28 [0,4194304] 0 2026-03-09T19:27:09.401 INFO:tasks.workunit.client.1.vm08.stdout:0/418: chown dd/d22/d24/d49/d50 1105605594 1 2026-03-09T19:27:09.401 INFO:tasks.workunit.client.1.vm08.stdout:4/393: mknod da/d10/d26/d3a/d49/c78 0 2026-03-09T19:27:09.401 INFO:tasks.workunit.client.1.vm08.stdout:3/440: sync 2026-03-09T19:27:09.402 INFO:tasks.workunit.client.1.vm08.stdout:2/376: dwrite d3/d4/f49 [0,4194304] 0 2026-03-09T19:27:09.404 INFO:tasks.workunit.client.0.vm07.stdout:6/111: mknod d0/d1/db/c2c 0 2026-03-09T19:27:09.404 INFO:tasks.workunit.client.0.vm07.stdout:6/112: stat d0/ff 0 2026-03-09T19:27:09.407 INFO:tasks.workunit.client.1.vm08.stdout:2/377: dwrite d3/d9/f20 [0,4194304] 0 2026-03-09T19:27:09.412 INFO:tasks.workunit.client.1.vm08.stdout:4/394: creat da/d10/d1b/f79 x:0 0 0 2026-03-09T19:27:09.431 INFO:tasks.workunit.client.0.vm07.stdout:5/117: write d3/f18 [3242383,16996] 0 2026-03-09T19:27:09.431 INFO:tasks.workunit.client.0.vm07.stdout:2/136: creat d3/dd/f34 x:0 0 0 2026-03-09T19:27:09.431 INFO:tasks.workunit.client.1.vm08.stdout:2/378: truncate d3/d4/d23/d2c/d39/d5e/de/d18/d1f/f83 1047911 0 2026-03-09T19:27:09.431 INFO:tasks.workunit.client.1.vm08.stdout:2/379: stat d3/d4/d23/d2c/d39/d5e/l65 0 2026-03-09T19:27:09.431 INFO:tasks.workunit.client.1.vm08.stdout:2/380: dwrite d3/d4/d23/d2c/f31 [0,4194304] 0 2026-03-09T19:27:09.431 INFO:tasks.workunit.client.1.vm08.stdout:2/381: fdatasync d3/d4/d23/d2c/d39/d5e/de/d8b/f81 0 2026-03-09T19:27:09.431 INFO:tasks.workunit.client.1.vm08.stdout:2/382: read - d3/d4/d23/d2c/d39/d5e/de/d18/d1f/f7f zero size 2026-03-09T19:27:09.431 INFO:tasks.workunit.client.1.vm08.stdout:2/383: write d3/d4/d23/d2c/f80 [229833,76439] 0 2026-03-09T19:27:09.431 INFO:tasks.workunit.client.1.vm08.stdout:2/384: dread d3/d9/f5d [0,4194304] 0 2026-03-09T19:27:09.431 INFO:tasks.workunit.client.0.vm07.stdout:7/139: creat d0/d4/d5/d8/d1a/d2a/f34 x:0 0 0 2026-03-09T19:27:09.432 INFO:tasks.workunit.client.0.vm07.stdout:5/118: mkdir d3/dd/d26/d2d 0 2026-03-09T19:27:09.432 INFO:tasks.workunit.client.1.vm08.stdout:0/419: rename fb to dd/d22/d24/f87 0 2026-03-09T19:27:09.433 INFO:tasks.workunit.client.1.vm08.stdout:0/420: write dd/d22/d27/d6c/f7f [515343,125635] 0 2026-03-09T19:27:09.434 INFO:tasks.workunit.client.1.vm08.stdout:4/395: creat da/d10/d26/d3a/d69/d75/f7a x:0 0 0 2026-03-09T19:27:09.435 INFO:tasks.workunit.client.1.vm08.stdout:3/441: creat d0/d6/de/d6e/f83 x:0 0 0 2026-03-09T19:27:09.438 INFO:tasks.workunit.client.0.vm07.stdout:2/137: link d3/l26 d3/dd/d16/d29/d2d/l35 0 2026-03-09T19:27:09.442 INFO:tasks.workunit.client.0.vm07.stdout:2/138: creat d3/dd/d16/f36 x:0 0 0 2026-03-09T19:27:09.443 INFO:tasks.workunit.client.1.vm08.stdout:2/385: link d3/d4/d23/d2c/d39/d5e/de/d18/f2d d3/d4/d3e/d4e/f8d 0 2026-03-09T19:27:09.444 INFO:tasks.workunit.client.0.vm07.stdout:2/139: mkdir d3/d37 0 2026-03-09T19:27:09.445 INFO:tasks.workunit.client.0.vm07.stdout:2/140: mkdir d3/d11/d38 0 2026-03-09T19:27:09.445 INFO:tasks.workunit.client.1.vm08.stdout:3/442: link d0/d6/de/c60 d0/d6/de/d1b/d16/d17/c84 0 2026-03-09T19:27:09.446 INFO:tasks.workunit.client.1.vm08.stdout:2/386: mknod d3/d4/d23/d2c/d39/d5e/de/d8b/c8e 0 2026-03-09T19:27:09.447 INFO:tasks.workunit.client.1.vm08.stdout:4/396: dread da/d10/f3d [0,4194304] 0 2026-03-09T19:27:09.449 INFO:tasks.workunit.client.0.vm07.stdout:2/141: dwrite d3/dd/f1e [0,4194304] 0 2026-03-09T19:27:09.449 INFO:tasks.workunit.client.1.vm08.stdout:4/397: read - da/d10/f53 zero size 2026-03-09T19:27:09.449 INFO:tasks.workunit.client.1.vm08.stdout:3/443: unlink d0/d6/de/d1b/d16/d17/f75 0 2026-03-09T19:27:09.449 INFO:tasks.workunit.client.0.vm07.stdout:2/142: chown d3/dd/f1e 60929876 1 2026-03-09T19:27:09.452 INFO:tasks.workunit.client.0.vm07.stdout:2/143: dwrite f2 [0,4194304] 0 2026-03-09T19:27:09.475 INFO:tasks.workunit.client.0.vm07.stdout:2/144: unlink d3/dd/d16/f36 0 2026-03-09T19:27:09.475 INFO:tasks.workunit.client.0.vm07.stdout:2/145: write d3/d11/f19 [193294,33811] 0 2026-03-09T19:27:09.475 INFO:tasks.workunit.client.0.vm07.stdout:2/146: write d3/fc [3308990,71988] 0 2026-03-09T19:27:09.475 INFO:tasks.workunit.client.0.vm07.stdout:2/147: creat d3/d11/f39 x:0 0 0 2026-03-09T19:27:09.475 INFO:tasks.workunit.client.1.vm08.stdout:3/444: mknod d0/d6/de/d15/c85 0 2026-03-09T19:27:09.475 INFO:tasks.workunit.client.1.vm08.stdout:3/445: chown d0/d4b/l73 47 1 2026-03-09T19:27:09.475 INFO:tasks.workunit.client.1.vm08.stdout:2/387: rename d3/d9/d79/c6e to d3/c8f 0 2026-03-09T19:27:09.475 INFO:tasks.workunit.client.1.vm08.stdout:2/388: write d3/d9/d4a/f59 [1186477,68880] 0 2026-03-09T19:27:09.475 INFO:tasks.workunit.client.1.vm08.stdout:4/398: link da/d10/d26/d27/d32/c67 da/d10/d26/c7b 0 2026-03-09T19:27:09.475 INFO:tasks.workunit.client.1.vm08.stdout:4/399: creat da/d10/d16/d28/d46/d52/d6e/d2c/f7c x:0 0 0 2026-03-09T19:27:09.475 INFO:tasks.workunit.client.1.vm08.stdout:3/446: creat d0/d6/de/f86 x:0 0 0 2026-03-09T19:27:09.475 INFO:tasks.workunit.client.1.vm08.stdout:3/447: readlink d0/d6/de/d15/l78 0 2026-03-09T19:27:09.475 INFO:tasks.workunit.client.1.vm08.stdout:4/400: chown da/d10/d26/d27/d32/c67 605702 1 2026-03-09T19:27:09.476 INFO:tasks.workunit.client.1.vm08.stdout:4/401: read f1 [786373,127240] 0 2026-03-09T19:27:09.478 INFO:tasks.workunit.client.1.vm08.stdout:3/448: getdents d0/d6/d25 0 2026-03-09T19:27:09.479 INFO:tasks.workunit.client.1.vm08.stdout:3/449: creat d0/d6/d25/f87 x:0 0 0 2026-03-09T19:27:09.479 INFO:tasks.workunit.client.1.vm08.stdout:3/450: stat d0/d6/de/l4d 0 2026-03-09T19:27:09.481 INFO:tasks.workunit.client.1.vm08.stdout:3/451: mkdir d0/d52/d6d/d77/d88 0 2026-03-09T19:27:09.485 INFO:tasks.workunit.client.0.vm07.stdout:1/96: fsync d1/d3/f12 0 2026-03-09T19:27:09.498 INFO:tasks.workunit.client.1.vm08.stdout:3/452: link d0/d4b/l73 d0/d6/de/l89 0 2026-03-09T19:27:09.498 INFO:tasks.workunit.client.1.vm08.stdout:3/453: chown d0/d8/f66 6185851 1 2026-03-09T19:27:09.498 INFO:tasks.workunit.client.0.vm07.stdout:1/97: write d1/d9/f15 [4800609,111696] 0 2026-03-09T19:27:09.498 INFO:tasks.workunit.client.0.vm07.stdout:1/98: dwrite d1/db/f1f [0,4194304] 0 2026-03-09T19:27:09.498 INFO:tasks.workunit.client.1.vm08.stdout:3/454: creat d0/d52/f8a x:0 0 0 2026-03-09T19:27:09.533 INFO:tasks.workunit.client.0.vm07.stdout:4/104: sync 2026-03-09T19:27:09.533 INFO:tasks.workunit.client.0.vm07.stdout:8/150: sync 2026-03-09T19:27:09.535 INFO:tasks.workunit.client.0.vm07.stdout:4/105: write d3/f8 [607430,63876] 0 2026-03-09T19:27:09.536 INFO:tasks.workunit.client.0.vm07.stdout:4/106: stat d3 0 2026-03-09T19:27:09.536 INFO:tasks.workunit.client.0.vm07.stdout:6/113: sync 2026-03-09T19:27:09.542 INFO:tasks.workunit.client.0.vm07.stdout:4/107: dwrite d3/fa [0,4194304] 0 2026-03-09T19:27:09.569 INFO:tasks.workunit.client.0.vm07.stdout:0/121: getdents d0/d6/d13/d1c 0 2026-03-09T19:27:09.570 INFO:tasks.workunit.client.0.vm07.stdout:0/122: dread d0/d6/f16 [0,4194304] 0 2026-03-09T19:27:09.570 INFO:tasks.workunit.client.1.vm08.stdout:1/536: dwrite d9/d11/f5f [0,4194304] 0 2026-03-09T19:27:09.570 INFO:tasks.workunit.client.1.vm08.stdout:5/374: dwrite f1 [0,4194304] 0 2026-03-09T19:27:09.570 INFO:tasks.workunit.client.1.vm08.stdout:5/375: write f1 [1025096,125383] 0 2026-03-09T19:27:09.570 INFO:tasks.workunit.client.1.vm08.stdout:6/398: truncate d3/d34/d5c/f7f 1042049 0 2026-03-09T19:27:09.570 INFO:tasks.workunit.client.1.vm08.stdout:6/399: dwrite d3/db/f42 [0,4194304] 0 2026-03-09T19:27:09.577 INFO:tasks.workunit.client.0.vm07.stdout:8/151: creat d7/d30/f3e x:0 0 0 2026-03-09T19:27:09.604 INFO:tasks.workunit.client.1.vm08.stdout:6/400: dwrite d3/db/f8f [0,4194304] 0 2026-03-09T19:27:09.604 INFO:tasks.workunit.client.1.vm08.stdout:1/537: chown d9/da/d53/l85 65 1 2026-03-09T19:27:09.605 INFO:tasks.workunit.client.1.vm08.stdout:9/387: dwrite d0/d2/f2f [0,4194304] 0 2026-03-09T19:27:09.605 INFO:tasks.workunit.client.0.vm07.stdout:4/108: fdatasync d3/fc 0 2026-03-09T19:27:09.605 INFO:tasks.workunit.client.0.vm07.stdout:9/179: fsync d0/db/f41 0 2026-03-09T19:27:09.605 INFO:tasks.workunit.client.0.vm07.stdout:4/109: write d3/d11/f18 [2045914,21361] 0 2026-03-09T19:27:09.605 INFO:tasks.workunit.client.0.vm07.stdout:9/180: creat d0/d17/f46 x:0 0 0 2026-03-09T19:27:09.605 INFO:tasks.workunit.client.0.vm07.stdout:9/181: symlink d0/db/d29/d2c/d36/l47 0 2026-03-09T19:27:09.606 INFO:tasks.workunit.client.1.vm08.stdout:5/376: symlink d16/d1e/d6e/l77 0 2026-03-09T19:27:09.607 INFO:tasks.workunit.client.1.vm08.stdout:7/461: mkdir d5/d16/d1c/d83/d9c 0 2026-03-09T19:27:09.608 INFO:tasks.workunit.client.1.vm08.stdout:6/401: mknod d3/d34/c93 0 2026-03-09T19:27:09.609 INFO:tasks.workunit.client.1.vm08.stdout:1/538: fsync d9/da/dc/f78 0 2026-03-09T19:27:09.609 INFO:tasks.workunit.client.1.vm08.stdout:5/377: symlink d16/d1e/d3b/d61/l78 0 2026-03-09T19:27:09.615 INFO:tasks.workunit.client.1.vm08.stdout:5/378: write d16/d1e/f27 [2845434,95456] 0 2026-03-09T19:27:09.626 INFO:tasks.workunit.client.1.vm08.stdout:6/402: truncate d3/f2a 2485220 0 2026-03-09T19:27:09.627 INFO:tasks.workunit.client.1.vm08.stdout:6/403: write d3/db/d43/f56 [561882,8638] 0 2026-03-09T19:27:09.627 INFO:tasks.workunit.client.1.vm08.stdout:7/462: link d5/f4d d5/d16/d1c/d83/d9c/f9d 0 2026-03-09T19:27:09.627 INFO:tasks.workunit.client.1.vm08.stdout:1/539: getdents d9 0 2026-03-09T19:27:09.632 INFO:tasks.workunit.client.1.vm08.stdout:4/402: dread da/d10/d16/d28/f34 [0,4194304] 0 2026-03-09T19:27:09.635 INFO:tasks.workunit.client.1.vm08.stdout:4/403: mknod da/d10/d16/d28/d46/d52/d6e/d6d/c7d 0 2026-03-09T19:27:09.635 INFO:tasks.workunit.client.1.vm08.stdout:4/404: stat da/d10/d16/d28/d46/d52/d6e/d40/f41 0 2026-03-09T19:27:09.639 INFO:tasks.workunit.client.1.vm08.stdout:4/405: getdents da/d10/d16/d28/d46/d52/d6e 0 2026-03-09T19:27:09.645 INFO:tasks.workunit.client.1.vm08.stdout:4/406: dwrite da/d10/d26/d38/f57 [0,4194304] 0 2026-03-09T19:27:09.648 INFO:tasks.workunit.client.1.vm08.stdout:7/463: dread d5/fc [0,4194304] 0 2026-03-09T19:27:09.653 INFO:tasks.workunit.client.1.vm08.stdout:4/407: unlink da/d10/d16/d28/d46/d52/d6e/f5a 0 2026-03-09T19:27:09.654 INFO:tasks.workunit.client.1.vm08.stdout:7/464: link d5/d14/d38/l77 d5/d16/d3a/d42/d6a/l9e 0 2026-03-09T19:27:09.655 INFO:tasks.workunit.client.1.vm08.stdout:7/465: fdatasync d5/d14/d2b/f37 0 2026-03-09T19:27:09.664 INFO:tasks.workunit.client.1.vm08.stdout:7/466: truncate d5/d16/d1c/f29 1237510 0 2026-03-09T19:27:09.679 INFO:tasks.workunit.client.0.vm07.stdout:9/182: sync 2026-03-09T19:27:09.679 INFO:tasks.workunit.client.0.vm07.stdout:4/110: sync 2026-03-09T19:27:09.679 INFO:tasks.workunit.client.1.vm08.stdout:7/467: creat d5/d14/d2b/f9f x:0 0 0 2026-03-09T19:27:09.679 INFO:tasks.workunit.client.1.vm08.stdout:7/468: dread d5/fa [4194304,4194304] 0 2026-03-09T19:27:09.679 INFO:tasks.workunit.client.0.vm07.stdout:4/111: symlink d3/d11/d16/l1c 0 2026-03-09T19:27:09.680 INFO:tasks.workunit.client.0.vm07.stdout:4/112: chown d3/f1a 846489873 1 2026-03-09T19:27:09.682 INFO:tasks.workunit.client.0.vm07.stdout:4/113: mknod d3/d11/d1b/c1d 0 2026-03-09T19:27:09.682 INFO:tasks.workunit.client.0.vm07.stdout:4/114: write d3/f1a [945632,10694] 0 2026-03-09T19:27:09.690 INFO:tasks.workunit.client.0.vm07.stdout:4/115: creat d3/d11/f1e x:0 0 0 2026-03-09T19:27:09.691 INFO:tasks.workunit.client.0.vm07.stdout:4/116: unlink d3/fa 0 2026-03-09T19:27:09.699 INFO:tasks.workunit.client.1.vm08.stdout:6/404: sync 2026-03-09T19:27:09.710 INFO:tasks.workunit.client.0.vm07.stdout:9/183: sync 2026-03-09T19:27:09.712 INFO:tasks.workunit.client.0.vm07.stdout:9/184: unlink d0/db/f3b 0 2026-03-09T19:27:09.712 INFO:tasks.workunit.client.0.vm07.stdout:9/185: creat d0/d6/f48 x:0 0 0 2026-03-09T19:27:09.714 INFO:tasks.workunit.client.0.vm07.stdout:9/186: rename d0/db/l1a to d0/d6/l49 0 2026-03-09T19:27:09.715 INFO:tasks.workunit.client.0.vm07.stdout:9/187: write d0/d6/d3a/f45 [180178,95174] 0 2026-03-09T19:27:09.716 INFO:tasks.workunit.client.0.vm07.stdout:9/188: dread d0/f4 [0,4194304] 0 2026-03-09T19:27:09.716 INFO:tasks.workunit.client.0.vm07.stdout:9/189: dread - d0/d17/f46 zero size 2026-03-09T19:27:09.721 INFO:tasks.workunit.client.0.vm07.stdout:9/190: link d0/db/d29/d2c/f34 d0/db/d29/d2c/f4a 0 2026-03-09T19:27:09.723 INFO:tasks.workunit.client.0.vm07.stdout:9/191: mknod d0/db/d29/d2c/d36/c4b 0 2026-03-09T19:27:09.728 INFO:tasks.workunit.client.0.vm07.stdout:9/192: dwrite d0/db/d29/d2c/f4a [0,4194304] 0 2026-03-09T19:27:09.742 INFO:tasks.workunit.client.0.vm07.stdout:9/193: read - d0/d6/f48 zero size 2026-03-09T19:27:09.777 INFO:tasks.workunit.client.1.vm08.stdout:9/388: dread d0/d2/d14/d5c/d32/f38 [0,4194304] 0 2026-03-09T19:27:09.781 INFO:tasks.workunit.client.1.vm08.stdout:9/389: fsync d0/d2/f36 0 2026-03-09T19:27:09.798 INFO:tasks.workunit.client.0.vm07.stdout:3/176: getdents d1/d6/dd 0 2026-03-09T19:27:09.798 INFO:tasks.workunit.client.0.vm07.stdout:3/177: readlink d1/l3 0 2026-03-09T19:27:09.798 INFO:tasks.workunit.client.1.vm08.stdout:9/390: write d0/d2/f21 [3262243,64301] 0 2026-03-09T19:27:09.798 INFO:tasks.workunit.client.1.vm08.stdout:9/391: readlink d0/d1b/l52 0 2026-03-09T19:27:09.798 INFO:tasks.workunit.client.1.vm08.stdout:9/392: dwrite d0/d2/f1a [0,4194304] 0 2026-03-09T19:27:09.798 INFO:tasks.workunit.client.1.vm08.stdout:9/393: symlink d0/d1b/d68/l81 0 2026-03-09T19:27:09.798 INFO:tasks.workunit.client.1.vm08.stdout:9/394: dwrite d0/d2/f2f [0,4194304] 0 2026-03-09T19:27:09.801 INFO:tasks.workunit.client.1.vm08.stdout:7/469: dread d5/d14/d2b/f30 [0,4194304] 0 2026-03-09T19:27:09.807 INFO:tasks.workunit.client.1.vm08.stdout:0/421: rmdir dd/d22/d27/d6c 39 2026-03-09T19:27:09.807 INFO:tasks.workunit.client.1.vm08.stdout:9/395: truncate d0/d2/d14/d5c/d32/f38 211564 0 2026-03-09T19:27:09.807 INFO:tasks.workunit.client.1.vm08.stdout:8/411: write de/d1d/d21/f72 [400428,22381] 0 2026-03-09T19:27:09.808 INFO:tasks.workunit.client.1.vm08.stdout:0/422: dread - dd/f6d zero size 2026-03-09T19:27:09.809 INFO:tasks.workunit.client.1.vm08.stdout:0/423: chown dd/d31/c35 137152758 1 2026-03-09T19:27:09.809 INFO:tasks.workunit.client.1.vm08.stdout:8/412: chown de/d25/d87/l92 2 1 2026-03-09T19:27:09.811 INFO:tasks.workunit.client.0.vm07.stdout:3/178: truncate d1/d6/f21 364982 0 2026-03-09T19:27:09.821 INFO:tasks.workunit.client.0.vm07.stdout:3/179: getdents d1/d1f/d16 0 2026-03-09T19:27:09.822 INFO:tasks.workunit.client.0.vm07.stdout:3/180: stat d1/d6/dd/f2b 0 2026-03-09T19:27:09.826 INFO:tasks.workunit.client.1.vm08.stdout:0/424: mknod dd/d22/d63/d6e/d72/c88 0 2026-03-09T19:27:09.829 INFO:tasks.workunit.client.1.vm08.stdout:8/413: creat de/d7c/f95 x:0 0 0 2026-03-09T19:27:09.830 INFO:tasks.workunit.client.0.vm07.stdout:3/181: creat d1/d6/dd/f33 x:0 0 0 2026-03-09T19:27:09.831 INFO:tasks.workunit.client.0.vm07.stdout:3/182: chown d1/d6/f1b 14 1 2026-03-09T19:27:09.831 INFO:tasks.workunit.client.1.vm08.stdout:9/396: rename d0/d2/f62 to d0/d1b/f82 0 2026-03-09T19:27:09.833 INFO:tasks.workunit.client.0.vm07.stdout:3/183: creat d1/d1f/d16/d28/f34 x:0 0 0 2026-03-09T19:27:09.833 INFO:tasks.workunit.client.0.vm07.stdout:3/184: readlink d1/d1f/l25 0 2026-03-09T19:27:09.833 INFO:tasks.workunit.client.1.vm08.stdout:8/414: dread - de/f54 zero size 2026-03-09T19:27:09.835 INFO:tasks.workunit.client.1.vm08.stdout:7/470: getdents d5/d16 0 2026-03-09T19:27:09.842 INFO:tasks.workunit.client.0.vm07.stdout:3/185: mknod d1/d6/c35 0 2026-03-09T19:27:09.842 INFO:tasks.workunit.client.1.vm08.stdout:8/415: stat de/d1d/d21/f4b 0 2026-03-09T19:27:09.842 INFO:tasks.workunit.client.1.vm08.stdout:8/416: chown de/d1d/c2a 24586 1 2026-03-09T19:27:09.842 INFO:tasks.workunit.client.1.vm08.stdout:7/471: mkdir d5/d16/d3a/d42/d85/da0 0 2026-03-09T19:27:09.842 INFO:tasks.workunit.client.1.vm08.stdout:7/472: truncate d5/d14/d38/f40 739108 0 2026-03-09T19:27:09.842 INFO:tasks.workunit.client.1.vm08.stdout:7/473: chown d5/d16/d1c/d83 50410 1 2026-03-09T19:27:09.842 INFO:tasks.workunit.client.1.vm08.stdout:7/474: dread - d5/d14/d2b/d5d/f94 zero size 2026-03-09T19:27:09.843 INFO:tasks.workunit.client.1.vm08.stdout:7/475: fdatasync d5/d14/d2b/f37 0 2026-03-09T19:27:09.856 INFO:tasks.workunit.client.1.vm08.stdout:4/408: dread da/d10/d26/d3a/f51 [0,4194304] 0 2026-03-09T19:27:09.858 INFO:tasks.workunit.client.1.vm08.stdout:0/425: readlink dd/d22/d27/d2e/l52 0 2026-03-09T19:27:09.861 INFO:tasks.workunit.client.1.vm08.stdout:9/397: creat d0/f83 x:0 0 0 2026-03-09T19:27:09.863 INFO:tasks.workunit.client.1.vm08.stdout:0/426: unlink dd/d22/d24/d49/f62 0 2026-03-09T19:27:09.864 INFO:tasks.workunit.client.1.vm08.stdout:4/409: unlink da/d10/f1c 0 2026-03-09T19:27:09.865 INFO:tasks.workunit.client.1.vm08.stdout:4/410: chown da/d10/d1b/f29 1517422454 1 2026-03-09T19:27:09.865 INFO:tasks.workunit.client.1.vm08.stdout:4/411: read da/d10/f2e [347736,31375] 0 2026-03-09T19:27:09.866 INFO:tasks.workunit.client.1.vm08.stdout:9/398: creat d0/d2/d8/d7/d48/d6f/f84 x:0 0 0 2026-03-09T19:27:09.867 INFO:tasks.workunit.client.1.vm08.stdout:8/417: creat de/d25/d31/d82/f96 x:0 0 0 2026-03-09T19:27:09.875 INFO:tasks.workunit.client.1.vm08.stdout:0/427: creat dd/d22/d27/d6c/f89 x:0 0 0 2026-03-09T19:27:09.876 INFO:tasks.workunit.client.1.vm08.stdout:0/428: stat dd/l14 0 2026-03-09T19:27:09.878 INFO:tasks.workunit.client.1.vm08.stdout:8/418: link de/d1d/f27 de/d1d/f97 0 2026-03-09T19:27:09.879 INFO:tasks.workunit.client.1.vm08.stdout:4/412: creat da/d10/d16/d28/d2f/f7e x:0 0 0 2026-03-09T19:27:09.880 INFO:tasks.workunit.client.1.vm08.stdout:4/413: chown da/d10/d16/d28/d2f/d4f/c58 4 1 2026-03-09T19:27:09.883 INFO:tasks.workunit.client.1.vm08.stdout:0/429: rename dd/d22/d27/d2e/f53 to dd/d22/d63/d6e/f8a 0 2026-03-09T19:27:09.885 INFO:tasks.workunit.client.1.vm08.stdout:4/414: mknod da/d10/d26/d3a/d49/c7f 0 2026-03-09T19:27:09.887 INFO:tasks.workunit.client.1.vm08.stdout:8/419: rename de/d25/d33/c3e to de/d1d/d2e/c98 0 2026-03-09T19:27:09.897 INFO:tasks.workunit.client.1.vm08.stdout:4/415: link da/d10/d16/d28/d2f/d4f/d64/f6f da/d10/d16/d28/d2f/f80 0 2026-03-09T19:27:09.899 INFO:tasks.workunit.client.1.vm08.stdout:8/420: dread de/d1d/d2e/f61 [0,4194304] 0 2026-03-09T19:27:09.901 INFO:tasks.workunit.client.1.vm08.stdout:4/416: mkdir da/d10/d16/d28/d2f/d4f/d64/d81 0 2026-03-09T19:27:09.926 INFO:tasks.workunit.client.1.vm08.stdout:0/430: sync 2026-03-09T19:27:09.936 INFO:tasks.workunit.client.1.vm08.stdout:0/431: rmdir dd/d22/d63 39 2026-03-09T19:27:09.937 INFO:tasks.workunit.client.1.vm08.stdout:0/432: readlink dd/d22/d24/d49/d50/l69 0 2026-03-09T19:27:09.939 INFO:tasks.workunit.client.1.vm08.stdout:0/433: symlink dd/d22/d27/d4f/d6f/l8b 0 2026-03-09T19:27:09.940 INFO:tasks.workunit.client.1.vm08.stdout:0/434: mknod dd/d22/c8c 0 2026-03-09T19:27:09.940 INFO:tasks.workunit.client.1.vm08.stdout:0/435: rmdir dd 39 2026-03-09T19:27:10.117 INFO:tasks.workunit.client.0.vm07.stdout:6/114: truncate d0/ff 1338144 0 2026-03-09T19:27:10.120 INFO:tasks.workunit.client.0.vm07.stdout:7/140: write d0/d4/d5/d26/f31 [988700,9165] 0 2026-03-09T19:27:10.120 INFO:tasks.workunit.client.0.vm07.stdout:6/115: mkdir d0/d2d 0 2026-03-09T19:27:10.124 INFO:tasks.workunit.client.0.vm07.stdout:7/141: truncate d0/f22 16219 0 2026-03-09T19:27:10.127 INFO:tasks.workunit.client.0.vm07.stdout:7/142: dread d0/d4/d5/dd/f1f [0,4194304] 0 2026-03-09T19:27:10.127 INFO:tasks.workunit.client.0.vm07.stdout:6/116: rename d0/d1/f6 to d0/d1/db/d1d/f2e 0 2026-03-09T19:27:10.134 INFO:tasks.workunit.client.0.vm07.stdout:7/143: creat d0/d4/d5/d8/f35 x:0 0 0 2026-03-09T19:27:10.159 INFO:tasks.workunit.client.1.vm08.stdout:2/389: truncate d3/d4/d23/d2c/d39/d5e/de/f7a 945820 0 2026-03-09T19:27:10.159 INFO:tasks.workunit.client.1.vm08.stdout:6/405: truncate d3/d34/d5c/f7f 1152275 0 2026-03-09T19:27:10.159 INFO:tasks.workunit.client.0.vm07.stdout:6/117: creat d0/d1/db/d17/f2f x:0 0 0 2026-03-09T19:27:10.159 INFO:tasks.workunit.client.0.vm07.stdout:6/118: mkdir d0/d13/d1e/d30 0 2026-03-09T19:27:10.159 INFO:tasks.workunit.client.0.vm07.stdout:2/148: write f0 [1156674,14705] 0 2026-03-09T19:27:10.159 INFO:tasks.workunit.client.0.vm07.stdout:2/149: write d3/d11/f31 [753322,49067] 0 2026-03-09T19:27:10.159 INFO:tasks.workunit.client.0.vm07.stdout:6/119: mkdir d0/d13/d1e/d30/d31 0 2026-03-09T19:27:10.159 INFO:tasks.workunit.client.0.vm07.stdout:6/120: dread - d0/fe zero size 2026-03-09T19:27:10.159 INFO:tasks.workunit.client.0.vm07.stdout:1/99: dwrite d1/d3/f4 [0,4194304] 0 2026-03-09T19:27:10.159 INFO:tasks.workunit.client.0.vm07.stdout:8/152: write d7/d9/d10/f1b [1434880,13827] 0 2026-03-09T19:27:10.159 INFO:tasks.workunit.client.0.vm07.stdout:0/123: write d0/d6/d13/d1c/f27 [4896285,114362] 0 2026-03-09T19:27:10.160 INFO:tasks.workunit.client.0.vm07.stdout:1/100: dwrite d1/d3/f4 [0,4194304] 0 2026-03-09T19:27:10.161 INFO:tasks.workunit.client.0.vm07.stdout:2/150: truncate d3/f15 3441261 0 2026-03-09T19:27:10.164 INFO:tasks.workunit.client.1.vm08.stdout:2/390: link d3/d4/f55 d3/d9/d79/d46/d8c/f90 0 2026-03-09T19:27:10.164 INFO:tasks.workunit.client.1.vm08.stdout:2/391: chown d3/f7c 4267639 1 2026-03-09T19:27:10.165 INFO:tasks.workunit.client.1.vm08.stdout:2/392: fdatasync d3/d4/f49 0 2026-03-09T19:27:10.166 INFO:tasks.workunit.client.1.vm08.stdout:2/393: write d3/d9/d26/f35 [4863542,13110] 0 2026-03-09T19:27:10.167 INFO:tasks.workunit.client.1.vm08.stdout:6/406: rmdir d3/d68/d79 0 2026-03-09T19:27:10.171 INFO:tasks.workunit.client.0.vm07.stdout:8/153: creat d7/d1d/f3f x:0 0 0 2026-03-09T19:27:10.178 INFO:tasks.workunit.client.0.vm07.stdout:0/124: rename d0/f23 to d0/d6/d13/d1c/d11/f29 0 2026-03-09T19:27:10.179 INFO:tasks.workunit.client.1.vm08.stdout:5/379: truncate d16/d1e/f5c 222331 0 2026-03-09T19:27:10.179 INFO:tasks.workunit.client.0.vm07.stdout:8/154: chown d7/ff 12444628 1 2026-03-09T19:27:10.179 INFO:tasks.workunit.client.0.vm07.stdout:2/151: creat d3/dd/d16/d30/f3a x:0 0 0 2026-03-09T19:27:10.179 INFO:tasks.workunit.client.1.vm08.stdout:6/407: mkdir d3/d94 0 2026-03-09T19:27:10.179 INFO:tasks.workunit.client.0.vm07.stdout:2/152: write d3/d11/f19 [993778,100692] 0 2026-03-09T19:27:10.180 INFO:tasks.workunit.client.0.vm07.stdout:2/153: chown d3/l9 101 1 2026-03-09T19:27:10.180 INFO:tasks.workunit.client.0.vm07.stdout:2/154: chown d3/l9 6 1 2026-03-09T19:27:10.180 INFO:tasks.workunit.client.1.vm08.stdout:1/540: dwrite d9/d11/d7a/d89/d8d/fa1 [0,4194304] 0 2026-03-09T19:27:10.185 INFO:tasks.workunit.client.1.vm08.stdout:5/380: symlink d16/l79 0 2026-03-09T19:27:10.196 INFO:tasks.workunit.client.1.vm08.stdout:2/394: creat d3/d4/f91 x:0 0 0 2026-03-09T19:27:10.196 INFO:tasks.workunit.client.1.vm08.stdout:2/395: chown d3/d4/d23/d2c/d39/d5e/d14/c1d 0 1 2026-03-09T19:27:10.198 INFO:tasks.workunit.client.0.vm07.stdout:2/155: mkdir d3/d37/d3b 0 2026-03-09T19:27:10.206 INFO:tasks.workunit.client.1.vm08.stdout:6/408: symlink d3/l95 0 2026-03-09T19:27:10.207 INFO:tasks.workunit.client.1.vm08.stdout:1/541: mkdir d9/d11/d7a/d89/d8d/da3 0 2026-03-09T19:27:10.207 INFO:tasks.workunit.client.0.vm07.stdout:0/125: getdents d0/d6/d13/d17 0 2026-03-09T19:27:10.207 INFO:tasks.workunit.client.0.vm07.stdout:2/156: unlink d3/dd/d16/d2f/c33 0 2026-03-09T19:27:10.207 INFO:tasks.workunit.client.0.vm07.stdout:0/126: symlink d0/d6/d13/d17/l2a 0 2026-03-09T19:27:10.208 INFO:tasks.workunit.client.1.vm08.stdout:5/381: chown d16/d1e/f2c 387807 1 2026-03-09T19:27:10.208 INFO:tasks.workunit.client.0.vm07.stdout:6/121: sync 2026-03-09T19:27:10.209 INFO:tasks.workunit.client.0.vm07.stdout:8/155: read d7/f1c [100528,75399] 0 2026-03-09T19:27:10.213 INFO:tasks.workunit.client.0.vm07.stdout:8/156: dwrite d7/d9/d37/d34/f38 [0,4194304] 0 2026-03-09T19:27:10.215 INFO:tasks.workunit.client.0.vm07.stdout:2/157: mkdir d3/dd/d16/d29/d3c 0 2026-03-09T19:27:10.216 INFO:tasks.workunit.client.0.vm07.stdout:0/127: creat d0/d6/d13/d17/f2b x:0 0 0 2026-03-09T19:27:10.217 INFO:tasks.workunit.client.1.vm08.stdout:1/542: read - d9/d11/d7a/f80 zero size 2026-03-09T19:27:10.217 INFO:tasks.workunit.client.0.vm07.stdout:6/122: mknod d0/d13/c32 0 2026-03-09T19:27:10.217 INFO:tasks.workunit.client.1.vm08.stdout:1/543: fsync d9/da/d12/d39/f47 0 2026-03-09T19:27:10.218 INFO:tasks.workunit.client.0.vm07.stdout:2/158: dwrite d3/d11/f1f [4194304,4194304] 0 2026-03-09T19:27:10.230 INFO:tasks.workunit.client.1.vm08.stdout:2/396: mkdir d3/d9/d79/d46/d8c/d92 0 2026-03-09T19:27:10.230 INFO:tasks.workunit.client.1.vm08.stdout:2/397: chown d3/d9/d79/f6b 2305 1 2026-03-09T19:27:10.230 INFO:tasks.workunit.client.1.vm08.stdout:2/398: dread - d3/d4/d23/d2c/f64 zero size 2026-03-09T19:27:10.230 INFO:tasks.workunit.client.1.vm08.stdout:5/382: truncate d16/d1e/f35 2610770 0 2026-03-09T19:27:10.232 INFO:tasks.workunit.client.0.vm07.stdout:0/128: symlink d0/d6/d13/d1c/d11/l2c 0 2026-03-09T19:27:10.233 INFO:tasks.workunit.client.0.vm07.stdout:0/129: write d0/d6/d13/d1c/d11/f15 [3181636,68714] 0 2026-03-09T19:27:10.233 INFO:tasks.workunit.client.0.vm07.stdout:0/130: stat d0/d6/d13/d17 0 2026-03-09T19:27:10.242 INFO:tasks.workunit.client.0.vm07.stdout:6/123: mknod d0/d13/c33 0 2026-03-09T19:27:10.242 INFO:tasks.workunit.client.1.vm08.stdout:6/409: truncate d3/f2a 2034135 0 2026-03-09T19:27:10.242 INFO:tasks.workunit.client.1.vm08.stdout:1/544: rename l3 to d9/da/d17/d60/la4 0 2026-03-09T19:27:10.243 INFO:tasks.workunit.client.1.vm08.stdout:1/545: fdatasync d9/d40/f57 0 2026-03-09T19:27:10.246 INFO:tasks.workunit.client.0.vm07.stdout:2/159: unlink d3/f7 0 2026-03-09T19:27:10.247 INFO:tasks.workunit.client.0.vm07.stdout:2/160: truncate d3/dd/f34 211515 0 2026-03-09T19:27:10.250 INFO:tasks.workunit.client.1.vm08.stdout:2/399: creat d3/d4/d23/d2c/d39/d5e/de/d18/f93 x:0 0 0 2026-03-09T19:27:10.259 INFO:tasks.workunit.client.1.vm08.stdout:7/476: rmdir d5/d14/d2b 39 2026-03-09T19:27:10.271 INFO:tasks.workunit.client.1.vm08.stdout:3/455: dwrite d0/d52/d6d/d77/f68 [0,4194304] 0 2026-03-09T19:27:10.271 INFO:tasks.workunit.client.1.vm08.stdout:3/456: write d0/d8/f4c [54244,87043] 0 2026-03-09T19:27:10.271 INFO:tasks.workunit.client.0.vm07.stdout:4/117: getdents d3/d11 0 2026-03-09T19:27:10.271 INFO:tasks.workunit.client.0.vm07.stdout:4/118: chown d3/f8 1004 1 2026-03-09T19:27:10.271 INFO:tasks.workunit.client.0.vm07.stdout:4/119: dread d3/f8 [0,4194304] 0 2026-03-09T19:27:10.271 INFO:tasks.workunit.client.0.vm07.stdout:9/194: truncate d0/db/d29/d2c/f30 306619 0 2026-03-09T19:27:10.277 INFO:tasks.workunit.client.0.vm07.stdout:6/124: creat d0/d13/d1e/f34 x:0 0 0 2026-03-09T19:27:10.279 INFO:tasks.workunit.client.0.vm07.stdout:6/125: write d0/d1/db/d17/f2f [213450,27678] 0 2026-03-09T19:27:10.300 INFO:tasks.workunit.client.1.vm08.stdout:6/410: rename d3/d34/c54 to d3/d55/c96 0 2026-03-09T19:27:10.301 INFO:tasks.workunit.client.1.vm08.stdout:6/411: write d3/db/d43/f71 [394046,43609] 0 2026-03-09T19:27:10.305 INFO:tasks.workunit.client.1.vm08.stdout:1/546: creat d9/da/dc/fa5 x:0 0 0 2026-03-09T19:27:10.308 INFO:tasks.workunit.client.0.vm07.stdout:8/157: rename d7/d9/d37/d34/f38 to d7/f40 0 2026-03-09T19:27:10.309 INFO:tasks.workunit.client.1.vm08.stdout:5/383: fdatasync d16/f4d 0 2026-03-09T19:27:10.310 INFO:tasks.workunit.client.1.vm08.stdout:9/399: truncate d0/d2/d14/f4d 1233421 0 2026-03-09T19:27:10.314 INFO:tasks.workunit.client.1.vm08.stdout:8/421: getdents de/d1d 0 2026-03-09T19:27:10.318 INFO:tasks.workunit.client.1.vm08.stdout:2/400: read d3/d4/f6 [5063362,47957] 0 2026-03-09T19:27:10.318 INFO:tasks.workunit.client.1.vm08.stdout:8/422: dread de/d1d/f59 [0,4194304] 0 2026-03-09T19:27:10.321 INFO:tasks.workunit.client.1.vm08.stdout:8/423: fdatasync de/d25/d31/f8e 0 2026-03-09T19:27:10.322 INFO:tasks.workunit.client.1.vm08.stdout:2/401: fdatasync d3/d4/d23/d2c/d39/d5e/de/d18/d1f/f7f 0 2026-03-09T19:27:10.322 INFO:tasks.workunit.client.1.vm08.stdout:5/384: dwrite d16/d1e/f27 [4194304,4194304] 0 2026-03-09T19:27:10.328 INFO:tasks.workunit.client.0.vm07.stdout:4/120: symlink d3/d11/d16/l1f 0 2026-03-09T19:27:10.328 INFO:tasks.workunit.client.1.vm08.stdout:4/417: write da/d10/d16/d28/d46/d52/f5b [1341259,101589] 0 2026-03-09T19:27:10.337 INFO:tasks.workunit.client.0.vm07.stdout:6/126: creat d0/d13/d1e/d30/f35 x:0 0 0 2026-03-09T19:27:10.339 INFO:tasks.workunit.client.1.vm08.stdout:6/412: creat d3/d34/d3b/d85/f97 x:0 0 0 2026-03-09T19:27:10.339 INFO:tasks.workunit.client.1.vm08.stdout:0/436: truncate dd/d22/d24/f26 1110622 0 2026-03-09T19:27:10.344 INFO:tasks.workunit.client.0.vm07.stdout:9/195: rename d0/d17/f46 to d0/d6/f4c 0 2026-03-09T19:27:10.349 INFO:tasks.workunit.client.0.vm07.stdout:9/196: readlink d0/d17/l18 0 2026-03-09T19:27:10.349 INFO:tasks.workunit.client.0.vm07.stdout:7/144: fsync d0/d4/d5/d8/f35 0 2026-03-09T19:27:10.349 INFO:tasks.workunit.client.0.vm07.stdout:8/158: creat d7/d9/d10/f41 x:0 0 0 2026-03-09T19:27:10.349 INFO:tasks.workunit.client.0.vm07.stdout:9/197: dwrite d0/db/f1d [0,4194304] 0 2026-03-09T19:27:10.349 INFO:tasks.workunit.client.0.vm07.stdout:9/198: stat d0/db/d29/d2c/f43 0 2026-03-09T19:27:10.354 INFO:tasks.workunit.client.1.vm08.stdout:9/400: creat d0/d2/d14/d5c/d32/f85 x:0 0 0 2026-03-09T19:27:10.356 INFO:tasks.workunit.client.0.vm07.stdout:8/159: dwrite d7/f1c [0,4194304] 0 2026-03-09T19:27:10.377 INFO:tasks.workunit.client.1.vm08.stdout:8/424: mkdir de/d25/d31/d82/d6d/d99 0 2026-03-09T19:27:10.379 INFO:tasks.workunit.client.0.vm07.stdout:4/121: mkdir d3/d11/d1b/d20 0 2026-03-09T19:27:10.379 INFO:tasks.workunit.client.1.vm08.stdout:8/425: chown de/d1d/d4f/f5e 1680640 1 2026-03-09T19:27:10.380 INFO:tasks.workunit.client.1.vm08.stdout:7/477: creat d5/d14/d2b/fa1 x:0 0 0 2026-03-09T19:27:10.382 INFO:tasks.workunit.client.0.vm07.stdout:6/127: mknod d0/d1/db/d24/c36 0 2026-03-09T19:27:10.393 INFO:tasks.workunit.client.1.vm08.stdout:5/385: creat d16/d1e/d3b/d61/f7a x:0 0 0 2026-03-09T19:27:10.400 INFO:tasks.workunit.client.1.vm08.stdout:3/457: link d0/d6/de/d1b/d16/d17/f71 d0/d52/d6d/f8b 0 2026-03-09T19:27:10.401 INFO:tasks.workunit.client.1.vm08.stdout:3/458: chown d0/d8/d24 2000639 1 2026-03-09T19:27:10.407 INFO:tasks.workunit.client.1.vm08.stdout:4/418: unlink da/d10/d26/d50/l55 0 2026-03-09T19:27:10.412 INFO:tasks.workunit.client.0.vm07.stdout:7/145: creat d0/d4/d5/f36 x:0 0 0 2026-03-09T19:27:10.417 INFO:tasks.workunit.client.0.vm07.stdout:9/199: mkdir d0/db/d29/d4d 0 2026-03-09T19:27:10.417 INFO:tasks.workunit.client.0.vm07.stdout:9/200: chown d0/db/d29/d2c/d36 214945331 1 2026-03-09T19:27:10.418 INFO:tasks.workunit.client.0.vm07.stdout:1/101: truncate d1/db/f1f 1850517 0 2026-03-09T19:27:10.418 INFO:tasks.workunit.client.1.vm08.stdout:6/413: creat d3/d15/f98 x:0 0 0 2026-03-09T19:27:10.422 INFO:tasks.workunit.client.0.vm07.stdout:8/160: rmdir d7/d16 39 2026-03-09T19:27:10.424 INFO:tasks.workunit.client.0.vm07.stdout:4/122: symlink d3/d11/d16/l21 0 2026-03-09T19:27:10.428 INFO:tasks.workunit.client.0.vm07.stdout:6/128: symlink d0/d1/db/l37 0 2026-03-09T19:27:10.440 INFO:tasks.workunit.client.0.vm07.stdout:2/161: rmdir d3/dd 39 2026-03-09T19:27:10.441 INFO:tasks.workunit.client.0.vm07.stdout:1/102: creat d1/d3/f23 x:0 0 0 2026-03-09T19:27:10.444 INFO:tasks.workunit.client.0.vm07.stdout:1/103: dread d1/d3/f4 [0,4194304] 0 2026-03-09T19:27:10.465 INFO:tasks.workunit.client.1.vm08.stdout:4/419: write f5 [818323,97445] 0 2026-03-09T19:27:10.465 INFO:tasks.workunit.client.1.vm08.stdout:4/420: write da/d10/d1b/f79 [785186,44672] 0 2026-03-09T19:27:10.465 INFO:tasks.workunit.client.1.vm08.stdout:9/401: truncate d0/d1b/f82 1003419 0 2026-03-09T19:27:10.465 INFO:tasks.workunit.client.0.vm07.stdout:1/104: truncate d1/d3/f23 687249 0 2026-03-09T19:27:10.465 INFO:tasks.workunit.client.0.vm07.stdout:8/161: mknod d7/d9/d37/d34/c42 0 2026-03-09T19:27:10.465 INFO:tasks.workunit.client.0.vm07.stdout:4/123: write d3/f7 [524157,3561] 0 2026-03-09T19:27:10.465 INFO:tasks.workunit.client.0.vm07.stdout:4/124: write d3/d11/f12 [486158,13660] 0 2026-03-09T19:27:10.465 INFO:tasks.workunit.client.0.vm07.stdout:6/129: creat d0/d1/db/d17/f38 x:0 0 0 2026-03-09T19:27:10.470 INFO:tasks.workunit.client.1.vm08.stdout:8/426: rename de/d1d/d4f/f6b to de/d1d/d69/f9a 0 2026-03-09T19:27:10.471 INFO:tasks.workunit.client.0.vm07.stdout:5/119: write d3/d1a/f17 [611357,22061] 0 2026-03-09T19:27:10.477 INFO:tasks.workunit.client.0.vm07.stdout:3/186: dwrite d1/d6/f21 [0,4194304] 0 2026-03-09T19:27:10.484 INFO:tasks.workunit.client.1.vm08.stdout:4/421: dread da/f1d [4194304,4194304] 0 2026-03-09T19:27:10.484 INFO:tasks.workunit.client.0.vm07.stdout:1/105: creat d1/d11/f24 x:0 0 0 2026-03-09T19:27:10.484 INFO:tasks.workunit.client.0.vm07.stdout:3/187: write d1/f22 [448465,16083] 0 2026-03-09T19:27:10.484 INFO:tasks.workunit.client.0.vm07.stdout:2/162: dwrite d3/dd/fe [0,4194304] 0 2026-03-09T19:27:10.484 INFO:tasks.workunit.client.0.vm07.stdout:8/162: symlink d7/d9/l43 0 2026-03-09T19:27:10.484 INFO:tasks.workunit.client.0.vm07.stdout:0/131: write d0/fa [1330243,42531] 0 2026-03-09T19:27:10.486 INFO:tasks.workunit.client.0.vm07.stdout:0/132: truncate d0/d6/d13/d17/f2b 29054 0 2026-03-09T19:27:10.504 INFO:tasks.workunit.client.0.vm07.stdout:1/106: symlink d1/d3/l25 0 2026-03-09T19:27:10.508 INFO:tasks.workunit.client.0.vm07.stdout:8/163: mkdir d7/d9/d10/d44 0 2026-03-09T19:27:10.509 INFO:tasks.workunit.client.0.vm07.stdout:4/125: mkdir d3/d11/d1b/d20/d22 0 2026-03-09T19:27:10.513 INFO:tasks.workunit.client.1.vm08.stdout:3/459: sync 2026-03-09T19:27:10.515 INFO:tasks.workunit.client.1.vm08.stdout:2/402: creat d3/d4/d23/d2c/f94 x:0 0 0 2026-03-09T19:27:10.515 INFO:tasks.workunit.client.1.vm08.stdout:0/437: creat dd/d22/f8d x:0 0 0 2026-03-09T19:27:10.516 INFO:tasks.workunit.client.1.vm08.stdout:1/547: getdents d9/d40/d49 0 2026-03-09T19:27:10.516 INFO:tasks.workunit.client.0.vm07.stdout:6/130: symlink d0/d1/d28/l39 0 2026-03-09T19:27:10.517 INFO:tasks.workunit.client.0.vm07.stdout:0/133: mknod d0/c2d 0 2026-03-09T19:27:10.517 INFO:tasks.workunit.client.0.vm07.stdout:6/131: dread d0/d1/db/d1d/f2e [0,4194304] 0 2026-03-09T19:27:10.518 INFO:tasks.workunit.client.1.vm08.stdout:9/402: mknod d0/d2/d8/d7/d48/d5e/c86 0 2026-03-09T19:27:10.529 INFO:tasks.workunit.client.0.vm07.stdout:5/120: creat d3/d1a/d28/f2e x:0 0 0 2026-03-09T19:27:10.529 INFO:tasks.workunit.client.0.vm07.stdout:4/126: dread d3/f1a [0,4194304] 0 2026-03-09T19:27:10.529 INFO:tasks.workunit.client.0.vm07.stdout:4/127: write d3/f7 [2722123,50524] 0 2026-03-09T19:27:10.529 INFO:tasks.workunit.client.1.vm08.stdout:8/427: symlink de/d1d/d2e/d5f/l9b 0 2026-03-09T19:27:10.529 INFO:tasks.workunit.client.1.vm08.stdout:4/422: mknod da/d10/d16/d28/d2f/d4f/d64/c82 0 2026-03-09T19:27:10.529 INFO:tasks.workunit.client.1.vm08.stdout:3/460: stat d0/d4b/l73 0 2026-03-09T19:27:10.529 INFO:tasks.workunit.client.0.vm07.stdout:2/163: mknod d3/dd/d16/d29/d2d/c3d 0 2026-03-09T19:27:10.530 INFO:tasks.workunit.client.0.vm07.stdout:2/164: chown d3/dd/d16/d29 49721493 1 2026-03-09T19:27:10.534 INFO:tasks.workunit.client.0.vm07.stdout:8/164: mkdir d7/d9/d37/d45 0 2026-03-09T19:27:10.542 INFO:tasks.workunit.client.1.vm08.stdout:0/438: dread dd/f18 [0,4194304] 0 2026-03-09T19:27:10.548 INFO:tasks.workunit.client.1.vm08.stdout:8/428: write de/d1d/d2e/d5f/f80 [994551,55263] 0 2026-03-09T19:27:10.551 INFO:tasks.workunit.client.1.vm08.stdout:9/403: rename d0/f44 to d0/d2/d80/f87 0 2026-03-09T19:27:10.552 INFO:tasks.workunit.client.1.vm08.stdout:9/404: chown d0/d2/d8/d7/f63 731949884 1 2026-03-09T19:27:10.555 INFO:tasks.workunit.client.0.vm07.stdout:1/107: truncate d1/f6 490436 0 2026-03-09T19:27:10.557 INFO:tasks.workunit.client.1.vm08.stdout:9/405: dwrite d0/d2/d8/f29 [0,4194304] 0 2026-03-09T19:27:10.557 INFO:tasks.workunit.client.0.vm07.stdout:3/188: creat d1/d1f/f36 x:0 0 0 2026-03-09T19:27:10.560 INFO:tasks.workunit.client.1.vm08.stdout:2/403: sync 2026-03-09T19:27:10.564 INFO:tasks.workunit.client.1.vm08.stdout:9/406: chown d0/d2/d8/d7/d48/d5d/c55 100469 1 2026-03-09T19:27:10.564 INFO:tasks.workunit.client.1.vm08.stdout:9/407: rename d0 to d0/d2/d8/d7/d48/d88 22 2026-03-09T19:27:10.570 INFO:tasks.workunit.client.0.vm07.stdout:1/108: sync 2026-03-09T19:27:10.570 INFO:tasks.workunit.client.0.vm07.stdout:4/128: fsync d3/f7 0 2026-03-09T19:27:10.572 INFO:tasks.workunit.client.1.vm08.stdout:1/548: dread d9/d11/f56 [0,4194304] 0 2026-03-09T19:27:10.572 INFO:tasks.workunit.client.0.vm07.stdout:9/201: dread d0/db/d29/d2c/f30 [0,4194304] 0 2026-03-09T19:27:10.573 INFO:tasks.workunit.client.0.vm07.stdout:4/129: dread d3/f13 [0,4194304] 0 2026-03-09T19:27:10.573 INFO:tasks.workunit.client.1.vm08.stdout:1/549: write d9/da/f8e [459497,5267] 0 2026-03-09T19:27:10.573 INFO:tasks.workunit.client.0.vm07.stdout:4/130: chown d3/ce 1583 1 2026-03-09T19:27:10.573 INFO:tasks.workunit.client.0.vm07.stdout:9/202: write d0/f3 [2914772,18950] 0 2026-03-09T19:27:10.575 INFO:tasks.workunit.client.0.vm07.stdout:4/131: dread d3/fc [0,4194304] 0 2026-03-09T19:27:10.575 INFO:tasks.workunit.client.1.vm08.stdout:9/408: dread d0/d2/f21 [0,4194304] 0 2026-03-09T19:27:10.579 INFO:tasks.workunit.client.1.vm08.stdout:2/404: dread d3/d9/d26/f69 [0,4194304] 0 2026-03-09T19:27:10.579 INFO:tasks.workunit.client.1.vm08.stdout:9/409: fsync d0/d2/d8/d7/f35 0 2026-03-09T19:27:10.586 INFO:tasks.workunit.client.0.vm07.stdout:2/165: mknod d3/d37/c3e 0 2026-03-09T19:27:10.588 INFO:tasks.workunit.client.1.vm08.stdout:6/414: write d3/d34/f76 [279476,21752] 0 2026-03-09T19:27:10.589 INFO:tasks.workunit.client.0.vm07.stdout:7/146: dwrite d0/d4/fc [0,4194304] 0 2026-03-09T19:27:10.590 INFO:tasks.workunit.client.1.vm08.stdout:6/415: chown d3/d15/c22 3409036 1 2026-03-09T19:27:10.590 INFO:tasks.workunit.client.1.vm08.stdout:6/416: fsync d3/d34/d5c/f7c 0 2026-03-09T19:27:10.591 INFO:tasks.workunit.client.0.vm07.stdout:2/166: dwrite d3/dd/fe [0,4194304] 0 2026-03-09T19:27:10.591 INFO:tasks.workunit.client.0.vm07.stdout:7/147: chown d0/d4/d5/dd/l19 578440764 1 2026-03-09T19:27:10.593 INFO:tasks.workunit.client.0.vm07.stdout:7/148: read - d0/d4/d5/f36 zero size 2026-03-09T19:27:10.593 INFO:tasks.workunit.client.0.vm07.stdout:2/167: stat d3/d37 0 2026-03-09T19:27:10.593 INFO:tasks.workunit.client.1.vm08.stdout:6/417: fdatasync d3/d15/f40 0 2026-03-09T19:27:10.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:10 vm08.local ceph-mon[57794]: pgmap v159: 65 pgs: 65 active+clean; 1.1 GiB data, 4.8 GiB used, 115 GiB / 120 GiB avail; 21 MiB/s rd, 117 MiB/s wr, 302 op/s 2026-03-09T19:27:10.598 INFO:tasks.workunit.client.0.vm07.stdout:2/168: dwrite f0 [0,4194304] 0 2026-03-09T19:27:10.599 INFO:tasks.workunit.client.0.vm07.stdout:3/189: creat d1/d6/f37 x:0 0 0 2026-03-09T19:27:10.603 INFO:tasks.workunit.client.0.vm07.stdout:3/190: dread d1/d1f/f13 [4194304,4194304] 0 2026-03-09T19:27:10.603 INFO:tasks.workunit.client.0.vm07.stdout:3/191: write d1/d1f/d16/f30 [611359,87923] 0 2026-03-09T19:27:10.611 INFO:tasks.workunit.client.0.vm07.stdout:1/109: creat d1/d9/f26 x:0 0 0 2026-03-09T19:27:10.636 INFO:tasks.workunit.client.0.vm07.stdout:9/203: write d0/d6/f4c [554317,109995] 0 2026-03-09T19:27:10.644 INFO:tasks.workunit.client.1.vm08.stdout:7/478: dwrite d5/d16/d3a/f64 [0,4194304] 0 2026-03-09T19:27:10.650 INFO:tasks.workunit.client.1.vm08.stdout:7/479: chown d5/d14/d27/d78 2377334 1 2026-03-09T19:27:10.702 INFO:tasks.workunit.client.0.vm07.stdout:8/165: mknod d7/d16/c46 0 2026-03-09T19:27:10.703 INFO:tasks.workunit.client.0.vm07.stdout:8/166: write d7/d1d/f3d [609712,18275] 0 2026-03-09T19:27:10.716 INFO:tasks.workunit.client.0.vm07.stdout:2/169: mknod d3/dd/d16/d2f/c3f 0 2026-03-09T19:27:10.727 INFO:tasks.workunit.client.0.vm07.stdout:0/134: link d0/d6/d13/d1c/f9 d0/d6/d13/d1c/d11/f2e 0 2026-03-09T19:27:10.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:10 vm07.local ceph-mon[48545]: pgmap v159: 65 pgs: 65 active+clean; 1.1 GiB data, 4.8 GiB used, 115 GiB / 120 GiB avail; 21 MiB/s rd, 117 MiB/s wr, 302 op/s 2026-03-09T19:27:10.729 INFO:tasks.workunit.client.0.vm07.stdout:0/135: write d0/d6/d13/d1c/d11/f15 [3136064,3701] 0 2026-03-09T19:27:10.729 INFO:tasks.workunit.client.0.vm07.stdout:0/136: write d0/fa [163684,13793] 0 2026-03-09T19:27:10.730 INFO:tasks.workunit.client.0.vm07.stdout:0/137: stat d0/d6/d13/d1c 0 2026-03-09T19:27:10.733 INFO:tasks.workunit.client.0.vm07.stdout:0/138: dwrite d0/d6/d13/d17/d19/f1b [4194304,4194304] 0 2026-03-09T19:27:10.739 INFO:tasks.workunit.client.0.vm07.stdout:0/139: dwrite d0/fa [0,4194304] 0 2026-03-09T19:27:10.743 INFO:tasks.workunit.client.0.vm07.stdout:1/110: dwrite d1/d9/f16 [0,4194304] 0 2026-03-09T19:27:10.748 INFO:tasks.workunit.client.0.vm07.stdout:9/204: rmdir d0/db/d29/d2c/d36 39 2026-03-09T19:27:10.765 INFO:tasks.workunit.client.0.vm07.stdout:8/167: rmdir d7/d30/d32 39 2026-03-09T19:27:10.767 INFO:tasks.workunit.client.0.vm07.stdout:2/170: mkdir d3/dd/d16/d30/d40 0 2026-03-09T19:27:10.767 INFO:tasks.workunit.client.0.vm07.stdout:2/171: chown c1 568650428 1 2026-03-09T19:27:10.769 INFO:tasks.workunit.client.0.vm07.stdout:2/172: dread d3/dd/f34 [0,4194304] 0 2026-03-09T19:27:10.769 INFO:tasks.workunit.client.0.vm07.stdout:2/173: dread - d3/d11/f39 zero size 2026-03-09T19:27:10.769 INFO:tasks.workunit.client.1.vm08.stdout:8/429: write de/d1d/f97 [1565299,15939] 0 2026-03-09T19:27:10.770 INFO:tasks.workunit.client.0.vm07.stdout:2/174: write d3/f27 [1084611,101832] 0 2026-03-09T19:27:10.783 INFO:tasks.workunit.client.1.vm08.stdout:4/423: truncate da/d10/d1b/f29 2583866 0 2026-03-09T19:27:10.784 INFO:tasks.workunit.client.0.vm07.stdout:0/140: symlink d0/d6/d13/d1c/l2f 0 2026-03-09T19:27:10.784 INFO:tasks.workunit.client.1.vm08.stdout:3/461: truncate d0/d8/f4a 3177455 0 2026-03-09T19:27:10.785 INFO:tasks.workunit.client.1.vm08.stdout:3/462: readlink d0/d6/de/d15/l48 0 2026-03-09T19:27:10.787 INFO:tasks.workunit.client.1.vm08.stdout:0/439: truncate dd/f19 1163309 0 2026-03-09T19:27:10.788 INFO:tasks.workunit.client.1.vm08.stdout:3/463: readlink d0/d6/de/d1a/l5e 0 2026-03-09T19:27:10.798 INFO:tasks.workunit.client.0.vm07.stdout:4/132: link d3/ld d3/d11/l23 0 2026-03-09T19:27:10.806 INFO:tasks.workunit.client.1.vm08.stdout:2/405: dwrite d3/d9/d79/f7d [0,4194304] 0 2026-03-09T19:27:10.807 INFO:tasks.workunit.client.1.vm08.stdout:2/406: stat d3/d4/f6 0 2026-03-09T19:27:10.809 INFO:tasks.workunit.client.1.vm08.stdout:2/407: dread d3/d9/f20 [0,4194304] 0 2026-03-09T19:27:10.811 INFO:tasks.workunit.client.1.vm08.stdout:2/408: dread - d3/d4/d23/d2c/f94 zero size 2026-03-09T19:27:10.816 INFO:tasks.workunit.client.1.vm08.stdout:2/409: dread d3/d4/d23/d2c/d39/d5e/d14/f44 [0,4194304] 0 2026-03-09T19:27:10.817 INFO:tasks.workunit.client.1.vm08.stdout:2/410: chown d3/l4c 159637680 1 2026-03-09T19:27:10.818 INFO:tasks.workunit.client.0.vm07.stdout:8/168: symlink d7/d9/d37/l47 0 2026-03-09T19:27:10.822 INFO:tasks.workunit.client.1.vm08.stdout:9/410: truncate d0/d2/d8/d7/f34 1026761 0 2026-03-09T19:27:10.837 INFO:tasks.workunit.client.0.vm07.stdout:9/205: mknod d0/db/d29/d4d/c4e 0 2026-03-09T19:27:10.840 INFO:tasks.workunit.client.1.vm08.stdout:6/418: rename d3/d68/f77 to d3/d34/d3b/f99 0 2026-03-09T19:27:10.840 INFO:tasks.workunit.client.0.vm07.stdout:4/133: write d3/f13 [1177244,84993] 0 2026-03-09T19:27:10.841 INFO:tasks.workunit.client.1.vm08.stdout:6/419: chown d3/d34/d6f 113401 1 2026-03-09T19:27:10.841 INFO:tasks.workunit.client.1.vm08.stdout:5/386: dread d16/d1e/f35 [0,4194304] 0 2026-03-09T19:27:10.846 INFO:tasks.workunit.client.1.vm08.stdout:5/387: read d16/d1e/f2c [839328,38066] 0 2026-03-09T19:27:10.854 INFO:tasks.workunit.client.0.vm07.stdout:3/192: getdents d1/d1f 0 2026-03-09T19:27:10.855 INFO:tasks.workunit.client.0.vm07.stdout:3/193: write d1/d1f/d16/d28/f34 [439152,77571] 0 2026-03-09T19:27:10.855 INFO:tasks.workunit.client.0.vm07.stdout:0/141: mknod d0/d6/d13/d1c/d11/c30 0 2026-03-09T19:27:10.858 INFO:tasks.workunit.client.0.vm07.stdout:0/142: dread d0/d6/d13/d17/d19/f1f [0,4194304] 0 2026-03-09T19:27:10.859 INFO:tasks.workunit.client.1.vm08.stdout:7/480: mknod d5/d14/d2b/d4b/ca2 0 2026-03-09T19:27:10.864 INFO:tasks.workunit.client.1.vm08.stdout:4/424: chown da/c76 285480 1 2026-03-09T19:27:10.865 INFO:tasks.workunit.client.0.vm07.stdout:8/169: creat d7/d9/d10/d44/f48 x:0 0 0 2026-03-09T19:27:10.865 INFO:tasks.workunit.client.0.vm07.stdout:8/170: truncate d7/d9/d10/f1b 1986013 0 2026-03-09T19:27:10.870 INFO:tasks.workunit.client.0.vm07.stdout:4/134: sync 2026-03-09T19:27:10.871 INFO:tasks.workunit.client.0.vm07.stdout:4/135: dread d3/fc [0,4194304] 0 2026-03-09T19:27:10.872 INFO:tasks.workunit.client.0.vm07.stdout:4/136: write d3/f13 [1834914,102583] 0 2026-03-09T19:27:10.880 INFO:tasks.workunit.client.1.vm08.stdout:3/464: creat d0/d6/de/d1b/d16/d17/f8c x:0 0 0 2026-03-09T19:27:10.897 INFO:tasks.workunit.client.1.vm08.stdout:8/430: rename de/d1d/d21/f30 to de/d1d/d4f/f9c 0 2026-03-09T19:27:10.901 INFO:tasks.workunit.client.0.vm07.stdout:2/175: link d3/dd/d16/d29/d2d/l35 d3/dd/d16/d29/d3c/l41 0 2026-03-09T19:27:10.901 INFO:tasks.workunit.client.1.vm08.stdout:6/420: symlink d3/db/l9a 0 2026-03-09T19:27:10.904 INFO:tasks.workunit.client.0.vm07.stdout:7/149: rmdir d0/d4/d5/d8 39 2026-03-09T19:27:10.908 INFO:tasks.workunit.client.1.vm08.stdout:7/481: dread - d5/d14/d38/f4c zero size 2026-03-09T19:27:10.914 INFO:tasks.workunit.client.1.vm08.stdout:9/411: sync 2026-03-09T19:27:10.914 INFO:tasks.workunit.client.1.vm08.stdout:9/412: chown d0/d2/d80/d69 337527 1 2026-03-09T19:27:10.916 INFO:tasks.workunit.client.0.vm07.stdout:3/194: link d1/d6/f21 d1/d1f/f38 0 2026-03-09T19:27:10.917 INFO:tasks.workunit.client.0.vm07.stdout:3/195: write d1/d6/dd/f2b [879762,59545] 0 2026-03-09T19:27:10.924 INFO:tasks.workunit.client.0.vm07.stdout:6/132: truncate d0/d1/f19 8481036 0 2026-03-09T19:27:10.929 INFO:tasks.workunit.client.0.vm07.stdout:5/121: creat d3/f2f x:0 0 0 2026-03-09T19:27:10.940 INFO:tasks.workunit.client.0.vm07.stdout:5/122: dread f2 [0,4194304] 0 2026-03-09T19:27:10.940 INFO:tasks.workunit.client.0.vm07.stdout:8/171: rename d7/d1d/c2c to d7/d9/d10/d44/c49 0 2026-03-09T19:27:10.945 INFO:tasks.workunit.client.0.vm07.stdout:2/176: creat d3/dd/d16/d29/f42 x:0 0 0 2026-03-09T19:27:10.950 INFO:tasks.workunit.client.1.vm08.stdout:2/411: rename d3/d4/d3e/d4e/l63 to d3/d9/d26/l95 0 2026-03-09T19:27:10.953 INFO:tasks.workunit.client.1.vm08.stdout:2/412: read - d3/d4/d23/d2c/d39/d5e/de/d8b/f7e zero size 2026-03-09T19:27:10.960 INFO:tasks.workunit.client.1.vm08.stdout:4/425: truncate da/f21 519478 0 2026-03-09T19:27:10.962 INFO:tasks.workunit.client.0.vm07.stdout:6/133: write d0/d1/db/d1d/f2e [530518,14395] 0 2026-03-09T19:27:10.962 INFO:tasks.workunit.client.1.vm08.stdout:4/426: fdatasync da/d10/d26/d38/f57 0 2026-03-09T19:27:10.964 INFO:tasks.workunit.client.0.vm07.stdout:0/143: creat d0/d6/d13/f31 x:0 0 0 2026-03-09T19:27:10.964 INFO:tasks.workunit.client.1.vm08.stdout:9/413: mknod d0/d2/d80/d69/c89 0 2026-03-09T19:27:10.964 INFO:tasks.workunit.client.1.vm08.stdout:5/388: rename d16/d1e to d16/d1e/d7b 22 2026-03-09T19:27:10.966 INFO:tasks.workunit.client.0.vm07.stdout:8/172: creat d7/d9/d10/d44/f4a x:0 0 0 2026-03-09T19:27:10.967 INFO:tasks.workunit.client.0.vm07.stdout:8/173: fdatasync d7/d9/d37/f3b 0 2026-03-09T19:27:10.972 INFO:tasks.workunit.client.1.vm08.stdout:6/421: dread d3/f32 [0,4194304] 0 2026-03-09T19:27:10.974 INFO:tasks.workunit.client.0.vm07.stdout:5/123: mknod d3/d1a/d28/c30 0 2026-03-09T19:27:10.976 INFO:tasks.workunit.client.0.vm07.stdout:2/177: mknod d3/dd/d16/d2f/c43 0 2026-03-09T19:27:10.977 INFO:tasks.workunit.client.1.vm08.stdout:9/414: dwrite d0/d2/d8/d7/d48/d6f/f84 [0,4194304] 0 2026-03-09T19:27:10.986 INFO:tasks.workunit.client.0.vm07.stdout:1/111: dread d1/f6 [0,4194304] 0 2026-03-09T19:27:10.999 INFO:tasks.workunit.client.0.vm07.stdout:4/137: link d3/fc d3/d11/d1b/d20/d22/f24 0 2026-03-09T19:27:11.006 INFO:tasks.workunit.client.1.vm08.stdout:5/389: mknod d16/d1e/d3b/d61/c7c 0 2026-03-09T19:27:11.007 INFO:tasks.workunit.client.1.vm08.stdout:6/422: symlink d3/d68/l9b 0 2026-03-09T19:27:11.007 INFO:tasks.workunit.client.1.vm08.stdout:5/390: chown d16/d1e 368 1 2026-03-09T19:27:11.007 INFO:tasks.workunit.client.1.vm08.stdout:6/423: stat d3/d34 0 2026-03-09T19:27:11.008 INFO:tasks.workunit.client.0.vm07.stdout:8/174: mknod d7/d30/d32/c4b 0 2026-03-09T19:27:11.008 INFO:tasks.workunit.client.1.vm08.stdout:6/424: write d3/d15/f7b [1454436,101497] 0 2026-03-09T19:27:11.009 INFO:tasks.workunit.client.0.vm07.stdout:1/112: readlink d1/d9/l1e 0 2026-03-09T19:27:11.014 INFO:tasks.workunit.client.0.vm07.stdout:6/134: creat d0/d13/d1e/d30/d31/f3a x:0 0 0 2026-03-09T19:27:11.020 INFO:tasks.workunit.client.1.vm08.stdout:0/440: dread dd/f19 [0,4194304] 0 2026-03-09T19:27:11.026 INFO:tasks.workunit.client.1.vm08.stdout:3/465: mknod d0/d6/c8d 0 2026-03-09T19:27:11.027 INFO:tasks.workunit.client.1.vm08.stdout:3/466: fsync d0/d52/f8a 0 2026-03-09T19:27:11.029 INFO:tasks.workunit.client.1.vm08.stdout:3/467: write d0/d6/de/d1b/d16/f7b [43629,106102] 0 2026-03-09T19:27:11.039 INFO:tasks.workunit.client.0.vm07.stdout:2/178: mkdir d3/d37/d3b/d44 0 2026-03-09T19:27:11.043 INFO:tasks.workunit.client.1.vm08.stdout:2/413: mknod d3/d4/d23/d2c/c96 0 2026-03-09T19:27:11.044 INFO:tasks.workunit.client.1.vm08.stdout:2/414: write d3/d9/d79/f6b [360816,51831] 0 2026-03-09T19:27:11.056 INFO:tasks.workunit.client.0.vm07.stdout:7/150: rmdir d0/d4/d5/d8/d1a/d2a/d2e 0 2026-03-09T19:27:11.064 INFO:tasks.workunit.client.1.vm08.stdout:7/482: rename d5/d16/d1c/c36 to d5/d16/d1c/d73/ca3 0 2026-03-09T19:27:11.067 INFO:tasks.workunit.client.0.vm07.stdout:9/206: write d0/db/d29/d2c/f4a [4942785,67904] 0 2026-03-09T19:27:11.068 INFO:tasks.workunit.client.0.vm07.stdout:2/179: chown d3/dd/d16/c21 119778544 1 2026-03-09T19:27:11.068 INFO:tasks.workunit.client.0.vm07.stdout:2/180: chown d3/d11/f2e 25986 1 2026-03-09T19:27:11.070 INFO:tasks.workunit.client.1.vm08.stdout:1/550: dwrite d9/da/d2c/d6a/f9c [0,4194304] 0 2026-03-09T19:27:11.077 INFO:tasks.workunit.client.1.vm08.stdout:1/551: write d9/d40/f92 [108561,29680] 0 2026-03-09T19:27:11.078 INFO:tasks.workunit.client.1.vm08.stdout:7/483: dwrite d5/d16/d3a/d42/d6a/f7d [0,4194304] 0 2026-03-09T19:27:11.087 INFO:tasks.workunit.client.0.vm07.stdout:8/175: getdents d7/d9/d37/d45 0 2026-03-09T19:27:11.087 INFO:tasks.workunit.client.0.vm07.stdout:8/176: dread - d7/d9/f36 zero size 2026-03-09T19:27:11.087 INFO:tasks.workunit.client.1.vm08.stdout:1/552: dwrite d9/da/d2c/d6a/f9c [0,4194304] 0 2026-03-09T19:27:11.097 INFO:tasks.workunit.client.0.vm07.stdout:9/207: creat d0/d17/f4f x:0 0 0 2026-03-09T19:27:11.100 INFO:tasks.workunit.client.0.vm07.stdout:6/135: creat d0/f3b x:0 0 0 2026-03-09T19:27:11.110 INFO:tasks.workunit.client.1.vm08.stdout:8/431: getdents de/d1d/d21/d73 0 2026-03-09T19:27:11.113 INFO:tasks.workunit.client.0.vm07.stdout:8/177: sync 2026-03-09T19:27:11.115 INFO:tasks.workunit.client.0.vm07.stdout:8/178: write d7/d30/f3e [376209,106370] 0 2026-03-09T19:27:11.133 INFO:tasks.workunit.client.1.vm08.stdout:2/415: mkdir d3/d4/d23/d2c/d39/d5e/de/d18/d1f/d97 0 2026-03-09T19:27:11.138 INFO:tasks.workunit.client.0.vm07.stdout:0/144: rename d0/d6/d13/d1c/l2f to d0/l32 0 2026-03-09T19:27:11.146 INFO:tasks.workunit.client.0.vm07.stdout:6/136: link d0/d1/db/f15 d0/d13/d1e/d30/d31/f3c 0 2026-03-09T19:27:11.151 INFO:tasks.workunit.client.0.vm07.stdout:3/196: creat d1/d1f/d16/f39 x:0 0 0 2026-03-09T19:27:11.153 INFO:tasks.workunit.client.0.vm07.stdout:3/197: dread d1/f7 [0,4194304] 0 2026-03-09T19:27:11.157 INFO:tasks.workunit.client.0.vm07.stdout:8/179: creat d7/d9/f4c x:0 0 0 2026-03-09T19:27:11.167 INFO:tasks.workunit.client.0.vm07.stdout:6/137: symlink d0/d13/d1e/d30/d31/l3d 0 2026-03-09T19:27:11.175 INFO:tasks.workunit.client.0.vm07.stdout:3/198: dwrite d1/d6/fb [4194304,4194304] 0 2026-03-09T19:27:11.175 INFO:tasks.workunit.client.1.vm08.stdout:1/553: symlink d9/da/d17/d60/la6 0 2026-03-09T19:27:11.176 INFO:tasks.workunit.client.1.vm08.stdout:4/427: write da/d10/f1f [833883,55333] 0 2026-03-09T19:27:11.177 INFO:tasks.workunit.client.0.vm07.stdout:8/180: write d7/d16/d1e/f33 [1773862,33949] 0 2026-03-09T19:27:11.181 INFO:tasks.workunit.client.0.vm07.stdout:6/138: sync 2026-03-09T19:27:11.184 INFO:tasks.workunit.client.1.vm08.stdout:4/428: dread da/d10/d26/d27/d32/f45 [4194304,4194304] 0 2026-03-09T19:27:11.184 INFO:tasks.workunit.client.1.vm08.stdout:0/441: write dd/f19 [1598594,94854] 0 2026-03-09T19:27:11.184 INFO:tasks.workunit.client.1.vm08.stdout:9/415: creat d0/d1b/f8a x:0 0 0 2026-03-09T19:27:11.192 INFO:tasks.workunit.client.0.vm07.stdout:0/145: mkdir d0/d6/d13/d33 0 2026-03-09T19:27:11.201 INFO:tasks.workunit.client.0.vm07.stdout:3/199: truncate d1/d1f/f36 1009726 0 2026-03-09T19:27:11.211 INFO:tasks.workunit.client.1.vm08.stdout:2/416: creat d3/d9/d79/f98 x:0 0 0 2026-03-09T19:27:11.211 INFO:tasks.workunit.client.1.vm08.stdout:7/484: creat d5/d16/d3a/d42/d85/da0/fa4 x:0 0 0 2026-03-09T19:27:11.212 INFO:tasks.workunit.client.0.vm07.stdout:4/138: dwrite d3/f8 [0,4194304] 0 2026-03-09T19:27:11.212 INFO:tasks.workunit.client.0.vm07.stdout:5/124: truncate d3/f25 3104472 0 2026-03-09T19:27:11.212 INFO:tasks.workunit.client.0.vm07.stdout:5/125: truncate d3/d1a/d28/f2e 679707 0 2026-03-09T19:27:11.212 INFO:tasks.workunit.client.0.vm07.stdout:5/126: readlink d3/d1a/l11 0 2026-03-09T19:27:11.212 INFO:tasks.workunit.client.0.vm07.stdout:5/127: chown d3/dd/f22 551034362 1 2026-03-09T19:27:11.212 INFO:tasks.workunit.client.0.vm07.stdout:8/181: rmdir d7/d9/d37/d34 39 2026-03-09T19:27:11.214 INFO:tasks.workunit.client.0.vm07.stdout:6/139: creat d0/d1/db/d1d/f3e x:0 0 0 2026-03-09T19:27:11.214 INFO:tasks.workunit.client.1.vm08.stdout:5/391: write d16/d1e/d3b/f5e [994704,93452] 0 2026-03-09T19:27:11.216 INFO:tasks.workunit.client.0.vm07.stdout:7/151: write d0/d4/d5/dd/f16 [86676,103504] 0 2026-03-09T19:27:11.220 INFO:tasks.workunit.client.0.vm07.stdout:1/113: rename d1/d3/la to d1/l27 0 2026-03-09T19:27:11.222 INFO:tasks.workunit.client.1.vm08.stdout:6/425: link d3/d34/d5c/l74 d3/d34/d3b/l9c 0 2026-03-09T19:27:11.222 INFO:tasks.workunit.client.0.vm07.stdout:0/146: read d0/d6/d13/d1c/f9 [300025,47933] 0 2026-03-09T19:27:11.223 INFO:tasks.workunit.client.0.vm07.stdout:0/147: fdatasync d0/d6/d13/d17/d19/f1b 0 2026-03-09T19:27:11.223 INFO:tasks.workunit.client.1.vm08.stdout:6/426: write d3/d34/f37 [873938,6895] 0 2026-03-09T19:27:11.225 INFO:tasks.workunit.client.0.vm07.stdout:9/208: write d0/db/f1d [4817522,16948] 0 2026-03-09T19:27:11.229 INFO:tasks.workunit.client.1.vm08.stdout:1/554: creat d9/da/d12/d39/fa7 x:0 0 0 2026-03-09T19:27:11.241 INFO:tasks.workunit.client.0.vm07.stdout:4/139: creat d3/d11/d1b/f25 x:0 0 0 2026-03-09T19:27:11.243 INFO:tasks.workunit.client.1.vm08.stdout:9/416: rename d0/d2/d14/d5c/l70 to d0/d2/d8/d7/d48/l8b 0 2026-03-09T19:27:11.249 INFO:tasks.workunit.client.0.vm07.stdout:4/140: dread d3/d11/f18 [0,4194304] 0 2026-03-09T19:27:11.256 INFO:tasks.workunit.client.1.vm08.stdout:3/468: write d0/d8/d24/f2d [549290,39183] 0 2026-03-09T19:27:11.265 INFO:tasks.workunit.client.0.vm07.stdout:5/128: mknod d3/dd/c31 0 2026-03-09T19:27:11.268 INFO:tasks.workunit.client.0.vm07.stdout:8/182: creat d7/d1d/f4d x:0 0 0 2026-03-09T19:27:11.271 INFO:tasks.workunit.client.0.vm07.stdout:6/140: chown d0/d1/db/d1d/c29 44 1 2026-03-09T19:27:11.273 INFO:tasks.workunit.client.0.vm07.stdout:7/152: creat d0/d4/d5/d8/f37 x:0 0 0 2026-03-09T19:27:11.273 INFO:tasks.workunit.client.1.vm08.stdout:4/429: dwrite da/d10/d16/d28/d46/d52/d6e/d2c/f4a [0,4194304] 0 2026-03-09T19:27:11.273 INFO:tasks.workunit.client.0.vm07.stdout:7/153: fsync d0/d4/d5/d8/d1a/f1d 0 2026-03-09T19:27:11.275 INFO:tasks.workunit.client.1.vm08.stdout:8/432: creat de/d91/f9d x:0 0 0 2026-03-09T19:27:11.276 INFO:tasks.workunit.client.0.vm07.stdout:2/181: rename d3/d37 to d3/dd/d16/d29/d2d/d45 0 2026-03-09T19:27:11.277 INFO:tasks.workunit.client.0.vm07.stdout:7/154: dread d0/d4/d5/d8/d1a/f1d [0,4194304] 0 2026-03-09T19:27:11.278 INFO:tasks.workunit.client.0.vm07.stdout:7/155: truncate d0/d4/d5/dd/f16 752900 0 2026-03-09T19:27:11.280 INFO:tasks.workunit.client.0.vm07.stdout:7/156: write d0/d4/d5/d8/d1a/d2a/f34 [654412,118535] 0 2026-03-09T19:27:11.290 INFO:tasks.workunit.client.0.vm07.stdout:0/148: creat d0/d6/d13/d17/d19/f34 x:0 0 0 2026-03-09T19:27:11.294 INFO:tasks.workunit.client.0.vm07.stdout:9/209: symlink d0/d17/l50 0 2026-03-09T19:27:11.295 INFO:tasks.workunit.client.1.vm08.stdout:1/555: rmdir d9/da/d17 39 2026-03-09T19:27:11.303 INFO:tasks.workunit.client.0.vm07.stdout:3/200: link d1/d6/f19 d1/d1f/d16/f3a 0 2026-03-09T19:27:11.309 INFO:tasks.workunit.client.1.vm08.stdout:1/556: dread d9/da/d53/d67/f77 [0,4194304] 0 2026-03-09T19:27:11.310 INFO:tasks.workunit.client.1.vm08.stdout:6/427: write d3/f12 [3175367,98761] 0 2026-03-09T19:27:11.313 INFO:tasks.workunit.client.1.vm08.stdout:7/485: rename d5/d14/d38/f3b to d5/d14/d27/d54/fa5 0 2026-03-09T19:27:11.316 INFO:tasks.workunit.client.1.vm08.stdout:7/486: dread d5/d14/d2b/f30 [0,4194304] 0 2026-03-09T19:27:11.319 INFO:tasks.workunit.client.1.vm08.stdout:7/487: write d5/d16/d3a/d42/f65 [4380988,29022] 0 2026-03-09T19:27:11.319 INFO:tasks.workunit.client.1.vm08.stdout:1/557: dwrite d9/d11/f5f [0,4194304] 0 2026-03-09T19:27:11.330 INFO:tasks.workunit.client.1.vm08.stdout:9/417: dread d0/d2/f21 [0,4194304] 0 2026-03-09T19:27:11.330 INFO:tasks.workunit.client.1.vm08.stdout:9/418: chown d0/d2/d8/f61 76 1 2026-03-09T19:27:11.331 INFO:tasks.workunit.client.1.vm08.stdout:9/419: dread - d0/d2/d80/d69/f7a zero size 2026-03-09T19:27:11.332 INFO:tasks.workunit.client.0.vm07.stdout:4/141: creat d3/d11/d1b/d20/f26 x:0 0 0 2026-03-09T19:27:11.347 INFO:tasks.workunit.client.1.vm08.stdout:2/417: truncate d3/d9/d79/d46/d8c/f90 994956 0 2026-03-09T19:27:11.350 INFO:tasks.workunit.client.0.vm07.stdout:6/141: rmdir d0/d1 39 2026-03-09T19:27:11.351 INFO:tasks.workunit.client.0.vm07.stdout:6/142: write d0/fe [1031769,20838] 0 2026-03-09T19:27:11.352 INFO:tasks.workunit.client.0.vm07.stdout:6/143: write d0/d13/d1e/f34 [453721,69777] 0 2026-03-09T19:27:11.354 INFO:tasks.workunit.client.1.vm08.stdout:3/469: write d0/d8/f66 [1414740,5862] 0 2026-03-09T19:27:11.369 INFO:tasks.workunit.client.0.vm07.stdout:7/157: mkdir d0/d4/d5/d26/d38 0 2026-03-09T19:27:11.371 INFO:tasks.workunit.client.1.vm08.stdout:8/433: rename de/d25/f42 to de/d25/d33/d46/f9e 0 2026-03-09T19:27:11.375 INFO:tasks.workunit.client.1.vm08.stdout:7/488: mknod d5/d14/d27/ca6 0 2026-03-09T19:27:11.376 INFO:tasks.workunit.client.1.vm08.stdout:7/489: chown d5/d16/d3a/d42/d6a 52 1 2026-03-09T19:27:11.378 INFO:tasks.workunit.client.1.vm08.stdout:1/558: truncate d9/da/d2d/f3d 5096844 0 2026-03-09T19:27:11.379 INFO:tasks.workunit.client.0.vm07.stdout:3/201: dread d1/fe [0,4194304] 0 2026-03-09T19:27:11.381 INFO:tasks.workunit.client.0.vm07.stdout:4/142: mknod d3/d11/c27 0 2026-03-09T19:27:11.384 INFO:tasks.workunit.client.0.vm07.stdout:5/129: link d3/fe d3/dd/d26/d2d/f32 0 2026-03-09T19:27:11.385 INFO:tasks.workunit.client.0.vm07.stdout:5/130: write d3/dd/f22 [209387,15710] 0 2026-03-09T19:27:11.394 INFO:tasks.workunit.client.1.vm08.stdout:2/418: rmdir d3/d4/d23/d2c 39 2026-03-09T19:27:11.395 INFO:tasks.workunit.client.0.vm07.stdout:5/131: dread d3/d1a/fa [0,4194304] 0 2026-03-09T19:27:11.397 INFO:tasks.workunit.client.0.vm07.stdout:5/132: read d3/dd/d26/d2d/f32 [2742137,31131] 0 2026-03-09T19:27:11.397 INFO:tasks.workunit.client.1.vm08.stdout:7/490: dread d5/d14/f59 [0,4194304] 0 2026-03-09T19:27:11.402 INFO:tasks.workunit.client.1.vm08.stdout:5/392: creat d16/d1e/f7d x:0 0 0 2026-03-09T19:27:11.403 INFO:tasks.workunit.client.1.vm08.stdout:5/393: chown d16/d1e/f44 2295128 1 2026-03-09T19:27:11.405 INFO:tasks.workunit.client.0.vm07.stdout:8/183: truncate d7/d16/d1e/f33 1729301 0 2026-03-09T19:27:11.407 INFO:tasks.workunit.client.0.vm07.stdout:6/144: fsync d0/d1/db/d17/f2f 0 2026-03-09T19:27:11.407 INFO:tasks.workunit.client.1.vm08.stdout:3/470: symlink d0/d6/de/d1b/l8e 0 2026-03-09T19:27:11.407 INFO:tasks.workunit.client.1.vm08.stdout:4/430: dwrite da/d10/d1b/f29 [0,4194304] 0 2026-03-09T19:27:11.408 INFO:tasks.workunit.client.0.vm07.stdout:2/182: truncate d3/ff 1243980 0 2026-03-09T19:27:11.412 INFO:tasks.workunit.client.0.vm07.stdout:7/158: fdatasync d0/d4/f33 0 2026-03-09T19:27:11.413 INFO:tasks.workunit.client.0.vm07.stdout:6/145: dread d0/d13/d1e/f34 [0,4194304] 0 2026-03-09T19:27:11.421 INFO:tasks.workunit.client.0.vm07.stdout:6/146: sync 2026-03-09T19:27:11.422 INFO:tasks.workunit.client.0.vm07.stdout:7/159: sync 2026-03-09T19:27:11.425 INFO:tasks.workunit.client.0.vm07.stdout:9/210: mknod d0/db/d29/d2c/c51 0 2026-03-09T19:27:11.425 INFO:tasks.workunit.client.0.vm07.stdout:0/149: creat d0/d6/d13/d33/f35 x:0 0 0 2026-03-09T19:27:11.426 INFO:tasks.workunit.client.0.vm07.stdout:0/150: chown d0/fa 10 1 2026-03-09T19:27:11.426 INFO:tasks.workunit.client.0.vm07.stdout:9/211: write d0/d6/fa [4325534,16554] 0 2026-03-09T19:27:11.427 INFO:tasks.workunit.client.0.vm07.stdout:9/212: rename d0/d6/d3a to d0/d6/d3a/d52 22 2026-03-09T19:27:11.429 INFO:tasks.workunit.client.1.vm08.stdout:9/420: rename d0/d2/d80/d72 to d0/d1b/d68/d7f/d8c 0 2026-03-09T19:27:11.429 INFO:tasks.workunit.client.1.vm08.stdout:9/421: chown d0/d2/d14/d5c/d32/f85 480871246 1 2026-03-09T19:27:11.431 INFO:tasks.workunit.client.1.vm08.stdout:6/428: dwrite d3/db/f42 [4194304,4194304] 0 2026-03-09T19:27:11.433 INFO:tasks.workunit.client.0.vm07.stdout:3/202: creat d1/d6/dd/f3b x:0 0 0 2026-03-09T19:27:11.460 INFO:tasks.workunit.client.1.vm08.stdout:8/434: dread de/d1d/f1e [0,4194304] 0 2026-03-09T19:27:11.465 INFO:tasks.workunit.client.1.vm08.stdout:1/559: rmdir d9/da/dc 39 2026-03-09T19:27:11.469 INFO:tasks.workunit.client.1.vm08.stdout:0/442: getdents dd/d22/d24 0 2026-03-09T19:27:11.469 INFO:tasks.workunit.client.1.vm08.stdout:0/443: chown dd/d22/d24/d49/l61 96649 1 2026-03-09T19:27:11.469 INFO:tasks.workunit.client.1.vm08.stdout:0/444: write dd/d22/d27/d6c/f7f [434201,64837] 0 2026-03-09T19:27:11.480 INFO:tasks.workunit.client.1.vm08.stdout:7/491: unlink d5/d14/d2b/d5d/f94 0 2026-03-09T19:27:11.482 INFO:tasks.workunit.client.1.vm08.stdout:5/394: mknod d16/d1e/d3b/d61/c7e 0 2026-03-09T19:27:11.485 INFO:tasks.workunit.client.1.vm08.stdout:4/431: creat da/d10/d16/d28/d2f/d4f/f83 x:0 0 0 2026-03-09T19:27:11.486 INFO:tasks.workunit.client.0.vm07.stdout:8/184: dwrite d7/d9/f36 [0,4194304] 0 2026-03-09T19:27:11.492 INFO:tasks.workunit.client.0.vm07.stdout:2/183: symlink d3/dd/d16/d29/d2d/d45/l46 0 2026-03-09T19:27:11.498 INFO:tasks.workunit.client.0.vm07.stdout:1/114: getdents d1/d11 0 2026-03-09T19:27:11.499 INFO:tasks.workunit.client.0.vm07.stdout:6/147: creat d0/d13/f3f x:0 0 0 2026-03-09T19:27:11.500 INFO:tasks.workunit.client.0.vm07.stdout:6/148: chown d0/d1/db/d1d/f22 2177407 1 2026-03-09T19:27:11.501 INFO:tasks.workunit.client.0.vm07.stdout:7/160: chown d0/c2f 201 1 2026-03-09T19:27:11.502 INFO:tasks.workunit.client.0.vm07.stdout:2/184: dwrite d3/dd/d16/d29/f42 [0,4194304] 0 2026-03-09T19:27:11.509 INFO:tasks.workunit.client.1.vm08.stdout:6/429: rename d3/d15/c21 to d3/d34/d5c/c9d 0 2026-03-09T19:27:11.515 INFO:tasks.workunit.client.1.vm08.stdout:3/471: dwrite d0/d6/de/d15/f53 [0,4194304] 0 2026-03-09T19:27:11.518 INFO:tasks.workunit.client.0.vm07.stdout:7/161: sync 2026-03-09T19:27:11.524 INFO:tasks.workunit.client.0.vm07.stdout:7/162: dwrite d0/d4/d5/d8/d1a/f1d [0,4194304] 0 2026-03-09T19:27:11.526 INFO:tasks.workunit.client.1.vm08.stdout:8/435: read - de/d1d/d2e/f56 zero size 2026-03-09T19:27:11.546 INFO:tasks.workunit.client.1.vm08.stdout:1/560: rmdir d9/da/d2d 39 2026-03-09T19:27:11.547 INFO:tasks.workunit.client.0.vm07.stdout:3/203: creat d1/d1f/d16/d28/f3c x:0 0 0 2026-03-09T19:27:11.548 INFO:tasks.workunit.client.0.vm07.stdout:3/204: stat d1/d1f/d16/d28/f3c 0 2026-03-09T19:27:11.552 INFO:tasks.workunit.client.0.vm07.stdout:4/143: link d3/f7 d3/d11/d1b/f28 0 2026-03-09T19:27:11.562 INFO:tasks.workunit.client.1.vm08.stdout:0/445: dwrite dd/d22/d63/d6e/f80 [0,4194304] 0 2026-03-09T19:27:11.563 INFO:tasks.workunit.client.1.vm08.stdout:2/419: mkdir d3/d4/d23/d2c/d39/d5e/de/d18/d99 0 2026-03-09T19:27:11.563 INFO:tasks.workunit.client.1.vm08.stdout:0/446: dread - dd/d22/d27/d6c/f85 zero size 2026-03-09T19:27:11.572 INFO:tasks.workunit.client.0.vm07.stdout:9/213: write d0/f4 [368135,89579] 0 2026-03-09T19:27:11.573 INFO:tasks.workunit.client.0.vm07.stdout:9/214: chown d0/d17/c38 883 1 2026-03-09T19:27:11.575 INFO:tasks.workunit.client.0.vm07.stdout:0/151: dwrite d0/d6/d13/d1c/f9 [0,4194304] 0 2026-03-09T19:27:11.576 INFO:tasks.workunit.client.1.vm08.stdout:7/492: symlink d5/d16/d3a/d42/d6a/la7 0 2026-03-09T19:27:11.576 INFO:tasks.workunit.client.0.vm07.stdout:5/133: mknod d3/dd/d26/d2d/c33 0 2026-03-09T19:27:11.582 INFO:tasks.workunit.client.1.vm08.stdout:5/395: unlink lb 0 2026-03-09T19:27:11.588 INFO:tasks.workunit.client.1.vm08.stdout:4/432: mkdir da/d10/d16/d28/d2f/d4f/d64/d84 0 2026-03-09T19:27:11.595 INFO:tasks.workunit.client.1.vm08.stdout:3/472: creat d0/d52/d7c/f8f x:0 0 0 2026-03-09T19:27:11.601 INFO:tasks.workunit.client.1.vm08.stdout:3/473: dwrite d0/d6/de/d1b/d16/d17/f3f [4194304,4194304] 0 2026-03-09T19:27:11.603 INFO:tasks.workunit.client.0.vm07.stdout:2/185: rename d3/dd/d16/d29/c2a to d3/dd/d16/d29/d3c/c47 0 2026-03-09T19:27:11.603 INFO:tasks.workunit.client.0.vm07.stdout:9/215: rename d0 to d0/db/d29/d53 22 2026-03-09T19:27:11.604 INFO:tasks.workunit.client.1.vm08.stdout:3/474: write d0/d6/de/d1b/d16/d17/f1d [5824304,122814] 0 2026-03-09T19:27:11.607 INFO:tasks.workunit.client.1.vm08.stdout:3/475: dread - d0/d52/d6d/f8b zero size 2026-03-09T19:27:11.608 INFO:tasks.workunit.client.1.vm08.stdout:3/476: chown d0/d8/d19/c34 34 1 2026-03-09T19:27:11.622 INFO:tasks.workunit.client.1.vm08.stdout:7/493: unlink d5/d16/f28 0 2026-03-09T19:27:11.647 INFO:tasks.workunit.client.0.vm07.stdout:7/163: dwrite d0/f25 [0,4194304] 0 2026-03-09T19:27:11.647 INFO:tasks.workunit.client.0.vm07.stdout:4/144: mkdir d3/d11/d29 0 2026-03-09T19:27:11.647 INFO:tasks.workunit.client.0.vm07.stdout:7/164: dwrite d0/d4/d5/d8/d1a/d2a/f34 [0,4194304] 0 2026-03-09T19:27:11.647 INFO:tasks.workunit.client.0.vm07.stdout:7/165: read - d0/d4/d5/f36 zero size 2026-03-09T19:27:11.647 INFO:tasks.workunit.client.0.vm07.stdout:7/166: chown d0/d4/d5/d8/c27 11053078 1 2026-03-09T19:27:11.647 INFO:tasks.workunit.client.0.vm07.stdout:1/115: mknod d1/c28 0 2026-03-09T19:27:11.647 INFO:tasks.workunit.client.0.vm07.stdout:8/185: creat d7/d9/d37/d45/f4e x:0 0 0 2026-03-09T19:27:11.647 INFO:tasks.workunit.client.0.vm07.stdout:3/205: mkdir d1/d3d 0 2026-03-09T19:27:11.647 INFO:tasks.workunit.client.0.vm07.stdout:4/145: fsync d3/d11/f18 0 2026-03-09T19:27:11.654 INFO:tasks.workunit.client.0.vm07.stdout:6/149: link d0/d1/db/c2c d0/d13/d1e/d30/c40 0 2026-03-09T19:27:11.659 INFO:tasks.workunit.client.1.vm08.stdout:5/396: stat d16/d45/f55 0 2026-03-09T19:27:11.660 INFO:tasks.workunit.client.1.vm08.stdout:5/397: fsync d16/d1e/d3b/f50 0 2026-03-09T19:27:11.666 INFO:tasks.workunit.client.0.vm07.stdout:3/206: rmdir d1/d6 39 2026-03-09T19:27:11.668 INFO:tasks.workunit.client.1.vm08.stdout:9/422: creat d0/d1b/f8d x:0 0 0 2026-03-09T19:27:11.668 INFO:tasks.workunit.client.0.vm07.stdout:3/207: fdatasync d1/f22 0 2026-03-09T19:27:11.669 INFO:tasks.workunit.client.1.vm08.stdout:6/430: fsync d3/f2a 0 2026-03-09T19:27:11.671 INFO:tasks.workunit.client.0.vm07.stdout:4/146: rename d3/d11/l17 to d3/d11/l2a 0 2026-03-09T19:27:11.676 INFO:tasks.workunit.client.0.vm07.stdout:0/152: creat d0/d6/d13/d1c/f36 x:0 0 0 2026-03-09T19:27:11.683 INFO:tasks.workunit.client.0.vm07.stdout:6/150: mkdir d0/d1/db/d24/d41 0 2026-03-09T19:27:11.714 INFO:tasks.workunit.client.1.vm08.stdout:0/447: creat dd/d7e/f8e x:0 0 0 2026-03-09T19:27:11.714 INFO:tasks.workunit.client.1.vm08.stdout:4/433: link da/d10/d16/d28/d46/d52/d6e/d2c/f4a da/d10/d1b/f85 0 2026-03-09T19:27:11.714 INFO:tasks.workunit.client.1.vm08.stdout:4/434: chown da/d10/d1b/f79 11882813 1 2026-03-09T19:27:11.714 INFO:tasks.workunit.client.1.vm08.stdout:6/431: symlink d3/d68/l9e 0 2026-03-09T19:27:11.714 INFO:tasks.workunit.client.1.vm08.stdout:1/561: link d9/da/d12/d39/l6e d9/da/d12/d39/la8 0 2026-03-09T19:27:11.714 INFO:tasks.workunit.client.1.vm08.stdout:7/494: mknod d5/ca8 0 2026-03-09T19:27:11.714 INFO:tasks.workunit.client.1.vm08.stdout:5/398: unlink d16/d1e/d30/f76 0 2026-03-09T19:27:11.714 INFO:tasks.workunit.client.0.vm07.stdout:4/147: mkdir d3/d11/d2b 0 2026-03-09T19:27:11.714 INFO:tasks.workunit.client.0.vm07.stdout:0/153: mknod d0/d6/d13/d17/c37 0 2026-03-09T19:27:11.714 INFO:tasks.workunit.client.0.vm07.stdout:1/116: link d1/d3/cd d1/d3/d21/c29 0 2026-03-09T19:27:11.714 INFO:tasks.workunit.client.0.vm07.stdout:1/117: read d1/d3/f12 [571109,32160] 0 2026-03-09T19:27:11.714 INFO:tasks.workunit.client.0.vm07.stdout:6/151: write d0/d13/d1e/f34 [1303471,27338] 0 2026-03-09T19:27:11.714 INFO:tasks.workunit.client.0.vm07.stdout:1/118: write d1/d9/f15 [4543995,29858] 0 2026-03-09T19:27:11.714 INFO:tasks.workunit.client.0.vm07.stdout:1/119: read d1/d3/f12 [250375,35486] 0 2026-03-09T19:27:11.717 INFO:tasks.workunit.client.1.vm08.stdout:9/423: unlink d0/fa 0 2026-03-09T19:27:11.720 INFO:tasks.workunit.client.0.vm07.stdout:1/120: symlink d1/d3/d21/l2a 0 2026-03-09T19:27:11.728 INFO:tasks.workunit.client.0.vm07.stdout:6/152: link d0/d1/db/d1d/l21 d0/d2d/l42 0 2026-03-09T19:27:11.728 INFO:tasks.workunit.client.0.vm07.stdout:1/121: symlink d1/d3/d21/l2b 0 2026-03-09T19:27:11.728 INFO:tasks.workunit.client.0.vm07.stdout:0/154: getdents d0/d6/d13/d17/d19 0 2026-03-09T19:27:11.730 INFO:tasks.workunit.client.0.vm07.stdout:5/134: sync 2026-03-09T19:27:11.743 INFO:tasks.workunit.client.1.vm08.stdout:9/424: unlink d0/d2/f6c 0 2026-03-09T19:27:11.743 INFO:tasks.workunit.client.1.vm08.stdout:4/435: creat da/d10/d16/d28/d2f/d4f/d64/d81/f86 x:0 0 0 2026-03-09T19:27:11.743 INFO:tasks.workunit.client.0.vm07.stdout:6/153: creat d0/d1/db/f43 x:0 0 0 2026-03-09T19:27:11.743 INFO:tasks.workunit.client.0.vm07.stdout:0/155: creat d0/d6/f38 x:0 0 0 2026-03-09T19:27:11.743 INFO:tasks.workunit.client.0.vm07.stdout:7/167: sync 2026-03-09T19:27:11.744 INFO:tasks.workunit.client.0.vm07.stdout:5/135: mknod d3/d1a/d28/c34 0 2026-03-09T19:27:11.744 INFO:tasks.workunit.client.0.vm07.stdout:5/136: stat d3/d1a/d28/c30 0 2026-03-09T19:27:11.751 INFO:tasks.workunit.client.1.vm08.stdout:4/436: dwrite da/d10/d26/f74 [0,4194304] 0 2026-03-09T19:27:11.758 INFO:tasks.workunit.client.0.vm07.stdout:1/122: creat d1/d20/f2c x:0 0 0 2026-03-09T19:27:11.769 INFO:tasks.workunit.client.1.vm08.stdout:1/562: creat d9/da/d17/fa9 x:0 0 0 2026-03-09T19:27:11.769 INFO:tasks.workunit.client.1.vm08.stdout:4/437: dwrite da/d10/d16/d28/d2f/d4f/f83 [0,4194304] 0 2026-03-09T19:27:11.769 INFO:tasks.workunit.client.1.vm08.stdout:3/477: link d0/d52/c62 d0/d52/d7c/d7e/c90 0 2026-03-09T19:27:11.769 INFO:tasks.workunit.client.0.vm07.stdout:6/154: chown d0/d1/db/d24/l25 2931385 1 2026-03-09T19:27:11.769 INFO:tasks.workunit.client.0.vm07.stdout:6/155: chown d0/d1/db/d1d/f3e 147 1 2026-03-09T19:27:11.769 INFO:tasks.workunit.client.0.vm07.stdout:0/156: creat d0/d6/d13/d33/f39 x:0 0 0 2026-03-09T19:27:11.769 INFO:tasks.workunit.client.0.vm07.stdout:1/123: dwrite d1/d3/f23 [0,4194304] 0 2026-03-09T19:27:11.772 INFO:tasks.workunit.client.0.vm07.stdout:1/124: write d1/d11/f24 [1004745,105096] 0 2026-03-09T19:27:11.772 INFO:tasks.workunit.client.0.vm07.stdout:1/125: read d1/d3/f12 [2640008,13877] 0 2026-03-09T19:27:11.774 INFO:tasks.workunit.client.1.vm08.stdout:7/495: truncate d5/d16/d1c/d83/f9b 1019831 0 2026-03-09T19:27:11.775 INFO:tasks.workunit.client.1.vm08.stdout:5/399: mknod d16/d1e/d30/d6f/c7f 0 2026-03-09T19:27:11.775 INFO:tasks.workunit.client.1.vm08.stdout:7/496: chown d5/d14/d2b/d5d/l6e 1038772 1 2026-03-09T19:27:11.778 INFO:tasks.workunit.client.0.vm07.stdout:5/137: mknod d3/d1a/d28/c35 0 2026-03-09T19:27:11.799 INFO:tasks.workunit.client.0.vm07.stdout:4/148: dread d3/f7 [0,4194304] 0 2026-03-09T19:27:11.799 INFO:tasks.workunit.client.0.vm07.stdout:4/149: stat d3/d11/d16/l1f 0 2026-03-09T19:27:11.799 INFO:tasks.workunit.client.0.vm07.stdout:1/126: symlink d1/d9/l2d 0 2026-03-09T19:27:11.799 INFO:tasks.workunit.client.0.vm07.stdout:7/168: rename d0/d2c to d0/d4/d5/d26/d38/d39 0 2026-03-09T19:27:11.800 INFO:tasks.workunit.client.1.vm08.stdout:7/497: unlink d5/l24 0 2026-03-09T19:27:11.802 INFO:tasks.workunit.client.1.vm08.stdout:5/400: mknod d16/d1e/d6e/c80 0 2026-03-09T19:27:11.805 INFO:tasks.workunit.client.0.vm07.stdout:6/156: mkdir d0/d44 0 2026-03-09T19:27:11.805 INFO:tasks.workunit.client.0.vm07.stdout:4/150: dread d3/fc [0,4194304] 0 2026-03-09T19:27:11.806 INFO:tasks.workunit.client.0.vm07.stdout:4/151: dread d3/d11/d1b/d20/d22/f24 [0,4194304] 0 2026-03-09T19:27:11.823 INFO:tasks.workunit.client.1.vm08.stdout:6/432: getdents d3/db 0 2026-03-09T19:27:11.834 INFO:tasks.workunit.client.0.vm07.stdout:5/138: read d3/f19 [349457,122204] 0 2026-03-09T19:27:11.834 INFO:tasks.workunit.client.0.vm07.stdout:1/127: creat d1/d3/d21/f2e x:0 0 0 2026-03-09T19:27:11.834 INFO:tasks.workunit.client.1.vm08.stdout:7/498: creat d5/d16/d3a/d42/fa9 x:0 0 0 2026-03-09T19:27:11.840 INFO:tasks.workunit.client.1.vm08.stdout:8/436: sync 2026-03-09T19:27:11.840 INFO:tasks.workunit.client.1.vm08.stdout:0/448: sync 2026-03-09T19:27:11.840 INFO:tasks.workunit.client.1.vm08.stdout:9/425: creat d0/d2/d8/f8e x:0 0 0 2026-03-09T19:27:11.840 INFO:tasks.workunit.client.1.vm08.stdout:6/433: dread - d3/d15/f6a zero size 2026-03-09T19:27:11.842 INFO:tasks.workunit.client.1.vm08.stdout:6/434: dread - d3/db/f30 zero size 2026-03-09T19:27:11.842 INFO:tasks.workunit.client.1.vm08.stdout:7/499: mkdir d5/d14/d2b/daa 0 2026-03-09T19:27:11.850 INFO:tasks.workunit.client.1.vm08.stdout:0/449: creat dd/d22/d63/d6e/d72/f8f x:0 0 0 2026-03-09T19:27:11.850 INFO:tasks.workunit.client.1.vm08.stdout:5/401: truncate d16/f17 2835897 0 2026-03-09T19:27:11.853 INFO:tasks.workunit.client.0.vm07.stdout:5/139: mkdir d3/d1a/d28/d36 0 2026-03-09T19:27:11.857 INFO:tasks.workunit.client.1.vm08.stdout:9/426: symlink d0/d2/d8/d7/d48/d5e/l8f 0 2026-03-09T19:27:11.858 INFO:tasks.workunit.client.0.vm07.stdout:4/152: creat d3/d11/d2b/f2c x:0 0 0 2026-03-09T19:27:11.859 INFO:tasks.workunit.client.0.vm07.stdout:5/140: fsync d3/f19 0 2026-03-09T19:27:11.860 INFO:tasks.workunit.client.0.vm07.stdout:5/141: read d3/d1a/fc [3253532,109258] 0 2026-03-09T19:27:11.862 INFO:tasks.workunit.client.1.vm08.stdout:5/402: mkdir d16/d45/d81 0 2026-03-09T19:27:11.865 INFO:tasks.workunit.client.0.vm07.stdout:4/153: symlink d3/d11/d1b/l2d 0 2026-03-09T19:27:11.865 INFO:tasks.workunit.client.0.vm07.stdout:4/154: chown d3/d11/c19 0 1 2026-03-09T19:27:11.867 INFO:tasks.workunit.client.0.vm07.stdout:1/128: creat d1/f2f x:0 0 0 2026-03-09T19:27:11.879 INFO:tasks.workunit.client.0.vm07.stdout:5/142: dread d3/dd/f23 [0,4194304] 0 2026-03-09T19:27:11.886 INFO:tasks.workunit.client.1.vm08.stdout:2/420: write d3/d4/d23/d2c/d39/d5e/d14/f58 [1284218,94205] 0 2026-03-09T19:27:11.889 INFO:tasks.workunit.client.1.vm08.stdout:2/421: dread - d3/d9/d79/f86 zero size 2026-03-09T19:27:11.890 INFO:tasks.workunit.client.1.vm08.stdout:7/500: creat d5/d16/d1c/fab x:0 0 0 2026-03-09T19:27:11.892 INFO:tasks.workunit.client.0.vm07.stdout:2/186: truncate d3/f27 1699749 0 2026-03-09T19:27:11.894 INFO:tasks.workunit.client.0.vm07.stdout:2/187: chown d3/dd/d16/d29/d2d/d45/l46 48201 1 2026-03-09T19:27:11.895 INFO:tasks.workunit.client.0.vm07.stdout:1/129: dwrite d1/db/f1f [0,4194304] 0 2026-03-09T19:27:11.895 INFO:tasks.workunit.client.0.vm07.stdout:2/188: write d3/d11/f31 [1039773,120366] 0 2026-03-09T19:27:11.901 INFO:tasks.workunit.client.1.vm08.stdout:0/450: fsync dd/d22/d27/f42 0 2026-03-09T19:27:11.901 INFO:tasks.workunit.client.0.vm07.stdout:8/186: dwrite f3 [4194304,4194304] 0 2026-03-09T19:27:11.901 INFO:tasks.workunit.client.1.vm08.stdout:2/422: mkdir d3/d9/d4a/d9a 0 2026-03-09T19:27:11.907 INFO:tasks.workunit.client.0.vm07.stdout:2/189: readlink d3/l26 0 2026-03-09T19:27:11.911 INFO:tasks.workunit.client.1.vm08.stdout:7/501: creat d5/d16/d1c/d73/fac x:0 0 0 2026-03-09T19:27:11.911 INFO:tasks.workunit.client.1.vm08.stdout:9/427: sync 2026-03-09T19:27:11.915 INFO:tasks.workunit.client.0.vm07.stdout:3/208: dwrite d1/d1f/f38 [0,4194304] 0 2026-03-09T19:27:11.915 INFO:tasks.workunit.client.1.vm08.stdout:2/423: rmdir d3/d9/d79/d46/d8c 39 2026-03-09T19:27:11.915 INFO:tasks.workunit.client.1.vm08.stdout:0/451: rename dd/d22/d24/d49/l56 to dd/l90 0 2026-03-09T19:27:11.916 INFO:tasks.workunit.client.0.vm07.stdout:3/209: fsync d1/d1f/d16/f30 0 2026-03-09T19:27:11.923 INFO:tasks.workunit.client.0.vm07.stdout:8/187: mkdir d7/d9/d37/d45/d4f 0 2026-03-09T19:27:11.933 INFO:tasks.workunit.client.0.vm07.stdout:1/130: getdents d1/d11 0 2026-03-09T19:27:11.933 INFO:tasks.workunit.client.1.vm08.stdout:9/428: creat d0/d1b/d68/d7f/f90 x:0 0 0 2026-03-09T19:27:11.933 INFO:tasks.workunit.client.1.vm08.stdout:2/424: fdatasync d3/d4/d23/d2c/d39/d5e/de/d8b/f7e 0 2026-03-09T19:27:11.939 INFO:tasks.workunit.client.1.vm08.stdout:9/429: mknod d0/d2/d8/d7/d48/c91 0 2026-03-09T19:27:11.951 INFO:tasks.workunit.client.0.vm07.stdout:1/131: mkdir d1/d9/d30 0 2026-03-09T19:27:11.951 INFO:tasks.workunit.client.0.vm07.stdout:1/132: write d1/d9/f26 [705246,47561] 0 2026-03-09T19:27:11.952 INFO:tasks.workunit.client.0.vm07.stdout:1/133: write d1/db/f1f [2773730,95325] 0 2026-03-09T19:27:11.953 INFO:tasks.workunit.client.1.vm08.stdout:2/425: fdatasync d3/d4/d3e/d4e/f8d 0 2026-03-09T19:27:11.955 INFO:tasks.workunit.client.1.vm08.stdout:9/430: creat d0/d2/d8/d7/d48/d5d/f92 x:0 0 0 2026-03-09T19:27:11.956 INFO:tasks.workunit.client.0.vm07.stdout:8/188: mkdir d7/d50 0 2026-03-09T19:27:11.957 INFO:tasks.workunit.client.0.vm07.stdout:8/189: write d7/d9/d10/f41 [691614,68367] 0 2026-03-09T19:27:11.958 INFO:tasks.workunit.client.1.vm08.stdout:0/452: rename dd/d22/d63/d6e/f80 to dd/d22/d27/f91 0 2026-03-09T19:27:11.960 INFO:tasks.workunit.client.0.vm07.stdout:8/190: dwrite d7/f1c [0,4194304] 0 2026-03-09T19:27:11.960 INFO:tasks.workunit.client.1.vm08.stdout:0/453: chown dd/d22/d27/f42 421227098 1 2026-03-09T19:27:11.966 INFO:tasks.workunit.client.0.vm07.stdout:8/191: dwrite d7/d9/d37/f3b [0,4194304] 0 2026-03-09T19:27:11.977 INFO:tasks.workunit.client.1.vm08.stdout:2/426: unlink d3/d4/d3e/d4e/f8d 0 2026-03-09T19:27:11.978 INFO:tasks.workunit.client.0.vm07.stdout:9/216: dread d0/f4 [0,4194304] 0 2026-03-09T19:27:11.982 INFO:tasks.workunit.client.0.vm07.stdout:9/217: dread d0/db/d29/d2c/f34 [0,4194304] 0 2026-03-09T19:27:11.991 INFO:tasks.workunit.client.1.vm08.stdout:3/478: write d0/d8/d19/f44 [7328588,121596] 0 2026-03-09T19:27:12.002 INFO:tasks.workunit.client.1.vm08.stdout:4/438: dwrite da/d10/d26/d27/d32/f45 [4194304,4194304] 0 2026-03-09T19:27:12.002 INFO:tasks.workunit.client.0.vm07.stdout:0/157: truncate d0/f1e 1495027 0 2026-03-09T19:27:12.002 INFO:tasks.workunit.client.0.vm07.stdout:6/157: dwrite d0/d1/db/f15 [0,4194304] 0 2026-03-09T19:27:12.002 INFO:tasks.workunit.client.0.vm07.stdout:6/158: write d0/d13/f1b [864012,128939] 0 2026-03-09T19:27:12.006 INFO:tasks.workunit.client.0.vm07.stdout:6/159: dwrite d0/f3b [0,4194304] 0 2026-03-09T19:27:12.013 INFO:tasks.workunit.client.1.vm08.stdout:9/431: rename d0/d1b/d68/d7f/f90 to d0/d2/d80/d69/f93 0 2026-03-09T19:27:12.014 INFO:tasks.workunit.client.1.vm08.stdout:8/437: write de/d1d/d4f/f5e [751703,50599] 0 2026-03-09T19:27:12.015 INFO:tasks.workunit.client.1.vm08.stdout:1/563: dwrite d9/da/d2d/f41 [4194304,4194304] 0 2026-03-09T19:27:12.023 INFO:tasks.workunit.client.1.vm08.stdout:6/435: dwrite d3/d15/f45 [0,4194304] 0 2026-03-09T19:27:12.026 INFO:tasks.workunit.client.1.vm08.stdout:6/436: write d3/d34/d3b/f8d [221807,51508] 0 2026-03-09T19:27:12.033 INFO:tasks.workunit.client.1.vm08.stdout:2/427: creat d3/d4/d23/d2c/d39/f9b x:0 0 0 2026-03-09T19:27:12.037 INFO:tasks.workunit.client.1.vm08.stdout:3/479: creat d0/d6/f91 x:0 0 0 2026-03-09T19:27:12.041 INFO:tasks.workunit.client.1.vm08.stdout:5/403: dwrite d16/d45/f55 [0,4194304] 0 2026-03-09T19:27:12.041 INFO:tasks.workunit.client.1.vm08.stdout:5/404: fsync d16/d1e/d30/f3a 0 2026-03-09T19:27:12.044 INFO:tasks.workunit.client.0.vm07.stdout:7/169: mknod d0/d4/c3a 0 2026-03-09T19:27:12.044 INFO:tasks.workunit.client.0.vm07.stdout:7/170: readlink d0/d4/d5/l11 0 2026-03-09T19:27:12.045 INFO:tasks.workunit.client.1.vm08.stdout:4/439: creat da/d10/d26/f87 x:0 0 0 2026-03-09T19:27:12.045 INFO:tasks.workunit.client.1.vm08.stdout:3/480: dread d0/d6/de/d1b/d16/f7b [0,4194304] 0 2026-03-09T19:27:12.045 INFO:tasks.workunit.client.1.vm08.stdout:4/440: fsync f5 0 2026-03-09T19:27:12.052 INFO:tasks.workunit.client.0.vm07.stdout:4/155: truncate d3/f13 1606328 0 2026-03-09T19:27:12.052 INFO:tasks.workunit.client.0.vm07.stdout:5/143: truncate d3/dd/d26/d2d/f32 1056462 0 2026-03-09T19:27:12.057 INFO:tasks.workunit.client.1.vm08.stdout:3/481: fsync d0/d8/f4c 0 2026-03-09T19:27:12.061 INFO:tasks.workunit.client.0.vm07.stdout:3/210: write d1/d1f/d16/f1e [4683298,117163] 0 2026-03-09T19:27:12.062 INFO:tasks.workunit.client.0.vm07.stdout:2/190: truncate d3/dd/fe 4063740 0 2026-03-09T19:27:12.062 INFO:tasks.workunit.client.1.vm08.stdout:3/482: fsync d0/f28 0 2026-03-09T19:27:12.062 INFO:tasks.workunit.client.0.vm07.stdout:2/191: write d3/d11/f1f [1018618,101938] 0 2026-03-09T19:27:12.063 INFO:tasks.workunit.client.0.vm07.stdout:2/192: chown d3/d11/f23 0 1 2026-03-09T19:27:12.063 INFO:tasks.workunit.client.0.vm07.stdout:2/193: fdatasync d3/f5 0 2026-03-09T19:27:12.064 INFO:tasks.workunit.client.1.vm08.stdout:7/502: dwrite d5/d14/d27/d54/f75 [0,4194304] 0 2026-03-09T19:27:12.065 INFO:tasks.workunit.client.1.vm08.stdout:1/564: mkdir d9/d11/d7a/d89/d8d/daa 0 2026-03-09T19:27:12.067 INFO:tasks.workunit.client.1.vm08.stdout:8/438: rename de/d1d/d69/f7e to de/d1d/d2e/f9f 0 2026-03-09T19:27:12.067 INFO:tasks.workunit.client.1.vm08.stdout:1/565: truncate d9/d40/d49/f7c 4508834 0 2026-03-09T19:27:12.070 INFO:tasks.workunit.client.1.vm08.stdout:6/437: truncate d3/d34/d3b/f67 348084 0 2026-03-09T19:27:12.079 INFO:tasks.workunit.client.1.vm08.stdout:5/405: fsync d16/d1e/f2c 0 2026-03-09T19:27:12.080 INFO:tasks.workunit.client.0.vm07.stdout:4/156: creat d3/d11/d1b/f2e x:0 0 0 2026-03-09T19:27:12.080 INFO:tasks.workunit.client.0.vm07.stdout:4/157: write d3/d11/d1b/f2e [935784,35501] 0 2026-03-09T19:27:12.080 INFO:tasks.workunit.client.0.vm07.stdout:4/158: stat d3/d11/d1b/d20/f26 0 2026-03-09T19:27:12.080 INFO:tasks.workunit.client.0.vm07.stdout:4/159: write d3/f8 [3941493,31330] 0 2026-03-09T19:27:12.080 INFO:tasks.workunit.client.0.vm07.stdout:5/144: symlink d3/dd/d26/d2d/l37 0 2026-03-09T19:27:12.084 INFO:tasks.workunit.client.1.vm08.stdout:6/438: dwrite d3/f6e [0,4194304] 0 2026-03-09T19:27:12.088 INFO:tasks.workunit.client.0.vm07.stdout:1/134: getdents d1/d3 0 2026-03-09T19:27:12.088 INFO:tasks.workunit.client.0.vm07.stdout:1/135: chown d1/d9/f16 2004165 1 2026-03-09T19:27:12.090 INFO:tasks.workunit.client.1.vm08.stdout:4/441: creat da/d10/d26/d3a/f88 x:0 0 0 2026-03-09T19:27:12.092 INFO:tasks.workunit.client.0.vm07.stdout:6/160: symlink d0/d44/l45 0 2026-03-09T19:27:12.094 INFO:tasks.workunit.client.0.vm07.stdout:3/211: dread - d1/d6/dd/f3b zero size 2026-03-09T19:27:12.095 INFO:tasks.workunit.client.1.vm08.stdout:4/442: readlink da/d10/d26/d3a/l4e 0 2026-03-09T19:27:12.097 INFO:tasks.workunit.client.0.vm07.stdout:6/161: dwrite d0/d1/db/d1d/f22 [0,4194304] 0 2026-03-09T19:27:12.100 INFO:tasks.workunit.client.0.vm07.stdout:6/162: stat d0/d1/db/d1d/f2e 0 2026-03-09T19:27:12.103 INFO:tasks.workunit.client.1.vm08.stdout:9/432: mknod d0/d2/d8/c94 0 2026-03-09T19:27:12.123 INFO:tasks.workunit.client.1.vm08.stdout:0/454: dwrite dd/d22/d27/d2e/d37/f40 [0,4194304] 0 2026-03-09T19:27:12.127 INFO:tasks.workunit.client.0.vm07.stdout:2/194: unlink d3/d11/f23 0 2026-03-09T19:27:12.131 INFO:tasks.workunit.client.1.vm08.stdout:3/483: mkdir d0/d6/de/d6e/d51/d92 0 2026-03-09T19:27:12.132 INFO:tasks.workunit.client.1.vm08.stdout:3/484: write d0/d6/d25/f87 [587121,10410] 0 2026-03-09T19:27:12.135 INFO:tasks.workunit.client.0.vm07.stdout:9/218: rename d0/d6/d3a/f45 to d0/db/d29/d2c/f54 0 2026-03-09T19:27:12.136 INFO:tasks.workunit.client.0.vm07.stdout:9/219: chown d0/d6/f20 49232920 1 2026-03-09T19:27:12.140 INFO:tasks.workunit.client.1.vm08.stdout:7/503: readlink d5/d16/d3a/l5b 0 2026-03-09T19:27:12.145 INFO:tasks.workunit.client.0.vm07.stdout:5/145: unlink d3/d1a/f1b 0 2026-03-09T19:27:12.145 INFO:tasks.workunit.client.0.vm07.stdout:5/146: write d3/d1a/f1c [45347,110086] 0 2026-03-09T19:27:12.145 INFO:tasks.workunit.client.1.vm08.stdout:7/504: readlink d5/d14/d2b/d4b/l5c 0 2026-03-09T19:27:12.145 INFO:tasks.workunit.client.1.vm08.stdout:7/505: read d5/d14/d38/f40 [335127,3264] 0 2026-03-09T19:27:12.152 INFO:tasks.workunit.client.1.vm08.stdout:6/439: mknod d3/d34/d5c/c9f 0 2026-03-09T19:27:12.154 INFO:tasks.workunit.client.1.vm08.stdout:5/406: creat d16/d1e/d3b/f82 x:0 0 0 2026-03-09T19:27:12.156 INFO:tasks.workunit.client.1.vm08.stdout:4/443: rmdir da/d10/d16/d28/d2f/d4f/d64/d81 39 2026-03-09T19:27:12.159 INFO:tasks.workunit.client.0.vm07.stdout:2/195: creat d3/dd/d16/f48 x:0 0 0 2026-03-09T19:27:12.159 INFO:tasks.workunit.client.0.vm07.stdout:2/196: chown d3/dd/d16/d29/d2d/d45/l46 609199823 1 2026-03-09T19:27:12.165 INFO:tasks.workunit.client.0.vm07.stdout:9/220: mknod d0/db/c55 0 2026-03-09T19:27:12.165 INFO:tasks.workunit.client.1.vm08.stdout:0/455: mkdir dd/d22/d24/d49/d92 0 2026-03-09T19:27:12.166 INFO:tasks.workunit.client.1.vm08.stdout:3/485: mkdir d0/d6/d93 0 2026-03-09T19:27:12.172 INFO:tasks.workunit.client.1.vm08.stdout:1/566: creat d9/d11/d7a/d89/d8d/da3/fab x:0 0 0 2026-03-09T19:27:12.175 INFO:tasks.workunit.client.1.vm08.stdout:7/506: unlink d5/d16/d3a/l5b 0 2026-03-09T19:27:12.180 INFO:tasks.workunit.client.0.vm07.stdout:8/192: write d7/d16/d1e/f33 [1629753,41138] 0 2026-03-09T19:27:12.183 INFO:tasks.workunit.client.0.vm07.stdout:0/158: truncate d0/d6/d13/d1c/d11/f15 1952838 0 2026-03-09T19:27:12.186 INFO:tasks.workunit.client.1.vm08.stdout:6/440: mkdir d3/db/d43/d69/da0 0 2026-03-09T19:27:12.188 INFO:tasks.workunit.client.1.vm08.stdout:5/407: rename d16/l79 to d16/d1e/d3b/d61/l83 0 2026-03-09T19:27:12.189 INFO:tasks.workunit.client.1.vm08.stdout:6/441: write d3/f9 [4612792,47897] 0 2026-03-09T19:27:12.194 INFO:tasks.workunit.client.0.vm07.stdout:7/171: rename d0/c2f to d0/d4/d5/d26/d38/c3b 0 2026-03-09T19:27:12.196 INFO:tasks.workunit.client.0.vm07.stdout:9/221: creat d0/f56 x:0 0 0 2026-03-09T19:27:12.202 INFO:tasks.workunit.client.1.vm08.stdout:3/486: creat d0/d6/de/d1b/d16/d17/f94 x:0 0 0 2026-03-09T19:27:12.208 INFO:tasks.workunit.client.1.vm08.stdout:7/507: mkdir d5/d14/d38/dad 0 2026-03-09T19:27:12.209 INFO:tasks.workunit.client.0.vm07.stdout:2/197: getdents d3/dd/d16/d30/d40 0 2026-03-09T19:27:12.216 INFO:tasks.workunit.client.0.vm07.stdout:0/159: rmdir d0/d6/d13/d1c 39 2026-03-09T19:27:12.216 INFO:tasks.workunit.client.0.vm07.stdout:3/212: truncate d1/f7 1822192 0 2026-03-09T19:27:12.217 INFO:tasks.workunit.client.0.vm07.stdout:3/213: readlink d1/l2c 0 2026-03-09T19:27:12.217 INFO:tasks.workunit.client.0.vm07.stdout:3/214: chown d1/d1f/d16/d28 41490730 1 2026-03-09T19:27:12.220 INFO:tasks.workunit.client.0.vm07.stdout:6/163: dwrite d0/d1/f8 [0,4194304] 0 2026-03-09T19:27:12.225 INFO:tasks.workunit.client.1.vm08.stdout:6/442: fsync d3/f2a 0 2026-03-09T19:27:12.226 INFO:tasks.workunit.client.1.vm08.stdout:6/443: read d3/d34/f35 [1071092,122931] 0 2026-03-09T19:27:12.226 INFO:tasks.workunit.client.0.vm07.stdout:4/160: getdents d3/d11/d1b/d20/d22 0 2026-03-09T19:27:12.227 INFO:tasks.workunit.client.0.vm07.stdout:6/164: dwrite d0/d13/d1e/d30/f35 [0,4194304] 0 2026-03-09T19:27:12.228 INFO:tasks.workunit.client.0.vm07.stdout:5/147: dwrite d3/dd/d26/d2d/f32 [0,4194304] 0 2026-03-09T19:27:12.231 INFO:tasks.workunit.client.1.vm08.stdout:5/408: mkdir d16/d1e/d6e/d84 0 2026-03-09T19:27:12.236 INFO:tasks.workunit.client.1.vm08.stdout:8/439: mkdir de/d25/d31/d82/d6d/d99/da0 0 2026-03-09T19:27:12.237 INFO:tasks.workunit.client.1.vm08.stdout:6/444: dwrite d3/d34/d6f/f4f [0,4194304] 0 2026-03-09T19:27:12.237 INFO:tasks.workunit.client.1.vm08.stdout:9/433: truncate d0/d2/d80/f6a 2164687 0 2026-03-09T19:27:12.247 INFO:tasks.workunit.client.1.vm08.stdout:6/445: chown d3/d34/d3b/f99 0 1 2026-03-09T19:27:12.250 INFO:tasks.workunit.client.1.vm08.stdout:0/456: dwrite dd/f18 [0,4194304] 0 2026-03-09T19:27:12.267 INFO:tasks.workunit.client.1.vm08.stdout:0/457: dwrite dd/d22/d24/d49/f5f [4194304,4194304] 0 2026-03-09T19:27:12.271 INFO:tasks.workunit.client.1.vm08.stdout:3/487: dread d0/d6/d25/f56 [0,4194304] 0 2026-03-09T19:27:12.276 INFO:tasks.workunit.client.1.vm08.stdout:1/567: unlink d9/da/d12/c32 0 2026-03-09T19:27:12.276 INFO:tasks.workunit.client.1.vm08.stdout:1/568: chown d9/da/d53/c55 6 1 2026-03-09T19:27:12.284 INFO:tasks.workunit.client.0.vm07.stdout:0/160: fsync d0/d6/d13/d17/d19/f1f 0 2026-03-09T19:27:12.304 INFO:tasks.workunit.client.0.vm07.stdout:6/165: mknod d0/d1/db/d1d/c46 0 2026-03-09T19:27:12.314 INFO:tasks.workunit.client.0.vm07.stdout:4/161: rename d3/d11/d1b/d20 to d3/d11/d16/d2f 0 2026-03-09T19:27:12.325 INFO:tasks.workunit.client.1.vm08.stdout:6/446: dwrite d3/d34/d3b/f99 [0,4194304] 0 2026-03-09T19:27:12.325 INFO:tasks.workunit.client.1.vm08.stdout:6/447: fdatasync d3/d34/d5c/f7c 0 2026-03-09T19:27:12.326 INFO:tasks.workunit.client.1.vm08.stdout:6/448: dread - d3/d15/f98 zero size 2026-03-09T19:27:12.326 INFO:tasks.workunit.client.1.vm08.stdout:4/444: dwrite da/d10/d16/d28/d46/d52/d6e/d40/f41 [0,4194304] 0 2026-03-09T19:27:12.327 INFO:tasks.workunit.client.1.vm08.stdout:4/445: read da/d10/d26/d38/f57 [891084,101689] 0 2026-03-09T19:27:12.338 INFO:tasks.workunit.client.0.vm07.stdout:8/193: truncate d7/d9/f36 324909 0 2026-03-09T19:27:12.339 INFO:tasks.workunit.client.0.vm07.stdout:8/194: chown d7/d9/l3c 7745881 1 2026-03-09T19:27:12.339 INFO:tasks.workunit.client.0.vm07.stdout:8/195: stat d7/d9/l3c 0 2026-03-09T19:27:12.351 INFO:tasks.workunit.client.1.vm08.stdout:0/458: mkdir dd/d22/d63/d93 0 2026-03-09T19:27:12.364 INFO:tasks.workunit.client.0.vm07.stdout:2/198: truncate d3/fa 3742121 0 2026-03-09T19:27:12.365 INFO:tasks.workunit.client.0.vm07.stdout:7/172: chown d0/l29 31 1 2026-03-09T19:27:12.368 INFO:tasks.workunit.client.0.vm07.stdout:9/222: dwrite d0/db/f21 [4194304,4194304] 0 2026-03-09T19:27:12.388 INFO:tasks.workunit.client.1.vm08.stdout:2/428: dread d3/d9/f1e [0,4194304] 0 2026-03-09T19:27:12.401 INFO:tasks.workunit.client.1.vm08.stdout:0/459: creat dd/d22/d63/f94 x:0 0 0 2026-03-09T19:27:12.407 INFO:tasks.workunit.client.0.vm07.stdout:1/136: dread d1/db/f14 [0,4194304] 0 2026-03-09T19:27:12.407 INFO:tasks.workunit.client.0.vm07.stdout:1/137: read d1/d9/f16 [3585093,27089] 0 2026-03-09T19:27:12.410 INFO:tasks.workunit.client.1.vm08.stdout:7/508: rename d5/d16 to d5/d14/dae 0 2026-03-09T19:27:12.430 INFO:tasks.workunit.client.1.vm08.stdout:8/440: write de/f5d [787444,105382] 0 2026-03-09T19:27:12.434 INFO:tasks.workunit.client.1.vm08.stdout:4/446: dwrite da/d10/d16/d28/d2f/d4f/f65 [0,4194304] 0 2026-03-09T19:27:12.438 INFO:tasks.workunit.client.0.vm07.stdout:0/161: creat d0/f3a x:0 0 0 2026-03-09T19:27:12.439 INFO:tasks.workunit.client.0.vm07.stdout:0/162: chown d0 637666017 1 2026-03-09T19:27:12.439 INFO:tasks.workunit.client.1.vm08.stdout:3/488: symlink d0/d8/l95 0 2026-03-09T19:27:12.441 INFO:tasks.workunit.client.1.vm08.stdout:4/447: dwrite da/d10/d16/d28/d46/d52/d6e/d40/d6c/f71 [0,4194304] 0 2026-03-09T19:27:12.446 INFO:tasks.workunit.client.1.vm08.stdout:4/448: fdatasync da/d10/d16/d28/d2f/d4f/f83 0 2026-03-09T19:27:12.446 INFO:tasks.workunit.client.1.vm08.stdout:4/449: stat da/d10/d16/d28/c3f 0 2026-03-09T19:27:12.449 INFO:tasks.workunit.client.0.vm07.stdout:6/166: read d0/d1/fa [1131384,46570] 0 2026-03-09T19:27:12.451 INFO:tasks.workunit.client.0.vm07.stdout:6/167: dread d0/d1/db/d17/f2f [0,4194304] 0 2026-03-09T19:27:12.455 INFO:tasks.workunit.client.1.vm08.stdout:1/569: truncate d9/da/d12/d39/f47 2926497 0 2026-03-09T19:27:12.455 INFO:tasks.workunit.client.0.vm07.stdout:4/162: creat d3/d11/d1b/f30 x:0 0 0 2026-03-09T19:27:12.455 INFO:tasks.workunit.client.0.vm07.stdout:4/163: dread d3/d11/d1b/f28 [0,4194304] 0 2026-03-09T19:27:12.464 INFO:tasks.workunit.client.1.vm08.stdout:2/429: mknod d3/d4/d23/d2c/d39/c9c 0 2026-03-09T19:27:12.482 INFO:tasks.workunit.client.1.vm08.stdout:0/460: creat dd/d22/d24/d49/d50/f95 x:0 0 0 2026-03-09T19:27:12.483 INFO:tasks.workunit.client.1.vm08.stdout:0/461: truncate dd/d31/f54 957548 0 2026-03-09T19:27:12.485 INFO:tasks.workunit.client.0.vm07.stdout:7/173: rename d0/d4/d5/d26/d38 to d0/d4/d5/d26/d3c 0 2026-03-09T19:27:12.486 INFO:tasks.workunit.client.0.vm07.stdout:3/215: truncate d1/d1f/d16/f1e 437863 0 2026-03-09T19:27:12.489 INFO:tasks.workunit.client.0.vm07.stdout:3/216: chown d1/d6/dd/f2b 1690 1 2026-03-09T19:27:12.490 INFO:tasks.workunit.client.1.vm08.stdout:0/462: read dd/d22/d7b/f83 [2106801,55737] 0 2026-03-09T19:27:12.493 INFO:tasks.workunit.client.1.vm08.stdout:7/509: mkdir d5/d14/dae/d1c/d83/d90/daf 0 2026-03-09T19:27:12.494 INFO:tasks.workunit.client.1.vm08.stdout:7/510: write d5/d14/dae/d3a/d42/d85/f19 [6963120,58936] 0 2026-03-09T19:27:12.514 INFO:tasks.workunit.client.0.vm07.stdout:9/223: dwrite d0/db/d29/d2c/f4a [4194304,4194304] 0 2026-03-09T19:27:12.518 INFO:tasks.workunit.client.0.vm07.stdout:9/224: dread d0/db/d29/d2c/f34 [0,4194304] 0 2026-03-09T19:27:12.518 INFO:tasks.workunit.client.0.vm07.stdout:1/138: mkdir d1/db/d31 0 2026-03-09T19:27:12.521 INFO:tasks.workunit.client.1.vm08.stdout:3/489: mkdir d0/d6/de/d15/d96 0 2026-03-09T19:27:12.524 INFO:tasks.workunit.client.1.vm08.stdout:8/441: dwrite de/d1d/d2e/d5f/f4e [0,4194304] 0 2026-03-09T19:27:12.545 INFO:tasks.workunit.client.0.vm07.stdout:6/168: rmdir d0/d2d 39 2026-03-09T19:27:12.553 INFO:tasks.workunit.client.0.vm07.stdout:4/164: symlink d3/d11/d2b/l31 0 2026-03-09T19:27:12.558 INFO:tasks.workunit.client.1.vm08.stdout:1/570: chown d9/da/dc/c25 2 1 2026-03-09T19:27:12.562 INFO:tasks.workunit.client.1.vm08.stdout:1/571: dread d9/da/dc/f68 [0,4194304] 0 2026-03-09T19:27:12.563 INFO:tasks.workunit.client.1.vm08.stdout:1/572: write d9/da/dc/f68 [590953,3378] 0 2026-03-09T19:27:12.566 INFO:tasks.workunit.client.0.vm07.stdout:2/199: mkdir d3/d49 0 2026-03-09T19:27:12.569 INFO:tasks.workunit.client.1.vm08.stdout:6/449: link d3/d34/d6f/f39 d3/d15/d8a/fa1 0 2026-03-09T19:27:12.569 INFO:tasks.workunit.client.1.vm08.stdout:5/409: rename d16/d1e/d30/l75 to d16/d45/l85 0 2026-03-09T19:27:12.571 INFO:tasks.workunit.client.0.vm07.stdout:3/217: dwrite d1/f2a [4194304,4194304] 0 2026-03-09T19:27:12.573 INFO:tasks.workunit.client.0.vm07.stdout:3/218: truncate d1/d1f/d16/f39 1030562 0 2026-03-09T19:27:12.574 INFO:tasks.workunit.client.1.vm08.stdout:0/463: unlink dd/d31/l76 0 2026-03-09T19:27:12.579 INFO:tasks.workunit.client.0.vm07.stdout:3/219: dread d1/d6/dd/f15 [0,4194304] 0 2026-03-09T19:27:12.582 INFO:tasks.workunit.client.0.vm07.stdout:1/139: mknod d1/d20/c32 0 2026-03-09T19:27:12.585 INFO:tasks.workunit.client.0.vm07.stdout:0/163: rmdir d0/d6/d13/d1c/d11 39 2026-03-09T19:27:12.588 INFO:tasks.workunit.client.1.vm08.stdout:5/410: dread d16/d45/f6a [0,4194304] 0 2026-03-09T19:27:12.590 INFO:tasks.workunit.client.0.vm07.stdout:0/164: dwrite d0/d6/d13/d17/f20 [0,4194304] 0 2026-03-09T19:27:12.590 INFO:tasks.workunit.client.0.vm07.stdout:6/169: creat d0/d1/db/d1d/f47 x:0 0 0 2026-03-09T19:27:12.591 INFO:tasks.workunit.client.1.vm08.stdout:1/573: creat d9/da/d12/fac x:0 0 0 2026-03-09T19:27:12.592 INFO:tasks.workunit.client.1.vm08.stdout:2/430: mkdir d3/d4/d3e/d9d 0 2026-03-09T19:27:12.592 INFO:tasks.workunit.client.1.vm08.stdout:1/574: read - d9/d11/d7a/f80 zero size 2026-03-09T19:27:12.592 INFO:tasks.workunit.client.0.vm07.stdout:5/148: getdents d3/d1a 0 2026-03-09T19:27:12.593 INFO:tasks.workunit.client.0.vm07.stdout:4/165: symlink d3/d11/d1b/l32 0 2026-03-09T19:27:12.593 INFO:tasks.workunit.client.1.vm08.stdout:2/431: write d3/d4/d23/d2c/d39/d5e/d14/f2b [5051982,94503] 0 2026-03-09T19:27:12.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:12 vm08.local ceph-mon[57794]: pgmap v160: 65 pgs: 65 active+clean; 1.2 GiB data, 5.0 GiB used, 115 GiB / 120 GiB avail; 23 MiB/s rd, 111 MiB/s wr, 237 op/s 2026-03-09T19:27:12.596 INFO:tasks.workunit.client.1.vm08.stdout:2/432: readlink d3/d4/d23/d2c/d39/d5e/l65 0 2026-03-09T19:27:12.597 INFO:tasks.workunit.client.1.vm08.stdout:0/464: dread dd/d22/d27/d6c/f7f [0,4194304] 0 2026-03-09T19:27:12.601 INFO:tasks.workunit.client.1.vm08.stdout:0/465: stat dd/d7e/f8e 0 2026-03-09T19:27:12.611 INFO:tasks.workunit.client.1.vm08.stdout:6/450: chown d3/f25 1 1 2026-03-09T19:27:12.611 INFO:tasks.workunit.client.0.vm07.stdout:2/200: symlink d3/d11/l4a 0 2026-03-09T19:27:12.615 INFO:tasks.workunit.client.1.vm08.stdout:4/450: dwrite da/d10/f25 [0,4194304] 0 2026-03-09T19:27:12.618 INFO:tasks.workunit.client.0.vm07.stdout:1/140: symlink d1/d3/l33 0 2026-03-09T19:27:12.619 INFO:tasks.workunit.client.0.vm07.stdout:9/225: mkdir d0/d6/d57 0 2026-03-09T19:27:12.621 INFO:tasks.workunit.client.0.vm07.stdout:0/165: rmdir d0 39 2026-03-09T19:27:12.622 INFO:tasks.workunit.client.0.vm07.stdout:6/170: unlink d0/d1/db/l37 0 2026-03-09T19:27:12.623 INFO:tasks.workunit.client.0.vm07.stdout:6/171: write d0/d1/db/f15 [325763,79915] 0 2026-03-09T19:27:12.625 INFO:tasks.workunit.client.1.vm08.stdout:4/451: read da/d10/f2e [1230414,53423] 0 2026-03-09T19:27:12.628 INFO:tasks.workunit.client.0.vm07.stdout:4/166: symlink d3/d11/d2b/l33 0 2026-03-09T19:27:12.630 INFO:tasks.workunit.client.0.vm07.stdout:8/196: write d7/d9/f36 [323094,79046] 0 2026-03-09T19:27:12.631 INFO:tasks.workunit.client.0.vm07.stdout:8/197: fsync d7/d16/d1e/f33 0 2026-03-09T19:27:12.636 INFO:tasks.workunit.client.0.vm07.stdout:1/141: symlink d1/d3/l34 0 2026-03-09T19:27:12.637 INFO:tasks.workunit.client.0.vm07.stdout:1/142: fsync d1/d11/f1b 0 2026-03-09T19:27:12.641 INFO:tasks.workunit.client.0.vm07.stdout:6/172: creat d0/d13/d1e/d30/f48 x:0 0 0 2026-03-09T19:27:12.643 INFO:tasks.workunit.client.0.vm07.stdout:9/226: dread d0/db/d29/d2c/d36/f3c [0,4194304] 0 2026-03-09T19:27:12.644 INFO:tasks.workunit.client.0.vm07.stdout:9/227: write d0/f56 [867669,55357] 0 2026-03-09T19:27:12.648 INFO:tasks.workunit.client.0.vm07.stdout:2/201: mknod d3/dd/d16/d30/d40/c4b 0 2026-03-09T19:27:12.653 INFO:tasks.workunit.client.1.vm08.stdout:9/434: rename d0/d2/d8/d7/d48/l8b to d0/d1b/d4e/l95 0 2026-03-09T19:27:12.653 INFO:tasks.workunit.client.0.vm07.stdout:2/202: write d3/dd/f1e [3027716,8337] 0 2026-03-09T19:27:12.653 INFO:tasks.workunit.client.0.vm07.stdout:2/203: dwrite d3/f5 [0,4194304] 0 2026-03-09T19:27:12.654 INFO:tasks.workunit.client.1.vm08.stdout:1/575: truncate d9/da/d12/f5c 1165285 0 2026-03-09T19:27:12.656 INFO:tasks.workunit.client.1.vm08.stdout:0/466: rmdir dd/d22/d63 39 2026-03-09T19:27:12.657 INFO:tasks.workunit.client.0.vm07.stdout:1/143: mknod d1/d9/c35 0 2026-03-09T19:27:12.659 INFO:tasks.workunit.client.0.vm07.stdout:0/166: symlink d0/d6/l3b 0 2026-03-09T19:27:12.661 INFO:tasks.workunit.client.1.vm08.stdout:6/451: fdatasync d3/f25 0 2026-03-09T19:27:12.662 INFO:tasks.workunit.client.0.vm07.stdout:9/228: chown d0/d17/l44 18 1 2026-03-09T19:27:12.662 INFO:tasks.workunit.client.0.vm07.stdout:9/229: write d0/d6/f4c [774568,42509] 0 2026-03-09T19:27:12.666 INFO:tasks.workunit.client.0.vm07.stdout:2/204: mkdir d3/dd/d16/d29/d3c/d4c 0 2026-03-09T19:27:12.674 INFO:tasks.workunit.client.0.vm07.stdout:7/174: truncate d0/d4/fc 4109998 0 2026-03-09T19:27:12.674 INFO:tasks.workunit.client.0.vm07.stdout:0/167: creat d0/f3c x:0 0 0 2026-03-09T19:27:12.674 INFO:tasks.workunit.client.0.vm07.stdout:2/205: write d3/d11/f2e [952858,78445] 0 2026-03-09T19:27:12.674 INFO:tasks.workunit.client.1.vm08.stdout:7/511: rename d5/d14/dae/d3a/d42/f68 to d5/d14/d2b/fb0 0 2026-03-09T19:27:12.674 INFO:tasks.workunit.client.1.vm08.stdout:7/512: stat d5/d14/d2b/d5d/l6e 0 2026-03-09T19:27:12.674 INFO:tasks.workunit.client.1.vm08.stdout:1/576: stat d9/d11/c34 0 2026-03-09T19:27:12.674 INFO:tasks.workunit.client.1.vm08.stdout:1/577: stat d9/da/dc/c25 0 2026-03-09T19:27:12.675 INFO:tasks.workunit.client.0.vm07.stdout:7/175: symlink d0/d4/d5/d8/l3d 0 2026-03-09T19:27:12.675 INFO:tasks.workunit.client.0.vm07.stdout:2/206: write d3/d11/f39 [970751,106669] 0 2026-03-09T19:27:12.676 INFO:tasks.workunit.client.1.vm08.stdout:8/442: getdents de/d1d/d69 0 2026-03-09T19:27:12.678 INFO:tasks.workunit.client.0.vm07.stdout:1/144: rename d1/d3/d21/c29 to d1/d11/c36 0 2026-03-09T19:27:12.680 INFO:tasks.workunit.client.0.vm07.stdout:4/167: sync 2026-03-09T19:27:12.681 INFO:tasks.workunit.client.0.vm07.stdout:2/207: sync 2026-03-09T19:27:12.683 INFO:tasks.workunit.client.0.vm07.stdout:8/198: fdatasync d7/d9/f36 0 2026-03-09T19:27:12.684 INFO:tasks.workunit.client.0.vm07.stdout:0/168: creat d0/f3d x:0 0 0 2026-03-09T19:27:12.686 INFO:tasks.workunit.client.0.vm07.stdout:6/173: rmdir d0/d1/db/d24/d41 0 2026-03-09T19:27:12.686 INFO:tasks.workunit.client.1.vm08.stdout:3/490: rename d0/d6/de/d15/c6b to d0/d52/d6d/d77/d88/c97 0 2026-03-09T19:27:12.687 INFO:tasks.workunit.client.0.vm07.stdout:5/149: dwrite d3/d1a/fa [4194304,4194304] 0 2026-03-09T19:27:12.688 INFO:tasks.workunit.client.0.vm07.stdout:9/230: creat d0/d6/d57/f58 x:0 0 0 2026-03-09T19:27:12.689 INFO:tasks.workunit.client.1.vm08.stdout:4/452: fsync da/f21 0 2026-03-09T19:27:12.690 INFO:tasks.workunit.client.0.vm07.stdout:5/150: write d3/dd/f22 [1061075,123223] 0 2026-03-09T19:27:12.691 INFO:tasks.workunit.client.0.vm07.stdout:5/151: dread - d3/dd/f24 zero size 2026-03-09T19:27:12.700 INFO:tasks.workunit.client.0.vm07.stdout:3/220: write d1/d1f/d16/f1e [1361572,110364] 0 2026-03-09T19:27:12.703 INFO:tasks.workunit.client.0.vm07.stdout:3/221: dwrite d1/d6/fb [0,4194304] 0 2026-03-09T19:27:12.707 INFO:tasks.workunit.client.0.vm07.stdout:3/222: dread - d1/d6/dd/f33 zero size 2026-03-09T19:27:12.710 INFO:tasks.workunit.client.0.vm07.stdout:7/176: mknod d0/d4/d5/d26/c3e 0 2026-03-09T19:27:12.712 INFO:tasks.workunit.client.1.vm08.stdout:0/467: mknod dd/d22/d24/d49/d50/d78/d86/c96 0 2026-03-09T19:27:12.717 INFO:tasks.workunit.client.0.vm07.stdout:1/145: rename d1/d20 to d1/d11/d37 0 2026-03-09T19:27:12.721 INFO:tasks.workunit.client.1.vm08.stdout:1/578: symlink d9/da/d53/d67/d6c/lad 0 2026-03-09T19:27:12.722 INFO:tasks.workunit.client.0.vm07.stdout:9/231: dread d0/d6/f8 [0,4194304] 0 2026-03-09T19:27:12.725 INFO:tasks.workunit.client.0.vm07.stdout:8/199: mknod d7/d16/c51 0 2026-03-09T19:27:12.725 INFO:tasks.workunit.client.0.vm07.stdout:0/169: rmdir d0/d6/d13/d17/d19 39 2026-03-09T19:27:12.725 INFO:tasks.workunit.client.0.vm07.stdout:8/200: chown d7/d9/fd 110300 1 2026-03-09T19:27:12.726 INFO:tasks.workunit.client.0.vm07.stdout:0/170: chown d0/d6/l25 5689192 1 2026-03-09T19:27:12.726 INFO:tasks.workunit.client.1.vm08.stdout:5/411: link d16/c67 d16/d1e/d3b/d61/c86 0 2026-03-09T19:27:12.726 INFO:tasks.workunit.client.1.vm08.stdout:8/443: dread - de/d25/d31/d82/d6d/f88 zero size 2026-03-09T19:27:12.727 INFO:tasks.workunit.client.1.vm08.stdout:5/412: fdatasync d16/d1e/d30/f3a 0 2026-03-09T19:27:12.727 INFO:tasks.workunit.client.1.vm08.stdout:5/413: stat d16/d45/l48 0 2026-03-09T19:27:12.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:12 vm07.local ceph-mon[48545]: pgmap v160: 65 pgs: 65 active+clean; 1.2 GiB data, 5.0 GiB used, 115 GiB / 120 GiB avail; 23 MiB/s rd, 111 MiB/s wr, 237 op/s 2026-03-09T19:27:12.730 INFO:tasks.workunit.client.0.vm07.stdout:7/177: sync 2026-03-09T19:27:12.730 INFO:tasks.workunit.client.1.vm08.stdout:2/433: rename d3/d4/d23/d2c/d39/d5e/de/d18/d1f/c21 to d3/d4/d23/d2c/d39/d5e/d87/c9e 0 2026-03-09T19:27:12.736 INFO:tasks.workunit.client.1.vm08.stdout:7/513: symlink d5/d14/d2b/daa/lb1 0 2026-03-09T19:27:12.736 INFO:tasks.workunit.client.1.vm08.stdout:0/468: read dd/d22/f5c [71583,119245] 0 2026-03-09T19:27:12.736 INFO:tasks.workunit.client.1.vm08.stdout:0/469: stat dd/d6a 0 2026-03-09T19:27:12.736 INFO:tasks.workunit.client.0.vm07.stdout:7/178: fdatasync d0/d4/d5/f20 0 2026-03-09T19:27:12.736 INFO:tasks.workunit.client.0.vm07.stdout:3/223: fdatasync d1/d6/f19 0 2026-03-09T19:27:12.736 INFO:tasks.workunit.client.0.vm07.stdout:7/179: read d0/d4/d5/d8/d1a/f1d [4013263,31235] 0 2026-03-09T19:27:12.736 INFO:tasks.workunit.client.0.vm07.stdout:3/224: stat d1/d1f/d16/f3a 0 2026-03-09T19:27:12.736 INFO:tasks.workunit.client.0.vm07.stdout:3/225: chown d1/d6/dd 763149 1 2026-03-09T19:27:12.736 INFO:tasks.workunit.client.0.vm07.stdout:4/168: mkdir d3/d11/d29/d34 0 2026-03-09T19:27:12.737 INFO:tasks.workunit.client.0.vm07.stdout:9/232: fdatasync d0/db/d29/d2c/f4a 0 2026-03-09T19:27:12.737 INFO:tasks.workunit.client.0.vm07.stdout:4/169: chown d3/f8 0 1 2026-03-09T19:27:12.737 INFO:tasks.workunit.client.1.vm08.stdout:1/579: unlink d9/da/d12/d39/l94 0 2026-03-09T19:27:12.738 INFO:tasks.workunit.client.1.vm08.stdout:1/580: write d9/d11/f5f [3236064,77581] 0 2026-03-09T19:27:12.738 INFO:tasks.workunit.client.0.vm07.stdout:9/233: write d0/f56 [549478,111523] 0 2026-03-09T19:27:12.747 INFO:tasks.workunit.client.1.vm08.stdout:3/491: symlink d0/d6/de/d6e/l98 0 2026-03-09T19:27:12.757 INFO:tasks.workunit.client.1.vm08.stdout:6/452: write d3/d34/d3b/f67 [1061807,29433] 0 2026-03-09T19:27:12.758 INFO:tasks.workunit.client.0.vm07.stdout:8/201: symlink d7/d9/d10/l52 0 2026-03-09T19:27:12.760 INFO:tasks.workunit.client.1.vm08.stdout:5/414: mkdir d16/d45/d87 0 2026-03-09T19:27:12.760 INFO:tasks.workunit.client.1.vm08.stdout:4/453: write f9 [26469,89391] 0 2026-03-09T19:27:12.763 INFO:tasks.workunit.client.0.vm07.stdout:5/152: symlink d3/d1a/d28/d36/l38 0 2026-03-09T19:27:12.763 INFO:tasks.workunit.client.1.vm08.stdout:4/454: dread da/d10/f25 [0,4194304] 0 2026-03-09T19:27:12.766 INFO:tasks.workunit.client.1.vm08.stdout:9/435: link d0/d1b/f82 d0/d2/d8/d7/d48/d5e/f96 0 2026-03-09T19:27:12.766 INFO:tasks.workunit.client.1.vm08.stdout:2/434: unlink d3/d4/d23/d2c/d39/d5e/d14/f44 0 2026-03-09T19:27:12.768 INFO:tasks.workunit.client.1.vm08.stdout:7/514: creat d5/d14/d2b/d5d/fb2 x:0 0 0 2026-03-09T19:27:12.770 INFO:tasks.workunit.client.1.vm08.stdout:7/515: read d5/d14/dae/d3a/d42/d6a/d8f/f98 [15519,33555] 0 2026-03-09T19:27:12.772 INFO:tasks.workunit.client.0.vm07.stdout:5/153: dread d3/d1a/fc [0,4194304] 0 2026-03-09T19:27:12.773 INFO:tasks.workunit.client.0.vm07.stdout:5/154: write d3/d1a/d28/f2e [1411723,94130] 0 2026-03-09T19:27:12.777 INFO:tasks.workunit.client.0.vm07.stdout:3/226: dread d1/d1f/f1a [4194304,4194304] 0 2026-03-09T19:27:12.778 INFO:tasks.workunit.client.1.vm08.stdout:8/444: dread de/d25/d33/d46/f50 [0,4194304] 0 2026-03-09T19:27:12.780 INFO:tasks.workunit.client.0.vm07.stdout:3/227: sync 2026-03-09T19:27:12.781 INFO:tasks.workunit.client.1.vm08.stdout:3/492: creat d0/d52/d7c/f99 x:0 0 0 2026-03-09T19:27:12.781 INFO:tasks.workunit.client.0.vm07.stdout:3/228: read d1/d1f/f38 [2813995,102753] 0 2026-03-09T19:27:12.783 INFO:tasks.workunit.client.1.vm08.stdout:1/581: dread d9/da/d12/d39/f52 [0,4194304] 0 2026-03-09T19:27:12.784 INFO:tasks.workunit.client.0.vm07.stdout:6/174: creat d0/f49 x:0 0 0 2026-03-09T19:27:12.785 INFO:tasks.workunit.client.1.vm08.stdout:1/582: dread - d9/f36 zero size 2026-03-09T19:27:12.788 INFO:tasks.workunit.client.1.vm08.stdout:1/583: write d9/da/f6f [882433,38181] 0 2026-03-09T19:27:12.791 INFO:tasks.workunit.client.0.vm07.stdout:9/234: creat d0/d6/d57/f59 x:0 0 0 2026-03-09T19:27:12.792 INFO:tasks.workunit.client.0.vm07.stdout:9/235: write d0/db/f39 [382320,54489] 0 2026-03-09T19:27:12.793 INFO:tasks.workunit.client.1.vm08.stdout:6/453: write d3/f6e [3976143,33433] 0 2026-03-09T19:27:12.795 INFO:tasks.workunit.client.0.vm07.stdout:4/170: dwrite d3/d11/f18 [0,4194304] 0 2026-03-09T19:27:12.795 INFO:tasks.workunit.client.0.vm07.stdout:4/171: chown d3/ce 72 1 2026-03-09T19:27:12.797 INFO:tasks.workunit.client.1.vm08.stdout:3/493: dwrite d0/d6/de/d15/f53 [0,4194304] 0 2026-03-09T19:27:12.799 INFO:tasks.workunit.client.1.vm08.stdout:3/494: chown d0/d6/de/d6e 99 1 2026-03-09T19:27:12.803 INFO:tasks.workunit.client.0.vm07.stdout:0/171: fdatasync d0/d6/d13/d17/d19/f1f 0 2026-03-09T19:27:12.804 INFO:tasks.workunit.client.0.vm07.stdout:0/172: fsync d0/d6/d13/d17/d19/f1b 0 2026-03-09T19:27:12.806 INFO:tasks.workunit.client.1.vm08.stdout:4/455: mknod da/d10/d1b/c89 0 2026-03-09T19:27:12.807 INFO:tasks.workunit.client.1.vm08.stdout:4/456: fdatasync da/d10/d1b/f29 0 2026-03-09T19:27:12.807 INFO:tasks.workunit.client.1.vm08.stdout:9/436: write d0/d2/d80/d69/f93 [2159,37660] 0 2026-03-09T19:27:12.808 INFO:tasks.workunit.client.1.vm08.stdout:2/435: mkdir d3/d9/d79/d9f 0 2026-03-09T19:27:12.815 INFO:tasks.workunit.client.1.vm08.stdout:0/470: chown dd/d22/d63/d6e/f8a 23 1 2026-03-09T19:27:12.825 INFO:tasks.workunit.client.0.vm07.stdout:2/208: rename d3/d11/l4a to d3/l4d 0 2026-03-09T19:27:12.825 INFO:tasks.workunit.client.0.vm07.stdout:2/209: chown d3/l9 842196 1 2026-03-09T19:27:12.826 INFO:tasks.workunit.client.0.vm07.stdout:7/180: symlink d0/d4/d5/d26/d32/l3f 0 2026-03-09T19:27:12.830 INFO:tasks.workunit.client.1.vm08.stdout:1/584: creat d9/da/d53/fae x:0 0 0 2026-03-09T19:27:12.830 INFO:tasks.workunit.client.1.vm08.stdout:5/415: getdents d16/d45/d87 0 2026-03-09T19:27:12.830 INFO:tasks.workunit.client.0.vm07.stdout:7/181: fsync d0/d4/d5/d8/fa 0 2026-03-09T19:27:12.830 INFO:tasks.workunit.client.0.vm07.stdout:5/155: creat d3/d1a/d28/f39 x:0 0 0 2026-03-09T19:27:12.830 INFO:tasks.workunit.client.0.vm07.stdout:5/156: chown d3/d1a/d28/d36/l38 1207200 1 2026-03-09T19:27:12.830 INFO:tasks.workunit.client.0.vm07.stdout:2/210: dread d3/dd/d16/d29/f42 [0,4194304] 0 2026-03-09T19:27:12.830 INFO:tasks.workunit.client.0.vm07.stdout:5/157: write d3/d1a/fa [4657728,54162] 0 2026-03-09T19:27:12.830 INFO:tasks.workunit.client.0.vm07.stdout:1/146: creat d1/f38 x:0 0 0 2026-03-09T19:27:12.831 INFO:tasks.workunit.client.0.vm07.stdout:2/211: stat d3/dd/d16/d30 0 2026-03-09T19:27:12.831 INFO:tasks.workunit.client.1.vm08.stdout:1/585: read d9/da/dc/f31 [4821000,17468] 0 2026-03-09T19:27:12.834 INFO:tasks.workunit.client.1.vm08.stdout:1/586: dread - d9/d11/f9b zero size 2026-03-09T19:27:12.835 INFO:tasks.workunit.client.1.vm08.stdout:6/454: rename d3/d34/d6f/d4b to d3/d34/d5c/da2 0 2026-03-09T19:27:12.836 INFO:tasks.workunit.client.1.vm08.stdout:3/495: symlink d0/d52/d7c/l9a 0 2026-03-09T19:27:12.836 INFO:tasks.workunit.client.1.vm08.stdout:0/471: dwrite dd/d22/f8d [0,4194304] 0 2026-03-09T19:27:12.837 INFO:tasks.workunit.client.0.vm07.stdout:9/236: mkdir d0/db/d29/d2c/d36/d5a 0 2026-03-09T19:27:12.839 INFO:tasks.workunit.client.1.vm08.stdout:4/457: chown da/f18 0 1 2026-03-09T19:27:12.842 INFO:tasks.workunit.client.0.vm07.stdout:8/202: mknod d7/d50/c53 0 2026-03-09T19:27:12.842 INFO:tasks.workunit.client.0.vm07.stdout:8/203: chown d7/d16/c1f 29 1 2026-03-09T19:27:12.843 INFO:tasks.workunit.client.0.vm07.stdout:8/204: write d7/d30/f3e [1276897,95128] 0 2026-03-09T19:27:12.847 INFO:tasks.workunit.client.0.vm07.stdout:9/237: sync 2026-03-09T19:27:12.847 INFO:tasks.workunit.client.0.vm07.stdout:9/238: chown d0/d17 133077421 1 2026-03-09T19:27:12.848 INFO:tasks.workunit.client.1.vm08.stdout:2/436: symlink d3/d4/d3e/d4e/d88/la0 0 2026-03-09T19:27:12.850 INFO:tasks.workunit.client.0.vm07.stdout:9/239: dread d0/db/f21 [4194304,4194304] 0 2026-03-09T19:27:12.858 INFO:tasks.workunit.client.1.vm08.stdout:3/496: dread d0/d4b/f74 [0,4194304] 0 2026-03-09T19:27:12.862 INFO:tasks.workunit.client.0.vm07.stdout:6/175: rename d0/d1/db/d17/f2f to d0/d2d/f4a 0 2026-03-09T19:27:12.866 INFO:tasks.workunit.client.1.vm08.stdout:7/516: write d5/d14/dae/f6b [2655640,126662] 0 2026-03-09T19:27:12.875 INFO:tasks.workunit.client.1.vm08.stdout:3/497: dwrite d0/f7a [0,4194304] 0 2026-03-09T19:27:12.884 INFO:tasks.workunit.client.0.vm07.stdout:7/182: unlink d0/d4/d5/dd/f16 0 2026-03-09T19:27:12.888 INFO:tasks.workunit.client.1.vm08.stdout:1/587: creat d9/d11/faf x:0 0 0 2026-03-09T19:27:12.895 INFO:tasks.workunit.client.1.vm08.stdout:0/472: creat dd/d22/d27/d4f/f97 x:0 0 0 2026-03-09T19:27:12.902 INFO:tasks.workunit.client.1.vm08.stdout:4/458: chown da/d10/d16/d28/d46/d52/d6e/l61 30882 1 2026-03-09T19:27:12.904 INFO:tasks.workunit.client.1.vm08.stdout:1/588: dwrite d9/da/dc/fa5 [0,4194304] 0 2026-03-09T19:27:12.904 INFO:tasks.workunit.client.1.vm08.stdout:8/445: creat de/d25/d31/d82/fa1 x:0 0 0 2026-03-09T19:27:12.905 INFO:tasks.workunit.client.1.vm08.stdout:4/459: dread - da/d10/d26/f87 zero size 2026-03-09T19:27:12.908 INFO:tasks.workunit.client.1.vm08.stdout:0/473: dwrite dd/d22/d24/d49/d50/f95 [0,4194304] 0 2026-03-09T19:27:12.917 INFO:tasks.workunit.client.1.vm08.stdout:1/589: write d9/da/d2c/d6a/f9c [407575,114633] 0 2026-03-09T19:27:12.917 INFO:tasks.workunit.client.1.vm08.stdout:0/474: truncate dd/d22/d63/f94 677097 0 2026-03-09T19:27:12.917 INFO:tasks.workunit.client.0.vm07.stdout:3/229: mkdir d1/d1f/d3e 0 2026-03-09T19:27:12.917 INFO:tasks.workunit.client.0.vm07.stdout:3/230: truncate d1/f22 872355 0 2026-03-09T19:27:12.918 INFO:tasks.workunit.client.0.vm07.stdout:8/205: mknod d7/d9/d37/d45/c54 0 2026-03-09T19:27:12.918 INFO:tasks.workunit.client.0.vm07.stdout:8/206: fsync d7/f15 0 2026-03-09T19:27:12.918 INFO:tasks.workunit.client.0.vm07.stdout:8/207: fdatasync d7/d9/d10/f20 0 2026-03-09T19:27:12.918 INFO:tasks.workunit.client.0.vm07.stdout:6/176: creat d0/d1/db/f4b x:0 0 0 2026-03-09T19:27:12.925 INFO:tasks.workunit.client.1.vm08.stdout:7/517: rmdir d5/d14/dae/d1c 39 2026-03-09T19:27:12.926 INFO:tasks.workunit.client.1.vm08.stdout:3/498: dread d0/d6/de/d1b/d16/f7b [0,4194304] 0 2026-03-09T19:27:12.926 INFO:tasks.workunit.client.1.vm08.stdout:7/518: truncate d5/d14/d2b/f37 4951562 0 2026-03-09T19:27:12.928 INFO:tasks.workunit.client.1.vm08.stdout:7/519: dread d5/d14/dae/d3a/d42/d6a/d8f/f98 [0,4194304] 0 2026-03-09T19:27:12.933 INFO:tasks.workunit.client.0.vm07.stdout:5/158: symlink d3/dd/d26/d2c/l3a 0 2026-03-09T19:27:12.933 INFO:tasks.workunit.client.1.vm08.stdout:1/590: symlink d9/d11/d7a/d89/lb0 0 2026-03-09T19:27:12.935 INFO:tasks.workunit.client.0.vm07.stdout:0/173: creat d0/d6/d13/d1c/f3e x:0 0 0 2026-03-09T19:27:12.948 INFO:tasks.workunit.client.0.vm07.stdout:5/159: dwrite d3/fe [0,4194304] 0 2026-03-09T19:27:12.948 INFO:tasks.workunit.client.0.vm07.stdout:9/240: mknod d0/c5b 0 2026-03-09T19:27:12.948 INFO:tasks.workunit.client.1.vm08.stdout:9/437: rename d0/d2/d8/d7 to d0/d1b/d97 0 2026-03-09T19:27:12.948 INFO:tasks.workunit.client.1.vm08.stdout:3/499: mknod d0/d6/de/d6e/d51/c9b 0 2026-03-09T19:27:12.948 INFO:tasks.workunit.client.1.vm08.stdout:7/520: unlink d5/d14/dae/f95 0 2026-03-09T19:27:12.948 INFO:tasks.workunit.client.1.vm08.stdout:7/521: write d5/d14/dae/d3a/d42/f9a [97632,48843] 0 2026-03-09T19:27:12.948 INFO:tasks.workunit.client.1.vm08.stdout:9/438: dread d0/d1b/d97/f58 [0,4194304] 0 2026-03-09T19:27:12.950 INFO:tasks.workunit.client.1.vm08.stdout:9/439: fdatasync d0/f83 0 2026-03-09T19:27:12.953 INFO:tasks.workunit.client.1.vm08.stdout:4/460: mkdir da/d10/d16/d28/d2f/d4f/d64/d84/d8a 0 2026-03-09T19:27:12.968 INFO:tasks.workunit.client.1.vm08.stdout:3/500: fsync d0/d4b/f74 0 2026-03-09T19:27:12.968 INFO:tasks.workunit.client.1.vm08.stdout:7/522: unlink d5/d14/d27/l33 0 2026-03-09T19:27:12.968 INFO:tasks.workunit.client.1.vm08.stdout:9/440: fdatasync d0/d2/d14/d5c/fd 0 2026-03-09T19:27:12.968 INFO:tasks.workunit.client.0.vm07.stdout:6/177: mkdir d0/d1/db/d17/d4c 0 2026-03-09T19:27:12.968 INFO:tasks.workunit.client.0.vm07.stdout:1/147: link d1/d3/d21/l2a d1/db/l39 0 2026-03-09T19:27:12.968 INFO:tasks.workunit.client.0.vm07.stdout:3/231: symlink d1/d3d/l3f 0 2026-03-09T19:27:12.968 INFO:tasks.workunit.client.0.vm07.stdout:5/160: fdatasync d3/d1a/f12 0 2026-03-09T19:27:12.970 INFO:tasks.workunit.client.0.vm07.stdout:8/208: creat d7/d9/d37/d34/f55 x:0 0 0 2026-03-09T19:27:12.971 INFO:tasks.workunit.client.0.vm07.stdout:8/209: fdatasync d7/f2e 0 2026-03-09T19:27:12.977 INFO:tasks.workunit.client.0.vm07.stdout:6/178: creat d0/d1/db/d24/f4d x:0 0 0 2026-03-09T19:27:12.979 INFO:tasks.workunit.client.1.vm08.stdout:3/501: symlink d0/d6/de/d1b/d16/l9c 0 2026-03-09T19:27:12.981 INFO:tasks.workunit.client.0.vm07.stdout:7/183: creat d0/d4/d5/f40 x:0 0 0 2026-03-09T19:27:12.982 INFO:tasks.workunit.client.1.vm08.stdout:4/461: mknod da/d10/d16/d28/d2f/d4f/d64/d84/d8a/c8b 0 2026-03-09T19:27:12.986 INFO:tasks.workunit.client.0.vm07.stdout:7/184: dwrite d0/d4/d5/f20 [0,4194304] 0 2026-03-09T19:27:12.989 INFO:tasks.workunit.client.0.vm07.stdout:2/212: getdents d3/dd/d16/d29 0 2026-03-09T19:27:12.989 INFO:tasks.workunit.client.0.vm07.stdout:1/148: dread d1/d9/f16 [0,4194304] 0 2026-03-09T19:27:12.989 INFO:tasks.workunit.client.0.vm07.stdout:2/213: write d3/d11/f39 [1258358,116425] 0 2026-03-09T19:27:13.005 INFO:tasks.workunit.client.1.vm08.stdout:8/446: link de/d25/l5a de/d1d/d2e/la2 0 2026-03-09T19:27:13.006 INFO:tasks.workunit.client.1.vm08.stdout:9/441: dread d0/d1b/f49 [4194304,4194304] 0 2026-03-09T19:27:13.007 INFO:tasks.workunit.client.1.vm08.stdout:9/442: truncate d0/d1b/d4e/f7d 836339 0 2026-03-09T19:27:13.007 INFO:tasks.workunit.client.1.vm08.stdout:9/443: stat d0/d1b/d97/d48/d5d/c55 0 2026-03-09T19:27:13.011 INFO:tasks.workunit.client.1.vm08.stdout:7/523: creat d5/d14/dae/d1c/d73/fb3 x:0 0 0 2026-03-09T19:27:13.012 INFO:tasks.workunit.client.1.vm08.stdout:7/524: chown d5/d14/dae/d1c/d83/f9b 424 1 2026-03-09T19:27:13.016 INFO:tasks.workunit.client.0.vm07.stdout:5/161: symlink d3/d1a/d28/d36/l3b 0 2026-03-09T19:27:13.017 INFO:tasks.workunit.client.1.vm08.stdout:3/502: dread d0/d8/f5b [0,4194304] 0 2026-03-09T19:27:13.018 INFO:tasks.workunit.client.1.vm08.stdout:3/503: stat d0/d8/d19/c3a 0 2026-03-09T19:27:13.019 INFO:tasks.workunit.client.0.vm07.stdout:9/241: mkdir d0/db/d29/d32/d5c 0 2026-03-09T19:27:13.023 INFO:tasks.workunit.client.1.vm08.stdout:1/591: rename d9/d11/f5f to d9/da/d17/fb1 0 2026-03-09T19:27:13.024 INFO:tasks.workunit.client.0.vm07.stdout:8/210: mkdir d7/d9/d37/d45/d56 0 2026-03-09T19:27:13.026 INFO:tasks.workunit.client.0.vm07.stdout:6/179: chown d0/d1/l7 11 1 2026-03-09T19:27:13.026 INFO:tasks.workunit.client.1.vm08.stdout:7/525: dread - d5/d14/dae/d1c/f87 zero size 2026-03-09T19:27:13.030 INFO:tasks.workunit.client.0.vm07.stdout:7/185: mkdir d0/d4/d5/d8/d41 0 2026-03-09T19:27:13.032 INFO:tasks.workunit.client.0.vm07.stdout:1/149: symlink d1/d11/d37/l3a 0 2026-03-09T19:27:13.032 INFO:tasks.workunit.client.0.vm07.stdout:1/150: chown d1/d3/d21/l2b 451663 1 2026-03-09T19:27:13.033 INFO:tasks.workunit.client.0.vm07.stdout:1/151: readlink d1/d9/l1e 0 2026-03-09T19:27:13.035 INFO:tasks.workunit.client.0.vm07.stdout:2/214: mknod d3/dd/d16/d29/d2d/d45/c4e 0 2026-03-09T19:27:13.055 INFO:tasks.workunit.client.1.vm08.stdout:9/444: rename d0/d2/d14/d5c/d32 to d0/d2/d14/d98 0 2026-03-09T19:27:13.055 INFO:tasks.workunit.client.1.vm08.stdout:1/592: mkdir d9/da/d53/d67/d6c/d76/db2 0 2026-03-09T19:27:13.055 INFO:tasks.workunit.client.0.vm07.stdout:1/152: dwrite d1/f2f [0,4194304] 0 2026-03-09T19:27:13.055 INFO:tasks.workunit.client.0.vm07.stdout:1/153: fsync d1/d11/d37/f2c 0 2026-03-09T19:27:13.055 INFO:tasks.workunit.client.0.vm07.stdout:5/162: creat d3/d1a/d28/f3c x:0 0 0 2026-03-09T19:27:13.055 INFO:tasks.workunit.client.1.vm08.stdout:4/462: creat da/d10/d16/d28/d2f/f8c x:0 0 0 2026-03-09T19:27:13.059 INFO:tasks.workunit.client.0.vm07.stdout:5/163: dread d3/d1a/f1c [0,4194304] 0 2026-03-09T19:27:13.064 INFO:tasks.workunit.client.0.vm07.stdout:2/215: creat d3/dd/d16/d30/d40/f4f x:0 0 0 2026-03-09T19:27:13.064 INFO:tasks.workunit.client.1.vm08.stdout:1/593: dread d9/da/d2d/f41 [0,4194304] 0 2026-03-09T19:27:13.066 INFO:tasks.workunit.client.0.vm07.stdout:1/154: truncate d1/db/f14 690411 0 2026-03-09T19:27:13.067 INFO:tasks.workunit.client.0.vm07.stdout:1/155: read - d1/d3/d21/f2e zero size 2026-03-09T19:27:13.068 INFO:tasks.workunit.client.1.vm08.stdout:7/526: mknod d5/d14/dae/d1c/d83/d9c/cb4 0 2026-03-09T19:27:13.073 INFO:tasks.workunit.client.1.vm08.stdout:3/504: creat d0/d6/de/f9d x:0 0 0 2026-03-09T19:27:13.073 INFO:tasks.workunit.client.1.vm08.stdout:3/505: fdatasync d0/d8/f4c 0 2026-03-09T19:27:13.074 INFO:tasks.workunit.client.1.vm08.stdout:3/506: dread - d0/d6/de/d6e/f83 zero size 2026-03-09T19:27:13.090 INFO:tasks.workunit.client.1.vm08.stdout:5/416: truncate d16/d1e/f44 4142714 0 2026-03-09T19:27:13.090 INFO:tasks.workunit.client.1.vm08.stdout:6/455: write d3/fc [9110196,70772] 0 2026-03-09T19:27:13.090 INFO:tasks.workunit.client.1.vm08.stdout:5/417: write d16/d45/f6b [3883759,52972] 0 2026-03-09T19:27:13.090 INFO:tasks.workunit.client.0.vm07.stdout:1/156: dread d1/d9/f26 [0,4194304] 0 2026-03-09T19:27:13.090 INFO:tasks.workunit.client.0.vm07.stdout:9/242: mkdir d0/d6/d57/d5d 0 2026-03-09T19:27:13.090 INFO:tasks.workunit.client.0.vm07.stdout:4/172: write d3/f13 [2292028,115420] 0 2026-03-09T19:27:13.090 INFO:tasks.workunit.client.0.vm07.stdout:4/173: dread - d3/d11/d16/d2f/f26 zero size 2026-03-09T19:27:13.090 INFO:tasks.workunit.client.0.vm07.stdout:8/211: write f5 [2441067,64079] 0 2026-03-09T19:27:13.090 INFO:tasks.workunit.client.0.vm07.stdout:6/180: mkdir d0/d4e 0 2026-03-09T19:27:13.090 INFO:tasks.workunit.client.0.vm07.stdout:4/174: dwrite d3/d11/f18 [0,4194304] 0 2026-03-09T19:27:13.090 INFO:tasks.workunit.client.0.vm07.stdout:8/212: truncate d7/d1d/f3f 617070 0 2026-03-09T19:27:13.090 INFO:tasks.workunit.client.1.vm08.stdout:1/594: dread d9/da/d53/d67/f79 [0,4194304] 0 2026-03-09T19:27:13.094 INFO:tasks.workunit.client.0.vm07.stdout:7/186: unlink d0/d4/d5/d26/d3c/c3b 0 2026-03-09T19:27:13.094 INFO:tasks.workunit.client.0.vm07.stdout:5/164: rmdir d3/dd/d26 39 2026-03-09T19:27:13.094 INFO:tasks.workunit.client.0.vm07.stdout:2/216: symlink d3/dd/d16/d29/d3c/l50 0 2026-03-09T19:27:13.096 INFO:tasks.workunit.client.0.vm07.stdout:2/217: readlink d3/l9 0 2026-03-09T19:27:13.099 INFO:tasks.workunit.client.1.vm08.stdout:6/456: creat d3/db/d43/d69/fa3 x:0 0 0 2026-03-09T19:27:13.101 INFO:tasks.workunit.client.1.vm08.stdout:5/418: symlink d16/d1e/d30/d6f/l88 0 2026-03-09T19:27:13.102 INFO:tasks.workunit.client.1.vm08.stdout:5/419: fdatasync d16/d1e/f27 0 2026-03-09T19:27:13.102 INFO:tasks.workunit.client.1.vm08.stdout:5/420: chown d16/d1e/d3b/d61/l78 3 1 2026-03-09T19:27:13.113 INFO:tasks.workunit.client.0.vm07.stdout:1/157: creat d1/d9/f3b x:0 0 0 2026-03-09T19:27:13.113 INFO:tasks.workunit.client.0.vm07.stdout:8/213: mkdir d7/d9/d57 0 2026-03-09T19:27:13.114 INFO:tasks.workunit.client.1.vm08.stdout:7/527: mkdir d5/d14/dae/d1c/db5 0 2026-03-09T19:27:13.114 INFO:tasks.workunit.client.1.vm08.stdout:7/528: write d5/d14/d2b/d5d/fb2 [374353,104886] 0 2026-03-09T19:27:13.114 INFO:tasks.workunit.client.1.vm08.stdout:7/529: fdatasync d5/d14/dae/d1c/d73/fb3 0 2026-03-09T19:27:13.114 INFO:tasks.workunit.client.1.vm08.stdout:1/595: readlink d9/da/l21 0 2026-03-09T19:27:13.114 INFO:tasks.workunit.client.1.vm08.stdout:1/596: chown d9/da/d12/fac 265398272 1 2026-03-09T19:27:13.114 INFO:tasks.workunit.client.1.vm08.stdout:1/597: read d9/d11/d7a/d89/d8d/fa1 [2404057,44834] 0 2026-03-09T19:27:13.114 INFO:tasks.workunit.client.1.vm08.stdout:1/598: fdatasync d9/d11/faf 0 2026-03-09T19:27:13.116 INFO:tasks.workunit.client.1.vm08.stdout:3/507: symlink d0/d6/de/d54/l9e 0 2026-03-09T19:27:13.117 INFO:tasks.workunit.client.0.vm07.stdout:4/175: creat d3/d11/f35 x:0 0 0 2026-03-09T19:27:13.118 INFO:tasks.workunit.client.0.vm07.stdout:4/176: write d3/d11/d1b/f2e [8890,32685] 0 2026-03-09T19:27:13.119 INFO:tasks.workunit.client.1.vm08.stdout:6/457: mkdir d3/d34/d3b/d85/da4 0 2026-03-09T19:27:13.125 INFO:tasks.workunit.client.1.vm08.stdout:6/458: dwrite d3/d34/d3b/f8d [0,4194304] 0 2026-03-09T19:27:13.140 INFO:tasks.workunit.client.0.vm07.stdout:9/243: sync 2026-03-09T19:27:13.142 INFO:tasks.workunit.client.0.vm07.stdout:1/158: creat d1/db/f3c x:0 0 0 2026-03-09T19:27:13.142 INFO:tasks.workunit.client.0.vm07.stdout:1/159: fdatasync d1/d11/f24 0 2026-03-09T19:27:13.143 INFO:tasks.workunit.client.0.vm07.stdout:1/160: write d1/f38 [220508,54855] 0 2026-03-09T19:27:13.143 INFO:tasks.workunit.client.1.vm08.stdout:3/508: mknod d0/d52/d7c/c9f 0 2026-03-09T19:27:13.147 INFO:tasks.workunit.client.1.vm08.stdout:3/509: dread - d0/d6/de/d1b/f7d zero size 2026-03-09T19:27:13.159 INFO:tasks.workunit.client.1.vm08.stdout:6/459: unlink d3/d15/f45 0 2026-03-09T19:27:13.161 INFO:tasks.workunit.client.1.vm08.stdout:6/460: write d3/d34/d3b/f67 [1494489,77664] 0 2026-03-09T19:27:13.173 INFO:tasks.workunit.client.1.vm08.stdout:7/530: dread d5/d14/dae/d1c/f5a [0,4194304] 0 2026-03-09T19:27:13.176 INFO:tasks.workunit.client.0.vm07.stdout:0/174: truncate d0/d6/d13/d17/f20 2993738 0 2026-03-09T19:27:13.177 INFO:tasks.workunit.client.0.vm07.stdout:0/175: stat d0/d6/d13 0 2026-03-09T19:27:13.177 INFO:tasks.workunit.client.0.vm07.stdout:0/176: stat d0/d6/d13/d17/c37 0 2026-03-09T19:27:13.179 INFO:tasks.workunit.client.1.vm08.stdout:5/421: mknod d16/d1e/d6e/d84/c89 0 2026-03-09T19:27:13.180 INFO:tasks.workunit.client.0.vm07.stdout:0/177: dwrite d0/fa [0,4194304] 0 2026-03-09T19:27:13.181 INFO:tasks.workunit.client.0.vm07.stdout:4/177: mknod d3/d11/d16/c36 0 2026-03-09T19:27:13.184 INFO:tasks.workunit.client.1.vm08.stdout:0/475: dwrite dd/d22/f5c [0,4194304] 0 2026-03-09T19:27:13.184 INFO:tasks.workunit.client.0.vm07.stdout:0/178: stat d0/d6/d13/d17/d19/f34 0 2026-03-09T19:27:13.186 INFO:tasks.workunit.client.0.vm07.stdout:5/165: mknod d3/dd/d26/d2d/c3d 0 2026-03-09T19:27:13.187 INFO:tasks.workunit.client.0.vm07.stdout:5/166: rename d3 to d3/d1a/d28/d36/d3e 22 2026-03-09T19:27:13.194 INFO:tasks.workunit.client.0.vm07.stdout:4/178: dwrite d3/d11/d1b/f2e [0,4194304] 0 2026-03-09T19:27:13.194 INFO:tasks.workunit.client.0.vm07.stdout:0/179: dread d0/d6/d13/d17/f2b [0,4194304] 0 2026-03-09T19:27:13.195 INFO:tasks.workunit.client.1.vm08.stdout:3/510: rename d0/d6/f57 to d0/d6/de/d15/d96/fa0 0 2026-03-09T19:27:13.197 INFO:tasks.workunit.client.1.vm08.stdout:8/447: dwrite de/f1f [0,4194304] 0 2026-03-09T19:27:13.198 INFO:tasks.workunit.client.1.vm08.stdout:3/511: write d0/d6/de/d6e/f81 [468589,60911] 0 2026-03-09T19:27:13.205 INFO:tasks.workunit.client.1.vm08.stdout:7/531: fdatasync d5/d14/d2b/d4b/f96 0 2026-03-09T19:27:13.205 INFO:tasks.workunit.client.1.vm08.stdout:3/512: dread d0/d6/d25/f56 [0,4194304] 0 2026-03-09T19:27:13.209 INFO:tasks.workunit.client.1.vm08.stdout:3/513: write d0/d4b/f74 [122545,4853] 0 2026-03-09T19:27:13.212 INFO:tasks.workunit.client.0.vm07.stdout:9/244: truncate d0/d6/f8 2143241 0 2026-03-09T19:27:13.213 INFO:tasks.workunit.client.0.vm07.stdout:9/245: write d0/d6/d57/f59 [529899,61090] 0 2026-03-09T19:27:13.216 INFO:tasks.workunit.client.0.vm07.stdout:3/232: write d1/d1f/d16/f3a [3158955,100961] 0 2026-03-09T19:27:13.234 INFO:tasks.workunit.client.1.vm08.stdout:9/445: dwrite d0/d2/d80/f6a [0,4194304] 0 2026-03-09T19:27:13.234 INFO:tasks.workunit.client.1.vm08.stdout:6/461: mknod d3/d34/d3b/d85/da4/ca5 0 2026-03-09T19:27:13.234 INFO:tasks.workunit.client.1.vm08.stdout:8/448: mknod de/d25/d31/ca3 0 2026-03-09T19:27:13.235 INFO:tasks.workunit.client.0.vm07.stdout:3/233: write d1/d6/f1b [917597,66433] 0 2026-03-09T19:27:13.235 INFO:tasks.workunit.client.0.vm07.stdout:3/234: fsync d1/d6/fb 0 2026-03-09T19:27:13.235 INFO:tasks.workunit.client.0.vm07.stdout:3/235: fsync d1/d6/f37 0 2026-03-09T19:27:13.235 INFO:tasks.workunit.client.0.vm07.stdout:3/236: dread - d1/d26/f31 zero size 2026-03-09T19:27:13.235 INFO:tasks.workunit.client.0.vm07.stdout:6/181: creat d0/f4f x:0 0 0 2026-03-09T19:27:13.235 INFO:tasks.workunit.client.0.vm07.stdout:6/182: dwrite d0/d13/f26 [0,4194304] 0 2026-03-09T19:27:13.238 INFO:tasks.workunit.client.0.vm07.stdout:5/167: mkdir d3/dd/d26/d3f 0 2026-03-09T19:27:13.241 INFO:tasks.workunit.client.0.vm07.stdout:4/179: rename d3/d11/d1b to d3/d11/d2b/d37 0 2026-03-09T19:27:13.241 INFO:tasks.workunit.client.0.vm07.stdout:4/180: dread - d3/d11/d2b/f2c zero size 2026-03-09T19:27:13.256 INFO:tasks.workunit.client.0.vm07.stdout:9/246: rmdir d0/db/d29 39 2026-03-09T19:27:13.259 INFO:tasks.workunit.client.1.vm08.stdout:9/446: mkdir d0/d2/d14/d98/d99 0 2026-03-09T19:27:13.267 INFO:tasks.workunit.client.0.vm07.stdout:9/247: dwrite d0/d6/f48 [0,4194304] 0 2026-03-09T19:27:13.268 INFO:tasks.workunit.client.0.vm07.stdout:3/237: symlink d1/d1f/d16/d28/l40 0 2026-03-09T19:27:13.268 INFO:tasks.workunit.client.0.vm07.stdout:3/238: readlink d1/d3d/l3f 0 2026-03-09T19:27:13.270 INFO:tasks.workunit.client.1.vm08.stdout:8/449: chown de/f1c 1263 1 2026-03-09T19:27:13.271 INFO:tasks.workunit.client.1.vm08.stdout:9/447: dread d0/d1b/d97/d48/d6f/f84 [0,4194304] 0 2026-03-09T19:27:13.274 INFO:tasks.workunit.client.1.vm08.stdout:3/514: dread d0/d6/de/d1b/d16/d17/f3f [0,4194304] 0 2026-03-09T19:27:13.274 INFO:tasks.workunit.client.1.vm08.stdout:5/422: dread d16/d1e/f57 [0,4194304] 0 2026-03-09T19:27:13.276 INFO:tasks.workunit.client.1.vm08.stdout:6/462: symlink d3/d68/d7e/la6 0 2026-03-09T19:27:13.278 INFO:tasks.workunit.client.0.vm07.stdout:5/168: mkdir d3/d1a/d28/d40 0 2026-03-09T19:27:13.282 INFO:tasks.workunit.client.0.vm07.stdout:0/180: creat d0/d6/d13/d1c/d11/f3f x:0 0 0 2026-03-09T19:27:13.287 INFO:tasks.workunit.client.1.vm08.stdout:5/423: mkdir d16/d1e/d30/d8a 0 2026-03-09T19:27:13.288 INFO:tasks.workunit.client.0.vm07.stdout:8/214: getdents d7/d9/d37/d45 0 2026-03-09T19:27:13.295 INFO:tasks.workunit.client.1.vm08.stdout:6/463: creat d3/db/d43/d69/da0/fa7 x:0 0 0 2026-03-09T19:27:13.301 INFO:tasks.workunit.client.1.vm08.stdout:3/515: creat d0/d6/d93/fa1 x:0 0 0 2026-03-09T19:27:13.301 INFO:tasks.workunit.client.1.vm08.stdout:6/464: read d3/f9 [4323752,101380] 0 2026-03-09T19:27:13.301 INFO:tasks.workunit.client.1.vm08.stdout:5/424: rmdir d16 39 2026-03-09T19:27:13.301 INFO:tasks.workunit.client.1.vm08.stdout:9/448: rename d0/d1b/d97/d48/d5d/c55 to d0/d1b/c9a 0 2026-03-09T19:27:13.304 INFO:tasks.workunit.client.1.vm08.stdout:9/449: chown d0/d2/d8/fe 171409921 1 2026-03-09T19:27:13.305 INFO:tasks.workunit.client.1.vm08.stdout:3/516: unlink d0/d6/de/d6e/f2c 0 2026-03-09T19:27:13.308 INFO:tasks.workunit.client.1.vm08.stdout:3/517: write d0/d6/de/d1b/f7d [156894,98430] 0 2026-03-09T19:27:13.313 INFO:tasks.workunit.client.0.vm07.stdout:5/169: write d3/d1a/f1c [3359947,77026] 0 2026-03-09T19:27:13.314 INFO:tasks.workunit.client.0.vm07.stdout:0/181: rename d0/d6/d13/d17/d19/f1b to d0/d6/d13/d1c/d11/f40 0 2026-03-09T19:27:13.315 INFO:tasks.workunit.client.0.vm07.stdout:0/182: chown d0/d6/d13/d17/c26 286281 1 2026-03-09T19:27:13.319 INFO:tasks.workunit.client.0.vm07.stdout:0/183: dwrite d0/d6/d13/d33/f35 [0,4194304] 0 2026-03-09T19:27:13.321 INFO:tasks.workunit.client.1.vm08.stdout:5/425: write d16/d1e/f35 [2019568,48143] 0 2026-03-09T19:27:13.321 INFO:tasks.workunit.client.0.vm07.stdout:9/248: creat d0/d17/f5e x:0 0 0 2026-03-09T19:27:13.330 INFO:tasks.workunit.client.0.vm07.stdout:3/239: symlink d1/d1f/d3e/l41 0 2026-03-09T19:27:13.334 INFO:tasks.workunit.client.1.vm08.stdout:3/518: mknod d0/d6/de/d15/ca2 0 2026-03-09T19:27:13.334 INFO:tasks.workunit.client.1.vm08.stdout:9/450: getdents d0/d2/d14/d98/d99 0 2026-03-09T19:27:13.341 INFO:tasks.workunit.client.1.vm08.stdout:9/451: creat d0/d1b/d97/d48/d5d/f9b x:0 0 0 2026-03-09T19:27:13.347 INFO:tasks.workunit.client.0.vm07.stdout:3/240: read d1/d1f/d16/f3a [304648,105890] 0 2026-03-09T19:27:13.349 INFO:tasks.workunit.client.1.vm08.stdout:5/426: fsync d16/f56 0 2026-03-09T19:27:13.351 INFO:tasks.workunit.client.1.vm08.stdout:5/427: fdatasync d16/d45/f6b 0 2026-03-09T19:27:13.353 INFO:tasks.workunit.client.1.vm08.stdout:3/519: dwrite d0/d6/de/d1b/f7d [0,4194304] 0 2026-03-09T19:27:13.356 INFO:tasks.workunit.client.0.vm07.stdout:1/161: fdatasync d1/f2f 0 2026-03-09T19:27:13.374 INFO:tasks.workunit.client.0.vm07.stdout:8/215: mknod d7/d9/d37/d45/d4f/c58 0 2026-03-09T19:27:13.374 INFO:tasks.workunit.client.0.vm07.stdout:7/187: write d0/d4/d5/dd/f18 [788900,87475] 0 2026-03-09T19:27:13.374 INFO:tasks.workunit.client.0.vm07.stdout:8/216: readlink d7/d16/l21 0 2026-03-09T19:27:13.374 INFO:tasks.workunit.client.0.vm07.stdout:7/188: dread - d0/d4/d5/d8/f37 zero size 2026-03-09T19:27:13.374 INFO:tasks.workunit.client.0.vm07.stdout:2/218: truncate d3/dd/f24 2439044 0 2026-03-09T19:27:13.377 INFO:tasks.workunit.client.1.vm08.stdout:5/428: dread d16/d1e/d3b/f43 [0,4194304] 0 2026-03-09T19:27:13.378 INFO:tasks.workunit.client.1.vm08.stdout:3/520: getdents d0/d8 0 2026-03-09T19:27:13.382 INFO:tasks.workunit.client.0.vm07.stdout:3/241: symlink d1/d1f/d16/l42 0 2026-03-09T19:27:13.382 INFO:tasks.workunit.client.0.vm07.stdout:7/189: creat d0/d4/d5/d26/f42 x:0 0 0 2026-03-09T19:27:13.382 INFO:tasks.workunit.client.0.vm07.stdout:2/219: chown d3/dd/c32 135513 1 2026-03-09T19:27:13.385 INFO:tasks.workunit.client.1.vm08.stdout:3/521: creat d0/d6/de/d15/fa3 x:0 0 0 2026-03-09T19:27:13.386 INFO:tasks.workunit.client.0.vm07.stdout:3/242: symlink d1/d26/l43 0 2026-03-09T19:27:13.388 INFO:tasks.workunit.client.1.vm08.stdout:5/429: link d16/d1e/f37 d16/d1e/d6e/d84/f8b 0 2026-03-09T19:27:13.388 INFO:tasks.workunit.client.0.vm07.stdout:7/190: rmdir d0/d4/d5/d26/d32 39 2026-03-09T19:27:13.389 INFO:tasks.workunit.client.0.vm07.stdout:2/220: mkdir d3/dd/d16/d29/d2d/d51 0 2026-03-09T19:27:13.389 INFO:tasks.workunit.client.0.vm07.stdout:7/191: stat d0/d4/d5/d26 0 2026-03-09T19:27:13.390 INFO:tasks.workunit.client.1.vm08.stdout:3/522: creat d0/d6/de/d15/fa4 x:0 0 0 2026-03-09T19:27:13.390 INFO:tasks.workunit.client.0.vm07.stdout:3/243: dread d1/f22 [0,4194304] 0 2026-03-09T19:27:13.391 INFO:tasks.workunit.client.0.vm07.stdout:1/162: link d1/d11/d37/l3a d1/d3/l3d 0 2026-03-09T19:27:13.404 INFO:tasks.workunit.client.1.vm08.stdout:5/430: mkdir d16/d1e/d8c 0 2026-03-09T19:27:13.404 INFO:tasks.workunit.client.0.vm07.stdout:2/221: symlink d3/dd/d16/d29/d2d/d45/l52 0 2026-03-09T19:27:13.404 INFO:tasks.workunit.client.0.vm07.stdout:7/192: dwrite d0/d4/d5/d8/d1a/f1d [4194304,4194304] 0 2026-03-09T19:27:13.413 INFO:tasks.workunit.client.0.vm07.stdout:3/244: creat d1/d6/dd/f44 x:0 0 0 2026-03-09T19:27:13.414 INFO:tasks.workunit.client.0.vm07.stdout:8/217: getdents d7/d9/d10/d44 0 2026-03-09T19:27:13.414 INFO:tasks.workunit.client.1.vm08.stdout:5/431: rmdir d16/d45 39 2026-03-09T19:27:13.414 INFO:tasks.workunit.client.0.vm07.stdout:8/218: write f5 [294689,20711] 0 2026-03-09T19:27:13.416 INFO:tasks.workunit.client.1.vm08.stdout:1/599: sync 2026-03-09T19:27:13.416 INFO:tasks.workunit.client.1.vm08.stdout:2/437: sync 2026-03-09T19:27:13.418 INFO:tasks.workunit.client.1.vm08.stdout:4/463: sync 2026-03-09T19:27:13.419 INFO:tasks.workunit.client.1.vm08.stdout:4/464: readlink da/le 0 2026-03-09T19:27:13.421 INFO:tasks.workunit.client.0.vm07.stdout:2/222: mkdir d3/dd/d16/d29/d2d/d45/d3b/d53 0 2026-03-09T19:27:13.424 INFO:tasks.workunit.client.0.vm07.stdout:2/223: dwrite f2 [0,4194304] 0 2026-03-09T19:27:13.425 INFO:tasks.workunit.client.0.vm07.stdout:1/163: mkdir d1/d3e 0 2026-03-09T19:27:13.435 INFO:tasks.workunit.client.1.vm08.stdout:5/432: getdents d16/d1e/d30/d8a 0 2026-03-09T19:27:13.441 INFO:tasks.workunit.client.0.vm07.stdout:3/245: fsync d1/d6/f9 0 2026-03-09T19:27:13.446 INFO:tasks.workunit.client.1.vm08.stdout:1/600: truncate d9/da/dc/f1d 2070180 0 2026-03-09T19:27:13.447 INFO:tasks.workunit.client.0.vm07.stdout:3/246: dwrite d1/f2a [0,4194304] 0 2026-03-09T19:27:13.447 INFO:tasks.workunit.client.0.vm07.stdout:3/247: stat d1/d1f/f38 0 2026-03-09T19:27:13.447 INFO:tasks.workunit.client.0.vm07.stdout:7/193: mkdir d0/d4/d5/d26/d43 0 2026-03-09T19:27:13.448 INFO:tasks.workunit.client.0.vm07.stdout:1/164: mkdir d1/d11/d37/d3f 0 2026-03-09T19:27:13.449 INFO:tasks.workunit.client.0.vm07.stdout:2/224: symlink d3/d11/l54 0 2026-03-09T19:27:13.450 INFO:tasks.workunit.client.1.vm08.stdout:2/438: mknod d3/d9/d79/d46/d8c/d92/ca1 0 2026-03-09T19:27:13.451 INFO:tasks.workunit.client.0.vm07.stdout:3/248: dwrite d1/d6/f19 [0,4194304] 0 2026-03-09T19:27:13.460 INFO:tasks.workunit.client.1.vm08.stdout:4/465: creat da/d10/d16/d28/f8d x:0 0 0 2026-03-09T19:27:13.465 INFO:tasks.workunit.client.1.vm08.stdout:1/601: rmdir d9/da/d53 39 2026-03-09T19:27:13.473 INFO:tasks.workunit.client.0.vm07.stdout:1/165: creat d1/d11/d37/f40 x:0 0 0 2026-03-09T19:27:13.473 INFO:tasks.workunit.client.0.vm07.stdout:1/166: write d1/db/f1f [2483529,4585] 0 2026-03-09T19:27:13.473 INFO:tasks.workunit.client.0.vm07.stdout:2/225: dread d3/dd/f34 [0,4194304] 0 2026-03-09T19:27:13.473 INFO:tasks.workunit.client.0.vm07.stdout:2/226: chown d3/dd/d16/d29 24 1 2026-03-09T19:27:13.473 INFO:tasks.workunit.client.0.vm07.stdout:2/227: readlink d3/dd/d16/d29/d2d/d45/l52 0 2026-03-09T19:27:13.474 INFO:tasks.workunit.client.0.vm07.stdout:2/228: fdatasync d3/f1a 0 2026-03-09T19:27:13.478 INFO:tasks.workunit.client.0.vm07.stdout:3/249: mkdir d1/d6/d45 0 2026-03-09T19:27:13.478 INFO:tasks.workunit.client.0.vm07.stdout:3/250: write d1/d1f/d16/d28/f34 [1194551,66240] 0 2026-03-09T19:27:13.479 INFO:tasks.workunit.client.1.vm08.stdout:1/602: rmdir d9/da/d53/d67/d6c 39 2026-03-09T19:27:13.479 INFO:tasks.workunit.client.0.vm07.stdout:3/251: fsync d1/d1f/d16/f1e 0 2026-03-09T19:27:13.481 INFO:tasks.workunit.client.1.vm08.stdout:1/603: stat d9/da/d53/d67/f77 0 2026-03-09T19:27:13.486 INFO:tasks.workunit.client.0.vm07.stdout:2/229: creat d3/dd/d16/d29/d2d/d45/f55 x:0 0 0 2026-03-09T19:27:13.502 INFO:tasks.workunit.client.0.vm07.stdout:8/219: dread d7/f19 [0,4194304] 0 2026-03-09T19:27:13.503 INFO:tasks.workunit.client.1.vm08.stdout:1/604: mkdir d9/da/d53/db3 0 2026-03-09T19:27:13.503 INFO:tasks.workunit.client.1.vm08.stdout:1/605: chown d9/da/d12/d39/la8 68121364 1 2026-03-09T19:27:13.503 INFO:tasks.workunit.client.1.vm08.stdout:1/606: creat d9/da/d12/d91/fb4 x:0 0 0 2026-03-09T19:27:13.503 INFO:tasks.workunit.client.0.vm07.stdout:7/194: rmdir d0/d4/d5/d26/d43 0 2026-03-09T19:27:13.503 INFO:tasks.workunit.client.0.vm07.stdout:8/220: symlink d7/d16/d1e/l59 0 2026-03-09T19:27:13.503 INFO:tasks.workunit.client.0.vm07.stdout:8/221: dwrite d7/d16/d1e/f33 [0,4194304] 0 2026-03-09T19:27:13.504 INFO:tasks.workunit.client.0.vm07.stdout:8/222: write d7/d9/d37/d34/f55 [999958,89077] 0 2026-03-09T19:27:13.509 INFO:tasks.workunit.client.1.vm08.stdout:1/607: mknod d9/da/d95/cb5 0 2026-03-09T19:27:13.512 INFO:tasks.workunit.client.0.vm07.stdout:8/223: dwrite d7/d9/d10/d44/f4a [0,4194304] 0 2026-03-09T19:27:13.514 INFO:tasks.workunit.client.0.vm07.stdout:8/224: read d7/d9/d10/f20 [183221,86066] 0 2026-03-09T19:27:13.514 INFO:tasks.workunit.client.0.vm07.stdout:2/230: getdents d3/dd 0 2026-03-09T19:27:13.521 INFO:tasks.workunit.client.0.vm07.stdout:7/195: rename d0/d4/d5/d26/c3e to d0/d4/d5/d26/c44 0 2026-03-09T19:27:13.521 INFO:tasks.workunit.client.1.vm08.stdout:1/608: fsync d9/da/f30 0 2026-03-09T19:27:13.522 INFO:tasks.workunit.client.1.vm08.stdout:4/466: sync 2026-03-09T19:27:13.528 INFO:tasks.workunit.client.0.vm07.stdout:8/225: creat d7/d9/d37/d34/f5a x:0 0 0 2026-03-09T19:27:13.529 INFO:tasks.workunit.client.0.vm07.stdout:8/226: write d7/d1d/f3f [1655881,89770] 0 2026-03-09T19:27:13.530 INFO:tasks.workunit.client.0.vm07.stdout:2/231: creat d3/dd/d16/d29/d2d/f56 x:0 0 0 2026-03-09T19:27:13.535 INFO:tasks.workunit.client.1.vm08.stdout:1/609: chown d9/da/d12/d39/c69 7723026 1 2026-03-09T19:27:13.536 INFO:tasks.workunit.client.1.vm08.stdout:1/610: write d9/da/f8e [3640239,38843] 0 2026-03-09T19:27:13.537 INFO:tasks.workunit.client.0.vm07.stdout:7/196: sync 2026-03-09T19:27:13.537 INFO:tasks.workunit.client.0.vm07.stdout:8/227: sync 2026-03-09T19:27:13.538 INFO:tasks.workunit.client.0.vm07.stdout:8/228: chown d7/d9/f4c 1 1 2026-03-09T19:27:13.540 INFO:tasks.workunit.client.0.vm07.stdout:2/232: mknod d3/dd/d16/d29/d2d/c57 0 2026-03-09T19:27:13.540 INFO:tasks.workunit.client.0.vm07.stdout:2/233: fsync d3/fc 0 2026-03-09T19:27:13.541 INFO:tasks.workunit.client.0.vm07.stdout:7/197: dwrite d0/d4/d5/f20 [0,4194304] 0 2026-03-09T19:27:13.550 INFO:tasks.workunit.client.1.vm08.stdout:4/467: symlink da/d10/d16/d28/d2f/d4f/d64/l8e 0 2026-03-09T19:27:13.558 INFO:tasks.workunit.client.0.vm07.stdout:2/234: creat d3/dd/d16/d29/f58 x:0 0 0 2026-03-09T19:27:13.562 INFO:tasks.workunit.client.1.vm08.stdout:4/468: dwrite da/d10/d16/d28/d2f/f8c [0,4194304] 0 2026-03-09T19:27:13.566 INFO:tasks.workunit.client.0.vm07.stdout:8/229: mknod d7/d9/d57/c5b 0 2026-03-09T19:27:13.572 INFO:tasks.workunit.client.0.vm07.stdout:8/230: unlink d7/d9/lb 0 2026-03-09T19:27:13.574 INFO:tasks.workunit.client.0.vm07.stdout:8/231: sync 2026-03-09T19:27:13.579 INFO:tasks.workunit.client.1.vm08.stdout:4/469: link da/d10/d1b/f79 da/d10/d16/d28/d46/f8f 0 2026-03-09T19:27:13.581 INFO:tasks.workunit.client.0.vm07.stdout:2/235: rmdir d3/dd/d16/d29/d2d/d51 0 2026-03-09T19:27:13.582 INFO:tasks.workunit.client.0.vm07.stdout:8/232: mknod d7/d50/c5c 0 2026-03-09T19:27:13.582 INFO:tasks.workunit.client.0.vm07.stdout:2/236: chown d3/dd/d16/d29/d2d/c57 135930702 1 2026-03-09T19:27:13.582 INFO:tasks.workunit.client.0.vm07.stdout:7/198: dread d0/f13 [0,4194304] 0 2026-03-09T19:27:13.584 INFO:tasks.workunit.client.0.vm07.stdout:7/199: read d0/d4/d5/d8/fa [422216,5947] 0 2026-03-09T19:27:13.585 INFO:tasks.workunit.client.0.vm07.stdout:7/200: write d0/d4/d5/d8/d1a/d2a/f34 [2993832,75322] 0 2026-03-09T19:27:13.587 INFO:tasks.workunit.client.0.vm07.stdout:7/201: sync 2026-03-09T19:27:13.599 INFO:tasks.workunit.client.0.vm07.stdout:7/202: write d0/d4/d5/d8/f15 [1175082,103923] 0 2026-03-09T19:27:13.615 INFO:tasks.workunit.client.0.vm07.stdout:2/237: getdents d3/dd/d16/d30 0 2026-03-09T19:27:13.626 INFO:tasks.workunit.client.1.vm08.stdout:1/611: dread d9/da/d12/f72 [0,4194304] 0 2026-03-09T19:27:13.626 INFO:tasks.workunit.client.1.vm08.stdout:1/612: chown d9/da/d12/d39/l3a 7 1 2026-03-09T19:27:13.627 INFO:tasks.workunit.client.1.vm08.stdout:1/613: readlink d9/da/d12/d39/l8c 0 2026-03-09T19:27:13.627 INFO:tasks.workunit.client.0.vm07.stdout:2/238: mknod d3/dd/d16/d29/c59 0 2026-03-09T19:27:13.627 INFO:tasks.workunit.client.0.vm07.stdout:2/239: fsync d3/dd/f1d 0 2026-03-09T19:27:13.627 INFO:tasks.workunit.client.0.vm07.stdout:7/203: link d0/d4/f33 d0/d4/d5/d26/d32/f45 0 2026-03-09T19:27:13.627 INFO:tasks.workunit.client.0.vm07.stdout:7/204: symlink d0/d4/d5/d8/d1a/d2a/l46 0 2026-03-09T19:27:13.628 INFO:tasks.workunit.client.0.vm07.stdout:7/205: write d0/d4/d5/d26/f31 [240284,118282] 0 2026-03-09T19:27:13.629 INFO:tasks.workunit.client.0.vm07.stdout:7/206: fsync d0/d4/d5/d8/d1a/f1d 0 2026-03-09T19:27:13.636 INFO:tasks.workunit.client.0.vm07.stdout:7/207: creat d0/d4/d5/dd/f47 x:0 0 0 2026-03-09T19:27:13.638 INFO:tasks.workunit.client.1.vm08.stdout:4/470: mkdir da/d10/d16/d28/d2f/d4f/d56/d90 0 2026-03-09T19:27:13.638 INFO:tasks.workunit.client.1.vm08.stdout:4/471: fdatasync da/d10/f77 0 2026-03-09T19:27:13.644 INFO:tasks.workunit.client.0.vm07.stdout:7/208: stat d0/d4/fc 0 2026-03-09T19:27:13.644 INFO:tasks.workunit.client.0.vm07.stdout:2/240: sync 2026-03-09T19:27:13.645 INFO:tasks.workunit.client.0.vm07.stdout:2/241: dread - d3/dd/d16/d30/d40/f4f zero size 2026-03-09T19:27:13.647 INFO:tasks.workunit.client.0.vm07.stdout:2/242: readlink d3/dd/d16/d29/d3c/l41 0 2026-03-09T19:27:13.652 INFO:tasks.workunit.client.0.vm07.stdout:2/243: mkdir d3/dd/d16/d29/d3c/d5a 0 2026-03-09T19:27:13.652 INFO:tasks.workunit.client.1.vm08.stdout:4/472: truncate da/d10/d26/d38/f57 1150777 0 2026-03-09T19:27:13.655 INFO:tasks.workunit.client.0.vm07.stdout:2/244: dwrite d3/d11/f2e [0,4194304] 0 2026-03-09T19:27:13.656 INFO:tasks.workunit.client.0.vm07.stdout:2/245: fdatasync d3/dd/d16/d30/f3a 0 2026-03-09T19:27:13.656 INFO:tasks.workunit.client.0.vm07.stdout:2/246: chown d3/dd/d16/d2f/c43 17503 1 2026-03-09T19:27:13.656 INFO:tasks.workunit.client.0.vm07.stdout:2/247: readlink d3/l26 0 2026-03-09T19:27:13.658 INFO:tasks.workunit.client.0.vm07.stdout:7/209: link d0/d4/d5/d8/d1a/l23 d0/d4/l48 0 2026-03-09T19:27:13.661 INFO:tasks.workunit.client.0.vm07.stdout:2/248: mknod d3/d11/c5b 0 2026-03-09T19:27:13.661 INFO:tasks.workunit.client.0.vm07.stdout:2/249: fsync d3/f1a 0 2026-03-09T19:27:13.663 INFO:tasks.workunit.client.0.vm07.stdout:7/210: dread - d0/d4/d5/dd/f47 zero size 2026-03-09T19:27:13.665 INFO:tasks.workunit.client.0.vm07.stdout:2/250: dread d3/dd/f1d [0,4194304] 0 2026-03-09T19:27:13.670 INFO:tasks.workunit.client.0.vm07.stdout:2/251: dwrite d3/dd/d16/d30/f3a [0,4194304] 0 2026-03-09T19:27:13.685 INFO:tasks.workunit.client.0.vm07.stdout:7/211: dread d0/d4/f12 [0,4194304] 0 2026-03-09T19:27:13.685 INFO:tasks.workunit.client.0.vm07.stdout:7/212: write d0/d4/d5/d26/f42 [179411,115492] 0 2026-03-09T19:27:13.687 INFO:tasks.workunit.client.0.vm07.stdout:7/213: truncate d0/d4/d5/f40 990458 0 2026-03-09T19:27:13.689 INFO:tasks.workunit.client.0.vm07.stdout:7/214: mknod d0/d4/d5/d26/d3c/d39/c49 0 2026-03-09T19:27:13.693 INFO:tasks.workunit.client.0.vm07.stdout:2/252: dread f0 [0,4194304] 0 2026-03-09T19:27:13.695 INFO:tasks.workunit.client.0.vm07.stdout:7/215: creat d0/d4/d5/d26/f4a x:0 0 0 2026-03-09T19:27:13.699 INFO:tasks.workunit.client.0.vm07.stdout:7/216: symlink d0/d4/l4b 0 2026-03-09T19:27:13.703 INFO:tasks.workunit.client.0.vm07.stdout:7/217: rename d0/l7 to d0/d4/d5/d26/d32/l4c 0 2026-03-09T19:27:13.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:13 vm07.local ceph-mon[48545]: pgmap v161: 65 pgs: 65 active+clean; 1.4 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 28 MiB/s rd, 131 MiB/s wr, 304 op/s 2026-03-09T19:27:13.732 INFO:tasks.workunit.client.1.vm08.stdout:0/476: write dd/d22/d27/d2e/f39 [302642,78823] 0 2026-03-09T19:27:13.744 INFO:tasks.workunit.client.1.vm08.stdout:0/477: rmdir dd/d22/d24/d49/d50/d78/d86 39 2026-03-09T19:27:13.747 INFO:tasks.workunit.client.1.vm08.stdout:0/478: getdents dd/d22/d27/d6c 0 2026-03-09T19:27:13.749 INFO:tasks.workunit.client.1.vm08.stdout:5/433: dread ff [0,4194304] 0 2026-03-09T19:27:13.750 INFO:tasks.workunit.client.1.vm08.stdout:0/479: getdents dd/d22 0 2026-03-09T19:27:13.753 INFO:tasks.workunit.client.1.vm08.stdout:5/434: truncate d16/d1e/f5f 572310 0 2026-03-09T19:27:13.754 INFO:tasks.workunit.client.1.vm08.stdout:5/435: stat d16/f4d 0 2026-03-09T19:27:13.756 INFO:tasks.workunit.client.1.vm08.stdout:5/436: fdatasync d16/f4e 0 2026-03-09T19:27:13.764 INFO:tasks.workunit.client.1.vm08.stdout:0/480: dwrite dd/d22/d24/d49/d50/f95 [4194304,4194304] 0 2026-03-09T19:27:13.769 INFO:tasks.workunit.client.1.vm08.stdout:7/532: write d5/d14/dae/d1c/f5a [4806535,70106] 0 2026-03-09T19:27:13.769 INFO:tasks.workunit.client.1.vm08.stdout:8/450: write de/d1d/d21/f45 [751889,25534] 0 2026-03-09T19:27:13.784 INFO:tasks.workunit.client.1.vm08.stdout:5/437: dread f1 [0,4194304] 0 2026-03-09T19:27:13.785 INFO:tasks.workunit.client.1.vm08.stdout:7/533: symlink d5/d14/d27/d78/lb6 0 2026-03-09T19:27:13.789 INFO:tasks.workunit.client.1.vm08.stdout:7/534: chown d5/d14/dae/f1f 357016 1 2026-03-09T19:27:13.796 INFO:tasks.workunit.client.1.vm08.stdout:0/481: mkdir dd/d22/d24/d49/d98 0 2026-03-09T19:27:13.797 INFO:tasks.workunit.client.1.vm08.stdout:5/438: dread d16/d1e/f57 [0,4194304] 0 2026-03-09T19:27:13.799 INFO:tasks.workunit.client.1.vm08.stdout:7/535: truncate d5/d14/dae/d1c/f87 998393 0 2026-03-09T19:27:13.800 INFO:tasks.workunit.client.1.vm08.stdout:0/482: write dd/d22/d24/f87 [577667,46117] 0 2026-03-09T19:27:13.805 INFO:tasks.workunit.client.1.vm08.stdout:5/439: chown d16/d1e/c24 3024 1 2026-03-09T19:27:13.805 INFO:tasks.workunit.client.1.vm08.stdout:0/483: creat dd/d22/d24/d49/f99 x:0 0 0 2026-03-09T19:27:13.811 INFO:tasks.workunit.client.1.vm08.stdout:7/536: dwrite d5/d14/d2b/f9f [0,4194304] 0 2026-03-09T19:27:13.819 INFO:tasks.workunit.client.1.vm08.stdout:5/440: rmdir d16/d1e/d30/d6f 39 2026-03-09T19:27:13.834 INFO:tasks.workunit.client.1.vm08.stdout:0/484: fsync dd/d22/d27/d2e/f51 0 2026-03-09T19:27:13.835 INFO:tasks.workunit.client.1.vm08.stdout:0/485: chown dd/d22/d27/d2e/f51 6729354 1 2026-03-09T19:27:13.839 INFO:tasks.workunit.client.1.vm08.stdout:7/537: creat d5/d14/dae/d3a/d42/fb7 x:0 0 0 2026-03-09T19:27:13.844 INFO:tasks.workunit.client.0.vm07.stdout:6/183: write d0/d13/f18 [254808,130538] 0 2026-03-09T19:27:13.844 INFO:tasks.workunit.client.1.vm08.stdout:7/538: chown d5/d14/dae/d1c/d73 117538 1 2026-03-09T19:27:13.844 INFO:tasks.workunit.client.0.vm07.stdout:6/184: mknod d0/d13/d1e/d30/d31/c50 0 2026-03-09T19:27:13.844 INFO:tasks.workunit.client.0.vm07.stdout:6/185: dread - d0/f49 zero size 2026-03-09T19:27:13.844 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:13 vm08.local ceph-mon[57794]: pgmap v161: 65 pgs: 65 active+clean; 1.4 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 28 MiB/s rd, 131 MiB/s wr, 304 op/s 2026-03-09T19:27:13.848 INFO:tasks.workunit.client.0.vm07.stdout:4/181: truncate d3/d11/f12 370015 0 2026-03-09T19:27:13.849 INFO:tasks.workunit.client.0.vm07.stdout:6/186: rename d0/d1/db/c12 to d0/d13/d1e/c51 0 2026-03-09T19:27:13.851 INFO:tasks.workunit.client.0.vm07.stdout:4/182: mkdir d3/d11/d2b/d38 0 2026-03-09T19:27:13.851 INFO:tasks.workunit.client.0.vm07.stdout:4/183: stat d3/f1a 0 2026-03-09T19:27:13.852 INFO:tasks.workunit.client.1.vm08.stdout:0/486: link dd/d31/c36 dd/d22/d24/d49/d92/c9a 0 2026-03-09T19:27:13.857 INFO:tasks.workunit.client.0.vm07.stdout:6/187: mkdir d0/d1/db/d52 0 2026-03-09T19:27:13.858 INFO:tasks.workunit.client.0.vm07.stdout:4/184: unlink d3/d11/d16/d2f/f26 0 2026-03-09T19:27:13.859 INFO:tasks.workunit.client.1.vm08.stdout:6/465: truncate d3/db/f42 7050645 0 2026-03-09T19:27:13.862 INFO:tasks.workunit.client.1.vm08.stdout:7/539: creat d5/d14/dae/fb8 x:0 0 0 2026-03-09T19:27:13.868 INFO:tasks.workunit.client.1.vm08.stdout:6/466: read - d3/d34/d5c/da2/f72 zero size 2026-03-09T19:27:13.870 INFO:tasks.workunit.client.1.vm08.stdout:0/487: dwrite dd/d22/d27/d4f/f97 [0,4194304] 0 2026-03-09T19:27:13.872 INFO:tasks.workunit.client.0.vm07.stdout:9/249: dwrite d0/db/f21 [0,4194304] 0 2026-03-09T19:27:13.873 INFO:tasks.workunit.client.1.vm08.stdout:7/540: read d5/d14/d27/d54/fa5 [871265,63746] 0 2026-03-09T19:27:13.874 INFO:tasks.workunit.client.0.vm07.stdout:0/184: dwrite d0/d6/d13/d17/d19/f1f [0,4194304] 0 2026-03-09T19:27:13.883 INFO:tasks.workunit.client.1.vm08.stdout:9/452: dwrite d0/d1b/d97/f3f [0,4194304] 0 2026-03-09T19:27:13.888 INFO:tasks.workunit.client.1.vm08.stdout:6/467: creat d3/db/d43/d69/da0/fa8 x:0 0 0 2026-03-09T19:27:13.911 INFO:tasks.workunit.client.0.vm07.stdout:5/170: truncate d3/dd/f22 718701 0 2026-03-09T19:27:13.933 INFO:tasks.workunit.client.1.vm08.stdout:9/453: unlink d0/d2/f45 0 2026-03-09T19:27:13.936 INFO:tasks.workunit.client.0.vm07.stdout:5/171: dread d3/d1a/d28/f2e [0,4194304] 0 2026-03-09T19:27:13.937 INFO:tasks.workunit.client.1.vm08.stdout:9/454: dread - d0/d2/d80/d69/f7a zero size 2026-03-09T19:27:13.937 INFO:tasks.workunit.client.1.vm08.stdout:9/455: stat d0/d1b/d97/d48/c91 0 2026-03-09T19:27:13.941 INFO:tasks.workunit.client.1.vm08.stdout:6/468: chown d3/d34/d5c/c9d 240268 1 2026-03-09T19:27:13.952 INFO:tasks.workunit.client.0.vm07.stdout:9/250: fsync d0/db/d29/d2c/f30 0 2026-03-09T19:27:13.953 INFO:tasks.workunit.client.1.vm08.stdout:7/541: mknod d5/d14/dae/d3a/cb9 0 2026-03-09T19:27:13.953 INFO:tasks.workunit.client.1.vm08.stdout:3/523: dwrite d0/d6/f39 [0,4194304] 0 2026-03-09T19:27:13.953 INFO:tasks.workunit.client.1.vm08.stdout:3/524: truncate d0/d6/de/d15/fa4 470005 0 2026-03-09T19:27:13.967 INFO:tasks.workunit.client.1.vm08.stdout:9/456: rename d0/d2/d14/d5c/c2c to d0/d1b/d68/c9c 0 2026-03-09T19:27:13.969 INFO:tasks.workunit.client.0.vm07.stdout:5/172: symlink d3/dd/d26/d2d/l41 0 2026-03-09T19:27:13.973 INFO:tasks.workunit.client.0.vm07.stdout:5/173: dwrite d3/d1a/f1c [0,4194304] 0 2026-03-09T19:27:13.974 INFO:tasks.workunit.client.0.vm07.stdout:5/174: chown d3/d1a/f12 315 1 2026-03-09T19:27:13.986 INFO:tasks.workunit.client.0.vm07.stdout:0/185: fdatasync d0/d6/d13/d1c/f27 0 2026-03-09T19:27:13.987 INFO:tasks.workunit.client.0.vm07.stdout:9/251: mknod d0/db/d29/d2c/d36/c5f 0 2026-03-09T19:27:13.988 INFO:tasks.workunit.client.0.vm07.stdout:0/186: chown d0/d6/d13/d17/d19 124 1 2026-03-09T19:27:13.989 INFO:tasks.workunit.client.1.vm08.stdout:3/525: write d0/d6/de/d1b/d16/d17/f1d [5156228,10329] 0 2026-03-09T19:27:13.989 INFO:tasks.workunit.client.0.vm07.stdout:0/187: dread - d0/d6/d13/d1c/d11/f29 zero size 2026-03-09T19:27:13.991 INFO:tasks.workunit.client.0.vm07.stdout:3/252: dwrite d1/d6/dd/f15 [0,4194304] 0 2026-03-09T19:27:13.998 INFO:tasks.workunit.client.1.vm08.stdout:2/439: dwrite d3/d4/f8 [4194304,4194304] 0 2026-03-09T19:27:13.998 INFO:tasks.workunit.client.1.vm08.stdout:9/457: dwrite d0/d1b/d97/f3f [4194304,4194304] 0 2026-03-09T19:27:13.998 INFO:tasks.workunit.client.0.vm07.stdout:9/252: dread d0/d6/f48 [0,4194304] 0 2026-03-09T19:27:13.998 INFO:tasks.workunit.client.0.vm07.stdout:3/253: dread - d1/f20 zero size 2026-03-09T19:27:14.001 INFO:tasks.workunit.client.1.vm08.stdout:3/526: dread d0/d6/de/d1b/f7d [0,4194304] 0 2026-03-09T19:27:14.013 INFO:tasks.workunit.client.0.vm07.stdout:0/188: creat d0/f41 x:0 0 0 2026-03-09T19:27:14.030 INFO:tasks.workunit.client.0.vm07.stdout:9/253: symlink d0/d17/l60 0 2026-03-09T19:27:14.030 INFO:tasks.workunit.client.0.vm07.stdout:8/233: truncate f3 1093463 0 2026-03-09T19:27:14.030 INFO:tasks.workunit.client.0.vm07.stdout:3/254: read d1/d1f/f1a [7060601,11382] 0 2026-03-09T19:27:14.030 INFO:tasks.workunit.client.0.vm07.stdout:3/255: dread d1/f2a [4194304,4194304] 0 2026-03-09T19:27:14.030 INFO:tasks.workunit.client.0.vm07.stdout:3/256: stat d1/d26 0 2026-03-09T19:27:14.030 INFO:tasks.workunit.client.0.vm07.stdout:0/189: stat d0/l32 0 2026-03-09T19:27:14.030 INFO:tasks.workunit.client.0.vm07.stdout:8/234: mknod d7/d9/d10/c5d 0 2026-03-09T19:27:14.030 INFO:tasks.workunit.client.1.vm08.stdout:2/440: dread - d3/f7c zero size 2026-03-09T19:27:14.030 INFO:tasks.workunit.client.1.vm08.stdout:9/458: creat d0/d2/d14/d98/f9d x:0 0 0 2026-03-09T19:27:14.030 INFO:tasks.workunit.client.1.vm08.stdout:3/527: symlink d0/d6/la5 0 2026-03-09T19:27:14.030 INFO:tasks.workunit.client.1.vm08.stdout:3/528: chown d0/d6/de/d15/l48 0 1 2026-03-09T19:27:14.030 INFO:tasks.workunit.client.1.vm08.stdout:6/469: rename d3/d34/d3b/d85 to d3/d34/da9 0 2026-03-09T19:27:14.030 INFO:tasks.workunit.client.1.vm08.stdout:2/441: symlink d3/d4/d3e/d4e/la2 0 2026-03-09T19:27:14.030 INFO:tasks.workunit.client.1.vm08.stdout:9/459: creat d0/d2/d14/d98/f9e x:0 0 0 2026-03-09T19:27:14.030 INFO:tasks.workunit.client.1.vm08.stdout:9/460: chown d0/d2/d8/cc 2740940 1 2026-03-09T19:27:14.031 INFO:tasks.workunit.client.1.vm08.stdout:3/529: dread - d0/d6/de/d6e/d51/f70 zero size 2026-03-09T19:27:14.033 INFO:tasks.workunit.client.1.vm08.stdout:3/530: chown d0/d6/de/d15/l48 5559 1 2026-03-09T19:27:14.033 INFO:tasks.workunit.client.1.vm08.stdout:6/470: chown d3/d34/d3b/l9c 1 1 2026-03-09T19:27:14.033 INFO:tasks.workunit.client.1.vm08.stdout:3/531: fsync d0/d6/de/d6e/f81 0 2026-03-09T19:27:14.038 INFO:tasks.workunit.client.1.vm08.stdout:2/442: rmdir d3/d4/d23/d2c 39 2026-03-09T19:27:14.040 INFO:tasks.workunit.client.0.vm07.stdout:0/190: fsync d0/f1e 0 2026-03-09T19:27:14.055 INFO:tasks.workunit.client.1.vm08.stdout:1/614: dwrite d9/da/f2f [0,4194304] 0 2026-03-09T19:27:14.055 INFO:tasks.workunit.client.1.vm08.stdout:4/473: write da/d10/d16/d28/d46/d52/d6e/d2c/f4a [74563,115959] 0 2026-03-09T19:27:14.055 INFO:tasks.workunit.client.0.vm07.stdout:3/257: symlink d1/l46 0 2026-03-09T19:27:14.055 INFO:tasks.workunit.client.0.vm07.stdout:3/258: write d1/d1f/f1a [9250824,93563] 0 2026-03-09T19:27:14.058 INFO:tasks.workunit.client.0.vm07.stdout:5/175: read d3/d1a/f12 [2476610,3611] 0 2026-03-09T19:27:14.079 INFO:tasks.workunit.client.0.vm07.stdout:2/253: write d3/f22 [1724811,97214] 0 2026-03-09T19:27:14.082 INFO:tasks.workunit.client.1.vm08.stdout:2/443: chown d3/d4/d23/d2c/d39/d5e/d14/f58 206 1 2026-03-09T19:27:14.082 INFO:tasks.workunit.client.0.vm07.stdout:3/259: mkdir d1/d3d/d47 0 2026-03-09T19:27:14.083 INFO:tasks.workunit.client.1.vm08.stdout:1/615: mkdir d9/d11/db6 0 2026-03-09T19:27:14.084 INFO:tasks.workunit.client.1.vm08.stdout:1/616: stat d9/da/d12/d39/f52 0 2026-03-09T19:27:14.087 INFO:tasks.workunit.client.0.vm07.stdout:9/254: getdents d0/d6/d57 0 2026-03-09T19:27:14.088 INFO:tasks.workunit.client.0.vm07.stdout:9/255: chown d0/db/c1e 0 1 2026-03-09T19:27:14.090 INFO:tasks.workunit.client.1.vm08.stdout:4/474: rmdir da/d10/d26/d3a/d69 39 2026-03-09T19:27:14.092 INFO:tasks.workunit.client.1.vm08.stdout:4/475: dread - da/d10/d16/d28/d2f/f80 zero size 2026-03-09T19:27:14.094 INFO:tasks.workunit.client.1.vm08.stdout:8/451: dwrite de/d1d/d21/f23 [0,4194304] 0 2026-03-09T19:27:14.101 INFO:tasks.workunit.client.0.vm07.stdout:9/256: dread d0/d6/fa [0,4194304] 0 2026-03-09T19:27:14.108 INFO:tasks.workunit.client.1.vm08.stdout:1/617: creat d9/d11/d7a/d89/fb7 x:0 0 0 2026-03-09T19:27:14.111 INFO:tasks.workunit.client.0.vm07.stdout:2/254: mkdir d3/d11/d38/d5c 0 2026-03-09T19:27:14.116 INFO:tasks.workunit.client.0.vm07.stdout:7/218: creat d0/d4/d5/d8/d1a/f4d x:0 0 0 2026-03-09T19:27:14.118 INFO:tasks.workunit.client.1.vm08.stdout:8/452: unlink de/d25/d33/f83 0 2026-03-09T19:27:14.120 INFO:tasks.workunit.client.0.vm07.stdout:7/219: dwrite d0/d4/d5/d26/f31 [0,4194304] 0 2026-03-09T19:27:14.120 INFO:tasks.workunit.client.1.vm08.stdout:9/461: link d0/d1b/d97/d48/d5e/f96 d0/d1b/f9f 0 2026-03-09T19:27:14.139 INFO:tasks.workunit.client.1.vm08.stdout:5/441: sync 2026-03-09T19:27:14.142 INFO:tasks.workunit.client.1.vm08.stdout:6/471: getdents d3/d34/da9/da4 0 2026-03-09T19:27:14.147 INFO:tasks.workunit.client.1.vm08.stdout:3/532: sync 2026-03-09T19:27:14.150 INFO:tasks.workunit.client.1.vm08.stdout:3/533: dread - d0/d6/de/d1b/d16/d17/f8c zero size 2026-03-09T19:27:14.151 INFO:tasks.workunit.client.1.vm08.stdout:4/476: mkdir da/d10/d26/d3a/d91 0 2026-03-09T19:27:14.151 INFO:tasks.workunit.client.1.vm08.stdout:4/477: fsync f9 0 2026-03-09T19:27:14.151 INFO:tasks.workunit.client.1.vm08.stdout:3/534: read - d0/d52/d6d/f8b zero size 2026-03-09T19:27:14.168 INFO:tasks.workunit.client.0.vm07.stdout:7/220: dread d0/f13 [4194304,4194304] 0 2026-03-09T19:27:14.186 INFO:tasks.workunit.client.1.vm08.stdout:1/618: symlink d9/da/lb8 0 2026-03-09T19:27:14.186 INFO:tasks.workunit.client.0.vm07.stdout:7/221: truncate d0/f22 598405 0 2026-03-09T19:27:14.187 INFO:tasks.workunit.client.1.vm08.stdout:1/619: chown d9/da/d53 117559 1 2026-03-09T19:27:14.188 INFO:tasks.workunit.client.1.vm08.stdout:4/478: creat da/d10/d16/d28/d46/d52/d6e/d40/d6c/f92 x:0 0 0 2026-03-09T19:27:14.200 INFO:tasks.workunit.client.1.vm08.stdout:4/479: dwrite da/d10/d16/d28/d46/d52/d6e/d40/d6c/f71 [4194304,4194304] 0 2026-03-09T19:27:14.213 INFO:tasks.workunit.client.0.vm07.stdout:7/222: mknod d0/d4/d5/d8/c4e 0 2026-03-09T19:27:14.217 INFO:tasks.workunit.client.1.vm08.stdout:3/535: mknod d0/d6/de/ca6 0 2026-03-09T19:27:14.220 INFO:tasks.workunit.client.0.vm07.stdout:7/223: mkdir d0/d4/d4f 0 2026-03-09T19:27:14.224 INFO:tasks.workunit.client.1.vm08.stdout:1/620: fsync d9/da/dc/f68 0 2026-03-09T19:27:14.229 INFO:tasks.workunit.client.0.vm07.stdout:6/188: rename d0/d13/d1e/d30 to d0/d1/db/d24/d53 0 2026-03-09T19:27:14.245 INFO:tasks.workunit.client.1.vm08.stdout:6/472: read d3/d34/d6f/f39 [12303,49408] 0 2026-03-09T19:27:14.252 INFO:tasks.workunit.client.0.vm07.stdout:7/224: unlink d0/d4/d5/f40 0 2026-03-09T19:27:14.258 INFO:tasks.workunit.client.1.vm08.stdout:4/480: creat da/d10/d26/d38/f93 x:0 0 0 2026-03-09T19:27:14.258 INFO:tasks.workunit.client.1.vm08.stdout:9/462: link d0/d1b/f4b d0/d1b/d68/d7f/fa0 0 2026-03-09T19:27:14.258 INFO:tasks.workunit.client.1.vm08.stdout:3/536: mknod d0/d6/de/d1b/d16/d17/ca7 0 2026-03-09T19:27:14.258 INFO:tasks.workunit.client.0.vm07.stdout:5/176: rename d3/dd/d26/d2d/f32 to d3/dd/d26/d3f/f42 0 2026-03-09T19:27:14.258 INFO:tasks.workunit.client.0.vm07.stdout:6/189: symlink d0/d44/l54 0 2026-03-09T19:27:14.258 INFO:tasks.workunit.client.0.vm07.stdout:5/177: dread d3/d1a/f1c [0,4194304] 0 2026-03-09T19:27:14.258 INFO:tasks.workunit.client.0.vm07.stdout:5/178: chown d3/d1a/c29 5678 1 2026-03-09T19:27:14.258 INFO:tasks.workunit.client.0.vm07.stdout:5/179: chown d3/d1a/c20 1844412 1 2026-03-09T19:27:14.259 INFO:tasks.workunit.client.0.vm07.stdout:5/180: chown d3/dd/d26/d3f/f42 14124 1 2026-03-09T19:27:14.268 INFO:tasks.workunit.client.1.vm08.stdout:7/542: write d5/d14/dae/d1c/d83/d9c/f9d [949882,131048] 0 2026-03-09T19:27:14.269 INFO:tasks.workunit.client.1.vm08.stdout:3/537: dwrite d0/d6/de/f86 [0,4194304] 0 2026-03-09T19:27:14.271 INFO:tasks.workunit.client.0.vm07.stdout:0/191: getdents d0 0 2026-03-09T19:27:14.271 INFO:tasks.workunit.client.0.vm07.stdout:0/192: stat d0/d6/d13/d1c/f36 0 2026-03-09T19:27:14.275 INFO:tasks.workunit.client.1.vm08.stdout:7/543: read d5/d14/dae/d3a/d42/d6a/d8f/f98 [379348,128060] 0 2026-03-09T19:27:14.302 INFO:tasks.workunit.client.1.vm08.stdout:0/488: write dd/d22/d27/f3d [481016,29674] 0 2026-03-09T19:27:14.303 INFO:tasks.workunit.client.1.vm08.stdout:0/489: write dd/d22/f29 [4010684,97030] 0 2026-03-09T19:27:14.304 INFO:tasks.workunit.client.0.vm07.stdout:8/235: dwrite d7/d9/fd [0,4194304] 0 2026-03-09T19:27:14.305 INFO:tasks.workunit.client.0.vm07.stdout:1/167: write d1/db/f14 [888992,57217] 0 2026-03-09T19:27:14.305 INFO:tasks.workunit.client.0.vm07.stdout:1/168: write d1/d11/d37/f40 [392420,27044] 0 2026-03-09T19:27:14.306 INFO:tasks.workunit.client.0.vm07.stdout:1/169: dread - d1/d3/d21/f2e zero size 2026-03-09T19:27:14.306 INFO:tasks.workunit.client.1.vm08.stdout:1/621: creat d9/da/dc/fb9 x:0 0 0 2026-03-09T19:27:14.325 INFO:tasks.workunit.client.0.vm07.stdout:9/257: truncate d0/d17/f1f 3298299 0 2026-03-09T19:27:14.328 INFO:tasks.workunit.client.1.vm08.stdout:2/444: dwrite d3/d9/d79/d46/f72 [0,4194304] 0 2026-03-09T19:27:14.332 INFO:tasks.workunit.client.0.vm07.stdout:6/190: chown d0/d1/c5 465670 1 2026-03-09T19:27:14.332 INFO:tasks.workunit.client.0.vm07.stdout:2/255: truncate f2 2710512 0 2026-03-09T19:27:14.334 INFO:tasks.workunit.client.0.vm07.stdout:2/256: write d3/d11/f39 [2094503,119220] 0 2026-03-09T19:27:14.334 INFO:tasks.workunit.client.0.vm07.stdout:6/191: truncate d0/d1/db/d17/f38 287817 0 2026-03-09T19:27:14.334 INFO:tasks.workunit.client.1.vm08.stdout:1/622: dread d9/da/d2c/d6a/f9c [0,4194304] 0 2026-03-09T19:27:14.334 INFO:tasks.workunit.client.0.vm07.stdout:5/181: mknod d3/d1a/d28/c43 0 2026-03-09T19:27:14.334 INFO:tasks.workunit.client.1.vm08.stdout:3/538: creat d0/d52/fa8 x:0 0 0 2026-03-09T19:27:14.335 INFO:tasks.workunit.client.1.vm08.stdout:3/539: write d0/d8/f66 [737870,79810] 0 2026-03-09T19:27:14.335 INFO:tasks.workunit.client.0.vm07.stdout:0/193: mknod d0/d6/c42 0 2026-03-09T19:27:14.341 INFO:tasks.workunit.client.1.vm08.stdout:0/490: creat dd/d22/d24/d49/d50/f9b x:0 0 0 2026-03-09T19:27:14.343 INFO:tasks.workunit.client.1.vm08.stdout:8/453: dwrite de/d1d/d21/f86 [0,4194304] 0 2026-03-09T19:27:14.344 INFO:tasks.workunit.client.0.vm07.stdout:4/185: dwrite d3/d11/f12 [0,4194304] 0 2026-03-09T19:27:14.349 INFO:tasks.workunit.client.1.vm08.stdout:7/544: sync 2026-03-09T19:27:14.356 INFO:tasks.workunit.client.1.vm08.stdout:5/442: dwrite d16/d1e/f5c [0,4194304] 0 2026-03-09T19:27:14.357 INFO:tasks.workunit.client.1.vm08.stdout:3/540: dwrite d0/d6/de/d15/f53 [0,4194304] 0 2026-03-09T19:27:14.388 INFO:tasks.workunit.client.1.vm08.stdout:2/445: truncate f2 215756 0 2026-03-09T19:27:14.403 INFO:tasks.workunit.client.1.vm08.stdout:4/481: rename da/d10/d26/d27/d32/c67 to da/d10/d16/d28/d4d/c94 0 2026-03-09T19:27:14.422 INFO:tasks.workunit.client.1.vm08.stdout:1/623: dread d9/da/d2d/f3d [0,4194304] 0 2026-03-09T19:27:14.425 INFO:tasks.workunit.client.0.vm07.stdout:8/236: mknod d7/d9/c5e 0 2026-03-09T19:27:14.426 INFO:tasks.workunit.client.0.vm07.stdout:8/237: chown d7/d9/l35 110330 1 2026-03-09T19:27:14.434 INFO:tasks.workunit.client.0.vm07.stdout:1/170: symlink d1/d3/l41 0 2026-03-09T19:27:14.440 INFO:tasks.workunit.client.1.vm08.stdout:0/491: stat dd/d22/d24/d49/d50/d78/d86/c96 0 2026-03-09T19:27:14.452 INFO:tasks.workunit.client.1.vm08.stdout:0/492: dread dd/d22/f28 [0,4194304] 0 2026-03-09T19:27:14.458 INFO:tasks.workunit.client.0.vm07.stdout:3/260: rename d1/d1f/f2d to d1/d6/dd/f48 0 2026-03-09T19:27:14.459 INFO:tasks.workunit.client.0.vm07.stdout:3/261: chown d1/d3d 0 1 2026-03-09T19:27:14.474 INFO:tasks.workunit.client.1.vm08.stdout:6/473: dwrite d3/d34/d5c/da2/f72 [0,4194304] 0 2026-03-09T19:27:14.515 INFO:tasks.workunit.client.0.vm07.stdout:6/192: rmdir d0/d44 39 2026-03-09T19:27:14.516 INFO:tasks.workunit.client.0.vm07.stdout:5/182: mknod d3/d1a/d28/c44 0 2026-03-09T19:27:14.516 INFO:tasks.workunit.client.0.vm07.stdout:5/183: chown d3/d1a 56048 1 2026-03-09T19:27:14.517 INFO:tasks.workunit.client.1.vm08.stdout:2/446: mkdir d3/d4/d23/d2c/d39/da3 0 2026-03-09T19:27:14.518 INFO:tasks.workunit.client.0.vm07.stdout:6/193: dwrite d0/d13/f26 [0,4194304] 0 2026-03-09T19:27:14.520 INFO:tasks.workunit.client.1.vm08.stdout:9/463: rename d0/d2/f36 to d0/d1b/d97/d48/d5e/fa1 0 2026-03-09T19:27:14.534 INFO:tasks.workunit.client.0.vm07.stdout:7/225: creat d0/d4/d5/f50 x:0 0 0 2026-03-09T19:27:14.535 INFO:tasks.workunit.client.0.vm07.stdout:1/171: creat d1/d11/f42 x:0 0 0 2026-03-09T19:27:14.536 INFO:tasks.workunit.client.1.vm08.stdout:6/474: truncate d3/d15/f64 418026 0 2026-03-09T19:27:14.543 INFO:tasks.workunit.client.1.vm08.stdout:2/447: creat d3/d9/d4a/fa4 x:0 0 0 2026-03-09T19:27:14.546 INFO:tasks.workunit.client.1.vm08.stdout:9/464: dread - d0/d1b/d97/d48/d6f/f79 zero size 2026-03-09T19:27:14.546 INFO:tasks.workunit.client.1.vm08.stdout:4/482: symlink da/d10/d26/d3a/l95 0 2026-03-09T19:27:14.549 INFO:tasks.workunit.client.0.vm07.stdout:4/186: symlink d3/d11/d29/d34/l39 0 2026-03-09T19:27:14.551 INFO:tasks.workunit.client.0.vm07.stdout:4/187: dread d3/d11/d2b/d37/f28 [0,4194304] 0 2026-03-09T19:27:14.552 INFO:tasks.workunit.client.1.vm08.stdout:0/493: creat dd/d22/d24/d49/d98/f9c x:0 0 0 2026-03-09T19:27:14.552 INFO:tasks.workunit.client.0.vm07.stdout:4/188: write d3/d11/d2b/d37/f25 [956183,10126] 0 2026-03-09T19:27:14.557 INFO:tasks.workunit.client.1.vm08.stdout:6/475: mknod d3/d55/caa 0 2026-03-09T19:27:14.557 INFO:tasks.workunit.client.1.vm08.stdout:6/476: chown d3/d34/d3b/f99 121 1 2026-03-09T19:27:14.559 INFO:tasks.workunit.client.1.vm08.stdout:7/545: link d5/d14/dae/d3a/d42/d6a/f62 d5/d14/dae/d3a/d42/d85/da0/fba 0 2026-03-09T19:27:14.561 INFO:tasks.workunit.client.0.vm07.stdout:1/172: symlink d1/d9/l43 0 2026-03-09T19:27:14.562 INFO:tasks.workunit.client.0.vm07.stdout:1/173: write d1/d11/f24 [716470,109626] 0 2026-03-09T19:27:14.564 INFO:tasks.workunit.client.0.vm07.stdout:9/258: creat d0/db/d29/d2c/f61 x:0 0 0 2026-03-09T19:27:14.565 INFO:tasks.workunit.client.1.vm08.stdout:8/454: mkdir de/da4 0 2026-03-09T19:27:14.569 INFO:tasks.workunit.client.0.vm07.stdout:7/226: symlink d0/d4/d5/d8/d41/l51 0 2026-03-09T19:27:14.570 INFO:tasks.workunit.client.0.vm07.stdout:7/227: write d0/d4/d5/d8/d1a/f1d [2439212,91265] 0 2026-03-09T19:27:14.571 INFO:tasks.workunit.client.0.vm07.stdout:7/228: dread - d0/d4/d5/d8/f37 zero size 2026-03-09T19:27:14.571 INFO:tasks.workunit.client.0.vm07.stdout:7/229: chown d0/d4/d5/d8/f37 6628 1 2026-03-09T19:27:14.572 INFO:tasks.workunit.client.0.vm07.stdout:7/230: chown d0/d4/d5/d26/d32 0 1 2026-03-09T19:27:14.580 INFO:tasks.workunit.client.1.vm08.stdout:5/443: mknod d16/d45/d81/c8d 0 2026-03-09T19:27:14.581 INFO:tasks.workunit.client.1.vm08.stdout:3/541: link d0/d6/de/d1b/l2f d0/d6/de/d1b/la9 0 2026-03-09T19:27:14.582 INFO:tasks.workunit.client.1.vm08.stdout:9/465: mkdir d0/d1b/d68/d7f/d8c/da2 0 2026-03-09T19:27:14.583 INFO:tasks.workunit.client.1.vm08.stdout:4/483: creat da/d10/d26/d27/f96 x:0 0 0 2026-03-09T19:27:14.583 INFO:tasks.workunit.client.1.vm08.stdout:0/494: truncate dd/d22/d27/d6c/f7f 1091034 0 2026-03-09T19:27:14.585 INFO:tasks.workunit.client.0.vm07.stdout:9/259: unlink d0/db/f21 0 2026-03-09T19:27:14.585 INFO:tasks.workunit.client.0.vm07.stdout:2/257: write f2 [1249749,109284] 0 2026-03-09T19:27:14.588 INFO:tasks.workunit.client.0.vm07.stdout:9/260: dwrite d0/db/f41 [0,4194304] 0 2026-03-09T19:27:14.596 INFO:tasks.workunit.client.1.vm08.stdout:7/546: chown d5/d14/dae/d1c/d83/d90/l93 271826483 1 2026-03-09T19:27:14.597 INFO:tasks.workunit.client.1.vm08.stdout:1/624: dwrite d9/d11/f87 [0,4194304] 0 2026-03-09T19:27:14.604 INFO:tasks.workunit.client.1.vm08.stdout:6/477: dwrite d3/f6e [4194304,4194304] 0 2026-03-09T19:27:14.613 INFO:tasks.workunit.client.1.vm08.stdout:7/547: chown d5/d14/dae/d1c/d83/f9b 420142 1 2026-03-09T19:27:14.613 INFO:tasks.workunit.client.0.vm07.stdout:0/194: getdents d0/d6 0 2026-03-09T19:27:14.613 INFO:tasks.workunit.client.0.vm07.stdout:0/195: write d0/fa [2563685,100081] 0 2026-03-09T19:27:14.613 INFO:tasks.workunit.client.1.vm08.stdout:1/625: write d9/da/dc/fa5 [2884335,50400] 0 2026-03-09T19:27:14.613 INFO:tasks.workunit.client.1.vm08.stdout:3/542: creat d0/d6/faa x:0 0 0 2026-03-09T19:27:14.613 INFO:tasks.workunit.client.1.vm08.stdout:7/548: readlink d5/d14/dae/d3a/d42/d85/l8b 0 2026-03-09T19:27:14.613 INFO:tasks.workunit.client.1.vm08.stdout:7/549: dread d5/d14/d2b/f9f [0,4194304] 0 2026-03-09T19:27:14.620 INFO:tasks.workunit.client.1.vm08.stdout:9/466: fdatasync d0/d1b/d97/d48/d5e/f6e 0 2026-03-09T19:27:14.622 INFO:tasks.workunit.client.0.vm07.stdout:8/238: getdents d7/d30/d32 0 2026-03-09T19:27:14.622 INFO:tasks.workunit.client.1.vm08.stdout:9/467: truncate d0/d1b/d97/d48/d5d/f92 957879 0 2026-03-09T19:27:14.632 INFO:tasks.workunit.client.0.vm07.stdout:3/262: getdents d1/d1f 0 2026-03-09T19:27:14.640 INFO:tasks.workunit.client.0.vm07.stdout:3/263: fdatasync d1/d6/dd/f2b 0 2026-03-09T19:27:14.640 INFO:tasks.workunit.client.0.vm07.stdout:1/174: mknod d1/d9/d30/c44 0 2026-03-09T19:27:14.644 INFO:tasks.workunit.client.0.vm07.stdout:9/261: write d0/d6/fa [4246192,78212] 0 2026-03-09T19:27:14.651 INFO:tasks.workunit.client.0.vm07.stdout:5/184: rename d3/d1a/l15 to d3/dd/l45 0 2026-03-09T19:27:14.651 INFO:tasks.workunit.client.0.vm07.stdout:5/185: chown d3/dd/d26/d2d/l41 7309 1 2026-03-09T19:27:14.653 INFO:tasks.workunit.client.0.vm07.stdout:0/196: fdatasync d0/d6/d13/d17/f2b 0 2026-03-09T19:27:14.655 INFO:tasks.workunit.client.0.vm07.stdout:6/194: link d0/d1/db/d1d/c29 d0/c55 0 2026-03-09T19:27:14.655 INFO:tasks.workunit.client.0.vm07.stdout:6/195: fdatasync d0/fe 0 2026-03-09T19:27:14.657 INFO:tasks.workunit.client.0.vm07.stdout:0/197: dread d0/d6/d13/d17/d19/f1f [0,4194304] 0 2026-03-09T19:27:14.660 INFO:tasks.workunit.client.0.vm07.stdout:7/231: mkdir d0/d52 0 2026-03-09T19:27:14.667 INFO:tasks.workunit.client.0.vm07.stdout:2/258: creat d3/dd/d16/d29/d2d/d45/d3b/d44/f5d x:0 0 0 2026-03-09T19:27:14.669 INFO:tasks.workunit.client.0.vm07.stdout:9/262: creat d0/db/d29/d2c/d36/f62 x:0 0 0 2026-03-09T19:27:14.670 INFO:tasks.workunit.client.0.vm07.stdout:9/263: write d0/d17/f4f [915172,75177] 0 2026-03-09T19:27:14.682 INFO:tasks.workunit.client.0.vm07.stdout:2/259: dwrite d3/dd/f1e [4194304,4194304] 0 2026-03-09T19:27:14.685 INFO:tasks.workunit.client.1.vm08.stdout:5/444: sync 2026-03-09T19:27:14.687 INFO:tasks.workunit.client.1.vm08.stdout:5/445: read d16/d1e/d3b/f50 [3721842,939] 0 2026-03-09T19:27:14.692 INFO:tasks.workunit.client.0.vm07.stdout:5/186: dread d3/d1a/f1c [0,4194304] 0 2026-03-09T19:27:14.696 INFO:tasks.workunit.client.1.vm08.stdout:0/495: mkdir dd/d9d 0 2026-03-09T19:27:14.699 INFO:tasks.workunit.client.1.vm08.stdout:2/448: link d3/d4/d23/d2c/d39/d5e/de/d8b/f70 d3/d9/d79/d46/d8c/fa5 0 2026-03-09T19:27:14.707 INFO:tasks.workunit.client.0.vm07.stdout:8/239: creat d7/d9/d37/d45/d56/f5f x:0 0 0 2026-03-09T19:27:14.707 INFO:tasks.workunit.client.1.vm08.stdout:8/455: mkdir de/d25/d31/d82/d6d/d99/da5 0 2026-03-09T19:27:14.707 INFO:tasks.workunit.client.0.vm07.stdout:8/240: fdatasync d7/d9/d10/f41 0 2026-03-09T19:27:14.711 INFO:tasks.workunit.client.0.vm07.stdout:8/241: dwrite d7/f2e [0,4194304] 0 2026-03-09T19:27:14.718 INFO:tasks.workunit.client.0.vm07.stdout:0/198: rename d0/d6/f16 to d0/d6/f43 0 2026-03-09T19:27:14.720 INFO:tasks.workunit.client.0.vm07.stdout:8/242: dread d7/d9/d10/f41 [0,4194304] 0 2026-03-09T19:27:14.720 INFO:tasks.workunit.client.1.vm08.stdout:3/543: symlink d0/d52/d6d/d77/lab 0 2026-03-09T19:27:14.721 INFO:tasks.workunit.client.1.vm08.stdout:3/544: chown d0/d6/de/d1b/d16 100801 1 2026-03-09T19:27:14.723 INFO:tasks.workunit.client.0.vm07.stdout:8/243: dwrite d7/d9/d10/f1b [0,4194304] 0 2026-03-09T19:27:14.734 INFO:tasks.workunit.client.0.vm07.stdout:1/175: rename d1/d9 to d1/d11/d37/d3f/d45 0 2026-03-09T19:27:14.734 INFO:tasks.workunit.client.0.vm07.stdout:1/176: dread - d1/f1d zero size 2026-03-09T19:27:14.735 INFO:tasks.workunit.client.0.vm07.stdout:1/177: chown d1/d3/f4 196 1 2026-03-09T19:27:14.739 INFO:tasks.workunit.client.0.vm07.stdout:4/189: getdents d3/d11/d29/d34 0 2026-03-09T19:27:14.741 INFO:tasks.workunit.client.0.vm07.stdout:7/232: symlink d0/d4/d5/d8/d1a/d2a/l53 0 2026-03-09T19:27:14.745 INFO:tasks.workunit.client.1.vm08.stdout:7/550: creat d5/d14/d38/fbb x:0 0 0 2026-03-09T19:27:14.745 INFO:tasks.workunit.client.0.vm07.stdout:4/190: dwrite d3/d11/f18 [0,4194304] 0 2026-03-09T19:27:14.749 INFO:tasks.workunit.client.1.vm08.stdout:9/468: unlink d0/d2/d8/l3c 0 2026-03-09T19:27:14.754 INFO:tasks.workunit.client.0.vm07.stdout:7/233: dwrite d0/d4/d5/d8/f35 [0,4194304] 0 2026-03-09T19:27:14.755 INFO:tasks.workunit.client.1.vm08.stdout:4/484: unlink da/d10/d1b/l54 0 2026-03-09T19:27:14.757 INFO:tasks.workunit.client.0.vm07.stdout:9/264: symlink d0/db/d29/d32/l63 0 2026-03-09T19:27:14.758 INFO:tasks.workunit.client.0.vm07.stdout:2/260: mknod d3/dd/d16/d30/c5e 0 2026-03-09T19:27:14.759 INFO:tasks.workunit.client.1.vm08.stdout:5/446: truncate ff 1009990 0 2026-03-09T19:27:14.760 INFO:tasks.workunit.client.1.vm08.stdout:6/478: write d3/db/f14 [3019342,3267] 0 2026-03-09T19:27:14.763 INFO:tasks.workunit.client.0.vm07.stdout:5/187: unlink d3/dd/d26/d3f/f42 0 2026-03-09T19:27:14.772 INFO:tasks.workunit.client.1.vm08.stdout:3/545: chown d0/d6/de/d1b/d16/l67 31158 1 2026-03-09T19:27:14.773 INFO:tasks.workunit.client.0.vm07.stdout:6/196: symlink d0/d4e/l56 0 2026-03-09T19:27:14.773 INFO:tasks.workunit.client.0.vm07.stdout:7/234: dwrite d0/d4/d5/f20 [0,4194304] 0 2026-03-09T19:27:14.773 INFO:tasks.workunit.client.0.vm07.stdout:9/265: dread d0/db/f39 [0,4194304] 0 2026-03-09T19:27:14.773 INFO:tasks.workunit.client.0.vm07.stdout:0/199: unlink d0/d6/d13/d1c/d11/f40 0 2026-03-09T19:27:14.773 INFO:tasks.workunit.client.0.vm07.stdout:0/200: chown d0/d6/d13/d1c/f36 44 1 2026-03-09T19:27:14.773 INFO:tasks.workunit.client.0.vm07.stdout:0/201: dwrite d0/f3d [0,4194304] 0 2026-03-09T19:27:14.777 INFO:tasks.workunit.client.0.vm07.stdout:0/202: dread d0/fa [0,4194304] 0 2026-03-09T19:27:14.786 INFO:tasks.workunit.client.0.vm07.stdout:6/197: dread d0/d1/f19 [0,4194304] 0 2026-03-09T19:27:14.786 INFO:tasks.workunit.client.0.vm07.stdout:6/198: stat d0/d1/db/d1d/f3e 0 2026-03-09T19:27:14.787 INFO:tasks.workunit.client.0.vm07.stdout:6/199: write d0/d1/db/d1d/f3e [367286,32766] 0 2026-03-09T19:27:14.795 INFO:tasks.workunit.client.0.vm07.stdout:8/244: dread f3 [0,4194304] 0 2026-03-09T19:27:14.797 INFO:tasks.workunit.client.0.vm07.stdout:1/178: dread d1/d3/f12 [0,4194304] 0 2026-03-09T19:27:14.850 INFO:tasks.workunit.client.1.vm08.stdout:9/469: symlink d0/d2/d14/la3 0 2026-03-09T19:27:14.851 INFO:tasks.workunit.client.1.vm08.stdout:4/485: dread da/d10/f1f [0,4194304] 0 2026-03-09T19:27:14.855 INFO:tasks.workunit.client.0.vm07.stdout:2/261: creat d3/dd/d16/f5f x:0 0 0 2026-03-09T19:27:14.855 INFO:tasks.workunit.client.0.vm07.stdout:2/262: write d3/dd/f1e [7278464,90676] 0 2026-03-09T19:27:14.856 INFO:tasks.workunit.client.0.vm07.stdout:2/263: chown d3/d49 0 1 2026-03-09T19:27:14.861 INFO:tasks.workunit.client.0.vm07.stdout:2/264: dread d3/dd/d16/d29/f42 [0,4194304] 0 2026-03-09T19:27:14.865 INFO:tasks.workunit.client.0.vm07.stdout:3/264: truncate d1/d6/f9 1156262 0 2026-03-09T19:27:14.870 INFO:tasks.workunit.client.1.vm08.stdout:2/449: write d3/d4/d23/d2c/d39/d5e/de/f17 [1684370,1778] 0 2026-03-09T19:27:14.871 INFO:tasks.workunit.client.0.vm07.stdout:9/266: mknod d0/db/d29/d32/c64 0 2026-03-09T19:27:14.872 INFO:tasks.workunit.client.1.vm08.stdout:8/456: fsync de/f16 0 2026-03-09T19:27:14.879 INFO:tasks.workunit.client.1.vm08.stdout:1/626: link d9/da/dc/c25 d9/da/dc/cba 0 2026-03-09T19:27:14.880 INFO:tasks.workunit.client.1.vm08.stdout:1/627: dread - d9/da/d2d/f50 zero size 2026-03-09T19:27:14.884 INFO:tasks.workunit.client.0.vm07.stdout:5/188: write d3/d1a/fb [802965,121180] 0 2026-03-09T19:27:14.884 INFO:tasks.workunit.client.1.vm08.stdout:7/551: write d5/d14/d2b/d5d/f84 [3558694,99441] 0 2026-03-09T19:27:14.885 INFO:tasks.workunit.client.0.vm07.stdout:5/189: chown d3/d1a/d28/d36/l3b 7 1 2026-03-09T19:27:14.885 INFO:tasks.workunit.client.0.vm07.stdout:5/190: dread - d3/f2f zero size 2026-03-09T19:27:14.886 INFO:tasks.workunit.client.0.vm07.stdout:5/191: truncate d3/d1a/d28/f3c 820119 0 2026-03-09T19:27:14.890 INFO:tasks.workunit.client.1.vm08.stdout:1/628: dwrite d9/d11/d7a/d89/d8d/da3/fab [0,4194304] 0 2026-03-09T19:27:14.891 INFO:tasks.workunit.client.1.vm08.stdout:6/479: dwrite d3/d15/f19 [4194304,4194304] 0 2026-03-09T19:27:14.916 INFO:tasks.workunit.client.0.vm07.stdout:7/235: fdatasync d0/d4/d5/f20 0 2026-03-09T19:27:14.918 INFO:tasks.workunit.client.1.vm08.stdout:3/546: mkdir d0/d6/de/d1b/d16/d17/dac 0 2026-03-09T19:27:14.919 INFO:tasks.workunit.client.0.vm07.stdout:0/203: mknod d0/d6/d13/d33/c44 0 2026-03-09T19:27:14.921 INFO:tasks.workunit.client.0.vm07.stdout:8/245: creat d7/d9/f60 x:0 0 0 2026-03-09T19:27:14.921 INFO:tasks.workunit.client.0.vm07.stdout:6/200: creat d0/d13/f57 x:0 0 0 2026-03-09T19:27:14.922 INFO:tasks.workunit.client.0.vm07.stdout:7/236: dwrite d0/d4/d5/d8/d1a/f1d [8388608,4194304] 0 2026-03-09T19:27:14.923 INFO:tasks.workunit.client.0.vm07.stdout:0/204: dread d0/f3d [0,4194304] 0 2026-03-09T19:27:14.924 INFO:tasks.workunit.client.0.vm07.stdout:6/201: write d0/d1/db/d24/f4d [137426,67693] 0 2026-03-09T19:27:14.924 INFO:tasks.workunit.client.0.vm07.stdout:6/202: truncate d0/d1/db/f4b 83973 0 2026-03-09T19:27:14.924 INFO:tasks.workunit.client.0.vm07.stdout:8/246: read d7/d9/f36 [79301,110343] 0 2026-03-09T19:27:14.925 INFO:tasks.workunit.client.0.vm07.stdout:6/203: dread - d0/d1/db/d24/d53/f48 zero size 2026-03-09T19:27:14.928 INFO:tasks.workunit.client.0.vm07.stdout:4/191: rename d3/d11/l23 to d3/d11/d16/d2f/l3a 0 2026-03-09T19:27:14.929 INFO:tasks.workunit.client.1.vm08.stdout:0/496: creat dd/d22/d27/f9e x:0 0 0 2026-03-09T19:27:14.932 INFO:tasks.workunit.client.0.vm07.stdout:4/192: dread d3/d11/f18 [0,4194304] 0 2026-03-09T19:27:14.935 INFO:tasks.workunit.client.0.vm07.stdout:4/193: dread d3/d11/d2b/d37/f2e [0,4194304] 0 2026-03-09T19:27:14.957 INFO:tasks.workunit.client.0.vm07.stdout:6/204: read d0/d1/f8 [3377359,124165] 0 2026-03-09T19:27:14.957 INFO:tasks.workunit.client.0.vm07.stdout:3/265: creat d1/d1f/d3e/f49 x:0 0 0 2026-03-09T19:27:14.957 INFO:tasks.workunit.client.1.vm08.stdout:5/447: fsync ff 0 2026-03-09T19:27:14.958 INFO:tasks.workunit.client.1.vm08.stdout:5/448: dread d16/f4d [0,4194304] 0 2026-03-09T19:27:14.958 INFO:tasks.workunit.client.1.vm08.stdout:7/552: mknod d5/d14/d27/cbc 0 2026-03-09T19:27:14.971 INFO:tasks.workunit.client.1.vm08.stdout:3/547: mkdir d0/d6/dad 0 2026-03-09T19:27:14.972 INFO:tasks.workunit.client.0.vm07.stdout:0/205: sync 2026-03-09T19:27:14.972 INFO:tasks.workunit.client.1.vm08.stdout:3/548: fdatasync d0/f7a 0 2026-03-09T19:27:14.973 INFO:tasks.workunit.client.0.vm07.stdout:0/206: chown d0/d6/d13/d17/l2a 0 1 2026-03-09T19:27:14.974 INFO:tasks.workunit.client.1.vm08.stdout:0/497: dread dd/d22/d24/f87 [0,4194304] 0 2026-03-09T19:27:14.977 INFO:tasks.workunit.client.1.vm08.stdout:9/470: symlink d0/d1b/d97/d48/d5d/d74/la4 0 2026-03-09T19:27:14.980 INFO:tasks.workunit.client.0.vm07.stdout:9/267: dread d0/d17/f1f [0,4194304] 0 2026-03-09T19:27:14.981 INFO:tasks.workunit.client.1.vm08.stdout:4/486: write da/d10/d26/d38/f43 [1237027,114131] 0 2026-03-09T19:27:14.985 INFO:tasks.workunit.client.1.vm08.stdout:6/480: write d3/f32 [1016792,15339] 0 2026-03-09T19:27:14.985 INFO:tasks.workunit.client.1.vm08.stdout:2/450: write d3/d4/f61 [988158,50348] 0 2026-03-09T19:27:14.995 INFO:tasks.workunit.client.0.vm07.stdout:1/179: mknod d1/db/d31/c46 0 2026-03-09T19:27:14.995 INFO:tasks.workunit.client.1.vm08.stdout:1/629: dwrite d9/f48 [0,4194304] 0 2026-03-09T19:27:14.997 INFO:tasks.workunit.client.1.vm08.stdout:6/481: dwrite d3/d34/d5c/da2/f72 [4194304,4194304] 0 2026-03-09T19:27:15.014 INFO:tasks.workunit.client.0.vm07.stdout:6/205: symlink d0/d1/db/d1d/l58 0 2026-03-09T19:27:15.014 INFO:tasks.workunit.client.0.vm07.stdout:2/265: truncate d3/f1a 1351388 0 2026-03-09T19:27:15.014 INFO:tasks.workunit.client.0.vm07.stdout:3/266: symlink d1/d1f/l4a 0 2026-03-09T19:27:15.014 INFO:tasks.workunit.client.0.vm07.stdout:5/192: creat d3/d1a/d28/d40/f46 x:0 0 0 2026-03-09T19:27:15.014 INFO:tasks.workunit.client.0.vm07.stdout:3/267: write d1/f20 [694195,84028] 0 2026-03-09T19:27:15.015 INFO:tasks.workunit.client.0.vm07.stdout:2/266: stat d3/dd/d16/d29/d2d/d45/l52 0 2026-03-09T19:27:15.017 INFO:tasks.workunit.client.1.vm08.stdout:8/457: creat de/d1d/d21/d73/fa6 x:0 0 0 2026-03-09T19:27:15.018 INFO:tasks.workunit.client.0.vm07.stdout:5/193: dwrite d3/f19 [0,4194304] 0 2026-03-09T19:27:15.044 INFO:tasks.workunit.client.0.vm07.stdout:0/207: unlink d0/d6/d13/d1c/f9 0 2026-03-09T19:27:15.044 INFO:tasks.workunit.client.0.vm07.stdout:0/208: dread - d0/f41 zero size 2026-03-09T19:27:15.050 INFO:tasks.workunit.client.1.vm08.stdout:3/549: symlink d0/d4b/lae 0 2026-03-09T19:27:15.080 INFO:tasks.workunit.client.0.vm07.stdout:7/237: fdatasync d0/f13 0 2026-03-09T19:27:15.080 INFO:tasks.workunit.client.0.vm07.stdout:7/238: chown d0/d4/d5/dd/f47 1032300 1 2026-03-09T19:27:15.080 INFO:tasks.workunit.client.0.vm07.stdout:7/239: fdatasync d0/d4/d5/d8/f37 0 2026-03-09T19:27:15.080 INFO:tasks.workunit.client.0.vm07.stdout:9/268: dread d0/d6/f10 [4194304,4194304] 0 2026-03-09T19:27:15.080 INFO:tasks.workunit.client.0.vm07.stdout:1/180: write d1/f1d [472779,23614] 0 2026-03-09T19:27:15.080 INFO:tasks.workunit.client.0.vm07.stdout:1/181: dread - d1/d11/f42 zero size 2026-03-09T19:27:15.080 INFO:tasks.workunit.client.0.vm07.stdout:1/182: dwrite d1/db/f1f [0,4194304] 0 2026-03-09T19:27:15.080 INFO:tasks.workunit.client.1.vm08.stdout:0/498: fsync dd/d22/d24/f71 0 2026-03-09T19:27:15.080 INFO:tasks.workunit.client.1.vm08.stdout:0/499: truncate dd/d22/d27/f9e 950551 0 2026-03-09T19:27:15.080 INFO:tasks.workunit.client.1.vm08.stdout:9/471: symlink d0/d1b/d97/d48/d6f/la5 0 2026-03-09T19:27:15.080 INFO:tasks.workunit.client.1.vm08.stdout:9/472: chown d0/d1b/d68/d7f/d8c 2001923 1 2026-03-09T19:27:15.080 INFO:tasks.workunit.client.1.vm08.stdout:9/473: chown d0/d2/d14/d5c 416935 1 2026-03-09T19:27:15.080 INFO:tasks.workunit.client.1.vm08.stdout:4/487: unlink f5 0 2026-03-09T19:27:15.080 INFO:tasks.workunit.client.1.vm08.stdout:9/474: dwrite d0/d1b/f8a [0,4194304] 0 2026-03-09T19:27:15.111 INFO:tasks.workunit.client.0.vm07.stdout:7/240: dread d0/d4/d5/d8/d1a/f1d [0,4194304] 0 2026-03-09T19:27:15.112 INFO:tasks.workunit.client.1.vm08.stdout:5/449: mkdir d16/d8e 0 2026-03-09T19:27:15.112 INFO:tasks.workunit.client.1.vm08.stdout:7/553: rename d5/d14/dae/d1c/l74 to d5/d14/d2b/d5d/lbd 0 2026-03-09T19:27:15.113 INFO:tasks.workunit.client.1.vm08.stdout:5/450: dread - d16/d1e/d30/f70 zero size 2026-03-09T19:27:15.113 INFO:tasks.workunit.client.1.vm08.stdout:7/554: chown d5/d14/dae/f49 117806 1 2026-03-09T19:27:15.114 INFO:tasks.workunit.client.1.vm08.stdout:7/555: truncate d5/d14/dae/d1c/d73/fb3 110710 0 2026-03-09T19:27:15.115 INFO:tasks.workunit.client.0.vm07.stdout:7/241: dwrite d0/d4/d5/d8/d1a/d2a/f34 [0,4194304] 0 2026-03-09T19:27:15.117 INFO:tasks.workunit.client.0.vm07.stdout:6/206: stat d0/d44 0 2026-03-09T19:27:15.118 INFO:tasks.workunit.client.0.vm07.stdout:7/242: chown d0/d4/d5/d8/d1a 5252110 1 2026-03-09T19:27:15.124 INFO:tasks.workunit.client.0.vm07.stdout:7/243: read d0/d4/d5/d8/fa [60536,33896] 0 2026-03-09T19:27:15.125 INFO:tasks.workunit.client.0.vm07.stdout:6/207: dwrite d0/d13/f26 [0,4194304] 0 2026-03-09T19:27:15.126 INFO:tasks.workunit.client.0.vm07.stdout:6/208: chown d0/f4f 7077813 1 2026-03-09T19:27:15.127 INFO:tasks.workunit.client.0.vm07.stdout:6/209: chown d0/d1/db/d52 383 1 2026-03-09T19:27:15.127 INFO:tasks.workunit.client.0.vm07.stdout:7/244: write d0/d4/d5/f36 [832092,87745] 0 2026-03-09T19:27:15.128 INFO:tasks.workunit.client.1.vm08.stdout:8/458: creat de/d1d/d21/d73/fa7 x:0 0 0 2026-03-09T19:27:15.142 INFO:tasks.workunit.client.0.vm07.stdout:1/183: dread d1/d3/f23 [0,4194304] 0 2026-03-09T19:27:15.147 INFO:tasks.workunit.client.1.vm08.stdout:3/550: creat d0/d52/d6d/d77/d88/faf x:0 0 0 2026-03-09T19:27:15.156 INFO:tasks.workunit.client.0.vm07.stdout:2/267: dread d3/f15 [0,4194304] 0 2026-03-09T19:27:15.163 INFO:tasks.workunit.client.0.vm07.stdout:5/194: mkdir d3/dd/d26/d3f/d47 0 2026-03-09T19:27:15.163 INFO:tasks.workunit.client.0.vm07.stdout:2/268: read d3/f4 [1276311,96106] 0 2026-03-09T19:27:15.163 INFO:tasks.workunit.client.0.vm07.stdout:2/269: read - d3/dd/d16/f48 zero size 2026-03-09T19:27:15.163 INFO:tasks.workunit.client.0.vm07.stdout:2/270: write d3/dd/d16/d30/d40/f4f [240741,93936] 0 2026-03-09T19:27:15.165 INFO:tasks.workunit.client.0.vm07.stdout:0/209: fsync d0/fa 0 2026-03-09T19:27:15.172 INFO:tasks.workunit.client.1.vm08.stdout:6/482: sync 2026-03-09T19:27:15.175 INFO:tasks.workunit.client.1.vm08.stdout:9/475: sync 2026-03-09T19:27:15.175 INFO:tasks.workunit.client.1.vm08.stdout:4/488: sync 2026-03-09T19:27:15.175 INFO:tasks.workunit.client.1.vm08.stdout:6/483: stat d3/d15/c22 0 2026-03-09T19:27:15.178 INFO:tasks.workunit.client.1.vm08.stdout:7/556: chown d5/c15 421 1 2026-03-09T19:27:15.183 INFO:tasks.workunit.client.0.vm07.stdout:9/269: dread d0/db/f39 [0,4194304] 0 2026-03-09T19:27:15.185 INFO:tasks.workunit.client.0.vm07.stdout:8/247: creat d7/d30/f61 x:0 0 0 2026-03-09T19:27:15.186 INFO:tasks.workunit.client.0.vm07.stdout:8/248: chown d7/d1d/f4d 12 1 2026-03-09T19:27:15.187 INFO:tasks.workunit.client.0.vm07.stdout:8/249: fsync d7/d9/d10/f41 0 2026-03-09T19:27:15.198 INFO:tasks.workunit.client.1.vm08.stdout:5/451: creat d16/d1e/d30/d6f/f8f x:0 0 0 2026-03-09T19:27:15.201 INFO:tasks.workunit.client.0.vm07.stdout:7/245: rename d0/d4/d4f to d0/d52/d54 0 2026-03-09T19:27:15.201 INFO:tasks.workunit.client.0.vm07.stdout:7/246: chown d0/d4 207 1 2026-03-09T19:27:15.207 INFO:tasks.workunit.client.1.vm08.stdout:9/476: sync 2026-03-09T19:27:15.208 INFO:tasks.workunit.client.1.vm08.stdout:4/489: unlink da/d10/d16/d28/d46/d52/d6e/d2c/f60 0 2026-03-09T19:27:15.220 INFO:tasks.workunit.client.1.vm08.stdout:5/452: sync 2026-03-09T19:27:15.220 INFO:tasks.workunit.client.0.vm07.stdout:4/194: write d3/f1a [1148029,40080] 0 2026-03-09T19:27:15.220 INFO:tasks.workunit.client.0.vm07.stdout:4/195: chown d3/d11/d29 131832617 1 2026-03-09T19:27:15.223 INFO:tasks.workunit.client.0.vm07.stdout:5/195: mkdir d3/d1a/d28/d48 0 2026-03-09T19:27:15.224 INFO:tasks.workunit.client.1.vm08.stdout:6/484: creat d3/db/d43/fab x:0 0 0 2026-03-09T19:27:15.231 INFO:tasks.workunit.client.1.vm08.stdout:7/557: creat d5/d14/dae/d1c/d73/fbe x:0 0 0 2026-03-09T19:27:15.231 INFO:tasks.workunit.client.1.vm08.stdout:6/485: sync 2026-03-09T19:27:15.232 INFO:tasks.workunit.client.1.vm08.stdout:6/486: fdatasync d3/d34/f37 0 2026-03-09T19:27:15.234 INFO:tasks.workunit.client.1.vm08.stdout:7/558: dread d5/d14/dae/d1c/d73/fb3 [0,4194304] 0 2026-03-09T19:27:15.239 INFO:tasks.workunit.client.1.vm08.stdout:6/487: dread d3/d15/f19 [4194304,4194304] 0 2026-03-09T19:27:15.247 INFO:tasks.workunit.client.1.vm08.stdout:0/500: creat dd/d22/d27/f9f x:0 0 0 2026-03-09T19:27:15.249 INFO:tasks.workunit.client.1.vm08.stdout:0/501: write dd/d22/d27/f3d [3537033,63777] 0 2026-03-09T19:27:15.255 INFO:tasks.workunit.client.1.vm08.stdout:9/477: dread d0/d1b/d97/d48/d6f/f84 [0,4194304] 0 2026-03-09T19:27:15.256 INFO:tasks.workunit.client.1.vm08.stdout:9/478: stat d0/d2/d14/f28 0 2026-03-09T19:27:15.258 INFO:tasks.workunit.client.0.vm07.stdout:9/270: creat d0/db/d29/d4d/f65 x:0 0 0 2026-03-09T19:27:15.273 INFO:tasks.workunit.client.0.vm07.stdout:8/250: mkdir d7/d9/d37/d45/d56/d62 0 2026-03-09T19:27:15.275 INFO:tasks.workunit.client.1.vm08.stdout:2/451: write d3/d4/d23/d2c/d39/d5e/d14/f78 [2882570,95219] 0 2026-03-09T19:27:15.296 INFO:tasks.workunit.client.1.vm08.stdout:6/488: rename d3/d34/d5c/f7c to d3/d34/d5c/fac 0 2026-03-09T19:27:15.297 INFO:tasks.workunit.client.0.vm07.stdout:1/184: link d1/d11/d37/d3f/d45/f3b d1/d3/d21/f47 0 2026-03-09T19:27:15.298 INFO:tasks.workunit.client.0.vm07.stdout:1/185: dread - d1/d11/d37/f2c zero size 2026-03-09T19:27:15.299 INFO:tasks.workunit.client.0.vm07.stdout:1/186: chown d1/d11/d37/d3f/d45/f26 8010 1 2026-03-09T19:27:15.300 INFO:tasks.workunit.client.1.vm08.stdout:8/459: write de/d1d/d21/f62 [1657924,72321] 0 2026-03-09T19:27:15.302 INFO:tasks.workunit.client.0.vm07.stdout:6/210: dwrite d0/f9 [0,4194304] 0 2026-03-09T19:27:15.306 INFO:tasks.workunit.client.1.vm08.stdout:1/630: dwrite d9/da/d12/d39/f47 [0,4194304] 0 2026-03-09T19:27:15.307 INFO:tasks.workunit.client.0.vm07.stdout:4/196: rmdir d3/d11/d2b 39 2026-03-09T19:27:15.314 INFO:tasks.workunit.client.0.vm07.stdout:5/196: rmdir d3 39 2026-03-09T19:27:15.317 INFO:tasks.workunit.client.0.vm07.stdout:7/247: write d0/d4/fc [327320,34615] 0 2026-03-09T19:27:15.317 INFO:tasks.workunit.client.0.vm07.stdout:2/271: mkdir d3/d49/d60 0 2026-03-09T19:27:15.327 INFO:tasks.workunit.client.1.vm08.stdout:8/460: dread f1 [0,4194304] 0 2026-03-09T19:27:15.334 INFO:tasks.workunit.client.1.vm08.stdout:0/502: mknod dd/d22/d24/d49/d98/ca0 0 2026-03-09T19:27:15.337 INFO:tasks.workunit.client.1.vm08.stdout:0/503: fdatasync dd/d22/d24/d49/d50/f95 0 2026-03-09T19:27:15.340 INFO:tasks.workunit.client.0.vm07.stdout:9/271: truncate d0/db/d29/f2f 232301 0 2026-03-09T19:27:15.342 INFO:tasks.workunit.client.1.vm08.stdout:5/453: dwrite d16/d1e/d30/f3f [0,4194304] 0 2026-03-09T19:27:15.342 INFO:tasks.workunit.client.0.vm07.stdout:9/272: write d0/db/f41 [2563700,39338] 0 2026-03-09T19:27:15.342 INFO:tasks.workunit.client.0.vm07.stdout:9/273: fsync d0/d6/d57/f59 0 2026-03-09T19:27:15.343 INFO:tasks.workunit.client.1.vm08.stdout:4/490: mknod da/d10/d16/d28/d2f/d4f/d64/d81/c97 0 2026-03-09T19:27:15.348 INFO:tasks.workunit.client.1.vm08.stdout:4/491: chown da/d10/d16/d28/f34 7013 1 2026-03-09T19:27:15.364 INFO:tasks.workunit.client.0.vm07.stdout:0/210: rename d0/c2d to d0/d6/d13/d1c/d11/c45 0 2026-03-09T19:27:15.374 INFO:tasks.workunit.client.1.vm08.stdout:7/559: symlink d5/lbf 0 2026-03-09T19:27:15.377 INFO:tasks.workunit.client.1.vm08.stdout:3/551: getdents d0/d6/de/d1b/d16 0 2026-03-09T19:27:15.386 INFO:tasks.workunit.client.0.vm07.stdout:8/251: creat d7/d9/d10/d44/f63 x:0 0 0 2026-03-09T19:27:15.387 INFO:tasks.workunit.client.0.vm07.stdout:8/252: truncate d7/d9/f4c 515913 0 2026-03-09T19:27:15.388 INFO:tasks.workunit.client.0.vm07.stdout:8/253: truncate d7/d1d/f3f 2517410 0 2026-03-09T19:27:15.394 INFO:tasks.workunit.client.0.vm07.stdout:3/268: dwrite d1/d6/f9 [0,4194304] 0 2026-03-09T19:27:15.395 INFO:tasks.workunit.client.1.vm08.stdout:1/631: unlink d9/da/d12/d39/c69 0 2026-03-09T19:27:15.402 INFO:tasks.workunit.client.0.vm07.stdout:8/254: dread d7/d9/d10/f20 [0,4194304] 0 2026-03-09T19:27:15.402 INFO:tasks.workunit.client.1.vm08.stdout:9/479: dwrite d0/d2/d8/fe [0,4194304] 0 2026-03-09T19:27:15.403 INFO:tasks.workunit.client.0.vm07.stdout:8/255: read d7/d16/d1e/f33 [2840094,44192] 0 2026-03-09T19:27:15.419 INFO:tasks.workunit.client.0.vm07.stdout:5/197: write d3/f19 [2535485,17572] 0 2026-03-09T19:27:15.425 INFO:tasks.workunit.client.0.vm07.stdout:7/248: mkdir d0/d52/d54/d55 0 2026-03-09T19:27:15.426 INFO:tasks.workunit.client.0.vm07.stdout:2/272: truncate d3/dd/d16/d29/f42 3210015 0 2026-03-09T19:27:15.432 INFO:tasks.workunit.client.1.vm08.stdout:2/452: unlink d3/d4/d23/d2c/d39/d5e/d87/c9e 0 2026-03-09T19:27:15.432 INFO:tasks.workunit.client.1.vm08.stdout:4/492: fsync f1 0 2026-03-09T19:27:15.434 INFO:tasks.workunit.client.1.vm08.stdout:3/552: creat d0/d4b/fb0 x:0 0 0 2026-03-09T19:27:15.436 INFO:tasks.workunit.client.1.vm08.stdout:6/489: mknod d3/d34/d6f/cad 0 2026-03-09T19:27:15.439 INFO:tasks.workunit.client.0.vm07.stdout:6/211: rename d0/d44/l45 to d0/d13/l59 0 2026-03-09T19:27:15.442 INFO:tasks.workunit.client.1.vm08.stdout:2/453: dwrite d3/d4/f8 [4194304,4194304] 0 2026-03-09T19:27:15.454 INFO:tasks.workunit.client.1.vm08.stdout:2/454: dread d3/d9/f1e [0,4194304] 0 2026-03-09T19:27:15.458 INFO:tasks.workunit.client.0.vm07.stdout:4/197: write d3/d11/f18 [1876449,116108] 0 2026-03-09T19:27:15.464 INFO:tasks.workunit.client.1.vm08.stdout:9/480: mknod d0/d2/d14/ca6 0 2026-03-09T19:27:15.470 INFO:tasks.workunit.client.1.vm08.stdout:5/454: getdents d16/d8e 0 2026-03-09T19:27:15.471 INFO:tasks.workunit.client.1.vm08.stdout:9/481: chown d0/d1b/d97/d48/d6f/c76 26845507 1 2026-03-09T19:27:15.471 INFO:tasks.workunit.client.1.vm08.stdout:4/493: creat da/d10/d1b/d23/f98 x:0 0 0 2026-03-09T19:27:15.471 INFO:tasks.workunit.client.0.vm07.stdout:8/256: rmdir d7/d16 39 2026-03-09T19:27:15.471 INFO:tasks.workunit.client.0.vm07.stdout:8/257: truncate d7/d9/d10/d44/f48 740563 0 2026-03-09T19:27:15.471 INFO:tasks.workunit.client.0.vm07.stdout:1/187: link d1/d3/d21/f2e d1/d11/f48 0 2026-03-09T19:27:15.471 INFO:tasks.workunit.client.0.vm07.stdout:1/188: stat d1/d3/l25 0 2026-03-09T19:27:15.471 INFO:tasks.workunit.client.1.vm08.stdout:9/482: truncate d0/d2/d8/f8e 960086 0 2026-03-09T19:27:15.484 INFO:tasks.workunit.client.0.vm07.stdout:9/274: write d0/d6/f20 [1398734,107927] 0 2026-03-09T19:27:15.491 INFO:tasks.workunit.client.0.vm07.stdout:5/198: chown d3/dd/f22 62 1 2026-03-09T19:27:15.491 INFO:tasks.workunit.client.0.vm07.stdout:9/275: fdatasync d0/db/d29/d2c/d36/f3c 0 2026-03-09T19:27:15.492 INFO:tasks.workunit.client.1.vm08.stdout:6/490: creat d3/d34/da9/fae x:0 0 0 2026-03-09T19:27:15.493 INFO:tasks.workunit.client.1.vm08.stdout:1/632: dwrite d9/da/f30 [0,4194304] 0 2026-03-09T19:27:15.493 INFO:tasks.workunit.client.0.vm07.stdout:7/249: rename d0/d4/d5/d8/fa to d0/d4/d5/d26/d32/f56 0 2026-03-09T19:27:15.496 INFO:tasks.workunit.client.1.vm08.stdout:8/461: dwrite de/d1d/d69/f8f [0,4194304] 0 2026-03-09T19:27:15.504 INFO:tasks.workunit.client.0.vm07.stdout:9/276: sync 2026-03-09T19:27:15.504 INFO:tasks.workunit.client.0.vm07.stdout:5/199: sync 2026-03-09T19:27:15.512 INFO:tasks.workunit.client.0.vm07.stdout:4/198: creat d3/d11/d16/d2f/d22/f3b x:0 0 0 2026-03-09T19:27:15.515 INFO:tasks.workunit.client.1.vm08.stdout:0/504: truncate dd/d22/d27/f3f 700923 0 2026-03-09T19:27:15.520 INFO:tasks.workunit.client.1.vm08.stdout:2/455: dread d3/d4/d23/d2c/f80 [0,4194304] 0 2026-03-09T19:27:15.523 INFO:tasks.workunit.client.1.vm08.stdout:5/455: symlink d16/d1e/d3b/l90 0 2026-03-09T19:27:15.525 INFO:tasks.workunit.client.1.vm08.stdout:4/494: unlink da/d10/d26/d50/c63 0 2026-03-09T19:27:15.529 INFO:tasks.workunit.client.0.vm07.stdout:2/273: rename d3/dd/d16/d29/d2d/d45/c4e to d3/dd/d16/d2f/c61 0 2026-03-09T19:27:15.531 INFO:tasks.workunit.client.0.vm07.stdout:2/274: dread d3/dd/f1d [0,4194304] 0 2026-03-09T19:27:15.531 INFO:tasks.workunit.client.1.vm08.stdout:1/633: write d9/d11/f44 [3013854,55419] 0 2026-03-09T19:27:15.536 INFO:tasks.workunit.client.0.vm07.stdout:7/250: fsync d0/d4/d5/d26/d32/f45 0 2026-03-09T19:27:15.540 INFO:tasks.workunit.client.1.vm08.stdout:8/462: chown cb 33087 1 2026-03-09T19:27:15.540 INFO:tasks.workunit.client.0.vm07.stdout:7/251: dread - d0/d4/d5/d26/f4a zero size 2026-03-09T19:27:15.540 INFO:tasks.workunit.client.0.vm07.stdout:7/252: dread - d0/d4/d5/dd/f47 zero size 2026-03-09T19:27:15.540 INFO:tasks.workunit.client.0.vm07.stdout:7/253: read d0/f13 [7401142,43909] 0 2026-03-09T19:27:15.544 INFO:tasks.workunit.client.1.vm08.stdout:8/463: truncate de/d1d/d2e/f9f 1196166 0 2026-03-09T19:27:15.549 INFO:tasks.workunit.client.1.vm08.stdout:6/491: rename d3/d15/f98 to d3/db/d43/d69/da0/faf 0 2026-03-09T19:27:15.549 INFO:tasks.workunit.client.0.vm07.stdout:0/211: dread d0/f1e [0,4194304] 0 2026-03-09T19:27:15.552 INFO:tasks.workunit.client.0.vm07.stdout:9/277: mknod d0/db/d29/d4d/c66 0 2026-03-09T19:27:15.553 INFO:tasks.workunit.client.1.vm08.stdout:0/505: mknod dd/d22/d27/d2e/d37/ca1 0 2026-03-09T19:27:15.555 INFO:tasks.workunit.client.0.vm07.stdout:6/212: write d0/d1/db/f14 [603925,47442] 0 2026-03-09T19:27:15.568 INFO:tasks.workunit.client.0.vm07.stdout:4/199: creat d3/d11/d29/f3c x:0 0 0 2026-03-09T19:27:15.569 INFO:tasks.workunit.client.1.vm08.stdout:3/553: dwrite d0/d6/de/d15/d96/fa0 [0,4194304] 0 2026-03-09T19:27:15.569 INFO:tasks.workunit.client.0.vm07.stdout:5/200: dwrite d3/d1a/f1c [4194304,4194304] 0 2026-03-09T19:27:15.572 INFO:tasks.workunit.client.0.vm07.stdout:8/258: creat d7/d9/d37/d45/d56/d62/f64 x:0 0 0 2026-03-09T19:27:15.572 INFO:tasks.workunit.client.0.vm07.stdout:5/201: readlink d3/dd/d26/d2d/l41 0 2026-03-09T19:27:15.576 INFO:tasks.workunit.client.0.vm07.stdout:1/189: creat d1/d3e/f49 x:0 0 0 2026-03-09T19:27:15.576 INFO:tasks.workunit.client.0.vm07.stdout:1/190: readlink d1/d3/d21/l2b 0 2026-03-09T19:27:15.576 INFO:tasks.workunit.client.0.vm07.stdout:8/259: truncate d7/d9/d37/d45/d56/d62/f64 426599 0 2026-03-09T19:27:15.578 INFO:tasks.workunit.client.0.vm07.stdout:1/191: dwrite d1/d11/d37/d3f/d45/f15 [0,4194304] 0 2026-03-09T19:27:15.590 INFO:tasks.workunit.client.1.vm08.stdout:7/560: getdents d5/d14/dae/d3a/d42 0 2026-03-09T19:27:15.600 INFO:tasks.workunit.client.0.vm07.stdout:7/254: unlink d0/d4/ce 0 2026-03-09T19:27:15.602 INFO:tasks.workunit.client.1.vm08.stdout:1/634: creat d9/da/d12/d39/fbb x:0 0 0 2026-03-09T19:27:15.606 INFO:tasks.workunit.client.0.vm07.stdout:0/212: dwrite d0/d6/d13/d1c/d11/f3f [0,4194304] 0 2026-03-09T19:27:15.614 INFO:tasks.workunit.client.0.vm07.stdout:9/278: dread d0/d6/f8 [0,4194304] 0 2026-03-09T19:27:15.616 INFO:tasks.workunit.client.1.vm08.stdout:8/464: symlink de/d25/d31/la8 0 2026-03-09T19:27:15.635 INFO:tasks.workunit.client.1.vm08.stdout:0/506: mknod dd/d22/d27/d4f/d6f/ca2 0 2026-03-09T19:27:15.645 INFO:tasks.workunit.client.0.vm07.stdout:5/202: creat d3/d1a/d28/d40/f49 x:0 0 0 2026-03-09T19:27:15.645 INFO:tasks.workunit.client.0.vm07.stdout:5/203: chown d3/c4 270923 1 2026-03-09T19:27:15.646 INFO:tasks.workunit.client.1.vm08.stdout:6/492: dread d3/f7 [0,4194304] 0 2026-03-09T19:27:15.656 INFO:tasks.workunit.client.1.vm08.stdout:3/554: rmdir d0/d6/d25 39 2026-03-09T19:27:15.670 INFO:tasks.workunit.client.0.vm07.stdout:5/204: dread d3/d1a/f12 [0,4194304] 0 2026-03-09T19:27:15.678 INFO:tasks.workunit.client.0.vm07.stdout:5/205: read d3/dd/f22 [159803,36945] 0 2026-03-09T19:27:15.678 INFO:tasks.workunit.client.0.vm07.stdout:5/206: stat d3/dd/l2a 0 2026-03-09T19:27:15.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:15 vm07.local ceph-mon[48545]: pgmap v162: 65 pgs: 65 active+clean; 1.4 GiB data, 5.7 GiB used, 114 GiB / 120 GiB avail; 29 MiB/s rd, 122 MiB/s wr, 250 op/s 2026-03-09T19:27:15.736 INFO:tasks.workunit.client.1.vm08.stdout:2/456: write d3/d9/f20 [1411426,121314] 0 2026-03-09T19:27:15.746 INFO:tasks.workunit.client.1.vm08.stdout:5/456: dwrite d16/d1e/d3b/f43 [0,4194304] 0 2026-03-09T19:27:15.746 INFO:tasks.workunit.client.1.vm08.stdout:4/495: dwrite da/d10/d16/d28/d46/d52/d6e/d2c/f36 [0,4194304] 0 2026-03-09T19:27:15.752 INFO:tasks.workunit.client.0.vm07.stdout:8/260: creat d7/d9/f65 x:0 0 0 2026-03-09T19:27:15.759 INFO:tasks.workunit.client.1.vm08.stdout:5/457: dwrite d16/d1e/d3b/f43 [4194304,4194304] 0 2026-03-09T19:27:15.771 INFO:tasks.workunit.client.0.vm07.stdout:1/192: creat d1/d11/d37/d3f/f4a x:0 0 0 2026-03-09T19:27:15.772 INFO:tasks.workunit.client.0.vm07.stdout:1/193: write d1/f1d [776408,35063] 0 2026-03-09T19:27:15.775 INFO:tasks.workunit.client.0.vm07.stdout:1/194: dread d1/d3/f12 [0,4194304] 0 2026-03-09T19:27:15.789 INFO:tasks.workunit.client.0.vm07.stdout:2/275: link d3/d11/f1f d3/dd/d16/d29/d2d/d45/f62 0 2026-03-09T19:27:15.796 INFO:tasks.workunit.client.0.vm07.stdout:0/213: dread - d0/f3a zero size 2026-03-09T19:27:15.799 INFO:tasks.workunit.client.0.vm07.stdout:9/279: truncate d0/d6/ff 5188380 0 2026-03-09T19:27:15.799 INFO:tasks.workunit.client.0.vm07.stdout:9/280: dwrite d0/d6/d57/f58 [0,4194304] 0 2026-03-09T19:27:15.801 INFO:tasks.workunit.client.0.vm07.stdout:9/281: dread - d0/d17/f5e zero size 2026-03-09T19:27:15.803 INFO:tasks.workunit.client.0.vm07.stdout:8/261: creat d7/d9/d37/d45/d4f/f66 x:0 0 0 2026-03-09T19:27:15.821 INFO:tasks.workunit.client.0.vm07.stdout:1/195: symlink d1/d3/d21/l4b 0 2026-03-09T19:27:15.829 INFO:tasks.workunit.client.0.vm07.stdout:6/213: creat d0/d1/f5a x:0 0 0 2026-03-09T19:27:15.831 INFO:tasks.workunit.client.0.vm07.stdout:4/200: mknod d3/d11/d2b/d37/c3d 0 2026-03-09T19:27:15.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:15 vm08.local ceph-mon[57794]: pgmap v162: 65 pgs: 65 active+clean; 1.4 GiB data, 5.7 GiB used, 114 GiB / 120 GiB avail; 29 MiB/s rd, 122 MiB/s wr, 250 op/s 2026-03-09T19:27:15.849 INFO:tasks.workunit.client.0.vm07.stdout:8/262: mkdir d7/d9/d37/d45/d56/d67 0 2026-03-09T19:27:15.849 INFO:tasks.workunit.client.0.vm07.stdout:8/263: stat d7/d9/cc 0 2026-03-09T19:27:15.855 INFO:tasks.workunit.client.0.vm07.stdout:2/276: truncate d3/f1a 291171 0 2026-03-09T19:27:15.867 INFO:tasks.workunit.client.0.vm07.stdout:6/214: creat d0/d2d/f5b x:0 0 0 2026-03-09T19:27:15.867 INFO:tasks.workunit.client.0.vm07.stdout:4/201: symlink d3/d11/d29/l3e 0 2026-03-09T19:27:15.868 INFO:tasks.workunit.client.0.vm07.stdout:4/202: write d3/d11/f35 [8980,50705] 0 2026-03-09T19:27:15.890 INFO:tasks.workunit.client.1.vm08.stdout:8/465: dwrite de/d1d/d2e/f9f [0,4194304] 0 2026-03-09T19:27:15.890 INFO:tasks.workunit.client.0.vm07.stdout:6/215: unlink d0/d1/db/l1f 0 2026-03-09T19:27:15.895 INFO:tasks.workunit.client.0.vm07.stdout:8/264: symlink d7/d9/d37/d45/d56/d67/l68 0 2026-03-09T19:27:15.895 INFO:tasks.workunit.client.1.vm08.stdout:6/493: chown d3/d34/d6f/f41 30296692 1 2026-03-09T19:27:15.902 INFO:tasks.workunit.client.0.vm07.stdout:1/196: creat d1/f4c x:0 0 0 2026-03-09T19:27:15.903 INFO:tasks.workunit.client.1.vm08.stdout:3/555: creat d0/d8/d24/fb1 x:0 0 0 2026-03-09T19:27:15.914 INFO:tasks.workunit.client.0.vm07.stdout:1/197: truncate d1/d3/f23 1694664 0 2026-03-09T19:27:15.914 INFO:tasks.workunit.client.0.vm07.stdout:2/277: creat d3/f63 x:0 0 0 2026-03-09T19:27:15.914 INFO:tasks.workunit.client.0.vm07.stdout:1/198: dwrite d1/f38 [0,4194304] 0 2026-03-09T19:27:15.915 INFO:tasks.workunit.client.0.vm07.stdout:1/199: chown d1/d11 16093 1 2026-03-09T19:27:15.919 INFO:tasks.workunit.client.1.vm08.stdout:9/483: getdents d0 0 2026-03-09T19:27:15.934 INFO:tasks.workunit.client.1.vm08.stdout:2/457: dread d3/d4/f49 [0,4194304] 0 2026-03-09T19:27:15.934 INFO:tasks.workunit.client.0.vm07.stdout:2/278: mkdir d3/dd/d16/d30/d64 0 2026-03-09T19:27:15.934 INFO:tasks.workunit.client.1.vm08.stdout:2/458: chown d3/d4/l34 216896316 1 2026-03-09T19:27:15.935 INFO:tasks.workunit.client.0.vm07.stdout:2/279: dread - d3/dd/d16/f48 zero size 2026-03-09T19:27:15.951 INFO:tasks.workunit.client.1.vm08.stdout:6/494: rmdir d3/d68/d7e 39 2026-03-09T19:27:15.951 INFO:tasks.workunit.client.0.vm07.stdout:2/280: dread d3/dd/f24 [0,4194304] 0 2026-03-09T19:27:15.952 INFO:tasks.workunit.client.0.vm07.stdout:2/281: write d3/dd/d16/d30/d40/f4f [307019,43805] 0 2026-03-09T19:27:15.953 INFO:tasks.workunit.client.0.vm07.stdout:2/282: write d3/dd/f1d [554609,3135] 0 2026-03-09T19:27:15.954 INFO:tasks.workunit.client.0.vm07.stdout:2/283: stat d3/dd/d16/d30/f3a 0 2026-03-09T19:27:15.954 INFO:tasks.workunit.client.1.vm08.stdout:6/495: chown d3/d34/d6f/cad 20049106 1 2026-03-09T19:27:15.957 INFO:tasks.workunit.client.0.vm07.stdout:2/284: dread f0 [0,4194304] 0 2026-03-09T19:27:15.957 INFO:tasks.workunit.client.1.vm08.stdout:3/556: truncate d0/d52/f5c 3647909 0 2026-03-09T19:27:15.958 INFO:tasks.workunit.client.1.vm08.stdout:4/496: symlink da/l99 0 2026-03-09T19:27:15.961 INFO:tasks.workunit.client.1.vm08.stdout:4/497: stat da/d10/d16/l48 0 2026-03-09T19:27:15.974 INFO:tasks.workunit.client.1.vm08.stdout:7/561: rename d5/d14/dae/d3a/cb9 to d5/d14/dae/d3a/cc0 0 2026-03-09T19:27:15.982 INFO:tasks.workunit.client.0.vm07.stdout:5/207: dwrite d3/d1a/f12 [0,4194304] 0 2026-03-09T19:27:15.983 INFO:tasks.workunit.client.0.vm07.stdout:3/269: rename d1/d6/c35 to d1/d1f/c4b 0 2026-03-09T19:27:15.984 INFO:tasks.workunit.client.0.vm07.stdout:4/203: rename d3 to d3/d11/d16/d2f/d3f 22 2026-03-09T19:27:15.985 INFO:tasks.workunit.client.0.vm07.stdout:3/270: write d1/f7 [1608677,92550] 0 2026-03-09T19:27:15.989 INFO:tasks.workunit.client.0.vm07.stdout:4/204: dread d3/d11/f18 [0,4194304] 0 2026-03-09T19:27:15.991 INFO:tasks.workunit.client.0.vm07.stdout:5/208: dread d3/d1a/fc [0,4194304] 0 2026-03-09T19:27:16.006 INFO:tasks.workunit.client.0.vm07.stdout:2/285: mknod d3/dd/d16/d30/d64/c65 0 2026-03-09T19:27:16.008 INFO:tasks.workunit.client.0.vm07.stdout:2/286: chown d3/dd/d16/d2f/c43 3 1 2026-03-09T19:27:16.009 INFO:tasks.workunit.client.0.vm07.stdout:2/287: stat d3/dd/d16/d30/d40/c4b 0 2026-03-09T19:27:16.011 INFO:tasks.workunit.client.1.vm08.stdout:2/459: chown d3/d9/d79/d46/d8c/fa5 236923652 1 2026-03-09T19:27:16.013 INFO:tasks.workunit.client.0.vm07.stdout:1/200: rename d1/d11/f24 to d1/d11/d37/d3f/d45/d30/f4d 0 2026-03-09T19:27:16.014 INFO:tasks.workunit.client.0.vm07.stdout:2/288: rename d3 to d3/dd/d66 22 2026-03-09T19:27:16.017 INFO:tasks.workunit.client.1.vm08.stdout:6/496: creat d3/db/fb0 x:0 0 0 2026-03-09T19:27:16.026 INFO:tasks.workunit.client.0.vm07.stdout:1/201: dwrite d1/f4c [0,4194304] 0 2026-03-09T19:27:16.027 INFO:tasks.workunit.client.0.vm07.stdout:5/209: creat d3/d1a/d28/d40/f4a x:0 0 0 2026-03-09T19:27:16.029 INFO:tasks.workunit.client.0.vm07.stdout:5/210: dwrite d3/d1a/fc [0,4194304] 0 2026-03-09T19:27:16.030 INFO:tasks.workunit.client.0.vm07.stdout:7/255: unlink d0/d4/d5/d8/f10 0 2026-03-09T19:27:16.031 INFO:tasks.workunit.client.1.vm08.stdout:7/562: dread d5/d14/d2b/d4b/f66 [0,4194304] 0 2026-03-09T19:27:16.035 INFO:tasks.workunit.client.0.vm07.stdout:7/256: read d0/d4/d5/d26/f42 [135792,19065] 0 2026-03-09T19:27:16.036 INFO:tasks.workunit.client.0.vm07.stdout:7/257: write d0/d4/d5/d8/d1a/f4d [894712,5033] 0 2026-03-09T19:27:16.044 INFO:tasks.workunit.client.0.vm07.stdout:4/205: dread d3/f13 [0,4194304] 0 2026-03-09T19:27:16.045 INFO:tasks.workunit.client.0.vm07.stdout:4/206: truncate d3/d11/f35 107166 0 2026-03-09T19:27:16.045 INFO:tasks.workunit.client.0.vm07.stdout:4/207: chown d3/d11/d29 15 1 2026-03-09T19:27:16.046 INFO:tasks.workunit.client.0.vm07.stdout:0/214: link d0/d6/l25 d0/d6/d13/l46 0 2026-03-09T19:27:16.047 INFO:tasks.workunit.client.0.vm07.stdout:0/215: truncate d0/d6/d13/d33/f39 837937 0 2026-03-09T19:27:16.049 INFO:tasks.workunit.client.0.vm07.stdout:2/289: write f0 [4652903,98663] 0 2026-03-09T19:27:16.056 INFO:tasks.workunit.client.1.vm08.stdout:1/635: dwrite d9/da/dc/f31 [4194304,4194304] 0 2026-03-09T19:27:16.064 INFO:tasks.workunit.client.0.vm07.stdout:6/216: symlink d0/d13/l5c 0 2026-03-09T19:27:16.071 INFO:tasks.workunit.client.0.vm07.stdout:5/211: rename d3/d1a/d28/d40/f4a to d3/dd/d26/d3f/f4b 0 2026-03-09T19:27:16.071 INFO:tasks.workunit.client.0.vm07.stdout:1/202: dwrite d1/d11/d37/d3f/d45/d30/f4d [0,4194304] 0 2026-03-09T19:27:16.075 INFO:tasks.workunit.client.1.vm08.stdout:2/460: symlink d3/d4/d23/d2c/d39/d5e/de/d18/d1f/d97/la6 0 2026-03-09T19:27:16.079 INFO:tasks.workunit.client.0.vm07.stdout:5/212: chown d3/dd/d26/d2d/l41 976 1 2026-03-09T19:27:16.084 INFO:tasks.workunit.client.1.vm08.stdout:6/497: getdents d3/d94 0 2026-03-09T19:27:16.090 INFO:tasks.workunit.client.0.vm07.stdout:2/290: dread d3/dd/d16/d29/d2d/d45/f62 [0,4194304] 0 2026-03-09T19:27:16.090 INFO:tasks.workunit.client.0.vm07.stdout:2/291: readlink d3/d11/l54 0 2026-03-09T19:27:16.090 INFO:tasks.workunit.client.0.vm07.stdout:5/213: dwrite d3/f19 [0,4194304] 0 2026-03-09T19:27:16.090 INFO:tasks.workunit.client.0.vm07.stdout:5/214: dwrite d3/d1a/f12 [4194304,4194304] 0 2026-03-09T19:27:16.090 INFO:tasks.workunit.client.1.vm08.stdout:6/498: dwrite d3/d34/d3b/f8d [0,4194304] 0 2026-03-09T19:27:16.090 INFO:tasks.workunit.client.1.vm08.stdout:6/499: chown d3/d34/f37 823 1 2026-03-09T19:27:16.110 INFO:tasks.workunit.client.1.vm08.stdout:7/563: creat d5/d14/d38/dad/fc1 x:0 0 0 2026-03-09T19:27:16.119 INFO:tasks.workunit.client.0.vm07.stdout:8/265: truncate d7/d1d/f3d 429679 0 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.1.vm08.stdout:3/557: getdents d0/d52/d6d 0 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.1.vm08.stdout:8/466: write de/d25/d33/d46/f50 [404898,70365] 0 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.1.vm08.stdout:0/507: dwrite dd/f7a [0,4194304] 0 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.1.vm08.stdout:0/508: dwrite dd/d22/d63/d6e/d72/f8f [0,4194304] 0 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.1.vm08.stdout:8/467: dwrite de/d1d/d21/d73/fa6 [0,4194304] 0 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.1.vm08.stdout:1/636: creat d9/da/d53/d67/d6c/fbc x:0 0 0 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.1.vm08.stdout:7/564: symlink d5/d14/dae/d3a/d42/d6a/lc2 0 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.1.vm08.stdout:8/468: write de/d1d/d21/d73/fa6 [3847777,52775] 0 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.0.vm07.stdout:8/266: read - d7/d9/f65 zero size 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.0.vm07.stdout:6/217: truncate d0/d13/f1c 184181 0 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.0.vm07.stdout:1/203: creat d1/d3/f4e x:0 0 0 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.0.vm07.stdout:1/204: chown d1/d11/d37/f40 1621 1 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.0.vm07.stdout:9/282: write d0/d6/ff [1096750,126083] 0 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.0.vm07.stdout:7/258: mkdir d0/d52/d54/d55/d57 0 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.0.vm07.stdout:4/208: symlink d3/d11/d2b/d38/l40 0 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.0.vm07.stdout:4/209: chown d3/d11/d2b/d37 1739814 1 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.0.vm07.stdout:4/210: stat d3/d11/d16 0 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.0.vm07.stdout:6/218: truncate d0/d1/db/d17/f1a 3265877 0 2026-03-09T19:27:16.147 INFO:tasks.workunit.client.0.vm07.stdout:6/219: truncate d0/d2d/f4a 459749 0 2026-03-09T19:27:16.148 INFO:tasks.workunit.client.1.vm08.stdout:0/509: mknod dd/d22/d24/d49/d50/ca3 0 2026-03-09T19:27:16.153 INFO:tasks.workunit.client.1.vm08.stdout:5/458: dwrite d16/f34 [0,4194304] 0 2026-03-09T19:27:16.160 INFO:tasks.workunit.client.0.vm07.stdout:1/205: mkdir d1/db/d31/d4f 0 2026-03-09T19:27:16.167 INFO:tasks.workunit.client.0.vm07.stdout:1/206: truncate d1/d11/f42 769644 0 2026-03-09T19:27:16.167 INFO:tasks.workunit.client.0.vm07.stdout:9/283: creat d0/db/d29/f67 x:0 0 0 2026-03-09T19:27:16.167 INFO:tasks.workunit.client.0.vm07.stdout:5/215: symlink d3/dd/l4c 0 2026-03-09T19:27:16.168 INFO:tasks.workunit.client.1.vm08.stdout:7/565: truncate d5/d14/d2b/f9f 370277 0 2026-03-09T19:27:16.169 INFO:tasks.workunit.client.0.vm07.stdout:7/259: mkdir d0/d4/d5/d26/d3c/d58 0 2026-03-09T19:27:16.171 INFO:tasks.workunit.client.1.vm08.stdout:0/510: creat dd/d22/d24/d49/d98/fa4 x:0 0 0 2026-03-09T19:27:16.172 INFO:tasks.workunit.client.1.vm08.stdout:6/500: getdents d3/d34/da9 0 2026-03-09T19:27:16.173 INFO:tasks.workunit.client.1.vm08.stdout:6/501: chown d3/d55/c78 2 1 2026-03-09T19:27:16.177 INFO:tasks.workunit.client.0.vm07.stdout:4/211: mkdir d3/d11/d29/d41 0 2026-03-09T19:27:16.185 INFO:tasks.workunit.client.0.vm07.stdout:1/207: mkdir d1/d11/d37/d3f/d45/d30/d50 0 2026-03-09T19:27:16.186 INFO:tasks.workunit.client.1.vm08.stdout:6/502: truncate d3/d34/d6f/f4f 4668305 0 2026-03-09T19:27:16.203 INFO:tasks.workunit.client.1.vm08.stdout:7/566: link d5/d14/f1e d5/d14/dae/d1c/d73/fc3 0 2026-03-09T19:27:16.203 INFO:tasks.workunit.client.1.vm08.stdout:0/511: unlink dd/d22/d27/d2e/d37/l84 0 2026-03-09T19:27:16.203 INFO:tasks.workunit.client.1.vm08.stdout:6/503: readlink d3/d68/d7e/la6 0 2026-03-09T19:27:16.203 INFO:tasks.workunit.client.0.vm07.stdout:9/284: mkdir d0/db/d29/d68 0 2026-03-09T19:27:16.203 INFO:tasks.workunit.client.0.vm07.stdout:1/208: dwrite d1/d3/f4e [0,4194304] 0 2026-03-09T19:27:16.203 INFO:tasks.workunit.client.0.vm07.stdout:1/209: chown d1/d3/d21/f47 399242797 1 2026-03-09T19:27:16.203 INFO:tasks.workunit.client.0.vm07.stdout:1/210: write d1/d3e/f49 [801491,4248] 0 2026-03-09T19:27:16.203 INFO:tasks.workunit.client.0.vm07.stdout:9/285: fdatasync d0/db/d29/d4d/f65 0 2026-03-09T19:27:16.203 INFO:tasks.workunit.client.0.vm07.stdout:9/286: readlink d0/d17/l50 0 2026-03-09T19:27:16.203 INFO:tasks.workunit.client.0.vm07.stdout:1/211: truncate d1/d11/d37/d3f/f4a 527475 0 2026-03-09T19:27:16.203 INFO:tasks.workunit.client.0.vm07.stdout:1/212: dwrite d1/f2f [0,4194304] 0 2026-03-09T19:27:16.204 INFO:tasks.workunit.client.0.vm07.stdout:4/212: creat d3/d11/d29/f42 x:0 0 0 2026-03-09T19:27:16.205 INFO:tasks.workunit.client.1.vm08.stdout:7/567: mkdir d5/dc4 0 2026-03-09T19:27:16.205 INFO:tasks.workunit.client.0.vm07.stdout:2/292: sync 2026-03-09T19:27:16.205 INFO:tasks.workunit.client.0.vm07.stdout:8/267: sync 2026-03-09T19:27:16.207 INFO:tasks.workunit.client.1.vm08.stdout:0/512: chown dd/d22/d27/d2e/d37/f44 540 1 2026-03-09T19:27:16.207 INFO:tasks.workunit.client.1.vm08.stdout:0/513: fsync dd/d22/d27/f9f 0 2026-03-09T19:27:16.208 INFO:tasks.workunit.client.1.vm08.stdout:0/514: chown dd/d22 1256018898 1 2026-03-09T19:27:16.209 INFO:tasks.workunit.client.0.vm07.stdout:9/287: truncate d0/d17/f1f 3986212 0 2026-03-09T19:27:16.213 INFO:tasks.workunit.client.0.vm07.stdout:4/213: dwrite d3/d11/f12 [0,4194304] 0 2026-03-09T19:27:16.223 INFO:tasks.workunit.client.0.vm07.stdout:3/271: read d1/d1f/f1a [1028245,60844] 0 2026-03-09T19:27:16.226 INFO:tasks.workunit.client.0.vm07.stdout:3/272: dread d1/d1f/f1a [4194304,4194304] 0 2026-03-09T19:27:16.232 INFO:tasks.workunit.client.1.vm08.stdout:0/515: creat dd/d22/d27/d4f/d6f/fa5 x:0 0 0 2026-03-09T19:27:16.238 INFO:tasks.workunit.client.0.vm07.stdout:7/260: dread d0/d4/d5/f36 [0,4194304] 0 2026-03-09T19:27:16.246 INFO:tasks.workunit.client.0.vm07.stdout:2/293: chown d3/dd/d16/d2f/c61 14604572 1 2026-03-09T19:27:16.254 INFO:tasks.workunit.client.1.vm08.stdout:0/516: mknod dd/d22/ca6 0 2026-03-09T19:27:16.254 INFO:tasks.workunit.client.0.vm07.stdout:2/294: write d3/f5 [3854814,43474] 0 2026-03-09T19:27:16.267 INFO:tasks.workunit.client.0.vm07.stdout:4/214: symlink d3/d11/d29/d34/l43 0 2026-03-09T19:27:16.271 INFO:tasks.workunit.client.0.vm07.stdout:3/273: mkdir d1/d6/d4c 0 2026-03-09T19:27:16.279 INFO:tasks.workunit.client.0.vm07.stdout:7/261: rename d0/d4/d5/d8/l3d to d0/d4/l59 0 2026-03-09T19:27:16.286 INFO:tasks.workunit.client.0.vm07.stdout:8/268: getdents d7/d30/d32 0 2026-03-09T19:27:16.289 INFO:tasks.workunit.client.1.vm08.stdout:4/498: sync 2026-03-09T19:27:16.289 INFO:tasks.workunit.client.0.vm07.stdout:7/262: dread d0/d4/d5/d8/d1a/d2a/f34 [0,4194304] 0 2026-03-09T19:27:16.295 INFO:tasks.workunit.client.0.vm07.stdout:1/213: creat d1/f51 x:0 0 0 2026-03-09T19:27:16.297 INFO:tasks.workunit.client.0.vm07.stdout:0/216: getdents d0/d6/d13 0 2026-03-09T19:27:16.298 INFO:tasks.workunit.client.0.vm07.stdout:3/274: creat d1/d6/dd/f4d x:0 0 0 2026-03-09T19:27:16.301 INFO:tasks.workunit.client.0.vm07.stdout:3/275: dwrite d1/d6/fb [4194304,4194304] 0 2026-03-09T19:27:16.303 INFO:tasks.workunit.client.1.vm08.stdout:3/558: sync 2026-03-09T19:27:16.303 INFO:tasks.workunit.client.1.vm08.stdout:7/568: sync 2026-03-09T19:27:16.310 INFO:tasks.workunit.client.1.vm08.stdout:7/569: truncate d5/d14/dae/d3a/d42/d85/da0/fa4 288164 0 2026-03-09T19:27:16.310 INFO:tasks.workunit.client.0.vm07.stdout:4/215: truncate d3/d11/d2b/d37/f28 1243940 0 2026-03-09T19:27:16.326 INFO:tasks.workunit.client.1.vm08.stdout:6/504: dread - d3/db/fb0 zero size 2026-03-09T19:27:16.341 INFO:tasks.workunit.client.1.vm08.stdout:4/499: truncate da/d10/d26/d38/f57 467958 0 2026-03-09T19:27:16.351 INFO:tasks.workunit.client.0.vm07.stdout:5/216: dread d3/f25 [0,4194304] 0 2026-03-09T19:27:16.351 INFO:tasks.workunit.client.0.vm07.stdout:5/217: dread - d3/d1a/d28/d40/f49 zero size 2026-03-09T19:27:16.352 INFO:tasks.workunit.client.1.vm08.stdout:3/559: truncate d0/d6/de/d1b/d16/d17/f71 957768 0 2026-03-09T19:27:16.361 INFO:tasks.workunit.client.1.vm08.stdout:7/570: fdatasync d5/d14/dae/d1c/d73/fb3 0 2026-03-09T19:27:16.369 INFO:tasks.workunit.client.1.vm08.stdout:9/484: truncate d0/d2/d8/fe 1069149 0 2026-03-09T19:27:16.389 INFO:tasks.workunit.client.0.vm07.stdout:2/295: link d3/dd/d16/d29/d2d/d45/f55 d3/dd/d16/d30/f67 0 2026-03-09T19:27:16.402 INFO:tasks.workunit.client.1.vm08.stdout:2/461: write d3/d4/d23/d2c/d39/d5e/de/d18/f1a [617481,78188] 0 2026-03-09T19:27:16.411 INFO:tasks.workunit.client.1.vm08.stdout:9/485: mkdir d0/d1b/d4e/da7 0 2026-03-09T19:27:16.419 INFO:tasks.workunit.client.0.vm07.stdout:7/263: mkdir d0/d52/d54/d5a 0 2026-03-09T19:27:16.429 INFO:tasks.workunit.client.1.vm08.stdout:6/505: unlink d3/f1b 0 2026-03-09T19:27:16.431 INFO:tasks.workunit.client.0.vm07.stdout:1/214: mkdir d1/d3/d52 0 2026-03-09T19:27:16.447 INFO:tasks.workunit.client.0.vm07.stdout:3/276: readlink d1/d1f/d16/l42 0 2026-03-09T19:27:16.448 INFO:tasks.workunit.client.0.vm07.stdout:3/277: dread - d1/d6/dd/f4d zero size 2026-03-09T19:27:16.451 INFO:tasks.workunit.client.0.vm07.stdout:4/216: creat d3/d11/d16/d2f/f44 x:0 0 0 2026-03-09T19:27:16.471 INFO:tasks.workunit.client.0.vm07.stdout:5/218: dread d3/d1a/d28/f2e [0,4194304] 0 2026-03-09T19:27:16.473 INFO:tasks.workunit.client.1.vm08.stdout:1/637: write d9/da/d2d/f41 [1260215,77391] 0 2026-03-09T19:27:16.477 INFO:tasks.workunit.client.0.vm07.stdout:5/219: read d3/dd/f22 [254651,28170] 0 2026-03-09T19:27:16.487 INFO:tasks.workunit.client.0.vm07.stdout:8/269: creat d7/d16/f69 x:0 0 0 2026-03-09T19:27:16.487 INFO:tasks.workunit.client.1.vm08.stdout:8/469: dwrite f1 [0,4194304] 0 2026-03-09T19:27:16.491 INFO:tasks.workunit.client.1.vm08.stdout:3/560: link d0/d6/de/d15/ca2 d0/d52/d6d/d77/d88/cb2 0 2026-03-09T19:27:16.491 INFO:tasks.workunit.client.1.vm08.stdout:6/506: chown d3/d34/d5c/da2/l66 1810889 1 2026-03-09T19:27:16.506 INFO:tasks.workunit.client.1.vm08.stdout:5/459: write d16/d1e/f57 [1219489,53999] 0 2026-03-09T19:27:16.506 INFO:tasks.workunit.client.0.vm07.stdout:7/264: symlink d0/d52/d54/l5b 0 2026-03-09T19:27:16.507 INFO:tasks.workunit.client.0.vm07.stdout:6/220: write d0/d1/f19 [2443419,9462] 0 2026-03-09T19:27:16.508 INFO:tasks.workunit.client.0.vm07.stdout:6/221: chown d0/d1 3756 1 2026-03-09T19:27:16.511 INFO:tasks.workunit.client.1.vm08.stdout:9/486: fsync d0/d1b/f4b 0 2026-03-09T19:27:16.514 INFO:tasks.workunit.client.0.vm07.stdout:1/215: fsync d1/d11/d37/d3f/d45/f3b 0 2026-03-09T19:27:16.519 INFO:tasks.workunit.client.1.vm08.stdout:9/487: chown d0/d2/c3d 44 1 2026-03-09T19:27:16.519 INFO:tasks.workunit.client.1.vm08.stdout:9/488: truncate d0/d2/d80/d69/f7a 761700 0 2026-03-09T19:27:16.525 INFO:tasks.workunit.client.1.vm08.stdout:1/638: dread d9/d40/f57 [0,4194304] 0 2026-03-09T19:27:16.526 INFO:tasks.workunit.client.1.vm08.stdout:1/639: write d9/d11/f44 [2168457,105110] 0 2026-03-09T19:27:16.529 INFO:tasks.workunit.client.0.vm07.stdout:3/278: mknod d1/d1f/d16/c4e 0 2026-03-09T19:27:16.531 INFO:tasks.workunit.client.1.vm08.stdout:1/640: stat d9/da/d53/c55 0 2026-03-09T19:27:16.531 INFO:tasks.workunit.client.1.vm08.stdout:1/641: fsync d9/da/d12/d91/fb4 0 2026-03-09T19:27:16.535 INFO:tasks.workunit.client.0.vm07.stdout:4/217: symlink d3/d11/d16/l45 0 2026-03-09T19:27:16.543 INFO:tasks.workunit.client.1.vm08.stdout:2/462: creat d3/d4/fa7 x:0 0 0 2026-03-09T19:27:16.543 INFO:tasks.workunit.client.1.vm08.stdout:1/642: dwrite d9/d11/d7a/d89/d8d/da3/fab [0,4194304] 0 2026-03-09T19:27:16.558 INFO:tasks.workunit.client.1.vm08.stdout:7/571: getdents d5/d14/d38/dad 0 2026-03-09T19:27:16.559 INFO:tasks.workunit.client.0.vm07.stdout:9/288: write d0/d6/f10 [2045849,4947] 0 2026-03-09T19:27:16.567 INFO:tasks.workunit.client.0.vm07.stdout:8/270: symlink d7/d9/d57/l6a 0 2026-03-09T19:27:16.572 INFO:tasks.workunit.client.0.vm07.stdout:8/271: dwrite d7/d9/d10/d44/f48 [0,4194304] 0 2026-03-09T19:27:16.580 INFO:tasks.workunit.client.0.vm07.stdout:6/222: mknod d0/d2d/c5d 0 2026-03-09T19:27:16.592 INFO:tasks.workunit.client.1.vm08.stdout:6/507: creat d3/db/d43/d69/fb1 x:0 0 0 2026-03-09T19:27:16.595 INFO:tasks.workunit.client.1.vm08.stdout:3/561: mknod d0/d6/dad/cb3 0 2026-03-09T19:27:16.595 INFO:tasks.workunit.client.0.vm07.stdout:1/216: dread d1/d11/d37/d3f/d45/f26 [0,4194304] 0 2026-03-09T19:27:16.595 INFO:tasks.workunit.client.0.vm07.stdout:1/217: write d1/d11/d37/d3f/d45/f15 [3465727,69061] 0 2026-03-09T19:27:16.602 INFO:tasks.workunit.client.0.vm07.stdout:6/223: sync 2026-03-09T19:27:16.603 INFO:tasks.workunit.client.0.vm07.stdout:6/224: write d0/d1/db/f15 [2400026,102257] 0 2026-03-09T19:27:16.604 INFO:tasks.workunit.client.0.vm07.stdout:6/225: read - d0/d1/f5a zero size 2026-03-09T19:27:16.609 INFO:tasks.workunit.client.0.vm07.stdout:4/218: mknod d3/d11/d29/d34/c46 0 2026-03-09T19:27:16.613 INFO:tasks.workunit.client.1.vm08.stdout:4/500: dwrite da/d10/d1b/f79 [0,4194304] 0 2026-03-09T19:27:16.629 INFO:tasks.workunit.client.1.vm08.stdout:1/643: rmdir d9/d11 39 2026-03-09T19:27:16.637 INFO:tasks.workunit.client.1.vm08.stdout:9/489: mkdir d0/d1b/d68/d7f/d8c/da2/da8 0 2026-03-09T19:27:16.641 INFO:tasks.workunit.client.0.vm07.stdout:2/296: write d3/fa [3132000,34648] 0 2026-03-09T19:27:16.660 INFO:tasks.workunit.client.1.vm08.stdout:6/508: unlink d3/db/l9a 0 2026-03-09T19:27:16.660 INFO:tasks.workunit.client.1.vm08.stdout:3/562: mknod d0/d6/de/d1a/cb4 0 2026-03-09T19:27:16.660 INFO:tasks.workunit.client.1.vm08.stdout:5/460: dwrite d16/d1e/d3b/f68 [0,4194304] 0 2026-03-09T19:27:16.660 INFO:tasks.workunit.client.0.vm07.stdout:7/265: symlink d0/d4/d5/d26/d3c/d58/l5c 0 2026-03-09T19:27:16.660 INFO:tasks.workunit.client.0.vm07.stdout:8/272: creat d7/d30/d32/f6b x:0 0 0 2026-03-09T19:27:16.660 INFO:tasks.workunit.client.0.vm07.stdout:8/273: write d7/d9/fd [825442,83660] 0 2026-03-09T19:27:16.660 INFO:tasks.workunit.client.0.vm07.stdout:0/217: rename d0/c1a to d0/d6/d13/d1c/c47 0 2026-03-09T19:27:16.660 INFO:tasks.workunit.client.0.vm07.stdout:6/226: mknod d0/d1/db/d24/c5e 0 2026-03-09T19:27:16.660 INFO:tasks.workunit.client.0.vm07.stdout:7/266: dread d0/d4/d5/d26/d32/f56 [0,4194304] 0 2026-03-09T19:27:16.661 INFO:tasks.workunit.client.0.vm07.stdout:7/267: read d0/d4/fc [1982284,113922] 0 2026-03-09T19:27:16.662 INFO:tasks.workunit.client.0.vm07.stdout:7/268: chown d0/d4/d5/d8/d1a/f4d 130 1 2026-03-09T19:27:16.662 INFO:tasks.workunit.client.0.vm07.stdout:7/269: chown d0/d4/d5/dd/l19 43656 1 2026-03-09T19:27:16.671 INFO:tasks.workunit.client.0.vm07.stdout:9/289: mkdir d0/db/d29/d32/d5c/d69 0 2026-03-09T19:27:16.691 INFO:tasks.workunit.client.0.vm07.stdout:5/220: creat d3/f4d x:0 0 0 2026-03-09T19:27:16.691 INFO:tasks.workunit.client.0.vm07.stdout:2/297: mknod d3/dd/d16/d29/d2d/d45/d3b/c68 0 2026-03-09T19:27:16.691 INFO:tasks.workunit.client.0.vm07.stdout:8/274: dread - d7/d1d/f4d zero size 2026-03-09T19:27:16.691 INFO:tasks.workunit.client.0.vm07.stdout:1/218: mknod d1/d11/d37/d3f/d45/d30/d50/c53 0 2026-03-09T19:27:16.693 INFO:tasks.workunit.client.0.vm07.stdout:6/227: creat d0/d13/f5f x:0 0 0 2026-03-09T19:27:16.696 INFO:tasks.workunit.client.0.vm07.stdout:4/219: getdents d3/d11/d29/d41 0 2026-03-09T19:27:16.698 INFO:tasks.workunit.client.0.vm07.stdout:7/270: creat d0/d4/d5/d26/d3c/d58/f5d x:0 0 0 2026-03-09T19:27:16.699 INFO:tasks.workunit.client.1.vm08.stdout:8/470: creat de/d25/d31/d82/fa9 x:0 0 0 2026-03-09T19:27:16.703 INFO:tasks.workunit.client.1.vm08.stdout:2/463: symlink d3/d9/d79/d46/la8 0 2026-03-09T19:27:16.716 INFO:tasks.workunit.client.0.vm07.stdout:2/298: mknod d3/dd/d16/d29/d2d/d45/c69 0 2026-03-09T19:27:16.717 INFO:tasks.workunit.client.0.vm07.stdout:8/275: unlink d7/d9/d10/c23 0 2026-03-09T19:27:16.717 INFO:tasks.workunit.client.0.vm07.stdout:8/276: write d7/d16/f69 [210137,36952] 0 2026-03-09T19:27:16.717 INFO:tasks.workunit.client.0.vm07.stdout:6/228: unlink d0/f4f 0 2026-03-09T19:27:16.719 INFO:tasks.workunit.client.1.vm08.stdout:6/509: unlink d3/d34/f76 0 2026-03-09T19:27:16.728 INFO:tasks.workunit.client.0.vm07.stdout:4/220: mknod d3/d11/d16/d2f/c47 0 2026-03-09T19:27:16.728 INFO:tasks.workunit.client.0.vm07.stdout:7/271: creat d0/d52/d54/f5e x:0 0 0 2026-03-09T19:27:16.728 INFO:tasks.workunit.client.1.vm08.stdout:3/563: creat d0/d6/de/d6e/d51/fb5 x:0 0 0 2026-03-09T19:27:16.728 INFO:tasks.workunit.client.1.vm08.stdout:8/471: unlink de/d25/d33/l7d 0 2026-03-09T19:27:16.729 INFO:tasks.workunit.client.0.vm07.stdout:2/299: rmdir d3/d11/d38 39 2026-03-09T19:27:16.729 INFO:tasks.workunit.client.1.vm08.stdout:3/564: read d0/d6/de/d1b/d16/f7b [105660,123997] 0 2026-03-09T19:27:16.732 INFO:tasks.workunit.client.1.vm08.stdout:3/565: dread - d0/d6/de/d6e/f83 zero size 2026-03-09T19:27:16.738 INFO:tasks.workunit.client.1.vm08.stdout:2/464: rename d3/d4/d23/d2c/d39/d5e/de/d18/d1f/d97 to d3/d4/d23/d2c/d39/d5e/de/d18/da9 0 2026-03-09T19:27:16.742 INFO:tasks.workunit.client.1.vm08.stdout:3/566: dread d0/d6/de/d15/f53 [0,4194304] 0 2026-03-09T19:27:16.743 INFO:tasks.workunit.client.0.vm07.stdout:8/277: creat d7/d9/d10/d44/f6c x:0 0 0 2026-03-09T19:27:16.748 INFO:tasks.workunit.client.0.vm07.stdout:3/279: dwrite d1/d1f/f36 [0,4194304] 0 2026-03-09T19:27:16.748 INFO:tasks.workunit.client.0.vm07.stdout:0/218: write d0/f1e [301893,127249] 0 2026-03-09T19:27:16.748 INFO:tasks.workunit.client.0.vm07.stdout:1/219: rmdir d1/d11 39 2026-03-09T19:27:16.758 INFO:tasks.workunit.client.0.vm07.stdout:3/280: write d1/d1f/d16/f39 [1136338,5057] 0 2026-03-09T19:27:16.764 INFO:tasks.workunit.client.1.vm08.stdout:9/490: write d0/f13 [2310951,55294] 0 2026-03-09T19:27:16.775 INFO:tasks.workunit.client.1.vm08.stdout:5/461: unlink d16/f2b 0 2026-03-09T19:27:16.775 INFO:tasks.workunit.client.1.vm08.stdout:9/491: chown d0/d1b/d68/d7f/d8c 207069345 1 2026-03-09T19:27:16.775 INFO:tasks.workunit.client.1.vm08.stdout:5/462: chown d16/d1e/c22 195 1 2026-03-09T19:27:16.776 INFO:tasks.workunit.client.0.vm07.stdout:4/221: mknod d3/d11/d2b/d38/c48 0 2026-03-09T19:27:16.776 INFO:tasks.workunit.client.0.vm07.stdout:7/272: rmdir d0 39 2026-03-09T19:27:16.776 INFO:tasks.workunit.client.0.vm07.stdout:9/290: rename d0/db/d29/d32/c64 to d0/db/d29/d2c/c6a 0 2026-03-09T19:27:16.776 INFO:tasks.workunit.client.0.vm07.stdout:5/221: creat d3/f4e x:0 0 0 2026-03-09T19:27:16.776 INFO:tasks.workunit.client.0.vm07.stdout:2/300: mknod d3/dd/d16/d29/c6a 0 2026-03-09T19:27:16.776 INFO:tasks.workunit.client.0.vm07.stdout:8/278: creat d7/d50/f6d x:0 0 0 2026-03-09T19:27:16.776 INFO:tasks.workunit.client.0.vm07.stdout:8/279: write d7/d9/d10/f1b [4117205,42936] 0 2026-03-09T19:27:16.776 INFO:tasks.workunit.client.0.vm07.stdout:5/222: dread d3/d1a/fc [0,4194304] 0 2026-03-09T19:27:16.776 INFO:tasks.workunit.client.0.vm07.stdout:5/223: stat d3/d1a/f10 0 2026-03-09T19:27:16.776 INFO:tasks.workunit.client.0.vm07.stdout:0/219: symlink d0/d6/d13/d33/l48 0 2026-03-09T19:27:16.776 INFO:tasks.workunit.client.0.vm07.stdout:1/220: read d1/f6 [113491,93444] 0 2026-03-09T19:27:16.777 INFO:tasks.workunit.client.0.vm07.stdout:3/281: mkdir d1/d26/d4f 0 2026-03-09T19:27:16.777 INFO:tasks.workunit.client.0.vm07.stdout:3/282: chown d1/d6/d4c 217877 1 2026-03-09T19:27:16.780 INFO:tasks.workunit.client.0.vm07.stdout:6/229: creat d0/d1/db/d17/d4c/f60 x:0 0 0 2026-03-09T19:27:16.780 INFO:tasks.workunit.client.1.vm08.stdout:2/465: chown d3/d4/d23/d2c/d39/c66 15461 1 2026-03-09T19:27:16.781 INFO:tasks.workunit.client.1.vm08.stdout:2/466: chown d3/d9 8764065 1 2026-03-09T19:27:16.785 INFO:tasks.workunit.client.1.vm08.stdout:4/501: rmdir da/d10/d26/d3a/d69/d75 39 2026-03-09T19:27:16.785 INFO:tasks.workunit.client.0.vm07.stdout:9/291: sync 2026-03-09T19:27:16.785 INFO:tasks.workunit.client.0.vm07.stdout:3/283: sync 2026-03-09T19:27:16.789 INFO:tasks.workunit.client.1.vm08.stdout:3/567: creat d0/d6/dad/fb6 x:0 0 0 2026-03-09T19:27:16.794 INFO:tasks.workunit.client.0.vm07.stdout:4/222: creat d3/d11/d2b/f49 x:0 0 0 2026-03-09T19:27:16.795 INFO:tasks.workunit.client.0.vm07.stdout:4/223: chown d3/d11/d16/l45 1932 1 2026-03-09T19:27:16.796 INFO:tasks.workunit.client.1.vm08.stdout:7/572: write d5/d14/d2b/f9f [602473,84472] 0 2026-03-09T19:27:16.797 INFO:tasks.workunit.client.1.vm08.stdout:1/644: symlink d9/lbd 0 2026-03-09T19:27:16.803 INFO:tasks.workunit.client.1.vm08.stdout:7/573: dread d5/d14/d2b/f9f [0,4194304] 0 2026-03-09T19:27:16.804 INFO:tasks.workunit.client.1.vm08.stdout:7/574: chown d5/d14/d2b/d4b/f66 22133 1 2026-03-09T19:27:16.808 INFO:tasks.workunit.client.0.vm07.stdout:2/301: rename d3/dd/d16/d29/d2d/c3d to d3/dd/d16/d29/c6b 0 2026-03-09T19:27:16.808 INFO:tasks.workunit.client.1.vm08.stdout:7/575: read d5/d14/dae/f6b [3025664,9664] 0 2026-03-09T19:27:16.809 INFO:tasks.workunit.client.0.vm07.stdout:2/302: sync 2026-03-09T19:27:16.809 INFO:tasks.workunit.client.1.vm08.stdout:7/576: chown d5/d14/d2b/f37 20671 1 2026-03-09T19:27:16.812 INFO:tasks.workunit.client.0.vm07.stdout:8/280: creat d7/d16/d1e/f6e x:0 0 0 2026-03-09T19:27:16.812 INFO:tasks.workunit.client.1.vm08.stdout:6/510: creat d3/d94/fb2 x:0 0 0 2026-03-09T19:27:16.813 INFO:tasks.workunit.client.1.vm08.stdout:9/492: chown d0/d1b/f9f 24 1 2026-03-09T19:27:16.815 INFO:tasks.workunit.client.1.vm08.stdout:9/493: chown d0/d1b/d68/d7f/d8c/da2 7533174 1 2026-03-09T19:27:16.816 INFO:tasks.workunit.client.0.vm07.stdout:8/281: dwrite d7/d9/d37/d34/f55 [0,4194304] 0 2026-03-09T19:27:16.818 INFO:tasks.workunit.client.1.vm08.stdout:8/472: creat de/d47/faa x:0 0 0 2026-03-09T19:27:16.820 INFO:tasks.workunit.client.0.vm07.stdout:0/220: fdatasync d0/d6/d13/d17/d19/f1f 0 2026-03-09T19:27:16.822 INFO:tasks.workunit.client.1.vm08.stdout:5/463: rmdir d16/d1e/d6e 39 2026-03-09T19:27:16.825 INFO:tasks.workunit.client.1.vm08.stdout:5/464: dread d16/f34 [0,4194304] 0 2026-03-09T19:27:16.829 INFO:tasks.workunit.client.0.vm07.stdout:6/230: symlink d0/d2d/l61 0 2026-03-09T19:27:16.832 INFO:tasks.workunit.client.1.vm08.stdout:4/502: creat da/d10/d16/d28/d2f/d4f/d56/f9a x:0 0 0 2026-03-09T19:27:16.832 INFO:tasks.workunit.client.0.vm07.stdout:1/221: dwrite d1/d11/d37/f2c [0,4194304] 0 2026-03-09T19:27:16.838 INFO:tasks.workunit.client.1.vm08.stdout:3/568: unlink d0/d6/de/d15/c76 0 2026-03-09T19:27:16.840 INFO:tasks.workunit.client.0.vm07.stdout:7/273: mknod d0/d4/d5/d8/d1a/d2a/c5f 0 2026-03-09T19:27:16.841 INFO:tasks.workunit.client.0.vm07.stdout:7/274: chown d0/d4/d5/dd/f47 7 1 2026-03-09T19:27:16.844 INFO:tasks.workunit.client.0.vm07.stdout:7/275: dwrite d0/f13 [0,4194304] 0 2026-03-09T19:27:16.844 INFO:tasks.workunit.client.1.vm08.stdout:5/465: dwrite d16/d45/f46 [0,4194304] 0 2026-03-09T19:27:16.846 INFO:tasks.workunit.client.1.vm08.stdout:5/466: stat d16/d1e/d3b/f5e 0 2026-03-09T19:27:16.847 INFO:tasks.workunit.client.0.vm07.stdout:7/276: read - d0/d4/d5/d26/f4a zero size 2026-03-09T19:27:16.849 INFO:tasks.workunit.client.1.vm08.stdout:0/517: link dd/d22/d27/d2e/d37/f46 dd/d22/d24/d49/d92/fa7 0 2026-03-09T19:27:16.850 INFO:tasks.workunit.client.0.vm07.stdout:4/224: rename d3/d11/f18 to d3/d11/d2b/d38/f4a 0 2026-03-09T19:27:16.850 INFO:tasks.workunit.client.1.vm08.stdout:0/518: readlink dd/l30 0 2026-03-09T19:27:16.850 INFO:tasks.workunit.client.0.vm07.stdout:4/225: chown d3/d11/d2b/f49 410982 1 2026-03-09T19:27:16.854 INFO:tasks.workunit.client.0.vm07.stdout:4/226: dread - d3/d11/d2b/d37/f30 zero size 2026-03-09T19:27:16.864 INFO:tasks.workunit.client.0.vm07.stdout:2/303: mknod d3/dd/d16/d29/c6c 0 2026-03-09T19:27:16.865 INFO:tasks.workunit.client.1.vm08.stdout:6/511: mknod d3/d94/cb3 0 2026-03-09T19:27:16.866 INFO:tasks.workunit.client.0.vm07.stdout:5/224: creat d3/d1a/d28/d48/f4f x:0 0 0 2026-03-09T19:27:16.869 INFO:tasks.workunit.client.1.vm08.stdout:8/473: rename de/d25 to de/d25/d87/dab 22 2026-03-09T19:27:16.883 INFO:tasks.workunit.client.0.vm07.stdout:8/282: rmdir d7/d9/d37/d45/d4f 39 2026-03-09T19:27:16.884 INFO:tasks.workunit.client.1.vm08.stdout:2/467: truncate d3/d4/d23/d2c/d39/d5e/de/d18/f2d 4404080 0 2026-03-09T19:27:16.885 INFO:tasks.workunit.client.0.vm07.stdout:8/283: dread f4 [0,4194304] 0 2026-03-09T19:27:16.889 INFO:tasks.workunit.client.0.vm07.stdout:8/284: dread d7/d9/d10/f20 [0,4194304] 0 2026-03-09T19:27:16.890 INFO:tasks.workunit.client.0.vm07.stdout:8/285: write d7/d9/d10/f41 [1763939,58329] 0 2026-03-09T19:27:16.892 INFO:tasks.workunit.client.0.vm07.stdout:9/292: creat d0/db/d29/d68/f6b x:0 0 0 2026-03-09T19:27:16.893 INFO:tasks.workunit.client.0.vm07.stdout:8/286: dread d7/d9/d37/d45/d56/d62/f64 [0,4194304] 0 2026-03-09T19:27:16.894 INFO:tasks.workunit.client.0.vm07.stdout:9/293: chown d0/d17/l60 7243153 1 2026-03-09T19:27:16.895 INFO:tasks.workunit.client.1.vm08.stdout:1/645: write d9/d40/d49/f7c [2861210,108329] 0 2026-03-09T19:27:16.898 INFO:tasks.workunit.client.0.vm07.stdout:1/222: rmdir d1/d3 39 2026-03-09T19:27:16.898 INFO:tasks.workunit.client.0.vm07.stdout:3/284: symlink d1/d3d/d47/l50 0 2026-03-09T19:27:16.898 INFO:tasks.workunit.client.0.vm07.stdout:3/285: chown d1 61 1 2026-03-09T19:27:16.905 INFO:tasks.workunit.client.0.vm07.stdout:7/277: creat d0/d4/d5/d26/d3c/f60 x:0 0 0 2026-03-09T19:27:16.908 INFO:tasks.workunit.client.1.vm08.stdout:9/494: write d0/d2/d8/f61 [3315604,109477] 0 2026-03-09T19:27:16.913 INFO:tasks.workunit.client.1.vm08.stdout:3/569: creat d0/d8/d19/fb7 x:0 0 0 2026-03-09T19:27:16.914 INFO:tasks.workunit.client.0.vm07.stdout:4/227: mkdir d3/d11/d2b/d37/d4b 0 2026-03-09T19:27:16.915 INFO:tasks.workunit.client.1.vm08.stdout:5/467: mknod d16/d45/d81/c91 0 2026-03-09T19:27:16.915 INFO:tasks.workunit.client.0.vm07.stdout:4/228: chown d3/d11/d2b/l31 23164772 1 2026-03-09T19:27:16.915 INFO:tasks.workunit.client.1.vm08.stdout:3/570: write d0/d6/f39 [563171,118903] 0 2026-03-09T19:27:16.918 INFO:tasks.workunit.client.1.vm08.stdout:3/571: fsync d0/d6/de/d6e/f81 0 2026-03-09T19:27:16.918 INFO:tasks.workunit.client.0.vm07.stdout:4/229: dwrite d3/f8 [4194304,4194304] 0 2026-03-09T19:27:16.920 INFO:tasks.workunit.client.0.vm07.stdout:2/304: unlink d3/d11/l54 0 2026-03-09T19:27:16.922 INFO:tasks.workunit.client.0.vm07.stdout:5/225: creat d3/d1a/d28/d48/f50 x:0 0 0 2026-03-09T19:27:16.922 INFO:tasks.workunit.client.1.vm08.stdout:7/577: mknod d5/d14/d27/d54/d86/cc5 0 2026-03-09T19:27:16.927 INFO:tasks.workunit.client.0.vm07.stdout:4/230: dread d3/f8 [4194304,4194304] 0 2026-03-09T19:27:16.930 INFO:tasks.workunit.client.1.vm08.stdout:7/578: dread - d5/d14/dae/d3a/d42/fa9 zero size 2026-03-09T19:27:16.932 INFO:tasks.workunit.client.0.vm07.stdout:0/221: symlink d0/d6/d13/d17/l49 0 2026-03-09T19:27:16.933 INFO:tasks.workunit.client.1.vm08.stdout:8/474: truncate de/d1d/f59 1729484 0 2026-03-09T19:27:16.934 INFO:tasks.workunit.client.0.vm07.stdout:0/222: dread - d0/d6/d13/f31 zero size 2026-03-09T19:27:16.935 INFO:tasks.workunit.client.0.vm07.stdout:8/287: rename d7/d9/d10/d44/f63 to d7/d50/f6f 0 2026-03-09T19:27:16.944 INFO:tasks.workunit.client.0.vm07.stdout:2/305: dread d3/fc [0,4194304] 0 2026-03-09T19:27:16.944 INFO:tasks.workunit.client.0.vm07.stdout:1/223: symlink d1/d3e/l54 0 2026-03-09T19:27:16.944 INFO:tasks.workunit.client.0.vm07.stdout:5/226: creat d3/d1a/f51 x:0 0 0 2026-03-09T19:27:16.944 INFO:tasks.workunit.client.1.vm08.stdout:2/468: mknod d3/d9/d79/d46/d8c/caa 0 2026-03-09T19:27:16.944 INFO:tasks.workunit.client.1.vm08.stdout:2/469: chown d3/d4/d23/d2c/d39/d5e/d14/f73 2011922 1 2026-03-09T19:27:16.944 INFO:tasks.workunit.client.1.vm08.stdout:2/470: chown d3/d9/d4a 41747 1 2026-03-09T19:27:16.945 INFO:tasks.workunit.client.1.vm08.stdout:9/495: read d0/d2/d8/f2d [3347117,108980] 0 2026-03-09T19:27:16.945 INFO:tasks.workunit.client.1.vm08.stdout:9/496: write d0/f13 [2151217,25377] 0 2026-03-09T19:27:16.945 INFO:tasks.workunit.client.1.vm08.stdout:9/497: chown d0/d2/d80/f87 3251 1 2026-03-09T19:27:16.948 INFO:tasks.workunit.client.1.vm08.stdout:5/468: unlink d16/d45/d81/c8d 0 2026-03-09T19:27:16.949 INFO:tasks.workunit.client.1.vm08.stdout:5/469: write d16/d45/f6b [3920200,63056] 0 2026-03-09T19:27:16.951 INFO:tasks.workunit.client.0.vm07.stdout:0/223: symlink d0/d6/d13/l4a 0 2026-03-09T19:27:16.954 INFO:tasks.workunit.client.1.vm08.stdout:3/572: truncate d0/d6/d25/f87 918613 0 2026-03-09T19:27:16.958 INFO:tasks.workunit.client.1.vm08.stdout:0/519: symlink dd/d22/la8 0 2026-03-09T19:27:16.958 INFO:tasks.workunit.client.1.vm08.stdout:6/512: creat d3/db/d43/d69/da0/fb4 x:0 0 0 2026-03-09T19:27:16.960 INFO:tasks.workunit.client.0.vm07.stdout:8/288: creat d7/d9/d37/d45/d56/d67/f70 x:0 0 0 2026-03-09T19:27:16.960 INFO:tasks.workunit.client.1.vm08.stdout:7/579: truncate d5/d14/d27/f35 4788996 0 2026-03-09T19:27:16.961 INFO:tasks.workunit.client.0.vm07.stdout:7/278: dwrite d0/d4/d5/d26/f42 [0,4194304] 0 2026-03-09T19:27:16.962 INFO:tasks.workunit.client.0.vm07.stdout:7/279: fsync d0/d52/d54/f5e 0 2026-03-09T19:27:16.963 INFO:tasks.workunit.client.0.vm07.stdout:2/306: chown d3/d11/d38 460333 1 2026-03-09T19:27:16.965 INFO:tasks.workunit.client.1.vm08.stdout:5/470: symlink d16/d45/l92 0 2026-03-09T19:27:16.967 INFO:tasks.workunit.client.1.vm08.stdout:3/573: rmdir d0/d52/d7c/d7e 39 2026-03-09T19:27:16.968 INFO:tasks.workunit.client.0.vm07.stdout:7/280: chown d0/d4/d5/d8/f1c 80883356 1 2026-03-09T19:27:16.969 INFO:tasks.workunit.client.1.vm08.stdout:8/475: dread de/f16 [0,4194304] 0 2026-03-09T19:27:16.972 INFO:tasks.workunit.client.0.vm07.stdout:5/227: fsync d3/dd/f24 0 2026-03-09T19:27:16.974 INFO:tasks.workunit.client.0.vm07.stdout:4/231: fdatasync d3/f7 0 2026-03-09T19:27:16.978 INFO:tasks.workunit.client.0.vm07.stdout:6/231: getdents d0/d1/db/d24/d53/d31 0 2026-03-09T19:27:16.980 INFO:tasks.workunit.client.1.vm08.stdout:0/520: fdatasync dd/f15 0 2026-03-09T19:27:16.981 INFO:tasks.workunit.client.1.vm08.stdout:9/498: write d0/d1b/f49 [9046863,129105] 0 2026-03-09T19:27:16.982 INFO:tasks.workunit.client.1.vm08.stdout:6/513: creat d3/d94/fb5 x:0 0 0 2026-03-09T19:27:16.983 INFO:tasks.workunit.client.1.vm08.stdout:4/503: getdents da/d10/d16/d28/d46/d52 0 2026-03-09T19:27:16.984 INFO:tasks.workunit.client.1.vm08.stdout:2/471: mkdir d3/d4/dab 0 2026-03-09T19:27:16.985 INFO:tasks.workunit.client.0.vm07.stdout:3/286: rename d1/d1f/d3e to d1/d6/dd/d51 0 2026-03-09T19:27:16.985 INFO:tasks.workunit.client.1.vm08.stdout:2/472: chown d3/d4/d23/d2c 20 1 2026-03-09T19:27:16.986 INFO:tasks.workunit.client.1.vm08.stdout:1/646: creat d9/d11/db6/fbe x:0 0 0 2026-03-09T19:27:16.987 INFO:tasks.workunit.client.1.vm08.stdout:2/473: write d3/d9/d4a/fa4 [207551,59478] 0 2026-03-09T19:27:16.991 INFO:tasks.workunit.client.1.vm08.stdout:7/580: rename d5/d14/l17 to d5/d14/d27/d54/d86/lc6 0 2026-03-09T19:27:16.992 INFO:tasks.workunit.client.0.vm07.stdout:3/287: sync 2026-03-09T19:27:16.993 INFO:tasks.workunit.client.0.vm07.stdout:2/307: creat d3/dd/d16/d29/d2d/f6d x:0 0 0 2026-03-09T19:27:16.993 INFO:tasks.workunit.client.0.vm07.stdout:2/308: readlink d3/dd/d16/d29/d3c/l41 0 2026-03-09T19:27:16.995 INFO:tasks.workunit.client.0.vm07.stdout:1/224: creat d1/d3/d21/f55 x:0 0 0 2026-03-09T19:27:16.997 INFO:tasks.workunit.client.0.vm07.stdout:4/232: write d3/f13 [301953,131061] 0 2026-03-09T19:27:16.997 INFO:tasks.workunit.client.0.vm07.stdout:6/232: mknod d0/d1/db/d1d/c62 0 2026-03-09T19:27:17.000 INFO:tasks.workunit.client.0.vm07.stdout:5/228: dread d3/d1a/fc [0,4194304] 0 2026-03-09T19:27:17.000 INFO:tasks.workunit.client.0.vm07.stdout:0/224: mknod d0/c4b 0 2026-03-09T19:27:17.001 INFO:tasks.workunit.client.0.vm07.stdout:5/229: dread - d3/d1a/d28/d48/f50 zero size 2026-03-09T19:27:17.001 INFO:tasks.workunit.client.1.vm08.stdout:6/514: fsync d3/d34/da9/f97 0 2026-03-09T19:27:17.003 INFO:tasks.workunit.client.1.vm08.stdout:6/515: chown d3/d34/d5c/c63 10 1 2026-03-09T19:27:17.011 INFO:tasks.workunit.client.0.vm07.stdout:6/233: dread d0/ff [0,4194304] 0 2026-03-09T19:27:17.013 INFO:tasks.workunit.client.0.vm07.stdout:8/289: link d7/d16/f69 d7/d16/f71 0 2026-03-09T19:27:17.020 INFO:tasks.workunit.client.0.vm07.stdout:9/294: link d0/db/d29/d2c/f43 d0/db/f6c 0 2026-03-09T19:27:17.024 INFO:tasks.workunit.client.1.vm08.stdout:1/647: fdatasync d9/da/d12/d39/f47 0 2026-03-09T19:27:17.024 INFO:tasks.workunit.client.0.vm07.stdout:7/281: mknod d0/c61 0 2026-03-09T19:27:17.025 INFO:tasks.workunit.client.0.vm07.stdout:8/290: dread d7/d9/fd [0,4194304] 0 2026-03-09T19:27:17.025 INFO:tasks.workunit.client.1.vm08.stdout:1/648: chown d9/da/d2c/d6a 393611 1 2026-03-09T19:27:17.035 INFO:tasks.workunit.client.1.vm08.stdout:1/649: write d9/da/d17/fa9 [510220,66763] 0 2026-03-09T19:27:17.044 INFO:tasks.workunit.client.0.vm07.stdout:3/288: rename d1/f22 to d1/d26/f52 0 2026-03-09T19:27:17.044 INFO:tasks.workunit.client.0.vm07.stdout:3/289: dread - d1/d6/dd/f4d zero size 2026-03-09T19:27:17.046 INFO:tasks.workunit.client.0.vm07.stdout:3/290: write d1/d6/dd/d51/f49 [961943,39621] 0 2026-03-09T19:27:17.047 INFO:tasks.workunit.client.1.vm08.stdout:4/504: write da/d10/f53 [708969,98283] 0 2026-03-09T19:27:17.052 INFO:tasks.workunit.client.0.vm07.stdout:2/309: fsync d3/f15 0 2026-03-09T19:27:17.053 INFO:tasks.workunit.client.0.vm07.stdout:1/225: mkdir d1/db/d31/d56 0 2026-03-09T19:27:17.053 INFO:tasks.workunit.client.0.vm07.stdout:1/226: readlink d1/d3/l34 0 2026-03-09T19:27:17.058 INFO:tasks.workunit.client.0.vm07.stdout:1/227: dwrite d1/d11/f1b [0,4194304] 0 2026-03-09T19:27:17.062 INFO:tasks.workunit.client.1.vm08.stdout:8/476: symlink de/d25/d31/d82/lac 0 2026-03-09T19:27:17.064 INFO:tasks.workunit.client.1.vm08.stdout:7/581: dread d5/d14/dae/d1c/f87 [0,4194304] 0 2026-03-09T19:27:17.066 INFO:tasks.workunit.client.1.vm08.stdout:9/499: symlink d0/d1b/d97/la9 0 2026-03-09T19:27:17.068 INFO:tasks.workunit.client.0.vm07.stdout:4/233: mknod d3/d11/d16/d2f/d22/c4c 0 2026-03-09T19:27:17.069 INFO:tasks.workunit.client.0.vm07.stdout:1/228: truncate d1/d11/f1b 5018806 0 2026-03-09T19:27:17.070 INFO:tasks.workunit.client.0.vm07.stdout:4/234: chown d3/d11/d29/f3c 15 1 2026-03-09T19:27:17.070 INFO:tasks.workunit.client.0.vm07.stdout:6/234: symlink d0/d1/db/d1d/l63 0 2026-03-09T19:27:17.071 INFO:tasks.workunit.client.0.vm07.stdout:4/235: read d3/d11/d2b/d37/f2e [3584960,112351] 0 2026-03-09T19:27:17.073 INFO:tasks.workunit.client.0.vm07.stdout:1/229: dread d1/d11/d37/d3f/f4a [0,4194304] 0 2026-03-09T19:27:17.074 INFO:tasks.workunit.client.0.vm07.stdout:1/230: truncate d1/d3e/f49 864837 0 2026-03-09T19:27:17.083 INFO:tasks.workunit.client.0.vm07.stdout:9/295: rmdir d0/db/d29/d32/d5c 39 2026-03-09T19:27:17.095 INFO:tasks.workunit.client.0.vm07.stdout:9/296: readlink d0/d6/ld 0 2026-03-09T19:27:17.097 INFO:tasks.workunit.client.1.vm08.stdout:1/650: dread d9/da/d2c/d6a/f9c [0,4194304] 0 2026-03-09T19:27:17.100 INFO:tasks.workunit.client.1.vm08.stdout:5/471: rename d16/d1e/d3b/d61/c69 to d16/d1e/d3b/c93 0 2026-03-09T19:27:17.120 INFO:tasks.workunit.client.0.vm07.stdout:3/291: write d1/d6/dd/f48 [306638,54891] 0 2026-03-09T19:27:17.120 INFO:tasks.workunit.client.0.vm07.stdout:3/292: write d1/d6/dd/f33 [112150,2407] 0 2026-03-09T19:27:17.120 INFO:tasks.workunit.client.0.vm07.stdout:3/293: dread d1/d6/f9 [0,4194304] 0 2026-03-09T19:27:17.120 INFO:tasks.workunit.client.0.vm07.stdout:3/294: write d1/d6/fb [1258610,110184] 0 2026-03-09T19:27:17.120 INFO:tasks.workunit.client.1.vm08.stdout:4/505: fdatasync da/d10/f13 0 2026-03-09T19:27:17.121 INFO:tasks.workunit.client.1.vm08.stdout:4/506: chown da/d10/d16/d28/f44 35097218 1 2026-03-09T19:27:17.121 INFO:tasks.workunit.client.1.vm08.stdout:3/574: creat d0/d6/fb8 x:0 0 0 2026-03-09T19:27:17.121 INFO:tasks.workunit.client.1.vm08.stdout:3/575: readlink d0/d4b/l73 0 2026-03-09T19:27:17.121 INFO:tasks.workunit.client.1.vm08.stdout:3/576: write d0/d6/de/d15/d96/fa0 [1110866,64727] 0 2026-03-09T19:27:17.121 INFO:tasks.workunit.client.1.vm08.stdout:7/582: rmdir d5/d14/d2b/daa 39 2026-03-09T19:27:17.121 INFO:tasks.workunit.client.1.vm08.stdout:7/583: chown d5/fa 64803 1 2026-03-09T19:27:17.125 INFO:tasks.workunit.client.1.vm08.stdout:0/521: symlink dd/d22/d63/d93/la9 0 2026-03-09T19:27:17.139 INFO:tasks.workunit.client.0.vm07.stdout:6/235: creat d0/d1/d28/f64 x:0 0 0 2026-03-09T19:27:17.148 INFO:tasks.workunit.client.0.vm07.stdout:4/236: unlink d3/d11/f35 0 2026-03-09T19:27:17.157 INFO:tasks.workunit.client.0.vm07.stdout:6/236: dwrite d0/d1/db/d24/d53/f35 [0,4194304] 0 2026-03-09T19:27:17.168 INFO:tasks.workunit.client.0.vm07.stdout:1/231: mknod d1/d3e/c57 0 2026-03-09T19:27:17.169 INFO:tasks.workunit.client.0.vm07.stdout:9/297: creat d0/d6/d3a/f6d x:0 0 0 2026-03-09T19:27:17.170 INFO:tasks.workunit.client.0.vm07.stdout:1/232: chown d1/d3/l41 203 1 2026-03-09T19:27:17.170 INFO:tasks.workunit.client.1.vm08.stdout:4/507: mkdir da/d10/d26/d27/d9b 0 2026-03-09T19:27:17.170 INFO:tasks.workunit.client.1.vm08.stdout:3/577: chown d0/d52/d7c/d7e/c90 23 1 2026-03-09T19:27:17.171 INFO:tasks.workunit.client.1.vm08.stdout:7/584: unlink d5/c18 0 2026-03-09T19:27:17.171 INFO:tasks.workunit.client.1.vm08.stdout:9/500: mkdir d0/d1b/daa 0 2026-03-09T19:27:17.179 INFO:tasks.workunit.client.1.vm08.stdout:0/522: mknod dd/d22/d63/d93/caa 0 2026-03-09T19:27:17.217 INFO:tasks.workunit.client.0.vm07.stdout:3/295: rename d1/d6/dd/f15 to d1/d6/dd/f53 0 2026-03-09T19:27:17.217 INFO:tasks.workunit.client.0.vm07.stdout:0/225: creat d0/d6/d13/f4c x:0 0 0 2026-03-09T19:27:17.217 INFO:tasks.workunit.client.0.vm07.stdout:0/226: dread - d0/f3a zero size 2026-03-09T19:27:17.217 INFO:tasks.workunit.client.0.vm07.stdout:5/230: creat d3/dd/f52 x:0 0 0 2026-03-09T19:27:17.217 INFO:tasks.workunit.client.1.vm08.stdout:3/578: dread d0/f7a [0,4194304] 0 2026-03-09T19:27:17.218 INFO:tasks.workunit.client.1.vm08.stdout:8/477: rename de/d1d/d21/l28 to de/d25/d31/d82/d6d/d99/da5/lad 0 2026-03-09T19:27:17.218 INFO:tasks.workunit.client.1.vm08.stdout:8/478: read de/d1d/d21/f4b [172126,109681] 0 2026-03-09T19:27:17.218 INFO:tasks.workunit.client.1.vm08.stdout:8/479: read - de/d25/d31/d82/fa9 zero size 2026-03-09T19:27:17.218 INFO:tasks.workunit.client.1.vm08.stdout:8/480: read - de/d1d/d2e/f56 zero size 2026-03-09T19:27:17.218 INFO:tasks.workunit.client.1.vm08.stdout:4/508: unlink da/d10/d16/d28/d2f/f8c 0 2026-03-09T19:27:17.218 INFO:tasks.workunit.client.1.vm08.stdout:7/585: mkdir d5/d14/d27/d78/dc7 0 2026-03-09T19:27:17.218 INFO:tasks.workunit.client.1.vm08.stdout:3/579: mknod d0/d8/cb9 0 2026-03-09T19:27:17.218 INFO:tasks.workunit.client.0.vm07.stdout:5/231: dread d3/d1a/fc [0,4194304] 0 2026-03-09T19:27:17.223 INFO:tasks.workunit.client.1.vm08.stdout:9/501: rename d0/d2/d8/l50 to d0/d1b/d97/d48/d5e/lab 0 2026-03-09T19:27:17.233 INFO:tasks.workunit.client.1.vm08.stdout:4/509: truncate f2 888469 0 2026-03-09T19:27:17.237 INFO:tasks.workunit.client.0.vm07.stdout:1/233: fsync d1/d3/d21/f2e 0 2026-03-09T19:27:17.256 INFO:tasks.workunit.client.1.vm08.stdout:9/502: mkdir d0/d2/d14/d98/dac 0 2026-03-09T19:27:17.256 INFO:tasks.workunit.client.0.vm07.stdout:5/232: rename d3/c4 to d3/d1a/c53 0 2026-03-09T19:27:17.256 INFO:tasks.workunit.client.0.vm07.stdout:4/237: getdents d3/d11/d29/d41 0 2026-03-09T19:27:17.256 INFO:tasks.workunit.client.0.vm07.stdout:4/238: fsync d3/d11/d16/d2f/f44 0 2026-03-09T19:27:17.256 INFO:tasks.workunit.client.0.vm07.stdout:6/237: truncate d0/d13/f1c 517948 0 2026-03-09T19:27:17.256 INFO:tasks.workunit.client.0.vm07.stdout:6/238: stat d0/d2d 0 2026-03-09T19:27:17.256 INFO:tasks.workunit.client.0.vm07.stdout:9/298: symlink d0/d6/d57/d5d/l6e 0 2026-03-09T19:27:17.256 INFO:tasks.workunit.client.0.vm07.stdout:9/299: chown d0/d6/fa 0 1 2026-03-09T19:27:17.256 INFO:tasks.workunit.client.0.vm07.stdout:5/233: dwrite d3/d1a/d28/d48/f4f [0,4194304] 0 2026-03-09T19:27:17.259 INFO:tasks.workunit.client.0.vm07.stdout:1/234: fdatasync d1/d11/d37/d3f/f4a 0 2026-03-09T19:27:17.259 INFO:tasks.workunit.client.0.vm07.stdout:5/234: dread - d3/d1a/d28/d40/f46 zero size 2026-03-09T19:27:17.259 INFO:tasks.workunit.client.0.vm07.stdout:5/235: fsync d3/f4d 0 2026-03-09T19:27:17.260 INFO:tasks.workunit.client.0.vm07.stdout:1/235: write d1/d11/d37/d3f/d45/f15 [2395555,62473] 0 2026-03-09T19:27:17.262 INFO:tasks.workunit.client.0.vm07.stdout:3/296: mkdir d1/d6/d45/d54 0 2026-03-09T19:27:17.262 INFO:tasks.workunit.client.0.vm07.stdout:5/236: chown d3/f2f 384245300 1 2026-03-09T19:27:17.262 INFO:tasks.workunit.client.0.vm07.stdout:0/227: creat d0/d6/d13/d17/d19/f4d x:0 0 0 2026-03-09T19:27:17.262 INFO:tasks.workunit.client.0.vm07.stdout:1/236: stat d1/db/f1f 0 2026-03-09T19:27:17.281 INFO:tasks.workunit.client.1.vm08.stdout:4/510: symlink da/d10/d16/d28/d46/d52/d6e/d73/l9c 0 2026-03-09T19:27:17.292 INFO:tasks.workunit.client.1.vm08.stdout:0/523: getdents dd/d22/d27/d4f 0 2026-03-09T19:27:17.301 INFO:tasks.workunit.client.0.vm07.stdout:5/237: dread f2 [0,4194304] 0 2026-03-09T19:27:17.309 INFO:tasks.workunit.client.0.vm07.stdout:5/238: truncate d3/dd/f52 433011 0 2026-03-09T19:27:17.309 INFO:tasks.workunit.client.1.vm08.stdout:4/511: link da/c76 da/c9d 0 2026-03-09T19:27:17.309 INFO:tasks.workunit.client.1.vm08.stdout:4/512: write da/d10/d16/d28/f8d [127580,93608] 0 2026-03-09T19:27:17.309 INFO:tasks.workunit.client.1.vm08.stdout:4/513: creat da/d10/d26/d27/d32/f9e x:0 0 0 2026-03-09T19:27:17.314 INFO:tasks.workunit.client.0.vm07.stdout:6/239: mknod d0/d1/c65 0 2026-03-09T19:27:17.315 INFO:tasks.workunit.client.1.vm08.stdout:4/514: creat da/d10/d16/f9f x:0 0 0 2026-03-09T19:27:17.315 INFO:tasks.workunit.client.0.vm07.stdout:3/297: rmdir d1/d1f 39 2026-03-09T19:27:17.319 INFO:tasks.workunit.client.1.vm08.stdout:6/516: sync 2026-03-09T19:27:17.319 INFO:tasks.workunit.client.0.vm07.stdout:0/228: mknod d0/d6/d13/d17/c4e 0 2026-03-09T19:27:17.319 INFO:tasks.workunit.client.0.vm07.stdout:6/240: write d0/d1/d28/f64 [780364,76699] 0 2026-03-09T19:27:17.323 INFO:tasks.workunit.client.0.vm07.stdout:5/239: creat d3/dd/d26/d2d/f54 x:0 0 0 2026-03-09T19:27:17.323 INFO:tasks.workunit.client.1.vm08.stdout:6/517: chown d3/f12 7 1 2026-03-09T19:27:17.324 INFO:tasks.workunit.client.0.vm07.stdout:9/300: dwrite d0/f4 [0,4194304] 0 2026-03-09T19:27:17.324 INFO:tasks.workunit.client.0.vm07.stdout:5/240: write d3/dd/d26/d2d/f54 [279984,77209] 0 2026-03-09T19:27:17.326 INFO:tasks.workunit.client.1.vm08.stdout:4/515: mkdir da/d10/d26/da0 0 2026-03-09T19:27:17.345 INFO:tasks.workunit.client.1.vm08.stdout:2/474: dwrite d3/d4/d23/d2c/d39/d5e/de/d18/f2d [0,4194304] 0 2026-03-09T19:27:17.345 INFO:tasks.workunit.client.0.vm07.stdout:8/291: dwrite f3 [0,4194304] 0 2026-03-09T19:27:17.347 INFO:tasks.workunit.client.0.vm07.stdout:2/310: write d3/dd/d16/d29/d2d/d45/f55 [531206,43009] 0 2026-03-09T19:27:17.349 INFO:tasks.workunit.client.1.vm08.stdout:1/651: write d9/da/d12/d39/fa7 [416910,102755] 0 2026-03-09T19:27:17.352 INFO:tasks.workunit.client.0.vm07.stdout:8/292: fdatasync f5 0 2026-03-09T19:27:17.353 INFO:tasks.workunit.client.0.vm07.stdout:8/293: truncate d7/d16/f71 636791 0 2026-03-09T19:27:17.355 INFO:tasks.workunit.client.0.vm07.stdout:7/282: truncate d0/f13 6708343 0 2026-03-09T19:27:17.363 INFO:tasks.workunit.client.1.vm08.stdout:4/516: mknod da/d10/d16/d28/d46/d52/d6e/d40/ca1 0 2026-03-09T19:27:17.372 INFO:tasks.workunit.client.1.vm08.stdout:5/472: dwrite d16/d45/f54 [0,4194304] 0 2026-03-09T19:27:17.375 INFO:tasks.workunit.client.1.vm08.stdout:4/517: read da/d10/d16/d28/d46/d52/d6e/d40/d6c/f71 [89704,34096] 0 2026-03-09T19:27:17.375 INFO:tasks.workunit.client.0.vm07.stdout:1/237: link d1/d3e/l54 d1/db/d31/d4f/l58 0 2026-03-09T19:27:17.381 INFO:tasks.workunit.client.0.vm07.stdout:1/238: dwrite d1/f2f [4194304,4194304] 0 2026-03-09T19:27:17.389 INFO:tasks.workunit.client.1.vm08.stdout:5/473: dread d16/d1e/d30/f3f [0,4194304] 0 2026-03-09T19:27:17.399 INFO:tasks.workunit.client.1.vm08.stdout:1/652: mknod d9/da/d12/cbf 0 2026-03-09T19:27:17.399 INFO:tasks.workunit.client.1.vm08.stdout:7/586: write d5/d14/dae/d3a/d42/d6a/f61 [378989,27585] 0 2026-03-09T19:27:17.402 INFO:tasks.workunit.client.0.vm07.stdout:0/229: creat d0/d6/f4f x:0 0 0 2026-03-09T19:27:17.405 INFO:tasks.workunit.client.1.vm08.stdout:3/580: write d0/d52/d7c/f8f [577400,24873] 0 2026-03-09T19:27:17.408 INFO:tasks.workunit.client.1.vm08.stdout:3/581: write d0/d6/de/d6e/f83 [693527,38875] 0 2026-03-09T19:27:17.411 INFO:tasks.workunit.client.0.vm07.stdout:3/298: stat d1/d1f/d16/f30 0 2026-03-09T19:27:17.411 INFO:tasks.workunit.client.1.vm08.stdout:8/481: truncate de/d1d/f1e 4055200 0 2026-03-09T19:27:17.412 INFO:tasks.workunit.client.1.vm08.stdout:8/482: write de/d1d/d21/d73/fa6 [425656,82737] 0 2026-03-09T19:27:17.420 INFO:tasks.workunit.client.1.vm08.stdout:9/503: dwrite d0/d2/d14/d5c/fd [0,4194304] 0 2026-03-09T19:27:17.431 INFO:tasks.workunit.client.1.vm08.stdout:4/518: unlink da/d10/f2e 0 2026-03-09T19:27:17.433 INFO:tasks.workunit.client.0.vm07.stdout:6/241: mknod d0/d1/db/d52/c66 0 2026-03-09T19:27:17.433 INFO:tasks.workunit.client.0.vm07.stdout:4/239: write d3/fc [1744334,25112] 0 2026-03-09T19:27:17.445 INFO:tasks.workunit.client.1.vm08.stdout:1/653: creat d9/da/d95/fc0 x:0 0 0 2026-03-09T19:27:17.448 INFO:tasks.workunit.client.1.vm08.stdout:0/524: dwrite dd/d22/d24/d49/d92/fa7 [0,4194304] 0 2026-03-09T19:27:17.450 INFO:tasks.workunit.client.0.vm07.stdout:7/283: creat d0/d52/f62 x:0 0 0 2026-03-09T19:27:17.450 INFO:tasks.workunit.client.0.vm07.stdout:8/294: dwrite d7/d9/d37/d45/d4f/f66 [0,4194304] 0 2026-03-09T19:27:17.452 INFO:tasks.workunit.client.0.vm07.stdout:7/284: dread - d0/d4/d5/d8/f37 zero size 2026-03-09T19:27:17.469 INFO:tasks.workunit.client.0.vm07.stdout:1/239: rmdir d1/d11/d37/d3f/d45/d30 39 2026-03-09T19:27:17.469 INFO:tasks.workunit.client.0.vm07.stdout:1/240: write d1/d11/f1b [2957591,56477] 0 2026-03-09T19:27:17.471 INFO:tasks.workunit.client.0.vm07.stdout:9/301: mkdir d0/d6f 0 2026-03-09T19:27:17.475 INFO:tasks.workunit.client.0.vm07.stdout:9/302: dwrite d0/db/f41 [4194304,4194304] 0 2026-03-09T19:27:17.480 INFO:tasks.workunit.client.1.vm08.stdout:2/475: dwrite d3/d9/f1e [0,4194304] 0 2026-03-09T19:27:17.498 INFO:tasks.workunit.client.0.vm07.stdout:9/303: sync 2026-03-09T19:27:17.502 INFO:tasks.workunit.client.1.vm08.stdout:8/483: creat de/d1d/d4f/fae x:0 0 0 2026-03-09T19:27:17.502 INFO:tasks.workunit.client.1.vm08.stdout:6/518: link d3/db/c46 d3/d94/cb6 0 2026-03-09T19:27:17.506 INFO:tasks.workunit.client.1.vm08.stdout:9/504: chown d0/d1b/d4e/l95 41095 1 2026-03-09T19:27:17.515 INFO:tasks.workunit.client.1.vm08.stdout:9/505: stat d0/d1b/d97/d48/d5d/f92 0 2026-03-09T19:27:17.515 INFO:tasks.workunit.client.1.vm08.stdout:4/519: mkdir da/d10/d16/d28/d2f/d4f/d64/d84/d8a/da2 0 2026-03-09T19:27:17.515 INFO:tasks.workunit.client.1.vm08.stdout:7/587: write d5/d14/d38/f4c [559204,103209] 0 2026-03-09T19:27:17.517 INFO:tasks.workunit.client.0.vm07.stdout:6/242: rmdir d0/d1/db/d1d 39 2026-03-09T19:27:17.517 INFO:tasks.workunit.client.1.vm08.stdout:5/474: mknod d16/d1e/d6e/c94 0 2026-03-09T19:27:17.522 INFO:tasks.workunit.client.0.vm07.stdout:4/240: rename d3/d11/d2b/d37/f2e to d3/d11/d2b/d37/f4d 0 2026-03-09T19:27:17.528 INFO:tasks.workunit.client.0.vm07.stdout:7/285: creat d0/d4/d5/d26/d3c/f63 x:0 0 0 2026-03-09T19:27:17.530 INFO:tasks.workunit.client.0.vm07.stdout:7/286: truncate d0/d4/d5/d8/f35 4582142 0 2026-03-09T19:27:17.530 INFO:tasks.workunit.client.0.vm07.stdout:8/295: unlink d7/ff 0 2026-03-09T19:27:17.545 INFO:tasks.workunit.client.1.vm08.stdout:2/476: truncate d3/d4/d23/d2c/d39/d5e/de/d18/d1f/f7f 863976 0 2026-03-09T19:27:17.547 INFO:tasks.workunit.client.1.vm08.stdout:2/477: stat d3/d4/d23 0 2026-03-09T19:27:17.555 INFO:tasks.workunit.client.0.vm07.stdout:5/241: link d3/dd/d26/d2d/l41 d3/d1a/d28/d40/l55 0 2026-03-09T19:27:17.556 INFO:tasks.workunit.client.0.vm07.stdout:5/242: dread - d3/dd/f24 zero size 2026-03-09T19:27:17.562 INFO:tasks.workunit.client.0.vm07.stdout:2/311: dwrite d3/f27 [0,4194304] 0 2026-03-09T19:27:17.575 INFO:tasks.workunit.client.0.vm07.stdout:9/304: symlink d0/d6/d57/d5d/l70 0 2026-03-09T19:27:17.576 INFO:tasks.workunit.client.0.vm07.stdout:3/299: getdents d1/d6/d4c 0 2026-03-09T19:27:17.577 INFO:tasks.workunit.client.0.vm07.stdout:3/300: chown d1/d1f/d16 2132 1 2026-03-09T19:27:17.586 INFO:tasks.workunit.client.0.vm07.stdout:6/243: rmdir d0/d1/d28 39 2026-03-09T19:27:17.594 INFO:tasks.workunit.client.0.vm07.stdout:6/244: stat d0/d44/l54 0 2026-03-09T19:27:17.599 INFO:tasks.workunit.client.1.vm08.stdout:7/588: stat d5/d14/dae/d1c/l6f 0 2026-03-09T19:27:17.607 INFO:tasks.workunit.client.0.vm07.stdout:7/287: rmdir d0/d4 39 2026-03-09T19:27:17.616 INFO:tasks.workunit.client.0.vm07.stdout:8/296: symlink d7/d9/d37/d45/d4f/l72 0 2026-03-09T19:27:17.617 INFO:tasks.workunit.client.1.vm08.stdout:8/484: write de/d1d/f97 [1235014,57925] 0 2026-03-09T19:27:17.620 INFO:tasks.workunit.client.1.vm08.stdout:0/525: mkdir dd/d9d/dab 0 2026-03-09T19:27:17.625 INFO:tasks.workunit.client.0.vm07.stdout:1/241: creat d1/d11/d37/d3f/d45/d30/f59 x:0 0 0 2026-03-09T19:27:17.625 INFO:tasks.workunit.client.1.vm08.stdout:4/520: write da/f1d [4526540,24061] 0 2026-03-09T19:27:17.629 INFO:tasks.workunit.client.1.vm08.stdout:3/582: rmdir d0/d8/d24 39 2026-03-09T19:27:17.631 INFO:tasks.workunit.client.1.vm08.stdout:3/583: write d0/d52/d7c/f8f [1230841,129306] 0 2026-03-09T19:27:17.634 INFO:tasks.workunit.client.1.vm08.stdout:2/478: unlink d3/d4/c5a 0 2026-03-09T19:27:17.638 INFO:tasks.workunit.client.1.vm08.stdout:4/521: dwrite da/d10/d26/f87 [0,4194304] 0 2026-03-09T19:27:17.657 INFO:tasks.workunit.client.1.vm08.stdout:6/519: link d3/d34/d6f/f4f d3/db/d43/d69/da0/fb7 0 2026-03-09T19:27:17.661 INFO:tasks.workunit.client.0.vm07.stdout:2/312: fsync d3/dd/fe 0 2026-03-09T19:27:17.663 INFO:tasks.workunit.client.0.vm07.stdout:0/230: mkdir d0/d6/d13/d1c/d50 0 2026-03-09T19:27:17.666 INFO:tasks.workunit.client.0.vm07.stdout:9/305: write d0/db/d29/d2c/f54 [433021,115355] 0 2026-03-09T19:27:17.666 INFO:tasks.workunit.client.0.vm07.stdout:3/301: creat d1/d26/f55 x:0 0 0 2026-03-09T19:27:17.667 INFO:tasks.workunit.client.0.vm07.stdout:3/302: stat d1/fe 0 2026-03-09T19:27:17.667 INFO:tasks.workunit.client.0.vm07.stdout:3/303: fdatasync d1/d6/f1b 0 2026-03-09T19:27:17.681 INFO:tasks.workunit.client.1.vm08.stdout:9/506: write d0/d1b/d97/f34 [329362,7236] 0 2026-03-09T19:27:17.684 INFO:tasks.workunit.client.0.vm07.stdout:4/241: symlink d3/d11/d29/d41/l4e 0 2026-03-09T19:27:17.684 INFO:tasks.workunit.client.0.vm07.stdout:7/288: chown d0/d4/c3a 19 1 2026-03-09T19:27:17.688 INFO:tasks.workunit.client.0.vm07.stdout:1/242: readlink d1/d3e/l54 0 2026-03-09T19:27:17.690 INFO:tasks.workunit.client.0.vm07.stdout:5/243: mkdir d3/dd/d26/d3f/d47/d56 0 2026-03-09T19:27:17.691 INFO:tasks.workunit.client.0.vm07.stdout:5/244: chown d3/f19 6767612 1 2026-03-09T19:27:17.692 INFO:tasks.workunit.client.0.vm07.stdout:1/243: dread d1/d11/d37/f40 [0,4194304] 0 2026-03-09T19:27:17.694 INFO:tasks.workunit.client.0.vm07.stdout:5/245: sync 2026-03-09T19:27:17.694 INFO:tasks.workunit.client.0.vm07.stdout:8/297: write d7/d1d/f4d [834990,82762] 0 2026-03-09T19:27:17.698 INFO:tasks.workunit.client.0.vm07.stdout:2/313: rmdir d3/dd/d16/d2f 39 2026-03-09T19:27:17.698 INFO:tasks.workunit.client.0.vm07.stdout:0/231: rmdir d0/d6/d13/d1c/d11 39 2026-03-09T19:27:17.703 INFO:tasks.workunit.client.0.vm07.stdout:3/304: creat d1/d6/d45/f56 x:0 0 0 2026-03-09T19:27:17.704 INFO:tasks.workunit.client.0.vm07.stdout:5/246: dread d3/d1a/fb [0,4194304] 0 2026-03-09T19:27:17.709 INFO:tasks.workunit.client.0.vm07.stdout:3/305: dwrite d1/d6/d45/f56 [0,4194304] 0 2026-03-09T19:27:17.710 INFO:tasks.workunit.client.0.vm07.stdout:5/247: dread d3/d1a/fc [0,4194304] 0 2026-03-09T19:27:17.711 INFO:tasks.workunit.client.0.vm07.stdout:7/289: chown d0/d4/d5/d8/d1a/l23 0 1 2026-03-09T19:27:17.722 INFO:tasks.workunit.client.0.vm07.stdout:6/245: write d0/d1/db/d17/f1a [1935953,3104] 0 2026-03-09T19:27:17.722 INFO:tasks.workunit.client.0.vm07.stdout:1/244: rmdir d1/d3 39 2026-03-09T19:27:17.726 INFO:tasks.workunit.client.0.vm07.stdout:1/245: dwrite d1/d11/d37/d3f/d45/f15 [0,4194304] 0 2026-03-09T19:27:17.739 INFO:tasks.workunit.client.0.vm07.stdout:9/306: unlink d0/db/d29/f2f 0 2026-03-09T19:27:17.748 INFO:tasks.workunit.client.0.vm07.stdout:3/306: creat d1/d6/dd/f57 x:0 0 0 2026-03-09T19:27:17.751 INFO:tasks.workunit.client.0.vm07.stdout:7/290: mkdir d0/d4/d5/d8/d41/d64 0 2026-03-09T19:27:17.759 INFO:tasks.workunit.client.0.vm07.stdout:1/246: mkdir d1/d11/d37/d5a 0 2026-03-09T19:27:17.760 INFO:tasks.workunit.client.0.vm07.stdout:1/247: dread d1/d11/d37/d3f/d45/f26 [0,4194304] 0 2026-03-09T19:27:17.761 INFO:tasks.workunit.client.0.vm07.stdout:2/314: creat d3/dd/d16/d2f/f6e x:0 0 0 2026-03-09T19:27:17.762 INFO:tasks.workunit.client.0.vm07.stdout:2/315: write f2 [1436948,11638] 0 2026-03-09T19:27:17.764 INFO:tasks.workunit.client.0.vm07.stdout:9/307: creat d0/db/d29/d2c/d36/f71 x:0 0 0 2026-03-09T19:27:17.767 INFO:tasks.workunit.client.1.vm08.stdout:7/589: read d5/d14/f59 [1690175,92156] 0 2026-03-09T19:27:17.767 INFO:tasks.workunit.client.0.vm07.stdout:4/242: rename d3/d11/d29/d41 to d3/d4f 0 2026-03-09T19:27:17.767 INFO:tasks.workunit.client.0.vm07.stdout:0/232: rename d0 to d0/d6/d51 22 2026-03-09T19:27:17.768 INFO:tasks.workunit.client.1.vm08.stdout:1/654: link d9/d11/l7e d9/da/d53/db3/lc1 0 2026-03-09T19:27:17.769 INFO:tasks.workunit.client.0.vm07.stdout:3/307: rmdir d1/d1f/d16/d28 39 2026-03-09T19:27:17.770 INFO:tasks.workunit.client.1.vm08.stdout:8/485: chown de/c14 31336548 1 2026-03-09T19:27:17.777 INFO:tasks.workunit.client.0.vm07.stdout:6/246: symlink d0/l67 0 2026-03-09T19:27:17.777 INFO:tasks.workunit.client.1.vm08.stdout:3/584: readlink d0/l29 0 2026-03-09T19:27:17.777 INFO:tasks.workunit.client.1.vm08.stdout:3/585: dread - d0/d6/de/d15/fa3 zero size 2026-03-09T19:27:17.782 INFO:tasks.workunit.client.0.vm07.stdout:6/247: dread d0/d1/fa [0,4194304] 0 2026-03-09T19:27:17.785 INFO:tasks.workunit.client.0.vm07.stdout:9/308: rename d0/db/c1e to d0/d6/d57/d5d/c72 0 2026-03-09T19:27:17.786 INFO:tasks.workunit.client.1.vm08.stdout:5/475: link d16/d1e/c24 d16/d1e/d6e/d84/c95 0 2026-03-09T19:27:17.788 INFO:tasks.workunit.client.0.vm07.stdout:0/233: fsync d0/d6/f43 0 2026-03-09T19:27:17.791 INFO:tasks.workunit.client.1.vm08.stdout:1/655: rename d9/d11/f87 to d9/d11/d7a/d89/d8d/da3/fc2 0 2026-03-09T19:27:17.794 INFO:tasks.workunit.client.0.vm07.stdout:5/248: link d3/d1a/c1e d3/d1a/d28/d48/c57 0 2026-03-09T19:27:17.794 INFO:tasks.workunit.client.1.vm08.stdout:8/486: unlink de/d1d/d2e/d5f/f4e 0 2026-03-09T19:27:17.796 INFO:tasks.workunit.client.1.vm08.stdout:8/487: chown de/d1d/d69/f8f 562140391 1 2026-03-09T19:27:17.804 INFO:tasks.workunit.client.0.vm07.stdout:2/316: dread f2 [0,4194304] 0 2026-03-09T19:27:17.805 INFO:tasks.workunit.client.0.vm07.stdout:2/317: chown d3/fc 0 1 2026-03-09T19:27:17.805 INFO:tasks.workunit.client.0.vm07.stdout:2/318: stat d3/dd/d16/d2f/c61 0 2026-03-09T19:27:17.808 INFO:tasks.workunit.client.0.vm07.stdout:2/319: dread d3/dd/d16/d29/d2d/d45/f62 [4194304,4194304] 0 2026-03-09T19:27:17.815 INFO:tasks.workunit.client.0.vm07.stdout:8/298: dwrite d7/d1d/f3d [0,4194304] 0 2026-03-09T19:27:17.828 INFO:tasks.workunit.client.0.vm07.stdout:2/320: dread d3/dd/f1e [0,4194304] 0 2026-03-09T19:27:17.839 INFO:tasks.workunit.client.1.vm08.stdout:0/526: dwrite dd/d22/d27/d6c/f85 [0,4194304] 0 2026-03-09T19:27:17.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:17 vm08.local ceph-mon[57794]: pgmap v163: 65 pgs: 65 active+clean; 1.5 GiB data, 5.8 GiB used, 114 GiB / 120 GiB avail; 29 MiB/s rd, 123 MiB/s wr, 242 op/s 2026-03-09T19:27:17.848 INFO:tasks.workunit.client.0.vm07.stdout:1/248: symlink d1/db/d31/d4f/l5b 0 2026-03-09T19:27:17.848 INFO:tasks.workunit.client.1.vm08.stdout:3/586: mknod d0/d6/de/d1b/cba 0 2026-03-09T19:27:17.849 INFO:tasks.workunit.client.0.vm07.stdout:4/243: write d3/d11/f1e [206730,126165] 0 2026-03-09T19:27:17.849 INFO:tasks.workunit.client.1.vm08.stdout:2/479: write d3/d9/d26/f6a [443890,117017] 0 2026-03-09T19:27:17.855 INFO:tasks.workunit.client.1.vm08.stdout:0/527: dwrite dd/d22/d27/f9e [0,4194304] 0 2026-03-09T19:27:17.864 INFO:tasks.workunit.client.1.vm08.stdout:5/476: symlink d16/d1e/d3b/l96 0 2026-03-09T19:27:17.870 INFO:tasks.workunit.client.0.vm07.stdout:0/234: read - d0/d6/f38 zero size 2026-03-09T19:27:17.874 INFO:tasks.workunit.client.1.vm08.stdout:1/656: symlink d9/d11/d7a/d89/d8d/lc3 0 2026-03-09T19:27:17.875 INFO:tasks.workunit.client.0.vm07.stdout:3/308: mknod d1/c58 0 2026-03-09T19:27:17.892 INFO:tasks.workunit.client.1.vm08.stdout:8/488: dread de/d1d/d4f/f5e [0,4194304] 0 2026-03-09T19:27:17.896 INFO:tasks.workunit.client.1.vm08.stdout:2/480: fsync d3/d4/d23/d2c/d39/d5e/de/d18/d1f/f83 0 2026-03-09T19:27:17.904 INFO:tasks.workunit.client.0.vm07.stdout:1/249: fsync d1/d11/d37/d3f/d45/f26 0 2026-03-09T19:27:17.915 INFO:tasks.workunit.client.1.vm08.stdout:0/528: read dd/d22/d27/d2e/d37/f44 [875579,79087] 0 2026-03-09T19:27:17.923 INFO:tasks.workunit.client.1.vm08.stdout:6/520: truncate d3/d34/d5c/fac 3298087 0 2026-03-09T19:27:17.923 INFO:tasks.workunit.client.1.vm08.stdout:5/477: rename d16/d1e/d6e/d84/c89 to d16/d1e/d30/d6f/c97 0 2026-03-09T19:27:17.932 INFO:tasks.workunit.client.0.vm07.stdout:9/309: truncate d0/f3 4743783 0 2026-03-09T19:27:17.933 INFO:tasks.workunit.client.0.vm07.stdout:9/310: write d0/d6/f20 [892461,27709] 0 2026-03-09T19:27:17.934 INFO:tasks.workunit.client.0.vm07.stdout:9/311: write d0/d6/fa [4090870,97084] 0 2026-03-09T19:27:17.939 INFO:tasks.workunit.client.1.vm08.stdout:6/521: dread d3/f6e [4194304,4194304] 0 2026-03-09T19:27:17.941 INFO:tasks.workunit.client.0.vm07.stdout:0/235: rmdir d0 39 2026-03-09T19:27:17.942 INFO:tasks.workunit.client.1.vm08.stdout:6/522: write d3/d94/fb5 [646025,94933] 0 2026-03-09T19:27:17.945 INFO:tasks.workunit.client.0.vm07.stdout:9/312: dread d0/db/d29/d2c/f54 [0,4194304] 0 2026-03-09T19:27:17.945 INFO:tasks.workunit.client.0.vm07.stdout:9/313: chown d0/d17/f4f 132607502 1 2026-03-09T19:27:17.946 INFO:tasks.workunit.client.0.vm07.stdout:9/314: write d0/d17/f5e [267736,120205] 0 2026-03-09T19:27:17.947 INFO:tasks.workunit.client.0.vm07.stdout:9/315: truncate d0/db/d29/d2c/d36/f71 540876 0 2026-03-09T19:27:17.955 INFO:tasks.workunit.client.0.vm07.stdout:3/309: dread d1/d1f/f13 [4194304,4194304] 0 2026-03-09T19:27:17.966 INFO:tasks.workunit.client.0.vm07.stdout:7/291: link d0/c28 d0/d4/d5/d8/d41/c65 0 2026-03-09T19:27:17.967 INFO:tasks.workunit.client.0.vm07.stdout:7/292: chown d0/d4/d5/d8/l2b 41238246 1 2026-03-09T19:27:17.971 INFO:tasks.workunit.client.0.vm07.stdout:2/321: mknod d3/dd/d16/d29/d3c/d4c/c6f 0 2026-03-09T19:27:17.974 INFO:tasks.workunit.client.1.vm08.stdout:3/587: truncate d0/d6/de/d1a/f5a 2495394 0 2026-03-09T19:27:17.975 INFO:tasks.workunit.client.1.vm08.stdout:9/507: getdents d0/d1b/d97/d48/d5d 0 2026-03-09T19:27:17.975 INFO:tasks.workunit.client.1.vm08.stdout:0/529: stat dd/d22/f3e 0 2026-03-09T19:27:17.976 INFO:tasks.workunit.client.1.vm08.stdout:3/588: readlink d0/d6/de/d1b/l8e 0 2026-03-09T19:27:17.977 INFO:tasks.workunit.client.1.vm08.stdout:3/589: fsync d0/d6/de/d15/fa4 0 2026-03-09T19:27:17.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:17 vm07.local ceph-mon[48545]: pgmap v163: 65 pgs: 65 active+clean; 1.5 GiB data, 5.8 GiB used, 114 GiB / 120 GiB avail; 29 MiB/s rd, 123 MiB/s wr, 242 op/s 2026-03-09T19:27:17.978 INFO:tasks.workunit.client.0.vm07.stdout:6/248: link d0/d1/db/d24/d53/d31/l3d d0/d13/l68 0 2026-03-09T19:27:17.982 INFO:tasks.workunit.client.1.vm08.stdout:7/590: getdents d5/d14/d27 0 2026-03-09T19:27:17.990 INFO:tasks.workunit.client.0.vm07.stdout:9/316: mkdir d0/d6/d73 0 2026-03-09T19:27:17.994 INFO:tasks.workunit.client.1.vm08.stdout:4/522: truncate da/d10/d16/d28/d46/d52/d6e/d2c/f4a 1051494 0 2026-03-09T19:27:17.995 INFO:tasks.workunit.client.0.vm07.stdout:5/249: creat d3/dd/f58 x:0 0 0 2026-03-09T19:27:17.996 INFO:tasks.workunit.client.0.vm07.stdout:5/250: write d3/d1a/d28/d48/f4f [3551987,128804] 0 2026-03-09T19:27:17.999 INFO:tasks.workunit.client.1.vm08.stdout:2/481: symlink d3/d4/lac 0 2026-03-09T19:27:18.016 INFO:tasks.workunit.client.0.vm07.stdout:7/293: mknod d0/d4/d5/d26/d32/c66 0 2026-03-09T19:27:18.018 INFO:tasks.workunit.client.0.vm07.stdout:7/294: truncate d0/d4/d5/f50 90112 0 2026-03-09T19:27:18.018 INFO:tasks.workunit.client.1.vm08.stdout:0/530: dread fc [0,4194304] 0 2026-03-09T19:27:18.018 INFO:tasks.workunit.client.0.vm07.stdout:7/295: chown d0/d4/d5/d8/l2b 19426 1 2026-03-09T19:27:18.025 INFO:tasks.workunit.client.1.vm08.stdout:0/531: dread dd/d22/d27/f3f [0,4194304] 0 2026-03-09T19:27:18.036 INFO:tasks.workunit.client.1.vm08.stdout:5/478: creat d16/d1e/d30/d8a/f98 x:0 0 0 2026-03-09T19:27:18.037 INFO:tasks.workunit.client.1.vm08.stdout:0/532: dwrite dd/d22/d24/d49/d92/fa7 [4194304,4194304] 0 2026-03-09T19:27:18.037 INFO:tasks.workunit.client.0.vm07.stdout:4/244: rmdir d3/d11/d2b/d37/d4b 0 2026-03-09T19:27:18.041 INFO:tasks.workunit.client.1.vm08.stdout:1/657: creat d9/da/d53/d67/fc4 x:0 0 0 2026-03-09T19:27:18.041 INFO:tasks.workunit.client.1.vm08.stdout:6/523: fdatasync d3/d34/d6f/f50 0 2026-03-09T19:27:18.042 INFO:tasks.workunit.client.1.vm08.stdout:0/533: stat dd/d22/d24/d49/d50/f95 0 2026-03-09T19:27:18.048 INFO:tasks.workunit.client.1.vm08.stdout:1/658: read d9/da/d12/f72 [1864954,19374] 0 2026-03-09T19:27:18.051 INFO:tasks.workunit.client.0.vm07.stdout:6/249: chown d0/d1/d28/f2b 21 1 2026-03-09T19:27:18.055 INFO:tasks.workunit.client.1.vm08.stdout:9/508: dwrite d0/d1b/d97/d48/d5e/f6e [0,4194304] 0 2026-03-09T19:27:18.055 INFO:tasks.workunit.client.0.vm07.stdout:6/250: chown d0/d2d/l61 95 1 2026-03-09T19:27:18.055 INFO:tasks.workunit.client.0.vm07.stdout:6/251: dwrite d0/d13/f26 [4194304,4194304] 0 2026-03-09T19:27:18.056 INFO:tasks.workunit.client.1.vm08.stdout:9/509: chown d0/d1b/d97/d48/c4f 162552 1 2026-03-09T19:27:18.057 INFO:tasks.workunit.client.1.vm08.stdout:1/659: dwrite d9/d11/db6/fbe [0,4194304] 0 2026-03-09T19:27:18.059 INFO:tasks.workunit.client.0.vm07.stdout:6/252: dwrite d0/d1/f5a [0,4194304] 0 2026-03-09T19:27:18.061 INFO:tasks.workunit.client.1.vm08.stdout:9/510: read d0/d1b/d97/d48/d5e/f6e [3698379,106934] 0 2026-03-09T19:27:18.063 INFO:tasks.workunit.client.1.vm08.stdout:1/660: dread - d9/d11/d7a/d89/fb7 zero size 2026-03-09T19:27:18.072 INFO:tasks.workunit.client.1.vm08.stdout:7/591: mknod d5/d14/d27/d54/cc8 0 2026-03-09T19:27:18.076 INFO:tasks.workunit.client.1.vm08.stdout:8/489: rmdir de/da4 0 2026-03-09T19:27:18.087 INFO:tasks.workunit.client.0.vm07.stdout:3/310: mknod d1/d1f/d16/c59 0 2026-03-09T19:27:18.091 INFO:tasks.workunit.client.0.vm07.stdout:3/311: dwrite d1/d6/dd/f57 [0,4194304] 0 2026-03-09T19:27:18.091 INFO:tasks.workunit.client.1.vm08.stdout:4/523: fdatasync da/d10/d16/d28/d46/d52/d6e/d40/f41 0 2026-03-09T19:27:18.096 INFO:tasks.workunit.client.0.vm07.stdout:3/312: dwrite d1/d1f/d16/f1e [0,4194304] 0 2026-03-09T19:27:18.114 INFO:tasks.workunit.client.0.vm07.stdout:8/299: link d7/d9/d37/d45/d56/d62/f64 d7/d9/d37/d45/f73 0 2026-03-09T19:27:18.117 INFO:tasks.workunit.client.0.vm07.stdout:4/245: mkdir d3/d11/d29/d34/d50 0 2026-03-09T19:27:18.117 INFO:tasks.workunit.client.1.vm08.stdout:0/534: creat dd/d31/fac x:0 0 0 2026-03-09T19:27:18.120 INFO:tasks.workunit.client.0.vm07.stdout:4/246: read d3/d11/d2b/d37/f25 [337844,36120] 0 2026-03-09T19:27:18.154 INFO:tasks.workunit.client.0.vm07.stdout:9/317: mknod d0/d6/d73/c74 0 2026-03-09T19:27:18.155 INFO:tasks.workunit.client.0.vm07.stdout:9/318: truncate d0/db/d29/d2c/d36/f62 625893 0 2026-03-09T19:27:18.161 INFO:tasks.workunit.client.1.vm08.stdout:9/511: symlink d0/d2/d14/d98/lad 0 2026-03-09T19:27:18.162 INFO:tasks.workunit.client.1.vm08.stdout:7/592: creat d5/d14/d27/fc9 x:0 0 0 2026-03-09T19:27:18.163 INFO:tasks.workunit.client.0.vm07.stdout:5/251: creat d3/dd/d26/d3f/d47/d56/f59 x:0 0 0 2026-03-09T19:27:18.164 INFO:tasks.workunit.client.0.vm07.stdout:5/252: write d3/d1a/d28/d48/f4f [2368866,130492] 0 2026-03-09T19:27:18.164 INFO:tasks.workunit.client.0.vm07.stdout:5/253: chown d3/d1a/f12 13121955 1 2026-03-09T19:27:18.165 INFO:tasks.workunit.client.0.vm07.stdout:5/254: readlink d3/d1a/l11 0 2026-03-09T19:27:18.169 INFO:tasks.workunit.client.0.vm07.stdout:1/250: getdents d1/d11/d37/d3f 0 2026-03-09T19:27:18.180 INFO:tasks.workunit.client.1.vm08.stdout:4/524: rmdir da/d10/d26/d27/d32 39 2026-03-09T19:27:18.180 INFO:tasks.workunit.client.0.vm07.stdout:8/300: rename d7/d9/f60 to d7/d30/d32/f74 0 2026-03-09T19:27:18.180 INFO:tasks.workunit.client.1.vm08.stdout:3/590: fdatasync d0/d6/de/d1b/d16/d17/f1d 0 2026-03-09T19:27:18.180 INFO:tasks.workunit.client.0.vm07.stdout:8/301: write d7/d9/d10/f1b [242548,10765] 0 2026-03-09T19:27:18.189 INFO:tasks.workunit.client.1.vm08.stdout:6/524: write d3/d34/d6f/f2f [5377467,118479] 0 2026-03-09T19:27:18.192 INFO:tasks.workunit.client.1.vm08.stdout:0/535: rmdir dd/d22/d24/d49/d50 39 2026-03-09T19:27:18.207 INFO:tasks.workunit.client.0.vm07.stdout:5/255: rmdir d3/d1a/d28/d48 39 2026-03-09T19:27:18.207 INFO:tasks.workunit.client.0.vm07.stdout:2/322: link d3/dd/d16/d29/d2d/d45/l52 d3/l70 0 2026-03-09T19:27:18.207 INFO:tasks.workunit.client.1.vm08.stdout:7/593: fsync d5/d14/d2b/fb0 0 2026-03-09T19:27:18.208 INFO:tasks.workunit.client.0.vm07.stdout:1/251: mkdir d1/d3e/d5c 0 2026-03-09T19:27:18.208 INFO:tasks.workunit.client.0.vm07.stdout:5/256: read d3/dd/d26/d2d/f54 [322700,83920] 0 2026-03-09T19:27:18.210 INFO:tasks.workunit.client.0.vm07.stdout:5/257: chown d3/d1a/d28 36 1 2026-03-09T19:27:18.210 INFO:tasks.workunit.client.1.vm08.stdout:4/525: chown da/d10/d16/d28/d2f/d4f/d64/d84/d8a/da2 16507721 1 2026-03-09T19:27:18.211 INFO:tasks.workunit.client.1.vm08.stdout:3/591: rename d0/d6/de/d15/f53 to d0/d6/de/d6e/d51/fbb 0 2026-03-09T19:27:18.211 INFO:tasks.workunit.client.1.vm08.stdout:3/592: readlink d0/d4b/l73 0 2026-03-09T19:27:18.212 INFO:tasks.workunit.client.1.vm08.stdout:6/525: symlink d3/d68/d7e/lb8 0 2026-03-09T19:27:18.213 INFO:tasks.workunit.client.0.vm07.stdout:2/323: dwrite d3/f22 [0,4194304] 0 2026-03-09T19:27:18.215 INFO:tasks.workunit.client.0.vm07.stdout:4/247: mkdir d3/d11/d51 0 2026-03-09T19:27:18.217 INFO:tasks.workunit.client.0.vm07.stdout:5/258: dwrite d3/d1a/f12 [0,4194304] 0 2026-03-09T19:27:18.218 INFO:tasks.workunit.client.1.vm08.stdout:4/526: truncate da/d10/d16/d28/f8d 895435 0 2026-03-09T19:27:18.219 INFO:tasks.workunit.client.1.vm08.stdout:7/594: dread d5/d14/d2b/f9f [0,4194304] 0 2026-03-09T19:27:18.232 INFO:tasks.workunit.client.1.vm08.stdout:3/593: read - d0/d52/f8a zero size 2026-03-09T19:27:18.233 INFO:tasks.workunit.client.1.vm08.stdout:3/594: chown d0/d6/de/d15/c85 99 1 2026-03-09T19:27:18.235 INFO:tasks.workunit.client.1.vm08.stdout:4/527: dwrite da/d10/f25 [4194304,4194304] 0 2026-03-09T19:27:18.239 INFO:tasks.workunit.client.1.vm08.stdout:3/595: read d0/d6/de/d1a/f5a [1582573,58979] 0 2026-03-09T19:27:18.247 INFO:tasks.workunit.client.0.vm07.stdout:1/252: rename d1/d11/d37/d3f/d45/d30 to d1/d11/d37/d5d 0 2026-03-09T19:27:18.259 INFO:tasks.workunit.client.1.vm08.stdout:9/512: creat d0/d2/fae x:0 0 0 2026-03-09T19:27:18.260 INFO:tasks.workunit.client.0.vm07.stdout:2/324: symlink d3/dd/d16/d29/d2d/l71 0 2026-03-09T19:27:18.261 INFO:tasks.workunit.client.0.vm07.stdout:4/248: creat d3/d11/d29/f52 x:0 0 0 2026-03-09T19:27:18.261 INFO:tasks.workunit.client.0.vm07.stdout:3/313: link d1/d1f/c17 d1/d6/dd/d51/c5a 0 2026-03-09T19:27:18.264 INFO:tasks.workunit.client.1.vm08.stdout:3/596: creat d0/d6/de/d1b/d16/d17/fbc x:0 0 0 2026-03-09T19:27:18.274 INFO:tasks.workunit.client.0.vm07.stdout:4/249: dwrite d3/f1a [0,4194304] 0 2026-03-09T19:27:18.274 INFO:tasks.workunit.client.1.vm08.stdout:6/526: rename d3/d34/d3b/f99 to d3/d68/fb9 0 2026-03-09T19:27:18.274 INFO:tasks.workunit.client.1.vm08.stdout:6/527: write d3/d34/d6f/f2f [4834304,17539] 0 2026-03-09T19:27:18.274 INFO:tasks.workunit.client.1.vm08.stdout:7/595: creat d5/d14/dae/d3a/fca x:0 0 0 2026-03-09T19:27:18.276 INFO:tasks.workunit.client.1.vm08.stdout:9/513: dread d0/d2/d14/d98/f40 [0,4194304] 0 2026-03-09T19:27:18.280 INFO:tasks.workunit.client.0.vm07.stdout:2/325: rename d3/dd/d16/d29/d3c/l50 to d3/dd/d16/d29/d2d/d45/d3b/d44/l72 0 2026-03-09T19:27:18.280 INFO:tasks.workunit.client.0.vm07.stdout:1/253: mknod d1/db/d31/d56/c5e 0 2026-03-09T19:27:18.280 INFO:tasks.workunit.client.1.vm08.stdout:3/597: rmdir d0/d52/d6d/d77/d88 39 2026-03-09T19:27:18.281 INFO:tasks.workunit.client.1.vm08.stdout:3/598: chown d0/d52/c62 8054 1 2026-03-09T19:27:18.281 INFO:tasks.workunit.client.1.vm08.stdout:3/599: fsync d0/d4b/fb0 0 2026-03-09T19:27:18.288 INFO:tasks.workunit.client.1.vm08.stdout:3/600: dread d0/d6/de/d15/d96/fa0 [0,4194304] 0 2026-03-09T19:27:18.290 INFO:tasks.workunit.client.1.vm08.stdout:6/528: mknod d3/db/d43/cba 0 2026-03-09T19:27:18.290 INFO:tasks.workunit.client.0.vm07.stdout:3/314: mknod d1/d6/d45/d54/c5b 0 2026-03-09T19:27:18.297 INFO:tasks.workunit.client.1.vm08.stdout:3/601: fdatasync d0/d8/f66 0 2026-03-09T19:27:18.297 INFO:tasks.workunit.client.0.vm07.stdout:3/315: stat d1/f2a 0 2026-03-09T19:27:18.303 INFO:tasks.workunit.client.1.vm08.stdout:6/529: creat d3/d68/d7e/fbb x:0 0 0 2026-03-09T19:27:18.328 INFO:tasks.workunit.client.1.vm08.stdout:3/602: write d0/d52/d6d/d77/d88/faf [32949,22477] 0 2026-03-09T19:27:18.330 INFO:tasks.workunit.client.0.vm07.stdout:9/319: dread d0/f56 [0,4194304] 0 2026-03-09T19:27:18.330 INFO:tasks.workunit.client.1.vm08.stdout:3/603: dread - d0/d6/de/d6e/d51/fb5 zero size 2026-03-09T19:27:18.334 INFO:tasks.workunit.client.0.vm07.stdout:5/259: getdents d3/d1a 0 2026-03-09T19:27:18.339 INFO:tasks.workunit.client.1.vm08.stdout:6/530: mkdir d3/dbc 0 2026-03-09T19:27:18.353 INFO:tasks.workunit.client.1.vm08.stdout:6/531: dwrite d3/d15/f2b [0,4194304] 0 2026-03-09T19:27:18.354 INFO:tasks.workunit.client.0.vm07.stdout:3/316: read - d1/d6/dd/f3b zero size 2026-03-09T19:27:18.354 INFO:tasks.workunit.client.0.vm07.stdout:1/254: rename d1/db/f3c to d1/d3/d21/f5f 0 2026-03-09T19:27:18.354 INFO:tasks.workunit.client.0.vm07.stdout:2/326: link d3/fc d3/dd/f73 0 2026-03-09T19:27:18.354 INFO:tasks.workunit.client.0.vm07.stdout:5/260: unlink d3/dd/l1d 0 2026-03-09T19:27:18.354 INFO:tasks.workunit.client.0.vm07.stdout:4/250: getdents d3/d11/d2b/d38 0 2026-03-09T19:27:18.354 INFO:tasks.workunit.client.0.vm07.stdout:1/255: dwrite d1/f4c [0,4194304] 0 2026-03-09T19:27:18.354 INFO:tasks.workunit.client.0.vm07.stdout:3/317: mkdir d1/d1f/d5c 0 2026-03-09T19:27:18.356 INFO:tasks.workunit.client.0.vm07.stdout:1/256: rename d1 to d1/db/d60 22 2026-03-09T19:27:18.357 INFO:tasks.workunit.client.1.vm08.stdout:6/532: symlink d3/db/lbd 0 2026-03-09T19:27:18.357 INFO:tasks.workunit.client.0.vm07.stdout:1/257: stat d1/d11/d37 0 2026-03-09T19:27:18.358 INFO:tasks.workunit.client.0.vm07.stdout:1/258: fdatasync d1/d3/f12 0 2026-03-09T19:27:18.358 INFO:tasks.workunit.client.1.vm08.stdout:6/533: fdatasync d3/d34/d3b/f8d 0 2026-03-09T19:27:18.358 INFO:tasks.workunit.client.0.vm07.stdout:4/251: dread d3/f13 [0,4194304] 0 2026-03-09T19:27:18.362 INFO:tasks.workunit.client.0.vm07.stdout:4/252: readlink d3/d11/d16/l21 0 2026-03-09T19:27:18.366 INFO:tasks.workunit.client.0.vm07.stdout:3/318: dread d1/d1f/d16/f39 [0,4194304] 0 2026-03-09T19:27:18.370 INFO:tasks.workunit.client.0.vm07.stdout:1/259: dwrite d1/f1d [0,4194304] 0 2026-03-09T19:27:18.370 INFO:tasks.workunit.client.0.vm07.stdout:3/319: write d1/d6/dd/f2b [1920051,31789] 0 2026-03-09T19:27:18.378 INFO:tasks.workunit.client.1.vm08.stdout:6/534: symlink d3/d34/lbe 0 2026-03-09T19:27:18.380 INFO:tasks.workunit.client.1.vm08.stdout:6/535: chown d3/d34/d3b/f8d 35520515 1 2026-03-09T19:27:18.383 INFO:tasks.workunit.client.0.vm07.stdout:4/253: dwrite d3/f8 [4194304,4194304] 0 2026-03-09T19:27:18.384 INFO:tasks.workunit.client.0.vm07.stdout:4/254: chown d3/d11/f12 12635 1 2026-03-09T19:27:18.396 INFO:tasks.workunit.client.0.vm07.stdout:1/260: rename d1/d11/d37/c32 to d1/d11/c61 0 2026-03-09T19:27:18.396 INFO:tasks.workunit.client.0.vm07.stdout:2/327: dread d3/d11/f2e [0,4194304] 0 2026-03-09T19:27:18.397 INFO:tasks.workunit.client.0.vm07.stdout:1/261: chown d1/d3/d21/l2b 8119327 1 2026-03-09T19:27:18.399 INFO:tasks.workunit.client.0.vm07.stdout:1/262: dread d1/f2f [4194304,4194304] 0 2026-03-09T19:27:18.409 INFO:tasks.workunit.client.0.vm07.stdout:4/255: mknod d3/d11/d2b/d38/c53 0 2026-03-09T19:27:18.417 INFO:tasks.workunit.client.0.vm07.stdout:4/256: chown d3/d11/d29 646547 1 2026-03-09T19:27:18.419 INFO:tasks.workunit.client.0.vm07.stdout:2/328: rename d3/d11/d38/d5c to d3/dd/d16/d2f/d74 0 2026-03-09T19:27:18.420 INFO:tasks.workunit.client.0.vm07.stdout:2/329: stat d3/dd/d16/d30/d40/c4b 0 2026-03-09T19:27:18.421 INFO:tasks.workunit.client.1.vm08.stdout:6/536: mknod d3/d15/d8a/cbf 0 2026-03-09T19:27:18.422 INFO:tasks.workunit.client.1.vm08.stdout:6/537: chown d3/d34/d5c/da2/f72 370983 1 2026-03-09T19:27:18.422 INFO:tasks.workunit.client.1.vm08.stdout:6/538: stat d3/db 0 2026-03-09T19:27:18.425 INFO:tasks.workunit.client.0.vm07.stdout:7/296: dread d0/f25 [0,4194304] 0 2026-03-09T19:27:18.431 INFO:tasks.workunit.client.0.vm07.stdout:3/320: dread d1/d6/f21 [0,4194304] 0 2026-03-09T19:27:18.433 INFO:tasks.workunit.client.1.vm08.stdout:2/482: write d3/d9/d4a/f59 [4333148,71010] 0 2026-03-09T19:27:18.434 INFO:tasks.workunit.client.0.vm07.stdout:0/236: write d0/d6/d13/d17/f2b [347091,47158] 0 2026-03-09T19:27:18.434 INFO:tasks.workunit.client.0.vm07.stdout:9/320: getdents d0/d6/d73 0 2026-03-09T19:27:18.435 INFO:tasks.workunit.client.0.vm07.stdout:9/321: dread - d0/db/d29/d4d/f65 zero size 2026-03-09T19:27:18.438 INFO:tasks.workunit.client.1.vm08.stdout:2/483: write d3/d4/d23/d2c/d39/d5e/d14/f78 [1014825,2166] 0 2026-03-09T19:27:18.438 INFO:tasks.workunit.client.1.vm08.stdout:1/661: dwrite d9/da/dc/f78 [0,4194304] 0 2026-03-09T19:27:18.439 INFO:tasks.workunit.client.1.vm08.stdout:1/662: write d9/da/dc/fa5 [3787430,99890] 0 2026-03-09T19:27:18.460 INFO:tasks.workunit.client.0.vm07.stdout:6/253: truncate d0/f9 2265459 0 2026-03-09T19:27:18.462 INFO:tasks.workunit.client.0.vm07.stdout:1/263: creat d1/d11/d37/d5d/d50/f62 x:0 0 0 2026-03-09T19:27:18.462 INFO:tasks.workunit.client.1.vm08.stdout:5/479: write d16/f18 [3842415,83655] 0 2026-03-09T19:27:18.471 INFO:tasks.workunit.client.1.vm08.stdout:1/663: mkdir d9/da/d12/d91/dc5 0 2026-03-09T19:27:18.471 INFO:tasks.workunit.client.0.vm07.stdout:7/297: rename d0/f25 to d0/d52/d54/d55/f67 0 2026-03-09T19:27:18.482 INFO:tasks.workunit.client.1.vm08.stdout:6/539: rename d3/l1f to d3/d34/da9/lc0 0 2026-03-09T19:27:18.484 INFO:tasks.workunit.client.1.vm08.stdout:1/664: fdatasync d9/da/dc/f10 0 2026-03-09T19:27:18.484 INFO:tasks.workunit.client.0.vm07.stdout:6/254: mknod d0/d4e/c69 0 2026-03-09T19:27:18.485 INFO:tasks.workunit.client.0.vm07.stdout:1/264: creat d1/d11/d37/d5d/d50/f63 x:0 0 0 2026-03-09T19:27:18.486 INFO:tasks.workunit.client.1.vm08.stdout:2/484: rmdir d3/d9/d79/d9f 0 2026-03-09T19:27:18.487 INFO:tasks.workunit.client.0.vm07.stdout:4/257: dread d3/d11/d2b/d37/f28 [0,4194304] 0 2026-03-09T19:27:18.488 INFO:tasks.workunit.client.0.vm07.stdout:7/298: rename d0/d4/d5/d8/d1a/d2a/l46 to d0/d4/d5/d8/d41/d64/l68 0 2026-03-09T19:27:18.489 INFO:tasks.workunit.client.0.vm07.stdout:3/321: link d1/f7 d1/d6/d45/f5d 0 2026-03-09T19:27:18.490 INFO:tasks.workunit.client.0.vm07.stdout:0/237: mkdir d0/d6/d13/d1c/d52 0 2026-03-09T19:27:18.490 INFO:tasks.workunit.client.1.vm08.stdout:1/665: mknod d9/da/dc/cc6 0 2026-03-09T19:27:18.491 INFO:tasks.workunit.client.0.vm07.stdout:9/322: mknod d0/d6f/c75 0 2026-03-09T19:27:18.491 INFO:tasks.workunit.client.1.vm08.stdout:5/480: rmdir d16/d45/d87 0 2026-03-09T19:27:18.496 INFO:tasks.workunit.client.0.vm07.stdout:6/255: mkdir d0/d1/db/d52/d6a 0 2026-03-09T19:27:18.529 INFO:tasks.workunit.client.0.vm07.stdout:2/330: link d3/dd/f24 d3/dd/d16/d2f/f75 0 2026-03-09T19:27:18.530 INFO:tasks.workunit.client.1.vm08.stdout:2/485: rename d3/d4/fd to d3/d4/d23/d2c/d39/d5e/de/d18/fad 0 2026-03-09T19:27:18.530 INFO:tasks.workunit.client.1.vm08.stdout:1/666: mknod d9/d11/d7a/d89/d8d/daa/cc7 0 2026-03-09T19:27:18.530 INFO:tasks.workunit.client.1.vm08.stdout:5/481: mkdir d16/d1e/d8c/d99 0 2026-03-09T19:27:18.530 INFO:tasks.workunit.client.1.vm08.stdout:5/482: dread - d16/f56 zero size 2026-03-09T19:27:18.530 INFO:tasks.workunit.client.1.vm08.stdout:6/540: link d3/d55/caa d3/cc1 0 2026-03-09T19:27:18.530 INFO:tasks.workunit.client.1.vm08.stdout:1/667: symlink d9/d11/lc8 0 2026-03-09T19:27:18.530 INFO:tasks.workunit.client.1.vm08.stdout:5/483: chown d16/d1e/d30/l4b 3840 1 2026-03-09T19:27:18.530 INFO:tasks.workunit.client.1.vm08.stdout:6/541: mkdir d3/d15/dc2 0 2026-03-09T19:27:18.530 INFO:tasks.workunit.client.1.vm08.stdout:5/484: mkdir d16/d1e/d6e/d84/d9a 0 2026-03-09T19:27:18.530 INFO:tasks.workunit.client.1.vm08.stdout:5/485: dwrite d16/f18 [4194304,4194304] 0 2026-03-09T19:27:18.530 INFO:tasks.workunit.client.1.vm08.stdout:1/668: getdents d9/da/d53/d67/d6c/d76/db2 0 2026-03-09T19:27:18.530 INFO:tasks.workunit.client.1.vm08.stdout:6/542: symlink d3/d34/lc3 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:4/258: write d3/d11/d2b/d37/f4d [1354139,65288] 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:1/265: rename d1/d11/d37/d5d/f4d to d1/db/d31/f64 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:4/259: truncate d3/d11/d2b/d37/f30 1038451 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:0/238: creat d0/d6/d13/d17/d19/f53 x:0 0 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:9/323: unlink d0/d6/d73/c74 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:1/266: dwrite d1/d11/d37/d3f/f4a [0,4194304] 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:9/324: chown d0/db/d29/d2c/d36/d5a 0 1 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:3/322: rename d1/f2a to d1/d3d/f5e 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:0/239: rename d0/d6/d13/d1c to d0/d6/d13/d1c/d52/d54 22 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:7/299: symlink d0/d52/d54/d5a/l69 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:7/300: write d0/d4/d5/d8/d1a/f4d [1851235,69027] 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:4/260: mknod d3/d11/d29/c54 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:2/331: mknod d3/dd/d16/d29/d2d/d45/d3b/d53/c76 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:1/267: fdatasync d1/f6 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:1/268: chown d1/d3/d52 6 1 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:1/269: chown d1/d11/d37/d3f/d45/l1e 23670132 1 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:1/270: write d1/d3e/f49 [1481374,97750] 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:1/271: write d1/d11/d37/f2c [968877,25429] 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.0.vm07.stdout:3/323: rename d1/l3 to d1/d1f/d5c/l5f 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.1.vm08.stdout:1/669: symlink d9/da/d53/lc9 0 2026-03-09T19:27:18.531 INFO:tasks.workunit.client.1.vm08.stdout:1/670: truncate d9/da/d95/fc0 535194 0 2026-03-09T19:27:18.532 INFO:tasks.workunit.client.1.vm08.stdout:1/671: chown d9/da/d53/d67/f77 184567 1 2026-03-09T19:27:18.533 INFO:tasks.workunit.client.0.vm07.stdout:2/332: truncate d3/d11/f31 2142613 0 2026-03-09T19:27:18.533 INFO:tasks.workunit.client.1.vm08.stdout:6/543: mkdir d3/d15/d8a/dc4 0 2026-03-09T19:27:18.535 INFO:tasks.workunit.client.1.vm08.stdout:6/544: truncate d3/db/d43/d69/da0/fa8 909123 0 2026-03-09T19:27:18.536 INFO:tasks.workunit.client.1.vm08.stdout:6/545: chown d3/db/d43/d69/fa3 2 1 2026-03-09T19:27:18.539 INFO:tasks.workunit.client.1.vm08.stdout:5/486: mkdir d16/d1e/d9b 0 2026-03-09T19:27:18.546 INFO:tasks.workunit.client.1.vm08.stdout:5/487: unlink d16/d1e/d3b/f82 0 2026-03-09T19:27:18.554 INFO:tasks.workunit.client.0.vm07.stdout:7/301: rename d0/f22 to d0/d52/d54/f6a 0 2026-03-09T19:27:18.569 INFO:tasks.workunit.client.1.vm08.stdout:6/546: link d3/d34/d6f/l38 d3/d94/lc5 0 2026-03-09T19:27:18.573 INFO:tasks.workunit.client.1.vm08.stdout:6/547: readlink d3/l5b 0 2026-03-09T19:27:18.573 INFO:tasks.workunit.client.1.vm08.stdout:8/490: write de/d1d/f59 [120505,47683] 0 2026-03-09T19:27:18.575 INFO:tasks.workunit.client.1.vm08.stdout:0/536: sync 2026-03-09T19:27:18.575 INFO:tasks.workunit.client.0.vm07.stdout:2/333: sync 2026-03-09T19:27:18.590 INFO:tasks.workunit.client.1.vm08.stdout:6/548: dread d3/f2a [0,4194304] 0 2026-03-09T19:27:18.603 INFO:tasks.workunit.client.0.vm07.stdout:8/302: truncate d7/d16/d1e/f33 3543475 0 2026-03-09T19:27:18.604 INFO:tasks.workunit.client.0.vm07.stdout:2/334: symlink d3/dd/d16/d29/d2d/d45/d3b/l77 0 2026-03-09T19:27:18.604 INFO:tasks.workunit.client.1.vm08.stdout:0/537: truncate dd/d22/d24/f60 114890 0 2026-03-09T19:27:18.604 INFO:tasks.workunit.client.1.vm08.stdout:7/596: write d5/d14/d27/d54/f75 [642479,106495] 0 2026-03-09T19:27:18.613 INFO:tasks.workunit.client.0.vm07.stdout:4/261: rename d3/l4 to d3/d11/d29/d34/d50/l55 0 2026-03-09T19:27:18.614 INFO:tasks.workunit.client.1.vm08.stdout:4/528: dwrite da/d10/d16/d28/d2f/d4f/d64/f6f [0,4194304] 0 2026-03-09T19:27:18.615 INFO:tasks.workunit.client.1.vm08.stdout:9/514: dwrite d0/d2/d14/f28 [0,4194304] 0 2026-03-09T19:27:18.619 INFO:tasks.workunit.client.0.vm07.stdout:4/262: dwrite d3/d11/d2b/f49 [0,4194304] 0 2026-03-09T19:27:18.631 INFO:tasks.workunit.client.1.vm08.stdout:4/529: dwrite da/d10/d26/d38/f93 [0,4194304] 0 2026-03-09T19:27:18.639 INFO:tasks.workunit.client.0.vm07.stdout:9/325: dread d0/d17/f5e [0,4194304] 0 2026-03-09T19:27:18.640 INFO:tasks.workunit.client.0.vm07.stdout:9/326: write d0/d6/ff [2472228,57254] 0 2026-03-09T19:27:18.640 INFO:tasks.workunit.client.1.vm08.stdout:4/530: chown da/d10/d16/d28/d46/d52/d6e/d2c/c4c 19413 1 2026-03-09T19:27:18.641 INFO:tasks.workunit.client.0.vm07.stdout:9/327: chown d0/c5b 93 1 2026-03-09T19:27:18.641 INFO:tasks.workunit.client.0.vm07.stdout:9/328: readlink d0/db/d29/l2d 0 2026-03-09T19:27:18.646 INFO:tasks.workunit.client.0.vm07.stdout:3/324: getdents d1/d6/dd/d51 0 2026-03-09T19:27:18.648 INFO:tasks.workunit.client.0.vm07.stdout:7/302: dread d0/d4/d5/d8/d1a/f4d [0,4194304] 0 2026-03-09T19:27:18.680 INFO:tasks.workunit.client.1.vm08.stdout:3/604: write d0/d6/de/d1b/d16/d17/f94 [249653,64134] 0 2026-03-09T19:27:18.698 INFO:tasks.workunit.client.0.vm07.stdout:9/329: mknod d0/db/d29/d68/c76 0 2026-03-09T19:27:18.699 INFO:tasks.workunit.client.0.vm07.stdout:9/330: chown d0/db/l1b 31 1 2026-03-09T19:27:18.702 INFO:tasks.workunit.client.0.vm07.stdout:9/331: dwrite d0/db/f41 [0,4194304] 0 2026-03-09T19:27:18.715 INFO:tasks.workunit.client.0.vm07.stdout:3/325: creat d1/d6/f60 x:0 0 0 2026-03-09T19:27:18.717 INFO:tasks.workunit.client.0.vm07.stdout:5/261: dwrite d3/f18 [0,4194304] 0 2026-03-09T19:27:18.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:18 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:27:18.746 INFO:tasks.workunit.client.0.vm07.stdout:7/303: mknod d0/d4/c6b 0 2026-03-09T19:27:18.749 INFO:tasks.workunit.client.0.vm07.stdout:7/304: dwrite d0/d4/d5/d26/f42 [0,4194304] 0 2026-03-09T19:27:18.755 INFO:tasks.workunit.client.0.vm07.stdout:7/305: sync 2026-03-09T19:27:18.756 INFO:tasks.workunit.client.0.vm07.stdout:7/306: dread - d0/d4/d5/d26/d3c/f63 zero size 2026-03-09T19:27:18.756 INFO:tasks.workunit.client.0.vm07.stdout:2/335: mknod d3/c78 0 2026-03-09T19:27:18.757 INFO:tasks.workunit.client.0.vm07.stdout:2/336: chown d3/c78 1711541 1 2026-03-09T19:27:18.757 INFO:tasks.workunit.client.0.vm07.stdout:2/337: stat d3/dd/d16/d30/c5e 0 2026-03-09T19:27:18.779 INFO:tasks.workunit.client.0.vm07.stdout:9/332: rmdir d0/db 39 2026-03-09T19:27:18.785 INFO:tasks.workunit.client.0.vm07.stdout:6/256: dread d0/f9 [0,4194304] 0 2026-03-09T19:27:18.787 INFO:tasks.workunit.client.0.vm07.stdout:6/257: sync 2026-03-09T19:27:18.787 INFO:tasks.workunit.client.0.vm07.stdout:5/262: mkdir d3/d1a/d5a 0 2026-03-09T19:27:18.788 INFO:tasks.workunit.client.0.vm07.stdout:5/263: truncate d3/dd/f58 615263 0 2026-03-09T19:27:18.809 INFO:tasks.workunit.client.0.vm07.stdout:3/326: creat d1/d6/d4c/f61 x:0 0 0 2026-03-09T19:27:18.809 INFO:tasks.workunit.client.0.vm07.stdout:3/327: chown d1/d1f/d16 17 1 2026-03-09T19:27:18.813 INFO:tasks.workunit.client.0.vm07.stdout:3/328: dread d1/d6/f21 [0,4194304] 0 2026-03-09T19:27:18.816 INFO:tasks.workunit.client.0.vm07.stdout:5/264: rename d3/d1a/c20 to d3/dd/d26/d3f/d47/c5b 0 2026-03-09T19:27:18.818 INFO:tasks.workunit.client.0.vm07.stdout:3/329: dwrite d1/d6/dd/f57 [0,4194304] 0 2026-03-09T19:27:18.819 INFO:tasks.workunit.client.0.vm07.stdout:3/330: chown d1/d6/dd/l18 8275 1 2026-03-09T19:27:18.819 INFO:tasks.workunit.client.1.vm08.stdout:2/486: write d3/d4/f55 [1016286,39191] 0 2026-03-09T19:27:18.819 INFO:tasks.workunit.client.0.vm07.stdout:3/331: chown d1/d26/c27 79140 1 2026-03-09T19:27:18.830 INFO:tasks.workunit.client.0.vm07.stdout:8/303: getdents d7/d9/d37/d34 0 2026-03-09T19:27:18.833 INFO:tasks.workunit.client.0.vm07.stdout:8/304: truncate d7/d9/d10/d44/f6c 484550 0 2026-03-09T19:27:18.834 INFO:tasks.workunit.client.1.vm08.stdout:6/549: symlink d3/d68/lc6 0 2026-03-09T19:27:18.836 INFO:tasks.workunit.client.0.vm07.stdout:1/272: truncate d1/f6 1390519 0 2026-03-09T19:27:18.838 INFO:tasks.workunit.client.1.vm08.stdout:0/538: rmdir dd/d22/d27/d6c 39 2026-03-09T19:27:18.843 INFO:tasks.workunit.client.1.vm08.stdout:6/550: write d3/db/d43/d69/fb1 [198972,100572] 0 2026-03-09T19:27:18.843 INFO:tasks.workunit.client.1.vm08.stdout:0/539: read dd/d22/d24/d49/d92/fa7 [8041249,14925] 0 2026-03-09T19:27:18.844 INFO:tasks.workunit.client.0.vm07.stdout:8/305: sync 2026-03-09T19:27:18.846 INFO:tasks.workunit.client.0.vm07.stdout:0/240: truncate d0/f1e 286542 0 2026-03-09T19:27:18.846 INFO:tasks.workunit.client.1.vm08.stdout:5/488: write d16/d1e/d30/f70 [107850,48973] 0 2026-03-09T19:27:18.847 INFO:tasks.workunit.client.0.vm07.stdout:8/306: readlink d7/d9/d37/l47 0 2026-03-09T19:27:18.847 INFO:tasks.workunit.client.1.vm08.stdout:1/672: dwrite d9/da/d12/f5c [0,4194304] 0 2026-03-09T19:27:18.848 INFO:tasks.workunit.client.0.vm07.stdout:8/307: dread - d7/d9/d37/d45/d56/f5f zero size 2026-03-09T19:27:18.850 INFO:tasks.workunit.client.0.vm07.stdout:3/332: creat d1/d3d/d47/f62 x:0 0 0 2026-03-09T19:27:18.853 INFO:tasks.workunit.client.1.vm08.stdout:1/673: fdatasync d9/da/dc/f78 0 2026-03-09T19:27:18.853 INFO:tasks.workunit.client.0.vm07.stdout:3/333: stat d1/d1f/d16/c59 0 2026-03-09T19:27:18.853 INFO:tasks.workunit.client.0.vm07.stdout:5/265: dwrite d3/dd/d26/d3f/f4b [0,4194304] 0 2026-03-09T19:27:18.855 INFO:tasks.workunit.client.1.vm08.stdout:7/597: mkdir d5/d14/dae/d1c/d83/d9c/dcb 0 2026-03-09T19:27:18.855 INFO:tasks.workunit.client.1.vm08.stdout:9/515: creat d0/d2/d14/d98/faf x:0 0 0 2026-03-09T19:27:18.855 INFO:tasks.workunit.client.0.vm07.stdout:7/307: creat d0/f6c x:0 0 0 2026-03-09T19:27:18.859 INFO:tasks.workunit.client.0.vm07.stdout:2/338: dread d3/dd/d16/d2f/f75 [0,4194304] 0 2026-03-09T19:27:18.874 INFO:tasks.workunit.client.1.vm08.stdout:5/489: dwrite d16/d1e/d30/d6f/f8f [0,4194304] 0 2026-03-09T19:27:18.876 INFO:tasks.workunit.client.0.vm07.stdout:1/273: mknod d1/d11/c65 0 2026-03-09T19:27:18.879 INFO:tasks.workunit.client.1.vm08.stdout:5/490: write d16/d1e/d30/d6f/f8f [1418683,91857] 0 2026-03-09T19:27:18.880 INFO:tasks.workunit.client.1.vm08.stdout:2/487: unlink d3/d4/d23/d2c/d39/l4b 0 2026-03-09T19:27:18.886 INFO:tasks.workunit.client.0.vm07.stdout:4/263: dwrite d3/d11/d2b/f2c [0,4194304] 0 2026-03-09T19:27:18.898 INFO:tasks.workunit.client.1.vm08.stdout:6/551: creat d3/d34/da9/fc7 x:0 0 0 2026-03-09T19:27:18.909 INFO:tasks.workunit.client.0.vm07.stdout:5/266: mknod d3/d1a/d28/d36/c5c 0 2026-03-09T19:27:18.917 INFO:tasks.workunit.client.0.vm07.stdout:7/308: rename d0/d4/f33 to d0/d52/d54/d55/f6d 0 2026-03-09T19:27:18.946 INFO:tasks.workunit.client.0.vm07.stdout:1/274: mknod d1/d3/d21/c66 0 2026-03-09T19:27:18.955 INFO:tasks.workunit.client.1.vm08.stdout:9/516: creat d0/d2/d14/d5c/fb0 x:0 0 0 2026-03-09T19:27:18.963 INFO:tasks.workunit.client.0.vm07.stdout:8/308: mkdir d7/d30/d75 0 2026-03-09T19:27:18.964 INFO:tasks.workunit.client.0.vm07.stdout:8/309: fdatasync d7/d9/d37/d34/f55 0 2026-03-09T19:27:18.969 INFO:tasks.workunit.client.0.vm07.stdout:4/264: mkdir d3/d4f/d56 0 2026-03-09T19:27:18.971 INFO:tasks.workunit.client.0.vm07.stdout:3/334: symlink d1/d1f/d16/d28/l63 0 2026-03-09T19:27:18.972 INFO:tasks.workunit.client.0.vm07.stdout:3/335: write d1/d6/d45/f56 [4216644,12241] 0 2026-03-09T19:27:18.978 INFO:tasks.workunit.client.0.vm07.stdout:9/333: dwrite d0/db/d29/d2c/f34 [0,4194304] 0 2026-03-09T19:27:18.979 INFO:tasks.workunit.client.1.vm08.stdout:8/491: write de/d1d/f1e [1468257,11719] 0 2026-03-09T19:27:18.980 INFO:tasks.workunit.client.0.vm07.stdout:3/336: dread d1/d6/d45/f56 [0,4194304] 0 2026-03-09T19:27:18.994 INFO:tasks.workunit.client.0.vm07.stdout:6/258: write d0/d1/db/d1d/f2e [1301683,13972] 0 2026-03-09T19:27:19.006 INFO:tasks.workunit.client.1.vm08.stdout:0/540: write dd/d22/d27/d2e/f51 [528119,22225] 0 2026-03-09T19:27:19.009 INFO:tasks.workunit.client.1.vm08.stdout:6/552: read d3/d15/f40 [2923025,116554] 0 2026-03-09T19:27:19.013 INFO:tasks.workunit.client.1.vm08.stdout:7/598: write d5/d14/f6c [1408642,34967] 0 2026-03-09T19:27:19.015 INFO:tasks.workunit.client.0.vm07.stdout:8/310: unlink d7/d50/c5c 0 2026-03-09T19:27:19.015 INFO:tasks.workunit.client.1.vm08.stdout:1/674: mknod d9/da/d17/d60/cca 0 2026-03-09T19:27:19.016 INFO:tasks.workunit.client.1.vm08.stdout:1/675: write d9/da/d12/d91/fb4 [237611,116821] 0 2026-03-09T19:27:19.020 INFO:tasks.workunit.client.0.vm07.stdout:4/265: creat d3/d11/d16/f57 x:0 0 0 2026-03-09T19:27:19.029 INFO:tasks.workunit.client.0.vm07.stdout:0/241: dwrite d0/fa [0,4194304] 0 2026-03-09T19:27:19.029 INFO:tasks.workunit.client.1.vm08.stdout:3/605: creat d0/d6/de/d6e/fbd x:0 0 0 2026-03-09T19:27:19.029 INFO:tasks.workunit.client.1.vm08.stdout:2/488: symlink d3/d4/d23/d2c/d39/d5e/de/d18/d99/lae 0 2026-03-09T19:27:19.029 INFO:tasks.workunit.client.1.vm08.stdout:4/531: dwrite da/d10/d26/d27/d32/f39 [0,4194304] 0 2026-03-09T19:27:19.039 INFO:tasks.workunit.client.0.vm07.stdout:3/337: creat d1/d1f/d16/d28/f64 x:0 0 0 2026-03-09T19:27:19.055 INFO:tasks.workunit.client.1.vm08.stdout:5/491: rename c4 to d16/d1e/d6e/d84/d9a/c9c 0 2026-03-09T19:27:19.056 INFO:tasks.workunit.client.0.vm07.stdout:3/338: dread d1/d1f/f13 [0,4194304] 0 2026-03-09T19:27:19.066 INFO:tasks.workunit.client.1.vm08.stdout:7/599: unlink d5/d14/dae/d1c/d73/fb3 0 2026-03-09T19:27:19.069 INFO:tasks.workunit.client.0.vm07.stdout:7/309: symlink d0/d4/d5/l6e 0 2026-03-09T19:27:19.070 INFO:tasks.workunit.client.0.vm07.stdout:2/339: link d3/dd/d16/d29/d2d/l35 d3/dd/d16/d29/d2d/d45/d3b/l79 0 2026-03-09T19:27:19.073 INFO:tasks.workunit.client.1.vm08.stdout:3/606: rmdir d0/d6/de/d54 39 2026-03-09T19:27:19.077 INFO:tasks.workunit.client.0.vm07.stdout:4/266: creat d3/d11/d16/d2f/d22/f58 x:0 0 0 2026-03-09T19:27:19.079 INFO:tasks.workunit.client.0.vm07.stdout:0/242: mknod d0/d6/c55 0 2026-03-09T19:27:19.080 INFO:tasks.workunit.client.0.vm07.stdout:4/267: dwrite d3/f8 [0,4194304] 0 2026-03-09T19:27:19.081 INFO:tasks.workunit.client.0.vm07.stdout:9/334: fsync d0/f3 0 2026-03-09T19:27:19.082 INFO:tasks.workunit.client.1.vm08.stdout:7/600: symlink d5/d14/d27/d54/lcc 0 2026-03-09T19:27:19.086 INFO:tasks.workunit.client.0.vm07.stdout:6/259: rename d0/d2d/l42 to d0/d13/l6b 0 2026-03-09T19:27:19.091 INFO:tasks.workunit.client.1.vm08.stdout:9/517: getdents d0/d2/d8 0 2026-03-09T19:27:19.092 INFO:tasks.workunit.client.1.vm08.stdout:3/607: truncate d0/d8/f5b 1037081 0 2026-03-09T19:27:19.093 INFO:tasks.workunit.client.1.vm08.stdout:3/608: write d0/d6/de/f86 [2044013,38582] 0 2026-03-09T19:27:19.094 INFO:tasks.workunit.client.1.vm08.stdout:7/601: creat d5/d14/dae/d1c/d83/d9c/dcb/fcd x:0 0 0 2026-03-09T19:27:19.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:18 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:27:19.094 INFO:tasks.workunit.client.0.vm07.stdout:4/268: dwrite d3/d11/f1e [0,4194304] 0 2026-03-09T19:27:19.095 INFO:tasks.workunit.client.0.vm07.stdout:4/269: write d3/d11/d29/f3c [578085,59784] 0 2026-03-09T19:27:19.095 INFO:tasks.workunit.client.0.vm07.stdout:9/335: dwrite d0/d6/d57/f58 [0,4194304] 0 2026-03-09T19:27:19.095 INFO:tasks.workunit.client.0.vm07.stdout:5/267: getdents d3/dd/d26/d3f 0 2026-03-09T19:27:19.100 INFO:tasks.workunit.client.1.vm08.stdout:6/553: read d3/d15/f19 [17454,80536] 0 2026-03-09T19:27:19.112 INFO:tasks.workunit.client.0.vm07.stdout:3/339: symlink d1/d1f/l65 0 2026-03-09T19:27:19.113 INFO:tasks.workunit.client.1.vm08.stdout:3/609: symlink d0/d6/dad/lbe 0 2026-03-09T19:27:19.120 INFO:tasks.workunit.client.0.vm07.stdout:7/310: unlink d0/d4/d5/d26/c44 0 2026-03-09T19:27:19.121 INFO:tasks.workunit.client.0.vm07.stdout:7/311: chown d0/d4/d5/d26/d3c/d39/l30 51 1 2026-03-09T19:27:19.121 INFO:tasks.workunit.client.1.vm08.stdout:6/554: symlink d3/d34/d3b/lc8 0 2026-03-09T19:27:19.121 INFO:tasks.workunit.client.0.vm07.stdout:7/312: readlink d0/d4/l48 0 2026-03-09T19:27:19.121 INFO:tasks.workunit.client.1.vm08.stdout:6/555: stat d3/d34/lc3 0 2026-03-09T19:27:19.121 INFO:tasks.workunit.client.0.vm07.stdout:7/313: readlink d0/d4/d5/d26/d3c/d58/l5c 0 2026-03-09T19:27:19.122 INFO:tasks.workunit.client.0.vm07.stdout:7/314: fdatasync d0/f6c 0 2026-03-09T19:27:19.122 INFO:tasks.workunit.client.1.vm08.stdout:6/556: stat d3/d15/d8a 0 2026-03-09T19:27:19.122 INFO:tasks.workunit.client.0.vm07.stdout:7/315: fsync d0/d4/d5/d8/f35 0 2026-03-09T19:27:19.123 INFO:tasks.workunit.client.1.vm08.stdout:9/518: mknod d0/d2/cb1 0 2026-03-09T19:27:19.124 INFO:tasks.workunit.client.0.vm07.stdout:9/336: readlink d0/l1 0 2026-03-09T19:27:19.128 INFO:tasks.workunit.client.0.vm07.stdout:4/270: mknod d3/d11/d16/c59 0 2026-03-09T19:27:19.129 INFO:tasks.workunit.client.0.vm07.stdout:5/268: unlink d3/dd/f22 0 2026-03-09T19:27:19.129 INFO:tasks.workunit.client.1.vm08.stdout:6/557: chown d3/db/f30 28892257 1 2026-03-09T19:27:19.130 INFO:tasks.workunit.client.0.vm07.stdout:0/243: mkdir d0/d6/d13/d1c/d11/d56 0 2026-03-09T19:27:19.131 INFO:tasks.workunit.client.1.vm08.stdout:9/519: dread - d0/d1b/d97/d48/d5d/f9b zero size 2026-03-09T19:27:19.131 INFO:tasks.workunit.client.1.vm08.stdout:9/520: readlink d0/d1b/d97/d48/d6f/la5 0 2026-03-09T19:27:19.133 INFO:tasks.workunit.client.0.vm07.stdout:6/260: dread d0/d13/f1b [0,4194304] 0 2026-03-09T19:27:19.144 INFO:tasks.workunit.client.0.vm07.stdout:7/316: write d0/d4/d5/f36 [1826535,75521] 0 2026-03-09T19:27:19.144 INFO:tasks.workunit.client.0.vm07.stdout:7/317: fdatasync d0/d4/d5/d8/f35 0 2026-03-09T19:27:19.148 INFO:tasks.workunit.client.0.vm07.stdout:0/244: write d0/d6/d13/d33/f39 [168379,102662] 0 2026-03-09T19:27:19.149 INFO:tasks.workunit.client.0.vm07.stdout:6/261: fdatasync d0/d1/f8 0 2026-03-09T19:27:19.150 INFO:tasks.workunit.client.0.vm07.stdout:3/340: mknod d1/c66 0 2026-03-09T19:27:19.150 INFO:tasks.workunit.client.0.vm07.stdout:7/318: rmdir d0/d4/d5/d8 39 2026-03-09T19:27:19.151 INFO:tasks.workunit.client.0.vm07.stdout:7/319: chown d0/d4/d5/d26/d3c/f63 61871 1 2026-03-09T19:27:19.154 INFO:tasks.workunit.client.0.vm07.stdout:6/262: unlink d0/f49 0 2026-03-09T19:27:19.154 INFO:tasks.workunit.client.0.vm07.stdout:6/263: chown d0/d1/d28 3687803 1 2026-03-09T19:27:19.155 INFO:tasks.workunit.client.0.vm07.stdout:3/341: creat d1/d6/dd/f67 x:0 0 0 2026-03-09T19:27:19.155 INFO:tasks.workunit.client.0.vm07.stdout:5/269: dread d3/d1a/f1c [4194304,4194304] 0 2026-03-09T19:27:19.156 INFO:tasks.workunit.client.0.vm07.stdout:9/337: getdents d0/db/d29/d4d 0 2026-03-09T19:27:19.156 INFO:tasks.workunit.client.0.vm07.stdout:3/342: chown d1/d26/l43 388579345 1 2026-03-09T19:27:19.156 INFO:tasks.workunit.client.0.vm07.stdout:4/271: getdents d3/d11/d16/d2f 0 2026-03-09T19:27:19.157 INFO:tasks.workunit.client.0.vm07.stdout:5/270: write d3/dd/d26/d3f/d47/d56/f59 [902691,9685] 0 2026-03-09T19:27:19.158 INFO:tasks.workunit.client.0.vm07.stdout:9/338: truncate d0/d17/f42 728117 0 2026-03-09T19:27:19.160 INFO:tasks.workunit.client.0.vm07.stdout:7/320: getdents d0/d52 0 2026-03-09T19:27:19.162 INFO:tasks.workunit.client.0.vm07.stdout:5/271: unlink d3/d1a/c1e 0 2026-03-09T19:27:19.162 INFO:tasks.workunit.client.0.vm07.stdout:9/339: truncate d0/d17/f1f 1708461 0 2026-03-09T19:27:19.162 INFO:tasks.workunit.client.0.vm07.stdout:3/343: dwrite d1/d1f/f36 [0,4194304] 0 2026-03-09T19:27:19.163 INFO:tasks.workunit.client.0.vm07.stdout:5/272: chown d3/d1a/f1c 50871 1 2026-03-09T19:27:19.163 INFO:tasks.workunit.client.0.vm07.stdout:5/273: readlink d3/d1a/d28/d36/l38 0 2026-03-09T19:27:19.168 INFO:tasks.workunit.client.0.vm07.stdout:5/274: unlink d3/dd/d26/d3f/f4b 0 2026-03-09T19:27:19.169 INFO:tasks.workunit.client.0.vm07.stdout:5/275: mkdir d3/d1a/d5d 0 2026-03-09T19:27:19.170 INFO:tasks.workunit.client.0.vm07.stdout:5/276: creat d3/dd/d26/d2c/f5e x:0 0 0 2026-03-09T19:27:19.199 INFO:tasks.workunit.client.1.vm08.stdout:8/492: sync 2026-03-09T19:27:19.202 INFO:tasks.workunit.client.1.vm08.stdout:8/493: rename de/d25/d31/la8 to de/d47/d85/laf 0 2026-03-09T19:27:19.203 INFO:tasks.workunit.client.1.vm08.stdout:8/494: chown de/d25/d31/d82/d6d/l78 3 1 2026-03-09T19:27:19.232 INFO:tasks.workunit.client.0.vm07.stdout:4/272: fdatasync d3/d11/d29/f3c 0 2026-03-09T19:27:19.240 INFO:tasks.workunit.client.1.vm08.stdout:0/541: write dd/f7a [2474910,75294] 0 2026-03-09T19:27:19.241 INFO:tasks.workunit.client.1.vm08.stdout:2/489: write d3/d9/d79/f6b [730249,32237] 0 2026-03-09T19:27:19.243 INFO:tasks.workunit.client.0.vm07.stdout:1/275: dwrite d1/d3/f23 [0,4194304] 0 2026-03-09T19:27:19.248 INFO:tasks.workunit.client.0.vm07.stdout:1/276: dread d1/d11/d37/d3f/f4a [0,4194304] 0 2026-03-09T19:27:19.249 INFO:tasks.workunit.client.1.vm08.stdout:1/676: truncate d9/da/d12/f5c 908821 0 2026-03-09T19:27:19.250 INFO:tasks.workunit.client.0.vm07.stdout:1/277: chown d1/db/d31/d56/c5e 5 1 2026-03-09T19:27:19.256 INFO:tasks.workunit.client.1.vm08.stdout:4/532: dwrite da/d10/d16/d28/d46/d52/d6e/d2c/f7c [0,4194304] 0 2026-03-09T19:27:19.262 INFO:tasks.workunit.client.1.vm08.stdout:5/492: dwrite d16/d1e/f2c [0,4194304] 0 2026-03-09T19:27:19.263 INFO:tasks.workunit.client.0.vm07.stdout:5/277: dread d3/d1a/d28/f2e [0,4194304] 0 2026-03-09T19:27:19.266 INFO:tasks.workunit.client.0.vm07.stdout:5/278: dread d3/d1a/d28/f3c [0,4194304] 0 2026-03-09T19:27:19.271 INFO:tasks.workunit.client.0.vm07.stdout:6/264: rmdir d0/d13 39 2026-03-09T19:27:19.271 INFO:tasks.workunit.client.0.vm07.stdout:6/265: write d0/d1/db/d17/d4c/f60 [320534,81310] 0 2026-03-09T19:27:19.271 INFO:tasks.workunit.client.0.vm07.stdout:5/279: dread d3/dd/f58 [0,4194304] 0 2026-03-09T19:27:19.271 INFO:tasks.workunit.client.1.vm08.stdout:2/490: chown f2 38159647 1 2026-03-09T19:27:19.272 INFO:tasks.workunit.client.0.vm07.stdout:2/340: dwrite d3/dd/f34 [0,4194304] 0 2026-03-09T19:27:19.274 INFO:tasks.workunit.client.1.vm08.stdout:2/491: chown d3/d4/d23/d2c/d39/d5e/de/d18/f93 1987906 1 2026-03-09T19:27:19.279 INFO:tasks.workunit.client.0.vm07.stdout:5/280: dwrite d3/dd/f24 [0,4194304] 0 2026-03-09T19:27:19.282 INFO:tasks.workunit.client.0.vm07.stdout:5/281: chown d3/dd 725207941 1 2026-03-09T19:27:19.283 INFO:tasks.workunit.client.0.vm07.stdout:5/282: chown d3/dd/d26/d3f 1 1 2026-03-09T19:27:19.291 INFO:tasks.workunit.client.0.vm07.stdout:8/311: truncate d7/d9/d10/d44/f4a 270628 0 2026-03-09T19:27:19.292 INFO:tasks.workunit.client.0.vm07.stdout:8/312: dread - d7/d30/f61 zero size 2026-03-09T19:27:19.292 INFO:tasks.workunit.client.0.vm07.stdout:8/313: write d7/d16/d1e/f6e [787666,31571] 0 2026-03-09T19:27:19.293 INFO:tasks.workunit.client.1.vm08.stdout:2/492: dwrite d3/d4/fa7 [0,4194304] 0 2026-03-09T19:27:19.307 INFO:tasks.workunit.client.1.vm08.stdout:1/677: mknod d9/d11/d7a/ccb 0 2026-03-09T19:27:19.323 INFO:tasks.workunit.client.1.vm08.stdout:4/533: rmdir da/d10/d16/d28/d2f/d4f 39 2026-03-09T19:27:19.324 INFO:tasks.workunit.client.1.vm08.stdout:4/534: truncate da/d10/d16/f9f 668239 0 2026-03-09T19:27:19.324 INFO:tasks.workunit.client.1.vm08.stdout:4/535: chown da/l99 3 1 2026-03-09T19:27:19.326 INFO:tasks.workunit.client.0.vm07.stdout:1/278: symlink d1/db/l67 0 2026-03-09T19:27:19.336 INFO:tasks.workunit.client.1.vm08.stdout:7/602: dwrite d5/d14/d2b/d5d/fb2 [0,4194304] 0 2026-03-09T19:27:19.359 INFO:tasks.workunit.client.1.vm08.stdout:3/610: write d0/d52/f8a [723916,130899] 0 2026-03-09T19:27:19.359 INFO:tasks.workunit.client.1.vm08.stdout:2/493: truncate d3/d4/d23/d2c/d39/f9b 539705 0 2026-03-09T19:27:19.366 INFO:tasks.workunit.client.1.vm08.stdout:6/558: write d3/db/d43/f51 [778378,126494] 0 2026-03-09T19:27:19.375 INFO:tasks.workunit.client.1.vm08.stdout:9/521: write d0/d2/d80/d69/f93 [358124,62334] 0 2026-03-09T19:27:19.389 INFO:tasks.workunit.client.1.vm08.stdout:5/493: symlink d16/l9d 0 2026-03-09T19:27:19.389 INFO:tasks.workunit.client.1.vm08.stdout:1/678: dread d9/d40/d49/f70 [0,4194304] 0 2026-03-09T19:27:19.393 INFO:tasks.workunit.client.0.vm07.stdout:2/341: rename d3/dd/d16/d2f to d3/dd/d16/d29/d3c/d5a/d7a 0 2026-03-09T19:27:19.395 INFO:tasks.workunit.client.1.vm08.stdout:7/603: chown d5/d14/dae/d3a/cc0 45741338 1 2026-03-09T19:27:19.396 INFO:tasks.workunit.client.1.vm08.stdout:7/604: readlink d5/d14/dae/d3a/d42/d6a/la7 0 2026-03-09T19:27:19.397 INFO:tasks.workunit.client.1.vm08.stdout:7/605: fdatasync d5/d14/dae/d1c/d83/d9c/f9d 0 2026-03-09T19:27:19.398 INFO:tasks.workunit.client.0.vm07.stdout:2/342: dwrite d3/dd/d16/d29/d2d/d45/f55 [0,4194304] 0 2026-03-09T19:27:19.409 INFO:tasks.workunit.client.0.vm07.stdout:7/321: write d0/d4/fc [3799789,75829] 0 2026-03-09T19:27:19.410 INFO:tasks.workunit.client.1.vm08.stdout:2/494: symlink d3/d4/d23/d2c/d39/laf 0 2026-03-09T19:27:19.410 INFO:tasks.workunit.client.1.vm08.stdout:2/495: chown d3/d9/d4a/fa4 0 1 2026-03-09T19:27:19.413 INFO:tasks.workunit.client.0.vm07.stdout:9/340: dwrite d0/d6/f4c [0,4194304] 0 2026-03-09T19:27:19.418 INFO:tasks.workunit.client.1.vm08.stdout:4/536: fdatasync da/d10/d16/d28/d46/d52/f5b 0 2026-03-09T19:27:19.418 INFO:tasks.workunit.client.0.vm07.stdout:3/344: dwrite d1/d6/dd/f3b [0,4194304] 0 2026-03-09T19:27:19.429 INFO:tasks.workunit.client.1.vm08.stdout:3/611: unlink d0/d8/d19/f44 0 2026-03-09T19:27:19.447 INFO:tasks.workunit.client.1.vm08.stdout:3/612: chown d0/d52/fa8 11396 1 2026-03-09T19:27:19.447 INFO:tasks.workunit.client.1.vm08.stdout:2/496: unlink d3/d4/d23/d2c/d39/d5e/de/d18/f3f 0 2026-03-09T19:27:19.447 INFO:tasks.workunit.client.1.vm08.stdout:9/522: creat d0/d1b/daa/fb2 x:0 0 0 2026-03-09T19:27:19.447 INFO:tasks.workunit.client.1.vm08.stdout:4/537: dwrite da/d10/d16/d28/d2f/d4f/d56/f9a [0,4194304] 0 2026-03-09T19:27:19.447 INFO:tasks.workunit.client.0.vm07.stdout:8/314: unlink d7/d9/c31 0 2026-03-09T19:27:19.447 INFO:tasks.workunit.client.0.vm07.stdout:8/315: dread - d7/d50/f6f zero size 2026-03-09T19:27:19.447 INFO:tasks.workunit.client.0.vm07.stdout:8/316: truncate d7/d9/d10/f41 2121210 0 2026-03-09T19:27:19.447 INFO:tasks.workunit.client.0.vm07.stdout:8/317: write d7/d9/f36 [214436,68025] 0 2026-03-09T19:27:19.447 INFO:tasks.workunit.client.0.vm07.stdout:8/318: chown d7/d9/d10/d44/c49 386285 1 2026-03-09T19:27:19.447 INFO:tasks.workunit.client.0.vm07.stdout:8/319: read d7/d9/d10/f1b [984953,70207] 0 2026-03-09T19:27:19.448 INFO:tasks.workunit.client.0.vm07.stdout:8/320: chown d7/d9/d10/f41 16247 1 2026-03-09T19:27:19.458 INFO:tasks.workunit.client.1.vm08.stdout:6/559: sync 2026-03-09T19:27:19.478 INFO:tasks.workunit.client.1.vm08.stdout:0/542: getdents dd/d22/d24/d49/d50/d78/d86 0 2026-03-09T19:27:19.478 INFO:tasks.workunit.client.1.vm08.stdout:2/497: mkdir d3/d4/d3e/d4e/d88/db0 0 2026-03-09T19:27:19.479 INFO:tasks.workunit.client.1.vm08.stdout:2/498: stat d3/d9/d79/d46/d8c/caa 0 2026-03-09T19:27:19.493 INFO:tasks.workunit.client.1.vm08.stdout:9/523: dread - d0/d1b/f8d zero size 2026-03-09T19:27:19.493 INFO:tasks.workunit.client.1.vm08.stdout:9/524: readlink d0/d1b/d4e/l95 0 2026-03-09T19:27:19.496 INFO:tasks.workunit.client.1.vm08.stdout:6/560: fsync d3/db/f14 0 2026-03-09T19:27:19.497 INFO:tasks.workunit.client.1.vm08.stdout:3/613: mknod d0/cbf 0 2026-03-09T19:27:19.519 INFO:tasks.workunit.client.1.vm08.stdout:2/499: symlink d3/d9/d79/d46/d8c/d92/lb1 0 2026-03-09T19:27:19.532 INFO:tasks.workunit.client.1.vm08.stdout:2/500: dread d3/d9/d4a/fa4 [0,4194304] 0 2026-03-09T19:27:19.547 INFO:tasks.workunit.client.1.vm08.stdout:8/495: dwrite de/d25/d31/d82/fa1 [0,4194304] 0 2026-03-09T19:27:19.570 INFO:tasks.workunit.client.1.vm08.stdout:5/494: dwrite d16/d1e/f2e [0,4194304] 0 2026-03-09T19:27:19.572 INFO:tasks.workunit.client.1.vm08.stdout:1/679: write d9/da/d2d/f50 [289772,107209] 0 2026-03-09T19:27:19.572 INFO:tasks.workunit.client.1.vm08.stdout:5/495: chown d16/d1e/d30/d6f/l88 8343 1 2026-03-09T19:27:19.574 INFO:tasks.workunit.client.1.vm08.stdout:6/561: rename d3/d34/d5c/da2/f72 to d3/d15/fc9 0 2026-03-09T19:27:19.575 INFO:tasks.workunit.client.1.vm08.stdout:6/562: chown d3/d34/d5c/l83 435116 1 2026-03-09T19:27:19.589 INFO:tasks.workunit.client.0.vm07.stdout:5/283: getdents d3/d1a/d5d 0 2026-03-09T19:27:19.590 INFO:tasks.workunit.client.0.vm07.stdout:5/284: dread - d3/d1a/d28/d40/f49 zero size 2026-03-09T19:27:19.592 INFO:tasks.workunit.client.1.vm08.stdout:3/614: creat d0/d6/de/d1b/fc0 x:0 0 0 2026-03-09T19:27:19.611 INFO:tasks.workunit.client.0.vm07.stdout:6/266: truncate d0/d1/db/d17/f1a 3137411 0 2026-03-09T19:27:19.613 INFO:tasks.workunit.client.0.vm07.stdout:4/273: link d3/d11/l2a d3/d11/d2b/d37/l5a 0 2026-03-09T19:27:19.613 INFO:tasks.workunit.client.1.vm08.stdout:2/501: truncate d3/d9/d79/d46/d8c/fa5 965205 0 2026-03-09T19:27:19.614 INFO:tasks.workunit.client.0.vm07.stdout:2/343: write d3/fc [3052409,120696] 0 2026-03-09T19:27:19.615 INFO:tasks.workunit.client.0.vm07.stdout:2/344: fsync d3/dd/d16/d30/d40/f4f 0 2026-03-09T19:27:19.618 INFO:tasks.workunit.client.1.vm08.stdout:7/606: dwrite d5/d14/dae/d1c/f29 [0,4194304] 0 2026-03-09T19:27:19.618 INFO:tasks.workunit.client.1.vm08.stdout:2/502: write d3/d9/d79/f6b [1697538,128868] 0 2026-03-09T19:27:19.620 INFO:tasks.workunit.client.0.vm07.stdout:2/345: dwrite d3/fc [0,4194304] 0 2026-03-09T19:27:19.636 INFO:tasks.workunit.client.0.vm07.stdout:6/267: sync 2026-03-09T19:27:19.636 INFO:tasks.workunit.client.0.vm07.stdout:6/268: readlink d0/d44/l54 0 2026-03-09T19:27:19.641 INFO:tasks.workunit.client.0.vm07.stdout:2/346: sync 2026-03-09T19:27:19.648 INFO:tasks.workunit.client.1.vm08.stdout:4/538: creat da/d10/d16/d28/fa3 x:0 0 0 2026-03-09T19:27:19.651 INFO:tasks.workunit.client.0.vm07.stdout:0/245: write d0/d6/d13/d1c/d11/f2e [182452,79754] 0 2026-03-09T19:27:19.651 INFO:tasks.workunit.client.0.vm07.stdout:9/341: getdents d0/db/d29/d4d 0 2026-03-09T19:27:19.668 INFO:tasks.workunit.client.1.vm08.stdout:0/543: dwrite dd/d22/f41 [0,4194304] 0 2026-03-09T19:27:19.670 INFO:tasks.workunit.client.1.vm08.stdout:0/544: readlink dd/d22/d63/l5a 0 2026-03-09T19:27:19.672 INFO:tasks.workunit.client.0.vm07.stdout:1/279: link d1/d11/d37/d5d/d50/c53 d1/d11/d37/d3f/c68 0 2026-03-09T19:27:19.677 INFO:tasks.workunit.client.0.vm07.stdout:1/280: dwrite d1/f51 [0,4194304] 0 2026-03-09T19:27:19.689 INFO:tasks.workunit.client.0.vm07.stdout:2/347: dread f0 [0,4194304] 0 2026-03-09T19:27:19.704 INFO:tasks.workunit.client.0.vm07.stdout:7/322: write d0/d4/d5/d26/d32/f45 [587330,107089] 0 2026-03-09T19:27:19.705 INFO:tasks.workunit.client.1.vm08.stdout:8/496: dwrite de/d1d/d69/f9a [0,4194304] 0 2026-03-09T19:27:19.714 INFO:tasks.workunit.client.0.vm07.stdout:4/274: creat d3/d4f/f5b x:0 0 0 2026-03-09T19:27:19.718 INFO:tasks.workunit.client.0.vm07.stdout:6/269: creat d0/d1/db/d24/d53/d31/f6c x:0 0 0 2026-03-09T19:27:19.718 INFO:tasks.workunit.client.0.vm07.stdout:6/270: chown d0/d1/db/d17 5923061 1 2026-03-09T19:27:19.723 INFO:tasks.workunit.client.0.vm07.stdout:6/271: dwrite d0/d1/db/f15 [4194304,4194304] 0 2026-03-09T19:27:19.727 INFO:tasks.workunit.client.0.vm07.stdout:0/246: mkdir d0/d6/d13/d17/d19/d57 0 2026-03-09T19:27:19.741 INFO:tasks.workunit.client.0.vm07.stdout:9/342: symlink d0/d6/d73/l77 0 2026-03-09T19:27:19.741 INFO:tasks.workunit.client.0.vm07.stdout:3/345: link d1/d26/f55 d1/f68 0 2026-03-09T19:27:19.741 INFO:tasks.workunit.client.0.vm07.stdout:3/346: readlink d1/l46 0 2026-03-09T19:27:19.754 INFO:tasks.workunit.client.0.vm07.stdout:1/281: dread d1/f2f [0,4194304] 0 2026-03-09T19:27:19.754 INFO:tasks.workunit.client.1.vm08.stdout:0/545: sync 2026-03-09T19:27:19.755 INFO:tasks.workunit.client.1.vm08.stdout:0/546: chown dd/d31 1379774 1 2026-03-09T19:27:19.755 INFO:tasks.workunit.client.0.vm07.stdout:1/282: chown d1/c1a 260 1 2026-03-09T19:27:19.756 INFO:tasks.workunit.client.1.vm08.stdout:0/547: read dd/f1e [2297723,90418] 0 2026-03-09T19:27:19.760 INFO:tasks.workunit.client.1.vm08.stdout:9/525: truncate d0/d2/d8/f8e 804955 0 2026-03-09T19:27:19.776 INFO:tasks.workunit.client.1.vm08.stdout:0/548: dread dd/d22/f3e [0,4194304] 0 2026-03-09T19:27:19.776 INFO:tasks.workunit.client.0.vm07.stdout:7/323: write d0/d4/d5/d8/f37 [980579,34458] 0 2026-03-09T19:27:19.776 INFO:tasks.workunit.client.0.vm07.stdout:4/275: creat d3/d11/d29/d34/f5c x:0 0 0 2026-03-09T19:27:19.776 INFO:tasks.workunit.client.0.vm07.stdout:0/247: mkdir d0/d6/d13/d17/d19/d58 0 2026-03-09T19:27:19.777 INFO:tasks.workunit.client.1.vm08.stdout:0/549: readlink dd/d22/d24/d49/d50/l69 0 2026-03-09T19:27:19.780 INFO:tasks.workunit.client.0.vm07.stdout:9/343: rmdir d0/db/d29/d2c/d36 39 2026-03-09T19:27:19.782 INFO:tasks.workunit.client.0.vm07.stdout:5/285: link d3/d1a/d28/f2e d3/d1a/d5d/f5f 0 2026-03-09T19:27:19.786 INFO:tasks.workunit.client.1.vm08.stdout:1/680: dwrite d9/d11/d7a/f9a [0,4194304] 0 2026-03-09T19:27:19.787 INFO:tasks.workunit.client.0.vm07.stdout:8/321: link d7/d9/d37/f3b d7/d9/d37/d45/f76 0 2026-03-09T19:27:19.788 INFO:tasks.workunit.client.0.vm07.stdout:8/322: write d7/d9/d37/d34/f55 [277161,67190] 0 2026-03-09T19:27:19.798 INFO:tasks.workunit.client.0.vm07.stdout:7/324: creat d0/d4/f6f x:0 0 0 2026-03-09T19:27:19.798 INFO:tasks.workunit.client.0.vm07.stdout:6/272: symlink d0/d1/db/l6d 0 2026-03-09T19:27:19.823 INFO:tasks.workunit.client.1.vm08.stdout:6/563: write d3/d15/d8a/fa1 [1101852,25259] 0 2026-03-09T19:27:19.825 INFO:tasks.workunit.client.0.vm07.stdout:3/347: creat d1/d26/d4f/f69 x:0 0 0 2026-03-09T19:27:19.830 INFO:tasks.workunit.client.0.vm07.stdout:5/286: mkdir d3/dd/d26/d2d/d60 0 2026-03-09T19:27:19.834 INFO:tasks.workunit.client.1.vm08.stdout:4/539: fdatasync da/d10/f6b 0 2026-03-09T19:27:19.840 INFO:tasks.workunit.client.0.vm07.stdout:1/283: mknod d1/d11/d37/d3f/d45/c69 0 2026-03-09T19:27:19.854 INFO:tasks.workunit.client.0.vm07.stdout:2/348: rmdir d3/d49/d60 0 2026-03-09T19:27:19.857 INFO:tasks.workunit.client.0.vm07.stdout:7/325: creat d0/d4/d5/d26/d3c/d58/f70 x:0 0 0 2026-03-09T19:27:19.857 INFO:tasks.workunit.client.1.vm08.stdout:7/607: write d5/d14/d38/f3c [711025,34460] 0 2026-03-09T19:27:19.857 INFO:tasks.workunit.client.0.vm07.stdout:4/276: symlink d3/d11/d51/l5d 0 2026-03-09T19:27:19.860 INFO:tasks.workunit.client.1.vm08.stdout:7/608: readlink d5/d14/dae/d1c/l6f 0 2026-03-09T19:27:19.862 INFO:tasks.workunit.client.1.vm08.stdout:7/609: fdatasync d5/d14/dae/d1c/d73/fbe 0 2026-03-09T19:27:19.864 INFO:tasks.workunit.client.1.vm08.stdout:2/503: dwrite d3/d4/d23/d2c/d39/d5e/d14/f58 [0,4194304] 0 2026-03-09T19:27:19.880 INFO:tasks.workunit.client.0.vm07.stdout:9/344: creat d0/db/d29/d32/d5c/f78 x:0 0 0 2026-03-09T19:27:19.886 INFO:tasks.workunit.client.1.vm08.stdout:9/526: unlink d0/d2/d8/f29 0 2026-03-09T19:27:19.888 INFO:tasks.workunit.client.1.vm08.stdout:0/550: truncate dd/d31/f54 429726 0 2026-03-09T19:27:19.889 INFO:tasks.workunit.client.0.vm07.stdout:5/287: creat d3/d1a/d28/d36/f61 x:0 0 0 2026-03-09T19:27:19.898 INFO:tasks.workunit.client.0.vm07.stdout:7/326: unlink d0/d4/d5/d8/f1c 0 2026-03-09T19:27:19.900 INFO:tasks.workunit.client.0.vm07.stdout:2/349: dread d3/f15 [0,4194304] 0 2026-03-09T19:27:19.900 INFO:tasks.workunit.client.0.vm07.stdout:2/350: chown d3/dd/d16 59 1 2026-03-09T19:27:19.901 INFO:tasks.workunit.client.0.vm07.stdout:4/277: creat d3/d4f/f5e x:0 0 0 2026-03-09T19:27:19.903 INFO:tasks.workunit.client.0.vm07.stdout:2/351: dread d3/dd/d16/d29/d3c/d5a/d7a/f75 [0,4194304] 0 2026-03-09T19:27:19.903 INFO:tasks.workunit.client.0.vm07.stdout:2/352: readlink d3/l9 0 2026-03-09T19:27:19.904 INFO:tasks.workunit.client.0.vm07.stdout:6/273: symlink d0/l6e 0 2026-03-09T19:27:19.905 INFO:tasks.workunit.client.0.vm07.stdout:2/353: write d3/d11/f39 [2581867,126378] 0 2026-03-09T19:27:19.906 INFO:tasks.workunit.client.1.vm08.stdout:8/497: write de/d25/d33/d46/f2d [137947,40044] 0 2026-03-09T19:27:19.910 INFO:tasks.workunit.client.0.vm07.stdout:0/248: dwrite d0/d6/d13/d17/f20 [0,4194304] 0 2026-03-09T19:27:19.912 INFO:tasks.workunit.client.0.vm07.stdout:2/354: sync 2026-03-09T19:27:19.922 INFO:tasks.workunit.client.0.vm07.stdout:9/345: mkdir d0/db/d29/d79 0 2026-03-09T19:27:19.926 INFO:tasks.workunit.client.1.vm08.stdout:4/540: rename da/d10/d16/l48 to da/d10/d26/d50/la4 0 2026-03-09T19:27:19.936 INFO:tasks.workunit.client.1.vm08.stdout:4/541: dread da/d10/d16/d28/d2f/d4f/d56/f9a [0,4194304] 0 2026-03-09T19:27:19.941 INFO:tasks.workunit.client.0.vm07.stdout:5/288: unlink c1 0 2026-03-09T19:27:19.942 INFO:tasks.workunit.client.0.vm07.stdout:5/289: write d3/f4d [892180,7535] 0 2026-03-09T19:27:19.942 INFO:tasks.workunit.client.0.vm07.stdout:5/290: fsync d3/f18 0 2026-03-09T19:27:19.945 INFO:tasks.workunit.client.0.vm07.stdout:8/323: link d7/d9/d37/l47 d7/d9/d37/d45/d56/d67/l77 0 2026-03-09T19:27:19.949 INFO:tasks.workunit.client.0.vm07.stdout:7/327: unlink d0/d4/d5/d26/d32/f56 0 2026-03-09T19:27:19.949 INFO:tasks.workunit.client.1.vm08.stdout:3/615: creat d0/d52/d7c/fc1 x:0 0 0 2026-03-09T19:27:19.949 INFO:tasks.workunit.client.1.vm08.stdout:7/610: write d5/d14/dae/d3a/d42/f65 [1217958,114452] 0 2026-03-09T19:27:19.952 INFO:tasks.workunit.client.1.vm08.stdout:9/527: unlink d0/d1b/d97/d48/d6f/c76 0 2026-03-09T19:27:19.953 INFO:tasks.workunit.client.1.vm08.stdout:0/551: truncate dd/d7e/f8e 420550 0 2026-03-09T19:27:19.954 INFO:tasks.workunit.client.1.vm08.stdout:0/552: write dd/d22/d27/d4f/d6f/fa5 [447255,103385] 0 2026-03-09T19:27:19.961 INFO:tasks.workunit.client.0.vm07.stdout:2/355: creat d3/dd/d16/d30/d64/f7b x:0 0 0 2026-03-09T19:27:19.962 INFO:tasks.workunit.client.0.vm07.stdout:9/346: symlink d0/d6/d57/d5d/l7a 0 2026-03-09T19:27:19.963 INFO:tasks.workunit.client.1.vm08.stdout:6/564: mknod d3/d15/dc2/cca 0 2026-03-09T19:27:19.970 INFO:tasks.workunit.client.1.vm08.stdout:8/498: creat de/d1d/fb0 x:0 0 0 2026-03-09T19:27:19.976 INFO:tasks.workunit.client.1.vm08.stdout:8/499: dread - de/d25/d31/d82/d6d/f88 zero size 2026-03-09T19:27:19.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:19 vm07.local ceph-mon[48545]: pgmap v164: 65 pgs: 65 active+clean; 1.7 GiB data, 6.5 GiB used, 113 GiB / 120 GiB avail; 33 MiB/s rd, 138 MiB/s wr, 239 op/s 2026-03-09T19:27:19.979 INFO:tasks.workunit.client.1.vm08.stdout:5/496: link d16/d1e/f5f d16/d1e/d3b/d61/f9e 0 2026-03-09T19:27:19.985 INFO:tasks.workunit.client.1.vm08.stdout:2/504: rename d3/d4/d23/d2c/d39/laf to d3/d4/d23/d2c/d39/da3/lb2 0 2026-03-09T19:27:19.991 INFO:tasks.workunit.client.0.vm07.stdout:6/274: dwrite d0/d1/fa [4194304,4194304] 0 2026-03-09T19:27:19.998 INFO:tasks.workunit.client.1.vm08.stdout:2/505: dwrite d3/d4/d23/d2c/d39/d5e/de/d18/f2d [4194304,4194304] 0 2026-03-09T19:27:20.001 INFO:tasks.workunit.client.0.vm07.stdout:8/324: mknod d7/d9/d37/d45/d4f/c78 0 2026-03-09T19:27:20.002 INFO:tasks.workunit.client.0.vm07.stdout:4/278: mkdir d3/d4f/d56/d5f 0 2026-03-09T19:27:20.003 INFO:tasks.workunit.client.1.vm08.stdout:4/542: creat da/d10/d26/d38/fa5 x:0 0 0 2026-03-09T19:27:20.008 INFO:tasks.workunit.client.1.vm08.stdout:7/611: unlink d5/d14/dae/fb8 0 2026-03-09T19:27:20.008 INFO:tasks.workunit.client.0.vm07.stdout:0/249: symlink d0/d6/d13/d1c/d52/l59 0 2026-03-09T19:27:20.009 INFO:tasks.workunit.client.1.vm08.stdout:9/528: creat d0/d1b/d97/d48/d5d/fb3 x:0 0 0 2026-03-09T19:27:20.009 INFO:tasks.workunit.client.1.vm08.stdout:7/612: dread - d5/d14/d38/dad/fc1 zero size 2026-03-09T19:27:20.010 INFO:tasks.workunit.client.0.vm07.stdout:9/347: creat d0/d6/f7b x:0 0 0 2026-03-09T19:27:20.010 INFO:tasks.workunit.client.1.vm08.stdout:1/681: rmdir d9/da/d53/d67/d6c/d93 0 2026-03-09T19:27:20.012 INFO:tasks.workunit.client.0.vm07.stdout:3/348: getdents d1/d6 0 2026-03-09T19:27:20.012 INFO:tasks.workunit.client.1.vm08.stdout:5/497: rmdir d16/d45 39 2026-03-09T19:27:20.013 INFO:tasks.workunit.client.1.vm08.stdout:5/498: truncate d16/d1e/d30/d8a/f98 960254 0 2026-03-09T19:27:20.014 INFO:tasks.workunit.client.1.vm08.stdout:5/499: chown d16/d1e/d30/c49 0 1 2026-03-09T19:27:20.014 INFO:tasks.workunit.client.0.vm07.stdout:6/275: read d0/d13/f1b [418079,66798] 0 2026-03-09T19:27:20.014 INFO:tasks.workunit.client.1.vm08.stdout:5/500: chown d16/d1e/f2e 34321 1 2026-03-09T19:27:20.015 INFO:tasks.workunit.client.0.vm07.stdout:1/284: getdents d1/d11/d37/d5d/d50 0 2026-03-09T19:27:20.022 INFO:tasks.workunit.client.0.vm07.stdout:9/348: dread d0/db/d29/d2c/f54 [0,4194304] 0 2026-03-09T19:27:20.022 INFO:tasks.workunit.client.1.vm08.stdout:0/553: mkdir dd/d22/d27/d6c/dad 0 2026-03-09T19:27:20.025 INFO:tasks.workunit.client.1.vm08.stdout:7/613: chown d5/d14/d2b/c2c 7133165 1 2026-03-09T19:27:20.026 INFO:tasks.workunit.client.0.vm07.stdout:1/285: creat d1/db/d31/d56/f6a x:0 0 0 2026-03-09T19:27:20.027 INFO:tasks.workunit.client.1.vm08.stdout:8/500: unlink de/f11 0 2026-03-09T19:27:20.029 INFO:tasks.workunit.client.0.vm07.stdout:2/356: creat d3/f7c x:0 0 0 2026-03-09T19:27:20.030 INFO:tasks.workunit.client.1.vm08.stdout:8/501: dread de/d1d/f59 [0,4194304] 0 2026-03-09T19:27:20.031 INFO:tasks.workunit.client.1.vm08.stdout:4/543: mkdir da/d10/d26/d27/da6 0 2026-03-09T19:27:20.032 INFO:tasks.workunit.client.0.vm07.stdout:0/250: creat d0/d6/d13/d17/d19/d57/f5a x:0 0 0 2026-03-09T19:27:20.032 INFO:tasks.workunit.client.1.vm08.stdout:3/616: creat d0/d8/fc2 x:0 0 0 2026-03-09T19:27:20.033 INFO:tasks.workunit.client.1.vm08.stdout:3/617: chown d0/d6/de/d1a/c31 1230039 1 2026-03-09T19:27:20.035 INFO:tasks.workunit.client.0.vm07.stdout:3/349: rename d1/d6/dd/d51/c5a to d1/d6/dd/d51/c6a 0 2026-03-09T19:27:20.035 INFO:tasks.workunit.client.1.vm08.stdout:3/618: write d0/d6/faa [532758,5983] 0 2026-03-09T19:27:20.038 INFO:tasks.workunit.client.0.vm07.stdout:9/349: fsync d0/db/d29/d2c/d36/f71 0 2026-03-09T19:27:20.059 INFO:tasks.workunit.client.1.vm08.stdout:5/501: unlink d16/d45/c62 0 2026-03-09T19:27:20.059 INFO:tasks.workunit.client.1.vm08.stdout:9/529: mknod d0/d2/d8/cb4 0 2026-03-09T19:27:20.059 INFO:tasks.workunit.client.1.vm08.stdout:7/614: mkdir d5/d14/d27/d78/dc7/dce 0 2026-03-09T19:27:20.059 INFO:tasks.workunit.client.1.vm08.stdout:3/619: truncate d0/d6/f91 503315 0 2026-03-09T19:27:20.059 INFO:tasks.workunit.client.1.vm08.stdout:7/615: read d5/d14/d2b/f9f [357485,38247] 0 2026-03-09T19:27:20.059 INFO:tasks.workunit.client.1.vm08.stdout:5/502: unlink d16/d1e/f44 0 2026-03-09T19:27:20.059 INFO:tasks.workunit.client.1.vm08.stdout:3/620: dwrite d0/d6/de/d1b/d16/d17/fbc [0,4194304] 0 2026-03-09T19:27:20.059 INFO:tasks.workunit.client.0.vm07.stdout:1/286: creat d1/d11/d37/d5d/d50/f6b x:0 0 0 2026-03-09T19:27:20.059 INFO:tasks.workunit.client.0.vm07.stdout:0/251: write d0/d6/d13/f31 [58177,123684] 0 2026-03-09T19:27:20.059 INFO:tasks.workunit.client.0.vm07.stdout:1/287: dread - d1/db/d31/d56/f6a zero size 2026-03-09T19:27:20.059 INFO:tasks.workunit.client.0.vm07.stdout:3/350: dwrite d1/d6/dd/f3b [4194304,4194304] 0 2026-03-09T19:27:20.059 INFO:tasks.workunit.client.0.vm07.stdout:9/350: write d0/d17/f5e [1400322,6162] 0 2026-03-09T19:27:20.059 INFO:tasks.workunit.client.0.vm07.stdout:1/288: truncate d1/d3/d21/f47 208908 0 2026-03-09T19:27:20.059 INFO:tasks.workunit.client.0.vm07.stdout:3/351: creat d1/d6/dd/d51/f6b x:0 0 0 2026-03-09T19:27:20.060 INFO:tasks.workunit.client.1.vm08.stdout:7/616: creat d5/d14/d27/d78/dc7/fcf x:0 0 0 2026-03-09T19:27:20.060 INFO:tasks.workunit.client.1.vm08.stdout:4/544: read da/d10/d1b/f37 [325769,5627] 0 2026-03-09T19:27:20.071 INFO:tasks.workunit.client.0.vm07.stdout:4/279: sync 2026-03-09T19:27:20.072 INFO:tasks.workunit.client.1.vm08.stdout:6/565: sync 2026-03-09T19:27:20.074 INFO:tasks.workunit.client.0.vm07.stdout:0/252: unlink d0/d6/l25 0 2026-03-09T19:27:20.075 INFO:tasks.workunit.client.0.vm07.stdout:4/280: dread d3/d11/f1e [0,4194304] 0 2026-03-09T19:27:20.075 INFO:tasks.workunit.client.0.vm07.stdout:4/281: readlink d3/d4f/l4e 0 2026-03-09T19:27:20.076 INFO:tasks.workunit.client.0.vm07.stdout:4/282: write d3/d4f/f5b [72034,81286] 0 2026-03-09T19:27:20.082 INFO:tasks.workunit.client.1.vm08.stdout:8/502: dread de/d1d/d21/f72 [0,4194304] 0 2026-03-09T19:27:20.085 INFO:tasks.workunit.client.1.vm08.stdout:9/530: dread d0/d2/d14/f31 [0,4194304] 0 2026-03-09T19:27:20.088 INFO:tasks.workunit.client.0.vm07.stdout:1/289: dread d1/d11/d37/f40 [0,4194304] 0 2026-03-09T19:27:20.094 INFO:tasks.workunit.client.0.vm07.stdout:0/253: unlink d0/fa 0 2026-03-09T19:27:20.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:19 vm08.local ceph-mon[57794]: pgmap v164: 65 pgs: 65 active+clean; 1.7 GiB data, 6.5 GiB used, 113 GiB / 120 GiB avail; 33 MiB/s rd, 138 MiB/s wr, 239 op/s 2026-03-09T19:27:20.099 INFO:tasks.workunit.client.0.vm07.stdout:4/283: rename d3/d11/d16/f57 to d3/d11/d16/d2f/f60 0 2026-03-09T19:27:20.099 INFO:tasks.workunit.client.0.vm07.stdout:4/284: truncate d3/d11/d16/d2f/d22/f3b 965968 0 2026-03-09T19:27:20.099 INFO:tasks.workunit.client.1.vm08.stdout:6/566: creat d3/d15/fcb x:0 0 0 2026-03-09T19:27:20.099 INFO:tasks.workunit.client.1.vm08.stdout:6/567: stat d3/db/d43/c81 0 2026-03-09T19:27:20.099 INFO:tasks.workunit.client.1.vm08.stdout:5/503: mkdir d16/d1e/d9f 0 2026-03-09T19:27:20.108 INFO:tasks.workunit.client.1.vm08.stdout:8/503: symlink de/d47/d85/lb1 0 2026-03-09T19:27:20.109 INFO:tasks.workunit.client.1.vm08.stdout:8/504: write de/d25/d31/d82/fa9 [523586,92529] 0 2026-03-09T19:27:20.115 INFO:tasks.workunit.client.0.vm07.stdout:5/291: dwrite d3/dd/f23 [0,4194304] 0 2026-03-09T19:27:20.116 INFO:tasks.workunit.client.0.vm07.stdout:5/292: chown d3/dd/d26/d2d 15 1 2026-03-09T19:27:20.117 INFO:tasks.workunit.client.0.vm07.stdout:5/293: read f2 [614348,81764] 0 2026-03-09T19:27:20.128 INFO:tasks.workunit.client.1.vm08.stdout:9/531: creat d0/d1b/d97/d48/fb5 x:0 0 0 2026-03-09T19:27:20.133 INFO:tasks.workunit.client.0.vm07.stdout:8/325: write d7/f19 [3921628,24476] 0 2026-03-09T19:27:20.139 INFO:tasks.workunit.client.0.vm07.stdout:7/328: write d0/f1 [5201887,103218] 0 2026-03-09T19:27:20.139 INFO:tasks.workunit.client.1.vm08.stdout:1/682: dwrite d9/d40/f57 [0,4194304] 0 2026-03-09T19:27:20.139 INFO:tasks.workunit.client.1.vm08.stdout:2/506: dwrite d3/d4/d23/d2c/d39/d5e/d14/f2b [0,4194304] 0 2026-03-09T19:27:20.140 INFO:tasks.workunit.client.0.vm07.stdout:7/329: readlink d0/d4/l24 0 2026-03-09T19:27:20.140 INFO:tasks.workunit.client.0.vm07.stdout:7/330: write d0/d4/d5/dd/f47 [607680,62724] 0 2026-03-09T19:27:20.140 INFO:tasks.workunit.client.1.vm08.stdout:2/507: fsync d3/d9/f20 0 2026-03-09T19:27:20.141 INFO:tasks.workunit.client.1.vm08.stdout:1/683: write d9/d11/d7a/d89/fb7 [1024763,73985] 0 2026-03-09T19:27:20.145 INFO:tasks.workunit.client.1.vm08.stdout:2/508: readlink d3/d4/d23/d2c/d39/d5e/de/d18/d99/lae 0 2026-03-09T19:27:20.146 INFO:tasks.workunit.client.1.vm08.stdout:2/509: write d3/d9/f20 [3165398,13090] 0 2026-03-09T19:27:20.151 INFO:tasks.workunit.client.1.vm08.stdout:1/684: dwrite d9/d11/d7a/d89/d8d/da3/fab [0,4194304] 0 2026-03-09T19:27:20.172 INFO:tasks.workunit.client.0.vm07.stdout:6/276: dwrite d0/d13/f1c [0,4194304] 0 2026-03-09T19:27:20.173 INFO:tasks.workunit.client.1.vm08.stdout:0/554: dwrite dd/d22/d24/f26 [0,4194304] 0 2026-03-09T19:27:20.174 INFO:tasks.workunit.client.0.vm07.stdout:1/290: fsync d1/d11/d37/d5d/d50/f6b 0 2026-03-09T19:27:20.186 INFO:tasks.workunit.client.1.vm08.stdout:6/568: symlink d3/d34/d5c/da2/lcc 0 2026-03-09T19:27:20.197 INFO:tasks.workunit.client.0.vm07.stdout:4/285: unlink d3/d11/d16/d2f/d22/f3b 0 2026-03-09T19:27:20.205 INFO:tasks.workunit.client.1.vm08.stdout:5/504: symlink d16/d45/d81/la0 0 2026-03-09T19:27:20.212 INFO:tasks.workunit.client.1.vm08.stdout:8/505: chown de/d7c/l8b 470753775 1 2026-03-09T19:27:20.214 INFO:tasks.workunit.client.0.vm07.stdout:2/357: truncate d3/f27 3164560 0 2026-03-09T19:27:20.215 INFO:tasks.workunit.client.0.vm07.stdout:5/294: truncate d3/dd/f58 921759 0 2026-03-09T19:27:20.217 INFO:tasks.workunit.client.0.vm07.stdout:8/326: rmdir d7/d9/d37/d45/d4f 39 2026-03-09T19:27:20.218 INFO:tasks.workunit.client.0.vm07.stdout:8/327: write d7/d50/f6d [580010,23383] 0 2026-03-09T19:27:20.218 INFO:tasks.workunit.client.0.vm07.stdout:5/295: dwrite d3/dd/f52 [0,4194304] 0 2026-03-09T19:27:20.222 INFO:tasks.workunit.client.0.vm07.stdout:8/328: fsync d7/d9/d10/f41 0 2026-03-09T19:27:20.229 INFO:tasks.workunit.client.0.vm07.stdout:8/329: chown d7/d9/d37/d45/d56/d62/f64 666463 1 2026-03-09T19:27:20.229 INFO:tasks.workunit.client.0.vm07.stdout:9/351: dwrite d0/db/d29/d2c/f30 [0,4194304] 0 2026-03-09T19:27:20.229 INFO:tasks.workunit.client.0.vm07.stdout:9/352: chown d0/d6/d73 1 1 2026-03-09T19:27:20.239 INFO:tasks.workunit.client.0.vm07.stdout:6/277: mknod d0/d1/db/d52/c6f 0 2026-03-09T19:27:20.239 INFO:tasks.workunit.client.0.vm07.stdout:3/352: write d1/d26/f31 [845850,50665] 0 2026-03-09T19:27:20.244 INFO:tasks.workunit.client.0.vm07.stdout:1/291: creat d1/d11/d37/d3f/f6c x:0 0 0 2026-03-09T19:27:20.254 INFO:tasks.workunit.client.1.vm08.stdout:3/621: creat d0/d6/fc3 x:0 0 0 2026-03-09T19:27:20.254 INFO:tasks.workunit.client.0.vm07.stdout:0/254: rename d0/d6/d13/d1c/d11/c45 to d0/d6/d13/d1c/d50/c5b 0 2026-03-09T19:27:20.254 INFO:tasks.workunit.client.0.vm07.stdout:4/286: unlink d3/d11/d16/d2f/d22/c4c 0 2026-03-09T19:27:20.254 INFO:tasks.workunit.client.0.vm07.stdout:2/358: fsync d3/dd/d16/d29/d2d/f56 0 2026-03-09T19:27:20.255 INFO:tasks.workunit.client.0.vm07.stdout:2/359: read d3/ff [3407011,87432] 0 2026-03-09T19:27:20.259 INFO:tasks.workunit.client.0.vm07.stdout:8/330: creat d7/d9/d37/d34/f79 x:0 0 0 2026-03-09T19:27:20.260 INFO:tasks.workunit.client.0.vm07.stdout:8/331: stat d7/f2e 0 2026-03-09T19:27:20.260 INFO:tasks.workunit.client.1.vm08.stdout:3/622: creat d0/d8/d19/fc4 x:0 0 0 2026-03-09T19:27:20.261 INFO:tasks.workunit.client.0.vm07.stdout:6/278: unlink d0/d1/f19 0 2026-03-09T19:27:20.261 INFO:tasks.workunit.client.0.vm07.stdout:2/360: dwrite d3/dd/d16/d30/d64/f7b [0,4194304] 0 2026-03-09T19:27:20.261 INFO:tasks.workunit.client.1.vm08.stdout:7/617: getdents d5/d14/d2b/d4b 0 2026-03-09T19:27:20.265 INFO:tasks.workunit.client.0.vm07.stdout:3/353: symlink d1/d3d/d47/l6c 0 2026-03-09T19:27:20.265 INFO:tasks.workunit.client.0.vm07.stdout:6/279: dwrite d0/d1/f8 [0,4194304] 0 2026-03-09T19:27:20.266 INFO:tasks.workunit.client.0.vm07.stdout:1/292: rmdir d1/d11 39 2026-03-09T19:27:20.267 INFO:tasks.workunit.client.0.vm07.stdout:1/293: readlink d1/d3/l34 0 2026-03-09T19:27:20.284 INFO:tasks.workunit.client.1.vm08.stdout:0/555: creat dd/d22/d24/d49/fae x:0 0 0 2026-03-09T19:27:20.284 INFO:tasks.workunit.client.1.vm08.stdout:5/505: sync 2026-03-09T19:27:20.285 INFO:tasks.workunit.client.1.vm08.stdout:0/556: chown dd/d22/d24/d49/d50/d78 4151231 1 2026-03-09T19:27:20.285 INFO:tasks.workunit.client.1.vm08.stdout:0/557: chown dd/d22/d63 15655150 1 2026-03-09T19:27:20.294 INFO:tasks.workunit.client.1.vm08.stdout:3/623: rmdir d0/d6/dad 39 2026-03-09T19:27:20.295 INFO:tasks.workunit.client.0.vm07.stdout:5/296: stat d3/dd/f58 0 2026-03-09T19:27:20.296 INFO:tasks.workunit.client.1.vm08.stdout:8/506: creat de/fb2 x:0 0 0 2026-03-09T19:27:20.299 INFO:tasks.workunit.client.0.vm07.stdout:9/353: link d0/db/d29/d2c/d36/f71 d0/db/d29/d79/f7c 0 2026-03-09T19:27:20.299 INFO:tasks.workunit.client.0.vm07.stdout:9/354: stat d0/d6/d57/f59 0 2026-03-09T19:27:20.300 INFO:tasks.workunit.client.0.vm07.stdout:9/355: fdatasync d0/db/d29/d68/f6b 0 2026-03-09T19:27:20.300 INFO:tasks.workunit.client.0.vm07.stdout:8/332: creat d7/d9/d37/d45/d56/f7a x:0 0 0 2026-03-09T19:27:20.308 INFO:tasks.workunit.client.1.vm08.stdout:0/558: unlink dd/d22/d27/l6b 0 2026-03-09T19:27:20.310 INFO:tasks.workunit.client.1.vm08.stdout:3/624: symlink d0/d6/de/d6e/d51/d92/lc5 0 2026-03-09T19:27:20.319 INFO:tasks.workunit.client.0.vm07.stdout:2/361: mkdir d3/dd/d16/d29/d2d/d45/d3b/d44/d7d 0 2026-03-09T19:27:20.320 INFO:tasks.workunit.client.1.vm08.stdout:8/507: rename de/d25/d33/d46 to de/d25/d31/d82/d6d/d99/da5/db3 0 2026-03-09T19:27:20.322 INFO:tasks.workunit.client.0.vm07.stdout:1/294: unlink d1/db/d31/d56/c5e 0 2026-03-09T19:27:20.323 INFO:tasks.workunit.client.1.vm08.stdout:5/506: creat d16/fa1 x:0 0 0 2026-03-09T19:27:20.324 INFO:tasks.workunit.client.1.vm08.stdout:5/507: fdatasync d16/d1e/d30/f70 0 2026-03-09T19:27:20.325 INFO:tasks.workunit.client.1.vm08.stdout:5/508: write d16/d1e/f2e [847034,74864] 0 2026-03-09T19:27:20.326 INFO:tasks.workunit.client.1.vm08.stdout:5/509: chown d16/d1e/d30/d6f/c7f 1 1 2026-03-09T19:27:20.328 INFO:tasks.workunit.client.1.vm08.stdout:3/625: read d0/d6/faa [176597,14035] 0 2026-03-09T19:27:20.339 INFO:tasks.workunit.client.1.vm08.stdout:0/559: symlink dd/d22/d24/d49/d50/d78/d86/laf 0 2026-03-09T19:27:20.345 INFO:tasks.workunit.client.1.vm08.stdout:6/569: dread d3/db/f44 [0,4194304] 0 2026-03-09T19:27:20.346 INFO:tasks.workunit.client.1.vm08.stdout:6/570: dread - d3/db/f30 zero size 2026-03-09T19:27:20.350 INFO:tasks.workunit.client.1.vm08.stdout:4/545: write da/d10/d26/d38/f57 [237096,112959] 0 2026-03-09T19:27:20.351 INFO:tasks.workunit.client.0.vm07.stdout:7/331: write d0/d4/d5/dd/f18 [621116,110646] 0 2026-03-09T19:27:20.354 INFO:tasks.workunit.client.1.vm08.stdout:4/546: truncate da/d10/d26/d38/f93 5093940 0 2026-03-09T19:27:20.359 INFO:tasks.workunit.client.1.vm08.stdout:4/547: dread da/d10/d16/d28/d2f/f80 [0,4194304] 0 2026-03-09T19:27:20.368 INFO:tasks.workunit.client.1.vm08.stdout:1/685: dwrite d9/d11/f9b [0,4194304] 0 2026-03-09T19:27:20.370 INFO:tasks.workunit.client.1.vm08.stdout:9/532: write d0/d2/d8/f2d [480473,10736] 0 2026-03-09T19:27:20.370 INFO:tasks.workunit.client.1.vm08.stdout:2/510: dwrite d3/d9/d79/f98 [0,4194304] 0 2026-03-09T19:27:20.384 INFO:tasks.workunit.client.1.vm08.stdout:7/618: dwrite d5/d14/d2b/d4b/f7e [0,4194304] 0 2026-03-09T19:27:20.388 INFO:tasks.workunit.client.1.vm08.stdout:7/619: stat d5/d14/dae/d1c/d73 0 2026-03-09T19:27:20.398 INFO:tasks.workunit.client.1.vm08.stdout:1/686: dread d9/d11/d7a/d89/d8d/da3/fc2 [0,4194304] 0 2026-03-09T19:27:20.399 INFO:tasks.workunit.client.1.vm08.stdout:1/687: dread - d9/da/d53/d67/d6c/fbc zero size 2026-03-09T19:27:20.400 INFO:tasks.workunit.client.1.vm08.stdout:5/510: rmdir d16/d1e 39 2026-03-09T19:27:20.400 INFO:tasks.workunit.client.1.vm08.stdout:7/620: dwrite d5/d14/d27/d54/f75 [0,4194304] 0 2026-03-09T19:27:20.419 INFO:tasks.workunit.client.1.vm08.stdout:6/571: symlink d3/db/d43/lcd 0 2026-03-09T19:27:20.422 INFO:tasks.workunit.client.1.vm08.stdout:2/511: dread d3/d9/f20 [0,4194304] 0 2026-03-09T19:27:20.432 INFO:tasks.workunit.client.1.vm08.stdout:4/548: dread da/d10/d16/d28/f44 [0,4194304] 0 2026-03-09T19:27:20.442 INFO:tasks.workunit.client.1.vm08.stdout:8/508: truncate de/f10 4482813 0 2026-03-09T19:27:20.442 INFO:tasks.workunit.client.1.vm08.stdout:9/533: creat d0/d1b/daa/fb6 x:0 0 0 2026-03-09T19:27:20.453 INFO:tasks.workunit.client.0.vm07.stdout:5/297: unlink d3/d1a/f51 0 2026-03-09T19:27:20.453 INFO:tasks.workunit.client.0.vm07.stdout:9/356: mkdir d0/db/d29/d2c/d36/d7d 0 2026-03-09T19:27:20.458 INFO:tasks.workunit.client.0.vm07.stdout:8/333: creat d7/d9/d37/d45/d56/d67/f7b x:0 0 0 2026-03-09T19:27:20.458 INFO:tasks.workunit.client.0.vm07.stdout:8/334: dread - d7/d30/d32/f6b zero size 2026-03-09T19:27:20.463 INFO:tasks.workunit.client.0.vm07.stdout:2/362: dwrite d3/dd/d16/d29/d2d/d45/f62 [0,4194304] 0 2026-03-09T19:27:20.467 INFO:tasks.workunit.client.0.vm07.stdout:1/295: dread - d1/d11/f48 zero size 2026-03-09T19:27:20.467 INFO:tasks.workunit.client.0.vm07.stdout:1/296: dread - d1/d3/d21/f55 zero size 2026-03-09T19:27:20.471 INFO:tasks.workunit.client.0.vm07.stdout:9/357: sync 2026-03-09T19:27:20.472 INFO:tasks.workunit.client.0.vm07.stdout:7/332: creat d0/d4/d5/d26/d3c/d58/f71 x:0 0 0 2026-03-09T19:27:20.472 INFO:tasks.workunit.client.1.vm08.stdout:9/534: chown d0/d1b/d97/d48/d5e/fa1 992069 1 2026-03-09T19:27:20.472 INFO:tasks.workunit.client.1.vm08.stdout:9/535: readlink d0/l2e 0 2026-03-09T19:27:20.475 INFO:tasks.workunit.client.0.vm07.stdout:1/297: dread d1/f1d [0,4194304] 0 2026-03-09T19:27:20.476 INFO:tasks.workunit.client.0.vm07.stdout:1/298: truncate d1/d11/f1b 5417801 0 2026-03-09T19:27:20.480 INFO:tasks.workunit.client.1.vm08.stdout:9/536: dwrite d0/d2/d14/d5c/fd [4194304,4194304] 0 2026-03-09T19:27:20.481 INFO:tasks.workunit.client.1.vm08.stdout:4/549: truncate da/d10/d16/d28/d46/d52/d6e/d40/f41 4930305 0 2026-03-09T19:27:20.491 INFO:tasks.workunit.client.1.vm08.stdout:9/537: dwrite d0/d1b/d97/d48/fb5 [0,4194304] 0 2026-03-09T19:27:20.491 INFO:tasks.workunit.client.1.vm08.stdout:1/688: creat d9/da/d2d/d62/fcc x:0 0 0 2026-03-09T19:27:20.491 INFO:tasks.workunit.client.1.vm08.stdout:1/689: chown d9/da/d17 3920082 1 2026-03-09T19:27:20.495 INFO:tasks.workunit.client.0.vm07.stdout:0/255: dwrite d0/d6/d13/f4c [0,4194304] 0 2026-03-09T19:27:20.503 INFO:tasks.workunit.client.0.vm07.stdout:4/287: creat d3/d11/f61 x:0 0 0 2026-03-09T19:27:20.516 INFO:tasks.workunit.client.0.vm07.stdout:4/288: dread d3/d11/d2b/d37/f4d [0,4194304] 0 2026-03-09T19:27:20.518 INFO:tasks.workunit.client.1.vm08.stdout:6/572: mkdir d3/d34/dce 0 2026-03-09T19:27:20.521 INFO:tasks.workunit.client.1.vm08.stdout:6/573: chown d3/d34/d5c/da2/lcc 1 1 2026-03-09T19:27:20.521 INFO:tasks.workunit.client.1.vm08.stdout:2/512: fsync d3/d4/d23/d2c/d39/d5e/de/d18/d1f/f7f 0 2026-03-09T19:27:20.530 INFO:tasks.workunit.client.0.vm07.stdout:3/354: creat d1/d1f/f6d x:0 0 0 2026-03-09T19:27:20.531 INFO:tasks.workunit.client.0.vm07.stdout:8/335: mkdir d7/d9/d57/d7c 0 2026-03-09T19:27:20.531 INFO:tasks.workunit.client.1.vm08.stdout:8/509: truncate de/d25/f71 2913445 0 2026-03-09T19:27:20.531 INFO:tasks.workunit.client.0.vm07.stdout:3/355: truncate d1/d26/d4f/f69 330585 0 2026-03-09T19:27:20.537 INFO:tasks.workunit.client.0.vm07.stdout:2/363: read d3/d11/f18 [2102047,93494] 0 2026-03-09T19:27:20.538 INFO:tasks.workunit.client.1.vm08.stdout:4/550: unlink da/d10/d16/d28/d46/f8f 0 2026-03-09T19:27:20.545 INFO:tasks.workunit.client.0.vm07.stdout:6/280: rename d0/d1/d28/f2b to d0/d1/db/f70 0 2026-03-09T19:27:20.548 INFO:tasks.workunit.client.1.vm08.stdout:7/621: rename d5/d14/dae/d3a/d42/d85/l8b to d5/d14/dae/d1c/ld0 0 2026-03-09T19:27:20.556 INFO:tasks.workunit.client.1.vm08.stdout:9/538: rmdir d0/d1b/d97/d48/d5e 39 2026-03-09T19:27:20.559 INFO:tasks.workunit.client.1.vm08.stdout:5/511: link d16/d1e/d30/d8a/f98 d16/d8e/fa2 0 2026-03-09T19:27:20.564 INFO:tasks.workunit.client.0.vm07.stdout:0/256: creat d0/d6/f5c x:0 0 0 2026-03-09T19:27:20.567 INFO:tasks.workunit.client.1.vm08.stdout:8/510: mkdir de/d1d/d2e/db4 0 2026-03-09T19:27:20.567 INFO:tasks.workunit.client.0.vm07.stdout:4/289: symlink d3/d11/d29/l62 0 2026-03-09T19:27:20.571 INFO:tasks.workunit.client.0.vm07.stdout:5/298: write d3/d1a/d28/d48/f50 [800754,126627] 0 2026-03-09T19:27:20.574 INFO:tasks.workunit.client.1.vm08.stdout:6/574: rename d3/d34/d5c/l74 to d3/d34/d5c/da2/lcf 0 2026-03-09T19:27:20.575 INFO:tasks.workunit.client.0.vm07.stdout:8/336: unlink d7/f15 0 2026-03-09T19:27:20.575 INFO:tasks.workunit.client.0.vm07.stdout:5/299: dwrite d3/d1a/f17 [0,4194304] 0 2026-03-09T19:27:20.575 INFO:tasks.workunit.client.1.vm08.stdout:6/575: stat d3/d34/d3b/c75 0 2026-03-09T19:27:20.583 INFO:tasks.workunit.client.1.vm08.stdout:5/512: truncate d16/d45/f55 1111945 0 2026-03-09T19:27:20.586 INFO:tasks.workunit.client.0.vm07.stdout:3/356: dwrite d1/f20 [0,4194304] 0 2026-03-09T19:27:20.589 INFO:tasks.workunit.client.0.vm07.stdout:7/333: mknod d0/d4/d5/d8/d1a/c72 0 2026-03-09T19:27:20.600 INFO:tasks.workunit.client.0.vm07.stdout:1/299: mkdir d1/d11/d37/d5a/d6d 0 2026-03-09T19:27:20.601 INFO:tasks.workunit.client.0.vm07.stdout:1/300: fdatasync d1/db/d31/d56/f6a 0 2026-03-09T19:27:20.602 INFO:tasks.workunit.client.1.vm08.stdout:4/551: symlink da/d10/d16/d28/d46/d52/d6e/d6d/la7 0 2026-03-09T19:27:20.614 INFO:tasks.workunit.client.1.vm08.stdout:4/552: write da/d10/d16/d28/d46/d52/d6e/d2c/f36 [3671959,42937] 0 2026-03-09T19:27:20.619 INFO:tasks.workunit.client.1.vm08.stdout:0/560: dwrite dd/d22/d27/d2e/d37/f44 [0,4194304] 0 2026-03-09T19:27:20.625 INFO:tasks.workunit.client.1.vm08.stdout:3/626: dwrite d0/d6/de/d6e/d51/f70 [0,4194304] 0 2026-03-09T19:27:20.630 INFO:tasks.workunit.client.0.vm07.stdout:0/257: write d0/d6/d13/d17/f2b [661409,5934] 0 2026-03-09T19:27:20.640 INFO:tasks.workunit.client.1.vm08.stdout:7/622: mkdir d5/d14/dae/dd1 0 2026-03-09T19:27:20.641 INFO:tasks.workunit.client.1.vm08.stdout:6/576: dread - d3/db/d43/d69/da0/fa7 zero size 2026-03-09T19:27:20.653 INFO:tasks.workunit.client.0.vm07.stdout:3/357: fdatasync d1/f7 0 2026-03-09T19:27:20.657 INFO:tasks.workunit.client.0.vm07.stdout:7/334: rename d0/d4/d5/d8/d1a/f1d to d0/d4/d5/d8/d41/f73 0 2026-03-09T19:27:20.688 INFO:tasks.workunit.client.0.vm07.stdout:1/301: chown d1/d11/d37/d5d/d50/c53 2773 1 2026-03-09T19:27:20.688 INFO:tasks.workunit.client.0.vm07.stdout:1/302: write d1/f4c [2654931,48592] 0 2026-03-09T19:27:20.689 INFO:tasks.workunit.client.0.vm07.stdout:4/290: mknod d3/d11/c63 0 2026-03-09T19:27:20.691 INFO:tasks.workunit.client.0.vm07.stdout:3/358: creat d1/d26/f6e x:0 0 0 2026-03-09T19:27:20.694 INFO:tasks.workunit.client.0.vm07.stdout:5/300: link d3/d1a/d28/f39 d3/dd/d26/d3f/d47/f62 0 2026-03-09T19:27:20.695 INFO:tasks.workunit.client.0.vm07.stdout:3/359: rename d1/d1f/d16/d28/l29 to d1/d26/l6f 0 2026-03-09T19:27:20.696 INFO:tasks.workunit.client.0.vm07.stdout:3/360: chown d1/l46 15 1 2026-03-09T19:27:20.696 INFO:tasks.workunit.client.0.vm07.stdout:5/301: creat d3/d1a/d28/d36/f63 x:0 0 0 2026-03-09T19:27:20.697 INFO:tasks.workunit.client.0.vm07.stdout:4/291: creat d3/f64 x:0 0 0 2026-03-09T19:27:20.699 INFO:tasks.workunit.client.0.vm07.stdout:4/292: dread d3/f7 [0,4194304] 0 2026-03-09T19:27:20.700 INFO:tasks.workunit.client.1.vm08.stdout:6/577: fsync d3/f32 0 2026-03-09T19:27:20.701 INFO:tasks.workunit.client.0.vm07.stdout:5/302: dwrite d3/f19 [4194304,4194304] 0 2026-03-09T19:27:20.701 INFO:tasks.workunit.client.0.vm07.stdout:4/293: rmdir d3/d4f 39 2026-03-09T19:27:20.703 INFO:tasks.workunit.client.1.vm08.stdout:5/513: symlink d16/d1e/d8c/d99/la3 0 2026-03-09T19:27:20.724 INFO:tasks.workunit.client.1.vm08.stdout:5/514: rename d16/d1e/d3b/d61/c7c to d16/d8e/ca4 0 2026-03-09T19:27:20.735 INFO:tasks.workunit.client.1.vm08.stdout:5/515: link d16/d1e/d6e/f72 d16/d1e/fa5 0 2026-03-09T19:27:20.735 INFO:tasks.workunit.client.1.vm08.stdout:5/516: mknod d16/d1e/d8c/ca6 0 2026-03-09T19:27:20.735 INFO:tasks.workunit.client.1.vm08.stdout:5/517: write d16/d45/f54 [1816247,27600] 0 2026-03-09T19:27:20.735 INFO:tasks.workunit.client.1.vm08.stdout:5/518: dwrite d16/d45/f6b [0,4194304] 0 2026-03-09T19:27:20.735 INFO:tasks.workunit.client.1.vm08.stdout:5/519: chown d16/d1e/d30/d6f/c97 51782 1 2026-03-09T19:27:20.735 INFO:tasks.workunit.client.1.vm08.stdout:5/520: chown d16/d1e/d30/d6f/f8f 291 1 2026-03-09T19:27:20.880 INFO:tasks.workunit.client.0.vm07.stdout:6/281: stat d0/d1/db/d24/d53/d31/f6c 0 2026-03-09T19:27:20.927 INFO:tasks.workunit.client.0.vm07.stdout:6/282: dread d0/d1/db/d1d/f22 [0,4194304] 0 2026-03-09T19:27:21.020 INFO:tasks.workunit.client.0.vm07.stdout:9/358: write d0/db/d29/d79/f7c [904928,92153] 0 2026-03-09T19:27:21.020 INFO:tasks.workunit.client.1.vm08.stdout:1/690: write d9/da/d53/fae [275259,37990] 0 2026-03-09T19:27:21.024 INFO:tasks.workunit.client.0.vm07.stdout:9/359: getdents d0/db/d29 0 2026-03-09T19:27:21.026 INFO:tasks.workunit.client.1.vm08.stdout:1/691: dread d9/d11/d7a/d89/d8d/da3/fc2 [0,4194304] 0 2026-03-09T19:27:21.027 INFO:tasks.workunit.client.1.vm08.stdout:1/692: mkdir d9/da/d95/dcd 0 2026-03-09T19:27:21.046 INFO:tasks.workunit.client.1.vm08.stdout:2/513: truncate d3/d4/d23/d2c/d39/d5e/d14/f58 1441847 0 2026-03-09T19:27:21.083 INFO:tasks.workunit.client.1.vm08.stdout:2/514: dwrite d3/d4/f8 [4194304,4194304] 0 2026-03-09T19:27:21.083 INFO:tasks.workunit.client.0.vm07.stdout:2/364: dwrite d3/d11/f18 [4194304,4194304] 0 2026-03-09T19:27:21.083 INFO:tasks.workunit.client.0.vm07.stdout:2/365: getdents d3/dd/d16/d29/d3c 0 2026-03-09T19:27:21.083 INFO:tasks.workunit.client.0.vm07.stdout:2/366: chown d3/dd/d16/d29/d3c/d5a/d7a/c61 0 1 2026-03-09T19:27:21.083 INFO:tasks.workunit.client.0.vm07.stdout:2/367: write f2 [1363706,128478] 0 2026-03-09T19:27:21.083 INFO:tasks.workunit.client.0.vm07.stdout:2/368: dread - d3/dd/d16/f5f zero size 2026-03-09T19:27:21.210 INFO:tasks.workunit.client.1.vm08.stdout:9/539: truncate d0/d2/d8/f2d 2778465 0 2026-03-09T19:27:21.211 INFO:tasks.workunit.client.0.vm07.stdout:3/361: write d1/d1f/d16/f1e [4773735,52304] 0 2026-03-09T19:27:21.211 INFO:tasks.workunit.client.1.vm08.stdout:5/521: truncate d16/d1e/d3b/f50 1172230 0 2026-03-09T19:27:21.216 INFO:tasks.workunit.client.0.vm07.stdout:3/362: truncate d1/d6/f21 2624077 0 2026-03-09T19:27:21.227 INFO:tasks.workunit.client.1.vm08.stdout:9/540: rename d0/d1b/d97/d48/d5e/f96 to d0/d1b/d97/d48/d5e/fb7 0 2026-03-09T19:27:21.234 INFO:tasks.workunit.client.1.vm08.stdout:9/541: fsync d0/d2/d14/f56 0 2026-03-09T19:27:21.235 INFO:tasks.workunit.client.1.vm08.stdout:9/542: mkdir d0/d1b/daa/db8 0 2026-03-09T19:27:21.237 INFO:tasks.workunit.client.1.vm08.stdout:9/543: rmdir d0/d2/d14/d98/dac 0 2026-03-09T19:27:21.238 INFO:tasks.workunit.client.1.vm08.stdout:9/544: creat d0/d1b/daa/fb9 x:0 0 0 2026-03-09T19:27:21.278 INFO:tasks.workunit.client.1.vm08.stdout:9/545: rmdir d0/d1b/daa/db8 0 2026-03-09T19:27:21.278 INFO:tasks.workunit.client.1.vm08.stdout:9/546: dwrite d0/d2/d80/d69/f7a [0,4194304] 0 2026-03-09T19:27:21.416 INFO:tasks.workunit.client.1.vm08.stdout:8/511: creat de/d25/d31/d82/fb5 x:0 0 0 2026-03-09T19:27:21.434 INFO:tasks.workunit.client.0.vm07.stdout:0/258: dwrite d0/d6/d13/d17/d19/f34 [0,4194304] 0 2026-03-09T19:27:21.444 INFO:tasks.workunit.client.0.vm07.stdout:0/259: dwrite d0/d6/d13/d33/f39 [0,4194304] 0 2026-03-09T19:27:21.447 INFO:tasks.workunit.client.1.vm08.stdout:1/693: fsync d9/f48 0 2026-03-09T19:27:21.463 INFO:tasks.workunit.client.0.vm07.stdout:0/260: mknod d0/d6/d13/d17/d19/d57/c5d 0 2026-03-09T19:27:21.476 INFO:tasks.workunit.client.1.vm08.stdout:1/694: creat d9/d40/d49/d9e/fce x:0 0 0 2026-03-09T19:27:21.482 INFO:tasks.workunit.client.1.vm08.stdout:1/695: unlink d9/da/d2c/d6a/f9c 0 2026-03-09T19:27:21.488 INFO:tasks.workunit.client.0.vm07.stdout:0/261: getdents d0/d6/d13/d1c/d50 0 2026-03-09T19:27:21.493 INFO:tasks.workunit.client.1.vm08.stdout:7/623: dwrite d5/d14/dae/d3a/d42/fb7 [0,4194304] 0 2026-03-09T19:27:21.493 INFO:tasks.workunit.client.1.vm08.stdout:1/696: creat d9/da/d95/dcd/fcf x:0 0 0 2026-03-09T19:27:21.496 INFO:tasks.workunit.client.1.vm08.stdout:7/624: readlink d5/d14/d38/l79 0 2026-03-09T19:27:21.497 INFO:tasks.workunit.client.0.vm07.stdout:5/303: write d3/dd/d26/d2d/f54 [215381,75857] 0 2026-03-09T19:27:21.498 INFO:tasks.workunit.client.0.vm07.stdout:4/294: dwrite d3/d11/f1e [0,4194304] 0 2026-03-09T19:27:21.499 INFO:tasks.workunit.client.0.vm07.stdout:4/295: chown d3/d11/d16/d2f/d22/f24 1526 1 2026-03-09T19:27:21.502 INFO:tasks.workunit.client.1.vm08.stdout:1/697: fdatasync d9/d11/f29 0 2026-03-09T19:27:21.511 INFO:tasks.workunit.client.1.vm08.stdout:7/625: mkdir d5/d14/dae/d1c/d83/d9c/dcb/dd2 0 2026-03-09T19:27:21.512 INFO:tasks.workunit.client.1.vm08.stdout:7/626: stat d5/d14/dae/d1c/d83/d90 0 2026-03-09T19:27:21.527 INFO:tasks.workunit.client.1.vm08.stdout:1/698: dread d9/da/d12/d39/fa7 [0,4194304] 0 2026-03-09T19:27:21.539 INFO:tasks.workunit.client.1.vm08.stdout:0/561: write dd/d31/f54 [1153422,114761] 0 2026-03-09T19:27:21.553 INFO:tasks.workunit.client.1.vm08.stdout:1/699: creat d9/d11/d7a/d89/d8d/daa/fd0 x:0 0 0 2026-03-09T19:27:21.553 INFO:tasks.workunit.client.1.vm08.stdout:1/700: fdatasync d9/da/dc/f31 0 2026-03-09T19:27:21.560 INFO:tasks.workunit.client.0.vm07.stdout:5/304: link d3/dd/d26/d3f/d47/c5b d3/dd/d26/d3f/d47/c64 0 2026-03-09T19:27:21.564 INFO:tasks.workunit.client.0.vm07.stdout:5/305: dread - d3/dd/d26/d3f/d47/f62 zero size 2026-03-09T19:27:21.591 INFO:tasks.workunit.client.0.vm07.stdout:2/369: write d3/f4 [3114017,67868] 0 2026-03-09T19:27:21.597 INFO:tasks.workunit.client.1.vm08.stdout:2/515: dwrite d3/d4/d23/d2c/f80 [0,4194304] 0 2026-03-09T19:27:21.623 INFO:tasks.workunit.client.0.vm07.stdout:7/335: write d0/d4/d5/d8/d41/f73 [13057201,116043] 0 2026-03-09T19:27:21.639 INFO:tasks.workunit.client.0.vm07.stdout:4/296: mknod d3/d11/d16/c65 0 2026-03-09T19:27:21.643 INFO:tasks.workunit.client.0.vm07.stdout:4/297: dwrite d3/d11/d29/f3c [0,4194304] 0 2026-03-09T19:27:21.649 INFO:tasks.workunit.client.0.vm07.stdout:7/336: sync 2026-03-09T19:27:21.653 INFO:tasks.workunit.client.0.vm07.stdout:7/337: dread d0/d4/d5/dd/f47 [0,4194304] 0 2026-03-09T19:27:21.666 INFO:tasks.workunit.client.0.vm07.stdout:4/298: truncate d3/f7 1940155 0 2026-03-09T19:27:21.674 INFO:tasks.workunit.client.0.vm07.stdout:7/338: mkdir d0/d4/d5/d8/d41/d64/d74 0 2026-03-09T19:27:21.676 INFO:tasks.workunit.client.0.vm07.stdout:7/339: sync 2026-03-09T19:27:21.680 INFO:tasks.workunit.client.0.vm07.stdout:4/299: dwrite d3/d11/f12 [0,4194304] 0 2026-03-09T19:27:21.681 INFO:tasks.workunit.client.0.vm07.stdout:4/300: chown d3/d11/d2b/f2c 60255 1 2026-03-09T19:27:21.682 INFO:tasks.workunit.client.0.vm07.stdout:4/301: fdatasync d3/f64 0 2026-03-09T19:27:21.682 INFO:tasks.workunit.client.1.vm08.stdout:4/553: creat da/fa8 x:0 0 0 2026-03-09T19:27:21.685 INFO:tasks.workunit.client.0.vm07.stdout:4/302: dwrite d3/f1a [0,4194304] 0 2026-03-09T19:27:21.732 INFO:tasks.workunit.client.0.vm07.stdout:4/303: link d3/d11/d2b/d37/c1d d3/d11/d16/d2f/c66 0 2026-03-09T19:27:21.740 INFO:tasks.workunit.client.0.vm07.stdout:3/363: creat d1/d6/f70 x:0 0 0 2026-03-09T19:27:21.754 INFO:tasks.workunit.client.1.vm08.stdout:2/516: mknod d3/d4/d23/d2c/d39/d5e/de/cb3 0 2026-03-09T19:27:21.754 INFO:tasks.workunit.client.1.vm08.stdout:2/517: chown d3/d4/d23/d2c/c96 2062765547 1 2026-03-09T19:27:21.755 INFO:tasks.workunit.client.1.vm08.stdout:2/518: truncate d3/d9/f5d 2540281 0 2026-03-09T19:27:21.755 INFO:tasks.workunit.client.0.vm07.stdout:4/304: creat d3/d11/d16/d2f/f67 x:0 0 0 2026-03-09T19:27:21.755 INFO:tasks.workunit.client.0.vm07.stdout:5/306: rmdir d3/dd/d26/d2c 39 2026-03-09T19:27:21.755 INFO:tasks.workunit.client.0.vm07.stdout:3/364: chown d1/d1f/f13 162444636 1 2026-03-09T19:27:21.755 INFO:tasks.workunit.client.0.vm07.stdout:5/307: creat d3/dd/d26/d3f/d47/d56/f65 x:0 0 0 2026-03-09T19:27:21.755 INFO:tasks.workunit.client.0.vm07.stdout:1/303: rmdir d1/d11/d37/d3f/d45 39 2026-03-09T19:27:21.755 INFO:tasks.workunit.client.0.vm07.stdout:1/304: chown d1/d3/l34 2309 1 2026-03-09T19:27:21.755 INFO:tasks.workunit.client.0.vm07.stdout:4/305: mknod d3/d11/c68 0 2026-03-09T19:27:21.761 INFO:tasks.workunit.client.0.vm07.stdout:8/337: link d7/d9/f4c d7/d9/d37/d45/f7d 0 2026-03-09T19:27:21.761 INFO:tasks.workunit.client.0.vm07.stdout:5/308: creat d3/dd/d26/d3f/f66 x:0 0 0 2026-03-09T19:27:21.761 INFO:tasks.workunit.client.0.vm07.stdout:4/306: creat d3/d11/d2b/f69 x:0 0 0 2026-03-09T19:27:21.762 INFO:tasks.workunit.client.0.vm07.stdout:1/305: sync 2026-03-09T19:27:21.763 INFO:tasks.workunit.client.0.vm07.stdout:1/306: write d1/d11/d37/f2c [3179932,88897] 0 2026-03-09T19:27:21.767 INFO:tasks.workunit.client.0.vm07.stdout:7/340: creat d0/d4/d5/d26/f75 x:0 0 0 2026-03-09T19:27:21.769 INFO:tasks.workunit.client.0.vm07.stdout:7/341: dread d0/d4/d5/f50 [0,4194304] 0 2026-03-09T19:27:21.770 INFO:tasks.workunit.client.0.vm07.stdout:7/342: readlink d0/d4/d5/d8/d1a/l23 0 2026-03-09T19:27:21.771 INFO:tasks.workunit.client.0.vm07.stdout:7/343: dread - d0/d52/f62 zero size 2026-03-09T19:27:21.773 INFO:tasks.workunit.client.0.vm07.stdout:7/344: dread d0/d4/d5/f50 [0,4194304] 0 2026-03-09T19:27:21.787 INFO:tasks.workunit.client.1.vm08.stdout:3/627: symlink d0/d6/de/d6e/lc6 0 2026-03-09T19:27:21.791 INFO:tasks.workunit.client.0.vm07.stdout:7/345: symlink d0/d52/l76 0 2026-03-09T19:27:21.791 INFO:tasks.workunit.client.0.vm07.stdout:8/338: fsync f4 0 2026-03-09T19:27:21.793 INFO:tasks.workunit.client.0.vm07.stdout:1/307: dread d1/d3/d21/f47 [0,4194304] 0 2026-03-09T19:27:21.794 INFO:tasks.workunit.client.0.vm07.stdout:7/346: chown d0/d4/d5/d26/d3c/d39/c49 8 1 2026-03-09T19:27:21.794 INFO:tasks.workunit.client.1.vm08.stdout:3/628: read d0/d6/de/d6e/f81 [351272,122475] 0 2026-03-09T19:27:21.799 INFO:tasks.workunit.client.1.vm08.stdout:0/562: rename dd/d22/d24/d49/d98 to dd/d22/d27/d2e/db0 0 2026-03-09T19:27:21.813 INFO:tasks.workunit.client.0.vm07.stdout:7/347: mknod d0/d4/d5/d8/d41/d64/c77 0 2026-03-09T19:27:21.814 INFO:tasks.workunit.client.0.vm07.stdout:7/348: dread - d0/d4/d5/d26/f75 zero size 2026-03-09T19:27:21.822 INFO:tasks.workunit.client.0.vm07.stdout:0/262: write d0/d6/d13/d33/f35 [894783,33345] 0 2026-03-09T19:27:21.827 INFO:tasks.workunit.client.1.vm08.stdout:0/563: fsync dd/d22/f28 0 2026-03-09T19:27:21.827 INFO:tasks.workunit.client.0.vm07.stdout:1/308: mkdir d1/d11/d37/d3f/d6e 0 2026-03-09T19:27:21.834 INFO:tasks.workunit.client.0.vm07.stdout:0/263: dread - d0/f41 zero size 2026-03-09T19:27:21.837 INFO:tasks.workunit.client.0.vm07.stdout:6/283: rename d0/d1/l7 to d0/d1/db/l71 0 2026-03-09T19:27:21.837 INFO:tasks.workunit.client.1.vm08.stdout:5/522: rmdir d16/d1e/d30/d8a 39 2026-03-09T19:27:21.844 INFO:tasks.workunit.client.1.vm08.stdout:7/627: dwrite d5/fa [0,4194304] 0 2026-03-09T19:27:21.848 INFO:tasks.workunit.client.0.vm07.stdout:0/264: rmdir d0/d6 39 2026-03-09T19:27:21.850 INFO:tasks.workunit.client.1.vm08.stdout:1/701: truncate d9/d11/d7a/d89/fb7 353347 0 2026-03-09T19:27:21.853 INFO:tasks.workunit.client.1.vm08.stdout:8/512: unlink de/f1c 0 2026-03-09T19:27:21.854 INFO:tasks.workunit.client.1.vm08.stdout:1/702: fdatasync d9/da/d2d/f50 0 2026-03-09T19:27:21.854 INFO:tasks.workunit.client.0.vm07.stdout:2/370: write d3/dd/f24 [2317314,113414] 0 2026-03-09T19:27:21.855 INFO:tasks.workunit.client.1.vm08.stdout:5/523: dwrite d16/d1e/f5c [0,4194304] 0 2026-03-09T19:27:21.856 INFO:tasks.workunit.client.1.vm08.stdout:1/703: stat d9/da/d12/d39/l8c 0 2026-03-09T19:27:21.858 INFO:tasks.workunit.client.0.vm07.stdout:2/371: sync 2026-03-09T19:27:21.891 INFO:tasks.workunit.client.0.vm07.stdout:9/360: rename d0/db/d29/d79 to d0/d6/d3a/d7e 0 2026-03-09T19:27:21.894 INFO:tasks.workunit.client.1.vm08.stdout:9/547: mknod d0/d2/cba 0 2026-03-09T19:27:21.895 INFO:tasks.workunit.client.1.vm08.stdout:9/548: write d0/f13 [2178680,12361] 0 2026-03-09T19:27:21.896 INFO:tasks.workunit.client.1.vm08.stdout:8/513: fsync f6 0 2026-03-09T19:27:21.899 INFO:tasks.workunit.client.0.vm07.stdout:2/372: creat d3/dd/d16/d30/f7e x:0 0 0 2026-03-09T19:27:21.902 INFO:tasks.workunit.client.0.vm07.stdout:0/265: dread d0/d6/d13/d17/f2b [0,4194304] 0 2026-03-09T19:27:21.903 INFO:tasks.workunit.client.0.vm07.stdout:0/266: write d0/d6/d13/d17/d19/f34 [4828990,50267] 0 2026-03-09T19:27:21.942 INFO:tasks.workunit.client.0.vm07.stdout:3/365: rename d1/d26/d4f to d1/d6/d71 0 2026-03-09T19:27:21.942 INFO:tasks.workunit.client.0.vm07.stdout:9/361: mkdir d0/d6/d3a/d7e/d7f 0 2026-03-09T19:27:21.942 INFO:tasks.workunit.client.0.vm07.stdout:3/366: mknod d1/d6/d4c/c72 0 2026-03-09T19:27:21.942 INFO:tasks.workunit.client.0.vm07.stdout:9/362: truncate d0/db/f1d 3839793 0 2026-03-09T19:27:21.942 INFO:tasks.workunit.client.0.vm07.stdout:9/363: truncate d0/d6/ff 5221408 0 2026-03-09T19:27:21.942 INFO:tasks.workunit.client.0.vm07.stdout:4/307: rmdir d3/d11/d2b 39 2026-03-09T19:27:21.942 INFO:tasks.workunit.client.1.vm08.stdout:9/549: dwrite d0/d2/d80/d69/f7a [0,4194304] 0 2026-03-09T19:27:21.942 INFO:tasks.workunit.client.1.vm08.stdout:5/524: symlink d16/d45/la7 0 2026-03-09T19:27:21.942 INFO:tasks.workunit.client.1.vm08.stdout:6/578: rmdir d3/d68 39 2026-03-09T19:27:21.942 INFO:tasks.workunit.client.1.vm08.stdout:9/550: dwrite d0/d2/f1a [0,4194304] 0 2026-03-09T19:27:21.942 INFO:tasks.workunit.client.1.vm08.stdout:1/704: creat d9/da/d2d/d4e/fd1 x:0 0 0 2026-03-09T19:27:21.942 INFO:tasks.workunit.client.1.vm08.stdout:4/554: write da/d10/f13 [5770190,4185] 0 2026-03-09T19:27:21.942 INFO:tasks.workunit.client.1.vm08.stdout:6/579: rename d3/d34/d3b/c3c to d3/db/d43/d69/cd0 0 2026-03-09T19:27:21.943 INFO:tasks.workunit.client.0.vm07.stdout:4/308: dwrite d3/d11/d16/d2f/d22/f58 [0,4194304] 0 2026-03-09T19:27:21.943 INFO:tasks.workunit.client.0.vm07.stdout:5/309: rename d3/dd/d26/d2d/l41 to d3/dd/d26/d3f/d47/l67 0 2026-03-09T19:27:21.945 INFO:tasks.workunit.client.0.vm07.stdout:2/373: rename d3 to d3/dd/d16/d29/d3c/d5a/d7a/d7f 22 2026-03-09T19:27:21.945 INFO:tasks.workunit.client.0.vm07.stdout:4/309: chown d3/ce 15682959 1 2026-03-09T19:27:21.946 INFO:tasks.workunit.client.1.vm08.stdout:4/555: unlink da/d10/d1b/f29 0 2026-03-09T19:27:21.946 INFO:tasks.workunit.client.1.vm08.stdout:4/556: fsync da/d10/d26/f87 0 2026-03-09T19:27:21.951 INFO:tasks.workunit.client.0.vm07.stdout:9/364: mkdir d0/db/d29/d32/d5c/d80 0 2026-03-09T19:27:21.954 INFO:tasks.workunit.client.1.vm08.stdout:1/705: creat d9/da/d53/db3/fd2 x:0 0 0 2026-03-09T19:27:21.955 INFO:tasks.workunit.client.1.vm08.stdout:1/706: chown d9/d11/faf 21035541 1 2026-03-09T19:27:21.959 INFO:tasks.workunit.client.1.vm08.stdout:1/707: dread d9/d40/f57 [0,4194304] 0 2026-03-09T19:27:21.961 INFO:tasks.workunit.client.0.vm07.stdout:4/310: fsync d3/d11/d2b/d37/f30 0 2026-03-09T19:27:21.963 INFO:tasks.workunit.client.0.vm07.stdout:2/374: mknod d3/dd/d16/d29/d3c/d4c/c80 0 2026-03-09T19:27:21.964 INFO:tasks.workunit.client.0.vm07.stdout:5/310: truncate d3/d1a/d28/f3c 1628068 0 2026-03-09T19:27:21.965 INFO:tasks.workunit.client.1.vm08.stdout:8/514: getdents de/d1d/d2e 0 2026-03-09T19:27:21.968 INFO:tasks.workunit.client.1.vm08.stdout:1/708: truncate d9/d40/d49/f70 672987 0 2026-03-09T19:27:21.969 INFO:tasks.workunit.client.0.vm07.stdout:0/267: getdents d0/d6/d13/d1c 0 2026-03-09T19:27:21.971 INFO:tasks.workunit.client.0.vm07.stdout:0/268: read - d0/d6/d13/d17/d19/d57/f5a zero size 2026-03-09T19:27:21.973 INFO:tasks.workunit.client.0.vm07.stdout:3/367: dwrite d1/d1f/f1a [4194304,4194304] 0 2026-03-09T19:27:21.973 INFO:tasks.workunit.client.1.vm08.stdout:8/515: fdatasync de/f19 0 2026-03-09T19:27:21.976 INFO:tasks.workunit.client.0.vm07.stdout:0/269: dwrite d0/f3a [0,4194304] 0 2026-03-09T19:27:21.978 INFO:tasks.workunit.client.0.vm07.stdout:3/368: unlink d1/d6/dd/f48 0 2026-03-09T19:27:21.979 INFO:tasks.workunit.client.0.vm07.stdout:3/369: dread d1/d6/d71/f69 [0,4194304] 0 2026-03-09T19:27:21.980 INFO:tasks.workunit.client.0.vm07.stdout:0/270: mknod d0/d6/d13/d1c/d11/c5e 0 2026-03-09T19:27:21.981 INFO:tasks.workunit.client.0.vm07.stdout:5/311: creat d3/f68 x:0 0 0 2026-03-09T19:27:21.983 INFO:tasks.workunit.client.0.vm07.stdout:3/370: creat d1/f73 x:0 0 0 2026-03-09T19:27:22.020 INFO:tasks.workunit.client.0.vm07.stdout:0/271: getdents d0/d6/d13/d17 0 2026-03-09T19:27:22.020 INFO:tasks.workunit.client.0.vm07.stdout:5/312: link d3/d1a/f12 d3/d1a/d28/d48/f69 0 2026-03-09T19:27:22.020 INFO:tasks.workunit.client.0.vm07.stdout:5/313: mknod d3/d1a/d28/c6a 0 2026-03-09T19:27:22.020 INFO:tasks.workunit.client.0.vm07.stdout:0/272: dwrite d0/d6/f4f [0,4194304] 0 2026-03-09T19:27:22.020 INFO:tasks.workunit.client.0.vm07.stdout:0/273: creat d0/d6/d13/d1c/d11/f5f x:0 0 0 2026-03-09T19:27:22.020 INFO:tasks.workunit.client.0.vm07.stdout:5/314: dwrite d3/d1a/d28/d48/f69 [0,4194304] 0 2026-03-09T19:27:22.020 INFO:tasks.workunit.client.0.vm07.stdout:0/274: fdatasync d0/f41 0 2026-03-09T19:27:22.020 INFO:tasks.workunit.client.0.vm07.stdout:0/275: readlink d0/l32 0 2026-03-09T19:27:22.020 INFO:tasks.workunit.client.0.vm07.stdout:0/276: creat d0/d6/d13/d1c/d50/f60 x:0 0 0 2026-03-09T19:27:22.020 INFO:tasks.workunit.client.0.vm07.stdout:0/277: mkdir d0/d6/d13/d1c/d61 0 2026-03-09T19:27:22.026 INFO:tasks.workunit.client.0.vm07.stdout:5/315: dread d3/d1a/d28/d48/f69 [4194304,4194304] 0 2026-03-09T19:27:22.028 INFO:tasks.workunit.client.0.vm07.stdout:5/316: mknod d3/d1a/d5d/c6b 0 2026-03-09T19:27:22.031 INFO:tasks.workunit.client.0.vm07.stdout:5/317: dwrite d3/f68 [0,4194304] 0 2026-03-09T19:27:22.163 INFO:tasks.workunit.client.0.vm07.stdout:2/375: sync 2026-03-09T19:27:22.171 INFO:tasks.workunit.client.0.vm07.stdout:2/376: dread d3/d11/f2e [0,4194304] 0 2026-03-09T19:27:22.269 INFO:tasks.workunit.client.1.vm08.stdout:2/519: sync 2026-03-09T19:27:22.269 INFO:tasks.workunit.client.1.vm08.stdout:8/516: sync 2026-03-09T19:27:22.277 INFO:tasks.workunit.client.1.vm08.stdout:8/517: read de/d1d/d21/f86 [3265514,110409] 0 2026-03-09T19:27:22.281 INFO:tasks.workunit.client.1.vm08.stdout:2/520: symlink d3/d9/d79/d46/d8c/lb4 0 2026-03-09T19:27:22.282 INFO:tasks.workunit.client.1.vm08.stdout:8/518: chown de/d25/d31/d82/d6d/d99/da5/db3/f50 94541653 1 2026-03-09T19:27:22.285 INFO:tasks.workunit.client.1.vm08.stdout:2/521: unlink d3/d9/l71 0 2026-03-09T19:27:22.285 INFO:tasks.workunit.client.1.vm08.stdout:8/519: rmdir de/d1d/d2e 39 2026-03-09T19:27:22.296 INFO:tasks.workunit.client.1.vm08.stdout:2/522: mknod d3/d4/d23/d2c/d39/d5e/de/d18/d99/cb5 0 2026-03-09T19:27:22.296 INFO:tasks.workunit.client.1.vm08.stdout:8/520: creat de/d25/d33/fb6 x:0 0 0 2026-03-09T19:27:22.318 INFO:tasks.workunit.client.1.vm08.stdout:2/523: mknod d3/d4/d23/cb6 0 2026-03-09T19:27:22.318 INFO:tasks.workunit.client.1.vm08.stdout:8/521: dwrite de/d1d/f27 [0,4194304] 0 2026-03-09T19:27:22.329 INFO:tasks.workunit.client.1.vm08.stdout:2/524: readlink d3/d9/d26/l95 0 2026-03-09T19:27:22.330 INFO:tasks.workunit.client.1.vm08.stdout:2/525: chown d3/d9/d4a/fa4 4389 1 2026-03-09T19:27:22.342 INFO:tasks.workunit.client.0.vm07.stdout:8/339: write d7/d16/f69 [979859,56860] 0 2026-03-09T19:27:22.342 INFO:tasks.workunit.client.0.vm07.stdout:7/349: write d0/d4/d5/d26/f31 [687199,84603] 0 2026-03-09T19:27:22.344 INFO:tasks.workunit.client.0.vm07.stdout:7/350: write d0/f6c [836765,29735] 0 2026-03-09T19:27:22.346 INFO:tasks.workunit.client.0.vm07.stdout:1/309: dwrite d1/db/f1f [0,4194304] 0 2026-03-09T19:27:22.349 INFO:tasks.workunit.client.0.vm07.stdout:4/311: dread d3/f13 [0,4194304] 0 2026-03-09T19:27:22.350 INFO:tasks.workunit.client.1.vm08.stdout:8/522: rename de/d25/d33/f6a to de/d1d/d69/fb7 0 2026-03-09T19:27:22.350 INFO:tasks.workunit.client.1.vm08.stdout:3/629: write d0/d8/d24/f2d [951896,30089] 0 2026-03-09T19:27:22.350 INFO:tasks.workunit.client.1.vm08.stdout:0/564: write dd/d22/d27/f3d [2035611,36554] 0 2026-03-09T19:27:22.350 INFO:tasks.workunit.client.1.vm08.stdout:3/630: chown d0/d6/de/d1b/d16/d17/l22 10349 1 2026-03-09T19:27:22.353 INFO:tasks.workunit.client.1.vm08.stdout:0/565: chown dd/d22/d27/d2e/d37/l38 1 1 2026-03-09T19:27:22.364 INFO:tasks.workunit.client.0.vm07.stdout:6/284: read d0/d1/d28/f64 [319351,119764] 0 2026-03-09T19:27:22.369 INFO:tasks.workunit.client.0.vm07.stdout:8/340: unlink d7/d30/d32/f6b 0 2026-03-09T19:27:22.370 INFO:tasks.workunit.client.1.vm08.stdout:7/628: write d5/d14/d27/d54/f58 [1466626,116093] 0 2026-03-09T19:27:22.372 INFO:tasks.workunit.client.1.vm08.stdout:7/629: truncate d5/d14/d2b/fb0 1008410 0 2026-03-09T19:27:22.379 INFO:tasks.workunit.client.1.vm08.stdout:0/566: dread dd/d22/f3e [0,4194304] 0 2026-03-09T19:27:22.381 INFO:tasks.workunit.client.0.vm07.stdout:7/351: mknod d0/d4/d5/d26/d3c/d58/c78 0 2026-03-09T19:27:22.382 INFO:tasks.workunit.client.0.vm07.stdout:1/310: readlink d1/l27 0 2026-03-09T19:27:22.388 INFO:tasks.workunit.client.1.vm08.stdout:3/631: mknod d0/d52/d6d/d77/d88/cc7 0 2026-03-09T19:27:22.389 INFO:tasks.workunit.client.1.vm08.stdout:3/632: chown d0/d6/de/d1b/l2f 20696 1 2026-03-09T19:27:22.401 INFO:tasks.workunit.client.0.vm07.stdout:4/312: symlink d3/d11/d16/d2f/d22/l6a 0 2026-03-09T19:27:22.401 INFO:tasks.workunit.client.0.vm07.stdout:9/365: dread d0/d6/f48 [0,4194304] 0 2026-03-09T19:27:22.402 INFO:tasks.workunit.client.0.vm07.stdout:8/341: read d7/d16/f69 [329843,81861] 0 2026-03-09T19:27:22.405 INFO:tasks.workunit.client.1.vm08.stdout:0/567: dread dd/d22/d27/d6c/f85 [0,4194304] 0 2026-03-09T19:27:22.407 INFO:tasks.workunit.client.1.vm08.stdout:0/568: chown dd/d22/d27/d4f/d6f 10 1 2026-03-09T19:27:22.421 INFO:tasks.workunit.client.1.vm08.stdout:7/630: fsync d5/d14/dae/d1c/d73/fac 0 2026-03-09T19:27:22.423 INFO:tasks.workunit.client.0.vm07.stdout:3/371: rename d1/d26 to d1/d74 0 2026-03-09T19:27:22.424 INFO:tasks.workunit.client.0.vm07.stdout:3/372: readlink d1/d1f/l65 0 2026-03-09T19:27:22.424 INFO:tasks.workunit.client.1.vm08.stdout:7/631: readlink d5/d14/dae/l1b 0 2026-03-09T19:27:22.428 INFO:tasks.workunit.client.0.vm07.stdout:3/373: dwrite d1/d6/dd/f3b [0,4194304] 0 2026-03-09T19:27:22.445 INFO:tasks.workunit.client.1.vm08.stdout:3/633: mknod d0/d6/d25/cc8 0 2026-03-09T19:27:22.448 INFO:tasks.workunit.client.0.vm07.stdout:7/352: mkdir d0/d4/d5/d8/d41/d64/d79 0 2026-03-09T19:27:22.452 INFO:tasks.workunit.client.1.vm08.stdout:7/632: truncate d5/d14/dae/d3a/d42/d6a/f62 816643 0 2026-03-09T19:27:22.455 INFO:tasks.workunit.client.0.vm07.stdout:4/313: rename d3/d11/d2b/d38/c53 to d3/d11/d2b/d38/c6b 0 2026-03-09T19:27:22.455 INFO:tasks.workunit.client.1.vm08.stdout:7/633: chown d5/fa 164225049 1 2026-03-09T19:27:22.456 INFO:tasks.workunit.client.1.vm08.stdout:5/525: dwrite d16/d8e/fa2 [0,4194304] 0 2026-03-09T19:27:22.459 INFO:tasks.workunit.client.1.vm08.stdout:5/526: readlink d16/d45/d81/la0 0 2026-03-09T19:27:22.468 INFO:tasks.workunit.client.1.vm08.stdout:9/551: dwrite d0/d1b/d97/d48/d5e/fa1 [0,4194304] 0 2026-03-09T19:27:22.470 INFO:tasks.workunit.client.1.vm08.stdout:0/569: dwrite dd/d22/d24/d49/d92/fa7 [4194304,4194304] 0 2026-03-09T19:27:22.483 INFO:tasks.workunit.client.1.vm08.stdout:8/523: mknod de/d1d/d2e/db4/cb8 0 2026-03-09T19:27:22.490 INFO:tasks.workunit.client.0.vm07.stdout:3/374: rename d1/f7 to d1/d1f/d5c/f75 0 2026-03-09T19:27:22.495 INFO:tasks.workunit.client.1.vm08.stdout:6/580: write d3/d34/d5c/f7f [98671,89713] 0 2026-03-09T19:27:22.496 INFO:tasks.workunit.client.1.vm08.stdout:4/557: dwrite da/f18 [0,4194304] 0 2026-03-09T19:27:22.496 INFO:tasks.workunit.client.0.vm07.stdout:3/375: dread - d1/d1f/d16/d28/f64 zero size 2026-03-09T19:27:22.496 INFO:tasks.workunit.client.0.vm07.stdout:3/376: chown d1/d6/dd/d51 2167485 1 2026-03-09T19:27:22.503 INFO:tasks.workunit.client.0.vm07.stdout:6/285: rmdir d0/d1/db 39 2026-03-09T19:27:22.510 INFO:tasks.workunit.client.1.vm08.stdout:2/526: rename d3/d4/d23/d2c/d39/d5e/de/d8b/f70 to d3/d4/d23/d2c/d39/d5e/fb7 0 2026-03-09T19:27:22.514 INFO:tasks.workunit.client.1.vm08.stdout:1/709: dwrite d9/d40/f92 [0,4194304] 0 2026-03-09T19:27:22.516 INFO:tasks.workunit.client.0.vm07.stdout:5/318: truncate d3/d1a/d28/d48/f69 4926972 0 2026-03-09T19:27:22.520 INFO:tasks.workunit.client.0.vm07.stdout:7/353: rmdir d0/d4 39 2026-03-09T19:27:22.522 INFO:tasks.workunit.client.1.vm08.stdout:2/527: dwrite d3/d4/f6 [4194304,4194304] 0 2026-03-09T19:27:22.530 INFO:tasks.workunit.client.0.vm07.stdout:1/311: truncate d1/f6 1547239 0 2026-03-09T19:27:22.532 INFO:tasks.workunit.client.0.vm07.stdout:4/314: rmdir d3/d11/d29 39 2026-03-09T19:27:22.534 INFO:tasks.workunit.client.0.vm07.stdout:2/377: dwrite d3/dd/d16/d30/f3a [4194304,4194304] 0 2026-03-09T19:27:22.548 INFO:tasks.workunit.client.1.vm08.stdout:5/527: readlink d16/d1e/d3b/d61/l83 0 2026-03-09T19:27:22.548 INFO:tasks.workunit.client.1.vm08.stdout:9/552: mkdir d0/d2/d14/d98/dbb 0 2026-03-09T19:27:22.549 INFO:tasks.workunit.client.1.vm08.stdout:0/570: read dd/d22/f8d [2654774,118723] 0 2026-03-09T19:27:22.575 INFO:tasks.workunit.client.1.vm08.stdout:7/634: rename d5/d14/d2b/f32 to d5/d14/d27/d54/d86/fd3 0 2026-03-09T19:27:22.579 INFO:tasks.workunit.client.0.vm07.stdout:7/354: dwrite d0/d4/d5/d26/d3c/d58/f71 [0,4194304] 0 2026-03-09T19:27:22.590 INFO:tasks.workunit.client.1.vm08.stdout:9/553: fsync d0/f83 0 2026-03-09T19:27:22.590 INFO:tasks.workunit.client.0.vm07.stdout:4/315: dread - d3/d11/d29/f52 zero size 2026-03-09T19:27:22.590 INFO:tasks.workunit.client.0.vm07.stdout:2/378: creat d3/dd/d16/d29/d2d/d45/d3b/d44/f81 x:0 0 0 2026-03-09T19:27:22.590 INFO:tasks.workunit.client.0.vm07.stdout:8/342: creat d7/d9/d37/d45/f7e x:0 0 0 2026-03-09T19:27:22.595 INFO:tasks.workunit.client.1.vm08.stdout:4/558: creat da/d10/d16/d28/d4d/fa9 x:0 0 0 2026-03-09T19:27:22.596 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:22 vm08.local ceph-mon[57794]: pgmap v165: 65 pgs: 65 active+clean; 1.8 GiB data, 6.6 GiB used, 113 GiB / 120 GiB avail; 31 MiB/s rd, 117 MiB/s wr, 254 op/s 2026-03-09T19:27:22.600 INFO:tasks.workunit.client.0.vm07.stdout:5/319: mkdir d3/d1a/d28/d6c 0 2026-03-09T19:27:22.604 INFO:tasks.workunit.client.0.vm07.stdout:5/320: dwrite d3/dd/f52 [0,4194304] 0 2026-03-09T19:27:22.617 INFO:tasks.workunit.client.1.vm08.stdout:5/528: rename d16/d1e/d6e/d84 to d16/d1e/d8c/d99/da8 0 2026-03-09T19:27:22.622 INFO:tasks.workunit.client.0.vm07.stdout:2/379: rename d3/dd/d16/d29/d2d/d45/f55 to d3/d49/f82 0 2026-03-09T19:27:22.624 INFO:tasks.workunit.client.1.vm08.stdout:2/528: mkdir d3/d4/d23/d2c/d39/d5e/db8 0 2026-03-09T19:27:22.629 INFO:tasks.workunit.client.1.vm08.stdout:9/554: creat d0/d1b/d68/d7f/d8c/fbc x:0 0 0 2026-03-09T19:27:22.631 INFO:tasks.workunit.client.0.vm07.stdout:0/278: mkdir d0/d6/d13/d1c/d11/d56/d62 0 2026-03-09T19:27:22.634 INFO:tasks.workunit.client.1.vm08.stdout:6/581: creat d3/d34/fd1 x:0 0 0 2026-03-09T19:27:22.644 INFO:tasks.workunit.client.1.vm08.stdout:4/559: creat da/d10/d16/d28/d2f/d4f/d64/d81/faa x:0 0 0 2026-03-09T19:27:22.658 INFO:tasks.workunit.client.1.vm08.stdout:9/555: dread d0/d1b/d68/d7f/fa0 [0,4194304] 0 2026-03-09T19:27:22.659 INFO:tasks.workunit.client.0.vm07.stdout:8/343: rename d7/d9/c5e to d7/d9/d37/d45/d56/c7f 0 2026-03-09T19:27:22.662 INFO:tasks.workunit.client.0.vm07.stdout:5/321: truncate d3/d1a/d28/d48/f69 5911103 0 2026-03-09T19:27:22.663 INFO:tasks.workunit.client.0.vm07.stdout:7/355: dread d0/d4/f12 [0,4194304] 0 2026-03-09T19:27:22.663 INFO:tasks.workunit.client.0.vm07.stdout:9/366: write d0/db/f6c [143691,39608] 0 2026-03-09T19:27:22.668 INFO:tasks.workunit.client.0.vm07.stdout:2/380: mknod d3/dd/d16/d29/d2d/d45/c83 0 2026-03-09T19:27:22.670 INFO:tasks.workunit.client.0.vm07.stdout:9/367: dread d0/db/d29/d2c/f30 [0,4194304] 0 2026-03-09T19:27:22.671 INFO:tasks.workunit.client.0.vm07.stdout:3/377: write d1/d6/f9 [4723412,85396] 0 2026-03-09T19:27:22.672 INFO:tasks.workunit.client.0.vm07.stdout:6/286: write d0/f3b [1200632,19516] 0 2026-03-09T19:27:22.675 INFO:tasks.workunit.client.0.vm07.stdout:0/279: chown d0/d6/d13/d1c/d50/c5b 8808 1 2026-03-09T19:27:22.677 INFO:tasks.workunit.client.0.vm07.stdout:1/312: write d1/d3/f4e [1391750,108500] 0 2026-03-09T19:27:22.684 INFO:tasks.workunit.client.1.vm08.stdout:3/634: dwrite d0/d8/f4a [0,4194304] 0 2026-03-09T19:27:22.685 INFO:tasks.workunit.client.0.vm07.stdout:3/378: dwrite d1/d6/dd/f3b [4194304,4194304] 0 2026-03-09T19:27:22.687 INFO:tasks.workunit.client.0.vm07.stdout:9/368: dwrite d0/d6/ff [0,4194304] 0 2026-03-09T19:27:22.691 INFO:tasks.workunit.client.0.vm07.stdout:8/344: creat d7/d50/f80 x:0 0 0 2026-03-09T19:27:22.692 INFO:tasks.workunit.client.1.vm08.stdout:1/710: truncate d9/da/d95/fc0 388609 0 2026-03-09T19:27:22.699 INFO:tasks.workunit.client.0.vm07.stdout:7/356: dread d0/d4/d5/d8/f35 [0,4194304] 0 2026-03-09T19:27:22.699 INFO:tasks.workunit.client.0.vm07.stdout:3/379: sync 2026-03-09T19:27:22.705 INFO:tasks.workunit.client.0.vm07.stdout:3/380: dwrite d1/d6/f70 [0,4194304] 0 2026-03-09T19:27:22.714 INFO:tasks.workunit.client.1.vm08.stdout:7/635: dwrite d5/d14/d38/fbb [0,4194304] 0 2026-03-09T19:27:22.722 INFO:tasks.workunit.client.0.vm07.stdout:5/322: dread - d3/f2f zero size 2026-03-09T19:27:22.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:22 vm07.local ceph-mon[48545]: pgmap v165: 65 pgs: 65 active+clean; 1.8 GiB data, 6.6 GiB used, 113 GiB / 120 GiB avail; 31 MiB/s rd, 117 MiB/s wr, 254 op/s 2026-03-09T19:27:22.749 INFO:tasks.workunit.client.0.vm07.stdout:2/381: unlink d3/dd/d16/f48 0 2026-03-09T19:27:22.754 INFO:tasks.workunit.client.0.vm07.stdout:4/316: link d3/d4f/f5e d3/d11/f6c 0 2026-03-09T19:27:22.759 INFO:tasks.workunit.client.0.vm07.stdout:9/369: mkdir d0/d6/d3a/d81 0 2026-03-09T19:27:22.770 INFO:tasks.workunit.client.1.vm08.stdout:0/571: rename dd/d22/d24/d49/f99 to dd/d22/d7b/d82/fb1 0 2026-03-09T19:27:22.770 INFO:tasks.workunit.client.1.vm08.stdout:5/529: dread - d16/d1e/d30/f3a zero size 2026-03-09T19:27:22.770 INFO:tasks.workunit.client.0.vm07.stdout:8/345: creat d7/d9/f81 x:0 0 0 2026-03-09T19:27:22.770 INFO:tasks.workunit.client.0.vm07.stdout:7/357: creat d0/d4/d5/d26/d3c/d39/f7a x:0 0 0 2026-03-09T19:27:22.770 INFO:tasks.workunit.client.0.vm07.stdout:7/358: dread - d0/d4/d5/d26/f75 zero size 2026-03-09T19:27:22.770 INFO:tasks.workunit.client.0.vm07.stdout:3/381: rename d1/d6/f70 to d1/d6/f76 0 2026-03-09T19:27:22.770 INFO:tasks.workunit.client.0.vm07.stdout:0/280: creat d0/d6/d13/d1c/d61/f63 x:0 0 0 2026-03-09T19:27:22.770 INFO:tasks.workunit.client.0.vm07.stdout:0/281: chown d0 2251061 1 2026-03-09T19:27:22.772 INFO:tasks.workunit.client.0.vm07.stdout:8/346: creat d7/d50/f82 x:0 0 0 2026-03-09T19:27:22.773 INFO:tasks.workunit.client.1.vm08.stdout:8/524: getdents de/d1d/d69 0 2026-03-09T19:27:22.773 INFO:tasks.workunit.client.0.vm07.stdout:0/282: dwrite d0/d6/f5c [0,4194304] 0 2026-03-09T19:27:22.778 INFO:tasks.workunit.client.1.vm08.stdout:3/635: symlink d0/d6/de/d1b/lc9 0 2026-03-09T19:27:22.778 INFO:tasks.workunit.client.1.vm08.stdout:7/636: dread - d5/d14/dae/d3a/f56 zero size 2026-03-09T19:27:22.778 INFO:tasks.workunit.client.0.vm07.stdout:8/347: write d7/d30/f61 [997771,97565] 0 2026-03-09T19:27:22.780 INFO:tasks.workunit.client.0.vm07.stdout:7/359: symlink d0/d52/d54/d5a/l7b 0 2026-03-09T19:27:22.780 INFO:tasks.workunit.client.0.vm07.stdout:3/382: rmdir d1/d3d 39 2026-03-09T19:27:22.783 INFO:tasks.workunit.client.0.vm07.stdout:4/317: unlink d3/d11/d2b/d37/f28 0 2026-03-09T19:27:22.791 INFO:tasks.workunit.client.0.vm07.stdout:6/287: mknod d0/d1/db/d17/c72 0 2026-03-09T19:27:22.812 INFO:tasks.workunit.client.1.vm08.stdout:3/636: dread d0/d52/d6d/d77/d88/faf [0,4194304] 0 2026-03-09T19:27:22.812 INFO:tasks.workunit.client.1.vm08.stdout:3/637: write d0/d6/de/f86 [3054115,15234] 0 2026-03-09T19:27:22.812 INFO:tasks.workunit.client.1.vm08.stdout:8/525: creat de/d47/fb9 x:0 0 0 2026-03-09T19:27:22.812 INFO:tasks.workunit.client.1.vm08.stdout:6/582: mkdir d3/d34/d6f/dd2 0 2026-03-09T19:27:22.812 INFO:tasks.workunit.client.1.vm08.stdout:6/583: write d3/d34/d3b/f8d [1577516,46740] 0 2026-03-09T19:27:22.812 INFO:tasks.workunit.client.1.vm08.stdout:0/572: fsync dd/d22/f2b 0 2026-03-09T19:27:22.812 INFO:tasks.workunit.client.1.vm08.stdout:4/560: creat da/fab x:0 0 0 2026-03-09T19:27:22.812 INFO:tasks.workunit.client.1.vm08.stdout:4/561: chown da/d10/d16/d28/d2f/d4f/d64/d81/c97 132263584 1 2026-03-09T19:27:22.812 INFO:tasks.workunit.client.0.vm07.stdout:6/288: chown d0/d13/l68 77 1 2026-03-09T19:27:22.812 INFO:tasks.workunit.client.0.vm07.stdout:0/283: creat d0/d6/d13/d17/f64 x:0 0 0 2026-03-09T19:27:22.812 INFO:tasks.workunit.client.0.vm07.stdout:7/360: creat d0/d4/d5/d26/d32/f7c x:0 0 0 2026-03-09T19:27:22.812 INFO:tasks.workunit.client.0.vm07.stdout:6/289: unlink d0/d4e/c69 0 2026-03-09T19:27:22.812 INFO:tasks.workunit.client.0.vm07.stdout:6/290: dwrite d0/f3b [0,4194304] 0 2026-03-09T19:27:22.812 INFO:tasks.workunit.client.0.vm07.stdout:7/361: link d0/d4/d5/d26/d3c/d58/f70 d0/d52/d54/f7d 0 2026-03-09T19:27:22.813 INFO:tasks.workunit.client.0.vm07.stdout:0/284: creat d0/f65 x:0 0 0 2026-03-09T19:27:22.813 INFO:tasks.workunit.client.0.vm07.stdout:6/291: stat d0/d1/c65 0 2026-03-09T19:27:22.813 INFO:tasks.workunit.client.0.vm07.stdout:0/285: write d0/d6/d13/d1c/d61/f63 [348000,65988] 0 2026-03-09T19:27:22.813 INFO:tasks.workunit.client.0.vm07.stdout:7/362: rmdir d0/d4/d5/d8 39 2026-03-09T19:27:22.813 INFO:tasks.workunit.client.0.vm07.stdout:0/286: write d0/d6/d13/d1c/d11/f2e [369304,40738] 0 2026-03-09T19:27:22.815 INFO:tasks.workunit.client.0.vm07.stdout:6/292: dwrite d0/d1/db/d24/d53/d31/f3c [4194304,4194304] 0 2026-03-09T19:27:22.836 INFO:tasks.workunit.client.0.vm07.stdout:8/348: read f5 [602728,64997] 0 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.0.vm07.stdout:8/349: mkdir d7/d1d/d83 0 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.0.vm07.stdout:0/287: mknod d0/d6/d13/d33/c66 0 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.0.vm07.stdout:8/350: creat d7/d50/f84 x:0 0 0 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.0.vm07.stdout:7/363: mknod d0/d4/d5/d8/d41/d64/d74/c7e 0 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.0.vm07.stdout:7/364: dread - d0/d4/d5/d26/d32/f7c zero size 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.0.vm07.stdout:7/365: mkdir d0/d52/d54/d55/d7f 0 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.1.vm08.stdout:0/573: dread dd/d22/d24/f60 [0,4194304] 0 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.1.vm08.stdout:6/584: unlink d3/db/d43/d69/cd0 0 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.1.vm08.stdout:0/574: stat dd/d22/d24/l4b 0 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.1.vm08.stdout:4/562: read - da/d10/d26/d27/f96 zero size 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.1.vm08.stdout:0/575: creat dd/d22/d27/d2e/db0/fb2 x:0 0 0 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.1.vm08.stdout:4/563: dwrite da/fa8 [0,4194304] 0 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.1.vm08.stdout:8/526: getdents de/d25/d33 0 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.1.vm08.stdout:0/576: stat dd/l90 0 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.1.vm08.stdout:8/527: chown de/d25/d31 15256 1 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.1.vm08.stdout:0/577: fdatasync dd/d22/d24/d49/f5f 0 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.1.vm08.stdout:8/528: creat de/d1d/d2e/d5f/fba x:0 0 0 2026-03-09T19:27:22.837 INFO:tasks.workunit.client.1.vm08.stdout:8/529: creat de/d1d/d2e/d5f/fbb x:0 0 0 2026-03-09T19:27:22.838 INFO:tasks.workunit.client.1.vm08.stdout:8/530: creat de/d25/d87/fbc x:0 0 0 2026-03-09T19:27:22.838 INFO:tasks.workunit.client.1.vm08.stdout:0/578: dwrite dd/d22/f41 [0,4194304] 0 2026-03-09T19:27:22.842 INFO:tasks.workunit.client.1.vm08.stdout:0/579: truncate dd/f1e 3112185 0 2026-03-09T19:27:22.843 INFO:tasks.workunit.client.1.vm08.stdout:0/580: mkdir dd/d22/d24/d49/d50/db3 0 2026-03-09T19:27:22.846 INFO:tasks.workunit.client.1.vm08.stdout:0/581: rename dd/d22/d27/d4f/d6f to dd/d22/d24/d49/d50/d78/db4 0 2026-03-09T19:27:22.847 INFO:tasks.workunit.client.1.vm08.stdout:0/582: mknod dd/d22/d7b/d82/cb5 0 2026-03-09T19:27:22.849 INFO:tasks.workunit.client.1.vm08.stdout:0/583: mknod dd/d22/d27/d65/cb6 0 2026-03-09T19:27:22.878 INFO:tasks.workunit.client.0.vm07.stdout:0/288: sync 2026-03-09T19:27:22.880 INFO:tasks.workunit.client.0.vm07.stdout:0/289: creat d0/d6/d13/d1c/d11/d56/f67 x:0 0 0 2026-03-09T19:27:22.887 INFO:tasks.workunit.client.1.vm08.stdout:8/531: read de/f5d [852343,16385] 0 2026-03-09T19:27:22.887 INFO:tasks.workunit.client.0.vm07.stdout:8/351: dread f4 [0,4194304] 0 2026-03-09T19:27:22.888 INFO:tasks.workunit.client.1.vm08.stdout:8/532: chown de/f19 647704 1 2026-03-09T19:27:22.888 INFO:tasks.workunit.client.1.vm08.stdout:8/533: stat de/d1d/d4f/f9c 0 2026-03-09T19:27:22.888 INFO:tasks.workunit.client.0.vm07.stdout:8/352: write d7/d9/d37/d34/f55 [2815673,113208] 0 2026-03-09T19:27:22.903 INFO:tasks.workunit.client.1.vm08.stdout:8/534: creat de/d91/fbd x:0 0 0 2026-03-09T19:27:22.932 INFO:tasks.workunit.client.1.vm08.stdout:4/564: dread da/d10/f6b [0,4194304] 0 2026-03-09T19:27:22.936 INFO:tasks.workunit.client.1.vm08.stdout:4/565: dread - da/d10/d16/d28/d46/d52/d6e/d40/f70 zero size 2026-03-09T19:27:22.954 INFO:tasks.workunit.client.1.vm08.stdout:5/530: dread d16/d45/f54 [0,4194304] 0 2026-03-09T19:27:22.954 INFO:tasks.workunit.client.1.vm08.stdout:4/566: rename da/d10/d26/d27/d32/f39 to da/d10/d26/d27/fac 0 2026-03-09T19:27:22.965 INFO:tasks.workunit.client.1.vm08.stdout:1/711: sync 2026-03-09T19:27:22.968 INFO:tasks.workunit.client.1.vm08.stdout:6/585: sync 2026-03-09T19:27:22.972 INFO:tasks.workunit.client.1.vm08.stdout:6/586: creat d3/db/d43/fd3 x:0 0 0 2026-03-09T19:27:22.973 INFO:tasks.workunit.client.0.vm07.stdout:8/353: read d7/f2e [3807618,124183] 0 2026-03-09T19:27:22.975 INFO:tasks.workunit.client.1.vm08.stdout:1/712: symlink d9/da/d12/d91/dc5/ld3 0 2026-03-09T19:27:22.987 INFO:tasks.workunit.client.0.vm07.stdout:8/354: truncate d7/d16/d1e/f33 3130004 0 2026-03-09T19:27:22.987 INFO:tasks.workunit.client.0.vm07.stdout:8/355: chown d7/d9/d57 0 1 2026-03-09T19:27:22.987 INFO:tasks.workunit.client.1.vm08.stdout:1/713: read - d9/da/d53/d67/d6c/fbc zero size 2026-03-09T19:27:22.987 INFO:tasks.workunit.client.1.vm08.stdout:1/714: write d9/d40/d49/f7c [4587997,41577] 0 2026-03-09T19:27:22.987 INFO:tasks.workunit.client.1.vm08.stdout:1/715: truncate d9/da/d12/fac 643587 0 2026-03-09T19:27:22.991 INFO:tasks.workunit.client.1.vm08.stdout:6/587: dread d3/d94/fb5 [0,4194304] 0 2026-03-09T19:27:22.992 INFO:tasks.workunit.client.1.vm08.stdout:1/716: mknod d9/da/d12/cd4 0 2026-03-09T19:27:23.001 INFO:tasks.workunit.client.0.vm07.stdout:5/323: dread d3/d1a/f12 [4194304,4194304] 0 2026-03-09T19:27:23.001 INFO:tasks.workunit.client.1.vm08.stdout:6/588: rename d3/l7d to d3/d34/d5c/ld4 0 2026-03-09T19:27:23.010 INFO:tasks.workunit.client.0.vm07.stdout:1/313: write d1/d3/d21/f5f [1046298,46077] 0 2026-03-09T19:27:23.019 INFO:tasks.workunit.client.0.vm07.stdout:5/324: symlink d3/dd/d26/d3f/l6d 0 2026-03-09T19:27:23.019 INFO:tasks.workunit.client.0.vm07.stdout:5/325: readlink d3/dd/l4c 0 2026-03-09T19:27:23.019 INFO:tasks.workunit.client.1.vm08.stdout:8/535: truncate de/d1d/d4f/fae 490025 0 2026-03-09T19:27:23.021 INFO:tasks.workunit.client.0.vm07.stdout:2/382: dwrite f0 [0,4194304] 0 2026-03-09T19:27:23.022 INFO:tasks.workunit.client.0.vm07.stdout:2/383: stat d3/dd/d16/d30/f3a 0 2026-03-09T19:27:23.022 INFO:tasks.workunit.client.0.vm07.stdout:2/384: chown d3 2227846 1 2026-03-09T19:27:23.028 INFO:tasks.workunit.client.0.vm07.stdout:5/326: rename d3/d1a/c29 to d3/dd/d26/d2d/d60/c6e 0 2026-03-09T19:27:23.035 INFO:tasks.workunit.client.1.vm08.stdout:2/529: dwrite f2 [0,4194304] 0 2026-03-09T19:27:23.040 INFO:tasks.workunit.client.1.vm08.stdout:9/556: write d0/d1b/d97/d48/d5d/f92 [983462,38238] 0 2026-03-09T19:27:23.046 INFO:tasks.workunit.client.0.vm07.stdout:4/318: rmdir d3/d11 39 2026-03-09T19:27:23.049 INFO:tasks.workunit.client.0.vm07.stdout:9/370: write d0/d17/f42 [264582,111182] 0 2026-03-09T19:27:23.064 INFO:tasks.workunit.client.1.vm08.stdout:7/637: write d5/d14/d2b/fa1 [818535,36881] 0 2026-03-09T19:27:23.064 INFO:tasks.workunit.client.0.vm07.stdout:1/314: mknod d1/d11/d37/d5a/d6d/c6f 0 2026-03-09T19:27:23.064 INFO:tasks.workunit.client.0.vm07.stdout:3/383: write d1/d3d/f5e [7678826,21521] 0 2026-03-09T19:27:23.064 INFO:tasks.workunit.client.1.vm08.stdout:1/717: getdents d9/d11/db6 0 2026-03-09T19:27:23.065 INFO:tasks.workunit.client.0.vm07.stdout:2/385: read d3/d11/f31 [1160606,7705] 0 2026-03-09T19:27:23.066 INFO:tasks.workunit.client.1.vm08.stdout:9/557: dread d0/d2/d8/f8e [0,4194304] 0 2026-03-09T19:27:23.068 INFO:tasks.workunit.client.0.vm07.stdout:1/315: symlink d1/d11/l70 0 2026-03-09T19:27:23.068 INFO:tasks.workunit.client.0.vm07.stdout:1/316: stat d1/db/d31/d4f/l5b 0 2026-03-09T19:27:23.069 INFO:tasks.workunit.client.0.vm07.stdout:1/317: chown d1/d11/d37/d5d/d50/f62 526375 1 2026-03-09T19:27:23.072 INFO:tasks.workunit.client.1.vm08.stdout:3/638: dwrite d0/d6/f91 [0,4194304] 0 2026-03-09T19:27:23.077 INFO:tasks.workunit.client.1.vm08.stdout:3/639: write d0/d6/de/d1b/fc0 [51516,101083] 0 2026-03-09T19:27:23.095 INFO:tasks.workunit.client.0.vm07.stdout:0/290: chown d0/d6/d13/d17/l2a 301922489 1 2026-03-09T19:27:23.095 INFO:tasks.workunit.client.0.vm07.stdout:5/327: mknod d3/dd/d26/c6f 0 2026-03-09T19:27:23.095 INFO:tasks.workunit.client.0.vm07.stdout:3/384: write d1/d1f/d16/f39 [1605540,117280] 0 2026-03-09T19:27:23.095 INFO:tasks.workunit.client.0.vm07.stdout:6/293: dwrite d0/f9 [0,4194304] 0 2026-03-09T19:27:23.095 INFO:tasks.workunit.client.0.vm07.stdout:4/319: mknod d3/d11/d51/c6d 0 2026-03-09T19:27:23.095 INFO:tasks.workunit.client.0.vm07.stdout:7/366: write d0/d4/d5/f20 [3365393,7857] 0 2026-03-09T19:27:23.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.094+0000 7f2a92a5b640 1 -- 192.168.123.107:0/4162545494 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a8c071a50 msgr2=0x7f2a8c071e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:23.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.094+0000 7f2a92a5b640 1 --2- 192.168.123.107:0/4162545494 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a8c071a50 0x7f2a8c071e50 secure :-1 s=READY pgs=334 cs=0 l=1 rev1=1 crypto rx=0x7f2a88009a00 tx=0x7f2a8802f290 comp rx=0 tx=0).stop 2026-03-09T19:27:23.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.096+0000 7f2a92a5b640 1 -- 192.168.123.107:0/4162545494 shutdown_connections 2026-03-09T19:27:23.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.096+0000 7f2a92a5b640 1 --2- 192.168.123.107:0/4162545494 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a8c072420 0x7f2a8c077190 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.096+0000 7f2a92a5b640 1 --2- 192.168.123.107:0/4162545494 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a8c071a50 0x7f2a8c071e50 unknown :-1 s=CLOSED pgs=334 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.096+0000 7f2a92a5b640 1 -- 192.168.123.107:0/4162545494 >> 192.168.123.107:0/4162545494 conn(0x7f2a8c06d4f0 msgr2=0x7f2a8c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:23.098 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.099+0000 7f2a92a5b640 1 -- 192.168.123.107:0/4162545494 shutdown_connections 2026-03-09T19:27:23.098 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.099+0000 7f2a92a5b640 1 -- 192.168.123.107:0/4162545494 wait complete. 2026-03-09T19:27:23.098 INFO:tasks.workunit.client.1.vm08.stdout:9/558: symlink d0/d1b/d68/d7f/d8c/da2/da8/lbd 0 2026-03-09T19:27:23.099 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.100+0000 7f2a92a5b640 1 Processor -- start 2026-03-09T19:27:23.099 INFO:tasks.workunit.client.1.vm08.stdout:9/559: readlink d0/l2e 0 2026-03-09T19:27:23.100 INFO:tasks.workunit.client.1.vm08.stdout:7/638: dread d5/d14/d38/f40 [0,4194304] 0 2026-03-09T19:27:23.101 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.100+0000 7f2a92a5b640 1 -- start start 2026-03-09T19:27:23.101 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.102+0000 7f2a92a5b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a8c071a50 0x7f2a8c1b4050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:23.101 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.102+0000 7f2a92a5b640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a8c072420 0x7f2a8c1b4590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:23.101 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.102+0000 7f2a92a5b640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2a8c1b4b60 con 0x7f2a8c071a50 2026-03-09T19:27:23.101 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.102+0000 7f2a92a5b640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2a8c1b4cd0 con 0x7f2a8c072420 2026-03-09T19:27:23.101 INFO:tasks.workunit.client.0.vm07.stdout:7/367: dwrite d0/d4/d5/d26/d32/f45 [0,4194304] 0 2026-03-09T19:27:23.101 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.102+0000 7f2a91a59640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a8c071a50 0x7f2a8c1b4050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:23.101 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.102+0000 7f2a91a59640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a8c071a50 0x7f2a8c1b4050 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:50568/0 (socket says 192.168.123.107:50568) 2026-03-09T19:27:23.101 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.102+0000 7f2a91a59640 1 -- 192.168.123.107:0/1742197470 learned_addr learned my addr 192.168.123.107:0/1742197470 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:27:23.102 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.103+0000 7f2a91258640 1 --2- 192.168.123.107:0/1742197470 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a8c072420 0x7f2a8c1b4590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:23.102 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.103+0000 7f2a91258640 1 -- 192.168.123.107:0/1742197470 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a8c071a50 msgr2=0x7f2a8c1b4050 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:23.102 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.103+0000 7f2a91258640 1 --2- 192.168.123.107:0/1742197470 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a8c071a50 0x7f2a8c1b4050 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.102 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.103+0000 7f2a91258640 1 -- 192.168.123.107:0/1742197470 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2a88009660 con 0x7f2a8c072420 2026-03-09T19:27:23.103 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.103+0000 7f2a91258640 1 --2- 192.168.123.107:0/1742197470 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a8c072420 0x7f2a8c1b4590 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f2a8400bcc0 tx=0x7f2a8400bdc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:23.104 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.104+0000 7f2a82ffd640 1 -- 192.168.123.107:0/1742197470 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2a84002c70 con 0x7f2a8c072420 2026-03-09T19:27:23.104 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.104+0000 7f2a82ffd640 1 -- 192.168.123.107:0/1742197470 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f2a84002dd0 con 0x7f2a8c072420 2026-03-09T19:27:23.104 INFO:tasks.workunit.client.0.vm07.stdout:9/371: creat d0/db/d29/d2c/d36/d7d/f82 x:0 0 0 2026-03-09T19:27:23.104 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.105+0000 7f2a82ffd640 1 -- 192.168.123.107:0/1742197470 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2a84015680 con 0x7f2a8c072420 2026-03-09T19:27:23.105 INFO:tasks.workunit.client.1.vm08.stdout:7/639: write d5/d14/dae/d3a/d42/fb7 [1862583,109504] 0 2026-03-09T19:27:23.106 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.106+0000 7f2a92a5b640 1 -- 192.168.123.107:0/1742197470 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2a8c1b9770 con 0x7f2a8c072420 2026-03-09T19:27:23.106 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.106+0000 7f2a92a5b640 1 -- 192.168.123.107:0/1742197470 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2a8c1b9c40 con 0x7f2a8c072420 2026-03-09T19:27:23.106 INFO:tasks.workunit.client.0.vm07.stdout:5/328: symlink d3/d1a/d28/d36/l70 0 2026-03-09T19:27:23.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.107+0000 7f2a92a5b640 1 -- 192.168.123.107:0/1742197470 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2a5c005350 con 0x7f2a8c072420 2026-03-09T19:27:23.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.110+0000 7f2a82ffd640 1 -- 192.168.123.107:0/1742197470 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f2a84008030 con 0x7f2a8c072420 2026-03-09T19:27:23.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.110+0000 7f2a82ffd640 1 --2- 192.168.123.107:0/1742197470 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2a60075fb0 0x7f2a60078470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:23.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.111+0000 7f2a82ffd640 1 -- 192.168.123.107:0/1742197470 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f2a8401a030 con 0x7f2a8c072420 2026-03-09T19:27:23.136 INFO:tasks.workunit.client.0.vm07.stdout:4/320: creat d3/d11/d16/f6e x:0 0 0 2026-03-09T19:27:23.137 INFO:tasks.workunit.client.1.vm08.stdout:9/560: chown d0/d2/d8/cc 0 1 2026-03-09T19:27:23.137 INFO:tasks.workunit.client.1.vm08.stdout:9/561: stat d0/d2/l67 0 2026-03-09T19:27:23.137 INFO:tasks.workunit.client.1.vm08.stdout:0/584: dwrite dd/d22/d63/d6e/f8a [0,4194304] 0 2026-03-09T19:27:23.137 INFO:tasks.workunit.client.1.vm08.stdout:0/585: chown dd/d31/c35 1790890 1 2026-03-09T19:27:23.137 INFO:tasks.workunit.client.1.vm08.stdout:7/640: creat d5/d14/d27/d78/dc7/fd4 x:0 0 0 2026-03-09T19:27:23.137 INFO:tasks.workunit.client.1.vm08.stdout:4/567: write da/d10/d16/d28/d2f/d4f/d56/f9a [3723926,14011] 0 2026-03-09T19:27:23.137 INFO:tasks.workunit.client.1.vm08.stdout:5/531: dwrite d16/d45/f5d [4194304,4194304] 0 2026-03-09T19:27:23.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.111+0000 7f2a91a59640 1 --2- 192.168.123.107:0/1742197470 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2a60075fb0 0x7f2a60078470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:23.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.112+0000 7f2a91a59640 1 --2- 192.168.123.107:0/1742197470 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2a60075fb0 0x7f2a60078470 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7f2a88002f40 tx=0x7f2a88005e50 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:23.137 INFO:tasks.workunit.client.0.vm07.stdout:3/385: unlink d1/d6/dd/f53 0 2026-03-09T19:27:23.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.122+0000 7f2a82ffd640 1 -- 192.168.123.107:0/1742197470 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f2a840605e0 con 0x7f2a8c072420 2026-03-09T19:27:23.137 INFO:tasks.workunit.client.0.vm07.stdout:8/356: dwrite d7/d9/f4c [0,4194304] 0 2026-03-09T19:27:23.142 INFO:tasks.workunit.client.1.vm08.stdout:0/586: mknod dd/d22/d63/cb7 0 2026-03-09T19:27:23.146 INFO:tasks.workunit.client.0.vm07.stdout:1/318: dread d1/d3/f12 [0,4194304] 0 2026-03-09T19:27:23.157 INFO:tasks.workunit.client.1.vm08.stdout:5/532: rmdir d16/d1e/d30/d6f 39 2026-03-09T19:27:23.193 INFO:tasks.workunit.client.1.vm08.stdout:0/587: truncate dd/d22/d24/f77 2908720 0 2026-03-09T19:27:23.198 INFO:tasks.workunit.client.0.vm07.stdout:7/368: mkdir d0/d80 0 2026-03-09T19:27:23.205 INFO:tasks.workunit.client.1.vm08.stdout:5/533: mknod d16/d1e/ca9 0 2026-03-09T19:27:23.210 INFO:tasks.workunit.client.1.vm08.stdout:5/534: dwrite d16/d45/f6b [0,4194304] 0 2026-03-09T19:27:23.212 INFO:tasks.workunit.client.0.vm07.stdout:0/291: creat d0/f68 x:0 0 0 2026-03-09T19:27:23.213 INFO:tasks.workunit.client.0.vm07.stdout:0/292: chown d0/d6/d13/d33/c44 492 1 2026-03-09T19:27:23.217 INFO:tasks.workunit.client.1.vm08.stdout:0/588: mknod dd/d22/cb8 0 2026-03-09T19:27:23.218 INFO:tasks.workunit.client.1.vm08.stdout:0/589: write dd/d22/d24/d49/d50/d78/db4/fa5 [997769,20390] 0 2026-03-09T19:27:23.239 INFO:tasks.workunit.client.1.vm08.stdout:5/535: chown d16/d1e/fa5 56 1 2026-03-09T19:27:23.239 INFO:tasks.workunit.client.1.vm08.stdout:0/590: creat dd/d22/d27/d6c/fb9 x:0 0 0 2026-03-09T19:27:23.239 INFO:tasks.workunit.client.0.vm07.stdout:9/372: creat d0/db/d29/d32/d5c/d69/f83 x:0 0 0 2026-03-09T19:27:23.239 INFO:tasks.workunit.client.0.vm07.stdout:9/373: stat d0/db/d29/d2c/d36/f3c 0 2026-03-09T19:27:23.239 INFO:tasks.workunit.client.0.vm07.stdout:1/319: getdents d1/d11/d37/d3f/d6e 0 2026-03-09T19:27:23.239 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.239+0000 7f2a92a5b640 1 -- 192.168.123.107:0/1742197470 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f2a5c002bf0 con 0x7f2a60075fb0 2026-03-09T19:27:23.240 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.241+0000 7f2a82ffd640 1 -- 192.168.123.107:0/1742197470 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f2a5c002bf0 con 0x7f2a60075fb0 2026-03-09T19:27:23.242 INFO:tasks.workunit.client.0.vm07.stdout:0/293: mkdir d0/d6/d13/d1c/d61/d69 0 2026-03-09T19:27:23.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.245+0000 7f2a80ff9640 1 -- 192.168.123.107:0/1742197470 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2a60075fb0 msgr2=0x7f2a60078470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:23.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.245+0000 7f2a80ff9640 1 --2- 192.168.123.107:0/1742197470 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2a60075fb0 0x7f2a60078470 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7f2a88002f40 tx=0x7f2a88005e50 comp rx=0 tx=0).stop 2026-03-09T19:27:23.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.245+0000 7f2a80ff9640 1 -- 192.168.123.107:0/1742197470 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a8c072420 msgr2=0x7f2a8c1b4590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:23.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.245+0000 7f2a80ff9640 1 --2- 192.168.123.107:0/1742197470 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a8c072420 0x7f2a8c1b4590 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f2a8400bcc0 tx=0x7f2a8400bdc0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.244 INFO:tasks.workunit.client.0.vm07.stdout:9/374: mknod d0/d17/c84 0 2026-03-09T19:27:23.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.245+0000 7f2a80ff9640 1 -- 192.168.123.107:0/1742197470 shutdown_connections 2026-03-09T19:27:23.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.245+0000 7f2a80ff9640 1 --2- 192.168.123.107:0/1742197470 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f2a60075fb0 0x7f2a60078470 unknown :-1 s=CLOSED pgs=155 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.245+0000 7f2a80ff9640 1 --2- 192.168.123.107:0/1742197470 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a8c072420 0x7f2a8c1b4590 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.245+0000 7f2a80ff9640 1 --2- 192.168.123.107:0/1742197470 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a8c071a50 0x7f2a8c1b4050 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.245+0000 7f2a80ff9640 1 -- 192.168.123.107:0/1742197470 >> 192.168.123.107:0/1742197470 conn(0x7f2a8c06d4f0 msgr2=0x7f2a8c075620 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:23.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.246+0000 7f2a80ff9640 1 -- 192.168.123.107:0/1742197470 shutdown_connections 2026-03-09T19:27:23.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.246+0000 7f2a80ff9640 1 -- 192.168.123.107:0/1742197470 wait complete. 2026-03-09T19:27:23.246 INFO:tasks.workunit.client.0.vm07.stdout:8/357: creat d7/d9/d37/f85 x:0 0 0 2026-03-09T19:27:23.248 INFO:tasks.workunit.client.0.vm07.stdout:7/369: link d0/d4/d5/d8/d1a/f4d d0/d80/f81 0 2026-03-09T19:27:23.249 INFO:tasks.workunit.client.0.vm07.stdout:7/370: write d0/d4/d5/d8/d41/f73 [9464461,78893] 0 2026-03-09T19:27:23.251 INFO:tasks.workunit.client.1.vm08.stdout:0/591: rename dd/d22/d27/d2e/db0/f9c to dd/d22/fba 0 2026-03-09T19:27:23.269 INFO:tasks.workunit.client.0.vm07.stdout:1/320: creat d1/db/d31/d56/f71 x:0 0 0 2026-03-09T19:27:23.269 INFO:tasks.workunit.client.0.vm07.stdout:8/358: write d7/d50/f6f [667914,56138] 0 2026-03-09T19:27:23.269 INFO:tasks.workunit.client.0.vm07.stdout:8/359: fsync d7/d50/f82 0 2026-03-09T19:27:23.270 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:27:23.270 INFO:tasks.workunit.client.0.vm07.stdout:9/375: truncate d0/d17/f1f 1047242 0 2026-03-09T19:27:23.283 INFO:tasks.workunit.client.1.vm08.stdout:5/536: dread d16/f18 [4194304,4194304] 0 2026-03-09T19:27:23.283 INFO:tasks.workunit.client.1.vm08.stdout:5/537: rmdir d16/d1e/d30/d6f 39 2026-03-09T19:27:23.283 INFO:tasks.workunit.client.1.vm08.stdout:5/538: symlink d16/d8e/laa 0 2026-03-09T19:27:23.283 INFO:tasks.workunit.client.1.vm08.stdout:5/539: rmdir d16/d1e/d8c/d99/da8/d9a 39 2026-03-09T19:27:23.283 INFO:tasks.workunit.client.1.vm08.stdout:5/540: creat d16/d1e/d8c/fab x:0 0 0 2026-03-09T19:27:23.283 INFO:tasks.workunit.client.1.vm08.stdout:5/541: creat d16/d1e/d8c/d99/da8/d9a/fac x:0 0 0 2026-03-09T19:27:23.283 INFO:tasks.workunit.client.1.vm08.stdout:5/542: symlink d16/d45/d81/lad 0 2026-03-09T19:27:23.284 INFO:tasks.workunit.client.1.vm08.stdout:0/592: dread dd/d22/d27/f42 [0,4194304] 0 2026-03-09T19:27:23.284 INFO:tasks.workunit.client.1.vm08.stdout:0/593: creat dd/d22/d24/d49/d50/d78/fbb x:0 0 0 2026-03-09T19:27:23.284 INFO:tasks.workunit.client.1.vm08.stdout:9/562: read d0/d2/d14/f4d [876699,38193] 0 2026-03-09T19:27:23.284 INFO:tasks.workunit.client.0.vm07.stdout:1/321: creat d1/d3/d52/f72 x:0 0 0 2026-03-09T19:27:23.284 INFO:tasks.workunit.client.0.vm07.stdout:1/322: dread - d1/d11/d37/d5d/d50/f63 zero size 2026-03-09T19:27:23.284 INFO:tasks.workunit.client.0.vm07.stdout:8/360: dread d7/d16/d1e/f6e [0,4194304] 0 2026-03-09T19:27:23.284 INFO:tasks.workunit.client.0.vm07.stdout:9/376: dwrite d0/db/d29/d2c/d36/d7d/f82 [0,4194304] 0 2026-03-09T19:27:23.284 INFO:tasks.workunit.client.0.vm07.stdout:9/377: fsync d0/db/d29/d2c/f34 0 2026-03-09T19:27:23.285 INFO:tasks.workunit.client.0.vm07.stdout:8/361: write d7/f19 [4750746,68326] 0 2026-03-09T19:27:23.295 INFO:tasks.workunit.client.0.vm07.stdout:1/323: rename d1/d11/d37/d5d/d50/c53 to d1/db/d31/d56/c73 0 2026-03-09T19:27:23.296 INFO:tasks.workunit.client.0.vm07.stdout:1/324: fdatasync d1/d3/f23 0 2026-03-09T19:27:23.300 INFO:tasks.workunit.client.0.vm07.stdout:1/325: dread d1/d3/d21/f5f [0,4194304] 0 2026-03-09T19:27:23.302 INFO:tasks.workunit.client.0.vm07.stdout:1/326: mknod d1/db/d31/d56/c74 0 2026-03-09T19:27:23.304 INFO:tasks.workunit.client.0.vm07.stdout:1/327: creat d1/d11/d37/d5a/f75 x:0 0 0 2026-03-09T19:27:23.314 INFO:tasks.workunit.client.0.vm07.stdout:1/328: dwrite d1/db/d31/d56/f6a [0,4194304] 0 2026-03-09T19:27:23.315 INFO:tasks.workunit.client.0.vm07.stdout:1/329: dread - d1/d11/d37/d5d/d50/f6b zero size 2026-03-09T19:27:23.318 INFO:tasks.workunit.client.0.vm07.stdout:1/330: dwrite d1/d11/d37/d5d/d50/f62 [0,4194304] 0 2026-03-09T19:27:23.351 INFO:tasks.workunit.client.0.vm07.stdout:9/378: sync 2026-03-09T19:27:23.352 INFO:tasks.workunit.client.0.vm07.stdout:9/379: stat d0/db/d29/d4d/c4e 0 2026-03-09T19:27:23.359 INFO:tasks.workunit.client.0.vm07.stdout:9/380: dwrite d0/db/d29/d2c/d36/d7d/f82 [0,4194304] 0 2026-03-09T19:27:23.364 INFO:tasks.workunit.client.0.vm07.stdout:9/381: mknod d0/d6f/c85 0 2026-03-09T19:27:23.377 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.378+0000 7fcb3093c640 1 -- 192.168.123.107:0/3167984287 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb2c0678f0 msgr2=0x7fcb2c0ff9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:23.377 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.378+0000 7fcb3093c640 1 --2- 192.168.123.107:0/3167984287 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb2c0678f0 0x7fcb2c0ff9f0 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7fcb140099b0 tx=0x7fcb1402f240 comp rx=0 tx=0).stop 2026-03-09T19:27:23.380 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.381+0000 7fcb3093c640 1 -- 192.168.123.107:0/3167984287 shutdown_connections 2026-03-09T19:27:23.380 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.381+0000 7fcb3093c640 1 --2- 192.168.123.107:0/3167984287 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb2c0fff30 0x7fcb2c1003b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.380 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.381+0000 7fcb3093c640 1 --2- 192.168.123.107:0/3167984287 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb2c0678f0 0x7fcb2c0ff9f0 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.380 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.381+0000 7fcb3093c640 1 -- 192.168.123.107:0/3167984287 >> 192.168.123.107:0/3167984287 conn(0x7fcb2c0f9f80 msgr2=0x7fcb2c0fc3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:23.380 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.382+0000 7fcb3093c640 1 -- 192.168.123.107:0/3167984287 shutdown_connections 2026-03-09T19:27:23.381 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.382+0000 7fcb3093c640 1 -- 192.168.123.107:0/3167984287 wait complete. 2026-03-09T19:27:23.381 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.382+0000 7fcb3093c640 1 Processor -- start 2026-03-09T19:27:23.381 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.383+0000 7fcb3093c640 1 -- start start 2026-03-09T19:27:23.382 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.383+0000 7fcb3093c640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb2c0678f0 0x7fcb2c1078f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:23.382 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.383+0000 7fcb3093c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb2c0fff30 0x7fcb2c105f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:23.382 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.383+0000 7fcb3093c640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcb2c107ec0 con 0x7fcb2c0fff30 2026-03-09T19:27:23.382 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.383+0000 7fcb3093c640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcb2c19bce0 con 0x7fcb2c0678f0 2026-03-09T19:27:23.382 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.383+0000 7fcb2a575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb2c0fff30 0x7fcb2c105f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:23.382 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.383+0000 7fcb2a575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb2c0fff30 0x7fcb2c105f40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:50598/0 (socket says 192.168.123.107:50598) 2026-03-09T19:27:23.382 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.383+0000 7fcb2a575640 1 -- 192.168.123.107:0/1236476140 learned_addr learned my addr 192.168.123.107:0/1236476140 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:27:23.383 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.384+0000 7fcb2ad76640 1 --2- 192.168.123.107:0/1236476140 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb2c0678f0 0x7fcb2c1078f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:23.383 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.384+0000 7fcb2a575640 1 -- 192.168.123.107:0/1236476140 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb2c0678f0 msgr2=0x7fcb2c1078f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:23.383 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.384+0000 7fcb2a575640 1 --2- 192.168.123.107:0/1236476140 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb2c0678f0 0x7fcb2c1078f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.383 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.384+0000 7fcb2a575640 1 -- 192.168.123.107:0/1236476140 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcb14009660 con 0x7fcb2c0fff30 2026-03-09T19:27:23.384 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.385+0000 7fcb2a575640 1 --2- 192.168.123.107:0/1236476140 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb2c0fff30 0x7fcb2c105f40 secure :-1 s=READY pgs=335 cs=0 l=1 rev1=1 crypto rx=0x7fcb2000e990 tx=0x7fcb2000ee60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:23.394 INFO:tasks.workunit.client.0.vm07.stdout:5/329: dread d3/fe [0,4194304] 0 2026-03-09T19:27:23.399 INFO:tasks.workunit.client.0.vm07.stdout:5/330: dwrite d3/d1a/d28/d36/f61 [0,4194304] 0 2026-03-09T19:27:23.400 INFO:tasks.workunit.client.1.vm08.stdout:3/640: dread d0/d6/de/d15/d96/fa0 [0,4194304] 0 2026-03-09T19:27:23.400 INFO:tasks.workunit.client.1.vm08.stdout:9/563: dread d0/f13 [0,4194304] 0 2026-03-09T19:27:23.401 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.402+0000 7fcb0bfff640 1 -- 192.168.123.107:0/1236476140 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcb2000cd30 con 0x7fcb2c0fff30 2026-03-09T19:27:23.401 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.402+0000 7fcb3093c640 1 -- 192.168.123.107:0/1236476140 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcb2c19bfc0 con 0x7fcb2c0fff30 2026-03-09T19:27:23.401 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.402+0000 7fcb3093c640 1 -- 192.168.123.107:0/1236476140 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcb2c1067f0 con 0x7fcb2c0fff30 2026-03-09T19:27:23.402 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.403+0000 7fcb0bfff640 1 -- 192.168.123.107:0/1236476140 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fcb2000ce90 con 0x7fcb2c0fff30 2026-03-09T19:27:23.402 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.403+0000 7fcb0bfff640 1 -- 192.168.123.107:0/1236476140 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcb20010640 con 0x7fcb2c0fff30 2026-03-09T19:27:23.402 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.403+0000 7fcb3093c640 1 -- 192.168.123.107:0/1236476140 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcb2c0ff9f0 con 0x7fcb2c0fff30 2026-03-09T19:27:23.409 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.410+0000 7fcb0bfff640 1 -- 192.168.123.107:0/1236476140 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7fcb200026e0 con 0x7fcb2c0fff30 2026-03-09T19:27:23.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.411+0000 7fcb0bfff640 1 --2- 192.168.123.107:0/1236476140 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fcafc075f60 0x7fcafc078420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:23.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.412+0000 7fcb0bfff640 1 -- 192.168.123.107:0/1236476140 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fcb20014070 con 0x7fcb2c0fff30 2026-03-09T19:27:23.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.412+0000 7fcb0bfff640 1 -- 192.168.123.107:0/1236476140 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7fcb20098310 con 0x7fcb2c0fff30 2026-03-09T19:27:23.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.413+0000 7fcb2ad76640 1 --2- 192.168.123.107:0/1236476140 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fcafc075f60 0x7fcafc078420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:23.412 INFO:tasks.workunit.client.0.vm07.stdout:5/331: mkdir d3/dd/d26/d3f/d47/d71 0 2026-03-09T19:27:23.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.414+0000 7fcb2ad76640 1 --2- 192.168.123.107:0/1236476140 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fcafc075f60 0x7fcafc078420 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7fcb14002b60 tx=0x7fcb1403a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:23.425 INFO:tasks.workunit.client.1.vm08.stdout:3/641: creat d0/d6/de/d6e/d51/d7f/fca x:0 0 0 2026-03-09T19:27:23.428 INFO:tasks.workunit.client.1.vm08.stdout:0/594: dread dd/d22/d63/d6e/d72/f8f [0,4194304] 0 2026-03-09T19:27:23.428 INFO:tasks.workunit.client.1.vm08.stdout:8/536: write de/d25/d33/f55 [2173426,99825] 0 2026-03-09T19:27:23.437 INFO:tasks.workunit.client.0.vm07.stdout:5/332: sync 2026-03-09T19:27:23.438 INFO:tasks.workunit.client.1.vm08.stdout:6/589: dwrite d3/f7 [0,4194304] 0 2026-03-09T19:27:23.445 INFO:tasks.workunit.client.1.vm08.stdout:9/564: symlink d0/d1b/d68/d7f/d8c/lbe 0 2026-03-09T19:27:23.453 INFO:tasks.workunit.client.1.vm08.stdout:3/642: mkdir d0/d6/d93/dcb 0 2026-03-09T19:27:23.465 INFO:tasks.workunit.client.0.vm07.stdout:5/333: fdatasync d3/dd/d26/d3f/d47/f62 0 2026-03-09T19:27:23.466 INFO:tasks.workunit.client.0.vm07.stdout:5/334: dread - d3/d1a/d28/d36/f63 zero size 2026-03-09T19:27:23.469 INFO:tasks.workunit.client.0.vm07.stdout:5/335: dread d3/d1a/d28/d48/f50 [0,4194304] 0 2026-03-09T19:27:23.474 INFO:tasks.workunit.client.1.vm08.stdout:0/595: dread - dd/d22/d27/f9f zero size 2026-03-09T19:27:23.477 INFO:tasks.workunit.client.1.vm08.stdout:2/530: write d3/d4/d23/d2c/f94 [1008166,80058] 0 2026-03-09T19:27:23.482 INFO:tasks.workunit.client.1.vm08.stdout:8/537: symlink de/d1d/d2e/lbe 0 2026-03-09T19:27:23.484 INFO:tasks.workunit.client.1.vm08.stdout:1/718: dwrite d9/da/f30 [0,4194304] 0 2026-03-09T19:27:23.500 INFO:tasks.workunit.client.1.vm08.stdout:8/538: truncate de/d25/d31/d82/fa9 1560151 0 2026-03-09T19:27:23.516 INFO:tasks.workunit.client.1.vm08.stdout:6/590: unlink d3/d15/d8a/fa1 0 2026-03-09T19:27:23.526 INFO:tasks.workunit.client.0.vm07.stdout:5/336: unlink d3/d1a/fc 0 2026-03-09T19:27:23.531 INFO:tasks.workunit.client.1.vm08.stdout:2/531: mkdir d3/d4/d23/d2c/d39/db9 0 2026-03-09T19:27:23.532 INFO:tasks.workunit.client.1.vm08.stdout:1/719: mknod d9/da/d95/cd5 0 2026-03-09T19:27:23.533 INFO:tasks.workunit.client.1.vm08.stdout:8/539: read - de/d47/faa zero size 2026-03-09T19:27:23.533 INFO:tasks.workunit.client.1.vm08.stdout:1/720: readlink d9/da/dc/l83 0 2026-03-09T19:27:23.534 INFO:tasks.workunit.client.1.vm08.stdout:8/540: dread - de/fb2 zero size 2026-03-09T19:27:23.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.554+0000 7fcb3093c640 1 -- 192.168.123.107:0/1236476140 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fcb2c106d70 con 0x7fcafc075f60 2026-03-09T19:27:23.555 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.556+0000 7fcb0bfff640 1 -- 192.168.123.107:0/1236476140 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7fcb2c106d70 con 0x7fcafc075f60 2026-03-09T19:27:23.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.561+0000 7fcb3093c640 1 -- 192.168.123.107:0/1236476140 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fcafc075f60 msgr2=0x7fcafc078420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:23.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.561+0000 7fcb3093c640 1 --2- 192.168.123.107:0/1236476140 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fcafc075f60 0x7fcafc078420 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7fcb14002b60 tx=0x7fcb1403a040 comp rx=0 tx=0).stop 2026-03-09T19:27:23.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.561+0000 7fcb3093c640 1 -- 192.168.123.107:0/1236476140 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb2c0fff30 msgr2=0x7fcb2c105f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:23.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.561+0000 7fcb3093c640 1 --2- 192.168.123.107:0/1236476140 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb2c0fff30 0x7fcb2c105f40 secure :-1 s=READY pgs=335 cs=0 l=1 rev1=1 crypto rx=0x7fcb2000e990 tx=0x7fcb2000ee60 comp rx=0 tx=0).stop 2026-03-09T19:27:23.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.562+0000 7fcb3093c640 1 -- 192.168.123.107:0/1236476140 shutdown_connections 2026-03-09T19:27:23.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.562+0000 7fcb3093c640 1 --2- 192.168.123.107:0/1236476140 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7fcafc075f60 0x7fcafc078420 unknown :-1 s=CLOSED pgs=156 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.562+0000 7fcb3093c640 1 --2- 192.168.123.107:0/1236476140 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb2c0fff30 0x7fcb2c105f40 secure :-1 s=CLOSED pgs=335 cs=0 l=1 rev1=1 crypto rx=0x7fcb2000e990 tx=0x7fcb2000ee60 comp rx=0 tx=0).stop 2026-03-09T19:27:23.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.562+0000 7fcb3093c640 1 --2- 192.168.123.107:0/1236476140 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb2c0678f0 0x7fcb2c1078f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.562+0000 7fcb3093c640 1 -- 192.168.123.107:0/1236476140 >> 192.168.123.107:0/1236476140 conn(0x7fcb2c0f9f80 msgr2=0x7fcb2c0fbb10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:23.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.562+0000 7fcb3093c640 1 -- 192.168.123.107:0/1236476140 shutdown_connections 2026-03-09T19:27:23.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.562+0000 7fcb3093c640 1 -- 192.168.123.107:0/1236476140 wait complete. 2026-03-09T19:27:23.567 INFO:tasks.workunit.client.0.vm07.stdout:7/371: dread d0/f6c [0,4194304] 0 2026-03-09T19:27:23.568 INFO:tasks.workunit.client.0.vm07.stdout:2/386: write d3/f15 [1454811,125787] 0 2026-03-09T19:27:23.569 INFO:tasks.workunit.client.0.vm07.stdout:5/337: getdents d3/d1a/d5d 0 2026-03-09T19:27:23.569 INFO:tasks.workunit.client.0.vm07.stdout:5/338: stat d3/fe 0 2026-03-09T19:27:23.570 INFO:tasks.workunit.client.0.vm07.stdout:5/339: truncate d3/d1a/d28/d40/f49 608295 0 2026-03-09T19:27:23.576 INFO:tasks.workunit.client.0.vm07.stdout:5/340: chown d3/d1a/d28/d36/f61 13327 1 2026-03-09T19:27:23.580 INFO:tasks.workunit.client.0.vm07.stdout:7/372: dwrite d0/d4/d5/d26/d32/f7c [0,4194304] 0 2026-03-09T19:27:23.585 INFO:tasks.workunit.client.0.vm07.stdout:4/321: write d3/d11/d2b/d37/f4d [654349,17154] 0 2026-03-09T19:27:23.585 INFO:tasks.workunit.client.1.vm08.stdout:4/568: write da/d10/d16/d28/d46/d52/d6e/d40/d6c/f92 [226985,26409] 0 2026-03-09T19:27:23.588 INFO:tasks.workunit.client.0.vm07.stdout:3/386: dwrite d1/d1f/d16/f30 [0,4194304] 0 2026-03-09T19:27:23.592 INFO:tasks.workunit.client.0.vm07.stdout:6/294: truncate d0/d1/db/d1d/f2e 191906 0 2026-03-09T19:27:23.603 INFO:tasks.workunit.client.1.vm08.stdout:7/641: dwrite d5/d14/dae/d3a/f64 [0,4194304] 0 2026-03-09T19:27:23.619 INFO:tasks.workunit.client.0.vm07.stdout:1/331: dread d1/d11/d37/d3f/f4a [0,4194304] 0 2026-03-09T19:27:23.620 INFO:tasks.workunit.client.1.vm08.stdout:1/721: creat d9/da/d2c/fd6 x:0 0 0 2026-03-09T19:27:23.626 INFO:tasks.workunit.client.0.vm07.stdout:2/387: symlink d3/dd/d16/d29/d2d/d45/d3b/d53/l84 0 2026-03-09T19:27:23.632 INFO:tasks.workunit.client.1.vm08.stdout:8/541: rename de/d1d/d21/f4b to de/d1d/d2e/d5f/fbf 0 2026-03-09T19:27:23.633 INFO:tasks.workunit.client.1.vm08.stdout:8/542: stat de/d1d/d2e/d5f/l9b 0 2026-03-09T19:27:23.634 INFO:tasks.workunit.client.1.vm08.stdout:8/543: chown de/d1d/d4f/f5e 78 1 2026-03-09T19:27:23.639 INFO:tasks.workunit.client.1.vm08.stdout:8/544: truncate de/d91/fbd 150674 0 2026-03-09T19:27:23.643 INFO:tasks.workunit.client.1.vm08.stdout:8/545: dread - de/d1d/d2e/d5f/fba zero size 2026-03-09T19:27:23.648 INFO:tasks.workunit.client.0.vm07.stdout:7/373: creat d0/d4/d5/d8/d41/d64/d74/f82 x:0 0 0 2026-03-09T19:27:23.648 INFO:tasks.workunit.client.1.vm08.stdout:9/565: link d0/d2/d14/f31 d0/d2/d14/fbf 0 2026-03-09T19:27:23.649 INFO:tasks.workunit.client.0.vm07.stdout:0/294: dwrite d0/d6/d13/d1c/d11/f3f [0,4194304] 0 2026-03-09T19:27:23.662 INFO:tasks.workunit.client.0.vm07.stdout:4/322: rename d3/d11/f61 to d3/d4f/d56/d5f/f6f 0 2026-03-09T19:27:23.663 INFO:tasks.workunit.client.0.vm07.stdout:4/323: dread - d3/f64 zero size 2026-03-09T19:27:23.664 INFO:tasks.workunit.client.0.vm07.stdout:4/324: write d3/d11/d16/d2f/f67 [355085,19645] 0 2026-03-09T19:27:23.686 INFO:tasks.workunit.client.0.vm07.stdout:3/387: creat d1/d74/f77 x:0 0 0 2026-03-09T19:27:23.687 INFO:tasks.workunit.client.1.vm08.stdout:5/543: dwrite f1 [0,4194304] 0 2026-03-09T19:27:23.687 INFO:tasks.workunit.client.0.vm07.stdout:8/362: dwrite d7/d9/fd [0,4194304] 0 2026-03-09T19:27:23.690 INFO:tasks.workunit.client.0.vm07.stdout:3/388: write d1/d3d/d47/f62 [974855,98411] 0 2026-03-09T19:27:23.704 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.704+0000 7f6b61aa3640 1 -- 192.168.123.107:0/872846870 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6b5c071a70 msgr2=0x7f6b5c071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:23.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.704+0000 7f6b61aa3640 1 --2- 192.168.123.107:0/872846870 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6b5c071a70 0x7f6b5c071e70 secure :-1 s=READY pgs=336 cs=0 l=1 rev1=1 crypto rx=0x7f6b4c0098e0 tx=0x7f6b4c02f1b0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.704+0000 7f6b61aa3640 1 -- 192.168.123.107:0/872846870 shutdown_connections 2026-03-09T19:27:23.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.704+0000 7f6b61aa3640 1 --2- 192.168.123.107:0/872846870 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b5c072440 0x7f6b5c0771b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.704+0000 7f6b61aa3640 1 --2- 192.168.123.107:0/872846870 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6b5c071a70 0x7f6b5c071e70 unknown :-1 s=CLOSED pgs=336 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.704+0000 7f6b61aa3640 1 -- 192.168.123.107:0/872846870 >> 192.168.123.107:0/872846870 conn(0x7f6b5c06d4f0 msgr2=0x7f6b5c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:23.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.704+0000 7f6b61aa3640 1 -- 192.168.123.107:0/872846870 shutdown_connections 2026-03-09T19:27:23.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.704+0000 7f6b61aa3640 1 -- 192.168.123.107:0/872846870 wait complete. 2026-03-09T19:27:23.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.705+0000 7f6b61aa3640 1 Processor -- start 2026-03-09T19:27:23.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.705+0000 7f6b61aa3640 1 -- start start 2026-03-09T19:27:23.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.705+0000 7f6b61aa3640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6b5c072440 0x7f6b5c1319d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:23.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.705+0000 7f6b61aa3640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b5c133380 0x7f6b5c131f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:23.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.705+0000 7f6b61aa3640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6b5c1324e0 con 0x7f6b5c072440 2026-03-09T19:27:23.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.705+0000 7f6b61aa3640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6b5c132650 con 0x7f6b5c133380 2026-03-09T19:27:23.705 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:23 vm07.local ceph-mon[48545]: pgmap v166: 65 pgs: 65 active+clean; 1.9 GiB data, 7.1 GiB used, 113 GiB / 120 GiB avail; 35 MiB/s rd, 122 MiB/s wr, 276 op/s 2026-03-09T19:27:23.705 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:23 vm07.local ceph-mon[48545]: from='client.24433 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:27:23.706 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.706+0000 7f6b5affd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b5c133380 0x7f6b5c131f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:23.706 INFO:tasks.workunit.client.0.vm07.stdout:8/363: dwrite d7/d9/f81 [0,4194304] 0 2026-03-09T19:27:23.706 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.707+0000 7f6b5affd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b5c133380 0x7f6b5c131f10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:35370/0 (socket says 192.168.123.107:35370) 2026-03-09T19:27:23.706 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.707+0000 7f6b5affd640 1 -- 192.168.123.107:0/3458322900 learned_addr learned my addr 192.168.123.107:0/3458322900 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:27:23.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.707+0000 7f6b5affd640 1 -- 192.168.123.107:0/3458322900 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6b5c072440 msgr2=0x7f6b5c1319d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:23.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.707+0000 7f6b5affd640 1 --2- 192.168.123.107:0/3458322900 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6b5c072440 0x7f6b5c1319d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.707+0000 7f6b5affd640 1 -- 192.168.123.107:0/3458322900 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6b4c009590 con 0x7f6b5c133380 2026-03-09T19:27:23.708 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.709+0000 7f6b5affd640 1 --2- 192.168.123.107:0/3458322900 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b5c133380 0x7f6b5c131f10 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f6b5400ea40 tx=0x7f6b5400ef10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:23.709 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.709+0000 7f6b58ff9640 1 -- 192.168.123.107:0/3458322900 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6b5400ce60 con 0x7f6b5c133380 2026-03-09T19:27:23.709 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.709+0000 7f6b61aa3640 1 -- 192.168.123.107:0/3458322900 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6b5c07faf0 con 0x7f6b5c133380 2026-03-09T19:27:23.709 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.709+0000 7f6b61aa3640 1 -- 192.168.123.107:0/3458322900 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6b5c080040 con 0x7f6b5c133380 2026-03-09T19:27:23.709 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.710+0000 7f6b58ff9640 1 -- 192.168.123.107:0/3458322900 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6b540040d0 con 0x7f6b5c133380 2026-03-09T19:27:23.709 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.710+0000 7f6b58ff9640 1 -- 192.168.123.107:0/3458322900 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6b54016710 con 0x7f6b5c133380 2026-03-09T19:27:23.712 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.710+0000 7f6b61aa3640 1 -- 192.168.123.107:0/3458322900 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6b5c071a70 con 0x7f6b5c133380 2026-03-09T19:27:23.712 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.713+0000 7f6b58ff9640 1 -- 192.168.123.107:0/3458322900 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f6b54004240 con 0x7f6b5c133380 2026-03-09T19:27:23.713 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.713+0000 7f6b58ff9640 1 --2- 192.168.123.107:0/3458322900 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6b34076170 0x7f6b34078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:23.713 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.713+0000 7f6b58ff9640 1 -- 192.168.123.107:0/3458322900 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f6b54098220 con 0x7f6b5c133380 2026-03-09T19:27:23.715 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.716+0000 7f6b5b7fe640 1 --2- 192.168.123.107:0/3458322900 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6b34076170 0x7f6b34078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:23.715 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.716+0000 7f6b5b7fe640 1 --2- 192.168.123.107:0/3458322900 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6b34076170 0x7f6b34078630 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7f6b4c02f6c0 tx=0x7f6b4c0028e0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:23.726 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.724+0000 7f6b58ff9640 1 -- 192.168.123.107:0/3458322900 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6b540617e0 con 0x7f6b5c133380 2026-03-09T19:27:23.746 INFO:tasks.workunit.client.0.vm07.stdout:5/341: mkdir d3/d1a/d28/d6c/d72 0 2026-03-09T19:27:23.769 INFO:tasks.workunit.client.0.vm07.stdout:9/382: truncate d0/d17/f5e 569237 0 2026-03-09T19:27:23.769 INFO:tasks.workunit.client.0.vm07.stdout:3/389: fdatasync d1/d6/f37 0 2026-03-09T19:27:23.770 INFO:tasks.workunit.client.0.vm07.stdout:3/390: dread - d1/d6/d4c/f61 zero size 2026-03-09T19:27:23.772 INFO:tasks.workunit.client.0.vm07.stdout:8/364: dread d7/d50/f6d [0,4194304] 0 2026-03-09T19:27:23.775 INFO:tasks.workunit.client.0.vm07.stdout:6/295: symlink d0/d1/d28/l73 0 2026-03-09T19:27:23.782 INFO:tasks.workunit.client.1.vm08.stdout:2/532: write d3/f7c [593480,7305] 0 2026-03-09T19:27:23.782 INFO:tasks.workunit.client.0.vm07.stdout:1/332: read d1/d11/d37/d3f/d45/f15 [1717598,107485] 0 2026-03-09T19:27:23.793 INFO:tasks.workunit.client.0.vm07.stdout:0/295: mkdir d0/d6/d13/d17/d19/d57/d6a 0 2026-03-09T19:27:23.794 INFO:tasks.workunit.client.0.vm07.stdout:9/383: mkdir d0/d6f/d86 0 2026-03-09T19:27:23.794 INFO:tasks.workunit.client.0.vm07.stdout:0/296: write d0/d6/d13/d1c/d11/f5f [914445,7076] 0 2026-03-09T19:27:23.815 INFO:tasks.workunit.client.0.vm07.stdout:8/365: rmdir d7/d9/d57 39 2026-03-09T19:27:23.822 INFO:tasks.workunit.client.0.vm07.stdout:6/296: fdatasync d0/d13/f57 0 2026-03-09T19:27:23.822 INFO:tasks.workunit.client.0.vm07.stdout:8/366: dwrite d7/d9/d37/d45/d56/f7a [0,4194304] 0 2026-03-09T19:27:23.825 INFO:tasks.workunit.client.0.vm07.stdout:6/297: dwrite d0/d1/db/f15 [4194304,4194304] 0 2026-03-09T19:27:23.833 INFO:tasks.workunit.client.0.vm07.stdout:1/333: truncate d1/db/f14 605670 0 2026-03-09T19:27:23.834 INFO:tasks.workunit.client.0.vm07.stdout:1/334: write d1/db/f1f [5005280,85942] 0 2026-03-09T19:27:23.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:23 vm08.local ceph-mon[57794]: pgmap v166: 65 pgs: 65 active+clean; 1.9 GiB data, 7.1 GiB used, 113 GiB / 120 GiB avail; 35 MiB/s rd, 122 MiB/s wr, 276 op/s 2026-03-09T19:27:23.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:23 vm08.local ceph-mon[57794]: from='client.24433 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:27:23.846 INFO:tasks.workunit.client.1.vm08.stdout:6/591: write d3/db/f8f [4394073,35968] 0 2026-03-09T19:27:23.847 INFO:tasks.workunit.client.1.vm08.stdout:6/592: chown d3/f5 15189063 1 2026-03-09T19:27:23.859 INFO:tasks.workunit.client.0.vm07.stdout:9/384: symlink d0/db/d29/d2c/d36/d7d/l87 0 2026-03-09T19:27:23.862 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.862+0000 7f6b61aa3640 1 -- 192.168.123.107:0/3458322900 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f6b5c075b00 con 0x7f6b34076170 2026-03-09T19:27:23.864 INFO:tasks.workunit.client.0.vm07.stdout:9/385: dwrite d0/db/d29/d32/d5c/d69/f83 [0,4194304] 0 2026-03-09T19:27:23.868 INFO:tasks.workunit.client.0.vm07.stdout:9/386: dread d0/d17/f42 [0,4194304] 0 2026-03-09T19:27:23.874 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.876+0000 7f6b58ff9640 1 -- 192.168.123.107:0/3458322900 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f6b5c075b00 con 0x7f6b34076170 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (4m) 2m ago 5m 22.6M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (5m) 2m ago 5m 8284k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (4m) 2m ago 4m 8644k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (5m) 2m ago 5m 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2799ea3e4bf3 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (4m) 2m ago 4m 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 81fc95c210b6 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (4m) 2m ago 4m 79.7M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (2m) 2m ago 2m 12.5M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (2m) 2m ago 2m 14.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (2m) 2m ago 2m 16.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (2m) 2m ago 2m 17.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:9283,8765,8443 running (5m) 2m ago 5m 540M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 706e626ecd10 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (4m) 2m ago 4m 485M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 880604c16b45 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (5m) 2m ago 5m 53.6M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ccb644205fb3 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (4m) 2m ago 4m 49.4M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8d7b1da9e1e2 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (5m) 2m ago 5m 13.8M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (4m) 2m ago 4m 15.3M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:27:23.880 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (4m) 2m ago 4m 46.5M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d7417e3377af 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (3m) 2m ago 3m 67.5M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2b3c7dd92144 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (3m) 2m ago 3m 45.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 67f7c4b96ef8 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (3m) 2m ago 3m 45.1M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 740e44caf4fc 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (3m) 2m ago 3m 67.3M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d929d31f8a58 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (3m) 2m ago 3m 65.0M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (4m) 2m ago 4m 39.2M - 2.43.0 a07b618ecd1d 238baaac36ff 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.880+0000 7f6b3a7fc640 1 -- 192.168.123.107:0/3458322900 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6b34076170 msgr2=0x7f6b34078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.880+0000 7f6b3a7fc640 1 --2- 192.168.123.107:0/3458322900 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6b34076170 0x7f6b34078630 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7f6b4c02f6c0 tx=0x7f6b4c0028e0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.880+0000 7f6b3a7fc640 1 -- 192.168.123.107:0/3458322900 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b5c133380 msgr2=0x7f6b5c131f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.880+0000 7f6b3a7fc640 1 --2- 192.168.123.107:0/3458322900 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b5c133380 0x7f6b5c131f10 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f6b5400ea40 tx=0x7f6b5400ef10 comp rx=0 tx=0).stop 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.880+0000 7f6b3a7fc640 1 -- 192.168.123.107:0/3458322900 shutdown_connections 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.880+0000 7f6b3a7fc640 1 --2- 192.168.123.107:0/3458322900 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6b34076170 0x7f6b34078630 unknown :-1 s=CLOSED pgs=157 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.880+0000 7f6b3a7fc640 1 --2- 192.168.123.107:0/3458322900 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b5c133380 0x7f6b5c131f10 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.880+0000 7f6b3a7fc640 1 --2- 192.168.123.107:0/3458322900 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6b5c072440 0x7f6b5c1319d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.880+0000 7f6b3a7fc640 1 -- 192.168.123.107:0/3458322900 >> 192.168.123.107:0/3458322900 conn(0x7f6b5c06d4f0 msgr2=0x7f6b5c0753f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.880+0000 7f6b3a7fc640 1 -- 192.168.123.107:0/3458322900 shutdown_connections 2026-03-09T19:27:23.881 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.880+0000 7f6b3a7fc640 1 -- 192.168.123.107:0/3458322900 wait complete. 2026-03-09T19:27:23.882 INFO:tasks.workunit.client.1.vm08.stdout:3/643: getdents d0/d6/d25 0 2026-03-09T19:27:23.887 INFO:tasks.workunit.client.0.vm07.stdout:8/367: symlink d7/d9/d37/d45/d56/d62/l86 0 2026-03-09T19:27:23.890 INFO:tasks.workunit.client.1.vm08.stdout:0/596: rmdir dd/d9d/dab 0 2026-03-09T19:27:23.891 INFO:tasks.workunit.client.1.vm08.stdout:9/566: dread d0/d1b/d97/f3f [4194304,4194304] 0 2026-03-09T19:27:23.895 INFO:tasks.workunit.client.1.vm08.stdout:9/567: write d0/d1b/daa/fb2 [852637,60311] 0 2026-03-09T19:27:23.895 INFO:tasks.workunit.client.1.vm08.stdout:9/568: write d0/d1b/d97/d48/d5e/fa1 [3313862,50189] 0 2026-03-09T19:27:23.895 INFO:tasks.workunit.client.1.vm08.stdout:9/569: write d0/d2/d14/d5c/fb0 [798377,75422] 0 2026-03-09T19:27:23.895 INFO:tasks.workunit.client.0.vm07.stdout:6/298: creat d0/d1/db/d24/d53/f74 x:0 0 0 2026-03-09T19:27:23.901 INFO:tasks.workunit.client.1.vm08.stdout:4/569: symlink da/d10/d26/da0/lad 0 2026-03-09T19:27:23.902 INFO:tasks.workunit.client.1.vm08.stdout:4/570: fsync da/d10/d26/d38/f93 0 2026-03-09T19:27:23.905 INFO:tasks.workunit.client.1.vm08.stdout:5/544: symlink d16/d1e/d8c/d99/lae 0 2026-03-09T19:27:23.907 INFO:tasks.workunit.client.0.vm07.stdout:2/388: getdents d3/dd/d16/d29 0 2026-03-09T19:27:23.907 INFO:tasks.workunit.client.0.vm07.stdout:2/389: chown d3/dd/fe 279816 1 2026-03-09T19:27:23.910 INFO:tasks.workunit.client.1.vm08.stdout:2/533: creat d3/d4/d23/d2c/d39/da3/fba x:0 0 0 2026-03-09T19:27:23.912 INFO:tasks.workunit.client.1.vm08.stdout:6/593: mknod d3/d34/d5c/cd5 0 2026-03-09T19:27:23.913 INFO:tasks.workunit.client.1.vm08.stdout:6/594: write d3/db/f8f [2307754,49019] 0 2026-03-09T19:27:23.924 INFO:tasks.workunit.client.1.vm08.stdout:7/642: write d5/d14/d38/f4c [1628266,56594] 0 2026-03-09T19:27:23.925 INFO:tasks.workunit.client.0.vm07.stdout:2/390: sync 2026-03-09T19:27:23.926 INFO:tasks.workunit.client.0.vm07.stdout:2/391: chown d3/f4 1637 1 2026-03-09T19:27:23.927 INFO:tasks.workunit.client.0.vm07.stdout:2/392: chown d3/dd/d16/d29/d3c/d5a/d7a/f6e 685148 1 2026-03-09T19:27:23.931 INFO:tasks.workunit.client.0.vm07.stdout:4/325: write d3/d11/d16/d2f/d22/f24 [698649,124106] 0 2026-03-09T19:27:23.932 INFO:tasks.workunit.client.0.vm07.stdout:7/374: dwrite d0/d52/d54/f7d [0,4194304] 0 2026-03-09T19:27:23.943 INFO:tasks.workunit.client.0.vm07.stdout:3/391: dwrite d1/d6/f37 [0,4194304] 0 2026-03-09T19:27:23.944 INFO:tasks.workunit.client.0.vm07.stdout:3/392: dread - d1/d74/f6e zero size 2026-03-09T19:27:23.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.950+0000 7f6eddda1640 1 -- 192.168.123.107:0/4287572601 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ed8071a70 msgr2=0x7f6ed8071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:23.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.950+0000 7f6eddda1640 1 --2- 192.168.123.107:0/4287572601 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ed8071a70 0x7f6ed8071e70 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f6ec8007920 tx=0x7f6ec8030040 comp rx=0 tx=0).stop 2026-03-09T19:27:23.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.950+0000 7f6eddda1640 1 -- 192.168.123.107:0/4287572601 shutdown_connections 2026-03-09T19:27:23.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.950+0000 7f6eddda1640 1 --2- 192.168.123.107:0/4287572601 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6ed8072440 0x7f6ed80771b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.950+0000 7f6eddda1640 1 --2- 192.168.123.107:0/4287572601 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ed8071a70 0x7f6ed8071e70 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.950+0000 7f6eddda1640 1 -- 192.168.123.107:0/4287572601 >> 192.168.123.107:0/4287572601 conn(0x7f6ed806d4f0 msgr2=0x7f6ed806f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:23.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.951+0000 7f6eddda1640 1 -- 192.168.123.107:0/4287572601 shutdown_connections 2026-03-09T19:27:23.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.951+0000 7f6eddda1640 1 -- 192.168.123.107:0/4287572601 wait complete. 2026-03-09T19:27:23.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.951+0000 7f6eddda1640 1 Processor -- start 2026-03-09T19:27:23.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.951+0000 7f6eddda1640 1 -- start start 2026-03-09T19:27:23.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.951+0000 7f6eddda1640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ed8072440 0x7f6ed81319b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:23.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.951+0000 7f6eddda1640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6ed8133360 0x7f6ed8131ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:23.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.951+0000 7f6eddda1640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ed81324c0 con 0x7f6ed8133360 2026-03-09T19:27:23.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.951+0000 7f6eddda1640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ed8132630 con 0x7f6ed8072440 2026-03-09T19:27:23.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.952+0000 7f6ed6ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6ed8133360 0x7f6ed8131ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:23.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.952+0000 7f6ed77fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ed8072440 0x7f6ed81319b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:23.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.952+0000 7f6ed77fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ed8072440 0x7f6ed81319b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:35386/0 (socket says 192.168.123.107:35386) 2026-03-09T19:27:23.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.952+0000 7f6ed77fe640 1 -- 192.168.123.107:0/2795042528 learned_addr learned my addr 192.168.123.107:0/2795042528 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:27:23.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.952+0000 7f6ed77fe640 1 -- 192.168.123.107:0/2795042528 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6ed8133360 msgr2=0x7f6ed8131ef0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:23.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.952+0000 7f6ed77fe640 1 --2- 192.168.123.107:0/2795042528 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6ed8133360 0x7f6ed8131ef0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:23.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.952+0000 7f6ed77fe640 1 -- 192.168.123.107:0/2795042528 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6ec80075d0 con 0x7f6ed8072440 2026-03-09T19:27:23.952 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.953+0000 7f6ed77fe640 1 --2- 192.168.123.107:0/2795042528 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ed8072440 0x7f6ed81319b0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f6ec8033040 tx=0x7f6ec8004050 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:23.952 INFO:tasks.workunit.client.0.vm07.stdout:0/297: write d0/f41 [257096,39399] 0 2026-03-09T19:27:23.952 INFO:tasks.workunit.client.0.vm07.stdout:5/342: dwrite d3/dd/f58 [0,4194304] 0 2026-03-09T19:27:23.954 INFO:tasks.workunit.client.1.vm08.stdout:0/597: creat dd/d22/d27/d2e/db0/fbc x:0 0 0 2026-03-09T19:27:23.955 INFO:tasks.workunit.client.0.vm07.stdout:5/343: readlink d3/dd/d26/d2d/l37 0 2026-03-09T19:27:23.956 INFO:tasks.workunit.client.0.vm07.stdout:6/299: mkdir d0/d4e/d75 0 2026-03-09T19:27:23.959 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.954+0000 7f6ed4ff9640 1 -- 192.168.123.107:0/2795042528 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6ec8030af0 con 0x7f6ed8072440 2026-03-09T19:27:23.959 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.956+0000 7f6eddda1640 1 -- 192.168.123.107:0/2795042528 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6ed807fb70 con 0x7f6ed8072440 2026-03-09T19:27:23.959 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.956+0000 7f6eddda1640 1 -- 192.168.123.107:0/2795042528 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6ed8080060 con 0x7f6ed8072440 2026-03-09T19:27:23.959 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.960+0000 7f6ed4ff9640 1 -- 192.168.123.107:0/2795042528 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6ec8030c50 con 0x7f6ed8072440 2026-03-09T19:27:23.959 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.960+0000 7f6ed4ff9640 1 -- 192.168.123.107:0/2795042528 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6ec8041730 con 0x7f6ed8072440 2026-03-09T19:27:23.959 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.960+0000 7f6eddda1640 1 -- 192.168.123.107:0/2795042528 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6ed8071a70 con 0x7f6ed8072440 2026-03-09T19:27:23.960 INFO:tasks.workunit.client.1.vm08.stdout:9/570: rename d0/d2/cba to d0/d2/d14/d98/d99/cc0 0 2026-03-09T19:27:23.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.964+0000 7f6ed4ff9640 1 -- 192.168.123.107:0/2795042528 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f6ec8041890 con 0x7f6ed8072440 2026-03-09T19:27:23.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.964+0000 7f6ed4ff9640 1 --2- 192.168.123.107:0/2795042528 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6eb0076170 0x7f6eb0078630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:23.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.964+0000 7f6ed4ff9640 1 -- 192.168.123.107:0/2795042528 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f6ec80bc6c0 con 0x7f6ed8072440 2026-03-09T19:27:23.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.966+0000 7f6ed6ffd640 1 --2- 192.168.123.107:0/2795042528 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6eb0076170 0x7f6eb0078630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:23.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.966+0000 7f6ed6ffd640 1 --2- 192.168.123.107:0/2795042528 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6eb0076170 0x7f6eb0078630 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7f6ed8133760 tx=0x7f6ed00072f0 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:23.968 INFO:tasks.workunit.client.1.vm08.stdout:8/546: dwrite de/d1d/d69/f8f [0,4194304] 0 2026-03-09T19:27:23.983 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:23.984+0000 7f6ed4ff9640 1 -- 192.168.123.107:0/2795042528 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f6ec8085c80 con 0x7f6ed8072440 2026-03-09T19:27:23.988 INFO:tasks.workunit.client.1.vm08.stdout:4/571: creat da/d10/d16/d28/d46/d52/d6e/d73/fae x:0 0 0 2026-03-09T19:27:23.990 INFO:tasks.workunit.client.1.vm08.stdout:4/572: chown da/d10/d16/d28/d46/d52/d6e/d73/l9c 284 1 2026-03-09T19:27:23.991 INFO:tasks.workunit.client.1.vm08.stdout:4/573: fdatasync da/d10/d16/d28/d46/d52/d6e/d2c/f36 0 2026-03-09T19:27:23.999 INFO:tasks.workunit.client.0.vm07.stdout:2/393: rename d3/dd/d16/d29/d2d/d45/d3b/d44/d7d to d3/dd/d16/d29/d2d/d45/d85 0 2026-03-09T19:27:24.001 INFO:tasks.workunit.client.1.vm08.stdout:5/545: mkdir d16/d45/daf 0 2026-03-09T19:27:24.006 INFO:tasks.workunit.client.0.vm07.stdout:4/326: mkdir d3/d11/d16/d2f/d22/d70 0 2026-03-09T19:27:24.018 INFO:tasks.workunit.client.1.vm08.stdout:1/722: dwrite d9/d40/d49/f70 [0,4194304] 0 2026-03-09T19:27:24.020 INFO:tasks.workunit.client.1.vm08.stdout:1/723: readlink l8 0 2026-03-09T19:27:24.020 INFO:tasks.workunit.client.1.vm08.stdout:1/724: stat d9/d40/c61 0 2026-03-09T19:27:24.033 INFO:tasks.workunit.client.1.vm08.stdout:7/643: fsync d5/d14/dae/d1c/fab 0 2026-03-09T19:27:24.036 INFO:tasks.workunit.client.0.vm07.stdout:3/393: mknod d1/d6/d4c/c78 0 2026-03-09T19:27:24.040 INFO:tasks.workunit.client.1.vm08.stdout:3/644: unlink d0/d6/de/d1b/d16/d17/f71 0 2026-03-09T19:27:24.048 INFO:tasks.workunit.client.0.vm07.stdout:9/387: symlink d0/db/d29/d2c/d36/d5a/l88 0 2026-03-09T19:27:24.048 INFO:tasks.workunit.client.1.vm08.stdout:6/595: dread d3/d15/f2b [0,4194304] 0 2026-03-09T19:27:24.048 INFO:tasks.workunit.client.1.vm08.stdout:3/645: chown d0/d4b/f74 979 1 2026-03-09T19:27:24.048 INFO:tasks.workunit.client.1.vm08.stdout:3/646: dwrite d0/d8/f4a [0,4194304] 0 2026-03-09T19:27:24.053 INFO:tasks.workunit.client.1.vm08.stdout:3/647: fdatasync d0/d52/f8a 0 2026-03-09T19:27:24.067 INFO:tasks.workunit.client.0.vm07.stdout:9/388: dread d0/f3 [0,4194304] 0 2026-03-09T19:27:24.068 INFO:tasks.workunit.client.0.vm07.stdout:0/298: rmdir d0/d6/d13/d17/d19 39 2026-03-09T19:27:24.071 INFO:tasks.workunit.client.0.vm07.stdout:9/389: dwrite d0/db/d29/d2c/f4a [0,4194304] 0 2026-03-09T19:27:24.072 INFO:tasks.workunit.client.1.vm08.stdout:8/547: creat de/d25/d31/fc0 x:0 0 0 2026-03-09T19:27:24.075 INFO:tasks.workunit.client.0.vm07.stdout:6/300: mkdir d0/d1/d28/d76 0 2026-03-09T19:27:24.076 INFO:tasks.workunit.client.0.vm07.stdout:6/301: chown d0/d1/db/f43 5663785 1 2026-03-09T19:27:24.080 INFO:tasks.workunit.client.0.vm07.stdout:1/335: creat d1/f76 x:0 0 0 2026-03-09T19:27:24.086 INFO:tasks.workunit.client.0.vm07.stdout:1/336: dread d1/d3e/f49 [0,4194304] 0 2026-03-09T19:27:24.089 INFO:tasks.workunit.client.0.vm07.stdout:1/337: dwrite d1/d11/f1b [0,4194304] 0 2026-03-09T19:27:24.093 INFO:tasks.workunit.client.0.vm07.stdout:1/338: chown d1/db/d31/d56/f6a 2297547 1 2026-03-09T19:27:24.094 INFO:tasks.workunit.client.0.vm07.stdout:5/344: rename d3/dd to d3/dd/d26/d73 22 2026-03-09T19:27:24.105 INFO:tasks.workunit.client.0.vm07.stdout:4/327: creat d3/d11/d2b/f71 x:0 0 0 2026-03-09T19:27:24.106 INFO:tasks.workunit.client.0.vm07.stdout:4/328: truncate d3/d4f/d56/d5f/f6f 788495 0 2026-03-09T19:27:24.109 INFO:tasks.workunit.client.1.vm08.stdout:0/598: dwrite dd/d22/d24/d49/f5f [4194304,4194304] 0 2026-03-09T19:27:24.131 INFO:tasks.workunit.client.1.vm08.stdout:6/596: mkdir d3/d34/d5c/da2/dd6 0 2026-03-09T19:27:24.132 INFO:tasks.workunit.client.0.vm07.stdout:2/394: dwrite d3/dd/fe [4194304,4194304] 0 2026-03-09T19:27:24.135 INFO:tasks.workunit.client.0.vm07.stdout:2/395: truncate d3/dd/d16/d29/d2d/d45/d3b/d44/f81 26076 0 2026-03-09T19:27:24.147 INFO:tasks.workunit.client.0.vm07.stdout:2/396: dread d3/f22 [0,4194304] 0 2026-03-09T19:27:24.147 INFO:tasks.workunit.client.0.vm07.stdout:3/394: mknod d1/d6/d71/c79 0 2026-03-09T19:27:24.151 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.151+0000 7f6eddda1640 1 -- 192.168.123.107:0/2795042528 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f6ed807a6e0 con 0x7f6ed8072440 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.153+0000 7f6ed4ff9640 1 -- 192.168.123.107:0/2795042528 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f6ec8085620 con 0x7f6ed8072440 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 14 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:27:24.152 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:27:24.155 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.156+0000 7f6eb67fc640 1 -- 192.168.123.107:0/2795042528 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6eb0076170 msgr2=0x7f6eb0078630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:24.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.156+0000 7f6eb67fc640 1 --2- 192.168.123.107:0/2795042528 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6eb0076170 0x7f6eb0078630 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7f6ed8133760 tx=0x7f6ed00072f0 comp rx=0 tx=0).stop 2026-03-09T19:27:24.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.156+0000 7f6eb67fc640 1 -- 192.168.123.107:0/2795042528 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ed8072440 msgr2=0x7f6ed81319b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:24.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.156+0000 7f6eb67fc640 1 --2- 192.168.123.107:0/2795042528 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ed8072440 0x7f6ed81319b0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f6ec8033040 tx=0x7f6ec8004050 comp rx=0 tx=0).stop 2026-03-09T19:27:24.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.156+0000 7f6eb67fc640 1 -- 192.168.123.107:0/2795042528 shutdown_connections 2026-03-09T19:27:24.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.156+0000 7f6eb67fc640 1 --2- 192.168.123.107:0/2795042528 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f6eb0076170 0x7f6eb0078630 unknown :-1 s=CLOSED pgs=158 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:24.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.156+0000 7f6eb67fc640 1 --2- 192.168.123.107:0/2795042528 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6ed8133360 0x7f6ed8131ef0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:24.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.156+0000 7f6eb67fc640 1 --2- 192.168.123.107:0/2795042528 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ed8072440 0x7f6ed81319b0 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:24.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.156+0000 7f6eb67fc640 1 -- 192.168.123.107:0/2795042528 >> 192.168.123.107:0/2795042528 conn(0x7f6ed806d4f0 msgr2=0x7f6ed80753f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:24.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.156+0000 7f6eb67fc640 1 -- 192.168.123.107:0/2795042528 shutdown_connections 2026-03-09T19:27:24.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.156+0000 7f6eb67fc640 1 -- 192.168.123.107:0/2795042528 wait complete. 2026-03-09T19:27:24.171 INFO:tasks.workunit.client.1.vm08.stdout:1/725: dwrite d9/f48 [0,4194304] 0 2026-03-09T19:27:24.171 INFO:tasks.workunit.client.1.vm08.stdout:1/726: dwrite d9/d11/f9b [0,4194304] 0 2026-03-09T19:27:24.171 INFO:tasks.workunit.client.1.vm08.stdout:1/727: chown d9/da/d12/d39/l3a 27 1 2026-03-09T19:27:24.177 INFO:tasks.workunit.client.0.vm07.stdout:8/368: creat d7/d9/f87 x:0 0 0 2026-03-09T19:27:24.190 INFO:tasks.workunit.client.1.vm08.stdout:3/648: creat d0/d8/d19/fcc x:0 0 0 2026-03-09T19:27:24.190 INFO:tasks.workunit.client.0.vm07.stdout:0/299: dread - d0/d6/d13/d1c/f3e zero size 2026-03-09T19:27:24.190 INFO:tasks.workunit.client.0.vm07.stdout:0/300: chown d0/d6 9148 1 2026-03-09T19:27:24.194 INFO:tasks.workunit.client.0.vm07.stdout:9/390: creat d0/d6/d3a/f89 x:0 0 0 2026-03-09T19:27:24.197 INFO:tasks.workunit.client.0.vm07.stdout:6/302: chown d0/ff 24310013 1 2026-03-09T19:27:24.205 INFO:tasks.workunit.client.0.vm07.stdout:7/375: truncate d0/d52/d54/f7d 2471963 0 2026-03-09T19:27:24.207 INFO:tasks.workunit.client.1.vm08.stdout:4/574: mknod da/caf 0 2026-03-09T19:27:24.213 INFO:tasks.workunit.client.1.vm08.stdout:5/546: creat d16/d1e/d9b/fb0 x:0 0 0 2026-03-09T19:27:24.220 INFO:tasks.workunit.client.0.vm07.stdout:4/329: creat d3/d4f/d56/d5f/f72 x:0 0 0 2026-03-09T19:27:24.222 INFO:tasks.workunit.client.1.vm08.stdout:7/644: mknod d5/d14/d27/d78/dc7/dce/cd5 0 2026-03-09T19:27:24.222 INFO:tasks.workunit.client.1.vm08.stdout:0/599: creat dd/d22/d63/d6e/d72/fbd x:0 0 0 2026-03-09T19:27:24.226 INFO:tasks.workunit.client.1.vm08.stdout:5/547: dwrite d16/d1e/d8c/fab [0,4194304] 0 2026-03-09T19:27:24.228 INFO:tasks.workunit.client.1.vm08.stdout:5/548: chown d16/d1e 4537 1 2026-03-09T19:27:24.232 INFO:tasks.workunit.client.0.vm07.stdout:3/395: read d1/d1f/d16/d28/f34 [1047199,66978] 0 2026-03-09T19:27:24.243 INFO:tasks.workunit.client.0.vm07.stdout:9/391: unlink d0/db/l23 0 2026-03-09T19:27:24.244 INFO:tasks.workunit.client.0.vm07.stdout:9/392: chown d0/db/l1b 2512 1 2026-03-09T19:27:24.244 INFO:tasks.workunit.client.0.vm07.stdout:9/393: fdatasync d0/d6/f4c 0 2026-03-09T19:27:24.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.246+0000 7f1e564a6640 1 -- 192.168.123.107:0/566694542 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e5010b100 msgr2=0x7f1e5010d4f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:24.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.246+0000 7f1e564a6640 1 --2- 192.168.123.107:0/566694542 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e5010b100 0x7f1e5010d4f0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f1e300098e0 tx=0x7f1e3002f1b0 comp rx=0 tx=0).stop 2026-03-09T19:27:24.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.246+0000 7f1e564a6640 1 -- 192.168.123.107:0/566694542 shutdown_connections 2026-03-09T19:27:24.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.246+0000 7f1e564a6640 1 --2- 192.168.123.107:0/566694542 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e5010b100 0x7f1e5010d4f0 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:24.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.246+0000 7f1e564a6640 1 --2- 192.168.123.107:0/566694542 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1e50108740 0x7f1e5010ab30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:24.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.246+0000 7f1e564a6640 1 -- 192.168.123.107:0/566694542 >> 192.168.123.107:0/566694542 conn(0x7f1e5006d4f0 msgr2=0x7f1e5006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:24.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.246+0000 7f1e564a6640 1 -- 192.168.123.107:0/566694542 shutdown_connections 2026-03-09T19:27:24.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.246+0000 7f1e564a6640 1 -- 192.168.123.107:0/566694542 wait complete. 2026-03-09T19:27:24.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.247+0000 7f1e564a6640 1 Processor -- start 2026-03-09T19:27:24.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.247+0000 7f1e564a6640 1 -- start start 2026-03-09T19:27:24.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.248+0000 7f1e564a6640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1e50108740 0x7f1e501a0800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:24.247 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.248+0000 7f1e564a6640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e5010b100 0x7f1e501a0d40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:24.247 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.248+0000 7f1e564a6640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1e501a1310 con 0x7f1e50108740 2026-03-09T19:27:24.247 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.248+0000 7f1e564a6640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1e501a1480 con 0x7f1e5010b100 2026-03-09T19:27:24.247 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.248+0000 7f1e4f7fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e5010b100 0x7f1e501a0d40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:24.247 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.248+0000 7f1e4f7fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e5010b100 0x7f1e501a0d40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:35404/0 (socket says 192.168.123.107:35404) 2026-03-09T19:27:24.247 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.248+0000 7f1e4f7fe640 1 -- 192.168.123.107:0/415814876 learned_addr learned my addr 192.168.123.107:0/415814876 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:27:24.247 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.248+0000 7f1e4ffff640 1 --2- 192.168.123.107:0/415814876 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1e50108740 0x7f1e501a0800 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:24.248 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.249+0000 7f1e4ffff640 1 -- 192.168.123.107:0/415814876 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e5010b100 msgr2=0x7f1e501a0d40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:24.248 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.249+0000 7f1e4ffff640 1 --2- 192.168.123.107:0/415814876 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e5010b100 0x7f1e501a0d40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:24.248 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.249+0000 7f1e4ffff640 1 -- 192.168.123.107:0/415814876 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1e30009590 con 0x7f1e50108740 2026-03-09T19:27:24.248 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.249+0000 7f1e4ffff640 1 --2- 192.168.123.107:0/415814876 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1e50108740 0x7f1e501a0800 secure :-1 s=READY pgs=337 cs=0 l=1 rev1=1 crypto rx=0x7f1e3800e9a0 tx=0x7f1e3800ee70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:24.248 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.249+0000 7f1e4d7fa640 1 -- 192.168.123.107:0/415814876 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1e3800cd30 con 0x7f1e50108740 2026-03-09T19:27:24.248 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.249+0000 7f1e564a6640 1 -- 192.168.123.107:0/415814876 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1e501a5f20 con 0x7f1e50108740 2026-03-09T19:27:24.248 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.249+0000 7f1e564a6640 1 -- 192.168.123.107:0/415814876 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1e501a6380 con 0x7f1e50108740 2026-03-09T19:27:24.250 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.250+0000 7f1e4d7fa640 1 -- 192.168.123.107:0/415814876 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f1e38004590 con 0x7f1e50108740 2026-03-09T19:27:24.250 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.250+0000 7f1e4d7fa640 1 -- 192.168.123.107:0/415814876 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1e38010640 con 0x7f1e50108740 2026-03-09T19:27:24.250 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.250+0000 7f1e46ffd640 1 -- 192.168.123.107:0/415814876 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1e50111e30 con 0x7f1e50108740 2026-03-09T19:27:24.251 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.251+0000 7f1e4d7fa640 1 -- 192.168.123.107:0/415814876 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f1e380107a0 con 0x7f1e50108740 2026-03-09T19:27:24.251 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.252+0000 7f1e4d7fa640 1 --2- 192.168.123.107:0/415814876 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1e18076290 0x7f1e18078750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:24.251 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.252+0000 7f1e4d7fa640 1 -- 192.168.123.107:0/415814876 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f1e38014070 con 0x7f1e50108740 2026-03-09T19:27:24.251 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.252+0000 7f1e4f7fe640 1 --2- 192.168.123.107:0/415814876 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1e18076290 0x7f1e18078750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:24.252 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.253+0000 7f1e4f7fe640 1 --2- 192.168.123.107:0/415814876 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1e18076290 0x7f1e18078750 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f1e30002be0 tx=0x7f1e3003a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:24.254 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.255+0000 7f1e4d7fa640 1 -- 192.168.123.107:0/415814876 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f1e38062a30 con 0x7f1e50108740 2026-03-09T19:27:24.259 INFO:tasks.workunit.client.1.vm08.stdout:9/571: creat d0/d2/fc1 x:0 0 0 2026-03-09T19:27:24.264 INFO:tasks.workunit.client.0.vm07.stdout:7/376: rmdir d0/d52/d54/d55 39 2026-03-09T19:27:24.269 INFO:tasks.workunit.client.1.vm08.stdout:9/572: dwrite d0/d1b/daa/fb6 [0,4194304] 0 2026-03-09T19:27:24.291 INFO:tasks.workunit.client.1.vm08.stdout:8/548: dwrite de/d1d/d69/fb7 [0,4194304] 0 2026-03-09T19:27:24.333 INFO:tasks.workunit.client.0.vm07.stdout:1/339: write d1/d11/d37/f40 [332072,74597] 0 2026-03-09T19:27:24.335 INFO:tasks.workunit.client.0.vm07.stdout:1/340: readlink d1/d3e/l54 0 2026-03-09T19:27:24.335 INFO:tasks.workunit.client.0.vm07.stdout:1/341: read d1/d3/f12 [1783280,84826] 0 2026-03-09T19:27:24.338 INFO:tasks.workunit.client.1.vm08.stdout:6/597: write d3/d94/fb2 [933459,34241] 0 2026-03-09T19:27:24.338 INFO:tasks.workunit.client.1.vm08.stdout:3/649: write d0/d8/f66 [631021,62676] 0 2026-03-09T19:27:24.341 INFO:tasks.workunit.client.0.vm07.stdout:6/303: write d0/d1/db/d24/d53/f35 [2666297,72089] 0 2026-03-09T19:27:24.346 INFO:tasks.workunit.client.1.vm08.stdout:6/598: dwrite d3/f32 [0,4194304] 0 2026-03-09T19:27:24.352 INFO:tasks.workunit.client.1.vm08.stdout:2/534: getdents d3/d9/d79/d46/d8c 0 2026-03-09T19:27:24.354 INFO:tasks.workunit.client.1.vm08.stdout:3/650: dwrite d0/d6/de/f86 [0,4194304] 0 2026-03-09T19:27:24.356 INFO:tasks.workunit.client.0.vm07.stdout:5/345: mknod d3/d1a/d28/d48/c74 0 2026-03-09T19:27:24.357 INFO:tasks.workunit.client.0.vm07.stdout:5/346: dread - d3/f2f zero size 2026-03-09T19:27:24.366 INFO:tasks.workunit.client.1.vm08.stdout:4/575: dread da/d10/f25 [4194304,4194304] 0 2026-03-09T19:27:24.369 INFO:tasks.workunit.client.1.vm08.stdout:7/645: creat d5/d14/d2b/d4b/fd6 x:0 0 0 2026-03-09T19:27:24.370 INFO:tasks.workunit.client.0.vm07.stdout:4/330: rename d3/d11/d16/l1c to d3/d11/d2b/l73 0 2026-03-09T19:27:24.376 INFO:tasks.workunit.client.0.vm07.stdout:8/369: creat d7/d30/d75/f88 x:0 0 0 2026-03-09T19:27:24.381 INFO:tasks.workunit.client.1.vm08.stdout:8/549: truncate de/d1d/d21/f45 1006476 0 2026-03-09T19:27:24.381 INFO:tasks.workunit.client.0.vm07.stdout:7/377: dread - d0/d4/d5/d26/d3c/d58/f5d zero size 2026-03-09T19:27:24.383 INFO:tasks.workunit.client.0.vm07.stdout:9/394: dwrite d0/f56 [0,4194304] 0 2026-03-09T19:27:24.384 INFO:tasks.workunit.client.0.vm07.stdout:9/395: dread - d0/d6/f7b zero size 2026-03-09T19:27:24.384 INFO:tasks.workunit.client.0.vm07.stdout:9/396: read d0/d6/ff [3109995,63864] 0 2026-03-09T19:27:24.386 INFO:tasks.workunit.client.1.vm08.stdout:8/550: dread de/f5d [0,4194304] 0 2026-03-09T19:27:24.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.403+0000 7f1e46ffd640 1 -- 192.168.123.107:0/415814876 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f1e50061980 con 0x7f1e50108740 2026-03-09T19:27:24.409 INFO:tasks.workunit.client.1.vm08.stdout:8/551: write de/d1d/d4f/f9c [173145,130589] 0 2026-03-09T19:27:24.409 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.405+0000 7f1e4d7fa640 1 -- 192.168.123.107:0/415814876 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1856 (secure 0 0 0) 0x7f1e380623d0 con 0x7f1e50108740 2026-03-09T19:27:24.409 INFO:teuthology.orchestra.run.vm07.stdout:e13 2026-03-09T19:27:24.409 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:27:24.409 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:27:24.409 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:epoch 13 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:27:24.410 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:27:24.411 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:27:24.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.407+0000 7f1e46ffd640 1 -- 192.168.123.107:0/415814876 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1e18076290 msgr2=0x7f1e18078750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:24.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.407+0000 7f1e46ffd640 1 --2- 192.168.123.107:0/415814876 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1e18076290 0x7f1e18078750 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f1e30002be0 tx=0x7f1e3003a040 comp rx=0 tx=0).stop 2026-03-09T19:27:24.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.407+0000 7f1e46ffd640 1 -- 192.168.123.107:0/415814876 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1e50108740 msgr2=0x7f1e501a0800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:24.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.407+0000 7f1e46ffd640 1 --2- 192.168.123.107:0/415814876 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1e50108740 0x7f1e501a0800 secure :-1 s=READY pgs=337 cs=0 l=1 rev1=1 crypto rx=0x7f1e3800e9a0 tx=0x7f1e3800ee70 comp rx=0 tx=0).stop 2026-03-09T19:27:24.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.407+0000 7f1e46ffd640 1 -- 192.168.123.107:0/415814876 shutdown_connections 2026-03-09T19:27:24.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.407+0000 7f1e46ffd640 1 --2- 192.168.123.107:0/415814876 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f1e18076290 0x7f1e18078750 unknown :-1 s=CLOSED pgs=159 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:24.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.407+0000 7f1e46ffd640 1 --2- 192.168.123.107:0/415814876 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e5010b100 0x7f1e501a0d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:24.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.407+0000 7f1e46ffd640 1 --2- 192.168.123.107:0/415814876 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1e50108740 0x7f1e501a0800 unknown :-1 s=CLOSED pgs=337 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:24.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.407+0000 7f1e46ffd640 1 -- 192.168.123.107:0/415814876 >> 192.168.123.107:0/415814876 conn(0x7f1e5006d4f0 msgr2=0x7f1e5010b880 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:24.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.407+0000 7f1e46ffd640 1 -- 192.168.123.107:0/415814876 shutdown_connections 2026-03-09T19:27:24.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.407+0000 7f1e46ffd640 1 -- 192.168.123.107:0/415814876 wait complete. 2026-03-09T19:27:24.411 INFO:tasks.workunit.client.1.vm08.stdout:2/535: fsync d3/d4/d23/d2c/d39/d5e/de/f7a 0 2026-03-09T19:27:24.412 INFO:tasks.workunit.client.1.vm08.stdout:0/600: link dd/d22/d24/d49/c58 dd/d22/d63/cbe 0 2026-03-09T19:27:24.413 INFO:tasks.workunit.client.1.vm08.stdout:1/728: getdents d9/da/d2d/d62 0 2026-03-09T19:27:24.415 INFO:tasks.workunit.client.1.vm08.stdout:8/552: creat de/d47/fc1 x:0 0 0 2026-03-09T19:27:24.416 INFO:tasks.workunit.client.1.vm08.stdout:6/599: creat d3/d34/d6f/fd7 x:0 0 0 2026-03-09T19:27:24.418 INFO:tasks.workunit.client.1.vm08.stdout:2/536: truncate d3/d9/f20 1239443 0 2026-03-09T19:27:24.419 INFO:tasks.workunit.client.1.vm08.stdout:0/601: creat dd/d22/d27/d6c/fbf x:0 0 0 2026-03-09T19:27:24.421 INFO:tasks.workunit.client.1.vm08.stdout:1/729: fsync d9/d11/d7a/f80 0 2026-03-09T19:27:24.422 INFO:tasks.workunit.client.1.vm08.stdout:8/553: mknod de/d25/d31/cc2 0 2026-03-09T19:27:24.430 INFO:tasks.workunit.client.1.vm08.stdout:6/600: rmdir d3/db/d43/d69/da0 39 2026-03-09T19:27:24.430 INFO:tasks.workunit.client.1.vm08.stdout:0/602: symlink dd/d22/d27/d4f/lc0 0 2026-03-09T19:27:24.430 INFO:tasks.workunit.client.1.vm08.stdout:0/603: dread dd/d22/d24/d49/f5f [4194304,4194304] 0 2026-03-09T19:27:24.430 INFO:tasks.workunit.client.0.vm07.stdout:1/342: creat d1/db/d31/d4f/f77 x:0 0 0 2026-03-09T19:27:24.430 INFO:tasks.workunit.client.0.vm07.stdout:1/343: write d1/d3/f23 [3668906,36106] 0 2026-03-09T19:27:24.430 INFO:tasks.workunit.client.0.vm07.stdout:1/344: stat d1/d3/d21 0 2026-03-09T19:27:24.430 INFO:tasks.workunit.client.0.vm07.stdout:6/304: chown d0/d1/db/d17/f1a 75061 1 2026-03-09T19:27:24.430 INFO:tasks.workunit.client.0.vm07.stdout:5/347: fsync d3/d1a/f1c 0 2026-03-09T19:27:24.431 INFO:tasks.workunit.client.1.vm08.stdout:0/604: write dd/d31/f54 [1690854,21487] 0 2026-03-09T19:27:24.433 INFO:tasks.workunit.client.0.vm07.stdout:3/396: rename d1/d1f/d16/d28/l63 to d1/d6/dd/d51/l7a 0 2026-03-09T19:27:24.436 INFO:tasks.workunit.client.0.vm07.stdout:0/301: rmdir d0/d6/d13/d17/d19 39 2026-03-09T19:27:24.436 INFO:tasks.workunit.client.0.vm07.stdout:7/378: stat d0/l29 0 2026-03-09T19:27:24.436 INFO:tasks.workunit.client.1.vm08.stdout:6/601: mknod d3/db/cd8 0 2026-03-09T19:27:24.437 INFO:tasks.workunit.client.0.vm07.stdout:7/379: write d0/d4/d5/f20 [3767336,14845] 0 2026-03-09T19:27:24.439 INFO:tasks.workunit.client.0.vm07.stdout:8/370: dwrite d7/d9/d37/d45/d56/f5f [0,4194304] 0 2026-03-09T19:27:24.448 INFO:tasks.workunit.client.0.vm07.stdout:1/345: creat d1/db/d31/f78 x:0 0 0 2026-03-09T19:27:24.448 INFO:tasks.workunit.client.0.vm07.stdout:6/305: mkdir d0/d1/db/d1d/d77 0 2026-03-09T19:27:24.449 INFO:tasks.workunit.client.1.vm08.stdout:6/602: dread d3/f2a [0,4194304] 0 2026-03-09T19:27:24.451 INFO:tasks.workunit.client.0.vm07.stdout:5/348: creat d3/dd/d26/d3f/d47/d56/f75 x:0 0 0 2026-03-09T19:27:24.451 INFO:tasks.workunit.client.0.vm07.stdout:5/349: dread - d3/d1a/d28/d36/f63 zero size 2026-03-09T19:27:24.452 INFO:tasks.workunit.client.1.vm08.stdout:1/730: link d9/da/dc/f1d d9/da/d12/d91/dc5/fd7 0 2026-03-09T19:27:24.452 INFO:tasks.workunit.client.1.vm08.stdout:1/731: chown d9/da/dc/l83 3 1 2026-03-09T19:27:24.455 INFO:tasks.workunit.client.0.vm07.stdout:3/397: symlink d1/d6/d45/l7b 0 2026-03-09T19:27:24.457 INFO:tasks.workunit.client.1.vm08.stdout:6/603: mknod d3/d15/dc2/cd9 0 2026-03-09T19:27:24.459 INFO:tasks.workunit.client.0.vm07.stdout:3/398: dwrite d1/d1f/d16/f39 [0,4194304] 0 2026-03-09T19:27:24.470 INFO:tasks.workunit.client.0.vm07.stdout:0/302: readlink d0/d6/d13/d17/l2a 0 2026-03-09T19:27:24.471 INFO:tasks.workunit.client.1.vm08.stdout:1/732: mknod d9/da/d2d/cd8 0 2026-03-09T19:27:24.472 INFO:tasks.workunit.client.1.vm08.stdout:0/605: dread dd/d22/d27/d2e/f39 [0,4194304] 0 2026-03-09T19:27:24.477 INFO:tasks.workunit.client.1.vm08.stdout:6/604: rename d3/f61 to d3/d34/da9/da4/fda 0 2026-03-09T19:27:24.490 INFO:tasks.workunit.client.1.vm08.stdout:0/606: read - dd/d22/d27/d2e/db0/fa4 zero size 2026-03-09T19:27:24.490 INFO:tasks.workunit.client.1.vm08.stdout:1/733: rename d9/da/d17/c3b to d9/da/d2d/cd9 0 2026-03-09T19:27:24.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.479+0000 7f1956655640 1 -- 192.168.123.107:0/3257231906 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f19500fe8f0 msgr2=0x7f19500fecf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:24.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.479+0000 7f1956655640 1 --2- 192.168.123.107:0/3257231906 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f19500fe8f0 0x7f19500fecf0 secure :-1 s=READY pgs=338 cs=0 l=1 rev1=1 crypto rx=0x7f194400b0a0 tx=0x7f194402f4a0 comp rx=0 tx=0).stop 2026-03-09T19:27:24.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.479+0000 7f1956655640 1 -- 192.168.123.107:0/3257231906 shutdown_connections 2026-03-09T19:27:24.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.479+0000 7f1956655640 1 --2- 192.168.123.107:0/3257231906 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f19500ff5d0 0x7f19500ffa50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:24.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.479+0000 7f1956655640 1 --2- 192.168.123.107:0/3257231906 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f19500fe8f0 0x7f19500fecf0 unknown :-1 s=CLOSED pgs=338 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:24.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.479+0000 7f1956655640 1 -- 192.168.123.107:0/3257231906 >> 192.168.123.107:0/3257231906 conn(0x7f19500fa410 msgr2=0x7f19500fc830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:24.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.479+0000 7f1956655640 1 -- 192.168.123.107:0/3257231906 shutdown_connections 2026-03-09T19:27:24.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.479+0000 7f1956655640 1 -- 192.168.123.107:0/3257231906 wait complete. 2026-03-09T19:27:24.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.480+0000 7f1956655640 1 Processor -- start 2026-03-09T19:27:24.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.480+0000 7f1956655640 1 -- start start 2026-03-09T19:27:24.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.480+0000 7f1956655640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f19500fe8f0 0x7f195019a270 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:24.491 INFO:tasks.workunit.client.0.vm07.stdout:7/380: write d0/d52/d54/f7d [2556716,40734] 0 2026-03-09T19:27:24.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.482+0000 7f1956655640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f19500ff5d0 0x7f195019a7b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:24.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.482+0000 7f1956655640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f195019ad80 con 0x7f19500fe8f0 2026-03-09T19:27:24.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.482+0000 7f1956655640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f195019aef0 con 0x7f19500ff5d0 2026-03-09T19:27:24.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.483+0000 7f194f7fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f19500ff5d0 0x7f195019a7b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:24.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.483+0000 7f194f7fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f19500ff5d0 0x7f195019a7b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:35414/0 (socket says 192.168.123.107:35414) 2026-03-09T19:27:24.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.483+0000 7f194f7fe640 1 -- 192.168.123.107:0/1138716566 learned_addr learned my addr 192.168.123.107:0/1138716566 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:27:24.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.483+0000 7f194f7fe640 1 -- 192.168.123.107:0/1138716566 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f19500fe8f0 msgr2=0x7f195019a270 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:24.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.483+0000 7f194f7fe640 1 --2- 192.168.123.107:0/1138716566 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f19500fe8f0 0x7f195019a270 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:24.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.483+0000 7f194f7fe640 1 -- 192.168.123.107:0/1138716566 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1944009d00 con 0x7f19500ff5d0 2026-03-09T19:27:24.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.483+0000 7f194f7fe640 1 --2- 192.168.123.107:0/1138716566 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f19500ff5d0 0x7f195019a7b0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f193c00b850 tx=0x7f193c00bd20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:24.491 INFO:tasks.workunit.client.0.vm07.stdout:8/371: creat d7/d9/d37/d45/d56/d62/f89 x:0 0 0 2026-03-09T19:27:24.491 INFO:tasks.workunit.client.0.vm07.stdout:8/372: read - d7/d50/f82 zero size 2026-03-09T19:27:24.495 INFO:tasks.workunit.client.1.vm08.stdout:1/734: rmdir d9/d11/d7a/d89 39 2026-03-09T19:27:24.504 INFO:tasks.workunit.client.0.vm07.stdout:4/331: creat d3/d11/f74 x:0 0 0 2026-03-09T19:27:24.504 INFO:tasks.workunit.client.0.vm07.stdout:4/332: dwrite d3/fc [0,4194304] 0 2026-03-09T19:27:24.506 INFO:tasks.workunit.client.0.vm07.stdout:3/399: mkdir d1/d1f/d16/d28/d7c 0 2026-03-09T19:27:24.506 INFO:tasks.workunit.client.1.vm08.stdout:6/605: link d3/d34/da9/f97 d3/db/fdb 0 2026-03-09T19:27:24.523 INFO:tasks.workunit.client.1.vm08.stdout:6/606: write d3/d34/da9/fc7 [845704,75715] 0 2026-03-09T19:27:24.523 INFO:tasks.workunit.client.0.vm07.stdout:5/350: dread d3/dd/f24 [0,4194304] 0 2026-03-09T19:27:24.523 INFO:tasks.workunit.client.0.vm07.stdout:7/381: truncate d0/d4/d5/dd/f47 746157 0 2026-03-09T19:27:24.523 INFO:tasks.workunit.client.0.vm07.stdout:7/382: readlink d0/d4/d5/d26/d32/l3f 0 2026-03-09T19:27:24.523 INFO:tasks.workunit.client.0.vm07.stdout:8/373: fsync d7/d1d/f4d 0 2026-03-09T19:27:24.523 INFO:tasks.workunit.client.0.vm07.stdout:0/303: fdatasync d0/d6/d13/d1c/f27 0 2026-03-09T19:27:24.523 INFO:tasks.workunit.client.0.vm07.stdout:3/400: creat d1/d1f/d5c/f7d x:0 0 0 2026-03-09T19:27:24.523 INFO:tasks.workunit.client.0.vm07.stdout:8/374: dwrite d7/d50/f6f [0,4194304] 0 2026-03-09T19:27:24.524 INFO:tasks.workunit.client.0.vm07.stdout:5/351: mkdir d3/dd/d26/d3f/d47/d71/d76 0 2026-03-09T19:27:24.528 INFO:tasks.workunit.client.0.vm07.stdout:7/383: link d0/d4/d5/d26/d3c/d39/f7a d0/d4/d5/dd/f83 0 2026-03-09T19:27:24.529 INFO:tasks.workunit.client.0.vm07.stdout:7/384: truncate d0/d4/d5/d26/d3c/d58/f71 4415417 0 2026-03-09T19:27:24.533 INFO:tasks.workunit.client.0.vm07.stdout:8/375: chown d7/d9/d37/d45/d4f/l72 613208 1 2026-03-09T19:27:24.534 INFO:tasks.workunit.client.0.vm07.stdout:5/352: write d3/d1a/f12 [2919655,116358] 0 2026-03-09T19:27:24.535 INFO:tasks.workunit.client.0.vm07.stdout:3/401: fsync d1/d74/f52 0 2026-03-09T19:27:24.538 INFO:tasks.workunit.client.0.vm07.stdout:7/385: unlink d0/d4/fc 0 2026-03-09T19:27:24.545 INFO:tasks.workunit.client.0.vm07.stdout:5/353: mknod d3/dd/d26/d3f/d47/d71/d76/c77 0 2026-03-09T19:27:24.547 INFO:tasks.workunit.client.0.vm07.stdout:8/376: creat d7/d9/d57/d7c/f8a x:0 0 0 2026-03-09T19:27:24.548 INFO:tasks.workunit.client.0.vm07.stdout:5/354: creat d3/d1a/d5d/f78 x:0 0 0 2026-03-09T19:27:24.550 INFO:tasks.workunit.client.0.vm07.stdout:8/377: rename d7/d16/d1e/l59 to d7/d9/d37/d45/d56/l8b 0 2026-03-09T19:27:24.555 INFO:tasks.workunit.client.0.vm07.stdout:7/386: link d0/d4/d5/d8/d41/d64/l68 d0/d52/d54/l84 0 2026-03-09T19:27:24.556 INFO:tasks.workunit.client.0.vm07.stdout:8/378: symlink d7/d30/l8c 0 2026-03-09T19:27:24.559 INFO:tasks.workunit.client.0.vm07.stdout:5/355: getdents d3/dd/d26/d2d/d60 0 2026-03-09T19:27:24.596 INFO:tasks.workunit.client.0.vm07.stdout:5/356: getdents d3/dd/d26/d2c 0 2026-03-09T19:27:24.599 INFO:tasks.workunit.client.0.vm07.stdout:5/357: dwrite d3/f19 [4194304,4194304] 0 2026-03-09T19:27:24.602 INFO:tasks.workunit.client.0.vm07.stdout:0/304: dread d0/d6/d13/d17/f20 [0,4194304] 0 2026-03-09T19:27:24.604 INFO:tasks.workunit.client.0.vm07.stdout:4/333: sync 2026-03-09T19:27:24.605 INFO:tasks.workunit.client.0.vm07.stdout:4/334: stat d3/d11/c27 0 2026-03-09T19:27:24.606 INFO:tasks.workunit.client.0.vm07.stdout:5/358: dwrite d3/d1a/f1c [0,4194304] 0 2026-03-09T19:27:24.608 INFO:tasks.workunit.client.0.vm07.stdout:0/305: creat d0/d6/d13/d1c/d61/d69/f6b x:0 0 0 2026-03-09T19:27:24.612 INFO:tasks.workunit.client.0.vm07.stdout:8/379: sync 2026-03-09T19:27:24.618 INFO:tasks.workunit.client.0.vm07.stdout:4/335: dwrite d3/d11/d16/d2f/d22/f24 [0,4194304] 0 2026-03-09T19:27:24.630 INFO:tasks.workunit.client.0.vm07.stdout:8/380: rename d7/d9/c2a to d7/d9/d37/d45/d56/d67/c8d 0 2026-03-09T19:27:24.631 INFO:tasks.workunit.client.0.vm07.stdout:0/306: creat d0/d6/d13/f6c x:0 0 0 2026-03-09T19:27:24.636 INFO:tasks.workunit.client.0.vm07.stdout:4/336: symlink d3/d4f/d56/d5f/l75 0 2026-03-09T19:27:24.641 INFO:tasks.workunit.client.0.vm07.stdout:8/381: fdatasync d7/d9/d37/d45/d4f/f66 0 2026-03-09T19:27:24.641 INFO:tasks.workunit.client.0.vm07.stdout:5/359: getdents d3/d1a/d28/d40 0 2026-03-09T19:27:24.641 INFO:tasks.workunit.client.0.vm07.stdout:0/307: dwrite d0/d6/f4f [0,4194304] 0 2026-03-09T19:27:24.642 INFO:tasks.workunit.client.0.vm07.stdout:4/337: rmdir d3/d11/d51 39 2026-03-09T19:27:24.651 INFO:tasks.workunit.client.0.vm07.stdout:5/360: mkdir d3/dd/d26/d2d/d79 0 2026-03-09T19:27:24.651 INFO:tasks.workunit.client.0.vm07.stdout:5/361: chown d3/dd/l13 301595 1 2026-03-09T19:27:24.689 INFO:tasks.workunit.client.0.vm07.stdout:0/308: fsync d0/d6/d13/d1c/f27 0 2026-03-09T19:27:24.689 INFO:tasks.workunit.client.0.vm07.stdout:4/338: rename d3/d11/d29/d34/d50/l55 to d3/d11/d29/d34/l76 0 2026-03-09T19:27:24.689 INFO:tasks.workunit.client.0.vm07.stdout:5/362: truncate d3/d1a/d5d/f5f 1464792 0 2026-03-09T19:27:24.690 INFO:tasks.workunit.client.0.vm07.stdout:8/382: link d7/d9/d57/l6a d7/d9/l8e 0 2026-03-09T19:27:24.691 INFO:tasks.workunit.client.0.vm07.stdout:8/383: chown d7/d9/l35 345650330 1 2026-03-09T19:27:24.693 INFO:tasks.workunit.client.0.vm07.stdout:4/339: creat d3/d11/d16/f77 x:0 0 0 2026-03-09T19:27:24.698 INFO:tasks.workunit.client.0.vm07.stdout:0/309: dwrite d0/d6/d13/d17/f20 [4194304,4194304] 0 2026-03-09T19:27:24.700 INFO:tasks.workunit.client.0.vm07.stdout:0/310: fsync d0/f3a 0 2026-03-09T19:27:24.704 INFO:tasks.workunit.client.0.vm07.stdout:0/311: stat d0/d6/f4f 0 2026-03-09T19:27:24.704 INFO:tasks.workunit.client.0.vm07.stdout:8/384: chown d7/d9/d10/l26 624 1 2026-03-09T19:27:24.704 INFO:tasks.workunit.client.0.vm07.stdout:5/363: link d3/d1a/d28/d48/f50 d3/d1a/d28/d6c/f7a 0 2026-03-09T19:27:24.705 INFO:tasks.workunit.client.0.vm07.stdout:4/340: fdatasync d3/d11/d2b/f2c 0 2026-03-09T19:27:24.705 INFO:tasks.workunit.client.0.vm07.stdout:4/341: stat d3/d11/d2b/d38 0 2026-03-09T19:27:24.706 INFO:tasks.workunit.client.0.vm07.stdout:4/342: chown d3/d11/d29/l62 1223 1 2026-03-09T19:27:24.706 INFO:tasks.workunit.client.0.vm07.stdout:4/343: dread - d3/d11/f74 zero size 2026-03-09T19:27:24.712 INFO:tasks.workunit.client.0.vm07.stdout:2/397: write d3/f1a [36376,46997] 0 2026-03-09T19:27:24.723 INFO:tasks.workunit.client.1.vm08.stdout:5/549: write d16/d1e/d3b/f3c [1594726,122336] 0 2026-03-09T19:27:24.723 INFO:tasks.workunit.client.1.vm08.stdout:5/550: creat d16/d45/fb1 x:0 0 0 2026-03-09T19:27:24.723 INFO:tasks.workunit.client.0.vm07.stdout:8/385: creat d7/d50/f8f x:0 0 0 2026-03-09T19:27:24.723 INFO:tasks.workunit.client.0.vm07.stdout:5/364: dread d3/d1a/d28/d6c/f7a [0,4194304] 0 2026-03-09T19:27:24.723 INFO:tasks.workunit.client.0.vm07.stdout:2/398: dread - d3/dd/d16/d29/d3c/d5a/d7a/f6e zero size 2026-03-09T19:27:24.723 INFO:tasks.workunit.client.0.vm07.stdout:5/365: dread - d3/d1a/d5d/f78 zero size 2026-03-09T19:27:24.723 INFO:tasks.workunit.client.0.vm07.stdout:2/399: dread - d3/dd/d16/d29/d3c/d5a/d7a/f6e zero size 2026-03-09T19:27:24.723 INFO:tasks.workunit.client.0.vm07.stdout:0/312: truncate d0/d6/d13/d1c/f36 757339 0 2026-03-09T19:27:24.724 INFO:tasks.workunit.client.0.vm07.stdout:4/344: dwrite d3/d11/d2b/f49 [4194304,4194304] 0 2026-03-09T19:27:24.725 INFO:tasks.workunit.client.0.vm07.stdout:0/313: chown d0/d6/d13/d1c/f3e 1106116596 1 2026-03-09T19:27:24.729 INFO:tasks.workunit.client.0.vm07.stdout:0/314: mknod d0/d6/d13/d1c/c6d 0 2026-03-09T19:27:24.730 INFO:tasks.workunit.client.0.vm07.stdout:0/315: unlink d0/d6/d13/d1c/d52/l59 0 2026-03-09T19:27:24.743 INFO:tasks.workunit.client.0.vm07.stdout:0/316: dwrite d0/d6/d13/f31 [0,4194304] 0 2026-03-09T19:27:24.755 INFO:tasks.workunit.client.0.vm07.stdout:4/345: dread d3/d11/d2b/d38/f4a [0,4194304] 0 2026-03-09T19:27:24.758 INFO:tasks.workunit.client.0.vm07.stdout:0/317: dwrite d0/d6/d13/d1c/d11/f2e [0,4194304] 0 2026-03-09T19:27:24.760 INFO:tasks.workunit.client.0.vm07.stdout:0/318: fdatasync d0/d6/d13/d1c/d61/d69/f6b 0 2026-03-09T19:27:24.780 INFO:tasks.workunit.client.1.vm08.stdout:3/651: write d0/d4b/fb0 [72069,76216] 0 2026-03-09T19:27:24.781 INFO:tasks.workunit.client.1.vm08.stdout:4/576: write da/d10/d26/d38/f43 [1971742,52622] 0 2026-03-09T19:27:24.782 INFO:tasks.workunit.client.1.vm08.stdout:7/646: write d5/d14/dae/f49 [594053,76572] 0 2026-03-09T19:27:24.784 INFO:tasks.workunit.client.0.vm07.stdout:4/346: read - d3/d11/d29/f42 zero size 2026-03-09T19:27:24.785 INFO:tasks.workunit.client.1.vm08.stdout:4/577: chown da/d10/d16/d28/fa3 1 1 2026-03-09T19:27:24.786 INFO:tasks.workunit.client.0.vm07.stdout:0/319: unlink d0/d6/f38 0 2026-03-09T19:27:24.803 INFO:tasks.workunit.client.1.vm08.stdout:7/647: dread d5/d14/d2b/f37 [4194304,4194304] 0 2026-03-09T19:27:24.824 INFO:tasks.workunit.client.0.vm07.stdout:0/320: rename d0/d6/d13/d1c/d11/c5e to d0/d6/d13/d1c/d11/d56/d62/c6e 0 2026-03-09T19:27:24.828 INFO:tasks.workunit.client.1.vm08.stdout:2/537: dwrite d3/d9/d4a/f89 [0,4194304] 0 2026-03-09T19:27:24.850 INFO:tasks.workunit.client.1.vm08.stdout:7/648: dread - d5/d14/f46 zero size 2026-03-09T19:27:24.850 INFO:tasks.workunit.client.1.vm08.stdout:2/538: truncate d3/d9/d26/f6a 3032973 0 2026-03-09T19:27:24.853 INFO:tasks.workunit.client.0.vm07.stdout:0/321: creat d0/d6/d13/d17/d19/d57/f6f x:0 0 0 2026-03-09T19:27:24.853 INFO:tasks.workunit.client.1.vm08.stdout:2/539: creat d3/d9/d79/d46/d8c/fbb x:0 0 0 2026-03-09T19:27:24.854 INFO:tasks.workunit.client.0.vm07.stdout:0/322: chown d0/d6/d13/d17/d19/d57/d6a 14 1 2026-03-09T19:27:24.854 INFO:tasks.workunit.client.1.vm08.stdout:2/540: write d3/d4/f8 [306973,63177] 0 2026-03-09T19:27:24.856 INFO:tasks.workunit.client.0.vm07.stdout:9/397: write d0/d17/f33 [14982,20077] 0 2026-03-09T19:27:24.863 INFO:tasks.workunit.client.0.vm07.stdout:9/398: dwrite d0/db/d29/d32/d5c/f78 [0,4194304] 0 2026-03-09T19:27:24.868 INFO:tasks.workunit.client.1.vm08.stdout:4/578: dread da/d10/d1b/f79 [0,4194304] 0 2026-03-09T19:27:24.868 INFO:tasks.workunit.client.1.vm08.stdout:7/649: symlink d5/d14/dae/d3a/d42/d6a/d8f/ld7 0 2026-03-09T19:27:24.877 INFO:tasks.workunit.client.1.vm08.stdout:2/541: dread d3/d4/d23/d2c/f5b [0,4194304] 0 2026-03-09T19:27:24.887 INFO:tasks.workunit.client.0.vm07.stdout:0/323: link d0/d6/d13/d17/d19/d57/c5d d0/d6/d13/d1c/d11/c70 0 2026-03-09T19:27:24.890 INFO:tasks.workunit.client.1.vm08.stdout:0/607: dwrite dd/d7e/f8e [0,4194304] 0 2026-03-09T19:27:24.891 INFO:tasks.workunit.client.0.vm07.stdout:1/346: dwrite d1/f1d [0,4194304] 0 2026-03-09T19:27:24.898 INFO:tasks.workunit.client.0.vm07.stdout:1/347: creat d1/db/d31/d4f/f79 x:0 0 0 2026-03-09T19:27:24.899 INFO:tasks.workunit.client.0.vm07.stdout:1/348: chown d1/db/d31/d56 19 1 2026-03-09T19:27:24.899 INFO:tasks.workunit.client.0.vm07.stdout:1/349: fdatasync d1/db/d31/d56/f6a 0 2026-03-09T19:27:24.902 INFO:tasks.workunit.client.0.vm07.stdout:1/350: mkdir d1/db/d31/d4f/d7a 0 2026-03-09T19:27:24.919 INFO:tasks.workunit.client.0.vm07.stdout:9/399: dread d0/db/d29/d2c/d36/f3c [0,4194304] 0 2026-03-09T19:27:24.920 INFO:tasks.workunit.client.0.vm07.stdout:9/400: unlink d0/f3 0 2026-03-09T19:27:24.921 INFO:tasks.workunit.client.0.vm07.stdout:9/401: mknod d0/db/d29/c8a 0 2026-03-09T19:27:24.923 INFO:tasks.workunit.client.0.vm07.stdout:9/402: creat d0/db/d29/d2c/d36/d7d/f8b x:0 0 0 2026-03-09T19:27:24.967 INFO:tasks.workunit.client.0.vm07.stdout:9/403: mknod d0/db/d29/d4d/c8c 0 2026-03-09T19:27:24.995 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.996+0000 7f194d7fa640 1 -- 192.168.123.107:0/1138716566 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f193c004490 con 0x7f19500ff5d0 2026-03-09T19:27:24.995 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.996+0000 7f194d7fa640 1 -- 192.168.123.107:0/1138716566 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f193c009ce0 con 0x7f19500ff5d0 2026-03-09T19:27:24.995 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.996+0000 7f1956655640 1 -- 192.168.123.107:0/1138716566 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1950101960 con 0x7f19500ff5d0 2026-03-09T19:27:24.995 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.996+0000 7f1956655640 1 -- 192.168.123.107:0/1138716566 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1950101eb0 con 0x7f19500ff5d0 2026-03-09T19:27:24.995 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.996+0000 7f194d7fa640 1 -- 192.168.123.107:0/1138716566 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f193c01d410 con 0x7f19500ff5d0 2026-03-09T19:27:24.996 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.997+0000 7f1956655640 1 -- 192.168.123.107:0/1138716566 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f191c005350 con 0x7f19500ff5d0 2026-03-09T19:27:24.997 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.998+0000 7f194d7fa640 1 -- 192.168.123.107:0/1138716566 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f193c045020 con 0x7f19500ff5d0 2026-03-09T19:27:24.997 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.998+0000 7f194d7fa640 1 --2- 192.168.123.107:0/1138716566 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f19240761c0 0x7f1924078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:24.998 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.999+0000 7f194d7fa640 1 -- 192.168.123.107:0/1138716566 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f193c09ff50 con 0x7f19500ff5d0 2026-03-09T19:27:24.998 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.999+0000 7f194ffff640 1 --2- 192.168.123.107:0/1138716566 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f19240761c0 0x7f1924078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:24.998 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:24.999+0000 7f194ffff640 1 --2- 192.168.123.107:0/1138716566 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f19240761c0 0x7f1924078680 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f1944004480 tx=0x7f1944004550 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:25.005 INFO:tasks.workunit.client.0.vm07.stdout:9/404: sync 2026-03-09T19:27:25.005 INFO:tasks.workunit.client.0.vm07.stdout:1/351: sync 2026-03-09T19:27:25.006 INFO:tasks.workunit.client.1.vm08.stdout:8/554: sync 2026-03-09T19:27:25.006 INFO:tasks.workunit.client.1.vm08.stdout:6/607: sync 2026-03-09T19:27:25.006 INFO:tasks.workunit.client.1.vm08.stdout:5/551: sync 2026-03-09T19:27:25.006 INFO:tasks.workunit.client.1.vm08.stdout:9/573: sync 2026-03-09T19:27:25.006 INFO:tasks.workunit.client.1.vm08.stdout:7/650: sync 2026-03-09T19:27:25.006 INFO:tasks.workunit.client.1.vm08.stdout:0/608: sync 2026-03-09T19:27:25.007 INFO:tasks.workunit.client.0.vm07.stdout:9/405: creat d0/db/d29/d32/d5c/d69/f8d x:0 0 0 2026-03-09T19:27:25.008 INFO:tasks.workunit.client.0.vm07.stdout:9/406: chown d0/db/d29/d2c/d36/d5a/l88 9608274 1 2026-03-09T19:27:25.011 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.010+0000 7f194d7fa640 1 -- 192.168.123.107:0/1138716566 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f193c0695c0 con 0x7f19500ff5d0 2026-03-09T19:27:25.012 INFO:tasks.workunit.client.0.vm07.stdout:9/407: creat d0/db/d29/d68/f8e x:0 0 0 2026-03-09T19:27:25.012 INFO:tasks.workunit.client.0.vm07.stdout:1/352: mknod d1/d11/d37/d5a/d6d/c7b 0 2026-03-09T19:27:25.012 INFO:tasks.workunit.client.1.vm08.stdout:7/651: creat d5/d14/dae/d1c/d83/d9c/fd8 x:0 0 0 2026-03-09T19:27:25.014 INFO:tasks.workunit.client.0.vm07.stdout:9/408: write d0/d6/f7b [404881,106076] 0 2026-03-09T19:27:25.014 INFO:tasks.workunit.client.1.vm08.stdout:8/555: creat de/d25/d31/d82/d6d/fc3 x:0 0 0 2026-03-09T19:27:25.015 INFO:tasks.workunit.client.0.vm07.stdout:9/409: dread - d0/db/d29/d2c/d36/d7d/f8b zero size 2026-03-09T19:27:25.018 INFO:tasks.workunit.client.1.vm08.stdout:9/574: mknod d0/d1b/d97/d48/d5d/d74/cc2 0 2026-03-09T19:27:25.018 INFO:tasks.workunit.client.1.vm08.stdout:5/552: creat d16/d8e/fb2 x:0 0 0 2026-03-09T19:27:25.018 INFO:tasks.workunit.client.0.vm07.stdout:9/410: truncate d0/db/f41 643192 0 2026-03-09T19:27:25.018 INFO:tasks.workunit.client.0.vm07.stdout:1/353: creat d1/d11/d37/d3f/d6e/f7c x:0 0 0 2026-03-09T19:27:25.018 INFO:tasks.workunit.client.0.vm07.stdout:1/354: chown d1/d11/d37/d5a/d6d/c6f 1965026 1 2026-03-09T19:27:25.022 INFO:tasks.workunit.client.1.vm08.stdout:7/652: rename d5/d14/f6c to d5/d14/d27/d78/dc7/fd9 0 2026-03-09T19:27:25.029 INFO:tasks.workunit.client.0.vm07.stdout:1/355: stat d1/d3/d21/l2a 0 2026-03-09T19:27:25.043 INFO:tasks.workunit.client.1.vm08.stdout:9/575: read d0/d2/d8/f2d [290900,42670] 0 2026-03-09T19:27:25.043 INFO:tasks.workunit.client.1.vm08.stdout:6/608: creat d3/db/fdc x:0 0 0 2026-03-09T19:27:25.043 INFO:tasks.workunit.client.1.vm08.stdout:9/576: chown d0 21997015 1 2026-03-09T19:27:25.049 INFO:tasks.workunit.client.1.vm08.stdout:0/609: symlink dd/d22/lc1 0 2026-03-09T19:27:25.050 INFO:tasks.workunit.client.0.vm07.stdout:2/400: dread d3/f4 [0,4194304] 0 2026-03-09T19:27:25.051 INFO:tasks.workunit.client.0.vm07.stdout:2/401: write d3/dd/f24 [1582438,36987] 0 2026-03-09T19:27:25.057 INFO:tasks.workunit.client.0.vm07.stdout:4/347: dread d3/d11/d2b/f2c [0,4194304] 0 2026-03-09T19:27:25.057 INFO:tasks.workunit.client.1.vm08.stdout:1/735: truncate d9/da/d53/d67/f77 2555510 0 2026-03-09T19:27:25.063 INFO:tasks.workunit.client.0.vm07.stdout:6/306: dwrite d0/d1/db/d1d/f2e [0,4194304] 0 2026-03-09T19:27:25.067 INFO:tasks.workunit.client.0.vm07.stdout:6/307: chown d0/d13/f1c 18594299 1 2026-03-09T19:27:25.074 INFO:tasks.workunit.client.1.vm08.stdout:7/653: creat d5/d14/dae/d1c/d83/d9c/dcb/fda x:0 0 0 2026-03-09T19:27:25.076 INFO:tasks.workunit.client.0.vm07.stdout:0/324: dread d0/d6/d13/d17/d19/f1f [0,4194304] 0 2026-03-09T19:27:25.082 INFO:tasks.workunit.client.0.vm07.stdout:3/402: dwrite d1/d6/f1b [0,4194304] 0 2026-03-09T19:27:25.086 INFO:tasks.workunit.client.1.vm08.stdout:5/553: mkdir d16/d1e/db3 0 2026-03-09T19:27:25.087 INFO:tasks.workunit.client.0.vm07.stdout:7/387: dwrite d0/d4/d5/d8/d1a/d2a/f34 [0,4194304] 0 2026-03-09T19:27:25.121 INFO:tasks.workunit.client.1.vm08.stdout:6/609: chown d3/db/cd 120557124 1 2026-03-09T19:27:25.122 INFO:tasks.workunit.client.1.vm08.stdout:8/556: rename de/d47/d85/laf to de/d25/d87/lc4 0 2026-03-09T19:27:25.122 INFO:tasks.workunit.client.1.vm08.stdout:1/736: symlink d9/da/d17/d60/lda 0 2026-03-09T19:27:25.122 INFO:tasks.workunit.client.1.vm08.stdout:9/577: creat d0/d1b/d4e/da7/fc3 x:0 0 0 2026-03-09T19:27:25.122 INFO:tasks.workunit.client.0.vm07.stdout:2/402: mkdir d3/d11/d38/d86 0 2026-03-09T19:27:25.122 INFO:tasks.workunit.client.0.vm07.stdout:7/388: stat d0/d4/d5/d8 0 2026-03-09T19:27:25.122 INFO:tasks.workunit.client.0.vm07.stdout:1/356: mkdir d1/d11/d37/d3f/d45/d7d 0 2026-03-09T19:27:25.122 INFO:tasks.workunit.client.0.vm07.stdout:6/308: creat d0/d4e/f78 x:0 0 0 2026-03-09T19:27:25.122 INFO:tasks.workunit.client.0.vm07.stdout:3/403: rename d1/d1f/d16/c59 to d1/d6/d45/c7e 0 2026-03-09T19:27:25.122 INFO:tasks.workunit.client.0.vm07.stdout:2/403: write d3/f22 [3880566,72315] 0 2026-03-09T19:27:25.122 INFO:tasks.workunit.client.0.vm07.stdout:2/404: fdatasync d3/f7c 0 2026-03-09T19:27:25.122 INFO:tasks.workunit.client.0.vm07.stdout:8/386: write d7/d9/d37/d45/f4e [924286,97866] 0 2026-03-09T19:27:25.122 INFO:tasks.workunit.client.0.vm07.stdout:5/366: truncate d3/fe 2248072 0 2026-03-09T19:27:25.122 INFO:tasks.workunit.client.0.vm07.stdout:4/348: symlink d3/l78 0 2026-03-09T19:27:25.126 INFO:tasks.workunit.client.1.vm08.stdout:3/652: truncate d0/d6/f91 2782229 0 2026-03-09T19:27:25.139 INFO:tasks.workunit.client.0.vm07.stdout:2/405: creat d3/dd/d16/d29/d3c/d4c/f87 x:0 0 0 2026-03-09T19:27:25.141 INFO:tasks.workunit.client.1.vm08.stdout:6/610: creat d3/d34/dce/fdd x:0 0 0 2026-03-09T19:27:25.141 INFO:tasks.workunit.client.1.vm08.stdout:6/611: chown d3/d34/dce/fdd 664702 1 2026-03-09T19:27:25.146 INFO:tasks.workunit.client.1.vm08.stdout:4/579: dwrite da/d10/d16/d28/d2f/f80 [0,4194304] 0 2026-03-09T19:27:25.148 INFO:tasks.workunit.client.0.vm07.stdout:8/387: mknod d7/d9/d37/d45/d56/c90 0 2026-03-09T19:27:25.149 INFO:tasks.workunit.client.0.vm07.stdout:8/388: chown d7/d9/d37 226 1 2026-03-09T19:27:25.150 INFO:tasks.workunit.client.1.vm08.stdout:9/578: creat d0/d2/d14/d5c/fc4 x:0 0 0 2026-03-09T19:27:25.150 INFO:tasks.workunit.client.1.vm08.stdout:2/542: write d3/d9/d4a/fa4 [473528,61964] 0 2026-03-09T19:27:25.154 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.153+0000 7f1956655640 1 -- 192.168.123.107:0/1138716566 --> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f191c002bf0 con 0x7f19240761c0 2026-03-09T19:27:25.154 INFO:tasks.workunit.client.1.vm08.stdout:0/610: creat dd/d22/d24/d49/d50/d78/fc2 x:0 0 0 2026-03-09T19:27:25.156 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.156+0000 7f194d7fa640 1 -- 192.168.123.107:0/1138716566 <== mgr.14227 v2:192.168.123.107:6800/2885771920 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f191c002bf0 con 0x7f19240761c0 2026-03-09T19:27:25.156 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:27:25.156 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T19:27:25.156 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:27:25.156 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:27:25.156 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [], 2026-03-09T19:27:25.156 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "0/23 daemons upgraded", 2026-03-09T19:27:25.156 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm08", 2026-03-09T19:27:25.156 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:27:25.156 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:27:25.160 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.160+0000 7f1956655640 1 -- 192.168.123.107:0/1138716566 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f19240761c0 msgr2=0x7f1924078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:25.160 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.160+0000 7f1956655640 1 --2- 192.168.123.107:0/1138716566 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f19240761c0 0x7f1924078680 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f1944004480 tx=0x7f1944004550 comp rx=0 tx=0).stop 2026-03-09T19:27:25.160 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.160+0000 7f1956655640 1 -- 192.168.123.107:0/1138716566 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f19500ff5d0 msgr2=0x7f195019a7b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:25.160 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.160+0000 7f1956655640 1 --2- 192.168.123.107:0/1138716566 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f19500ff5d0 0x7f195019a7b0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f193c00b850 tx=0x7f193c00bd20 comp rx=0 tx=0).stop 2026-03-09T19:27:25.160 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.160+0000 7f1956655640 1 -- 192.168.123.107:0/1138716566 shutdown_connections 2026-03-09T19:27:25.160 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.160+0000 7f1956655640 1 --2- 192.168.123.107:0/1138716566 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f19240761c0 0x7f1924078680 secure :-1 s=CLOSED pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f1944004480 tx=0x7f1944004550 comp rx=0 tx=0).stop 2026-03-09T19:27:25.160 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.160+0000 7f1956655640 1 --2- 192.168.123.107:0/1138716566 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f19500ff5d0 0x7f195019a7b0 secure :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f193c00b850 tx=0x7f193c00bd20 comp rx=0 tx=0).stop 2026-03-09T19:27:25.160 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.160+0000 7f1956655640 1 --2- 192.168.123.107:0/1138716566 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f19500fe8f0 0x7f195019a270 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:25.161 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.160+0000 7f1956655640 1 -- 192.168.123.107:0/1138716566 >> 192.168.123.107:0/1138716566 conn(0x7f19500fa410 msgr2=0x7f19500fbf20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:25.161 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.161+0000 7f1956655640 1 -- 192.168.123.107:0/1138716566 shutdown_connections 2026-03-09T19:27:25.161 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.162+0000 7f1956655640 1 -- 192.168.123.107:0/1138716566 wait complete. 2026-03-09T19:27:25.163 INFO:tasks.workunit.client.1.vm08.stdout:4/580: dwrite da/d10/d26/d27/f35 [4194304,4194304] 0 2026-03-09T19:27:25.166 INFO:tasks.workunit.client.1.vm08.stdout:1/737: creat d9/d11/d7a/d89/fdb x:0 0 0 2026-03-09T19:27:25.166 INFO:tasks.workunit.client.0.vm07.stdout:0/325: sync 2026-03-09T19:27:25.168 INFO:tasks.workunit.client.0.vm07.stdout:1/357: mkdir d1/d11/d37/d3f/d7e 0 2026-03-09T19:27:25.173 INFO:tasks.workunit.client.1.vm08.stdout:3/653: dread - d0/d6/de/d6e/d51/fb5 zero size 2026-03-09T19:27:25.178 INFO:tasks.workunit.client.1.vm08.stdout:3/654: dread d0/d4b/fb0 [0,4194304] 0 2026-03-09T19:27:25.188 INFO:tasks.workunit.client.0.vm07.stdout:6/309: mknod d0/c79 0 2026-03-09T19:27:25.189 INFO:tasks.workunit.client.1.vm08.stdout:5/554: creat d16/d1e/d30/fb4 x:0 0 0 2026-03-09T19:27:25.199 INFO:tasks.workunit.client.0.vm07.stdout:2/406: symlink d3/dd/d16/d29/d2d/d45/d85/l88 0 2026-03-09T19:27:25.205 INFO:tasks.workunit.client.1.vm08.stdout:6/612: mknod d3/d34/d3b/cde 0 2026-03-09T19:27:25.225 INFO:tasks.workunit.client.0.vm07.stdout:9/411: truncate d0/db/d29/d2c/f34 5761472 0 2026-03-09T19:27:25.245 INFO:tasks.workunit.client.0.vm07.stdout:0/326: dread d0/d6/d13/d17/f2b [0,4194304] 0 2026-03-09T19:27:25.245 INFO:tasks.workunit.client.1.vm08.stdout:8/557: creat de/d25/d31/d82/fc5 x:0 0 0 2026-03-09T19:27:25.246 INFO:tasks.workunit.client.0.vm07.stdout:0/327: truncate d0/d6/d13/d1c/d50/f60 994832 0 2026-03-09T19:27:25.247 INFO:tasks.workunit.client.1.vm08.stdout:8/558: chown de/d47/faa 22 1 2026-03-09T19:27:25.247 INFO:tasks.workunit.client.0.vm07.stdout:6/310: symlink d0/d1/db/d24/l7a 0 2026-03-09T19:27:25.251 INFO:tasks.workunit.client.1.vm08.stdout:1/738: mknod d9/da/d2d/d62/cdc 0 2026-03-09T19:27:25.254 INFO:tasks.workunit.client.0.vm07.stdout:0/328: dwrite d0/d6/d13/f6c [0,4194304] 0 2026-03-09T19:27:25.256 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:25 vm07.local ceph-mon[48545]: from='client.14682 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:27:25.256 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:25 vm07.local ceph-mon[48545]: from='client.24441 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:27:25.256 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:25 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/2795042528' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:27:25.256 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:25 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/415814876' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:27:25.256 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.255+0000 7f835733a640 1 -- 192.168.123.107:0/872619913 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8350072370 msgr2=0x7f835010c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:25.256 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.255+0000 7f835733a640 1 --2- 192.168.123.107:0/872619913 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8350072370 0x7f835010c590 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f83440099b0 tx=0x7f834402f240 comp rx=0 tx=0).stop 2026-03-09T19:27:25.256 INFO:tasks.workunit.client.0.vm07.stdout:8/389: truncate d7/d9/d10/f1b 2671706 0 2026-03-09T19:27:25.257 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.257+0000 7f835733a640 1 -- 192.168.123.107:0/872619913 shutdown_connections 2026-03-09T19:27:25.257 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.257+0000 7f835733a640 1 --2- 192.168.123.107:0/872619913 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8350072370 0x7f835010c590 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:25.257 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.257+0000 7f835733a640 1 --2- 192.168.123.107:0/872619913 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f83500719a0 0x7f8350071da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:25.257 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.257+0000 7f835733a640 1 -- 192.168.123.107:0/872619913 >> 192.168.123.107:0/872619913 conn(0x7f835006d4f0 msgr2=0x7f835006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:25.257 INFO:tasks.workunit.client.0.vm07.stdout:7/389: write d0/d4/d5/d26/f42 [636024,54309] 0 2026-03-09T19:27:25.259 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.258+0000 7f835733a640 1 -- 192.168.123.107:0/872619913 shutdown_connections 2026-03-09T19:27:25.260 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.260+0000 7f835733a640 1 -- 192.168.123.107:0/872619913 wait complete. 2026-03-09T19:27:25.260 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.261+0000 7f835733a640 1 Processor -- start 2026-03-09T19:27:25.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.262+0000 7f835733a640 1 -- start start 2026-03-09T19:27:25.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.262+0000 7f835733a640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83500719a0 0x7f8350115900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:25.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.262+0000 7f835733a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8350072370 0x7f8350115e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:25.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.262+0000 7f835733a640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8350117340 con 0x7f8350072370 2026-03-09T19:27:25.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.262+0000 7f835733a640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f83501174b0 con 0x7f83500719a0 2026-03-09T19:27:25.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.263+0000 7f8355b37640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8350072370 0x7f8350115e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:25.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.263+0000 7f8356338640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83500719a0 0x7f8350115900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:25.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.263+0000 7f8356338640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83500719a0 0x7f8350115900 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:35444/0 (socket says 192.168.123.107:35444) 2026-03-09T19:27:25.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.263+0000 7f8356338640 1 -- 192.168.123.107:0/1353960079 learned_addr learned my addr 192.168.123.107:0/1353960079 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:27:25.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.265+0000 7f8356338640 1 -- 192.168.123.107:0/1353960079 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8350072370 msgr2=0x7f8350115e40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:25.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.265+0000 7f8356338640 1 --2- 192.168.123.107:0/1353960079 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8350072370 0x7f8350115e40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:25.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.265+0000 7f8356338640 1 -- 192.168.123.107:0/1353960079 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8344009660 con 0x7f83500719a0 2026-03-09T19:27:25.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.265+0000 7f8356338640 1 --2- 192.168.123.107:0/1353960079 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83500719a0 0x7f8350115900 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f834000d900 tx=0x7f834000ddd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:25.265 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.266+0000 7f833f7fe640 1 -- 192.168.123.107:0/1353960079 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8340004490 con 0x7f83500719a0 2026-03-09T19:27:25.265 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.266+0000 7f835733a640 1 -- 192.168.123.107:0/1353960079 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8350116440 con 0x7f83500719a0 2026-03-09T19:27:25.266 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.266+0000 7f835733a640 1 -- 192.168.123.107:0/1353960079 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8350116770 con 0x7f83500719a0 2026-03-09T19:27:25.266 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.267+0000 7f833f7fe640 1 -- 192.168.123.107:0/1353960079 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8340007620 con 0x7f83500719a0 2026-03-09T19:27:25.266 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.267+0000 7f833f7fe640 1 -- 192.168.123.107:0/1353960079 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8340002e60 con 0x7f83500719a0 2026-03-09T19:27:25.268 INFO:tasks.workunit.client.0.vm07.stdout:4/349: creat d3/d11/f79 x:0 0 0 2026-03-09T19:27:25.268 INFO:tasks.workunit.client.1.vm08.stdout:3/655: mknod d0/d6/de/d1b/d16/ccd 0 2026-03-09T19:27:25.269 INFO:tasks.workunit.client.0.vm07.stdout:4/350: dread - d3/d11/d2b/f69 zero size 2026-03-09T19:27:25.269 INFO:tasks.workunit.client.1.vm08.stdout:7/654: dwrite d5/d14/dae/d3a/d42/d6a/f62 [0,4194304] 0 2026-03-09T19:27:25.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.271+0000 7f833f7fe640 1 -- 192.168.123.107:0/1353960079 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 98480+0+0 (secure 0 0 0) 0x7f8340010550 con 0x7f83500719a0 2026-03-09T19:27:25.271 INFO:tasks.workunit.client.0.vm07.stdout:3/404: dwrite d1/f68 [0,4194304] 0 2026-03-09T19:27:25.273 INFO:tasks.workunit.client.0.vm07.stdout:3/405: chown d1/d6/f19 7670 1 2026-03-09T19:27:25.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.277+0000 7f833f7fe640 1 --2- 192.168.123.107:0/1353960079 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f83280761c0 0x7f8328078680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:25.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.278+0000 7f833f7fe640 1 -- 192.168.123.107:0/1353960079 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f8340098900 con 0x7f83500719a0 2026-03-09T19:27:25.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.278+0000 7f8355b37640 1 --2- 192.168.123.107:0/1353960079 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f83280761c0 0x7f8328078680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:25.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.278+0000 7f835733a640 1 -- 192.168.123.107:0/1353960079 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8318005350 con 0x7f83500719a0 2026-03-09T19:27:25.278 INFO:tasks.workunit.client.0.vm07.stdout:3/406: stat d1/d6 0 2026-03-09T19:27:25.278 INFO:tasks.workunit.client.0.vm07.stdout:9/412: mkdir d0/d6/d57/d8f 0 2026-03-09T19:27:25.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.282+0000 7f8355b37640 1 --2- 192.168.123.107:0/1353960079 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f83280761c0 0x7f8328078680 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f8344002c30 tx=0x7f834403a040 comp rx=0 tx=0).ready entity=mgr.14227 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:25.298 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.296+0000 7f833f7fe640 1 -- 192.168.123.107:0/1353960079 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+186382 (secure 0 0 0) 0x7f8340061e40 con 0x7f83500719a0 2026-03-09T19:27:25.321 INFO:tasks.workunit.client.1.vm08.stdout:0/611: unlink dd/d22/d24/d49/d92/c9a 0 2026-03-09T19:27:25.321 INFO:tasks.workunit.client.1.vm08.stdout:0/612: stat dd/d31 0 2026-03-09T19:27:25.332 INFO:tasks.workunit.client.0.vm07.stdout:1/358: write d1/d3/d21/f47 [1120765,7780] 0 2026-03-09T19:27:25.333 INFO:tasks.workunit.client.0.vm07.stdout:1/359: dread - d1/db/d31/f78 zero size 2026-03-09T19:27:25.334 INFO:tasks.workunit.client.0.vm07.stdout:1/360: write d1/d11/d37/d5d/d50/f63 [1026780,114610] 0 2026-03-09T19:27:25.335 INFO:tasks.workunit.client.1.vm08.stdout:2/543: dwrite d3/d4/f61 [0,4194304] 0 2026-03-09T19:27:25.335 INFO:tasks.workunit.client.0.vm07.stdout:1/361: write d1/d11/d37/f40 [495734,11885] 0 2026-03-09T19:27:25.337 INFO:tasks.workunit.client.0.vm07.stdout:1/362: write d1/db/d31/d56/f6a [2076842,85338] 0 2026-03-09T19:27:25.339 INFO:tasks.workunit.client.0.vm07.stdout:1/363: stat d1/d11/d37/d5d/d50 0 2026-03-09T19:27:25.346 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:25 vm08.local ceph-mon[57794]: from='client.14682 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:27:25.346 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:25 vm08.local ceph-mon[57794]: from='client.24441 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:27:25.346 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:25 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/2795042528' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:27:25.346 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:25 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/415814876' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:27:25.362 INFO:tasks.workunit.client.0.vm07.stdout:6/311: mkdir d0/d1/db/d17/d4c/d7b 0 2026-03-09T19:27:25.369 INFO:tasks.workunit.client.1.vm08.stdout:1/739: creat d9/d40/d49/d9e/fdd x:0 0 0 2026-03-09T19:27:25.385 INFO:tasks.workunit.client.0.vm07.stdout:2/407: rename d3/l4d to d3/d49/l89 0 2026-03-09T19:27:25.389 INFO:tasks.workunit.client.1.vm08.stdout:8/559: dread f1 [0,4194304] 0 2026-03-09T19:27:25.393 INFO:tasks.workunit.client.0.vm07.stdout:0/329: chown d0/d6/d13/d33/c44 424 1 2026-03-09T19:27:25.394 INFO:tasks.workunit.client.1.vm08.stdout:9/579: dwrite d0/d1b/f9f [0,4194304] 0 2026-03-09T19:27:25.398 INFO:tasks.workunit.client.0.vm07.stdout:8/390: creat d7/d9/d37/d34/f91 x:0 0 0 2026-03-09T19:27:25.399 INFO:tasks.workunit.client.0.vm07.stdout:5/367: link d3/d1a/l27 d3/d1a/d28/d36/l7b 0 2026-03-09T19:27:25.400 INFO:tasks.workunit.client.1.vm08.stdout:3/656: truncate d0/d6/de/d1b/d16/d17/f8c 620202 0 2026-03-09T19:27:25.404 INFO:tasks.workunit.client.1.vm08.stdout:9/580: dread d0/d1b/d97/d48/d5e/fb7 [0,4194304] 0 2026-03-09T19:27:25.405 INFO:tasks.workunit.client.0.vm07.stdout:8/391: dread d7/d1d/f4d [0,4194304] 0 2026-03-09T19:27:25.410 INFO:tasks.workunit.client.1.vm08.stdout:5/555: symlink d16/d1e/d30/lb5 0 2026-03-09T19:27:25.411 INFO:tasks.workunit.client.0.vm07.stdout:4/351: creat d3/d11/d16/d2f/d22/f7a x:0 0 0 2026-03-09T19:27:25.412 INFO:tasks.workunit.client.0.vm07.stdout:4/352: chown d3/d4f/d56/d5f/f72 166977 1 2026-03-09T19:27:25.412 INFO:tasks.workunit.client.1.vm08.stdout:6/613: creat d3/db/d43/d69/da0/fdf x:0 0 0 2026-03-09T19:27:25.412 INFO:tasks.workunit.client.1.vm08.stdout:3/657: dwrite d0/d52/f8a [0,4194304] 0 2026-03-09T19:27:25.414 INFO:tasks.workunit.client.0.vm07.stdout:4/353: truncate d3/d11/d16/d2f/d22/f7a 663936 0 2026-03-09T19:27:25.430 INFO:tasks.workunit.client.1.vm08.stdout:2/544: dread - d3/d4/d23/d2c/d39/d5e/de/d8b/f7e zero size 2026-03-09T19:27:25.434 INFO:tasks.workunit.client.1.vm08.stdout:2/545: dread d3/d9/d4a/f89 [0,4194304] 0 2026-03-09T19:27:25.440 INFO:tasks.workunit.client.1.vm08.stdout:1/740: dread d9/da/d17/fa9 [0,4194304] 0 2026-03-09T19:27:25.441 INFO:tasks.workunit.client.1.vm08.stdout:8/560: rmdir de/d25/d33 39 2026-03-09T19:27:25.447 INFO:tasks.workunit.client.1.vm08.stdout:9/581: creat d0/d2/d14/d98/fc5 x:0 0 0 2026-03-09T19:27:25.448 INFO:tasks.workunit.client.1.vm08.stdout:9/582: readlink d0/d1b/d4e/l95 0 2026-03-09T19:27:25.450 INFO:tasks.workunit.client.1.vm08.stdout:5/556: dread - d16/d1e/f5a zero size 2026-03-09T19:27:25.454 INFO:tasks.workunit.client.1.vm08.stdout:6/614: creat d3/d34/da9/da4/fe0 x:0 0 0 2026-03-09T19:27:25.461 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.462+0000 7f835733a640 1 -- 192.168.123.107:0/1353960079 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f83180051c0 con 0x7f83500719a0 2026-03-09T19:27:25.461 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.462+0000 7f833f7fe640 1 -- 192.168.123.107:0/1353960079 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f83400617e0 con 0x7f83500719a0 2026-03-09T19:27:25.461 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_OK 2026-03-09T19:27:25.464 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.465+0000 7f835733a640 1 -- 192.168.123.107:0/1353960079 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f83280761c0 msgr2=0x7f8328078680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:25.464 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.465+0000 7f835733a640 1 --2- 192.168.123.107:0/1353960079 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f83280761c0 0x7f8328078680 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f8344002c30 tx=0x7f834403a040 comp rx=0 tx=0).stop 2026-03-09T19:27:25.464 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.465+0000 7f835733a640 1 -- 192.168.123.107:0/1353960079 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83500719a0 msgr2=0x7f8350115900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:25.464 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.465+0000 7f835733a640 1 --2- 192.168.123.107:0/1353960079 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83500719a0 0x7f8350115900 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f834000d900 tx=0x7f834000ddd0 comp rx=0 tx=0).stop 2026-03-09T19:27:25.464 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.465+0000 7f835733a640 1 -- 192.168.123.107:0/1353960079 shutdown_connections 2026-03-09T19:27:25.464 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.465+0000 7f835733a640 1 --2- 192.168.123.107:0/1353960079 >> [v2:192.168.123.107:6800/2885771920,v1:192.168.123.107:6801/2885771920] conn(0x7f83280761c0 0x7f8328078680 unknown :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:25.464 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.465+0000 7f835733a640 1 --2- 192.168.123.107:0/1353960079 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8350072370 0x7f8350115e40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:25.464 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.465+0000 7f835733a640 1 --2- 192.168.123.107:0/1353960079 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83500719a0 0x7f8350115900 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:25.464 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.465+0000 7f835733a640 1 -- 192.168.123.107:0/1353960079 >> 192.168.123.107:0/1353960079 conn(0x7f835006d4f0 msgr2=0x7f835010a770 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:25.465 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.466+0000 7f835733a640 1 -- 192.168.123.107:0/1353960079 shutdown_connections 2026-03-09T19:27:25.465 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:25.466+0000 7f835733a640 1 -- 192.168.123.107:0/1353960079 wait complete. 2026-03-09T19:27:25.475 INFO:tasks.workunit.client.1.vm08.stdout:4/581: getdents da/d10/d16/d28/d2f/d4f/d64 0 2026-03-09T19:27:25.478 INFO:tasks.workunit.client.1.vm08.stdout:7/655: write d5/d14/dae/d1c/d73/fac [925115,33484] 0 2026-03-09T19:27:25.501 INFO:tasks.workunit.client.1.vm08.stdout:8/561: rmdir de/d1d/d69 39 2026-03-09T19:27:25.509 INFO:tasks.workunit.client.1.vm08.stdout:1/741: truncate d9/da/d17/fa9 1298973 0 2026-03-09T19:27:25.510 INFO:tasks.workunit.client.1.vm08.stdout:2/546: write d3/d4/d23/d2c/d39/d5e/de/d18/f1a [7192150,85151] 0 2026-03-09T19:27:25.510 INFO:tasks.workunit.client.0.vm07.stdout:6/312: creat d0/d1/db/d17/d4c/f7c x:0 0 0 2026-03-09T19:27:25.511 INFO:tasks.workunit.client.1.vm08.stdout:3/658: chown d0/d6/f91 225 1 2026-03-09T19:27:25.511 INFO:tasks.workunit.client.1.vm08.stdout:2/547: chown d3/d4/d23/d2c/d39 1 1 2026-03-09T19:27:25.511 INFO:tasks.workunit.client.1.vm08.stdout:1/742: chown d9/da/d12/f98 3316 1 2026-03-09T19:27:25.511 INFO:tasks.workunit.client.1.vm08.stdout:9/583: creat d0/d1b/daa/fc6 x:0 0 0 2026-03-09T19:27:25.513 INFO:tasks.workunit.client.1.vm08.stdout:9/584: chown d0/d2/d14/d98/f85 390073443 1 2026-03-09T19:27:25.514 INFO:tasks.workunit.client.1.vm08.stdout:5/557: chown d16/d1e/c31 1025 1 2026-03-09T19:27:25.514 INFO:tasks.workunit.client.1.vm08.stdout:6/615: creat d3/d55/fe1 x:0 0 0 2026-03-09T19:27:25.518 INFO:tasks.workunit.client.0.vm07.stdout:2/408: truncate d3/dd/d16/f25 918330 0 2026-03-09T19:27:25.519 INFO:tasks.workunit.client.0.vm07.stdout:2/409: dread - d3/dd/d16/d29/d3c/d4c/f87 zero size 2026-03-09T19:27:25.527 INFO:tasks.workunit.client.0.vm07.stdout:0/330: mkdir d0/d6/d13/d1c/d11/d56/d71 0 2026-03-09T19:27:25.540 INFO:tasks.workunit.client.1.vm08.stdout:7/656: dread d5/fb [4194304,4194304] 0 2026-03-09T19:27:25.542 INFO:tasks.workunit.client.1.vm08.stdout:2/548: dread d3/d9/d79/f98 [0,4194304] 0 2026-03-09T19:27:25.545 INFO:tasks.workunit.client.1.vm08.stdout:4/582: mkdir da/d10/d26/d50/db0 0 2026-03-09T19:27:25.545 INFO:tasks.workunit.client.1.vm08.stdout:6/616: dwrite d3/db/d43/fd3 [0,4194304] 0 2026-03-09T19:27:25.550 INFO:tasks.workunit.client.0.vm07.stdout:5/368: fdatasync d3/d1a/d28/d48/f50 0 2026-03-09T19:27:25.557 INFO:tasks.workunit.client.1.vm08.stdout:3/659: unlink d0/d6/de/c30 0 2026-03-09T19:27:25.558 INFO:tasks.workunit.client.0.vm07.stdout:5/369: dwrite d3/d1a/f17 [0,4194304] 0 2026-03-09T19:27:25.562 INFO:tasks.workunit.client.1.vm08.stdout:3/660: chown d0/d6/de/d1b/l49 4 1 2026-03-09T19:27:25.571 INFO:tasks.workunit.client.1.vm08.stdout:1/743: unlink d9/da/d2c/f58 0 2026-03-09T19:27:25.571 INFO:tasks.workunit.client.0.vm07.stdout:4/354: creat d3/d4f/d56/d5f/f7b x:0 0 0 2026-03-09T19:27:25.571 INFO:tasks.workunit.client.0.vm07.stdout:4/355: truncate d3/d11/f79 103596 0 2026-03-09T19:27:25.577 INFO:tasks.workunit.client.1.vm08.stdout:0/613: rename dd/d22/d63/l7d to dd/lc3 0 2026-03-09T19:27:25.579 INFO:tasks.workunit.client.0.vm07.stdout:1/364: creat d1/d11/d37/d3f/d7e/f7f x:0 0 0 2026-03-09T19:27:25.579 INFO:tasks.workunit.client.0.vm07.stdout:1/365: stat d1/db/d31/c46 0 2026-03-09T19:27:25.581 INFO:tasks.workunit.client.1.vm08.stdout:2/549: mkdir d3/d9/d79/dbc 0 2026-03-09T19:27:25.582 INFO:tasks.workunit.client.0.vm07.stdout:2/410: mkdir d3/dd/d16/d29/d2d/d45/d85/d8a 0 2026-03-09T19:27:25.590 INFO:tasks.workunit.client.1.vm08.stdout:4/583: rmdir da/d10/d16/d28/d4d 39 2026-03-09T19:27:25.590 INFO:tasks.workunit.client.1.vm08.stdout:9/585: getdents d0/d2/d14/d98/dbb 0 2026-03-09T19:27:25.590 INFO:tasks.workunit.client.0.vm07.stdout:7/390: creat d0/d4/d5/f85 x:0 0 0 2026-03-09T19:27:25.590 INFO:tasks.workunit.client.0.vm07.stdout:7/391: write d0/d4/d5/d26/f42 [4679722,99558] 0 2026-03-09T19:27:25.590 INFO:tasks.workunit.client.0.vm07.stdout:7/392: readlink d0/d4/d5/l6e 0 2026-03-09T19:27:25.595 INFO:tasks.workunit.client.1.vm08.stdout:1/744: creat d9/d11/d7a/d89/d8d/da3/fde x:0 0 0 2026-03-09T19:27:25.596 INFO:tasks.workunit.client.1.vm08.stdout:5/558: symlink d16/d1e/d30/d8a/lb6 0 2026-03-09T19:27:25.598 INFO:tasks.workunit.client.0.vm07.stdout:5/370: symlink d3/d1a/d5d/l7c 0 2026-03-09T19:27:25.603 INFO:tasks.workunit.client.0.vm07.stdout:4/356: unlink d3/d11/d2b/d37/f4d 0 2026-03-09T19:27:25.637 INFO:tasks.workunit.client.1.vm08.stdout:4/584: stat da/d10/d16/d28/d4d 0 2026-03-09T19:27:25.637 INFO:tasks.workunit.client.1.vm08.stdout:9/586: creat d0/d1b/d97/d48/d5e/fc7 x:0 0 0 2026-03-09T19:27:25.637 INFO:tasks.workunit.client.1.vm08.stdout:3/661: fsync d0/d6/dad/fb6 0 2026-03-09T19:27:25.637 INFO:tasks.workunit.client.1.vm08.stdout:1/745: rmdir d9/da/d53/db3 39 2026-03-09T19:27:25.637 INFO:tasks.workunit.client.1.vm08.stdout:0/614: fdatasync dd/d22/d27/d2e/f39 0 2026-03-09T19:27:25.638 INFO:tasks.workunit.client.0.vm07.stdout:4/357: truncate d3/d11/f74 1011441 0 2026-03-09T19:27:25.638 INFO:tasks.workunit.client.0.vm07.stdout:4/358: read d3/f13 [2141723,48070] 0 2026-03-09T19:27:25.638 INFO:tasks.workunit.client.0.vm07.stdout:9/413: link d0/db/d29/d4d/c8c d0/d6f/c90 0 2026-03-09T19:27:25.638 INFO:tasks.workunit.client.0.vm07.stdout:6/313: mkdir d0/d1/db/d17/d4c/d7b/d7d 0 2026-03-09T19:27:25.638 INFO:tasks.workunit.client.0.vm07.stdout:6/314: readlink d0/l67 0 2026-03-09T19:27:25.638 INFO:tasks.workunit.client.0.vm07.stdout:6/315: chown d0/d4e/f78 11483 1 2026-03-09T19:27:25.638 INFO:tasks.workunit.client.0.vm07.stdout:2/411: rename d3/dd/d16/d30/d64 to d3/dd/d16/d29/d2d/d45/d8b 0 2026-03-09T19:27:25.638 INFO:tasks.workunit.client.0.vm07.stdout:0/331: mknod d0/c72 0 2026-03-09T19:27:25.638 INFO:tasks.workunit.client.0.vm07.stdout:7/393: creat d0/d4/f86 x:0 0 0 2026-03-09T19:27:25.638 INFO:tasks.workunit.client.0.vm07.stdout:3/407: link d1/d74/l6f d1/d1f/d16/d28/l7f 0 2026-03-09T19:27:25.638 INFO:tasks.workunit.client.0.vm07.stdout:9/414: readlink d0/l19 0 2026-03-09T19:27:25.638 INFO:tasks.workunit.client.0.vm07.stdout:4/359: truncate d3/d11/d29/d34/f5c 816579 0 2026-03-09T19:27:25.638 INFO:tasks.workunit.client.0.vm07.stdout:2/412: symlink d3/dd/d16/d29/d2d/d45/d8b/l8c 0 2026-03-09T19:27:25.640 INFO:tasks.workunit.client.0.vm07.stdout:2/413: chown d3/dd/d16/d30/d40/c4b 5245 1 2026-03-09T19:27:25.644 INFO:tasks.workunit.client.0.vm07.stdout:7/394: mkdir d0/d52/d54/d5a/d87 0 2026-03-09T19:27:25.645 INFO:tasks.workunit.client.1.vm08.stdout:2/550: dread d3/f7 [0,4194304] 0 2026-03-09T19:27:25.646 INFO:tasks.workunit.client.0.vm07.stdout:8/392: link d7/d9/l12 d7/d9/d57/d7c/l92 0 2026-03-09T19:27:25.646 INFO:tasks.workunit.client.0.vm07.stdout:7/395: dread d0/d4/d5/d26/d3c/d58/f70 [0,4194304] 0 2026-03-09T19:27:25.651 INFO:tasks.workunit.client.0.vm07.stdout:9/415: fdatasync d0/f4 0 2026-03-09T19:27:25.653 INFO:tasks.workunit.client.0.vm07.stdout:4/360: creat d3/d4f/f7c x:0 0 0 2026-03-09T19:27:25.658 INFO:tasks.workunit.client.0.vm07.stdout:7/396: rename d0/d4/d5/d26/f42 to d0/d4/d5/d8/d41/d64/d74/f88 0 2026-03-09T19:27:25.687 INFO:tasks.workunit.client.0.vm07.stdout:7/397: readlink d0/d4/d5/l11 0 2026-03-09T19:27:25.687 INFO:tasks.workunit.client.0.vm07.stdout:3/408: mknod d1/d1f/d16/d28/d7c/c80 0 2026-03-09T19:27:25.688 INFO:tasks.workunit.client.0.vm07.stdout:3/409: chown d1/d6/dd/f2b 93 1 2026-03-09T19:27:25.688 INFO:tasks.workunit.client.0.vm07.stdout:0/332: rmdir d0/d6/d13/d1c/d11/d56/d71 0 2026-03-09T19:27:25.688 INFO:tasks.workunit.client.0.vm07.stdout:8/393: mknod d7/d9/d37/d45/c93 0 2026-03-09T19:27:25.688 INFO:tasks.workunit.client.0.vm07.stdout:0/333: dread d0/d6/d13/f6c [0,4194304] 0 2026-03-09T19:27:25.688 INFO:tasks.workunit.client.0.vm07.stdout:2/414: getdents d3/dd/d16/d29/d2d/d45/d3b/d44 0 2026-03-09T19:27:25.688 INFO:tasks.workunit.client.0.vm07.stdout:2/415: chown d3/d49 5592 1 2026-03-09T19:27:25.688 INFO:tasks.workunit.client.0.vm07.stdout:0/334: chown d0/d6/d13/d17/c26 438130 1 2026-03-09T19:27:25.688 INFO:tasks.workunit.client.0.vm07.stdout:2/416: chown d3/dd/d16/d29/d2d/d45/d85 3071965 1 2026-03-09T19:27:25.688 INFO:tasks.workunit.client.1.vm08.stdout:4/585: fsync da/d10/d1b/f37 0 2026-03-09T19:27:25.689 INFO:tasks.workunit.client.1.vm08.stdout:1/746: symlink d9/da/d2d/d4e/ldf 0 2026-03-09T19:27:25.690 INFO:tasks.workunit.client.1.vm08.stdout:2/551: symlink d3/d4/d3e/d4e/lbd 0 2026-03-09T19:27:25.690 INFO:tasks.workunit.client.0.vm07.stdout:8/394: dwrite d7/f40 [4194304,4194304] 0 2026-03-09T19:27:25.691 INFO:tasks.workunit.client.1.vm08.stdout:2/552: chown d3/d4/d23/d2c/d39/d5e/de/c6f 1335 1 2026-03-09T19:27:25.691 INFO:tasks.workunit.client.0.vm07.stdout:8/395: stat d7/d16/l21 0 2026-03-09T19:27:25.693 INFO:tasks.workunit.client.0.vm07.stdout:2/417: dwrite d3/f1a [0,4194304] 0 2026-03-09T19:27:25.694 INFO:tasks.workunit.client.1.vm08.stdout:9/587: mkdir d0/d2/dc8 0 2026-03-09T19:27:25.694 INFO:tasks.workunit.client.0.vm07.stdout:0/335: rename d0/d6/d13/d1c/d61/d69/f6b to d0/d6/d13/d1c/d50/f73 0 2026-03-09T19:27:25.695 INFO:tasks.workunit.client.1.vm08.stdout:3/662: creat d0/d6/d93/dcb/fce x:0 0 0 2026-03-09T19:27:25.695 INFO:tasks.workunit.client.1.vm08.stdout:3/663: stat d0/d8/d24/c6f 0 2026-03-09T19:27:25.697 INFO:tasks.workunit.client.1.vm08.stdout:0/615: creat dd/d22/d24/d49/d50/db3/fc4 x:0 0 0 2026-03-09T19:27:25.703 INFO:tasks.workunit.client.1.vm08.stdout:1/747: dread d9/da/f6f [0,4194304] 0 2026-03-09T19:27:25.707 INFO:tasks.workunit.client.0.vm07.stdout:0/336: dread d0/d6/d13/d1c/d11/f2e [0,4194304] 0 2026-03-09T19:27:25.714 INFO:tasks.workunit.client.1.vm08.stdout:8/562: sync 2026-03-09T19:27:25.730 INFO:tasks.workunit.client.0.vm07.stdout:1/366: sync 2026-03-09T19:27:25.760 INFO:tasks.workunit.client.0.vm07.stdout:1/367: dwrite d1/d3/d21/f47 [0,4194304] 0 2026-03-09T19:27:25.760 INFO:tasks.workunit.client.0.vm07.stdout:1/368: dread d1/d11/f1b [0,4194304] 0 2026-03-09T19:27:25.760 INFO:tasks.workunit.client.0.vm07.stdout:1/369: write d1/f76 [572830,48418] 0 2026-03-09T19:27:25.760 INFO:tasks.workunit.client.0.vm07.stdout:1/370: mknod d1/d11/c80 0 2026-03-09T19:27:25.760 INFO:tasks.workunit.client.1.vm08.stdout:3/664: creat d0/d6/de/d1b/d16/fcf x:0 0 0 2026-03-09T19:27:25.760 INFO:tasks.workunit.client.1.vm08.stdout:3/665: dwrite d0/d6/de/d1b/fc0 [0,4194304] 0 2026-03-09T19:27:25.760 INFO:tasks.workunit.client.1.vm08.stdout:8/563: mknod de/d7c/cc6 0 2026-03-09T19:27:25.760 INFO:tasks.workunit.client.1.vm08.stdout:1/748: dread - d9/d11/faf zero size 2026-03-09T19:27:25.760 INFO:tasks.workunit.client.1.vm08.stdout:9/588: creat d0/d2/dc8/fc9 x:0 0 0 2026-03-09T19:27:25.760 INFO:tasks.workunit.client.1.vm08.stdout:3/666: dread d0/d6/f91 [0,4194304] 0 2026-03-09T19:27:25.767 INFO:tasks.workunit.client.1.vm08.stdout:3/667: dread - d0/d6/fc3 zero size 2026-03-09T19:27:25.767 INFO:tasks.workunit.client.1.vm08.stdout:3/668: chown d0/d4b/f74 25536 1 2026-03-09T19:27:25.768 INFO:tasks.workunit.client.1.vm08.stdout:3/669: chown d0/d6 29548 1 2026-03-09T19:27:25.780 INFO:tasks.workunit.client.0.vm07.stdout:3/410: sync 2026-03-09T19:27:25.862 INFO:tasks.workunit.client.1.vm08.stdout:0/616: dread dd/fe [0,4194304] 0 2026-03-09T19:27:25.992 INFO:tasks.workunit.client.0.vm07.stdout:9/416: dread d0/db/d29/d2c/f34 [0,4194304] 0 2026-03-09T19:27:25.993 INFO:tasks.workunit.client.0.vm07.stdout:9/417: fsync d0/db/d29/d68/f8e 0 2026-03-09T19:27:26.008 INFO:tasks.workunit.client.0.vm07.stdout:9/418: sync 2026-03-09T19:27:26.012 INFO:tasks.workunit.client.0.vm07.stdout:1/371: dread d1/d3/f4e [0,4194304] 0 2026-03-09T19:27:26.017 INFO:tasks.workunit.client.0.vm07.stdout:1/372: dwrite d1/db/d31/d56/f6a [0,4194304] 0 2026-03-09T19:27:26.018 INFO:tasks.workunit.client.0.vm07.stdout:1/373: fdatasync d1/db/d31/f78 0 2026-03-09T19:27:26.019 INFO:tasks.workunit.client.0.vm07.stdout:1/374: chown d1/d11/d37/d3f/d45/le 172499749 1 2026-03-09T19:27:26.028 INFO:tasks.workunit.client.0.vm07.stdout:7/398: dread d0/d4/d5/d26/d3c/d58/f71 [0,4194304] 0 2026-03-09T19:27:26.029 INFO:tasks.workunit.client.0.vm07.stdout:7/399: stat d0/c28 0 2026-03-09T19:27:26.072 INFO:tasks.workunit.client.0.vm07.stdout:5/371: dwrite d3/f4d [0,4194304] 0 2026-03-09T19:27:26.092 INFO:tasks.workunit.client.0.vm07.stdout:6/316: dwrite d0/d1/db/f43 [0,4194304] 0 2026-03-09T19:27:26.109 INFO:tasks.workunit.client.0.vm07.stdout:6/317: mknod d0/d13/d1e/c7e 0 2026-03-09T19:27:26.113 INFO:tasks.workunit.client.0.vm07.stdout:5/372: creat d3/dd/d26/f7d x:0 0 0 2026-03-09T19:27:26.113 INFO:tasks.workunit.client.0.vm07.stdout:5/373: chown d3/d1a/d28/d36 3 1 2026-03-09T19:27:26.153 INFO:tasks.workunit.client.1.vm08.stdout:6/617: creat d3/d68/fe2 x:0 0 0 2026-03-09T19:27:26.163 INFO:tasks.workunit.client.1.vm08.stdout:5/559: mknod d16/d1e/d8c/d99/da8/cb7 0 2026-03-09T19:27:26.172 INFO:tasks.workunit.client.1.vm08.stdout:6/618: mkdir d3/d34/dce/de3 0 2026-03-09T19:27:26.174 INFO:tasks.workunit.client.1.vm08.stdout:6/619: write d3/db/f8f [2093981,70437] 0 2026-03-09T19:27:26.175 INFO:tasks.workunit.client.1.vm08.stdout:5/560: dread d16/f34 [0,4194304] 0 2026-03-09T19:27:26.178 INFO:tasks.workunit.client.0.vm07.stdout:0/337: rename d0/d6/d13/d17/d19/f4d to d0/d6/d13/d17/d19/d57/d6a/f74 0 2026-03-09T19:27:26.179 INFO:tasks.workunit.client.1.vm08.stdout:5/561: read d16/d1e/f37 [800968,20040] 0 2026-03-09T19:27:26.186 INFO:tasks.workunit.client.1.vm08.stdout:5/562: dread d16/f4d [0,4194304] 0 2026-03-09T19:27:26.187 INFO:tasks.workunit.client.1.vm08.stdout:5/563: stat d16/d45/f6b 0 2026-03-09T19:27:26.193 INFO:tasks.workunit.client.1.vm08.stdout:5/564: dwrite d16/d45/fb1 [0,4194304] 0 2026-03-09T19:27:26.202 INFO:tasks.workunit.client.0.vm07.stdout:0/338: link d0/d6/d13/d1c/d11/f5f d0/d6/d13/d17/d19/d57/f75 0 2026-03-09T19:27:26.205 INFO:tasks.workunit.client.1.vm08.stdout:6/620: link d3/d15/f40 d3/fe4 0 2026-03-09T19:27:26.206 INFO:tasks.workunit.client.1.vm08.stdout:6/621: readlink d3/d34/d5c/da2/lcc 0 2026-03-09T19:27:26.215 INFO:tasks.workunit.client.0.vm07.stdout:0/339: chown d0/d6/d13/d1c/c1d 4062 1 2026-03-09T19:27:26.217 INFO:tasks.workunit.client.0.vm07.stdout:7/400: dread d0/d4/d5/f20 [0,4194304] 0 2026-03-09T19:27:26.217 INFO:tasks.workunit.client.0.vm07.stdout:0/340: mkdir d0/d6/d13/d1c/d11/d76 0 2026-03-09T19:27:26.221 INFO:tasks.workunit.client.1.vm08.stdout:5/565: creat d16/fb8 x:0 0 0 2026-03-09T19:27:26.227 INFO:tasks.workunit.client.1.vm08.stdout:6/622: creat d3/d15/d8a/dc4/fe5 x:0 0 0 2026-03-09T19:27:26.235 INFO:tasks.workunit.client.1.vm08.stdout:5/566: truncate d16/d1e/f57 4245893 0 2026-03-09T19:27:26.243 INFO:tasks.workunit.client.0.vm07.stdout:0/341: creat d0/d6/d13/d17/d19/d58/f77 x:0 0 0 2026-03-09T19:27:26.260 INFO:tasks.workunit.client.0.vm07.stdout:7/401: getdents d0/d52/d54/d5a 0 2026-03-09T19:27:26.260 INFO:tasks.workunit.client.1.vm08.stdout:6/623: link d3/d34/c5a d3/d15/dc2/ce6 0 2026-03-09T19:27:26.276 INFO:tasks.workunit.client.1.vm08.stdout:6/624: symlink d3/d68/le7 0 2026-03-09T19:27:26.276 INFO:tasks.workunit.client.1.vm08.stdout:6/625: readlink d3/d15/l92 0 2026-03-09T19:27:26.284 INFO:tasks.workunit.client.0.vm07.stdout:8/396: dwrite d7/d50/f6d [0,4194304] 0 2026-03-09T19:27:26.286 INFO:tasks.workunit.client.0.vm07.stdout:2/418: dwrite d3/dd/d16/d29/f42 [0,4194304] 0 2026-03-09T19:27:26.293 INFO:tasks.workunit.client.1.vm08.stdout:8/564: write de/d1d/d21/d73/fa7 [744289,117691] 0 2026-03-09T19:27:26.295 INFO:tasks.workunit.client.0.vm07.stdout:2/419: write d3/f4 [1806694,93399] 0 2026-03-09T19:27:26.295 INFO:tasks.workunit.client.0.vm07.stdout:2/420: fdatasync d3/f1a 0 2026-03-09T19:27:26.301 INFO:tasks.workunit.client.1.vm08.stdout:8/565: symlink de/d1d/d4f/lc7 0 2026-03-09T19:27:26.301 INFO:tasks.workunit.client.0.vm07.stdout:8/397: read d7/d9/f81 [1638272,43822] 0 2026-03-09T19:27:26.303 INFO:tasks.workunit.client.0.vm07.stdout:8/398: write d7/d9/d37/f85 [554937,111764] 0 2026-03-09T19:27:26.304 INFO:tasks.workunit.client.1.vm08.stdout:3/670: truncate d0/d6/de/f86 1928839 0 2026-03-09T19:27:26.305 INFO:tasks.workunit.client.1.vm08.stdout:3/671: read - d0/d6/de/d1b/d16/fcf zero size 2026-03-09T19:27:26.308 INFO:tasks.workunit.client.1.vm08.stdout:9/589: dwrite d0/d1b/d97/d48/d6f/f84 [0,4194304] 0 2026-03-09T19:27:26.310 INFO:tasks.workunit.client.1.vm08.stdout:9/590: chown d0/d1b/d4e/da7 124298975 1 2026-03-09T19:27:26.322 INFO:tasks.workunit.client.0.vm07.stdout:3/411: truncate d1/d1f/f1a 3040969 0 2026-03-09T19:27:26.325 INFO:tasks.workunit.client.1.vm08.stdout:3/672: unlink d0/d6/de/d1b/l8e 0 2026-03-09T19:27:26.329 INFO:tasks.workunit.client.1.vm08.stdout:0/617: dwrite dd/f6d [0,4194304] 0 2026-03-09T19:27:26.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:26 vm08.local ceph-mon[57794]: pgmap v167: 65 pgs: 65 active+clean; 2.0 GiB data, 7.4 GiB used, 113 GiB / 120 GiB avail; 31 MiB/s rd, 109 MiB/s wr, 207 op/s 2026-03-09T19:27:26.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:26 vm08.local ceph-mon[57794]: from='client.24453 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:27:26.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:26 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/1353960079' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:27:26.355 INFO:tasks.workunit.client.1.vm08.stdout:0/618: symlink dd/d22/d63/d93/lc5 0 2026-03-09T19:27:26.362 INFO:tasks.workunit.client.0.vm07.stdout:1/375: dwrite d1/f2f [4194304,4194304] 0 2026-03-09T19:27:26.405 INFO:tasks.workunit.client.0.vm07.stdout:8/399: symlink d7/d30/d75/l94 0 2026-03-09T19:27:26.406 INFO:tasks.workunit.client.0.vm07.stdout:8/400: dread - d7/d30/d75/f88 zero size 2026-03-09T19:27:26.409 INFO:tasks.workunit.client.0.vm07.stdout:1/376: rename d1/d11/d37/d3f/d6e/f7c to d1/db/f81 0 2026-03-09T19:27:26.409 INFO:tasks.workunit.client.0.vm07.stdout:1/377: read d1/d3/f12 [1538831,59939] 0 2026-03-09T19:27:26.427 INFO:tasks.workunit.client.0.vm07.stdout:8/401: mknod d7/d9/d57/c95 0 2026-03-09T19:27:26.430 INFO:tasks.workunit.client.0.vm07.stdout:8/402: creat d7/d1d/f96 x:0 0 0 2026-03-09T19:27:26.432 INFO:tasks.workunit.client.0.vm07.stdout:8/403: readlink d7/d9/d37/l47 0 2026-03-09T19:27:26.432 INFO:tasks.workunit.client.0.vm07.stdout:5/374: write d3/f4e [533393,120853] 0 2026-03-09T19:27:26.435 INFO:tasks.workunit.client.0.vm07.stdout:6/318: dwrite d0/d2d/f4a [0,4194304] 0 2026-03-09T19:27:26.438 INFO:tasks.workunit.client.0.vm07.stdout:5/375: truncate d3/dd/d26/d3f/d47/d56/f65 824727 0 2026-03-09T19:27:26.439 INFO:tasks.workunit.client.0.vm07.stdout:5/376: stat d3/d1a/d28/d48/c74 0 2026-03-09T19:27:26.445 INFO:tasks.workunit.client.0.vm07.stdout:1/378: rename d1/d3/d52/f72 to d1/d11/d37/d3f/f82 0 2026-03-09T19:27:26.458 INFO:tasks.workunit.client.0.vm07.stdout:7/402: creat d0/d4/d5/d8/d41/f89 x:0 0 0 2026-03-09T19:27:26.458 INFO:tasks.workunit.client.0.vm07.stdout:6/319: mkdir d0/d4e/d7f 0 2026-03-09T19:27:26.459 INFO:tasks.workunit.client.0.vm07.stdout:6/320: dread - d0/d4e/f78 zero size 2026-03-09T19:27:26.468 INFO:tasks.workunit.client.0.vm07.stdout:1/379: rename d1/d11/f1b to d1/d11/f83 0 2026-03-09T19:27:26.472 INFO:tasks.workunit.client.0.vm07.stdout:1/380: dread - d1/d11/d37/d5d/f59 zero size 2026-03-09T19:27:26.478 INFO:tasks.workunit.client.1.vm08.stdout:9/591: creat d0/d1b/d97/fca x:0 0 0 2026-03-09T19:27:26.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:26 vm07.local ceph-mon[48545]: pgmap v167: 65 pgs: 65 active+clean; 2.0 GiB data, 7.4 GiB used, 113 GiB / 120 GiB avail; 31 MiB/s rd, 109 MiB/s wr, 207 op/s 2026-03-09T19:27:26.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:26 vm07.local ceph-mon[48545]: from='client.24453 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:27:26.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:26 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/1353960079' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:27:26.478 INFO:tasks.workunit.client.1.vm08.stdout:9/592: chown d0/d2/d14/ca6 15210 1 2026-03-09T19:27:26.479 INFO:tasks.workunit.client.0.vm07.stdout:5/377: link d3/dd/d26/d2d/c3d d3/d1a/d5a/c7e 0 2026-03-09T19:27:26.480 INFO:tasks.workunit.client.0.vm07.stdout:7/403: rename d0/d4/d5/d26/d32/l3f to d0/l8a 0 2026-03-09T19:27:26.483 INFO:tasks.workunit.client.0.vm07.stdout:5/378: dwrite d3/d1a/d28/d48/f69 [0,4194304] 0 2026-03-09T19:27:26.516 INFO:tasks.workunit.client.1.vm08.stdout:9/593: mkdir d0/d2/d14/dcb 0 2026-03-09T19:27:26.516 INFO:tasks.workunit.client.1.vm08.stdout:9/594: truncate d0/d1b/d68/d7f/d8c/fbc 258220 0 2026-03-09T19:27:26.516 INFO:tasks.workunit.client.1.vm08.stdout:9/595: read - d0/d2/d14/d98/f9d zero size 2026-03-09T19:27:26.516 INFO:tasks.workunit.client.0.vm07.stdout:5/379: dwrite d3/dd/d26/d3f/d47/d56/f75 [0,4194304] 0 2026-03-09T19:27:26.516 INFO:tasks.workunit.client.0.vm07.stdout:5/380: readlink d3/dd/d26/d2c/l3a 0 2026-03-09T19:27:26.517 INFO:tasks.workunit.client.0.vm07.stdout:5/381: chown d3/dd/d26/d2d/d79 2306005 1 2026-03-09T19:27:26.517 INFO:tasks.workunit.client.0.vm07.stdout:5/382: write d3/dd/d26/f7d [956484,81473] 0 2026-03-09T19:27:26.517 INFO:tasks.workunit.client.0.vm07.stdout:5/383: write d3/dd/f58 [238795,17845] 0 2026-03-09T19:27:26.517 INFO:tasks.workunit.client.0.vm07.stdout:7/404: unlink d0/d4/d5/d8/d41/d64/c77 0 2026-03-09T19:27:26.518 INFO:tasks.workunit.client.0.vm07.stdout:7/405: dread d0/d4/d5/d8/d1a/f4d [0,4194304] 0 2026-03-09T19:27:26.521 INFO:tasks.workunit.client.0.vm07.stdout:4/361: creat d3/d11/f7d x:0 0 0 2026-03-09T19:27:26.536 INFO:tasks.workunit.client.0.vm07.stdout:5/384: fdatasync d3/d1a/d28/f2e 0 2026-03-09T19:27:26.539 INFO:tasks.workunit.client.0.vm07.stdout:5/385: creat d3/dd/d26/d2d/d60/f7f x:0 0 0 2026-03-09T19:27:26.541 INFO:tasks.workunit.client.0.vm07.stdout:5/386: creat d3/d1a/d5d/f80 x:0 0 0 2026-03-09T19:27:26.541 INFO:tasks.workunit.client.0.vm07.stdout:5/387: chown d3/dd/l13 46459 1 2026-03-09T19:27:26.542 INFO:tasks.workunit.client.0.vm07.stdout:5/388: write d3/d1a/d5d/f80 [678263,37400] 0 2026-03-09T19:27:26.552 INFO:tasks.workunit.client.0.vm07.stdout:7/406: dread d0/f1 [0,4194304] 0 2026-03-09T19:27:26.558 INFO:tasks.workunit.client.0.vm07.stdout:7/407: mknod d0/d4/d5/d8/d41/c8b 0 2026-03-09T19:27:26.564 INFO:tasks.workunit.client.0.vm07.stdout:7/408: chown d0/d4/l4b 20617 1 2026-03-09T19:27:26.567 INFO:tasks.workunit.client.0.vm07.stdout:7/409: link d0/d4/d5/d8/d41/l51 d0/d4/d5/d8/d1a/l8c 0 2026-03-09T19:27:26.612 INFO:tasks.workunit.client.0.vm07.stdout:0/342: rmdir d0/d6/d13/d1c/d11 39 2026-03-09T19:27:26.620 INFO:tasks.workunit.client.1.vm08.stdout:6/626: write d3/f9 [3164087,4848] 0 2026-03-09T19:27:26.625 INFO:tasks.workunit.client.1.vm08.stdout:5/567: dwrite d16/d1e/f5b [0,4194304] 0 2026-03-09T19:27:26.628 INFO:tasks.workunit.client.1.vm08.stdout:5/568: chown d16/d1e/d30/l52 32571764 1 2026-03-09T19:27:26.633 INFO:tasks.workunit.client.1.vm08.stdout:8/566: write de/d1d/d2e/f9f [4430574,101399] 0 2026-03-09T19:27:26.638 INFO:tasks.workunit.client.0.vm07.stdout:9/419: creat d0/db/f91 x:0 0 0 2026-03-09T19:27:26.653 INFO:tasks.workunit.client.0.vm07.stdout:0/343: mkdir d0/d6/d13/d1c/d11/d56/d78 0 2026-03-09T19:27:26.654 INFO:tasks.workunit.client.0.vm07.stdout:0/344: chown d0/f65 4981 1 2026-03-09T19:27:26.654 INFO:tasks.workunit.client.0.vm07.stdout:4/362: creat d3/d11/d2b/f7e x:0 0 0 2026-03-09T19:27:26.655 INFO:tasks.workunit.client.0.vm07.stdout:4/363: chown d3/d11/d16/c65 428048815 1 2026-03-09T19:27:26.658 INFO:tasks.workunit.client.1.vm08.stdout:0/619: dwrite dd/d22/d7b/f83 [0,4194304] 0 2026-03-09T19:27:26.659 INFO:tasks.workunit.client.0.vm07.stdout:9/420: sync 2026-03-09T19:27:26.659 INFO:tasks.workunit.client.0.vm07.stdout:9/421: readlink d0/d17/l50 0 2026-03-09T19:27:26.661 INFO:tasks.workunit.client.0.vm07.stdout:9/422: dread d0/db/d29/d2c/d36/f3c [0,4194304] 0 2026-03-09T19:27:26.664 INFO:tasks.workunit.client.0.vm07.stdout:3/412: write d1/d6/dd/f2b [3922317,65934] 0 2026-03-09T19:27:26.669 INFO:tasks.workunit.client.1.vm08.stdout:7/657: rename d5/d14/dae/c1d to d5/cdb 0 2026-03-09T19:27:26.673 INFO:tasks.workunit.client.0.vm07.stdout:6/321: symlink d0/d1/db/l80 0 2026-03-09T19:27:26.678 INFO:tasks.workunit.client.0.vm07.stdout:0/345: rename d0/d6/d13/d1c/d11/f15 to d0/d6/f79 0 2026-03-09T19:27:26.679 INFO:tasks.workunit.client.0.vm07.stdout:2/421: write d3/dd/d16/f25 [393271,50121] 0 2026-03-09T19:27:26.680 INFO:tasks.workunit.client.0.vm07.stdout:4/364: creat d3/d4f/d56/f7f x:0 0 0 2026-03-09T19:27:26.681 INFO:tasks.workunit.client.0.vm07.stdout:0/346: write d0/d6/d13/d1c/d61/f63 [466383,38720] 0 2026-03-09T19:27:26.691 INFO:tasks.workunit.client.1.vm08.stdout:3/673: creat d0/d52/fd0 x:0 0 0 2026-03-09T19:27:26.696 INFO:tasks.workunit.client.0.vm07.stdout:8/404: write d7/d30/f3e [1581717,24764] 0 2026-03-09T19:27:26.698 INFO:tasks.workunit.client.1.vm08.stdout:8/567: dread de/d25/d31/d82/d6d/d99/da5/db3/f2d [0,4194304] 0 2026-03-09T19:27:26.698 INFO:tasks.workunit.client.1.vm08.stdout:0/620: truncate dd/d22/d27/d4f/f97 3501160 0 2026-03-09T19:27:26.712 INFO:tasks.workunit.client.1.vm08.stdout:7/658: creat d5/d14/d27/fdc x:0 0 0 2026-03-09T19:27:26.717 INFO:tasks.workunit.client.0.vm07.stdout:4/365: symlink d3/d4f/d56/d5f/l80 0 2026-03-09T19:27:26.717 INFO:tasks.workunit.client.0.vm07.stdout:7/410: symlink d0/d4/d5/d8/l8d 0 2026-03-09T19:27:26.721 INFO:tasks.workunit.client.0.vm07.stdout:6/322: dread d0/d1/db/f70 [0,4194304] 0 2026-03-09T19:27:26.728 INFO:tasks.workunit.client.0.vm07.stdout:4/366: dread d3/d11/d2b/f49 [0,4194304] 0 2026-03-09T19:27:26.729 INFO:tasks.workunit.client.0.vm07.stdout:1/381: write d1/d11/d37/d5d/f59 [241665,39928] 0 2026-03-09T19:27:26.732 INFO:tasks.workunit.client.0.vm07.stdout:9/423: symlink d0/db/d29/d32/l92 0 2026-03-09T19:27:26.732 INFO:tasks.workunit.client.0.vm07.stdout:0/347: creat d0/d6/d13/d17/d19/d57/d6a/f7a x:0 0 0 2026-03-09T19:27:26.739 INFO:tasks.workunit.client.1.vm08.stdout:3/674: mkdir d0/d6/de/d1b/d16/dd1 0 2026-03-09T19:27:26.739 INFO:tasks.workunit.client.1.vm08.stdout:9/596: write d0/d1b/f65 [608868,51500] 0 2026-03-09T19:27:26.745 INFO:tasks.workunit.client.0.vm07.stdout:5/389: truncate d3/f4d 4120616 0 2026-03-09T19:27:26.746 INFO:tasks.workunit.client.0.vm07.stdout:5/390: chown d3/d1a/d5a 4 1 2026-03-09T19:27:26.751 INFO:tasks.workunit.client.1.vm08.stdout:7/659: creat d5/d14/d2b/d4b/fdd x:0 0 0 2026-03-09T19:27:26.756 INFO:tasks.workunit.client.1.vm08.stdout:5/569: creat d16/d1e/d30/fb9 x:0 0 0 2026-03-09T19:27:26.760 INFO:tasks.workunit.client.0.vm07.stdout:7/411: dread d0/d4/d5/f36 [0,4194304] 0 2026-03-09T19:27:26.760 INFO:tasks.workunit.client.0.vm07.stdout:7/412: stat d0/d52/d54/f7d 0 2026-03-09T19:27:26.766 INFO:tasks.workunit.client.0.vm07.stdout:3/413: write d1/d6/dd/f33 [555182,6063] 0 2026-03-09T19:27:26.770 INFO:tasks.workunit.client.1.vm08.stdout:6/627: dwrite d3/d34/d5c/da2/f5e [0,4194304] 0 2026-03-09T19:27:26.775 INFO:tasks.workunit.client.1.vm08.stdout:0/621: mkdir dd/dc6 0 2026-03-09T19:27:26.776 INFO:tasks.workunit.client.0.vm07.stdout:4/367: symlink d3/d4f/d56/d5f/l81 0 2026-03-09T19:27:26.779 INFO:tasks.workunit.client.0.vm07.stdout:0/348: mknod d0/d6/d13/d1c/d50/c7b 0 2026-03-09T19:27:26.779 INFO:tasks.workunit.client.0.vm07.stdout:0/349: write d0/d6/d13/d1c/d61/f63 [248797,116936] 0 2026-03-09T19:27:26.780 INFO:tasks.workunit.client.0.vm07.stdout:0/350: fsync d0/d6/d13/d33/f35 0 2026-03-09T19:27:26.781 INFO:tasks.workunit.client.0.vm07.stdout:8/405: mkdir d7/d9/d37/d45/d97 0 2026-03-09T19:27:26.782 INFO:tasks.workunit.client.1.vm08.stdout:4/586: rename da/d10/f6b to da/d10/d16/d28/d46/fb1 0 2026-03-09T19:27:26.782 INFO:tasks.workunit.client.1.vm08.stdout:2/553: rename d3 to d3/d4/d23/d2c/d39/d5e/d87/dbe 22 2026-03-09T19:27:26.783 INFO:tasks.workunit.client.1.vm08.stdout:4/587: write da/d10/d16/d28/d2f/d4f/d56/f9a [5231851,73103] 0 2026-03-09T19:27:26.796 INFO:tasks.workunit.client.0.vm07.stdout:2/422: write d3/f27 [4139477,51657] 0 2026-03-09T19:27:26.796 INFO:tasks.workunit.client.1.vm08.stdout:7/660: truncate d5/d14/dae/d3a/d42/fa9 280248 0 2026-03-09T19:27:26.797 INFO:tasks.workunit.client.1.vm08.stdout:5/570: creat d16/d1e/d30/d8a/fba x:0 0 0 2026-03-09T19:27:26.813 INFO:tasks.workunit.client.0.vm07.stdout:0/351: dread d0/d6/d13/d17/d19/d57/f75 [0,4194304] 0 2026-03-09T19:27:26.814 INFO:tasks.workunit.client.0.vm07.stdout:0/352: chown d0/d6/d13/d1c/c1d 2547298 1 2026-03-09T19:27:26.818 INFO:tasks.workunit.client.1.vm08.stdout:6/628: mkdir d3/d34/d5c/de8 0 2026-03-09T19:27:26.833 INFO:tasks.workunit.client.0.vm07.stdout:7/413: unlink d0/d4/d5/d26/d3c/d39/l30 0 2026-03-09T19:27:26.833 INFO:tasks.workunit.client.1.vm08.stdout:3/675: mkdir d0/d6/de/d1b/d16/d17/dac/dd2 0 2026-03-09T19:27:26.843 INFO:tasks.workunit.client.0.vm07.stdout:1/382: dwrite d1/d3/f4 [0,4194304] 0 2026-03-09T19:27:26.844 INFO:tasks.workunit.client.0.vm07.stdout:6/323: truncate d0/d1/db/d24/d53/f35 2785010 0 2026-03-09T19:27:26.845 INFO:tasks.workunit.client.0.vm07.stdout:6/324: stat d0/d1/db/d24/d53/d31 0 2026-03-09T19:27:26.845 INFO:tasks.workunit.client.1.vm08.stdout:9/597: dwrite d0/d1b/d97/d48/d5d/f9b [0,4194304] 0 2026-03-09T19:27:26.852 INFO:tasks.workunit.client.0.vm07.stdout:4/368: creat d3/d11/d16/f82 x:0 0 0 2026-03-09T19:27:26.853 INFO:tasks.workunit.client.0.vm07.stdout:6/325: dread d0/ff [0,4194304] 0 2026-03-09T19:27:26.859 INFO:tasks.workunit.client.0.vm07.stdout:4/369: dwrite d3/d11/d16/f77 [0,4194304] 0 2026-03-09T19:27:26.865 INFO:tasks.workunit.client.0.vm07.stdout:8/406: truncate d7/d9/d37/d45/d4f/f66 4082521 0 2026-03-09T19:27:26.872 INFO:tasks.workunit.client.1.vm08.stdout:6/629: symlink d3/d55/le9 0 2026-03-09T19:27:26.888 INFO:tasks.workunit.client.0.vm07.stdout:5/391: symlink d3/d1a/d28/d48/l81 0 2026-03-09T19:27:26.888 INFO:tasks.workunit.client.1.vm08.stdout:8/568: getdents de/d25/d31 0 2026-03-09T19:27:26.895 INFO:tasks.workunit.client.0.vm07.stdout:8/407: sync 2026-03-09T19:27:26.897 INFO:tasks.workunit.client.0.vm07.stdout:0/353: unlink d0/d6/d13/d17/f2b 0 2026-03-09T19:27:26.900 INFO:tasks.workunit.client.0.vm07.stdout:7/414: fsync d0/d52/d54/d55/f6d 0 2026-03-09T19:27:26.901 INFO:tasks.workunit.client.0.vm07.stdout:0/354: dwrite d0/f3a [0,4194304] 0 2026-03-09T19:27:26.905 INFO:tasks.workunit.client.0.vm07.stdout:3/414: symlink d1/l81 0 2026-03-09T19:27:26.907 INFO:tasks.workunit.client.0.vm07.stdout:3/415: truncate d1/d1f/d16/d28/f64 951603 0 2026-03-09T19:27:26.911 INFO:tasks.workunit.client.1.vm08.stdout:1/749: rename d9/da/l23 to d9/d40/d49/le0 0 2026-03-09T19:27:26.911 INFO:tasks.workunit.client.0.vm07.stdout:3/416: dwrite d1/f68 [0,4194304] 0 2026-03-09T19:27:26.927 INFO:tasks.workunit.client.0.vm07.stdout:9/424: rename d0/c5b to d0/c93 0 2026-03-09T19:27:26.931 INFO:tasks.workunit.client.1.vm08.stdout:2/554: mknod d3/d4/d23/d2c/cbf 0 2026-03-09T19:27:26.943 INFO:tasks.workunit.client.0.vm07.stdout:4/370: fdatasync d3/d11/d16/d2f/f44 0 2026-03-09T19:27:26.943 INFO:tasks.workunit.client.1.vm08.stdout:5/571: creat d16/d1e/d30/d6f/fbb x:0 0 0 2026-03-09T19:27:26.943 INFO:tasks.workunit.client.0.vm07.stdout:4/371: readlink d3/d11/d2b/d38/l40 0 2026-03-09T19:27:26.944 INFO:tasks.workunit.client.1.vm08.stdout:8/569: mkdir de/d91/dc8 0 2026-03-09T19:27:26.944 INFO:tasks.workunit.client.0.vm07.stdout:4/372: write d3/d11/d2b/f69 [461675,96213] 0 2026-03-09T19:27:26.945 INFO:tasks.workunit.client.1.vm08.stdout:3/676: mkdir d0/d6/de/d1b/d16/d17/dac/dd2/dd3 0 2026-03-09T19:27:26.962 INFO:tasks.workunit.client.1.vm08.stdout:2/555: dread - d3/d4/d23/d2c/f64 zero size 2026-03-09T19:27:26.968 INFO:tasks.workunit.client.0.vm07.stdout:0/355: dread d0/f1e [0,4194304] 0 2026-03-09T19:27:26.975 INFO:tasks.workunit.client.1.vm08.stdout:7/661: dread d5/d14/dae/d3a/d42/f65 [0,4194304] 0 2026-03-09T19:27:26.987 INFO:tasks.workunit.client.1.vm08.stdout:8/570: mkdir de/d25/d87/dc9 0 2026-03-09T19:27:26.987 INFO:tasks.workunit.client.1.vm08.stdout:8/571: dread - de/d47/fb9 zero size 2026-03-09T19:27:26.991 INFO:tasks.workunit.client.0.vm07.stdout:3/417: creat d1/d1f/d5c/f82 x:0 0 0 2026-03-09T19:27:26.997 INFO:tasks.workunit.client.1.vm08.stdout:0/622: write dd/d22/d27/d6c/f7f [92904,42674] 0 2026-03-09T19:27:27.003 INFO:tasks.workunit.client.1.vm08.stdout:4/588: dread da/d10/d16/d28/d46/d52/d6e/d40/d6c/f71 [4194304,4194304] 0 2026-03-09T19:27:27.007 INFO:tasks.workunit.client.0.vm07.stdout:9/425: mkdir d0/d6/d3a/d94 0 2026-03-09T19:27:27.009 INFO:tasks.workunit.client.0.vm07.stdout:6/326: mkdir d0/d1/db/d52/d6a/d81 0 2026-03-09T19:27:27.010 INFO:tasks.workunit.client.0.vm07.stdout:6/327: chown d0/d44 45104394 1 2026-03-09T19:27:27.012 INFO:tasks.workunit.client.1.vm08.stdout:6/630: write d3/db/f30 [115630,99672] 0 2026-03-09T19:27:27.014 INFO:tasks.workunit.client.1.vm08.stdout:9/598: dwrite d0/d1b/d97/f3f [4194304,4194304] 0 2026-03-09T19:27:27.014 INFO:tasks.workunit.client.1.vm08.stdout:3/677: dread d0/d8/f4c [0,4194304] 0 2026-03-09T19:27:27.036 INFO:tasks.workunit.client.0.vm07.stdout:8/408: mknod d7/d9/c98 0 2026-03-09T19:27:27.040 INFO:tasks.workunit.client.0.vm07.stdout:8/409: dread d7/d50/f6d [0,4194304] 0 2026-03-09T19:27:27.040 INFO:tasks.workunit.client.1.vm08.stdout:4/589: write da/d10/d1b/f79 [3439050,60114] 0 2026-03-09T19:27:27.041 INFO:tasks.workunit.client.0.vm07.stdout:8/410: dread d7/d1d/f4d [0,4194304] 0 2026-03-09T19:27:27.042 INFO:tasks.workunit.client.0.vm07.stdout:4/373: mknod d3/d11/d2b/d37/c83 0 2026-03-09T19:27:27.043 INFO:tasks.workunit.client.0.vm07.stdout:4/374: read d3/d11/d16/f77 [452775,38712] 0 2026-03-09T19:27:27.046 INFO:tasks.workunit.client.1.vm08.stdout:4/590: read da/f18 [185420,19714] 0 2026-03-09T19:27:27.048 INFO:tasks.workunit.client.1.vm08.stdout:1/750: rename d9/da/dc/fa5 to d9/da/d17/fe1 0 2026-03-09T19:27:27.049 INFO:tasks.workunit.client.1.vm08.stdout:0/623: dread dd/f7a [0,4194304] 0 2026-03-09T19:27:27.050 INFO:tasks.workunit.client.1.vm08.stdout:1/751: truncate d9/da/d2d/f50 415304 0 2026-03-09T19:27:27.067 INFO:tasks.workunit.client.0.vm07.stdout:3/418: read - d1/d6/dd/f44 zero size 2026-03-09T19:27:27.068 INFO:tasks.workunit.client.0.vm07.stdout:9/426: creat d0/db/d29/d4d/f95 x:0 0 0 2026-03-09T19:27:27.070 INFO:tasks.workunit.client.0.vm07.stdout:2/423: getdents d3/dd/d16/d29/d2d/d45/d85 0 2026-03-09T19:27:27.071 INFO:tasks.workunit.client.0.vm07.stdout:2/424: stat d3/dd/d16/d29/f42 0 2026-03-09T19:27:27.071 INFO:tasks.workunit.client.1.vm08.stdout:8/572: rename de/d1d/l3a to de/d25/d31/d82/d6d/d99/da0/lca 0 2026-03-09T19:27:27.071 INFO:tasks.workunit.client.1.vm08.stdout:5/572: creat d16/d1e/d8c/d99/da8/fbc x:0 0 0 2026-03-09T19:27:27.071 INFO:tasks.workunit.client.1.vm08.stdout:4/591: creat da/d10/d16/d28/d2f/d4f/d64/d81/fb2 x:0 0 0 2026-03-09T19:27:27.072 INFO:tasks.workunit.client.1.vm08.stdout:5/573: chown d16/d1e/f37 4 1 2026-03-09T19:27:27.073 INFO:tasks.workunit.client.1.vm08.stdout:8/573: dread - de/d1d/d2e/d5f/fbb zero size 2026-03-09T19:27:27.075 INFO:tasks.workunit.client.1.vm08.stdout:1/752: rename d9/da/d2d/d4e/ldf to d9/da/d2d/d4e/le2 0 2026-03-09T19:27:27.093 INFO:tasks.workunit.client.0.vm07.stdout:4/375: readlink d3/d11/d51/l5d 0 2026-03-09T19:27:27.093 INFO:tasks.workunit.client.1.vm08.stdout:2/556: creat d3/d4/d23/fc0 x:0 0 0 2026-03-09T19:27:27.093 INFO:tasks.workunit.client.1.vm08.stdout:6/631: truncate d3/fc 5785985 0 2026-03-09T19:27:27.094 INFO:tasks.workunit.client.0.vm07.stdout:4/376: chown d3/d11/d29/f52 49049 1 2026-03-09T19:27:27.094 INFO:tasks.workunit.client.1.vm08.stdout:9/599: symlink d0/d2/lcc 0 2026-03-09T19:27:27.094 INFO:tasks.workunit.client.0.vm07.stdout:4/377: stat d3/d11/d16/d2f 0 2026-03-09T19:27:27.095 INFO:tasks.workunit.client.1.vm08.stdout:6/632: chown d3/d68/d7e/fbb 12657 1 2026-03-09T19:27:27.104 INFO:tasks.workunit.client.0.vm07.stdout:7/415: creat d0/d4/d5/d26/f8e x:0 0 0 2026-03-09T19:27:27.104 INFO:tasks.workunit.client.0.vm07.stdout:1/383: getdents d1/db/d31/d4f 0 2026-03-09T19:27:27.105 INFO:tasks.workunit.client.1.vm08.stdout:3/678: rename d0/d6/de/d6e/l98 to d0/d8/d19/ld4 0 2026-03-09T19:27:27.108 INFO:tasks.workunit.client.1.vm08.stdout:4/592: symlink da/d10/d16/d28/d46/d52/d6e/d40/d6c/lb3 0 2026-03-09T19:27:27.108 INFO:tasks.workunit.client.1.vm08.stdout:4/593: chown da/d10/l3c 0 1 2026-03-09T19:27:27.110 INFO:tasks.workunit.client.1.vm08.stdout:8/574: read de/d25/d33/f55 [2453599,88750] 0 2026-03-09T19:27:27.118 INFO:tasks.workunit.client.1.vm08.stdout:4/594: dwrite da/fab [0,4194304] 0 2026-03-09T19:27:27.136 INFO:tasks.workunit.client.0.vm07.stdout:9/427: rename d0/d17/f4f to d0/db/d29/d32/d5c/d69/f96 0 2026-03-09T19:27:27.137 INFO:tasks.workunit.client.0.vm07.stdout:9/428: chown d0/db/d29/d2c/d36/d5a 124833491 1 2026-03-09T19:27:27.138 INFO:tasks.workunit.client.1.vm08.stdout:0/624: write dd/d22/d24/d49/d50/f9b [84604,42525] 0 2026-03-09T19:27:27.144 INFO:tasks.workunit.client.0.vm07.stdout:5/392: getdents d3/d1a/d5d 0 2026-03-09T19:27:27.145 INFO:tasks.workunit.client.0.vm07.stdout:6/328: mkdir d0/d1/d82 0 2026-03-09T19:27:27.146 INFO:tasks.workunit.client.1.vm08.stdout:7/662: getdents d5/d14/dae/d1c/d73 0 2026-03-09T19:27:27.152 INFO:tasks.workunit.client.1.vm08.stdout:5/574: dwrite d16/d1e/d3b/d61/f7a [0,4194304] 0 2026-03-09T19:27:27.154 INFO:tasks.workunit.client.1.vm08.stdout:5/575: fsync d16/d45/f6b 0 2026-03-09T19:27:27.154 INFO:tasks.workunit.client.0.vm07.stdout:2/425: mknod d3/dd/d16/d29/d3c/c8d 0 2026-03-09T19:27:27.156 INFO:tasks.workunit.client.1.vm08.stdout:5/576: truncate d16/d1e/d30/d8a/fba 417701 0 2026-03-09T19:27:27.162 INFO:tasks.workunit.client.1.vm08.stdout:6/633: creat d3/d94/fea x:0 0 0 2026-03-09T19:27:27.167 INFO:tasks.workunit.client.1.vm08.stdout:6/634: dread d3/db/f30 [0,4194304] 0 2026-03-09T19:27:27.168 INFO:tasks.workunit.client.0.vm07.stdout:4/378: readlink d3/d11/d16/d2f/l3a 0 2026-03-09T19:27:27.171 INFO:tasks.workunit.client.0.vm07.stdout:4/379: dwrite d3/d4f/f7c [0,4194304] 0 2026-03-09T19:27:27.172 INFO:tasks.workunit.client.1.vm08.stdout:4/595: read - da/d10/d16/d28/fa3 zero size 2026-03-09T19:27:27.173 INFO:tasks.workunit.client.0.vm07.stdout:7/416: mknod d0/d52/d54/d5a/c8f 0 2026-03-09T19:27:27.173 INFO:tasks.workunit.client.0.vm07.stdout:4/380: dread - d3/d11/d16/f82 zero size 2026-03-09T19:27:27.173 INFO:tasks.workunit.client.0.vm07.stdout:1/384: truncate d1/d11/d37/d3f/d45/f15 1040965 0 2026-03-09T19:27:27.173 INFO:tasks.workunit.client.1.vm08.stdout:1/753: truncate d9/f36 107960 0 2026-03-09T19:27:27.174 INFO:tasks.workunit.client.0.vm07.stdout:0/356: link d0/d6/d13/d1c/d11/f29 d0/d6/d13/d17/d19/f7c 0 2026-03-09T19:27:27.175 INFO:tasks.workunit.client.0.vm07.stdout:1/385: truncate d1/d11/d37/d5d/f59 626943 0 2026-03-09T19:27:27.184 INFO:tasks.workunit.client.0.vm07.stdout:1/386: dwrite d1/db/d31/d56/f71 [0,4194304] 0 2026-03-09T19:27:27.192 INFO:tasks.workunit.client.0.vm07.stdout:3/419: unlink d1/d6/dd/d51/c6a 0 2026-03-09T19:27:27.193 INFO:tasks.workunit.client.1.vm08.stdout:0/625: creat dd/d22/d7b/d82/fc7 x:0 0 0 2026-03-09T19:27:27.211 INFO:tasks.workunit.client.0.vm07.stdout:6/329: read d0/d1/d28/f64 [318050,115787] 0 2026-03-09T19:27:27.220 INFO:tasks.workunit.client.0.vm07.stdout:6/330: chown d0/f3b 1337 1 2026-03-09T19:27:27.220 INFO:tasks.workunit.client.0.vm07.stdout:3/420: dread d1/d1f/d16/f30 [0,4194304] 0 2026-03-09T19:27:27.220 INFO:tasks.workunit.client.0.vm07.stdout:2/426: dread - d3/f63 zero size 2026-03-09T19:27:27.220 INFO:tasks.workunit.client.0.vm07.stdout:8/411: link d7/d9/d37/d45/d56/c90 d7/d16/c99 0 2026-03-09T19:27:27.220 INFO:tasks.workunit.client.1.vm08.stdout:5/577: creat d16/d1e/d3b/fbd x:0 0 0 2026-03-09T19:27:27.258 INFO:tasks.workunit.client.0.vm07.stdout:7/417: dread d0/d4/d5/dd/f1f [0,4194304] 0 2026-03-09T19:27:27.259 INFO:tasks.workunit.client.1.vm08.stdout:2/557: dwrite d3/d9/f5d [0,4194304] 0 2026-03-09T19:27:27.266 INFO:tasks.workunit.client.1.vm08.stdout:8/575: mknod de/d1d/d69/ccb 0 2026-03-09T19:27:27.266 INFO:tasks.workunit.client.1.vm08.stdout:2/558: readlink d3/d9/d26/l95 0 2026-03-09T19:27:27.270 INFO:tasks.workunit.client.0.vm07.stdout:1/387: symlink d1/d3/l84 0 2026-03-09T19:27:27.272 INFO:tasks.workunit.client.0.vm07.stdout:9/429: fsync d0/db/f41 0 2026-03-09T19:27:27.273 INFO:tasks.workunit.client.1.vm08.stdout:7/663: symlink d5/d14/dae/d1c/lde 0 2026-03-09T19:27:27.275 INFO:tasks.workunit.client.0.vm07.stdout:6/331: creat d0/d1/db/d52/d6a/f83 x:0 0 0 2026-03-09T19:27:27.276 INFO:tasks.workunit.client.0.vm07.stdout:1/388: dwrite d1/d11/d37/d5d/f59 [0,4194304] 0 2026-03-09T19:27:27.283 INFO:tasks.workunit.client.0.vm07.stdout:3/421: fdatasync d1/d6/dd/f67 0 2026-03-09T19:27:27.285 INFO:tasks.workunit.client.0.vm07.stdout:2/427: write d3/dd/d16/d30/f67 [1138862,46752] 0 2026-03-09T19:27:27.285 INFO:tasks.workunit.client.0.vm07.stdout:3/422: stat d1/d3d/d47/f62 0 2026-03-09T19:27:27.286 INFO:tasks.workunit.client.0.vm07.stdout:1/389: dread d1/f2f [4194304,4194304] 0 2026-03-09T19:27:27.311 INFO:tasks.workunit.client.1.vm08.stdout:9/600: rename d0/d1b/daa to d0/d2/d8/dcd 0 2026-03-09T19:27:27.324 INFO:tasks.workunit.client.0.vm07.stdout:7/418: fdatasync d0/d4/d5/dd/f18 0 2026-03-09T19:27:27.324 INFO:tasks.workunit.client.0.vm07.stdout:0/357: creat d0/d6/d13/d1c/d11/d56/d78/f7d x:0 0 0 2026-03-09T19:27:27.324 INFO:tasks.workunit.client.0.vm07.stdout:7/419: chown d0/d4/d5/d8/c4e 844239 1 2026-03-09T19:27:27.335 INFO:tasks.workunit.client.1.vm08.stdout:1/754: symlink d9/da/d53/db3/le3 0 2026-03-09T19:27:27.343 INFO:tasks.workunit.client.1.vm08.stdout:2/559: write d3/d9/d79/d46/d8c/fa5 [523226,89849] 0 2026-03-09T19:27:27.344 INFO:tasks.workunit.client.0.vm07.stdout:9/430: symlink d0/db/d29/d4d/l97 0 2026-03-09T19:27:27.344 INFO:tasks.workunit.client.1.vm08.stdout:2/560: stat d3/d4/d23 0 2026-03-09T19:27:27.345 INFO:tasks.workunit.client.1.vm08.stdout:8/576: dwrite de/d1d/d69/f9a [4194304,4194304] 0 2026-03-09T19:27:27.348 INFO:tasks.workunit.client.1.vm08.stdout:3/679: rename d0/d6/de/d6e/d51/d92/lc5 to d0/d6/d93/ld5 0 2026-03-09T19:27:27.362 INFO:tasks.workunit.client.0.vm07.stdout:6/332: creat d0/d1/db/d52/d6a/f84 x:0 0 0 2026-03-09T19:27:27.376 INFO:tasks.workunit.client.1.vm08.stdout:1/755: fdatasync d9/da/d12/f98 0 2026-03-09T19:27:27.379 INFO:tasks.workunit.client.0.vm07.stdout:3/423: mknod d1/d3d/d47/c83 0 2026-03-09T19:27:27.381 INFO:tasks.workunit.client.1.vm08.stdout:0/626: creat dd/d22/d27/fc8 x:0 0 0 2026-03-09T19:27:27.382 INFO:tasks.workunit.client.0.vm07.stdout:1/390: creat d1/d11/d37/d5a/d6d/f85 x:0 0 0 2026-03-09T19:27:27.383 INFO:tasks.workunit.client.0.vm07.stdout:3/424: dwrite d1/d6/dd/f3b [0,4194304] 0 2026-03-09T19:27:27.388 INFO:tasks.workunit.client.1.vm08.stdout:1/756: dwrite d9/da/d53/d67/d6c/fbc [0,4194304] 0 2026-03-09T19:27:27.408 INFO:tasks.workunit.client.1.vm08.stdout:4/596: creat da/d10/d26/d3a/d91/fb4 x:0 0 0 2026-03-09T19:27:27.409 INFO:tasks.workunit.client.0.vm07.stdout:2/428: write d3/f5 [407824,30320] 0 2026-03-09T19:27:27.409 INFO:tasks.workunit.client.1.vm08.stdout:4/597: fsync da/d10/d1b/f79 0 2026-03-09T19:27:27.412 INFO:tasks.workunit.client.1.vm08.stdout:9/601: dwrite d0/d1b/d97/f34 [0,4194304] 0 2026-03-09T19:27:27.414 INFO:tasks.workunit.client.1.vm08.stdout:4/598: readlink da/d10/d16/d28/d2f/d4f/d64/l8e 0 2026-03-09T19:27:27.414 INFO:tasks.workunit.client.1.vm08.stdout:9/602: chown d0/d1b/d68/d7f/d8c/da2 35 1 2026-03-09T19:27:27.415 INFO:tasks.workunit.client.1.vm08.stdout:9/603: write d0/d1b/d97/f22 [4164589,116833] 0 2026-03-09T19:27:27.430 INFO:tasks.workunit.client.1.vm08.stdout:3/680: rmdir d0/d52/d7c/d7e 39 2026-03-09T19:27:27.431 INFO:tasks.workunit.client.1.vm08.stdout:3/681: chown d0/d6/de/d1b/d16/d17/dac/dd2 11 1 2026-03-09T19:27:27.435 INFO:tasks.workunit.client.1.vm08.stdout:6/635: getdents d3/d34/d5c/da2 0 2026-03-09T19:27:27.469 INFO:tasks.workunit.client.1.vm08.stdout:9/604: write d0/d2/d8/f61 [4708765,88608] 0 2026-03-09T19:27:27.469 INFO:tasks.workunit.client.1.vm08.stdout:9/605: chown d0/d1b/d68/d7f 114 1 2026-03-09T19:27:27.476 INFO:tasks.workunit.client.0.vm07.stdout:8/412: rename d7/d9/d57/d7c to d7/d9/d10/d44/d9a 0 2026-03-09T19:27:27.483 INFO:tasks.workunit.client.0.vm07.stdout:8/413: read d7/f2e [4027812,120821] 0 2026-03-09T19:27:27.488 INFO:tasks.workunit.client.0.vm07.stdout:0/358: symlink d0/d6/l7e 0 2026-03-09T19:27:27.489 INFO:tasks.workunit.client.0.vm07.stdout:0/359: write d0/d6/d13/d17/d19/d57/f6f [510655,81731] 0 2026-03-09T19:27:27.496 INFO:tasks.workunit.client.0.vm07.stdout:4/381: link d3/d11/d2b/d38/c48 d3/d4f/d56/c84 0 2026-03-09T19:27:27.496 INFO:tasks.workunit.client.1.vm08.stdout:1/757: symlink d9/d11/d7a/le4 0 2026-03-09T19:27:27.496 INFO:tasks.workunit.client.0.vm07.stdout:4/382: write d3/d11/d2b/f71 [310462,119233] 0 2026-03-09T19:27:27.497 INFO:tasks.workunit.client.1.vm08.stdout:4/599: unlink da/d10/d26/d3a/c5f 0 2026-03-09T19:27:27.497 INFO:tasks.workunit.client.0.vm07.stdout:4/383: truncate d3/d11/d16/f82 395417 0 2026-03-09T19:27:27.498 INFO:tasks.workunit.client.1.vm08.stdout:4/600: fsync da/d10/d16/d28/d4d/fa9 0 2026-03-09T19:27:27.498 INFO:tasks.workunit.client.0.vm07.stdout:4/384: fdatasync d3/d11/d16/d2f/d22/f24 0 2026-03-09T19:27:27.499 INFO:tasks.workunit.client.1.vm08.stdout:4/601: write da/fa8 [681395,355] 0 2026-03-09T19:27:27.502 INFO:tasks.workunit.client.1.vm08.stdout:4/602: fsync da/d10/d16/d28/d46/d52/d6e/d73/fae 0 2026-03-09T19:27:27.525 INFO:tasks.workunit.client.0.vm07.stdout:5/393: getdents d3/dd/d26/d3f/d47/d71/d76 0 2026-03-09T19:27:27.525 INFO:tasks.workunit.client.0.vm07.stdout:5/394: readlink d3/d1a/d28/d48/l81 0 2026-03-09T19:27:27.527 INFO:tasks.workunit.client.1.vm08.stdout:7/664: getdents d5/d14/dae/d1c/d83/d9c 0 2026-03-09T19:27:27.527 INFO:tasks.workunit.client.1.vm08.stdout:7/665: readlink d5/d14/dae/l1b 0 2026-03-09T19:27:27.530 INFO:tasks.workunit.client.1.vm08.stdout:5/578: rename d16/d1e/f5f to d16/fbe 0 2026-03-09T19:27:27.533 INFO:tasks.workunit.client.1.vm08.stdout:0/627: write dd/d22/d27/f9f [1036132,33775] 0 2026-03-09T19:27:27.550 INFO:tasks.workunit.client.1.vm08.stdout:6/636: write d3/db/d43/f51 [345828,75489] 0 2026-03-09T19:27:27.552 INFO:tasks.workunit.client.1.vm08.stdout:8/577: dwrite de/f19 [0,4194304] 0 2026-03-09T19:27:27.558 INFO:tasks.workunit.client.1.vm08.stdout:7/666: truncate d5/fc 2939209 0 2026-03-09T19:27:27.564 INFO:tasks.workunit.client.1.vm08.stdout:9/606: dwrite d0/d1b/d97/f63 [0,4194304] 0 2026-03-09T19:27:27.566 INFO:tasks.workunit.client.0.vm07.stdout:7/420: symlink d0/d4/d5/d8/d41/d64/d79/l90 0 2026-03-09T19:27:27.572 INFO:tasks.workunit.client.1.vm08.stdout:4/603: write da/d10/d16/d28/d2f/d4f/f65 [4009222,48972] 0 2026-03-09T19:27:27.584 INFO:tasks.workunit.client.0.vm07.stdout:9/431: mknod d0/c98 0 2026-03-09T19:27:27.584 INFO:tasks.workunit.client.1.vm08.stdout:1/758: truncate d9/da/d53/d67/f77 3276207 0 2026-03-09T19:27:27.590 INFO:tasks.workunit.client.0.vm07.stdout:1/391: symlink d1/d3e/d5c/l86 0 2026-03-09T19:27:27.592 INFO:tasks.workunit.client.0.vm07.stdout:3/425: mknod d1/d1f/c84 0 2026-03-09T19:27:27.595 INFO:tasks.workunit.client.0.vm07.stdout:8/414: mknod d7/d30/c9b 0 2026-03-09T19:27:27.595 INFO:tasks.workunit.client.0.vm07.stdout:8/415: fdatasync d7/d9/d37/d45/d56/f7a 0 2026-03-09T19:27:27.596 INFO:tasks.workunit.client.1.vm08.stdout:7/667: readlink d5/d14/d38/l77 0 2026-03-09T19:27:27.602 INFO:tasks.workunit.client.0.vm07.stdout:0/360: link d0/d6/d13/d33/f35 d0/d6/d13/d1c/d11/d56/f7f 0 2026-03-09T19:27:27.606 INFO:tasks.workunit.client.0.vm07.stdout:4/385: mknod d3/d11/c85 0 2026-03-09T19:27:27.609 INFO:tasks.workunit.client.0.vm07.stdout:9/432: truncate d0/db/d29/d68/f6b 118573 0 2026-03-09T19:27:27.610 INFO:tasks.workunit.client.1.vm08.stdout:5/579: mknod d16/d1e/d8c/d99/da8/cbf 0 2026-03-09T19:27:27.614 INFO:tasks.workunit.client.1.vm08.stdout:5/580: read d16/f18 [330307,62907] 0 2026-03-09T19:27:27.614 INFO:tasks.workunit.client.1.vm08.stdout:0/628: mknod dd/d22/d27/cc9 0 2026-03-09T19:27:27.616 INFO:tasks.workunit.client.0.vm07.stdout:1/392: dwrite d1/db/d31/f64 [4194304,4194304] 0 2026-03-09T19:27:27.617 INFO:tasks.workunit.client.0.vm07.stdout:1/393: read - d1/d11/d37/d3f/f82 zero size 2026-03-09T19:27:27.624 INFO:tasks.workunit.client.0.vm07.stdout:1/394: dwrite d1/d3/f23 [0,4194304] 0 2026-03-09T19:27:27.624 INFO:tasks.workunit.client.1.vm08.stdout:3/682: link d0/d8/d19/c3a d0/d52/d7c/d7e/cd6 0 2026-03-09T19:27:27.624 INFO:tasks.workunit.client.0.vm07.stdout:6/333: rename d0/d1/db/d24/c5e to d0/d1/db/c85 0 2026-03-09T19:27:27.630 INFO:tasks.workunit.client.1.vm08.stdout:1/759: creat d9/da/dc/fe5 x:0 0 0 2026-03-09T19:27:27.643 INFO:tasks.workunit.client.1.vm08.stdout:6/637: mkdir d3/dbc/deb 0 2026-03-09T19:27:27.644 INFO:tasks.workunit.client.0.vm07.stdout:5/395: write d3/f68 [2623561,119684] 0 2026-03-09T19:27:27.644 INFO:tasks.workunit.client.0.vm07.stdout:7/421: write d0/d4/d5/d8/f37 [806089,2267] 0 2026-03-09T19:27:27.646 INFO:tasks.workunit.client.0.vm07.stdout:0/361: creat d0/d6/d13/d1c/d11/f80 x:0 0 0 2026-03-09T19:27:27.647 INFO:tasks.workunit.client.0.vm07.stdout:0/362: readlink d0/d6/d13/d1c/d11/l2c 0 2026-03-09T19:27:27.647 INFO:tasks.workunit.client.0.vm07.stdout:0/363: chown d0 10 1 2026-03-09T19:27:27.650 INFO:tasks.workunit.client.0.vm07.stdout:5/396: dread d3/f4e [0,4194304] 0 2026-03-09T19:27:27.652 INFO:tasks.workunit.client.0.vm07.stdout:9/433: mkdir d0/db/d29/d68/d99 0 2026-03-09T19:27:27.652 INFO:tasks.workunit.client.0.vm07.stdout:9/434: stat d0/db/d29/d2c/l40 0 2026-03-09T19:27:27.657 INFO:tasks.workunit.client.1.vm08.stdout:7/668: rmdir d5/d14/d2b 39 2026-03-09T19:27:27.657 INFO:tasks.workunit.client.1.vm08.stdout:9/607: symlink d0/d1b/d97/lce 0 2026-03-09T19:27:27.658 INFO:tasks.workunit.client.1.vm08.stdout:2/561: rename d3/d9/d79/dbc to d3/d4/d23/d2c/dc1 0 2026-03-09T19:27:27.659 INFO:tasks.workunit.client.1.vm08.stdout:9/608: truncate d0/d1b/f65 810908 0 2026-03-09T19:27:27.659 INFO:tasks.workunit.client.1.vm08.stdout:4/604: mkdir da/d10/d26/d3a/db5 0 2026-03-09T19:27:27.661 INFO:tasks.workunit.client.1.vm08.stdout:4/605: chown da/d10/d26/d50 9871 1 2026-03-09T19:27:27.661 INFO:tasks.workunit.client.1.vm08.stdout:7/669: read d5/d14/dae/d3a/f64 [3340030,67544] 0 2026-03-09T19:27:27.662 INFO:tasks.workunit.client.0.vm07.stdout:2/429: rename d3/dd/d16/d29/d2d/l71 to d3/d11/d38/l8e 0 2026-03-09T19:27:27.681 INFO:tasks.workunit.client.0.vm07.stdout:5/397: dread d3/dd/f24 [0,4194304] 0 2026-03-09T19:27:27.690 INFO:tasks.workunit.client.0.vm07.stdout:6/334: fdatasync d0/d13/f5f 0 2026-03-09T19:27:27.707 INFO:tasks.workunit.client.0.vm07.stdout:8/416: unlink d7/d9/d37/d45/d4f/f66 0 2026-03-09T19:27:27.709 INFO:tasks.workunit.client.0.vm07.stdout:3/426: dwrite d1/d6/dd/f4d [0,4194304] 0 2026-03-09T19:27:27.717 INFO:tasks.workunit.client.0.vm07.stdout:7/422: truncate d0/d4/d5/dd/f1f 519402 0 2026-03-09T19:27:27.734 INFO:tasks.workunit.client.0.vm07.stdout:4/386: dwrite d3/d11/f6c [0,4194304] 0 2026-03-09T19:27:27.770 INFO:tasks.workunit.client.0.vm07.stdout:2/430: mknod d3/dd/d16/d29/d2d/d45/c8f 0 2026-03-09T19:27:27.773 INFO:tasks.workunit.client.0.vm07.stdout:2/431: dread d3/f1a [0,4194304] 0 2026-03-09T19:27:27.783 INFO:tasks.workunit.client.1.vm08.stdout:5/581: dwrite d16/d1e/d3b/f5e [0,4194304] 0 2026-03-09T19:27:27.804 INFO:tasks.workunit.client.0.vm07.stdout:1/395: mkdir d1/d11/d37/d3f/d45/d87 0 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.1.vm08.stdout:3/683: read d0/d6/de/d15/fa4 [224424,32150] 0 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.1.vm08.stdout:1/760: rmdir d9/d11/d7a/d89/d8d/daa 39 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.1.vm08.stdout:0/629: mkdir dd/d31/dca 0 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.1.vm08.stdout:7/670: creat d5/d14/dae/d3a/d42/d85/fdf x:0 0 0 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.1.vm08.stdout:7/671: chown d5/d14/dae/d1c/d83/d9c/f9d 104400 1 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.1.vm08.stdout:9/609: rename d0/d1b/d97/d48/c91 to d0/d2/d14/ccf 0 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.1.vm08.stdout:9/610: dread d0/d1b/f82 [0,4194304] 0 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.1.vm08.stdout:9/611: read d0/d2/d8/dcd/fb2 [483326,114945] 0 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.1.vm08.stdout:9/612: readlink d0/l7b 0 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.1.vm08.stdout:8/578: getdents de/d25/d87 0 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.1.vm08.stdout:9/613: unlink d0/d1b/d68/d7f/d8c/c78 0 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.1.vm08.stdout:7/672: dread d5/d14/d27/d54/f58 [0,4194304] 0 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.1.vm08.stdout:3/684: symlink d0/d6/de/d54/ld7 0 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.0.vm07.stdout:1/396: stat d1/db/d31/d56/c74 0 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.0.vm07.stdout:1/397: read - d1/db/d31/d4f/f77 zero size 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.0.vm07.stdout:0/364: truncate d0/d6/d13/d17/d19/d57/d6a/f74 459632 0 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.0.vm07.stdout:0/365: truncate d0/d6/d13/d17/d19/d58/f77 331352 0 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.0.vm07.stdout:0/366: write d0/d6/d13/d1c/d50/f60 [729658,74202] 0 2026-03-09T19:27:27.859 INFO:tasks.workunit.client.0.vm07.stdout:0/367: readlink d0/d6/d13/d1c/d11/l28 0 2026-03-09T19:27:27.860 INFO:tasks.workunit.client.0.vm07.stdout:0/368: read - d0/d6/d13/d1c/d11/f29 zero size 2026-03-09T19:27:27.860 INFO:tasks.workunit.client.0.vm07.stdout:4/387: truncate d3/d11/d29/f42 361870 0 2026-03-09T19:27:27.860 INFO:tasks.workunit.client.0.vm07.stdout:0/369: write d0/d6/d13/d1c/d11/f80 [199013,13814] 0 2026-03-09T19:27:27.860 INFO:tasks.workunit.client.0.vm07.stdout:4/388: mkdir d3/d11/d16/d2f/d22/d86 0 2026-03-09T19:27:27.860 INFO:tasks.workunit.client.0.vm07.stdout:4/389: dread - d3/d11/f7d zero size 2026-03-09T19:27:27.860 INFO:tasks.workunit.client.0.vm07.stdout:0/370: mkdir d0/d6/d13/d1c/d52/d81 0 2026-03-09T19:27:27.860 INFO:tasks.workunit.client.0.vm07.stdout:2/432: getdents d3/dd/d16/d29/d2d/d45/d3b/d44 0 2026-03-09T19:27:27.860 INFO:tasks.workunit.client.0.vm07.stdout:5/398: link d3/d1a/d28/c43 d3/d1a/d5d/c82 0 2026-03-09T19:27:27.860 INFO:tasks.workunit.client.0.vm07.stdout:6/335: dread d0/d1/db/d17/d4c/f60 [0,4194304] 0 2026-03-09T19:27:27.860 INFO:tasks.workunit.client.0.vm07.stdout:6/336: chown d0/d1/db/d24/d53/f48 0 1 2026-03-09T19:27:27.860 INFO:tasks.workunit.client.0.vm07.stdout:3/427: link d1/d1f/d16/c4e d1/d6/d71/c85 0 2026-03-09T19:27:27.860 INFO:tasks.workunit.client.0.vm07.stdout:4/390: fdatasync d3/f1a 0 2026-03-09T19:27:27.860 INFO:tasks.workunit.client.0.vm07.stdout:0/371: read d0/d6/f79 [661419,40251] 0 2026-03-09T19:27:27.860 INFO:tasks.workunit.client.0.vm07.stdout:9/435: getdents d0/db/d29/d32 0 2026-03-09T19:27:27.860 INFO:tasks.workunit.client.0.vm07.stdout:5/399: mkdir d3/dd/d26/d2d/d60/d83 0 2026-03-09T19:27:27.863 INFO:tasks.workunit.client.0.vm07.stdout:2/433: rename d3/dd/d16/d29/c6a to d3/dd/d16/d29/d2d/c90 0 2026-03-09T19:27:27.867 INFO:tasks.workunit.client.0.vm07.stdout:6/337: rmdir d0/d1/db/d24/d53/d31 39 2026-03-09T19:27:27.868 INFO:tasks.workunit.client.0.vm07.stdout:6/338: chown d0/d1/db/f14 26311482 1 2026-03-09T19:27:27.871 INFO:tasks.workunit.client.1.vm08.stdout:9/614: symlink d0/d1b/d97/d48/d5e/ld0 0 2026-03-09T19:27:27.873 INFO:tasks.workunit.client.0.vm07.stdout:4/391: truncate d3/d11/d2b/d38/f4a 2089238 0 2026-03-09T19:27:27.874 INFO:tasks.workunit.client.0.vm07.stdout:0/372: unlink d0/d6/d13/d1c/d11/f3f 0 2026-03-09T19:27:27.875 INFO:tasks.workunit.client.0.vm07.stdout:5/400: readlink d3/d1a/d28/d40/l55 0 2026-03-09T19:27:27.876 INFO:tasks.workunit.client.0.vm07.stdout:3/428: rename d1/d3d/l3f to d1/d6/dd/l86 0 2026-03-09T19:27:27.876 INFO:tasks.workunit.client.1.vm08.stdout:3/685: symlink d0/d8/d19/ld8 0 2026-03-09T19:27:27.876 INFO:tasks.workunit.client.1.vm08.stdout:9/615: chown d0/d2/f2a 1 1 2026-03-09T19:27:27.877 INFO:tasks.workunit.client.1.vm08.stdout:8/579: fdatasync de/d25/f71 0 2026-03-09T19:27:27.877 INFO:tasks.workunit.client.1.vm08.stdout:8/580: chown de/d47/d85 6920741 1 2026-03-09T19:27:27.882 INFO:tasks.workunit.client.0.vm07.stdout:2/434: creat d3/dd/d16/d29/f91 x:0 0 0 2026-03-09T19:27:27.883 INFO:tasks.workunit.client.0.vm07.stdout:6/339: creat d0/d4e/f86 x:0 0 0 2026-03-09T19:27:27.883 INFO:tasks.workunit.client.0.vm07.stdout:2/435: dread - d3/dd/d16/d29/d2d/d45/d3b/d44/f5d zero size 2026-03-09T19:27:27.888 INFO:tasks.workunit.client.0.vm07.stdout:9/436: rename d0/db/d29/d2c/d36/c4b to d0/d6/d57/c9a 0 2026-03-09T19:27:27.888 INFO:tasks.workunit.client.1.vm08.stdout:8/581: mknod de/d1d/d4f/ccc 0 2026-03-09T19:27:27.888 INFO:tasks.workunit.client.0.vm07.stdout:3/429: mkdir d1/d6/dd/d51/d87 0 2026-03-09T19:27:27.890 INFO:tasks.workunit.client.0.vm07.stdout:9/437: dread d0/d17/f33 [0,4194304] 0 2026-03-09T19:27:27.891 INFO:tasks.workunit.client.0.vm07.stdout:2/436: symlink d3/dd/d16/d29/d3c/d5a/d7a/l92 0 2026-03-09T19:27:27.893 INFO:tasks.workunit.client.0.vm07.stdout:3/430: dwrite d1/d74/f6e [0,4194304] 0 2026-03-09T19:27:27.894 INFO:tasks.workunit.client.0.vm07.stdout:5/401: truncate d3/d1a/d28/f3c 781628 0 2026-03-09T19:27:27.895 INFO:tasks.workunit.client.0.vm07.stdout:3/431: write d1/d1f/d16/d28/f64 [811330,112489] 0 2026-03-09T19:27:27.900 INFO:tasks.workunit.client.1.vm08.stdout:9/616: getdents d0/d1b/d97/d48/d5d 0 2026-03-09T19:27:27.963 INFO:tasks.workunit.client.1.vm08.stdout:9/617: creat d0/d1b/d97/d48/d5d/fd1 x:0 0 0 2026-03-09T19:27:27.963 INFO:tasks.workunit.client.1.vm08.stdout:8/582: fdatasync de/d1d/d2e/f61 0 2026-03-09T19:27:27.963 INFO:tasks.workunit.client.1.vm08.stdout:8/583: dread - de/d25/d31/d82/d6d/f88 zero size 2026-03-09T19:27:27.963 INFO:tasks.workunit.client.1.vm08.stdout:8/584: mknod de/d25/d31/d82/d6d/d99/ccd 0 2026-03-09T19:27:27.963 INFO:tasks.workunit.client.1.vm08.stdout:8/585: stat de/c17 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:9/438: creat d0/d6/f9b x:0 0 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:4/392: creat d3/d11/f87 x:0 0 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:3/432: creat d1/d6/d45/f88 x:0 0 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:3/433: dread d1/d6/dd/f4d [0,4194304] 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:9/439: mknod d0/c9c 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:2/437: creat d3/f93 x:0 0 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:2/438: chown d3/dd/d16/d29/d2d/d45/c8f 363 1 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:3/434: mkdir d1/d89 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:3/435: chown d1/d6 26108 1 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:3/436: write d1/d74/f55 [1422778,88056] 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:6/340: getdents d0/d1/db/d24/d53/d31 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:3/437: dwrite d1/d6/dd/f3b [8388608,4194304] 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:4/393: getdents d3/d4f/d56/d5f 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:4/394: readlink d3/lf 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:6/341: fsync d0/d1/db/d17/f1a 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:3/438: creat d1/d6/dd/f8a x:0 0 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:4/395: mkdir d3/d4f/d56/d5f/d88 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:6/342: chown d0/d1/db/d52/d6a/f83 318980 1 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:3/439: symlink d1/d1f/l8b 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:6/343: rename d0/d1/d82 to d0/d1/db/d52/d6a/d87 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:3/440: rename d1/d1f/c4b to d1/d3d/d47/c8c 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:6/344: rename d0/d4e/f86 to d0/d2d/f88 0 2026-03-09T19:27:27.964 INFO:tasks.workunit.client.0.vm07.stdout:3/441: symlink d1/d1f/l8d 0 2026-03-09T19:27:28.241 INFO:tasks.workunit.client.0.vm07.stdout:9/440: sync 2026-03-09T19:27:28.345 INFO:tasks.workunit.client.0.vm07.stdout:9/441: unlink d0/d6/d3a/f6d 0 2026-03-09T19:27:28.345 INFO:tasks.workunit.client.0.vm07.stdout:9/442: stat d0/d6/l31 0 2026-03-09T19:27:28.345 INFO:tasks.workunit.client.0.vm07.stdout:9/443: symlink d0/db/d29/d32/l9d 0 2026-03-09T19:27:28.513 INFO:tasks.workunit.client.0.vm07.stdout:8/417: write d7/d9/d37/d45/f73 [681098,122582] 0 2026-03-09T19:27:28.519 INFO:tasks.workunit.client.1.vm08.stdout:4/606: dwrite da/f21 [4194304,4194304] 0 2026-03-09T19:27:28.525 INFO:tasks.workunit.client.0.vm07.stdout:6/345: read d0/d1/d28/f64 [712552,72041] 0 2026-03-09T19:27:28.525 INFO:tasks.workunit.client.0.vm07.stdout:8/418: dread - d7/d9/d37/d45/f7e zero size 2026-03-09T19:27:28.526 INFO:tasks.workunit.client.0.vm07.stdout:8/419: readlink d7/d9/l35 0 2026-03-09T19:27:28.527 INFO:tasks.workunit.client.1.vm08.stdout:4/607: mknod da/d10/d26/d38/cb6 0 2026-03-09T19:27:28.528 INFO:tasks.workunit.client.1.vm08.stdout:4/608: write da/d10/d16/d28/d46/d52/d6e/d73/fae [454127,22049] 0 2026-03-09T19:27:28.530 INFO:tasks.workunit.client.0.vm07.stdout:1/398: write d1/d3/d21/f55 [574092,129116] 0 2026-03-09T19:27:28.531 INFO:tasks.workunit.client.0.vm07.stdout:6/346: rename d0/d13/f1c to d0/d1/db/d24/d53/d31/f89 0 2026-03-09T19:27:28.531 INFO:tasks.workunit.client.0.vm07.stdout:1/399: chown d1/d11/d37/d5d/d50/f62 37253 1 2026-03-09T19:27:28.534 INFO:tasks.workunit.client.0.vm07.stdout:1/400: dread d1/db/d31/d56/f6a [0,4194304] 0 2026-03-09T19:27:28.535 INFO:tasks.workunit.client.0.vm07.stdout:8/420: symlink d7/d9/d37/d45/d56/d67/l9c 0 2026-03-09T19:27:28.535 INFO:tasks.workunit.client.1.vm08.stdout:5/582: dwrite d16/d1e/d3b/f3c [0,4194304] 0 2026-03-09T19:27:28.536 INFO:tasks.workunit.client.0.vm07.stdout:8/421: write d7/d9/d37/d45/d56/d62/f89 [672421,7028] 0 2026-03-09T19:27:28.536 INFO:tasks.workunit.client.0.vm07.stdout:8/422: stat d7/d50 0 2026-03-09T19:27:28.538 INFO:tasks.workunit.client.1.vm08.stdout:4/609: truncate da/d10/d16/d28/f34 1960058 0 2026-03-09T19:27:28.540 INFO:tasks.workunit.client.1.vm08.stdout:5/583: rmdir d16/d1e/d8c/d99/da8/d9a 39 2026-03-09T19:27:28.541 INFO:tasks.workunit.client.0.vm07.stdout:8/423: write d7/d9/d37/d45/d56/f7a [3185316,101823] 0 2026-03-09T19:27:28.543 INFO:tasks.workunit.client.1.vm08.stdout:5/584: dread d16/d1e/d6e/f72 [0,4194304] 0 2026-03-09T19:27:28.556 INFO:tasks.workunit.client.0.vm07.stdout:1/401: rename d1/d11/d37/d3f/d45/d7d to d1/d11/d37/d3f/d45/d87/d88 0 2026-03-09T19:27:28.557 INFO:tasks.workunit.client.0.vm07.stdout:6/347: dread d0/d13/f26 [0,4194304] 0 2026-03-09T19:27:28.557 INFO:tasks.workunit.client.0.vm07.stdout:5/402: getdents d3/dd/d26/d2d/d60 0 2026-03-09T19:27:28.557 INFO:tasks.workunit.client.1.vm08.stdout:5/585: read - d16/d1e/d30/fb4 zero size 2026-03-09T19:27:28.557 INFO:tasks.workunit.client.1.vm08.stdout:5/586: write d16/d45/fb1 [1403989,58055] 0 2026-03-09T19:27:28.557 INFO:tasks.workunit.client.1.vm08.stdout:0/630: truncate dd/d22/f41 1949191 0 2026-03-09T19:27:28.557 INFO:tasks.workunit.client.1.vm08.stdout:1/761: dwrite d9/da/d95/fc0 [0,4194304] 0 2026-03-09T19:27:28.561 INFO:tasks.workunit.client.1.vm08.stdout:7/673: write d5/fb [4282025,50132] 0 2026-03-09T19:27:28.564 INFO:tasks.workunit.client.1.vm08.stdout:5/587: symlink d16/d1e/d8c/d99/lc0 0 2026-03-09T19:27:28.566 INFO:tasks.workunit.client.1.vm08.stdout:4/610: mknod da/d10/d26/d27/da6/cb7 0 2026-03-09T19:27:28.593 INFO:tasks.workunit.client.1.vm08.stdout:3/686: write d0/d52/fa8 [787710,127618] 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.1.vm08.stdout:1/762: unlink d9/da/d53/d67/d6c/fbc 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.1.vm08.stdout:3/687: creat d0/d6/de/d15/d96/fd9 x:0 0 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.1.vm08.stdout:0/631: truncate dd/f19 1281715 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.1.vm08.stdout:0/632: read - dd/d22/d27/d2e/db0/fa4 zero size 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.1.vm08.stdout:1/763: symlink d9/da/d12/d91/dc5/le6 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.1.vm08.stdout:5/588: creat d16/d1e/d9f/fc1 x:0 0 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.1.vm08.stdout:1/764: mkdir d9/d11/d7a/d89/de7 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.1.vm08.stdout:7/674: rename d5/d14/d2b/d4b/f96 to d5/d14/dae/d3a/d42/d6a/d8f/fe0 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.0.vm07.stdout:1/402: mkdir d1/d11/d37/d3f/d45/d87/d89 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.0.vm07.stdout:5/403: unlink d3/d1a/d28/d36/l3b 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.0.vm07.stdout:5/404: write d3/dd/f58 [4782713,74308] 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.0.vm07.stdout:0/373: dwrite d0/d6/d13/d17/d19/d57/f75 [0,4194304] 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.0.vm07.stdout:1/403: creat d1/d11/d37/d5d/f8a x:0 0 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.0.vm07.stdout:5/405: dread d3/d1a/f17 [0,4194304] 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.0.vm07.stdout:0/374: dread d0/d6/d13/d1c/d11/f2e [0,4194304] 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.0.vm07.stdout:5/406: dwrite d3/dd/f23 [0,4194304] 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.0.vm07.stdout:0/375: fsync d0/d6/f5c 0 2026-03-09T19:27:28.594 INFO:tasks.workunit.client.0.vm07.stdout:5/407: dread d3/f4e [0,4194304] 0 2026-03-09T19:27:28.597 INFO:tasks.workunit.client.0.vm07.stdout:5/408: dwrite d3/dd/d26/d3f/d47/d56/f75 [0,4194304] 0 2026-03-09T19:27:28.601 INFO:tasks.workunit.client.1.vm08.stdout:5/589: creat d16/d1e/d30/fc2 x:0 0 0 2026-03-09T19:27:28.603 INFO:tasks.workunit.client.1.vm08.stdout:7/675: creat d5/fe1 x:0 0 0 2026-03-09T19:27:28.603 INFO:tasks.workunit.client.0.vm07.stdout:0/376: link d0/d6/d13/d33/l48 d0/d6/d13/d17/d19/d57/d6a/l82 0 2026-03-09T19:27:28.606 INFO:tasks.workunit.client.1.vm08.stdout:7/676: link d5/d14/dae/d3a/d42/f9a d5/d14/d2b/d4b/fe2 0 2026-03-09T19:27:28.606 INFO:tasks.workunit.client.0.vm07.stdout:0/377: symlink d0/d6/d13/d1c/d11/d56/l83 0 2026-03-09T19:27:28.608 INFO:tasks.workunit.client.1.vm08.stdout:7/677: creat d5/d14/d27/d78/dc7/dce/fe3 x:0 0 0 2026-03-09T19:27:28.611 INFO:tasks.workunit.client.1.vm08.stdout:5/590: dread d16/d8e/fa2 [0,4194304] 0 2026-03-09T19:27:28.613 INFO:tasks.workunit.client.1.vm08.stdout:5/591: link d16/d8e/laa d16/lc3 0 2026-03-09T19:27:28.614 INFO:tasks.workunit.client.1.vm08.stdout:5/592: symlink d16/d1e/d6e/lc4 0 2026-03-09T19:27:28.615 INFO:tasks.workunit.client.1.vm08.stdout:5/593: creat d16/d45/daf/fc5 x:0 0 0 2026-03-09T19:27:28.616 INFO:tasks.workunit.client.1.vm08.stdout:5/594: creat d16/d1e/db3/fc6 x:0 0 0 2026-03-09T19:27:28.618 INFO:tasks.workunit.client.1.vm08.stdout:5/595: rename d16/d1e/f57 to d16/d1e/d6e/fc7 0 2026-03-09T19:27:28.619 INFO:tasks.workunit.client.1.vm08.stdout:5/596: mkdir d16/d1e/db3/dc8 0 2026-03-09T19:27:28.623 INFO:tasks.workunit.client.1.vm08.stdout:5/597: dwrite d16/d45/fb1 [0,4194304] 0 2026-03-09T19:27:28.624 INFO:tasks.workunit.client.1.vm08.stdout:5/598: chown d16/d45/daf/fc5 7401193 1 2026-03-09T19:27:28.624 INFO:tasks.workunit.client.1.vm08.stdout:5/599: dread - d16/d1e/d30/fb9 zero size 2026-03-09T19:27:28.800 INFO:tasks.workunit.client.1.vm08.stdout:9/618: dwrite d0/d1b/d68/d7f/fa0 [0,4194304] 0 2026-03-09T19:27:28.812 INFO:tasks.workunit.client.0.vm07.stdout:2/439: write d3/dd/d16/f5f [513292,108218] 0 2026-03-09T19:27:28.812 INFO:tasks.workunit.client.1.vm08.stdout:9/619: dwrite d0/d1b/d97/f22 [4194304,4194304] 0 2026-03-09T19:27:28.812 INFO:tasks.workunit.client.1.vm08.stdout:8/586: dwrite de/f54 [0,4194304] 0 2026-03-09T19:27:28.812 INFO:tasks.workunit.client.1.vm08.stdout:8/587: chown de/d1d/d69/f84 486931089 1 2026-03-09T19:27:28.813 INFO:tasks.workunit.client.0.vm07.stdout:4/396: dwrite d3/d11/d2b/d37/f25 [0,4194304] 0 2026-03-09T19:27:28.814 INFO:tasks.workunit.client.1.vm08.stdout:9/620: read d0/d2/d14/f4d [792473,27957] 0 2026-03-09T19:27:28.824 INFO:tasks.workunit.client.0.vm07.stdout:3/442: truncate d1/d6/f9 4300907 0 2026-03-09T19:27:28.837 INFO:tasks.workunit.client.1.vm08.stdout:8/588: fsync de/d1d/d21/d73/fa6 0 2026-03-09T19:27:28.837 INFO:tasks.workunit.client.1.vm08.stdout:8/589: dread - de/d25/d33/fb6 zero size 2026-03-09T19:27:28.837 INFO:tasks.workunit.client.0.vm07.stdout:3/443: chown d1/d1f/d5c/f75 6188 1 2026-03-09T19:27:28.837 INFO:tasks.workunit.client.0.vm07.stdout:3/444: read - d1/d74/f77 zero size 2026-03-09T19:27:28.837 INFO:tasks.workunit.client.0.vm07.stdout:2/440: mknod d3/dd/d16/c94 0 2026-03-09T19:27:28.837 INFO:tasks.workunit.client.0.vm07.stdout:3/445: dread d1/d1f/d16/f30 [0,4194304] 0 2026-03-09T19:27:28.837 INFO:tasks.workunit.client.0.vm07.stdout:3/446: dread - d1/d6/dd/f67 zero size 2026-03-09T19:27:28.855 INFO:tasks.workunit.client.0.vm07.stdout:2/441: dread d3/dd/f73 [0,4194304] 0 2026-03-09T19:27:28.874 INFO:tasks.workunit.client.0.vm07.stdout:2/442: dread d3/f22 [0,4194304] 0 2026-03-09T19:27:28.876 INFO:tasks.workunit.client.0.vm07.stdout:2/443: mknod d3/dd/d16/d29/d2d/d45/d3b/d44/c95 0 2026-03-09T19:27:28.878 INFO:tasks.workunit.client.0.vm07.stdout:2/444: mkdir d3/dd/d16/d29/d2d/d45/d3b/d44/d96 0 2026-03-09T19:27:28.879 INFO:tasks.workunit.client.0.vm07.stdout:2/445: dread - d3/dd/d16/d29/f91 zero size 2026-03-09T19:27:28.894 INFO:tasks.workunit.client.0.vm07.stdout:2/446: dread d3/f5 [0,4194304] 0 2026-03-09T19:27:28.896 INFO:tasks.workunit.client.0.vm07.stdout:2/447: mkdir d3/dd/d16/d29/d2d/d45/d3b/d44/d97 0 2026-03-09T19:27:28.900 INFO:tasks.workunit.client.0.vm07.stdout:2/448: dwrite d3/dd/d16/d30/f3a [0,4194304] 0 2026-03-09T19:27:28.904 INFO:tasks.workunit.client.0.vm07.stdout:2/449: mkdir d3/dd/d16/d29/d2d/d45/d8b/d98 0 2026-03-09T19:27:28.949 INFO:tasks.workunit.client.0.vm07.stdout:4/397: sync 2026-03-09T19:27:28.949 INFO:tasks.workunit.client.0.vm07.stdout:4/398: write d3/fc [1592031,11931] 0 2026-03-09T19:27:28.952 INFO:tasks.workunit.client.0.vm07.stdout:4/399: getdents d3/d4f 0 2026-03-09T19:27:28.953 INFO:tasks.workunit.client.0.vm07.stdout:4/400: mknod d3/c89 0 2026-03-09T19:27:28.955 INFO:tasks.workunit.client.0.vm07.stdout:4/401: rename d3/d11/f12 to d3/d11/d2b/d38/f8a 0 2026-03-09T19:27:28.956 INFO:tasks.workunit.client.0.vm07.stdout:4/402: dread d3/d11/d16/d2f/d22/f7a [0,4194304] 0 2026-03-09T19:27:28.957 INFO:tasks.workunit.client.0.vm07.stdout:4/403: unlink d3/f8 0 2026-03-09T19:27:28.973 INFO:tasks.workunit.client.0.vm07.stdout:3/447: sync 2026-03-09T19:27:29.015 INFO:tasks.workunit.client.0.vm07.stdout:3/448: sync 2026-03-09T19:27:29.016 INFO:tasks.workunit.client.0.vm07.stdout:3/449: stat d1/d3d/d47/l50 0 2026-03-09T19:27:29.017 INFO:tasks.workunit.client.0.vm07.stdout:3/450: fsync d1/d6/fb 0 2026-03-09T19:27:29.017 INFO:tasks.workunit.client.0.vm07.stdout:3/451: write d1/d74/f77 [614114,31413] 0 2026-03-09T19:27:29.020 INFO:tasks.workunit.client.0.vm07.stdout:3/452: unlink d1/d6/d45/f88 0 2026-03-09T19:27:29.021 INFO:tasks.workunit.client.0.vm07.stdout:3/453: getdents d1/d1f/d16 0 2026-03-09T19:27:29.024 INFO:tasks.workunit.client.0.vm07.stdout:3/454: dread d1/d74/f6e [0,4194304] 0 2026-03-09T19:27:29.056 INFO:tasks.workunit.client.0.vm07.stdout:8/424: dread d7/d16/f71 [0,4194304] 0 2026-03-09T19:27:29.060 INFO:tasks.workunit.client.0.vm07.stdout:8/425: dwrite d7/f40 [0,4194304] 0 2026-03-09T19:27:29.063 INFO:tasks.workunit.client.0.vm07.stdout:8/426: dread d7/f40 [0,4194304] 0 2026-03-09T19:27:29.065 INFO:tasks.workunit.client.0.vm07.stdout:8/427: getdents d7/d50 0 2026-03-09T19:27:29.066 INFO:tasks.workunit.client.0.vm07.stdout:8/428: write d7/f19 [3116261,5236] 0 2026-03-09T19:27:29.067 INFO:tasks.workunit.client.1.vm08.stdout:6/638: sync 2026-03-09T19:27:29.068 INFO:tasks.workunit.client.0.vm07.stdout:5/409: fsync d3/d1a/f17 0 2026-03-09T19:27:29.070 INFO:tasks.workunit.client.0.vm07.stdout:8/429: write f5 [3330072,17586] 0 2026-03-09T19:27:29.073 INFO:tasks.workunit.client.0.vm07.stdout:8/430: creat d7/f9d x:0 0 0 2026-03-09T19:27:29.077 INFO:tasks.workunit.client.1.vm08.stdout:2/562: sync 2026-03-09T19:27:29.077 INFO:tasks.workunit.client.0.vm07.stdout:8/431: creat d7/d9/f9e x:0 0 0 2026-03-09T19:27:29.078 INFO:tasks.workunit.client.1.vm08.stdout:6/639: getdents d3/db/d43 0 2026-03-09T19:27:29.080 INFO:tasks.workunit.client.1.vm08.stdout:6/640: rmdir d3/d34/dce 39 2026-03-09T19:27:29.085 INFO:tasks.workunit.client.1.vm08.stdout:6/641: dread d3/d15/f2b [0,4194304] 0 2026-03-09T19:27:29.087 INFO:tasks.workunit.client.1.vm08.stdout:6/642: creat d3/d68/fec x:0 0 0 2026-03-09T19:27:29.101 INFO:tasks.workunit.client.1.vm08.stdout:6/643: chown d3/d68/d7e/fbb 2581907 1 2026-03-09T19:27:29.101 INFO:tasks.workunit.client.1.vm08.stdout:6/644: symlink d3/d94/led 0 2026-03-09T19:27:29.101 INFO:tasks.workunit.client.1.vm08.stdout:6/645: unlink d3/f32 0 2026-03-09T19:27:29.102 INFO:tasks.workunit.client.1.vm08.stdout:6/646: rename d3/d15/dc2/cd9 to d3/d34/da9/da4/cee 0 2026-03-09T19:27:29.102 INFO:tasks.workunit.client.1.vm08.stdout:6/647: dwrite d3/d34/dce/fdd [0,4194304] 0 2026-03-09T19:27:29.102 INFO:tasks.workunit.client.1.vm08.stdout:6/648: rename d3/d15/d8a to d3/d94/def 0 2026-03-09T19:27:29.102 INFO:tasks.workunit.client.1.vm08.stdout:6/649: truncate d3/d34/d6f/f2f 768523 0 2026-03-09T19:27:29.105 INFO:tasks.workunit.client.1.vm08.stdout:6/650: unlink d3/d34/d3b/f67 0 2026-03-09T19:27:29.116 INFO:tasks.workunit.client.1.vm08.stdout:6/651: dread d3/db/f44 [0,4194304] 0 2026-03-09T19:27:29.127 INFO:tasks.workunit.client.1.vm08.stdout:6/652: unlink d3/d94/fb2 0 2026-03-09T19:27:29.127 INFO:tasks.workunit.client.1.vm08.stdout:6/653: read - d3/db/d43/d69/da0/fb4 zero size 2026-03-09T19:27:29.127 INFO:tasks.workunit.client.1.vm08.stdout:6/654: symlink d3/d94/lf0 0 2026-03-09T19:27:29.127 INFO:tasks.workunit.client.1.vm08.stdout:6/655: readlink d3/d68/l9e 0 2026-03-09T19:27:29.139 INFO:tasks.workunit.client.1.vm08.stdout:6/656: read d3/db/f8f [1962876,98407] 0 2026-03-09T19:27:29.144 INFO:tasks.workunit.client.1.vm08.stdout:6/657: creat d3/ff1 x:0 0 0 2026-03-09T19:27:29.185 INFO:tasks.workunit.client.0.vm07.stdout:9/444: dwrite d0/db/d29/d32/d5c/d69/f83 [0,4194304] 0 2026-03-09T19:27:29.197 INFO:tasks.workunit.client.0.vm07.stdout:7/423: truncate d0/d4/d5/dd/f1f 260553 0 2026-03-09T19:27:29.198 INFO:tasks.workunit.client.0.vm07.stdout:9/445: mkdir d0/db/d9e 0 2026-03-09T19:27:29.202 INFO:tasks.workunit.client.0.vm07.stdout:6/348: dread d0/d1/db/d24/d53/d31/f89 [0,4194304] 0 2026-03-09T19:27:29.205 INFO:tasks.workunit.client.0.vm07.stdout:6/349: dread d0/d1/db/d17/d4c/f60 [0,4194304] 0 2026-03-09T19:27:29.209 INFO:tasks.workunit.client.0.vm07.stdout:9/446: creat d0/d6/d57/d8f/f9f x:0 0 0 2026-03-09T19:27:29.211 INFO:tasks.workunit.client.0.vm07.stdout:6/350: unlink d0/d1/db/d1d/f22 0 2026-03-09T19:27:29.213 INFO:tasks.workunit.client.0.vm07.stdout:9/447: fdatasync d0/db/d29/d2c/f4a 0 2026-03-09T19:27:29.220 INFO:tasks.workunit.client.0.vm07.stdout:6/351: creat d0/f8a x:0 0 0 2026-03-09T19:27:29.232 INFO:tasks.workunit.client.1.vm08.stdout:3/688: write d0/d6/de/d6e/d51/f70 [4136290,80588] 0 2026-03-09T19:27:29.233 INFO:tasks.workunit.client.1.vm08.stdout:0/633: write dd/d22/d27/f3d [571511,62104] 0 2026-03-09T19:27:29.233 INFO:tasks.workunit.client.1.vm08.stdout:3/689: chown d0/d8/c43 244262 1 2026-03-09T19:27:29.233 INFO:tasks.workunit.client.0.vm07.stdout:6/352: creat d0/f8b x:0 0 0 2026-03-09T19:27:29.233 INFO:tasks.workunit.client.0.vm07.stdout:6/353: mknod d0/d13/d1e/c8c 0 2026-03-09T19:27:29.233 INFO:tasks.workunit.client.0.vm07.stdout:1/404: write d1/f6 [1044387,96468] 0 2026-03-09T19:27:29.233 INFO:tasks.workunit.client.0.vm07.stdout:6/354: creat d0/d13/d1e/f8d x:0 0 0 2026-03-09T19:27:29.233 INFO:tasks.workunit.client.0.vm07.stdout:1/405: creat d1/d11/d37/d5d/d50/f8b x:0 0 0 2026-03-09T19:27:29.235 INFO:tasks.workunit.client.1.vm08.stdout:1/765: dwrite d9/da/d2d/d62/fcc [0,4194304] 0 2026-03-09T19:27:29.245 INFO:tasks.workunit.client.1.vm08.stdout:0/634: dread dd/d22/d27/d2e/d37/f40 [0,4194304] 0 2026-03-09T19:27:29.245 INFO:tasks.workunit.client.0.vm07.stdout:6/355: rename d0/f9 to d0/d1/db/d52/d6a/d87/f8e 0 2026-03-09T19:27:29.245 INFO:tasks.workunit.client.0.vm07.stdout:1/406: creat d1/db/d31/d4f/f8c x:0 0 0 2026-03-09T19:27:29.245 INFO:tasks.workunit.client.0.vm07.stdout:1/407: symlink d1/d11/d37/l8d 0 2026-03-09T19:27:29.245 INFO:tasks.workunit.client.0.vm07.stdout:6/356: dwrite d0/d1/db/f15 [4194304,4194304] 0 2026-03-09T19:27:29.250 INFO:tasks.workunit.client.0.vm07.stdout:1/408: truncate d1/d11/d37/f2c 557084 0 2026-03-09T19:27:29.250 INFO:tasks.workunit.client.0.vm07.stdout:6/357: truncate d0/d1/db/d17/d4c/f60 1375229 0 2026-03-09T19:27:29.255 INFO:tasks.workunit.client.0.vm07.stdout:6/358: rmdir d0/d2d 39 2026-03-09T19:27:29.256 INFO:tasks.workunit.client.0.vm07.stdout:1/409: creat d1/d11/d37/d3f/f8e x:0 0 0 2026-03-09T19:27:29.256 INFO:tasks.workunit.client.0.vm07.stdout:6/359: truncate d0/d1/db/f14 2138951 0 2026-03-09T19:27:29.257 INFO:tasks.workunit.client.0.vm07.stdout:6/360: readlink d0/d1/d28/l73 0 2026-03-09T19:27:29.257 INFO:tasks.workunit.client.0.vm07.stdout:1/410: symlink d1/db/d31/d56/l8f 0 2026-03-09T19:27:29.258 INFO:tasks.workunit.client.0.vm07.stdout:6/361: creat d0/d1/db/d17/d4c/f8f x:0 0 0 2026-03-09T19:27:29.259 INFO:tasks.workunit.client.0.vm07.stdout:1/411: unlink d1/d3/l3d 0 2026-03-09T19:27:29.260 INFO:tasks.workunit.client.0.vm07.stdout:6/362: dread d0/d13/f18 [0,4194304] 0 2026-03-09T19:27:29.260 INFO:tasks.workunit.client.0.vm07.stdout:6/363: read d0/d1/d28/f64 [801764,58551] 0 2026-03-09T19:27:29.261 INFO:tasks.workunit.client.0.vm07.stdout:1/412: write d1/d11/d37/d3f/f82 [773958,61569] 0 2026-03-09T19:27:29.267 INFO:tasks.workunit.client.0.vm07.stdout:1/413: mknod d1/d11/d37/d3f/c90 0 2026-03-09T19:27:29.271 INFO:tasks.workunit.client.0.vm07.stdout:1/414: readlink d1/d3/l25 0 2026-03-09T19:27:29.271 INFO:tasks.workunit.client.0.vm07.stdout:1/415: chown d1/d3e/d5c 6108 1 2026-03-09T19:27:29.271 INFO:tasks.workunit.client.0.vm07.stdout:1/416: write d1/db/d31/d4f/f79 [939712,23552] 0 2026-03-09T19:27:29.274 INFO:tasks.workunit.client.0.vm07.stdout:1/417: rename d1/d3/d52 to d1/d91 0 2026-03-09T19:27:29.275 INFO:tasks.workunit.client.0.vm07.stdout:1/418: mknod d1/d91/c92 0 2026-03-09T19:27:29.276 INFO:tasks.workunit.client.0.vm07.stdout:1/419: creat d1/db/d31/d4f/f93 x:0 0 0 2026-03-09T19:27:29.276 INFO:tasks.workunit.client.0.vm07.stdout:1/420: stat d1/d3/d21/f5f 0 2026-03-09T19:27:29.277 INFO:tasks.workunit.client.0.vm07.stdout:1/421: symlink d1/d11/d37/d5a/l94 0 2026-03-09T19:27:29.285 INFO:tasks.workunit.client.0.vm07.stdout:1/422: dwrite d1/d3/f23 [0,4194304] 0 2026-03-09T19:27:29.285 INFO:tasks.workunit.client.0.vm07.stdout:1/423: write d1/db/d31/d4f/f93 [609564,122420] 0 2026-03-09T19:27:29.289 INFO:tasks.workunit.client.0.vm07.stdout:1/424: dread d1/f76 [0,4194304] 0 2026-03-09T19:27:29.291 INFO:tasks.workunit.client.0.vm07.stdout:1/425: unlink d1/d3/f4e 0 2026-03-09T19:27:29.292 INFO:tasks.workunit.client.0.vm07.stdout:1/426: symlink d1/d91/l95 0 2026-03-09T19:27:29.294 INFO:tasks.workunit.client.0.vm07.stdout:1/427: creat d1/f96 x:0 0 0 2026-03-09T19:27:29.298 INFO:tasks.workunit.client.0.vm07.stdout:1/428: dread d1/db/d31/d56/f71 [0,4194304] 0 2026-03-09T19:27:29.308 INFO:tasks.workunit.client.0.vm07.stdout:1/429: write d1/d11/d37/d5d/d50/f62 [2176907,17916] 0 2026-03-09T19:27:29.311 INFO:tasks.workunit.client.0.vm07.stdout:1/430: dread d1/d3/f4 [0,4194304] 0 2026-03-09T19:27:29.314 INFO:tasks.workunit.client.0.vm07.stdout:1/431: link d1/db/d31/f78 d1/db/d31/d56/f97 0 2026-03-09T19:27:29.317 INFO:tasks.workunit.client.0.vm07.stdout:1/432: dwrite d1/d11/d37/d5d/f59 [0,4194304] 0 2026-03-09T19:27:29.319 INFO:tasks.workunit.client.0.vm07.stdout:1/433: chown d1/f96 22 1 2026-03-09T19:27:29.321 INFO:tasks.workunit.client.0.vm07.stdout:1/434: write d1/d11/d37/d5d/f59 [266006,17306] 0 2026-03-09T19:27:29.328 INFO:tasks.workunit.client.0.vm07.stdout:1/435: creat d1/d11/d37/d3f/d45/f98 x:0 0 0 2026-03-09T19:27:29.328 INFO:tasks.workunit.client.0.vm07.stdout:1/436: chown d1/f51 1 1 2026-03-09T19:27:29.329 INFO:tasks.workunit.client.0.vm07.stdout:1/437: readlink d1/d11/d37/d3f/d45/l1e 0 2026-03-09T19:27:29.330 INFO:tasks.workunit.client.0.vm07.stdout:1/438: mkdir d1/d11/d37/d3f/d6e/d99 0 2026-03-09T19:27:29.333 INFO:tasks.workunit.client.0.vm07.stdout:1/439: rename d1/d11/d37/d3f/d6e/d99 to d1/d11/d37/d5a/d9a 0 2026-03-09T19:27:29.337 INFO:tasks.workunit.client.0.vm07.stdout:1/440: dwrite d1/db/d31/d4f/f77 [0,4194304] 0 2026-03-09T19:27:29.339 INFO:tasks.workunit.client.0.vm07.stdout:1/441: creat d1/db/f9b x:0 0 0 2026-03-09T19:27:29.340 INFO:tasks.workunit.client.0.vm07.stdout:1/442: chown d1/db/l67 90881098 1 2026-03-09T19:27:29.341 INFO:tasks.workunit.client.0.vm07.stdout:1/443: fsync d1/d3/f4 0 2026-03-09T19:27:29.344 INFO:tasks.workunit.client.0.vm07.stdout:1/444: dwrite d1/db/f1f [0,4194304] 0 2026-03-09T19:27:29.359 INFO:tasks.workunit.client.0.vm07.stdout:0/378: write d0/d6/d13/d17/d19/f53 [792050,89474] 0 2026-03-09T19:27:29.360 INFO:tasks.workunit.client.1.vm08.stdout:4/611: dread da/d10/d26/d27/f3b [0,4194304] 0 2026-03-09T19:27:29.361 INFO:tasks.workunit.client.1.vm08.stdout:4/612: read - da/d10/d16/d28/d2f/d4f/d64/d81/fb2 zero size 2026-03-09T19:27:29.362 INFO:tasks.workunit.client.0.vm07.stdout:1/445: getdents d1/d11/d37/d3f/d45/d87/d88 0 2026-03-09T19:27:29.363 INFO:tasks.workunit.client.0.vm07.stdout:0/379: write d0/d6/d13/d1c/d50/f73 [20175,29623] 0 2026-03-09T19:27:29.364 INFO:tasks.workunit.client.0.vm07.stdout:0/380: unlink d0/l32 0 2026-03-09T19:27:29.365 INFO:tasks.workunit.client.1.vm08.stdout:4/613: mknod da/d10/d16/d28/d46/d52/d6e/d6d/cb8 0 2026-03-09T19:27:29.373 INFO:tasks.workunit.client.1.vm08.stdout:4/614: creat da/d10/d16/d28/d2f/d4f/d64/d84/d8a/fb9 x:0 0 0 2026-03-09T19:27:29.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:29 vm07.local ceph-mon[48545]: pgmap v168: 65 pgs: 65 active+clean; 2.1 GiB data, 7.5 GiB used, 112 GiB / 120 GiB avail; 30 MiB/s rd, 109 MiB/s wr, 214 op/s 2026-03-09T19:27:29.499 INFO:tasks.workunit.client.0.vm07.stdout:3/455: dread d1/d1f/d16/f1e [0,4194304] 0 2026-03-09T19:27:29.499 INFO:tasks.workunit.client.1.vm08.stdout:0/635: dread dd/d22/d27/d6c/f7f [0,4194304] 0 2026-03-09T19:27:29.502 INFO:tasks.workunit.client.1.vm08.stdout:0/636: unlink dd/d22/d63/f94 0 2026-03-09T19:27:29.503 INFO:tasks.workunit.client.1.vm08.stdout:0/637: read - dd/d22/d63/d6e/d72/fbd zero size 2026-03-09T19:27:29.504 INFO:tasks.workunit.client.0.vm07.stdout:3/456: dwrite d1/d6/dd/f33 [0,4194304] 0 2026-03-09T19:27:29.511 INFO:tasks.workunit.client.0.vm07.stdout:3/457: mkdir d1/d6/dd/d51/d8e 0 2026-03-09T19:27:29.520 INFO:tasks.workunit.client.0.vm07.stdout:3/458: truncate d1/d6/dd/f67 324599 0 2026-03-09T19:27:29.537 INFO:tasks.workunit.client.1.vm08.stdout:7/678: write d5/d14/dae/d3a/f64 [904085,63568] 0 2026-03-09T19:27:29.572 INFO:tasks.workunit.client.1.vm08.stdout:7/679: getdents d5/d14/d27 0 2026-03-09T19:27:29.582 INFO:tasks.workunit.client.1.vm08.stdout:7/680: rename d5/d14/dae/d3a/d42/d85/da0/fa4 to d5/d14/dae/d3a/d42/d85/da0/fe4 0 2026-03-09T19:27:29.597 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:29 vm08.local ceph-mon[57794]: pgmap v168: 65 pgs: 65 active+clean; 2.1 GiB data, 7.5 GiB used, 112 GiB / 120 GiB avail; 30 MiB/s rd, 109 MiB/s wr, 214 op/s 2026-03-09T19:27:29.598 INFO:tasks.workunit.client.1.vm08.stdout:8/590: write de/d25/d31/d82/d6d/d99/da5/db3/f9e [354141,21660] 0 2026-03-09T19:27:29.598 INFO:tasks.workunit.client.0.vm07.stdout:2/450: write d3/dd/d16/d29/f58 [1042213,117774] 0 2026-03-09T19:27:29.598 INFO:tasks.workunit.client.0.vm07.stdout:4/404: write d3/d4f/f5b [833513,27463] 0 2026-03-09T19:27:29.600 INFO:tasks.workunit.client.0.vm07.stdout:2/451: truncate d3/dd/f24 3035274 0 2026-03-09T19:27:29.606 INFO:tasks.workunit.client.0.vm07.stdout:2/452: dwrite d3/dd/d16/d29/f91 [0,4194304] 0 2026-03-09T19:27:29.610 INFO:tasks.workunit.client.1.vm08.stdout:9/621: dwrite d0/d1b/d97/d48/d5d/fb3 [0,4194304] 0 2026-03-09T19:27:29.617 INFO:tasks.workunit.client.0.vm07.stdout:5/410: write d3/f4e [990850,98105] 0 2026-03-09T19:27:29.617 INFO:tasks.workunit.client.0.vm07.stdout:8/432: truncate d7/d9/fd 6597667 0 2026-03-09T19:27:29.628 INFO:tasks.workunit.client.1.vm08.stdout:7/681: dread d5/d14/dae/d1c/f5a [0,4194304] 0 2026-03-09T19:27:29.629 INFO:tasks.workunit.client.1.vm08.stdout:8/591: fdatasync de/d1d/d2e/f56 0 2026-03-09T19:27:29.630 INFO:tasks.workunit.client.1.vm08.stdout:2/563: dwrite d3/d9/d79/d46/d8c/f90 [0,4194304] 0 2026-03-09T19:27:29.657 INFO:tasks.workunit.client.0.vm07.stdout:2/453: symlink d3/d49/l99 0 2026-03-09T19:27:29.657 INFO:tasks.workunit.client.0.vm07.stdout:2/454: chown d3/dd/d16/f5f 619542504 1 2026-03-09T19:27:29.658 INFO:tasks.workunit.client.0.vm07.stdout:2/455: chown d3/dd/d16/d29/d2d/d45/d3b/l77 56 1 2026-03-09T19:27:29.669 INFO:tasks.workunit.client.0.vm07.stdout:4/405: creat d3/d11/d16/d2f/d22/d70/f8b x:0 0 0 2026-03-09T19:27:29.686 INFO:tasks.workunit.client.1.vm08.stdout:6/658: dwrite d3/db/fb0 [0,4194304] 0 2026-03-09T19:27:29.687 INFO:tasks.workunit.client.0.vm07.stdout:7/424: dwrite d0/d4/d5/d8/f35 [0,4194304] 0 2026-03-09T19:27:29.687 INFO:tasks.workunit.client.0.vm07.stdout:2/456: rename d3/dd/d16/d29/d3c/d4c/f87 to d3/dd/f9a 0 2026-03-09T19:27:29.687 INFO:tasks.workunit.client.0.vm07.stdout:8/433: mkdir d7/d1d/d83/d9f 0 2026-03-09T19:27:29.700 INFO:tasks.workunit.client.0.vm07.stdout:9/448: write d0/db/f6c [833111,114400] 0 2026-03-09T19:27:29.709 INFO:tasks.workunit.client.1.vm08.stdout:3/690: write d0/d6/de/d6e/f81 [1237945,87798] 0 2026-03-09T19:27:29.709 INFO:tasks.workunit.client.0.vm07.stdout:8/434: fdatasync d7/d50/f84 0 2026-03-09T19:27:29.709 INFO:tasks.workunit.client.0.vm07.stdout:8/435: truncate d7/d9/d10/d44/d9a/f8a 875531 0 2026-03-09T19:27:29.712 INFO:tasks.workunit.client.0.vm07.stdout:8/436: dread d7/d9/d10/d44/d9a/f8a [0,4194304] 0 2026-03-09T19:27:29.740 INFO:tasks.workunit.client.1.vm08.stdout:1/766: write d9/da/d12/d39/fa7 [1536180,100256] 0 2026-03-09T19:27:29.740 INFO:tasks.workunit.client.1.vm08.stdout:1/767: readlink d9/da/dc/l28 0 2026-03-09T19:27:29.741 INFO:tasks.workunit.client.0.vm07.stdout:5/411: dread d3/dd/f52 [0,4194304] 0 2026-03-09T19:27:29.741 INFO:tasks.workunit.client.0.vm07.stdout:4/406: rename d3/d11/d2b/d38/f4a to d3/d11/d16/d2f/d22/d86/f8c 0 2026-03-09T19:27:29.741 INFO:tasks.workunit.client.0.vm07.stdout:6/364: write d0/d1/db/d17/f1a [3826257,59490] 0 2026-03-09T19:27:29.741 INFO:tasks.workunit.client.0.vm07.stdout:8/437: readlink d7/d9/d57/l6a 0 2026-03-09T19:27:29.741 INFO:tasks.workunit.client.0.vm07.stdout:7/425: creat d0/d4/d5/d26/f91 x:0 0 0 2026-03-09T19:27:29.741 INFO:tasks.workunit.client.0.vm07.stdout:2/457: getdents d3/dd/d16/d29/d2d/d45/d3b/d44 0 2026-03-09T19:27:29.742 INFO:tasks.workunit.client.0.vm07.stdout:9/449: truncate d0/d6/d3a/f89 700663 0 2026-03-09T19:27:29.748 INFO:tasks.workunit.client.0.vm07.stdout:8/438: chown d7/d9/d37/d45/f76 51850 1 2026-03-09T19:27:29.748 INFO:tasks.workunit.client.0.vm07.stdout:8/439: chown d7/d16/d1e 3251369 1 2026-03-09T19:27:29.752 INFO:tasks.workunit.client.1.vm08.stdout:8/592: rename de/d47/fb9 to de/d91/dc8/fce 0 2026-03-09T19:27:29.755 INFO:tasks.workunit.client.1.vm08.stdout:2/564: creat d3/d9/d79/fc2 x:0 0 0 2026-03-09T19:27:29.757 INFO:tasks.workunit.client.0.vm07.stdout:2/458: unlink f2 0 2026-03-09T19:27:29.763 INFO:tasks.workunit.client.0.vm07.stdout:9/450: unlink d0/d6/d57/d5d/l70 0 2026-03-09T19:27:29.763 INFO:tasks.workunit.client.1.vm08.stdout:4/615: write da/d10/d16/d28/d46/d52/d6e/d2c/f4a [113573,58679] 0 2026-03-09T19:27:29.766 INFO:tasks.workunit.client.0.vm07.stdout:0/381: dwrite d0/d6/d13/d1c/d11/d56/f7f [4194304,4194304] 0 2026-03-09T19:27:29.773 INFO:tasks.workunit.client.0.vm07.stdout:0/382: write d0/d6/f5c [2641074,106058] 0 2026-03-09T19:27:29.780 INFO:tasks.workunit.client.0.vm07.stdout:8/440: rename d7/d9/d37/d45/d56/d67/f70 to d7/d50/fa0 0 2026-03-09T19:27:29.781 INFO:tasks.workunit.client.0.vm07.stdout:5/412: getdents d3/dd/d26/d2d/d60 0 2026-03-09T19:27:29.782 INFO:tasks.workunit.client.0.vm07.stdout:5/413: dread - d3/d1a/d28/d40/f46 zero size 2026-03-09T19:27:29.783 INFO:tasks.workunit.client.0.vm07.stdout:3/459: write d1/d6/d71/f69 [538287,10076] 0 2026-03-09T19:27:29.784 INFO:tasks.workunit.client.0.vm07.stdout:7/426: mkdir d0/d52/d54/d5a/d87/d92 0 2026-03-09T19:27:29.785 INFO:tasks.workunit.client.0.vm07.stdout:4/407: link d3/d11/f6c d3/f8d 0 2026-03-09T19:27:29.788 INFO:tasks.workunit.client.1.vm08.stdout:5/600: write d16/d1e/d6e/fc7 [3549352,60969] 0 2026-03-09T19:27:29.788 INFO:tasks.workunit.client.1.vm08.stdout:1/768: rename d9/da/d95/cb5 to d9/da/d53/d67/d6c/d76/ce8 0 2026-03-09T19:27:29.790 INFO:tasks.workunit.client.0.vm07.stdout:9/451: dread - d0/db/d29/f67 zero size 2026-03-09T19:27:29.790 INFO:tasks.workunit.client.1.vm08.stdout:0/638: dwrite dd/d22/d24/d49/d92/fa7 [8388608,4194304] 0 2026-03-09T19:27:29.806 INFO:tasks.workunit.client.1.vm08.stdout:8/593: symlink de/d91/dc8/lcf 0 2026-03-09T19:27:29.810 INFO:tasks.workunit.client.0.vm07.stdout:2/459: rename d3/dd/d16/d29/d2d/d45/d3b/d53/c76 to d3/dd/d16/d29/d3c/d5a/d7a/d74/c9b 0 2026-03-09T19:27:29.818 INFO:tasks.workunit.client.1.vm08.stdout:1/769: mkdir d9/da/d95/dcd/de9 0 2026-03-09T19:27:29.834 INFO:tasks.workunit.client.1.vm08.stdout:6/659: creat d3/d34/d5c/de8/ff2 x:0 0 0 2026-03-09T19:27:29.835 INFO:tasks.workunit.client.0.vm07.stdout:3/460: mknod d1/d1f/d16/d28/d7c/c8f 0 2026-03-09T19:27:29.836 INFO:tasks.workunit.client.0.vm07.stdout:4/408: rename d3/d11/d2b/d37/f25 to d3/d11/d51/f8e 0 2026-03-09T19:27:29.836 INFO:tasks.workunit.client.0.vm07.stdout:2/460: creat d3/dd/d16/d29/d2d/d45/d3b/d53/f9c x:0 0 0 2026-03-09T19:27:29.836 INFO:tasks.workunit.client.1.vm08.stdout:1/770: chown d9/da/d95/dcd/de9 1632 1 2026-03-09T19:27:29.836 INFO:tasks.workunit.client.1.vm08.stdout:2/565: symlink d3/d9/lc3 0 2026-03-09T19:27:29.836 INFO:tasks.workunit.client.1.vm08.stdout:8/594: rename de/d25/d87/fbc to de/d25/d31/d82/d6d/d99/fd0 0 2026-03-09T19:27:29.836 INFO:tasks.workunit.client.1.vm08.stdout:9/622: link d0/d2/d8/l25 d0/ld2 0 2026-03-09T19:27:29.841 INFO:tasks.workunit.client.1.vm08.stdout:1/771: truncate d9/da/f8e 4175438 0 2026-03-09T19:27:29.845 INFO:tasks.workunit.client.0.vm07.stdout:7/427: dread d0/d4/d5/dd/f47 [0,4194304] 0 2026-03-09T19:27:29.846 INFO:tasks.workunit.client.0.vm07.stdout:5/414: fsync d3/d1a/d28/d48/f4f 0 2026-03-09T19:27:29.849 INFO:tasks.workunit.client.0.vm07.stdout:3/461: symlink d1/d6/dd/l90 0 2026-03-09T19:27:29.851 INFO:tasks.workunit.client.1.vm08.stdout:2/566: truncate d3/d9/d26/f35 3068971 0 2026-03-09T19:27:29.880 INFO:tasks.workunit.client.0.vm07.stdout:9/452: creat d0/d6/d3a/d94/fa0 x:0 0 0 2026-03-09T19:27:29.880 INFO:tasks.workunit.client.0.vm07.stdout:2/461: creat d3/dd/d16/d29/d3c/d4c/f9d x:0 0 0 2026-03-09T19:27:29.880 INFO:tasks.workunit.client.0.vm07.stdout:8/441: link d7/d16/f69 d7/d9/d10/d44/d9a/fa1 0 2026-03-09T19:27:29.880 INFO:tasks.workunit.client.0.vm07.stdout:7/428: dread - d0/d4/d5/d26/d3c/f60 zero size 2026-03-09T19:27:29.880 INFO:tasks.workunit.client.0.vm07.stdout:5/415: symlink d3/d1a/d28/d40/l84 0 2026-03-09T19:27:29.880 INFO:tasks.workunit.client.1.vm08.stdout:8/595: creat de/d91/fd1 x:0 0 0 2026-03-09T19:27:29.880 INFO:tasks.workunit.client.1.vm08.stdout:6/660: symlink d3/d34/dce/de3/lf3 0 2026-03-09T19:27:29.880 INFO:tasks.workunit.client.1.vm08.stdout:8/596: dwrite de/d1d/d69/f9a [4194304,4194304] 0 2026-03-09T19:27:29.880 INFO:tasks.workunit.client.1.vm08.stdout:8/597: chown de/d1d/d21/d73/l79 0 1 2026-03-09T19:27:29.880 INFO:tasks.workunit.client.1.vm08.stdout:6/661: rmdir d3/d34/d5c/da2 39 2026-03-09T19:27:29.880 INFO:tasks.workunit.client.1.vm08.stdout:1/772: rename d9/d11/d7a/d89/d8d/daa/fd0 to d9/da/d17/d60/fea 0 2026-03-09T19:27:29.880 INFO:tasks.workunit.client.1.vm08.stdout:8/598: symlink de/d25/d31/ld2 0 2026-03-09T19:27:29.889 INFO:tasks.workunit.client.0.vm07.stdout:8/442: rmdir d7/d30/d32 39 2026-03-09T19:27:29.897 INFO:tasks.workunit.client.1.vm08.stdout:8/599: creat de/d1d/d2e/d5f/fd3 x:0 0 0 2026-03-09T19:27:29.897 INFO:tasks.workunit.client.0.vm07.stdout:7/429: creat d0/d4/d5/d8/f93 x:0 0 0 2026-03-09T19:27:29.897 INFO:tasks.workunit.client.0.vm07.stdout:5/416: symlink d3/d1a/d5d/l85 0 2026-03-09T19:27:29.902 INFO:tasks.workunit.client.1.vm08.stdout:1/773: mknod d9/d11/d7a/d89/de7/ceb 0 2026-03-09T19:27:29.912 INFO:tasks.workunit.client.1.vm08.stdout:8/600: write de/d91/fbd [218173,60619] 0 2026-03-09T19:27:29.947 INFO:tasks.workunit.client.1.vm08.stdout:6/662: creat d3/d68/ff4 x:0 0 0 2026-03-09T19:27:29.947 INFO:tasks.workunit.client.1.vm08.stdout:1/774: symlink d9/d11/d7a/d89/d8d/da3/lec 0 2026-03-09T19:27:29.947 INFO:tasks.workunit.client.0.vm07.stdout:8/443: mknod d7/d9/d37/d45/d56/d62/ca2 0 2026-03-09T19:27:29.947 INFO:tasks.workunit.client.0.vm07.stdout:7/430: symlink d0/d4/d5/d26/d3c/l94 0 2026-03-09T19:27:29.947 INFO:tasks.workunit.client.0.vm07.stdout:7/431: stat d0/d4/d5/d26/d3c/d39 0 2026-03-09T19:27:29.947 INFO:tasks.workunit.client.0.vm07.stdout:7/432: chown d0/d4/l48 305801651 1 2026-03-09T19:27:29.947 INFO:tasks.workunit.client.0.vm07.stdout:5/417: creat d3/d1a/f86 x:0 0 0 2026-03-09T19:27:29.947 INFO:tasks.workunit.client.0.vm07.stdout:8/444: creat d7/d9/d37/d45/d56/d67/fa3 x:0 0 0 2026-03-09T19:27:29.947 INFO:tasks.workunit.client.0.vm07.stdout:7/433: mkdir d0/d52/d54/d95 0 2026-03-09T19:27:29.947 INFO:tasks.workunit.client.0.vm07.stdout:7/434: rmdir d0 39 2026-03-09T19:27:29.947 INFO:tasks.workunit.client.0.vm07.stdout:7/435: creat d0/d52/d54/d5a/f96 x:0 0 0 2026-03-09T19:27:29.947 INFO:tasks.workunit.client.0.vm07.stdout:7/436: fsync d0/f13 0 2026-03-09T19:27:29.947 INFO:tasks.workunit.client.0.vm07.stdout:7/437: creat d0/d4/d5/d26/d3c/d39/f97 x:0 0 0 2026-03-09T19:27:29.954 INFO:tasks.workunit.client.0.vm07.stdout:7/438: dread d0/d52/d54/d55/f67 [0,4194304] 0 2026-03-09T19:27:29.957 INFO:tasks.workunit.client.0.vm07.stdout:7/439: rename d0/d4/d5/dd to d0/d4/d5/d8/d41/d64/d74/d98 0 2026-03-09T19:27:29.957 INFO:tasks.workunit.client.0.vm07.stdout:7/440: readlink d0/d4/d5/d8/d1a/d2a/l53 0 2026-03-09T19:27:30.098 INFO:tasks.workunit.client.0.vm07.stdout:0/383: read d0/f41 [245644,42355] 0 2026-03-09T19:27:30.102 INFO:tasks.workunit.client.0.vm07.stdout:0/384: dwrite d0/d6/d13/d33/f35 [0,4194304] 0 2026-03-09T19:27:30.104 INFO:tasks.workunit.client.0.vm07.stdout:0/385: dread - d0/f65 zero size 2026-03-09T19:27:30.111 INFO:tasks.workunit.client.0.vm07.stdout:0/386: readlink d0/d6/d13/l46 0 2026-03-09T19:27:30.111 INFO:tasks.workunit.client.0.vm07.stdout:0/387: chown d0/d6/d13/d17/d19/d57 3218 1 2026-03-09T19:27:30.116 INFO:tasks.workunit.client.0.vm07.stdout:0/388: rename d0/d6/d13/d1c/ce to d0/d6/d13/d1c/d61/c84 0 2026-03-09T19:27:30.119 INFO:tasks.workunit.client.0.vm07.stdout:0/389: rename d0/d6/d13/d1c/f36 to d0/d6/d13/d1c/d50/f85 0 2026-03-09T19:27:30.122 INFO:tasks.workunit.client.0.vm07.stdout:0/390: unlink d0/d6/f5c 0 2026-03-09T19:27:30.228 INFO:tasks.workunit.client.0.vm07.stdout:4/409: sync 2026-03-09T19:27:30.233 INFO:tasks.workunit.client.0.vm07.stdout:4/410: mkdir d3/d11/d2b/d38/d8f 0 2026-03-09T19:27:30.235 INFO:tasks.workunit.client.0.vm07.stdout:4/411: fsync d3/d11/d16/d2f/d22/f7a 0 2026-03-09T19:27:30.237 INFO:tasks.workunit.client.0.vm07.stdout:4/412: symlink d3/l90 0 2026-03-09T19:27:30.239 INFO:tasks.workunit.client.0.vm07.stdout:4/413: chown d3/d11/d2b/d38/f8a 236 1 2026-03-09T19:27:30.239 INFO:tasks.workunit.client.0.vm07.stdout:4/414: read - d3/d11/d16/d2f/d22/d70/f8b zero size 2026-03-09T19:27:30.241 INFO:tasks.workunit.client.0.vm07.stdout:4/415: fsync d3/d11/d2b/f2c 0 2026-03-09T19:27:30.241 INFO:tasks.workunit.client.0.vm07.stdout:4/416: chown d3/l6 0 1 2026-03-09T19:27:30.242 INFO:tasks.workunit.client.0.vm07.stdout:4/417: mkdir d3/d11/d16/d2f/d91 0 2026-03-09T19:27:30.245 INFO:tasks.workunit.client.0.vm07.stdout:4/418: dread d3/d11/f6c [0,4194304] 0 2026-03-09T19:27:30.249 INFO:tasks.workunit.client.0.vm07.stdout:8/445: sync 2026-03-09T19:27:30.388 INFO:tasks.workunit.client.0.vm07.stdout:8/446: sync 2026-03-09T19:27:30.391 INFO:tasks.workunit.client.0.vm07.stdout:8/447: creat d7/d1d/d83/d9f/fa4 x:0 0 0 2026-03-09T19:27:30.438 INFO:tasks.workunit.client.0.vm07.stdout:1/446: dwrite d1/d11/d37/f2c [0,4194304] 0 2026-03-09T19:27:30.451 INFO:tasks.workunit.client.0.vm07.stdout:6/365: write d0/d1/db/d24/d53/f35 [2071015,58117] 0 2026-03-09T19:27:30.467 INFO:tasks.workunit.client.1.vm08.stdout:7/682: write d5/d14/dae/f6b [693949,40823] 0 2026-03-09T19:27:30.468 INFO:tasks.workunit.client.1.vm08.stdout:3/691: write d0/d6/de/d1b/f7d [3771551,90713] 0 2026-03-09T19:27:30.468 INFO:tasks.workunit.client.1.vm08.stdout:7/683: creat d5/d14/d38/dad/fe5 x:0 0 0 2026-03-09T19:27:30.468 INFO:tasks.workunit.client.1.vm08.stdout:3/692: mknod d0/d6/de/d6e/d51/cda 0 2026-03-09T19:27:30.468 INFO:tasks.workunit.client.1.vm08.stdout:7/684: fdatasync d5/d14/dae/d3a/d42/fb7 0 2026-03-09T19:27:30.468 INFO:tasks.workunit.client.1.vm08.stdout:3/693: rmdir d0/d4b 39 2026-03-09T19:27:30.468 INFO:tasks.workunit.client.1.vm08.stdout:7/685: mknod d5/d14/dae/d1c/d83/ce6 0 2026-03-09T19:27:30.468 INFO:tasks.workunit.client.1.vm08.stdout:0/639: write dd/f15 [318781,70374] 0 2026-03-09T19:27:30.468 INFO:tasks.workunit.client.1.vm08.stdout:3/694: truncate d0/d6/de/d1a/f5a 2155832 0 2026-03-09T19:27:30.469 INFO:tasks.workunit.client.0.vm07.stdout:1/447: mkdir d1/d11/d37/d3f/d6e/d9c 0 2026-03-09T19:27:30.469 INFO:tasks.workunit.client.0.vm07.stdout:1/448: dread - d1/f96 zero size 2026-03-09T19:27:30.469 INFO:tasks.workunit.client.0.vm07.stdout:6/366: mknod d0/d1/db/d1d/d77/c90 0 2026-03-09T19:27:30.469 INFO:tasks.workunit.client.0.vm07.stdout:1/449: rename d1/db/d31/f78 to d1/d11/d37/d3f/d45/f9d 0 2026-03-09T19:27:30.469 INFO:tasks.workunit.client.0.vm07.stdout:1/450: write d1/db/d31/d4f/f8c [812099,13361] 0 2026-03-09T19:27:30.469 INFO:tasks.workunit.client.0.vm07.stdout:1/451: stat d1/db/d31 0 2026-03-09T19:27:30.469 INFO:tasks.workunit.client.0.vm07.stdout:6/367: mkdir d0/d1/db/d91 0 2026-03-09T19:27:30.469 INFO:tasks.workunit.client.1.vm08.stdout:2/567: write d3/d9/d79/f98 [87282,7341] 0 2026-03-09T19:27:30.470 INFO:tasks.workunit.client.1.vm08.stdout:2/568: readlink d3/d4/d23/d2c/d39/d5e/d14/l85 0 2026-03-09T19:27:30.472 INFO:tasks.workunit.client.1.vm08.stdout:0/640: creat dd/d22/d27/d65/fcb x:0 0 0 2026-03-09T19:27:30.473 INFO:tasks.workunit.client.1.vm08.stdout:3/695: truncate d0/d6/de/d15/d96/fa0 1769086 0 2026-03-09T19:27:30.475 INFO:tasks.workunit.client.1.vm08.stdout:9/623: mkdir d0/d1b/d97/dd3 0 2026-03-09T19:27:30.476 INFO:tasks.workunit.client.1.vm08.stdout:9/624: write d0/d1b/d4e/da7/fc3 [440338,105145] 0 2026-03-09T19:27:30.477 INFO:tasks.workunit.client.1.vm08.stdout:7/686: rename d5/cdb to d5/d14/d2b/d5d/ce7 0 2026-03-09T19:27:30.479 INFO:tasks.workunit.client.0.vm07.stdout:9/453: dwrite d0/db/f41 [0,4194304] 0 2026-03-09T19:27:30.487 INFO:tasks.workunit.client.0.vm07.stdout:2/462: truncate d3/ff 2451841 0 2026-03-09T19:27:30.487 INFO:tasks.workunit.client.0.vm07.stdout:3/462: write d1/d1f/f13 [2597682,52500] 0 2026-03-09T19:27:30.490 INFO:tasks.workunit.client.0.vm07.stdout:2/463: dwrite d3/f93 [0,4194304] 0 2026-03-09T19:27:30.500 INFO:tasks.workunit.client.0.vm07.stdout:2/464: dwrite d3/f93 [0,4194304] 0 2026-03-09T19:27:30.519 INFO:tasks.workunit.client.1.vm08.stdout:3/696: truncate d0/d6/de/d6e/d51/d7f/fca 931938 0 2026-03-09T19:27:30.519 INFO:tasks.workunit.client.1.vm08.stdout:8/601: write de/d25/d33/f55 [1523224,106200] 0 2026-03-09T19:27:30.519 INFO:tasks.workunit.client.1.vm08.stdout:8/602: chown de/d1d/d69 119880 1 2026-03-09T19:27:30.519 INFO:tasks.workunit.client.1.vm08.stdout:6/663: write d3/db/f30 [924730,65559] 0 2026-03-09T19:27:30.519 INFO:tasks.workunit.client.1.vm08.stdout:7/687: truncate d5/d14/d2b/fb0 1000473 0 2026-03-09T19:27:30.519 INFO:tasks.workunit.client.0.vm07.stdout:6/368: creat d0/d1/f92 x:0 0 0 2026-03-09T19:27:30.519 INFO:tasks.workunit.client.0.vm07.stdout:6/369: chown d0/d1/db/d24/d53/d31 10639 1 2026-03-09T19:27:30.519 INFO:tasks.workunit.client.0.vm07.stdout:6/370: chown d0/d1/db/d24/d53/f74 439246720 1 2026-03-09T19:27:30.519 INFO:tasks.workunit.client.0.vm07.stdout:9/454: read d0/d17/f33 [27968,65812] 0 2026-03-09T19:27:30.519 INFO:tasks.workunit.client.0.vm07.stdout:3/463: rename d1/d6/d71/c79 to d1/d6/c91 0 2026-03-09T19:27:30.524 INFO:tasks.workunit.client.1.vm08.stdout:3/697: rename d0/d6/d93/fa1 to d0/d8/d19/fdb 0 2026-03-09T19:27:30.524 INFO:tasks.workunit.client.1.vm08.stdout:3/698: chown d0/d6/de/c69 0 1 2026-03-09T19:27:30.526 INFO:tasks.workunit.client.0.vm07.stdout:6/371: fdatasync d0/d13/f5f 0 2026-03-09T19:27:30.527 INFO:tasks.workunit.client.1.vm08.stdout:6/664: mkdir d3/d34/d3b/df5 0 2026-03-09T19:27:30.527 INFO:tasks.workunit.client.0.vm07.stdout:6/372: fdatasync d0/d1/db/d24/d53/d31/f3c 0 2026-03-09T19:27:30.531 INFO:tasks.workunit.client.1.vm08.stdout:3/699: fdatasync d0/d52/d7c/fc1 0 2026-03-09T19:27:30.581 INFO:tasks.workunit.client.0.vm07.stdout:2/465: creat d3/dd/d16/d29/d2d/d45/d85/d8a/f9e x:0 0 0 2026-03-09T19:27:30.581 INFO:tasks.workunit.client.0.vm07.stdout:3/464: creat d1/d1f/d5c/f92 x:0 0 0 2026-03-09T19:27:30.581 INFO:tasks.workunit.client.0.vm07.stdout:3/465: rename d1/d6/d45/f56 to d1/d1f/d5c/f93 0 2026-03-09T19:27:30.581 INFO:tasks.workunit.client.0.vm07.stdout:2/466: mknod d3/dd/d16/d29/d2d/d45/d3b/d44/c9f 0 2026-03-09T19:27:30.581 INFO:tasks.workunit.client.0.vm07.stdout:3/466: unlink d1/d6/dd/c32 0 2026-03-09T19:27:30.581 INFO:tasks.workunit.client.1.vm08.stdout:7/688: symlink d5/d14/dae/le8 0 2026-03-09T19:27:30.581 INFO:tasks.workunit.client.1.vm08.stdout:2/569: getdents d3/d4/d23/d2c/d39/d5e/d14 0 2026-03-09T19:27:30.581 INFO:tasks.workunit.client.1.vm08.stdout:7/689: dread - d5/d14/dae/d3a/f56 zero size 2026-03-09T19:27:30.581 INFO:tasks.workunit.client.1.vm08.stdout:3/700: chown d0/d8/d19/ld4 3702 1 2026-03-09T19:27:30.581 INFO:tasks.workunit.client.1.vm08.stdout:6/665: link d3/d15/c47 d3/d15/cf6 0 2026-03-09T19:27:30.581 INFO:tasks.workunit.client.1.vm08.stdout:3/701: creat d0/d4b/fdc x:0 0 0 2026-03-09T19:27:30.581 INFO:tasks.workunit.client.1.vm08.stdout:3/702: dread d0/d8/f4c [0,4194304] 0 2026-03-09T19:27:30.581 INFO:tasks.workunit.client.1.vm08.stdout:6/666: creat d3/db/ff7 x:0 0 0 2026-03-09T19:27:30.581 INFO:tasks.workunit.client.1.vm08.stdout:3/703: creat d0/d52/d6d/d77/d88/fdd x:0 0 0 2026-03-09T19:27:30.581 INFO:tasks.workunit.client.1.vm08.stdout:3/704: mkdir d0/d6/d93/dcb/dde 0 2026-03-09T19:27:30.581 INFO:tasks.workunit.client.1.vm08.stdout:3/705: mkdir d0/d52/d6d/d77/ddf 0 2026-03-09T19:27:30.585 INFO:tasks.workunit.client.1.vm08.stdout:9/625: dread d0/d2/d80/f6a [0,4194304] 0 2026-03-09T19:27:30.588 INFO:tasks.workunit.client.1.vm08.stdout:9/626: getdents d0/d2/d14/d98 0 2026-03-09T19:27:30.589 INFO:tasks.workunit.client.1.vm08.stdout:9/627: read d0/d1b/f65 [681577,95448] 0 2026-03-09T19:27:30.589 INFO:tasks.workunit.client.1.vm08.stdout:9/628: chown d0/d1b/c37 4890 1 2026-03-09T19:27:30.598 INFO:tasks.workunit.client.1.vm08.stdout:9/629: dread d0/d2/f2a [0,4194304] 0 2026-03-09T19:27:30.599 INFO:tasks.workunit.client.1.vm08.stdout:9/630: creat d0/d2/d14/d98/d99/fd4 x:0 0 0 2026-03-09T19:27:30.600 INFO:tasks.workunit.client.1.vm08.stdout:9/631: mknod d0/d1b/d97/dd3/cd5 0 2026-03-09T19:27:30.600 INFO:tasks.workunit.client.1.vm08.stdout:9/632: readlink d0/d1b/d68/d7f/d8c/da2/da8/lbd 0 2026-03-09T19:27:30.601 INFO:tasks.workunit.client.1.vm08.stdout:9/633: truncate d0/d2/f21 8742590 0 2026-03-09T19:27:30.603 INFO:tasks.workunit.client.1.vm08.stdout:9/634: truncate d0/d2/d14/f31 4532553 0 2026-03-09T19:27:30.604 INFO:tasks.workunit.client.1.vm08.stdout:9/635: mknod d0/d2/d8/cd6 0 2026-03-09T19:27:30.605 INFO:tasks.workunit.client.1.vm08.stdout:9/636: write d0/d1b/f65 [939364,117305] 0 2026-03-09T19:27:30.607 INFO:tasks.workunit.client.1.vm08.stdout:9/637: symlink d0/d1b/d68/d7f/d8c/da2/da8/ld7 0 2026-03-09T19:27:30.610 INFO:tasks.workunit.client.1.vm08.stdout:9/638: dwrite d0/d1b/d97/f3f [4194304,4194304] 0 2026-03-09T19:27:30.618 INFO:tasks.workunit.client.1.vm08.stdout:9/639: dread d0/d2/f1a [4194304,4194304] 0 2026-03-09T19:27:30.619 INFO:tasks.workunit.client.1.vm08.stdout:9/640: mkdir d0/d2/d14/d98/d99/dd8 0 2026-03-09T19:27:30.620 INFO:tasks.workunit.client.1.vm08.stdout:9/641: read d0/d1b/d97/d48/d6f/f84 [3910506,13954] 0 2026-03-09T19:27:30.698 INFO:tasks.workunit.client.0.vm07.stdout:5/418: dwrite d3/d1a/d28/d48/f50 [0,4194304] 0 2026-03-09T19:27:30.701 INFO:tasks.workunit.client.0.vm07.stdout:7/441: write d0/d4/d5/f50 [260445,80404] 0 2026-03-09T19:27:30.703 INFO:tasks.workunit.client.1.vm08.stdout:1/775: dwrite d9/da/d12/f5c [0,4194304] 0 2026-03-09T19:27:30.703 INFO:tasks.workunit.client.0.vm07.stdout:5/419: dwrite d3/f4e [0,4194304] 0 2026-03-09T19:27:30.710 INFO:tasks.workunit.client.0.vm07.stdout:0/391: dwrite d0/d6/d13/f4c [0,4194304] 0 2026-03-09T19:27:30.711 INFO:tasks.workunit.client.0.vm07.stdout:5/420: getdents d3 0 2026-03-09T19:27:30.718 INFO:tasks.workunit.client.0.vm07.stdout:4/419: dwrite d3/d11/f1e [4194304,4194304] 0 2026-03-09T19:27:30.836 INFO:tasks.workunit.client.0.vm07.stdout:2/467: sync 2026-03-09T19:27:30.836 INFO:tasks.workunit.client.0.vm07.stdout:2/468: chown d3/c78 75 1 2026-03-09T19:27:30.837 INFO:tasks.workunit.client.0.vm07.stdout:2/469: fdatasync d3/dd/d16/f25 0 2026-03-09T19:27:30.883 INFO:tasks.workunit.client.0.vm07.stdout:8/448: dwrite d7/d50/f80 [0,4194304] 0 2026-03-09T19:27:30.987 INFO:tasks.workunit.client.1.vm08.stdout:4/616: sync 2026-03-09T19:27:30.987 INFO:tasks.workunit.client.1.vm08.stdout:5/601: sync 2026-03-09T19:27:30.987 INFO:tasks.workunit.client.1.vm08.stdout:6/667: sync 2026-03-09T19:27:30.989 INFO:tasks.workunit.client.1.vm08.stdout:4/617: stat da/d10/d1b/f79 0 2026-03-09T19:27:30.991 INFO:tasks.workunit.client.1.vm08.stdout:5/602: dread - d16/d8e/fb2 zero size 2026-03-09T19:27:30.992 INFO:tasks.workunit.client.1.vm08.stdout:4/618: mknod da/d10/d1b/d23/cba 0 2026-03-09T19:27:30.992 INFO:tasks.workunit.client.1.vm08.stdout:4/619: chown da/d10/d26/d3a 3715167 1 2026-03-09T19:27:30.995 INFO:tasks.workunit.client.1.vm08.stdout:5/603: mkdir d16/d1e/dc9 0 2026-03-09T19:27:30.996 INFO:tasks.workunit.client.1.vm08.stdout:4/620: link da/fa8 da/d10/d26/d50/fbb 0 2026-03-09T19:27:30.998 INFO:tasks.workunit.client.1.vm08.stdout:4/621: symlink da/d10/lbc 0 2026-03-09T19:27:31.001 INFO:tasks.workunit.client.1.vm08.stdout:6/668: link d3/d34/d5c/da2/lcc d3/d68/lf8 0 2026-03-09T19:27:31.003 INFO:tasks.workunit.client.1.vm08.stdout:5/604: fsync d16/fbe 0 2026-03-09T19:27:31.005 INFO:tasks.workunit.client.1.vm08.stdout:4/622: dwrite da/d10/d16/d28/d2f/d4f/d56/f9a [0,4194304] 0 2026-03-09T19:27:31.012 INFO:tasks.workunit.client.1.vm08.stdout:5/605: dread d16/d1e/d30/d8a/f98 [0,4194304] 0 2026-03-09T19:27:31.013 INFO:tasks.workunit.client.1.vm08.stdout:5/606: chown d16/d1e/d3b/fbd 0 1 2026-03-09T19:27:31.013 INFO:tasks.workunit.client.1.vm08.stdout:4/623: mknod da/d10/d26/d27/da6/cbd 0 2026-03-09T19:27:31.014 INFO:tasks.workunit.client.1.vm08.stdout:5/607: chown d16/d1e/d6e/lc4 8500 1 2026-03-09T19:27:31.014 INFO:tasks.workunit.client.1.vm08.stdout:4/624: chown da/d10/d16/d28/d46 1746 1 2026-03-09T19:27:31.017 INFO:tasks.workunit.client.1.vm08.stdout:5/608: truncate d16/d1e/d30/fb4 185671 0 2026-03-09T19:27:31.023 INFO:tasks.workunit.client.1.vm08.stdout:4/625: fdatasync da/d10/f77 0 2026-03-09T19:27:31.033 INFO:tasks.workunit.client.0.vm07.stdout:1/452: dwrite d1/d11/d37/d3f/d45/f26 [0,4194304] 0 2026-03-09T19:27:31.042 INFO:tasks.workunit.client.0.vm07.stdout:1/453: symlink d1/d11/d37/d3f/d6e/d9c/l9e 0 2026-03-09T19:27:31.055 INFO:tasks.workunit.client.1.vm08.stdout:8/603: write de/d25/f71 [2836423,120527] 0 2026-03-09T19:27:31.058 INFO:tasks.workunit.client.1.vm08.stdout:8/604: truncate de/d1d/d21/d73/fa7 1569407 0 2026-03-09T19:27:31.059 INFO:tasks.workunit.client.1.vm08.stdout:0/641: dwrite dd/d22/f5c [4194304,4194304] 0 2026-03-09T19:27:31.060 INFO:tasks.workunit.client.1.vm08.stdout:0/642: stat dd/d22/d24 0 2026-03-09T19:27:31.061 INFO:tasks.workunit.client.1.vm08.stdout:8/605: write de/d1d/d69/f9a [1854869,78654] 0 2026-03-09T19:27:31.068 INFO:tasks.workunit.client.1.vm08.stdout:0/643: dwrite dd/d22/d27/d6c/f7f [0,4194304] 0 2026-03-09T19:27:31.098 INFO:tasks.workunit.client.0.vm07.stdout:9/455: truncate d0/db/d29/d32/d5c/d69/f83 2479943 0 2026-03-09T19:27:31.098 INFO:tasks.workunit.client.0.vm07.stdout:6/373: dwrite d0/ff [0,4194304] 0 2026-03-09T19:27:31.098 INFO:tasks.workunit.client.1.vm08.stdout:8/606: getdents de/d25/d31 0 2026-03-09T19:27:31.098 INFO:tasks.workunit.client.1.vm08.stdout:2/570: dwrite d3/d9/d79/f7d [0,4194304] 0 2026-03-09T19:27:31.098 INFO:tasks.workunit.client.1.vm08.stdout:0/644: mkdir dd/d9d/dcc 0 2026-03-09T19:27:31.098 INFO:tasks.workunit.client.1.vm08.stdout:8/607: dread - de/d1d/fb0 zero size 2026-03-09T19:27:31.098 INFO:tasks.workunit.client.1.vm08.stdout:0/645: rename dd/d22/d63/d6e/d72/f8f to dd/d22/d24/d49/d92/fcd 0 2026-03-09T19:27:31.098 INFO:tasks.workunit.client.1.vm08.stdout:8/608: mkdir de/d47/dd4 0 2026-03-09T19:27:31.098 INFO:tasks.workunit.client.1.vm08.stdout:7/690: dwrite d5/d14/d38/f40 [0,4194304] 0 2026-03-09T19:27:31.098 INFO:tasks.workunit.client.1.vm08.stdout:7/691: chown d5/d14/d2b/f30 0 1 2026-03-09T19:27:31.103 INFO:tasks.workunit.client.0.vm07.stdout:3/467: write d1/d3d/f5e [7912631,23970] 0 2026-03-09T19:27:31.104 INFO:tasks.workunit.client.0.vm07.stdout:3/468: mkdir d1/d3d/d47/d94 0 2026-03-09T19:27:31.106 INFO:tasks.workunit.client.0.vm07.stdout:3/469: creat d1/d3d/f95 x:0 0 0 2026-03-09T19:27:31.107 INFO:tasks.workunit.client.1.vm08.stdout:3/706: write d0/d6/de/d1b/d16/f7b [141893,61646] 0 2026-03-09T19:27:31.109 INFO:tasks.workunit.client.0.vm07.stdout:3/470: symlink d1/d6/d71/l96 0 2026-03-09T19:27:31.232 INFO:tasks.workunit.client.0.vm07.stdout:6/374: sync 2026-03-09T19:27:31.265 INFO:tasks.workunit.client.0.vm07.stdout:6/375: mknod d0/d4e/d75/c93 0 2026-03-09T19:27:31.265 INFO:tasks.workunit.client.0.vm07.stdout:6/376: rename d0/d1/db/d52/d6a to d0/d1/db/d52/d94 0 2026-03-09T19:27:31.298 INFO:tasks.workunit.client.1.vm08.stdout:6/669: sync 2026-03-09T19:27:31.300 INFO:tasks.workunit.client.1.vm08.stdout:6/670: fsync d3/d34/f35 0 2026-03-09T19:27:31.308 INFO:tasks.workunit.client.1.vm08.stdout:8/609: sync 2026-03-09T19:27:31.311 INFO:tasks.workunit.client.1.vm08.stdout:8/610: mkdir de/d91/dd5 0 2026-03-09T19:27:31.320 INFO:tasks.workunit.client.1.vm08.stdout:8/611: dread f6 [4194304,4194304] 0 2026-03-09T19:27:31.331 INFO:tasks.workunit.client.1.vm08.stdout:8/612: link f6 de/d91/fd6 0 2026-03-09T19:27:31.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:31 vm07.local ceph-mon[48545]: pgmap v169: 65 pgs: 65 active+clean; 2.2 GiB data, 7.9 GiB used, 112 GiB / 120 GiB avail; 36 MiB/s rd, 118 MiB/s wr, 273 op/s 2026-03-09T19:27:31.527 INFO:tasks.workunit.client.1.vm08.stdout:9/642: dread d0/d1b/d97/d48/fb5 [0,4194304] 0 2026-03-09T19:27:31.528 INFO:tasks.workunit.client.1.vm08.stdout:9/643: truncate d0/d2/d8/dcd/fb6 1901895 0 2026-03-09T19:27:31.586 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:31 vm08.local ceph-mon[57794]: pgmap v169: 65 pgs: 65 active+clean; 2.2 GiB data, 7.9 GiB used, 112 GiB / 120 GiB avail; 36 MiB/s rd, 118 MiB/s wr, 273 op/s 2026-03-09T19:27:31.660 INFO:tasks.workunit.client.0.vm07.stdout:3/471: dread d1/d1f/d16/d28/f34 [0,4194304] 0 2026-03-09T19:27:31.669 INFO:tasks.workunit.client.0.vm07.stdout:0/392: dread d0/d6/d13/d17/f20 [0,4194304] 0 2026-03-09T19:27:31.673 INFO:tasks.workunit.client.1.vm08.stdout:6/671: truncate d3/db/d43/fd3 2973159 0 2026-03-09T19:27:31.673 INFO:tasks.workunit.client.0.vm07.stdout:3/472: rmdir d1/d3d/d47/d94 0 2026-03-09T19:27:31.678 INFO:tasks.workunit.client.0.vm07.stdout:0/393: dread d0/d6/d13/d17/d19/d57/d6a/f74 [0,4194304] 0 2026-03-09T19:27:31.681 INFO:tasks.workunit.client.0.vm07.stdout:8/449: symlink d7/d9/d57/la5 0 2026-03-09T19:27:31.683 INFO:tasks.workunit.client.0.vm07.stdout:8/450: mkdir d7/d50/da6 0 2026-03-09T19:27:31.684 INFO:tasks.workunit.client.0.vm07.stdout:8/451: write d7/d1d/f96 [144692,121621] 0 2026-03-09T19:27:31.688 INFO:tasks.workunit.client.0.vm07.stdout:0/394: dread d0/f3a [0,4194304] 0 2026-03-09T19:27:31.689 INFO:tasks.workunit.client.0.vm07.stdout:0/395: symlink d0/d6/d13/d1c/d11/d56/l86 0 2026-03-09T19:27:31.691 INFO:tasks.workunit.client.0.vm07.stdout:0/396: getdents d0/d6/d13/d1c/d52 0 2026-03-09T19:27:31.714 INFO:tasks.workunit.client.0.vm07.stdout:0/397: dread d0/d6/f43 [0,4194304] 0 2026-03-09T19:27:31.715 INFO:tasks.workunit.client.0.vm07.stdout:0/398: chown d0/d6/d13/d17/c26 119 1 2026-03-09T19:27:31.716 INFO:tasks.workunit.client.0.vm07.stdout:0/399: rmdir d0/d6/d13/d17/d19 39 2026-03-09T19:27:31.717 INFO:tasks.workunit.client.0.vm07.stdout:0/400: symlink d0/d6/d13/d1c/d11/l87 0 2026-03-09T19:27:31.764 INFO:tasks.workunit.client.1.vm08.stdout:1/776: dwrite d9/da/d12/f72 [0,4194304] 0 2026-03-09T19:27:31.769 INFO:tasks.workunit.client.1.vm08.stdout:1/777: dread d9/da/d2d/f50 [0,4194304] 0 2026-03-09T19:27:31.771 INFO:tasks.workunit.client.1.vm08.stdout:1/778: symlink d9/d11/d7a/d89/d8d/daa/led 0 2026-03-09T19:27:31.772 INFO:tasks.workunit.client.1.vm08.stdout:1/779: chown d9/da/d53/d67/l84 15 1 2026-03-09T19:27:31.774 INFO:tasks.workunit.client.1.vm08.stdout:1/780: dread - d9/da/d95/dcd/fcf zero size 2026-03-09T19:27:31.775 INFO:tasks.workunit.client.1.vm08.stdout:1/781: chown d9/da/d12/c66 1705 1 2026-03-09T19:27:31.779 INFO:tasks.workunit.client.1.vm08.stdout:1/782: creat d9/da/d95/dcd/fee x:0 0 0 2026-03-09T19:27:31.779 INFO:tasks.workunit.client.1.vm08.stdout:1/783: creat d9/d40/fef x:0 0 0 2026-03-09T19:27:31.785 INFO:tasks.workunit.client.1.vm08.stdout:1/784: link d9/d11/f73 d9/da/d95/dcd/de9/ff0 0 2026-03-09T19:27:31.966 INFO:tasks.workunit.client.0.vm07.stdout:4/420: write d3/d4f/d56/d5f/f6f [332676,28809] 0 2026-03-09T19:27:31.970 INFO:tasks.workunit.client.0.vm07.stdout:2/470: dwrite d3/dd/d16/d29/d2d/d45/d3b/d44/f81 [0,4194304] 0 2026-03-09T19:27:31.979 INFO:tasks.workunit.client.0.vm07.stdout:2/471: mknod d3/dd/d16/d29/d3c/d4c/ca0 0 2026-03-09T19:27:31.979 INFO:tasks.workunit.client.0.vm07.stdout:9/456: creat d0/db/d29/d2c/d36/fa1 x:0 0 0 2026-03-09T19:27:31.998 INFO:tasks.workunit.client.0.vm07.stdout:9/457: getdents d0/db/d29/d4d 0 2026-03-09T19:27:32.002 INFO:tasks.workunit.client.0.vm07.stdout:9/458: mknod d0/d6/d3a/ca2 0 2026-03-09T19:27:32.016 INFO:tasks.workunit.client.1.vm08.stdout:5/609: truncate d16/d45/fb1 1603661 0 2026-03-09T19:27:32.022 INFO:tasks.workunit.client.1.vm08.stdout:3/707: write d0/d6/de/d15/d96/fa0 [2307103,48054] 0 2026-03-09T19:27:32.056 INFO:tasks.workunit.client.1.vm08.stdout:0/646: write dd/d22/d24/d49/d50/d78/fbb [980409,6295] 0 2026-03-09T19:27:32.064 INFO:tasks.workunit.client.1.vm08.stdout:2/571: dwrite d3/d4/fa7 [0,4194304] 0 2026-03-09T19:27:32.084 INFO:tasks.workunit.client.1.vm08.stdout:0/647: dread dd/d22/d24/d49/d92/fcd [0,4194304] 0 2026-03-09T19:27:32.133 INFO:tasks.workunit.client.1.vm08.stdout:0/648: dwrite dd/f6d [0,4194304] 0 2026-03-09T19:27:32.141 INFO:tasks.workunit.client.1.vm08.stdout:1/785: dread d9/d40/f57 [0,4194304] 0 2026-03-09T19:27:32.155 INFO:tasks.workunit.client.1.vm08.stdout:7/692: mkdir d5/d14/dae/d1c/d83/de9 0 2026-03-09T19:27:32.169 INFO:tasks.workunit.client.1.vm08.stdout:9/644: dwrite d0/d1b/f7c [0,4194304] 0 2026-03-09T19:27:32.178 INFO:tasks.workunit.client.1.vm08.stdout:4/626: unlink da/d10/d16/d28/d2f/f7e 0 2026-03-09T19:27:32.233 INFO:tasks.workunit.client.1.vm08.stdout:7/693: symlink d5/d14/dae/d1c/d83/d90/lea 0 2026-03-09T19:27:32.233 INFO:tasks.workunit.client.1.vm08.stdout:9/645: creat d0/d1b/d4e/fd9 x:0 0 0 2026-03-09T19:27:32.233 INFO:tasks.workunit.client.1.vm08.stdout:8/613: link de/d1d/c2b de/d25/d31/d82/d6d/cd7 0 2026-03-09T19:27:32.233 INFO:tasks.workunit.client.1.vm08.stdout:7/694: unlink d5/d14/d27/d78/dc7/dce/fe3 0 2026-03-09T19:27:32.233 INFO:tasks.workunit.client.1.vm08.stdout:4/627: creat da/d10/d16/d28/d46/fbe x:0 0 0 2026-03-09T19:27:32.233 INFO:tasks.workunit.client.1.vm08.stdout:7/695: rename d5/d14/d27/d54/l8c to d5/d14/d27/d54/leb 0 2026-03-09T19:27:32.233 INFO:tasks.workunit.client.1.vm08.stdout:9/646: rmdir d0/d2/d14/dcb 0 2026-03-09T19:27:32.233 INFO:tasks.workunit.client.1.vm08.stdout:9/647: rmdir d0/d2/d14/d5c 39 2026-03-09T19:27:32.233 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:32 vm08.local ceph-mon[57794]: pgmap v170: 65 pgs: 65 active+clean; 2.3 GiB data, 8.0 GiB used, 112 GiB / 120 GiB avail; 31 MiB/s rd, 92 MiB/s wr, 220 op/s 2026-03-09T19:27:32.234 INFO:tasks.workunit.client.1.vm08.stdout:0/649: read dd/d22/f41 [137484,53189] 0 2026-03-09T19:27:32.237 INFO:tasks.workunit.client.1.vm08.stdout:0/650: link dd/d22/d27/f9f dd/d22/d24/d49/d50/db3/fce 0 2026-03-09T19:27:32.238 INFO:tasks.workunit.client.1.vm08.stdout:0/651: truncate dd/f7a 1118927 0 2026-03-09T19:27:32.240 INFO:tasks.workunit.client.1.vm08.stdout:6/672: write d3/db/d43/d69/da0/fa7 [84402,101782] 0 2026-03-09T19:27:32.241 INFO:tasks.workunit.client.1.vm08.stdout:6/673: creat d3/d94/def/ff9 x:0 0 0 2026-03-09T19:27:32.244 INFO:tasks.workunit.client.1.vm08.stdout:6/674: creat d3/d34/d5c/da2/dd6/ffa x:0 0 0 2026-03-09T19:27:32.255 INFO:tasks.workunit.client.0.vm07.stdout:3/473: write d1/d6/f37 [2957876,84196] 0 2026-03-09T19:27:32.255 INFO:tasks.workunit.client.0.vm07.stdout:3/474: mkdir d1/d6/d4c/d97 0 2026-03-09T19:27:32.255 INFO:tasks.workunit.client.1.vm08.stdout:6/675: mknod d3/db/d43/d69/da0/cfb 0 2026-03-09T19:27:32.255 INFO:tasks.workunit.client.1.vm08.stdout:6/676: creat d3/dbc/ffc x:0 0 0 2026-03-09T19:27:32.255 INFO:tasks.workunit.client.1.vm08.stdout:6/677: creat d3/d15/ffd x:0 0 0 2026-03-09T19:27:32.255 INFO:tasks.workunit.client.1.vm08.stdout:0/652: dread fc [0,4194304] 0 2026-03-09T19:27:32.255 INFO:tasks.workunit.client.1.vm08.stdout:6/678: mkdir d3/d34/da9/dfe 0 2026-03-09T19:27:32.256 INFO:tasks.workunit.client.0.vm07.stdout:5/421: symlink d3/dd/d26/l87 0 2026-03-09T19:27:32.261 INFO:tasks.workunit.client.1.vm08.stdout:0/653: rename dd/d22/d24/d49/d92/fa7 to dd/d7e/fcf 0 2026-03-09T19:27:32.263 INFO:tasks.workunit.client.1.vm08.stdout:0/654: mknod dd/d22/d27/d6c/cd0 0 2026-03-09T19:27:32.265 INFO:tasks.workunit.client.0.vm07.stdout:7/442: mkdir d0/d4/d5/d99 0 2026-03-09T19:27:32.267 INFO:tasks.workunit.client.0.vm07.stdout:7/443: getdents d0/d4/d5/d8/d1a/d2a 0 2026-03-09T19:27:32.270 INFO:tasks.workunit.client.0.vm07.stdout:7/444: fsync d0/d4/d5/d8/d41/d64/d74/f82 0 2026-03-09T19:27:32.270 INFO:tasks.workunit.client.0.vm07.stdout:7/445: stat d0/d4/f86 0 2026-03-09T19:27:32.275 INFO:tasks.workunit.client.0.vm07.stdout:0/401: symlink d0/l88 0 2026-03-09T19:27:32.285 INFO:tasks.workunit.client.0.vm07.stdout:0/402: chown d0/d6/d13/d1c/d11/f2e 31 1 2026-03-09T19:27:32.285 INFO:tasks.workunit.client.0.vm07.stdout:2/472: dwrite d3/dd/d16/d29/d2d/f6d [0,4194304] 0 2026-03-09T19:27:32.288 INFO:tasks.workunit.client.0.vm07.stdout:1/454: rename d1/f4c to d1/d11/d37/d3f/d6e/f9f 0 2026-03-09T19:27:32.289 INFO:tasks.workunit.client.0.vm07.stdout:1/455: chown d1/d3/d21/l2a 459376 1 2026-03-09T19:27:32.290 INFO:tasks.workunit.client.0.vm07.stdout:2/473: creat d3/d49/fa1 x:0 0 0 2026-03-09T19:27:32.290 INFO:tasks.workunit.client.0.vm07.stdout:1/456: fdatasync d1/d11/d37/d5d/f59 0 2026-03-09T19:27:32.290 INFO:tasks.workunit.client.0.vm07.stdout:4/421: rename d3/d11/d29/f52 to d3/d4f/d56/d5f/d88/f92 0 2026-03-09T19:27:32.292 INFO:tasks.workunit.client.0.vm07.stdout:4/422: fdatasync d3/f7 0 2026-03-09T19:27:32.292 INFO:tasks.workunit.client.0.vm07.stdout:1/457: rename d1/d11/d37/d5a/d6d/c7b to d1/d11/d37/d5d/d50/ca0 0 2026-03-09T19:27:32.293 INFO:tasks.workunit.client.0.vm07.stdout:4/423: mkdir d3/d11/d16/d2f/d22/d70/d93 0 2026-03-09T19:27:32.293 INFO:tasks.workunit.client.0.vm07.stdout:1/458: dread - d1/d11/d37/d3f/d7e/f7f zero size 2026-03-09T19:27:32.294 INFO:tasks.workunit.client.0.vm07.stdout:4/424: symlink d3/d11/d2b/d37/l94 0 2026-03-09T19:27:32.319 INFO:tasks.workunit.client.0.vm07.stdout:1/459: dread d1/d3/d21/f47 [0,4194304] 0 2026-03-09T19:27:32.330 INFO:tasks.workunit.client.0.vm07.stdout:1/460: write d1/d11/d37/d5d/f59 [3268856,64716] 0 2026-03-09T19:27:32.330 INFO:tasks.workunit.client.0.vm07.stdout:1/461: symlink d1/la1 0 2026-03-09T19:27:32.330 INFO:tasks.workunit.client.0.vm07.stdout:1/462: creat d1/d11/d37/d3f/d45/d87/fa2 x:0 0 0 2026-03-09T19:27:32.330 INFO:tasks.workunit.client.0.vm07.stdout:1/463: dread d1/db/d31/f64 [0,4194304] 0 2026-03-09T19:27:32.333 INFO:tasks.workunit.client.0.vm07.stdout:1/464: mknod d1/d11/ca3 0 2026-03-09T19:27:32.335 INFO:tasks.workunit.client.0.vm07.stdout:1/465: fsync d1/d11/d37/d5a/f75 0 2026-03-09T19:27:32.376 INFO:tasks.workunit.client.0.vm07.stdout:1/466: unlink d1/d11/d37/d3f/f6c 0 2026-03-09T19:27:32.493 INFO:tasks.workunit.client.0.vm07.stdout:8/452: mkdir d7/d9/da7 0 2026-03-09T19:27:32.493 INFO:tasks.workunit.client.0.vm07.stdout:9/459: dwrite d0/f56 [0,4194304] 0 2026-03-09T19:27:32.493 INFO:tasks.workunit.client.0.vm07.stdout:8/453: link d7/d9/d37/d45/d4f/l72 d7/d9/da7/la8 0 2026-03-09T19:27:32.493 INFO:tasks.workunit.client.0.vm07.stdout:8/454: dread - d7/d9/d37/d34/f91 zero size 2026-03-09T19:27:32.493 INFO:tasks.workunit.client.0.vm07.stdout:8/455: stat d7/d9/d37/d45/d56/d67/c8d 0 2026-03-09T19:27:32.493 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:32 vm07.local ceph-mon[48545]: pgmap v170: 65 pgs: 65 active+clean; 2.3 GiB data, 8.0 GiB used, 112 GiB / 120 GiB avail; 31 MiB/s rd, 92 MiB/s wr, 220 op/s 2026-03-09T19:27:32.493 INFO:tasks.workunit.client.1.vm08.stdout:5/610: write d16/d45/f55 [1295624,67235] 0 2026-03-09T19:27:32.494 INFO:tasks.workunit.client.1.vm08.stdout:5/611: write d16/d45/f6b [549637,3119] 0 2026-03-09T19:27:32.494 INFO:tasks.workunit.client.1.vm08.stdout:3/708: dwrite d0/d6/faa [0,4194304] 0 2026-03-09T19:27:32.494 INFO:tasks.workunit.client.1.vm08.stdout:3/709: chown d0/d6/de/d6e 186775000 1 2026-03-09T19:27:32.494 INFO:tasks.workunit.client.1.vm08.stdout:3/710: stat d0/d8/d24/c6f 0 2026-03-09T19:27:32.494 INFO:tasks.workunit.client.1.vm08.stdout:5/612: dwrite d16/d45/daf/fc5 [0,4194304] 0 2026-03-09T19:27:32.494 INFO:tasks.workunit.client.1.vm08.stdout:3/711: creat d0/d52/d6d/d77/d88/fe0 x:0 0 0 2026-03-09T19:27:32.494 INFO:tasks.workunit.client.1.vm08.stdout:5/613: chown d16/c6d 200 1 2026-03-09T19:27:32.494 INFO:tasks.workunit.client.1.vm08.stdout:5/614: chown d16/d45/la7 60 1 2026-03-09T19:27:32.494 INFO:tasks.workunit.client.1.vm08.stdout:5/615: rename d16/d1e/d8c/d99/da8/c95 to d16/d45/d81/cca 0 2026-03-09T19:27:32.494 INFO:tasks.workunit.client.1.vm08.stdout:5/616: getdents d16/d1e/d3b/d61 0 2026-03-09T19:27:32.494 INFO:tasks.workunit.client.1.vm08.stdout:5/617: getdents d16/d1e/d6e 0 2026-03-09T19:27:32.696 INFO:tasks.workunit.client.0.vm07.stdout:3/475: sync 2026-03-09T19:27:32.697 INFO:tasks.workunit.client.0.vm07.stdout:3/476: readlink d1/d1f/d16/l42 0 2026-03-09T19:27:32.698 INFO:tasks.workunit.client.0.vm07.stdout:3/477: fsync d1/f20 0 2026-03-09T19:27:32.698 INFO:tasks.workunit.client.0.vm07.stdout:3/478: chown d1/d6/d71 52183 1 2026-03-09T19:27:32.699 INFO:tasks.workunit.client.0.vm07.stdout:3/479: write d1/d1f/d16/f30 [4411590,86373] 0 2026-03-09T19:27:32.717 INFO:tasks.workunit.client.0.vm07.stdout:4/425: sync 2026-03-09T19:27:32.717 INFO:tasks.workunit.client.0.vm07.stdout:8/456: sync 2026-03-09T19:27:32.717 INFO:tasks.workunit.client.0.vm07.stdout:0/403: sync 2026-03-09T19:27:32.718 INFO:tasks.workunit.client.0.vm07.stdout:8/457: chown d7/d9/d10/d44 460907079 1 2026-03-09T19:27:32.718 INFO:tasks.workunit.client.0.vm07.stdout:4/426: chown d3/d11/d16/d2f/d22/d86 4585290 1 2026-03-09T19:27:32.719 INFO:tasks.workunit.client.0.vm07.stdout:8/458: truncate f4 273433 0 2026-03-09T19:27:32.720 INFO:tasks.workunit.client.0.vm07.stdout:0/404: dread - d0/d6/d13/d17/d19/d57/f5a zero size 2026-03-09T19:27:32.720 INFO:tasks.workunit.client.0.vm07.stdout:0/405: stat d0/d6/d13/d1c/d61/f63 0 2026-03-09T19:27:32.720 INFO:tasks.workunit.client.0.vm07.stdout:8/459: read - d7/d9/f87 zero size 2026-03-09T19:27:32.721 INFO:tasks.workunit.client.0.vm07.stdout:0/406: mknod d0/d6/c89 0 2026-03-09T19:27:32.722 INFO:tasks.workunit.client.0.vm07.stdout:8/460: chown d7/d9/d10/f41 10 1 2026-03-09T19:27:32.722 INFO:tasks.workunit.client.0.vm07.stdout:0/407: unlink d0/d6/d13/d1c/d11/d56/l86 0 2026-03-09T19:27:32.757 INFO:tasks.workunit.client.0.vm07.stdout:0/408: dread d0/d6/d13/d33/f35 [0,4194304] 0 2026-03-09T19:27:32.954 INFO:tasks.workunit.client.1.vm08.stdout:9/648: dread d0/d2/f1d [0,4194304] 0 2026-03-09T19:27:32.978 INFO:tasks.workunit.client.0.vm07.stdout:6/377: unlink d0/l6e 0 2026-03-09T19:27:32.980 INFO:tasks.workunit.client.0.vm07.stdout:6/378: truncate d0/d1/db/d1d/f27 282143 0 2026-03-09T19:27:32.980 INFO:tasks.workunit.client.1.vm08.stdout:2/572: write d3/d4/d23/d2c/d39/d5e/de/f7a [1973675,58471] 0 2026-03-09T19:27:32.982 INFO:tasks.workunit.client.1.vm08.stdout:2/573: symlink d3/d9/d79/d46/d8c/d92/lc4 0 2026-03-09T19:27:32.986 INFO:tasks.workunit.client.1.vm08.stdout:2/574: dwrite d3/d9/d79/fc2 [0,4194304] 0 2026-03-09T19:27:32.990 INFO:tasks.workunit.client.1.vm08.stdout:1/786: write d9/d11/d7a/d89/fdb [788755,123625] 0 2026-03-09T19:27:33.004 INFO:tasks.workunit.client.1.vm08.stdout:0/655: mkdir dd/d22/d24/d49/dd1 0 2026-03-09T19:27:33.008 INFO:tasks.workunit.client.1.vm08.stdout:0/656: symlink dd/d22/d63/d6e/ld2 0 2026-03-09T19:27:33.008 INFO:tasks.workunit.client.1.vm08.stdout:0/657: stat dd/d22/d7b/d82/fb1 0 2026-03-09T19:27:33.008 INFO:tasks.workunit.client.1.vm08.stdout:2/575: getdents d3/d9/d79 0 2026-03-09T19:27:33.008 INFO:tasks.workunit.client.1.vm08.stdout:0/658: readlink dd/l14 0 2026-03-09T19:27:33.013 INFO:tasks.workunit.client.1.vm08.stdout:8/614: dwrite de/d1d/f1e [0,4194304] 0 2026-03-09T19:27:33.016 INFO:tasks.workunit.client.1.vm08.stdout:8/615: truncate de/d1d/d2e/d5f/fd3 650843 0 2026-03-09T19:27:33.017 INFO:tasks.workunit.client.1.vm08.stdout:8/616: readlink de/d25/l63 0 2026-03-09T19:27:33.018 INFO:tasks.workunit.client.1.vm08.stdout:7/696: dwrite d5/d14/dae/d1c/d83/d9c/dcb/fda [0,4194304] 0 2026-03-09T19:27:33.020 INFO:tasks.workunit.client.1.vm08.stdout:4/628: write da/d10/d16/d28/d46/fb1 [1106798,51125] 0 2026-03-09T19:27:33.021 INFO:tasks.workunit.client.0.vm07.stdout:6/379: sync 2026-03-09T19:27:33.025 INFO:tasks.workunit.client.1.vm08.stdout:4/629: dread da/d10/d16/d28/d2f/d4f/d56/f9a [0,4194304] 0 2026-03-09T19:27:33.035 INFO:tasks.workunit.client.1.vm08.stdout:7/697: chown d5/d14/d2b/daa/lb1 95 1 2026-03-09T19:27:33.036 INFO:tasks.workunit.client.1.vm08.stdout:4/630: rename da/d10/d26/d38/fa5 to da/d10/d16/fbf 0 2026-03-09T19:27:33.040 INFO:tasks.workunit.client.0.vm07.stdout:6/380: dread d0/d1/db/f14 [0,4194304] 0 2026-03-09T19:27:33.041 INFO:tasks.workunit.client.1.vm08.stdout:7/698: mknod d5/d14/dae/d3a/d42/cec 0 2026-03-09T19:27:33.044 INFO:tasks.workunit.client.1.vm08.stdout:4/631: chown da/d10/d16/d28/d46/d52/d6e/d6d/cb8 3230 1 2026-03-09T19:27:33.047 INFO:tasks.workunit.client.1.vm08.stdout:1/787: dread d9/da/f8e [0,4194304] 0 2026-03-09T19:27:33.053 INFO:tasks.workunit.client.1.vm08.stdout:6/679: write d3/d15/f6a [633554,97198] 0 2026-03-09T19:27:33.056 INFO:tasks.workunit.client.1.vm08.stdout:7/699: mknod d5/d14/dae/d1c/d83/d90/ced 0 2026-03-09T19:27:33.056 INFO:tasks.workunit.client.1.vm08.stdout:4/632: unlink da/d10/d1b/d23/l30 0 2026-03-09T19:27:33.056 INFO:tasks.workunit.client.1.vm08.stdout:8/617: getdents de/d25/d31/d82 0 2026-03-09T19:27:33.058 INFO:tasks.workunit.client.1.vm08.stdout:8/618: stat de/d1d/d2e/l3c 0 2026-03-09T19:27:33.059 INFO:tasks.workunit.client.1.vm08.stdout:4/633: chown da/d10/d26/d27/d32/f45 718 1 2026-03-09T19:27:33.062 INFO:tasks.workunit.client.0.vm07.stdout:6/381: rename d0/d1/db/d24/d53 to d0/d13/d1e/d95 0 2026-03-09T19:27:33.062 INFO:tasks.workunit.client.0.vm07.stdout:5/422: write d3/dd/d26/d3f/f66 [66823,122986] 0 2026-03-09T19:27:33.067 INFO:tasks.workunit.client.1.vm08.stdout:4/634: write da/d10/d16/d28/d2f/d4f/d64/d84/d8a/fb9 [165348,13244] 0 2026-03-09T19:27:33.075 INFO:tasks.workunit.client.0.vm07.stdout:6/382: symlink d0/d13/l96 0 2026-03-09T19:27:33.075 INFO:tasks.workunit.client.1.vm08.stdout:4/635: chown da/d10/f53 201 1 2026-03-09T19:27:33.079 INFO:tasks.workunit.client.0.vm07.stdout:7/446: dwrite d0/d4/d5/d26/d3c/f60 [0,4194304] 0 2026-03-09T19:27:33.092 INFO:tasks.workunit.client.0.vm07.stdout:2/474: write d3/dd/d16/d30/d40/f4f [29446,61406] 0 2026-03-09T19:27:33.092 INFO:tasks.workunit.client.0.vm07.stdout:2/475: dwrite d3/dd/d16/d29/d2d/d45/d85/d8a/f9e [0,4194304] 0 2026-03-09T19:27:33.098 INFO:tasks.workunit.client.1.vm08.stdout:8/619: fsync de/d25/d31/d82/d6d/d99/da5/db3/f50 0 2026-03-09T19:27:33.101 INFO:tasks.workunit.client.1.vm08.stdout:1/788: creat d9/d11/d7a/ff1 x:0 0 0 2026-03-09T19:27:33.102 INFO:tasks.workunit.client.1.vm08.stdout:1/789: chown d9/d11/d7a/d89/d8d/daa 125081389 1 2026-03-09T19:27:33.103 INFO:tasks.workunit.client.0.vm07.stdout:7/447: creat d0/d52/f9a x:0 0 0 2026-03-09T19:27:33.104 INFO:tasks.workunit.client.1.vm08.stdout:7/700: mknod d5/dc4/cee 0 2026-03-09T19:27:33.109 INFO:tasks.workunit.client.1.vm08.stdout:4/636: rmdir da/d10/d16/d28/d2f/d4f/d64/d84/d8a 39 2026-03-09T19:27:33.113 INFO:tasks.workunit.client.1.vm08.stdout:8/620: truncate de/d25/d33/f41 2741702 0 2026-03-09T19:27:33.115 INFO:tasks.workunit.client.1.vm08.stdout:1/790: fdatasync d9/da/d12/d39/f47 0 2026-03-09T19:27:33.115 INFO:tasks.workunit.client.1.vm08.stdout:1/791: rename d9 to d9/da/d2d/d62/df2 22 2026-03-09T19:27:33.116 INFO:tasks.workunit.client.1.vm08.stdout:7/701: creat d5/d14/dae/d1c/d83/d9c/fef x:0 0 0 2026-03-09T19:27:33.117 INFO:tasks.workunit.client.0.vm07.stdout:7/448: symlink d0/d4/d5/d26/l9b 0 2026-03-09T19:27:33.117 INFO:tasks.workunit.client.0.vm07.stdout:1/467: truncate d1/d11/d37/d5a/f75 920992 0 2026-03-09T19:27:33.121 INFO:tasks.workunit.client.0.vm07.stdout:9/460: write d0/d6/f10 [4072285,107872] 0 2026-03-09T19:27:33.121 INFO:tasks.workunit.client.1.vm08.stdout:7/702: rename d5/d14/d27/d54/d86/cc5 to d5/d14/dae/d1c/d83/d9c/dcb/dd2/cf0 0 2026-03-09T19:27:33.123 INFO:tasks.workunit.client.0.vm07.stdout:7/449: rename d0/d4/d5/d8/d1a/l23 to d0/d4/d5/d8/d41/d64/d74/d98/l9c 0 2026-03-09T19:27:33.124 INFO:tasks.workunit.client.0.vm07.stdout:7/450: chown d0/d4/d5/d26/f4a 18189791 1 2026-03-09T19:27:33.128 INFO:tasks.workunit.client.1.vm08.stdout:3/712: dwrite d0/d6/de/d15/fa3 [0,4194304] 0 2026-03-09T19:27:33.129 INFO:tasks.workunit.client.1.vm08.stdout:3/713: chown d0/d6/de/d15 3 1 2026-03-09T19:27:33.138 INFO:tasks.workunit.client.1.vm08.stdout:5/618: dwrite d16/d1e/d3b/f50 [0,4194304] 0 2026-03-09T19:27:33.139 INFO:tasks.workunit.client.1.vm08.stdout:7/703: fsync d5/d14/dae/d1c/fab 0 2026-03-09T19:27:33.140 INFO:tasks.workunit.client.1.vm08.stdout:5/619: chown d16/d1e/d30/d8a 477 1 2026-03-09T19:27:33.141 INFO:tasks.workunit.client.1.vm08.stdout:5/620: chown d16/d1e/d9b/fb0 3844740 1 2026-03-09T19:27:33.145 INFO:tasks.workunit.client.0.vm07.stdout:9/461: creat d0/d6/d3a/d81/fa3 x:0 0 0 2026-03-09T19:27:33.145 INFO:tasks.workunit.client.0.vm07.stdout:9/462: chown d0/db/d29/d68/d99 255261308 1 2026-03-09T19:27:33.153 INFO:tasks.workunit.client.0.vm07.stdout:7/451: rename d0/d4/d5/d26/d3c/d39/c49 to d0/d4/d5/c9d 0 2026-03-09T19:27:33.164 INFO:tasks.workunit.client.0.vm07.stdout:9/463: rename d0/d17/c38 to d0/db/d29/d2c/d36/d5a/ca4 0 2026-03-09T19:27:33.164 INFO:tasks.workunit.client.0.vm07.stdout:4/427: write d3/d11/f6c [3796237,43262] 0 2026-03-09T19:27:33.164 INFO:tasks.workunit.client.0.vm07.stdout:8/461: write d7/d9/d10/d44/f4a [1056721,29961] 0 2026-03-09T19:27:33.164 INFO:tasks.workunit.client.0.vm07.stdout:3/480: dwrite d1/d1f/d16/d28/f3c [0,4194304] 0 2026-03-09T19:27:33.164 INFO:tasks.workunit.client.0.vm07.stdout:7/452: link d0/d52/d54/f5e d0/d52/d54/f9e 0 2026-03-09T19:27:33.164 INFO:tasks.workunit.client.0.vm07.stdout:0/409: dwrite d0/d6/d13/d1c/d11/f2e [0,4194304] 0 2026-03-09T19:27:33.166 INFO:tasks.workunit.client.0.vm07.stdout:4/428: getdents d3/d11/d29 0 2026-03-09T19:27:33.168 INFO:tasks.workunit.client.0.vm07.stdout:4/429: chown d3/d11/d29/d34 23376275 1 2026-03-09T19:27:33.169 INFO:tasks.workunit.client.0.vm07.stdout:0/410: stat d0/d6/d13/d1c/d61 0 2026-03-09T19:27:33.170 INFO:tasks.workunit.client.0.vm07.stdout:8/462: creat d7/d30/d32/fa9 x:0 0 0 2026-03-09T19:27:33.170 INFO:tasks.workunit.client.0.vm07.stdout:3/481: creat d1/f98 x:0 0 0 2026-03-09T19:27:33.175 INFO:tasks.workunit.client.0.vm07.stdout:4/430: creat d3/d11/d2b/d37/f95 x:0 0 0 2026-03-09T19:27:33.213 INFO:tasks.workunit.client.0.vm07.stdout:8/463: creat d7/d9/d37/d34/faa x:0 0 0 2026-03-09T19:27:33.213 INFO:tasks.workunit.client.0.vm07.stdout:3/482: rename d1/d6/d71/l96 to d1/d6/dd/l99 0 2026-03-09T19:27:33.213 INFO:tasks.workunit.client.0.vm07.stdout:4/431: truncate d3/f7 2109347 0 2026-03-09T19:27:33.213 INFO:tasks.workunit.client.0.vm07.stdout:3/483: symlink d1/d6/d45/l9a 0 2026-03-09T19:27:33.213 INFO:tasks.workunit.client.0.vm07.stdout:3/484: write d1/f98 [672838,12258] 0 2026-03-09T19:27:33.213 INFO:tasks.workunit.client.0.vm07.stdout:9/464: link d0/d6/d3a/d7e/f7c d0/db/d29/d4d/fa5 0 2026-03-09T19:27:33.290 INFO:tasks.workunit.client.0.vm07.stdout:6/383: sync 2026-03-09T19:27:33.291 INFO:tasks.workunit.client.0.vm07.stdout:6/384: chown d0/d1/db/d52/c66 3 1 2026-03-09T19:27:33.292 INFO:tasks.workunit.client.0.vm07.stdout:6/385: fsync d0/d13/f3f 0 2026-03-09T19:27:33.294 INFO:tasks.workunit.client.0.vm07.stdout:6/386: chown d0/d1/db/d52/d94/d87/f8e 1 1 2026-03-09T19:27:33.296 INFO:tasks.workunit.client.0.vm07.stdout:6/387: rename d0/d1/db/d52/d94/f83 to d0/d1/d28/d76/f97 0 2026-03-09T19:27:33.297 INFO:tasks.workunit.client.0.vm07.stdout:4/432: sync 2026-03-09T19:27:33.297 INFO:tasks.workunit.client.0.vm07.stdout:3/485: sync 2026-03-09T19:27:33.299 INFO:tasks.workunit.client.0.vm07.stdout:6/388: link d0/d1/db/d1d/l58 d0/d1/db/d17/d4c/d7b/l98 0 2026-03-09T19:27:33.300 INFO:tasks.workunit.client.0.vm07.stdout:3/486: mknod d1/d6/d4c/c9b 0 2026-03-09T19:27:33.300 INFO:tasks.workunit.client.0.vm07.stdout:4/433: creat d3/d11/d2b/d38/f96 x:0 0 0 2026-03-09T19:27:33.301 INFO:tasks.workunit.client.0.vm07.stdout:6/389: chown d0/d1/db/d1d/l63 38752 1 2026-03-09T19:27:33.302 INFO:tasks.workunit.client.1.vm08.stdout:6/680: sync 2026-03-09T19:27:33.303 INFO:tasks.workunit.client.1.vm08.stdout:1/792: sync 2026-03-09T19:27:33.304 INFO:tasks.workunit.client.1.vm08.stdout:6/681: creat d3/d94/fff x:0 0 0 2026-03-09T19:27:33.306 INFO:tasks.workunit.client.1.vm08.stdout:1/793: rename d9/da/dc/f78 to d9/da/d53/d67/d6c/d76/ff3 0 2026-03-09T19:27:33.306 INFO:tasks.workunit.client.1.vm08.stdout:1/794: read - d9/d40/fef zero size 2026-03-09T19:27:33.309 INFO:tasks.workunit.client.1.vm08.stdout:1/795: mkdir d9/da/d2d/d4e/df4 0 2026-03-09T19:27:33.313 INFO:tasks.workunit.client.1.vm08.stdout:1/796: dread d9/da/d12/d39/f52 [0,4194304] 0 2026-03-09T19:27:33.315 INFO:tasks.workunit.client.1.vm08.stdout:1/797: dread d9/da/d95/fc0 [0,4194304] 0 2026-03-09T19:27:33.334 INFO:tasks.workunit.client.1.vm08.stdout:9/649: write d0/d2/d14/d98/f9e [6226,130028] 0 2026-03-09T19:27:33.334 INFO:tasks.workunit.client.1.vm08.stdout:2/576: write d3/d9/d26/f69 [1201641,26095] 0 2026-03-09T19:27:33.334 INFO:tasks.workunit.client.1.vm08.stdout:2/577: fdatasync d3/d4/d23/d2c/d39/d5e/de/f17 0 2026-03-09T19:27:33.334 INFO:tasks.workunit.client.1.vm08.stdout:2/578: chown d3/d9/d4a/c82 75704 1 2026-03-09T19:27:33.334 INFO:tasks.workunit.client.0.vm07.stdout:6/390: creat d0/d4e/f99 x:0 0 0 2026-03-09T19:27:33.334 INFO:tasks.workunit.client.0.vm07.stdout:3/487: truncate d1/d6/f9 1306161 0 2026-03-09T19:27:33.334 INFO:tasks.workunit.client.0.vm07.stdout:6/391: creat d0/d1/db/d1d/d77/f9a x:0 0 0 2026-03-09T19:27:33.334 INFO:tasks.workunit.client.0.vm07.stdout:3/488: truncate d1/d74/f52 219243 0 2026-03-09T19:27:33.334 INFO:tasks.workunit.client.0.vm07.stdout:6/392: fdatasync d0/d1/db/f43 0 2026-03-09T19:27:33.334 INFO:tasks.workunit.client.0.vm07.stdout:3/489: rmdir d1/d6 39 2026-03-09T19:27:33.339 INFO:tasks.workunit.client.0.vm07.stdout:3/490: fdatasync d1/d6/dd/f8a 0 2026-03-09T19:27:33.340 INFO:tasks.workunit.client.0.vm07.stdout:3/491: creat d1/d1f/f9c x:0 0 0 2026-03-09T19:27:33.341 INFO:tasks.workunit.client.0.vm07.stdout:3/492: readlink d1/d1f/l4a 0 2026-03-09T19:27:33.344 INFO:tasks.workunit.client.0.vm07.stdout:3/493: dwrite d1/d6/dd/f33 [0,4194304] 0 2026-03-09T19:27:33.347 INFO:tasks.workunit.client.0.vm07.stdout:3/494: stat d1/d6/d4c/d97 0 2026-03-09T19:27:33.348 INFO:tasks.workunit.client.0.vm07.stdout:3/495: creat d1/d6/f9d x:0 0 0 2026-03-09T19:27:33.349 INFO:tasks.workunit.client.0.vm07.stdout:3/496: creat d1/d3d/d47/f9e x:0 0 0 2026-03-09T19:27:33.349 INFO:tasks.workunit.client.0.vm07.stdout:3/497: rmdir d1/d1f/d16 39 2026-03-09T19:27:33.351 INFO:tasks.workunit.client.0.vm07.stdout:3/498: mknod d1/d1f/d16/d28/d7c/c9f 0 2026-03-09T19:27:33.386 INFO:tasks.workunit.client.0.vm07.stdout:9/465: dread d0/d17/f42 [0,4194304] 0 2026-03-09T19:27:33.387 INFO:tasks.workunit.client.0.vm07.stdout:9/466: chown d0/db/d29/d2c/f4a 191 1 2026-03-09T19:27:33.399 INFO:tasks.workunit.client.0.vm07.stdout:9/467: creat d0/d6/d73/fa6 x:0 0 0 2026-03-09T19:27:33.399 INFO:tasks.workunit.client.1.vm08.stdout:0/659: write dd/d22/d24/f60 [57524,65608] 0 2026-03-09T19:27:33.405 INFO:tasks.workunit.client.0.vm07.stdout:9/468: chown d0/db/f1d 60459267 1 2026-03-09T19:27:33.416 INFO:tasks.workunit.client.1.vm08.stdout:0/660: link dd/d22/d24/d49/l4e dd/d22/d63/d6e/ld3 0 2026-03-09T19:27:33.416 INFO:tasks.workunit.client.1.vm08.stdout:0/661: chown dd/d22/d27/c47 8 1 2026-03-09T19:27:33.416 INFO:tasks.workunit.client.0.vm07.stdout:9/469: read d0/d17/f33 [5702,1296] 0 2026-03-09T19:27:33.416 INFO:tasks.workunit.client.0.vm07.stdout:9/470: read d0/db/d29/d2c/f4a [1340878,62772] 0 2026-03-09T19:27:33.416 INFO:tasks.workunit.client.0.vm07.stdout:9/471: symlink d0/d6/d3a/d7e/la7 0 2026-03-09T19:27:33.417 INFO:tasks.workunit.client.1.vm08.stdout:0/662: read dd/d22/f29 [2700116,83326] 0 2026-03-09T19:27:33.426 INFO:tasks.workunit.client.1.vm08.stdout:0/663: dread dd/d22/f3e [0,4194304] 0 2026-03-09T19:27:33.427 INFO:tasks.workunit.client.1.vm08.stdout:0/664: mkdir dd/d22/d24/d49/d50/dd4 0 2026-03-09T19:27:33.470 INFO:tasks.workunit.client.0.vm07.stdout:5/423: dwrite d3/d1a/d28/f2e [0,4194304] 0 2026-03-09T19:27:33.475 INFO:tasks.workunit.client.0.vm07.stdout:5/424: unlink d3/d1a/d28/f39 0 2026-03-09T19:27:33.481 INFO:tasks.workunit.client.0.vm07.stdout:5/425: rename d3/dd/d26/d2c/l3a to d3/dd/d26/d2d/d60/d83/l88 0 2026-03-09T19:27:33.486 INFO:tasks.workunit.client.0.vm07.stdout:5/426: dwrite d3/f4e [0,4194304] 0 2026-03-09T19:27:33.491 INFO:tasks.workunit.client.1.vm08.stdout:4/637: dread da/d10/d16/d28/d2f/f80 [0,4194304] 0 2026-03-09T19:27:33.495 INFO:tasks.workunit.client.0.vm07.stdout:5/427: fsync d3/f2f 0 2026-03-09T19:27:33.495 INFO:tasks.workunit.client.1.vm08.stdout:4/638: fsync da/d10/d26/d38/f43 0 2026-03-09T19:27:33.500 INFO:tasks.workunit.client.0.vm07.stdout:1/468: write d1/d11/d37/d3f/d45/f3b [1142909,83490] 0 2026-03-09T19:27:33.508 INFO:tasks.workunit.client.1.vm08.stdout:3/714: write d0/d6/de/d1b/fc0 [5088361,116398] 0 2026-03-09T19:27:33.509 INFO:tasks.workunit.client.0.vm07.stdout:1/469: fdatasync d1/db/f9b 0 2026-03-09T19:27:33.509 INFO:tasks.workunit.client.0.vm07.stdout:7/453: write d0/d4/d5/d26/f75 [703672,47941] 0 2026-03-09T19:27:33.509 INFO:tasks.workunit.client.0.vm07.stdout:1/470: write d1/db/d31/d4f/f8c [1399237,5161] 0 2026-03-09T19:27:33.509 INFO:tasks.workunit.client.0.vm07.stdout:2/476: dwrite d3/f15 [0,4194304] 0 2026-03-09T19:27:33.510 INFO:tasks.workunit.client.1.vm08.stdout:3/715: write d0/d52/d6d/d77/d88/fe0 [474479,36686] 0 2026-03-09T19:27:33.510 INFO:tasks.workunit.client.0.vm07.stdout:3/499: getdents d1 0 2026-03-09T19:27:33.511 INFO:tasks.workunit.client.0.vm07.stdout:2/477: chown d3/dd/d16/d29/d2d/d45/c83 0 1 2026-03-09T19:27:33.512 INFO:tasks.workunit.client.0.vm07.stdout:0/411: write d0/d6/d13/d1c/d50/f85 [158972,128541] 0 2026-03-09T19:27:33.519 INFO:tasks.workunit.client.0.vm07.stdout:1/471: dwrite d1/f96 [0,4194304] 0 2026-03-09T19:27:33.539 INFO:tasks.workunit.client.1.vm08.stdout:5/621: dwrite d16/d1e/d8c/d99/da8/fbc [0,4194304] 0 2026-03-09T19:27:33.539 INFO:tasks.workunit.client.1.vm08.stdout:8/621: dwrite de/d1d/d21/f62 [0,4194304] 0 2026-03-09T19:27:33.539 INFO:tasks.workunit.client.1.vm08.stdout:3/716: chown d0/d6/de/d6e/d51/d7f 75850450 1 2026-03-09T19:27:33.539 INFO:tasks.workunit.client.1.vm08.stdout:7/704: dwrite d5/d14/d2b/d4b/fe2 [0,4194304] 0 2026-03-09T19:27:33.540 INFO:tasks.workunit.client.0.vm07.stdout:1/472: dwrite d1/d11/d37/d5d/f8a [0,4194304] 0 2026-03-09T19:27:33.540 INFO:tasks.workunit.client.0.vm07.stdout:8/464: write d7/d9/d10/d44/d9a/fa1 [695433,42554] 0 2026-03-09T19:27:33.569 INFO:tasks.workunit.client.1.vm08.stdout:6/682: write d3/d15/f19 [7447690,88535] 0 2026-03-09T19:27:33.579 INFO:tasks.workunit.client.1.vm08.stdout:1/798: rmdir d9/da/d2d/d4e 39 2026-03-09T19:27:33.590 INFO:tasks.workunit.client.1.vm08.stdout:3/717: creat d0/d6/de/d1b/d16/dd1/fe1 x:0 0 0 2026-03-09T19:27:33.590 INFO:tasks.workunit.client.1.vm08.stdout:7/705: symlink d5/lf1 0 2026-03-09T19:27:33.590 INFO:tasks.workunit.client.0.vm07.stdout:2/478: fsync d3/fa 0 2026-03-09T19:27:33.590 INFO:tasks.workunit.client.0.vm07.stdout:1/473: symlink d1/d11/d37/d5d/la4 0 2026-03-09T19:27:33.590 INFO:tasks.workunit.client.0.vm07.stdout:1/474: readlink d1/d11/d37/d3f/d45/le 0 2026-03-09T19:27:33.591 INFO:tasks.workunit.client.0.vm07.stdout:0/412: dread d0/d6/d13/d17/d19/d57/f6f [0,4194304] 0 2026-03-09T19:27:33.591 INFO:tasks.workunit.client.0.vm07.stdout:4/434: write d3/d11/d2b/f69 [831068,59763] 0 2026-03-09T19:27:33.593 INFO:tasks.workunit.client.1.vm08.stdout:9/650: write d0/d1b/d97/f22 [7275240,3842] 0 2026-03-09T19:27:33.596 INFO:tasks.workunit.client.0.vm07.stdout:8/465: write d7/f40 [7667799,50450] 0 2026-03-09T19:27:33.597 INFO:tasks.workunit.client.1.vm08.stdout:3/718: dread - d0/d6/d93/dcb/fce zero size 2026-03-09T19:27:33.597 INFO:tasks.workunit.client.1.vm08.stdout:7/706: creat d5/d14/dae/d3a/d42/ff2 x:0 0 0 2026-03-09T19:27:33.600 INFO:tasks.workunit.client.1.vm08.stdout:3/719: write d0/d6/de/d1b/d16/dd1/fe1 [368399,48525] 0 2026-03-09T19:27:33.602 INFO:tasks.workunit.client.1.vm08.stdout:2/579: write d3/d4/f91 [483566,126375] 0 2026-03-09T19:27:33.602 INFO:tasks.workunit.client.0.vm07.stdout:7/454: getdents d0/d52/d54/d55/d7f 0 2026-03-09T19:27:33.604 INFO:tasks.workunit.client.1.vm08.stdout:9/651: truncate d0/d1b/d97/d48/d5e/fc7 385229 0 2026-03-09T19:27:33.604 INFO:tasks.workunit.client.1.vm08.stdout:9/652: fdatasync d0/d1b/f65 0 2026-03-09T19:27:33.605 INFO:tasks.workunit.client.0.vm07.stdout:6/393: dwrite d0/d1/db/d17/f38 [0,4194304] 0 2026-03-09T19:27:33.614 INFO:tasks.workunit.client.1.vm08.stdout:3/720: unlink d0/d6/f91 0 2026-03-09T19:27:33.635 INFO:tasks.workunit.client.1.vm08.stdout:3/721: chown d0/d6/dad/cb3 0 1 2026-03-09T19:27:33.635 INFO:tasks.workunit.client.1.vm08.stdout:6/683: creat d3/d34/f100 x:0 0 0 2026-03-09T19:27:33.635 INFO:tasks.workunit.client.1.vm08.stdout:3/722: mknod d0/d4b/ce2 0 2026-03-09T19:27:33.635 INFO:tasks.workunit.client.0.vm07.stdout:2/479: mkdir d3/dd/d16/d29/d3c/da2 0 2026-03-09T19:27:33.635 INFO:tasks.workunit.client.0.vm07.stdout:2/480: chown d3/dd/d16/d29/d3c/d5a/d7a/f6e 22 1 2026-03-09T19:27:33.635 INFO:tasks.workunit.client.0.vm07.stdout:1/475: creat d1/d11/d37/d3f/d45/d87/d88/fa5 x:0 0 0 2026-03-09T19:27:33.635 INFO:tasks.workunit.client.0.vm07.stdout:1/476: dread - d1/d11/d37/d3f/d45/f9d zero size 2026-03-09T19:27:33.635 INFO:tasks.workunit.client.0.vm07.stdout:1/477: write d1/d3/d21/f47 [278660,2734] 0 2026-03-09T19:27:33.635 INFO:tasks.workunit.client.0.vm07.stdout:1/478: write d1/f96 [379280,124580] 0 2026-03-09T19:27:33.635 INFO:tasks.workunit.client.0.vm07.stdout:4/435: creat d3/d11/d16/d2f/d22/d86/f97 x:0 0 0 2026-03-09T19:27:33.635 INFO:tasks.workunit.client.0.vm07.stdout:0/413: rename d0/d6/d13/d1c/d11/d76 to d0/d6/d13/d1c/d11/d56/d78/d8a 0 2026-03-09T19:27:33.635 INFO:tasks.workunit.client.0.vm07.stdout:7/455: dread - d0/d4/d5/d8/d41/f89 zero size 2026-03-09T19:27:33.635 INFO:tasks.workunit.client.0.vm07.stdout:7/456: chown d0/d4/d5/f20 1 1 2026-03-09T19:27:33.635 INFO:tasks.workunit.client.0.vm07.stdout:7/457: chown d0/d4/d5/f20 326 1 2026-03-09T19:27:33.640 INFO:tasks.workunit.client.0.vm07.stdout:1/479: symlink d1/d11/d37/d3f/d6e/la6 0 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:1/480: chown d1/d3/l41 5647 1 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:4/436: creat d3/d11/d2b/f98 x:0 0 0 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:1/481: chown d1/d3e/c57 714149 1 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:4/437: mkdir d3/d11/d16/d2f/d22/d70/d99 0 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:4/438: stat d3/d11/d29/d34 0 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:1/482: dwrite d1/d3/d21/f55 [0,4194304] 0 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:2/481: getdents d3/d49 0 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:1/483: chown d1/d11/d37/d5d/f8a 31 1 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:4/439: creat d3/d11/d51/f9a x:0 0 0 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:6/394: rename d0/d1/db/d1d/l21 to d0/d13/d1e/d95/l9b 0 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:2/482: unlink d3/dd/d16/d29/d2d/f56 0 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:6/395: dread - d0/d13/f57 zero size 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:6/396: fdatasync d0/d1/db/d1d/d77/f9a 0 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:4/440: truncate d3/d11/d29/f42 614336 0 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:6/397: symlink d0/d4e/d75/l9c 0 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:6/398: creat d0/d1/db/f9d x:0 0 0 2026-03-09T19:27:33.657 INFO:tasks.workunit.client.0.vm07.stdout:1/484: dwrite d1/db/d31/d4f/f8c [0,4194304] 0 2026-03-09T19:27:33.658 INFO:tasks.workunit.client.0.vm07.stdout:2/483: dread d3/dd/d16/d29/d2d/d45/d85/d8a/f9e [0,4194304] 0 2026-03-09T19:27:33.659 INFO:tasks.workunit.client.0.vm07.stdout:6/399: truncate d0/d13/f26 4657837 0 2026-03-09T19:27:33.661 INFO:tasks.workunit.client.0.vm07.stdout:2/484: readlink d3/dd/d16/d29/d3c/l41 0 2026-03-09T19:27:33.696 INFO:tasks.workunit.client.1.vm08.stdout:7/707: sync 2026-03-09T19:27:33.697 INFO:tasks.workunit.client.1.vm08.stdout:7/708: chown d5/d14/d27 3 1 2026-03-09T19:27:33.700 INFO:tasks.workunit.client.1.vm08.stdout:7/709: dread d5/d14/d2b/d4b/f66 [0,4194304] 0 2026-03-09T19:27:33.712 INFO:tasks.workunit.client.0.vm07.stdout:2/485: sync 2026-03-09T19:27:33.713 INFO:tasks.workunit.client.1.vm08.stdout:7/710: creat d5/d14/dae/d3a/d42/ff3 x:0 0 0 2026-03-09T19:27:33.715 INFO:tasks.workunit.client.0.vm07.stdout:2/486: creat d3/dd/d16/d29/fa3 x:0 0 0 2026-03-09T19:27:33.721 INFO:tasks.workunit.client.0.vm07.stdout:2/487: mkdir d3/dd/d16/d29/d2d/d45/d3b/d44/d97/da4 0 2026-03-09T19:27:33.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:33 vm07.local ceph-mon[48545]: pgmap v171: 65 pgs: 65 active+clean; 2.3 GiB data, 8.3 GiB used, 112 GiB / 120 GiB avail; 31 MiB/s rd, 97 MiB/s wr, 212 op/s 2026-03-09T19:27:33.722 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:33 vm07.local ceph-mon[48545]: Upgrade: Updating mgr.vm08.mxylvw 2026-03-09T19:27:33.723 INFO:tasks.workunit.client.0.vm07.stdout:2/488: dwrite d3/dd/d16/d29/f58 [0,4194304] 0 2026-03-09T19:27:33.731 INFO:tasks.workunit.client.0.vm07.stdout:2/489: creat d3/dd/d16/d29/d2d/d45/d85/fa5 x:0 0 0 2026-03-09T19:27:33.735 INFO:tasks.workunit.client.0.vm07.stdout:2/490: symlink d3/dd/d16/d29/d3c/d5a/d7a/d74/la6 0 2026-03-09T19:27:33.735 INFO:tasks.workunit.client.0.vm07.stdout:2/491: truncate d3/dd/d16/d30/f7e 782015 0 2026-03-09T19:27:33.737 INFO:tasks.workunit.client.0.vm07.stdout:2/492: rename d3/d11/d38/d86 to d3/dd/d16/d30/da7 0 2026-03-09T19:27:33.741 INFO:tasks.workunit.client.0.vm07.stdout:2/493: mknod d3/dd/d16/d29/d3c/d4c/ca8 0 2026-03-09T19:27:33.742 INFO:tasks.workunit.client.0.vm07.stdout:2/494: rmdir d3/dd/d16/d29/d2d/d45/d3b 39 2026-03-09T19:27:33.746 INFO:tasks.workunit.client.0.vm07.stdout:2/495: mknod d3/dd/d16/d29/d2d/d45/d3b/d44/ca9 0 2026-03-09T19:27:33.750 INFO:tasks.workunit.client.0.vm07.stdout:2/496: mkdir d3/dd/daa 0 2026-03-09T19:27:33.756 INFO:tasks.workunit.client.0.vm07.stdout:2/497: creat d3/dd/d16/d29/d2d/d45/d8b/d98/fab x:0 0 0 2026-03-09T19:27:33.758 INFO:tasks.workunit.client.0.vm07.stdout:2/498: mknod d3/dd/d16/d29/d2d/d45/d3b/d44/d97/cac 0 2026-03-09T19:27:33.765 INFO:tasks.workunit.client.0.vm07.stdout:2/499: dread d3/dd/f73 [0,4194304] 0 2026-03-09T19:27:33.765 INFO:tasks.workunit.client.0.vm07.stdout:2/500: stat d3/dd/f1e 0 2026-03-09T19:27:33.775 INFO:tasks.workunit.client.0.vm07.stdout:5/428: dread d3/dd/f58 [0,4194304] 0 2026-03-09T19:27:33.775 INFO:tasks.workunit.client.0.vm07.stdout:2/501: mkdir d3/dd/d16/d30/da7/dad 0 2026-03-09T19:27:33.776 INFO:tasks.workunit.client.0.vm07.stdout:2/502: fsync d3/f93 0 2026-03-09T19:27:33.778 INFO:tasks.workunit.client.0.vm07.stdout:5/429: mkdir d3/dd/d26/d2c/d89 0 2026-03-09T19:27:33.787 INFO:tasks.workunit.client.0.vm07.stdout:5/430: rmdir d3/d1a/d5a 39 2026-03-09T19:27:33.795 INFO:tasks.workunit.client.0.vm07.stdout:5/431: truncate d3/d1a/fa 5204244 0 2026-03-09T19:27:33.796 INFO:tasks.workunit.client.1.vm08.stdout:0/665: write dd/f19 [983593,79788] 0 2026-03-09T19:27:33.796 INFO:tasks.workunit.client.0.vm07.stdout:9/472: write d0/db/d29/d2c/f4a [3872600,98985] 0 2026-03-09T19:27:33.798 INFO:tasks.workunit.client.0.vm07.stdout:9/473: mkdir d0/db/d29/da8 0 2026-03-09T19:27:33.799 INFO:tasks.workunit.client.1.vm08.stdout:0/666: creat dd/d22/d24/d49/d50/d78/d86/fd5 x:0 0 0 2026-03-09T19:27:33.801 INFO:tasks.workunit.client.0.vm07.stdout:9/474: creat d0/d6f/fa9 x:0 0 0 2026-03-09T19:27:33.801 INFO:tasks.workunit.client.1.vm08.stdout:0/667: symlink dd/d22/d7b/ld6 0 2026-03-09T19:27:33.802 INFO:tasks.workunit.client.0.vm07.stdout:9/475: mknod d0/d6/d3a/d94/caa 0 2026-03-09T19:27:33.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:33 vm08.local ceph-mon[57794]: pgmap v171: 65 pgs: 65 active+clean; 2.3 GiB data, 8.3 GiB used, 112 GiB / 120 GiB avail; 31 MiB/s rd, 97 MiB/s wr, 212 op/s 2026-03-09T19:27:33.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:33 vm08.local ceph-mon[57794]: Upgrade: Updating mgr.vm08.mxylvw 2026-03-09T19:27:33.884 INFO:tasks.workunit.client.1.vm08.stdout:6/684: dread d3/f6e [0,4194304] 0 2026-03-09T19:27:33.891 INFO:tasks.workunit.client.1.vm08.stdout:1/799: dread d9/d11/f56 [0,4194304] 0 2026-03-09T19:27:33.907 INFO:tasks.workunit.client.1.vm08.stdout:1/800: getdents d9/d40/d49/d9e 0 2026-03-09T19:27:33.907 INFO:tasks.workunit.client.1.vm08.stdout:1/801: unlink d9/da/d2d/d4e/ca2 0 2026-03-09T19:27:33.925 INFO:tasks.workunit.client.0.vm07.stdout:5/432: sync 2026-03-09T19:27:33.953 INFO:tasks.workunit.client.0.vm07.stdout:5/433: creat d3/dd/f8a x:0 0 0 2026-03-09T19:27:33.953 INFO:tasks.workunit.client.0.vm07.stdout:5/434: dread - d3/dd/d26/d2d/d60/f7f zero size 2026-03-09T19:27:33.953 INFO:tasks.workunit.client.0.vm07.stdout:5/435: dread d3/d1a/f12 [0,4194304] 0 2026-03-09T19:27:33.953 INFO:tasks.workunit.client.0.vm07.stdout:5/436: mknod d3/d1a/c8b 0 2026-03-09T19:27:33.953 INFO:tasks.workunit.client.0.vm07.stdout:5/437: dread d3/dd/f24 [0,4194304] 0 2026-03-09T19:27:33.953 INFO:tasks.workunit.client.0.vm07.stdout:5/438: rename d3/f2f to d3/d1a/d28/d36/f8c 0 2026-03-09T19:27:33.969 INFO:tasks.workunit.client.1.vm08.stdout:4/639: write da/d10/d16/d28/d2f/d4f/f83 [568398,76808] 0 2026-03-09T19:27:33.970 INFO:tasks.workunit.client.1.vm08.stdout:4/640: chown f2 11 1 2026-03-09T19:27:33.972 INFO:tasks.workunit.client.1.vm08.stdout:4/641: getdents da/d10/d16/d28/d2f/d4f/d64/d84/d8a/da2 0 2026-03-09T19:27:33.973 INFO:tasks.workunit.client.1.vm08.stdout:4/642: mkdir da/d10/d26/da0/dc0 0 2026-03-09T19:27:33.974 INFO:tasks.workunit.client.1.vm08.stdout:4/643: truncate da/d10/d26/d50/fbb 2241261 0 2026-03-09T19:27:33.975 INFO:tasks.workunit.client.1.vm08.stdout:4/644: write da/d10/d26/d27/fac [3014512,17659] 0 2026-03-09T19:27:33.976 INFO:tasks.workunit.client.1.vm08.stdout:4/645: chown da/d10/l3c 82307 1 2026-03-09T19:27:33.990 INFO:tasks.workunit.client.1.vm08.stdout:6/685: sync 2026-03-09T19:27:33.992 INFO:tasks.workunit.client.0.vm07.stdout:7/458: dread d0/d4/d5/d26/f31 [0,4194304] 0 2026-03-09T19:27:33.994 INFO:tasks.workunit.client.0.vm07.stdout:7/459: dread d0/d4/d5/f50 [0,4194304] 0 2026-03-09T19:27:33.999 INFO:tasks.workunit.client.1.vm08.stdout:6/686: creat d3/db/d43/d69/f101 x:0 0 0 2026-03-09T19:27:33.999 INFO:tasks.workunit.client.1.vm08.stdout:6/687: stat d3/d15/f40 0 2026-03-09T19:27:34.023 INFO:tasks.workunit.client.1.vm08.stdout:5/622: write d16/d1e/d3b/f43 [3415026,98159] 0 2026-03-09T19:27:34.031 INFO:tasks.workunit.client.0.vm07.stdout:3/500: write d1/d6/f21 [3066625,47837] 0 2026-03-09T19:27:34.032 INFO:tasks.workunit.client.0.vm07.stdout:3/501: readlink d1/d6/d45/l9a 0 2026-03-09T19:27:34.036 INFO:tasks.workunit.client.1.vm08.stdout:8/622: dwrite de/fb2 [0,4194304] 0 2026-03-09T19:27:34.039 INFO:tasks.workunit.client.0.vm07.stdout:9/476: dread d0/db/d29/d2c/f34 [0,4194304] 0 2026-03-09T19:27:34.051 INFO:tasks.workunit.client.1.vm08.stdout:4/646: dread da/d10/d26/d27/fac [0,4194304] 0 2026-03-09T19:27:34.057 INFO:tasks.workunit.client.1.vm08.stdout:5/623: dread d16/d1e/d8c/d99/da8/f8b [0,4194304] 0 2026-03-09T19:27:34.085 INFO:tasks.workunit.client.1.vm08.stdout:2/580: dwrite d3/d4/d23/d2c/d39/d5e/de/d18/f50 [0,4194304] 0 2026-03-09T19:27:34.089 INFO:tasks.workunit.client.0.vm07.stdout:8/466: write d7/d9/d10/f41 [762362,1617] 0 2026-03-09T19:27:34.102 INFO:tasks.workunit.client.1.vm08.stdout:9/653: truncate d0/d1b/d97/d48/d5d/fb3 758884 0 2026-03-09T19:27:34.107 INFO:tasks.workunit.client.0.vm07.stdout:3/502: rename d1/d6/d4c/c78 to d1/d6/d4c/d97/ca0 0 2026-03-09T19:27:34.108 INFO:tasks.workunit.client.1.vm08.stdout:9/654: dwrite d0/d1b/f4b [4194304,4194304] 0 2026-03-09T19:27:34.123 INFO:tasks.workunit.client.0.vm07.stdout:8/467: mkdir d7/d16/d1e/dab 0 2026-03-09T19:27:34.124 INFO:tasks.workunit.client.1.vm08.stdout:4/647: rename f1 to da/d10/d16/fc1 0 2026-03-09T19:27:34.124 INFO:tasks.workunit.client.1.vm08.stdout:5/624: mknod d16/d1e/d3b/ccb 0 2026-03-09T19:27:34.127 INFO:tasks.workunit.client.0.vm07.stdout:8/468: dread d7/d16/f69 [0,4194304] 0 2026-03-09T19:27:34.138 INFO:tasks.workunit.client.1.vm08.stdout:2/581: truncate d3/d9/d79/f6b 1059579 0 2026-03-09T19:27:34.138 INFO:tasks.workunit.client.1.vm08.stdout:2/582: fdatasync d3/d9/d79/fc2 0 2026-03-09T19:27:34.138 INFO:tasks.workunit.client.1.vm08.stdout:3/723: dwrite d0/d6/f39 [0,4194304] 0 2026-03-09T19:27:34.138 INFO:tasks.workunit.client.1.vm08.stdout:3/724: readlink d0/d6/de/d1b/d16/l61 0 2026-03-09T19:27:34.138 INFO:tasks.workunit.client.0.vm07.stdout:8/469: fsync d7/d16/f69 0 2026-03-09T19:27:34.138 INFO:tasks.workunit.client.0.vm07.stdout:3/503: mknod d1/d6/d45/d54/ca1 0 2026-03-09T19:27:34.138 INFO:tasks.workunit.client.0.vm07.stdout:9/477: truncate d0/db/d29/d68/f6b 374208 0 2026-03-09T19:27:34.138 INFO:tasks.workunit.client.0.vm07.stdout:9/478: dwrite d0/d6/f10 [4194304,4194304] 0 2026-03-09T19:27:34.138 INFO:tasks.workunit.client.0.vm07.stdout:0/414: dwrite d0/d6/d13/d17/d19/f7c [0,4194304] 0 2026-03-09T19:27:34.160 INFO:tasks.workunit.client.1.vm08.stdout:5/625: mkdir d16/d1e/d8c/d99/dcc 0 2026-03-09T19:27:34.163 INFO:tasks.workunit.client.0.vm07.stdout:0/415: dread d0/d6/d13/d1c/d11/f5f [0,4194304] 0 2026-03-09T19:27:34.172 INFO:tasks.workunit.client.0.vm07.stdout:4/441: dwrite d3/d4f/d56/f7f [0,4194304] 0 2026-03-09T19:27:34.172 INFO:tasks.workunit.client.0.vm07.stdout:4/442: readlink d3/ld 0 2026-03-09T19:27:34.177 INFO:tasks.workunit.client.0.vm07.stdout:1/485: dwrite d1/d11/d37/d3f/f4a [0,4194304] 0 2026-03-09T19:27:34.178 INFO:tasks.workunit.client.1.vm08.stdout:7/711: write d5/d14/dae/d3a/d42/d6a/f62 [3716467,4720] 0 2026-03-09T19:27:34.208 INFO:tasks.workunit.client.1.vm08.stdout:5/626: dread - d16/d1e/d30/fc2 zero size 2026-03-09T19:27:34.210 INFO:tasks.workunit.client.0.vm07.stdout:0/416: mkdir d0/d6/d13/d1c/d11/d8b 0 2026-03-09T19:27:34.215 INFO:tasks.workunit.client.0.vm07.stdout:8/470: rename d7/d9/l35 to d7/d9/d37/d45/lac 0 2026-03-09T19:27:34.216 INFO:tasks.workunit.client.1.vm08.stdout:8/623: getdents de/d25/d31/d82/d6d/d99/da5 0 2026-03-09T19:27:34.237 INFO:tasks.workunit.client.0.vm07.stdout:2/503: write d3/f4 [439167,41102] 0 2026-03-09T19:27:34.241 INFO:tasks.workunit.client.0.vm07.stdout:4/443: creat d3/d11/d29/f9b x:0 0 0 2026-03-09T19:27:34.242 INFO:tasks.workunit.client.0.vm07.stdout:4/444: write d3/d4f/d56/f7f [884466,23570] 0 2026-03-09T19:27:34.255 INFO:tasks.workunit.client.0.vm07.stdout:9/479: creat d0/db/d29/da8/fab x:0 0 0 2026-03-09T19:27:34.255 INFO:tasks.workunit.client.0.vm07.stdout:9/480: stat d0/db/d29 0 2026-03-09T19:27:34.257 INFO:tasks.workunit.client.1.vm08.stdout:9/655: getdents d0/d1b/d68/d7f/d8c/da2 0 2026-03-09T19:27:34.265 INFO:tasks.workunit.client.1.vm08.stdout:0/668: write dd/d22/d24/f26 [3544918,78904] 0 2026-03-09T19:27:34.267 INFO:tasks.workunit.client.1.vm08.stdout:7/712: rename d5/d14/dae/d3a/d42/d85/c22 to d5/d14/d2b/d5d/cf4 0 2026-03-09T19:27:34.268 INFO:tasks.workunit.client.1.vm08.stdout:1/802: write d9/da/d53/d67/d6c/d76/f99 [779620,94134] 0 2026-03-09T19:27:34.286 INFO:tasks.workunit.client.1.vm08.stdout:9/656: fdatasync d0/d2/dc8/fc9 0 2026-03-09T19:27:34.291 INFO:tasks.workunit.client.0.vm07.stdout:8/471: symlink d7/d9/d57/lad 0 2026-03-09T19:27:34.293 INFO:tasks.workunit.client.0.vm07.stdout:8/472: chown d7/d9/d37/d45/d4f/c78 30 1 2026-03-09T19:27:34.293 INFO:tasks.workunit.client.0.vm07.stdout:2/504: mkdir d3/dd/d16/d29/d2d/d45/d3b/dae 0 2026-03-09T19:27:34.296 INFO:tasks.workunit.client.1.vm08.stdout:0/669: fdatasync dd/d22/d27/d2e/f39 0 2026-03-09T19:27:34.296 INFO:tasks.workunit.client.1.vm08.stdout:0/670: stat dd/d9d 0 2026-03-09T19:27:34.297 INFO:tasks.workunit.client.1.vm08.stdout:0/671: chown dd/d22/d27/d2e/f39 596909617 1 2026-03-09T19:27:34.317 INFO:tasks.workunit.client.0.vm07.stdout:7/460: write d0/d4/d5/d8/d41/f89 [314994,90969] 0 2026-03-09T19:27:34.325 INFO:tasks.workunit.client.1.vm08.stdout:6/688: write d3/f3e [5006123,20758] 0 2026-03-09T19:27:34.327 INFO:tasks.workunit.client.1.vm08.stdout:9/657: creat d0/d2/d8/dcd/fda x:0 0 0 2026-03-09T19:27:34.360 INFO:tasks.workunit.client.0.vm07.stdout:2/505: unlink d3/dd/d16/d29/d2d/d45/d8b/f7b 0 2026-03-09T19:27:34.370 INFO:tasks.workunit.client.1.vm08.stdout:4/648: dwrite da/d10/d16/d28/d46/d52/d6e/d2c/f36 [0,4194304] 0 2026-03-09T19:27:34.372 INFO:tasks.workunit.client.1.vm08.stdout:9/658: creat d0/d1b/d97/d48/d6f/fdb x:0 0 0 2026-03-09T19:27:34.372 INFO:tasks.workunit.client.1.vm08.stdout:2/583: write d3/d9/d4a/f59 [1752017,71776] 0 2026-03-09T19:27:34.372 INFO:tasks.workunit.client.1.vm08.stdout:9/659: chown d0/d1b/d4e/c5b 62 1 2026-03-09T19:27:34.381 INFO:tasks.workunit.client.1.vm08.stdout:3/725: write d0/d6/de/d1b/d16/d17/fbc [3576423,48097] 0 2026-03-09T19:27:34.382 INFO:tasks.workunit.client.0.vm07.stdout:3/504: truncate d1/d6/dd/f33 4059394 0 2026-03-09T19:27:34.383 INFO:tasks.workunit.client.0.vm07.stdout:6/400: dwrite d0/d13/f26 [0,4194304] 0 2026-03-09T19:27:34.384 INFO:tasks.workunit.client.0.vm07.stdout:6/401: write d0/d1/db/d17/f38 [4800237,96115] 0 2026-03-09T19:27:34.400 INFO:tasks.workunit.client.1.vm08.stdout:9/660: rename d0/d2/c3d to d0/d1b/d68/cdc 0 2026-03-09T19:27:34.400 INFO:tasks.workunit.client.1.vm08.stdout:5/627: write d16/d1e/f7d [278855,88493] 0 2026-03-09T19:27:34.405 INFO:tasks.workunit.client.1.vm08.stdout:8/624: write de/d25/d31/fc0 [295104,109130] 0 2026-03-09T19:27:34.413 INFO:tasks.workunit.client.1.vm08.stdout:7/713: write d5/d14/d27/d78/dc7/fd4 [883550,90820] 0 2026-03-09T19:27:34.417 INFO:tasks.workunit.client.0.vm07.stdout:0/417: dwrite d0/d6/d13/d1c/f27 [0,4194304] 0 2026-03-09T19:27:34.431 INFO:tasks.workunit.client.1.vm08.stdout:2/584: creat d3/d4/d3e/d9d/fc5 x:0 0 0 2026-03-09T19:27:34.433 INFO:tasks.workunit.client.0.vm07.stdout:8/473: dwrite d7/d9/d37/d34/f5a [0,4194304] 0 2026-03-09T19:27:34.433 INFO:tasks.workunit.client.1.vm08.stdout:0/672: dwrite dd/d22/f28 [4194304,4194304] 0 2026-03-09T19:27:34.435 INFO:tasks.workunit.client.1.vm08.stdout:0/673: chown dd/d22/d7b/f83 836867 1 2026-03-09T19:27:34.437 INFO:tasks.workunit.client.1.vm08.stdout:2/585: read d3/d4/d23/d2c/d39/d5e/d14/f58 [110697,41750] 0 2026-03-09T19:27:34.454 INFO:tasks.workunit.client.1.vm08.stdout:1/803: write d9/da/f1e [5150350,120464] 0 2026-03-09T19:27:34.459 INFO:tasks.workunit.client.1.vm08.stdout:5/628: mkdir d16/d1e/d6e/dcd 0 2026-03-09T19:27:34.460 INFO:tasks.workunit.client.1.vm08.stdout:5/629: chown d16/d45/daf 1848658 1 2026-03-09T19:27:34.461 INFO:tasks.workunit.client.1.vm08.stdout:5/630: dwrite d16/d1e/d3b/f43 [0,4194304] 0 2026-03-09T19:27:34.463 INFO:tasks.workunit.client.1.vm08.stdout:5/631: chown d16/d1e/d30/d6f/fbb 4 1 2026-03-09T19:27:34.464 INFO:tasks.workunit.client.1.vm08.stdout:0/674: read - dd/d22/d24/d49/fae zero size 2026-03-09T19:27:34.465 INFO:tasks.workunit.client.1.vm08.stdout:5/632: chown d16/d45/d81/c91 795463691 1 2026-03-09T19:27:34.466 INFO:tasks.workunit.client.1.vm08.stdout:5/633: chown d16/d1e/d6e/l77 95277644 1 2026-03-09T19:27:34.468 INFO:tasks.workunit.client.1.vm08.stdout:9/661: fsync d0/d1b/d97/d48/d5e/fc7 0 2026-03-09T19:27:34.473 INFO:tasks.workunit.client.1.vm08.stdout:8/625: mkdir de/d25/d87/dc9/dd8 0 2026-03-09T19:27:34.479 INFO:tasks.workunit.client.1.vm08.stdout:2/586: mknod d3/cc6 0 2026-03-09T19:27:34.481 INFO:tasks.workunit.client.1.vm08.stdout:0/675: creat dd/d22/d27/d4f/fd7 x:0 0 0 2026-03-09T19:27:34.487 INFO:tasks.workunit.client.0.vm07.stdout:4/445: creat d3/d11/d16/d2f/d22/d70/d93/f9c x:0 0 0 2026-03-09T19:27:34.491 INFO:tasks.workunit.client.1.vm08.stdout:9/662: creat d0/d1b/d68/d7f/d8c/da2/fdd x:0 0 0 2026-03-09T19:27:34.492 INFO:tasks.workunit.client.1.vm08.stdout:9/663: write d0/d2/d14/d98/f9e [478596,5925] 0 2026-03-09T19:27:34.500 INFO:tasks.workunit.client.0.vm07.stdout:7/461: mknod d0/d4/d5/d26/d3c/c9f 0 2026-03-09T19:27:34.501 INFO:tasks.workunit.client.1.vm08.stdout:6/689: write d3/f2a [2807169,1092] 0 2026-03-09T19:27:34.505 INFO:tasks.workunit.client.1.vm08.stdout:4/649: dwrite da/d10/f13 [4194304,4194304] 0 2026-03-09T19:27:34.507 INFO:tasks.workunit.client.0.vm07.stdout:1/486: link d1/d11/f83 d1/d3e/d5c/fa7 0 2026-03-09T19:27:34.507 INFO:tasks.workunit.client.1.vm08.stdout:8/626: creat de/d1d/d4f/fd9 x:0 0 0 2026-03-09T19:27:34.521 INFO:tasks.workunit.client.1.vm08.stdout:0/676: mknod dd/d22/d63/d6e/cd8 0 2026-03-09T19:27:34.537 INFO:tasks.workunit.client.0.vm07.stdout:2/506: creat d3/d49/faf x:0 0 0 2026-03-09T19:27:34.537 INFO:tasks.workunit.client.0.vm07.stdout:3/505: fsync d1/d6/f1b 0 2026-03-09T19:27:34.537 INFO:tasks.workunit.client.1.vm08.stdout:6/690: fdatasync d3/d34/da9/da4/fe0 0 2026-03-09T19:27:34.537 INFO:tasks.workunit.client.1.vm08.stdout:8/627: creat de/d25/d31/d82/d6d/d99/da0/fda x:0 0 0 2026-03-09T19:27:34.538 INFO:tasks.workunit.client.1.vm08.stdout:0/677: truncate dd/d22/f29 5038377 0 2026-03-09T19:27:34.540 INFO:tasks.workunit.client.1.vm08.stdout:6/691: creat d3/d94/f102 x:0 0 0 2026-03-09T19:27:34.540 INFO:tasks.workunit.client.1.vm08.stdout:8/628: dread - de/d47/fc1 zero size 2026-03-09T19:27:34.541 INFO:tasks.workunit.client.0.vm07.stdout:8/474: creat d7/d30/d75/fae x:0 0 0 2026-03-09T19:27:34.542 INFO:tasks.workunit.client.1.vm08.stdout:2/587: link d3/d4/d23/d2c/d39/d5e/de/d18/d1f/l43 d3/d4/d23/d2c/d39/d5e/de/d18/d1f/lc7 0 2026-03-09T19:27:34.542 INFO:tasks.workunit.client.0.vm07.stdout:4/446: creat d3/d4f/f9d x:0 0 0 2026-03-09T19:27:34.544 INFO:tasks.workunit.client.0.vm07.stdout:7/462: rename d0/d4/d5/f50 to d0/d52/d54/d5a/d87/d92/fa0 0 2026-03-09T19:27:34.544 INFO:tasks.workunit.client.1.vm08.stdout:6/692: rmdir d3/db/d43 39 2026-03-09T19:27:34.548 INFO:tasks.workunit.client.1.vm08.stdout:8/629: creat de/d91/fdb x:0 0 0 2026-03-09T19:27:34.561 INFO:tasks.workunit.client.0.vm07.stdout:3/506: rmdir d1/d6/dd 39 2026-03-09T19:27:34.561 INFO:tasks.workunit.client.1.vm08.stdout:8/630: fsync de/d1d/d21/d73/fa7 0 2026-03-09T19:27:34.561 INFO:tasks.workunit.client.1.vm08.stdout:2/588: unlink d3/d4/d23/d2c/d39/d5e/d14/c40 0 2026-03-09T19:27:34.561 INFO:tasks.workunit.client.1.vm08.stdout:2/589: unlink d3/d4/d23/cb6 0 2026-03-09T19:27:34.561 INFO:tasks.workunit.client.1.vm08.stdout:6/693: truncate d3/fc 1595382 0 2026-03-09T19:27:34.562 INFO:tasks.workunit.client.0.vm07.stdout:2/507: rename d3/d11/f1f to d3/dd/d16/d29/d2d/d45/d3b/fb0 0 2026-03-09T19:27:34.564 INFO:tasks.workunit.client.1.vm08.stdout:2/590: dread d3/f7c [0,4194304] 0 2026-03-09T19:27:34.568 INFO:tasks.workunit.client.1.vm08.stdout:8/631: dread de/d25/f71 [0,4194304] 0 2026-03-09T19:27:34.568 INFO:tasks.workunit.client.1.vm08.stdout:8/632: read f1 [555596,86254] 0 2026-03-09T19:27:34.571 INFO:tasks.workunit.client.0.vm07.stdout:6/402: dread d0/d2d/f4a [0,4194304] 0 2026-03-09T19:27:34.572 INFO:tasks.workunit.client.0.vm07.stdout:8/475: creat d7/d50/da6/faf x:0 0 0 2026-03-09T19:27:34.575 INFO:tasks.workunit.client.1.vm08.stdout:8/633: dread - de/d1d/d2e/d5f/fbb zero size 2026-03-09T19:27:34.576 INFO:tasks.workunit.client.0.vm07.stdout:6/403: chown d0/d1/db/l71 1 1 2026-03-09T19:27:34.577 INFO:tasks.workunit.client.0.vm07.stdout:3/507: truncate d1/d1f/d16/f3a 1207668 0 2026-03-09T19:27:34.579 INFO:tasks.workunit.client.0.vm07.stdout:4/447: link d3/d11/c85 d3/d11/d29/c9e 0 2026-03-09T19:27:34.590 INFO:tasks.workunit.client.0.vm07.stdout:2/508: creat d3/dd/daa/fb1 x:0 0 0 2026-03-09T19:27:34.590 INFO:tasks.workunit.client.0.vm07.stdout:3/508: mknod d1/d6/d4c/d97/ca2 0 2026-03-09T19:27:34.590 INFO:tasks.workunit.client.0.vm07.stdout:4/448: rename d3/d11/d2b/d37/l2d to d3/d11/d16/d2f/d91/l9f 0 2026-03-09T19:27:34.590 INFO:tasks.workunit.client.0.vm07.stdout:2/509: rename d3/dd/d16/d29/d2d/d45/d8b/l8c to d3/dd/d16/d30/d40/lb2 0 2026-03-09T19:27:34.590 INFO:tasks.workunit.client.0.vm07.stdout:3/509: dread d1/d74/f52 [0,4194304] 0 2026-03-09T19:27:34.590 INFO:tasks.workunit.client.0.vm07.stdout:3/510: readlink d1/d74/l6f 0 2026-03-09T19:27:34.597 INFO:tasks.workunit.client.1.vm08.stdout:8/634: dread de/d1d/d21/f23 [0,4194304] 0 2026-03-09T19:27:34.599 INFO:tasks.workunit.client.1.vm08.stdout:8/635: read - de/d1d/fb0 zero size 2026-03-09T19:27:34.604 INFO:tasks.workunit.client.0.vm07.stdout:2/510: dread d3/dd/d16/d29/f42 [0,4194304] 0 2026-03-09T19:27:34.605 INFO:tasks.workunit.client.1.vm08.stdout:8/636: fdatasync f1 0 2026-03-09T19:27:34.606 INFO:tasks.workunit.client.1.vm08.stdout:8/637: write de/d1d/d21/f86 [5151057,16277] 0 2026-03-09T19:27:34.608 INFO:tasks.workunit.client.1.vm08.stdout:8/638: chown de/d25/d31/d82/d6d/d99/da5/db3/c3f 2298 1 2026-03-09T19:27:34.612 INFO:tasks.workunit.client.0.vm07.stdout:2/511: mkdir d3/dd/d16/d29/d3c/d5a/db3 0 2026-03-09T19:27:34.649 INFO:tasks.workunit.client.0.vm07.stdout:2/512: symlink d3/dd/d16/d29/d3c/da2/lb4 0 2026-03-09T19:27:34.649 INFO:tasks.workunit.client.0.vm07.stdout:2/513: creat d3/dd/d16/d29/d3c/d5a/d7a/d74/fb5 x:0 0 0 2026-03-09T19:27:34.649 INFO:tasks.workunit.client.0.vm07.stdout:2/514: readlink d3/dd/d16/d29/d2d/d45/d3b/l79 0 2026-03-09T19:27:34.649 INFO:tasks.workunit.client.0.vm07.stdout:2/515: creat d3/dd/d16/d30/da7/dad/fb6 x:0 0 0 2026-03-09T19:27:34.671 INFO:tasks.workunit.client.1.vm08.stdout:9/664: sync 2026-03-09T19:27:34.671 INFO:tasks.workunit.client.0.vm07.stdout:2/516: sync 2026-03-09T19:27:34.681 INFO:tasks.workunit.client.1.vm08.stdout:3/726: write d0/d6/de/d15/fa4 [627483,118391] 0 2026-03-09T19:27:34.685 INFO:tasks.workunit.client.1.vm08.stdout:7/714: dwrite d5/d14/d38/dad/fc1 [0,4194304] 0 2026-03-09T19:27:34.688 INFO:tasks.workunit.client.1.vm08.stdout:1/804: dwrite d9/da/d53/d67/fc4 [0,4194304] 0 2026-03-09T19:27:34.701 INFO:tasks.workunit.client.0.vm07.stdout:8/476: fsync d7/d9/d37/d34/f5a 0 2026-03-09T19:27:34.707 INFO:tasks.workunit.client.1.vm08.stdout:9/665: creat d0/d2/d80/fde x:0 0 0 2026-03-09T19:27:34.713 INFO:tasks.workunit.client.1.vm08.stdout:5/634: dwrite d16/d1e/f5a [0,4194304] 0 2026-03-09T19:27:34.713 INFO:tasks.workunit.client.1.vm08.stdout:5/635: readlink d16/d45/l48 0 2026-03-09T19:27:34.714 INFO:tasks.workunit.client.1.vm08.stdout:3/727: mkdir d0/d6/de/d6e/d51/d7f/de3 0 2026-03-09T19:27:34.720 INFO:tasks.workunit.client.0.vm07.stdout:8/477: creat d7/d9/d37/d45/d4f/fb0 x:0 0 0 2026-03-09T19:27:34.722 INFO:tasks.workunit.client.0.vm07.stdout:9/481: dwrite d0/d17/f5e [0,4194304] 0 2026-03-09T19:27:34.726 INFO:tasks.workunit.client.0.vm07.stdout:8/478: mkdir d7/d9/d37/d45/d4f/db1 0 2026-03-09T19:27:34.731 INFO:tasks.workunit.client.1.vm08.stdout:1/805: dread d9/da/d12/fac [0,4194304] 0 2026-03-09T19:27:34.743 INFO:tasks.workunit.client.1.vm08.stdout:5/636: creat d16/d45/d81/fce x:0 0 0 2026-03-09T19:27:34.745 INFO:tasks.workunit.client.1.vm08.stdout:4/650: dwrite da/fab [0,4194304] 0 2026-03-09T19:27:34.754 INFO:tasks.workunit.client.0.vm07.stdout:8/479: readlink d7/d9/da7/la8 0 2026-03-09T19:27:34.759 INFO:tasks.workunit.client.0.vm07.stdout:8/480: dread - d7/d9/f65 zero size 2026-03-09T19:27:34.762 INFO:tasks.workunit.client.1.vm08.stdout:0/678: write dd/d22/d24/d49/f4c [678397,62154] 0 2026-03-09T19:27:34.762 INFO:tasks.workunit.client.1.vm08.stdout:1/806: mkdir d9/da/d17/d60/df5 0 2026-03-09T19:27:34.762 INFO:tasks.workunit.client.1.vm08.stdout:1/807: write d9/d11/d7a/ff1 [528354,29424] 0 2026-03-09T19:27:34.766 INFO:tasks.workunit.client.0.vm07.stdout:9/482: rename d0/db/d29/d2c/d36/d7d/f8b to d0/db/fac 0 2026-03-09T19:27:34.766 INFO:tasks.workunit.client.0.vm07.stdout:1/487: write d1/db/d31/d4f/f8c [4248187,32313] 0 2026-03-09T19:27:34.767 INFO:tasks.workunit.client.1.vm08.stdout:5/637: unlink f5 0 2026-03-09T19:27:34.768 INFO:tasks.workunit.client.0.vm07.stdout:8/481: creat d7/d9/d57/fb2 x:0 0 0 2026-03-09T19:27:34.771 INFO:tasks.workunit.client.1.vm08.stdout:4/651: creat da/d10/d16/d28/d46/d52/d6e/d40/fc2 x:0 0 0 2026-03-09T19:27:34.780 INFO:tasks.workunit.client.1.vm08.stdout:2/591: getdents d3/d4/d23/d2c/d39/d5e/d14 0 2026-03-09T19:27:34.780 INFO:tasks.workunit.client.0.vm07.stdout:0/418: truncate d0/f3a 4014481 0 2026-03-09T19:27:34.780 INFO:tasks.workunit.client.0.vm07.stdout:0/419: chown d0/d6/c42 71 1 2026-03-09T19:27:34.780 INFO:tasks.workunit.client.0.vm07.stdout:1/488: unlink d1/cf 0 2026-03-09T19:27:34.780 INFO:tasks.workunit.client.1.vm08.stdout:6/694: truncate d3/db/f30 621123 0 2026-03-09T19:27:34.781 INFO:tasks.workunit.client.0.vm07.stdout:7/463: dwrite d0/d4/d5/d26/d3c/d58/f71 [0,4194304] 0 2026-03-09T19:27:34.784 INFO:tasks.workunit.client.0.vm07.stdout:8/482: mknod d7/d50/da6/cb3 0 2026-03-09T19:27:34.796 INFO:tasks.workunit.client.1.vm08.stdout:9/666: rename d0/d1b/d4e to d0/d1b/d97/d48/d5d/ddf 0 2026-03-09T19:27:34.812 INFO:tasks.workunit.client.1.vm08.stdout:5/638: sync 2026-03-09T19:27:34.814 INFO:tasks.workunit.client.0.vm07.stdout:6/404: dwrite d0/d1/db/d24/f4d [0,4194304] 0 2026-03-09T19:27:34.821 INFO:tasks.workunit.client.1.vm08.stdout:6/695: mknod d3/d34/d5c/da2/c103 0 2026-03-09T19:27:34.821 INFO:tasks.workunit.client.0.vm07.stdout:9/483: mkdir d0/db/d29/d32/d5c/d80/dad 0 2026-03-09T19:27:34.825 INFO:tasks.workunit.client.0.vm07.stdout:4/449: dwrite d3/d11/d2b/f2c [0,4194304] 0 2026-03-09T19:27:34.826 INFO:tasks.workunit.client.0.vm07.stdout:3/511: dwrite d1/d1f/d5c/f7d [0,4194304] 0 2026-03-09T19:27:34.828 INFO:tasks.workunit.client.1.vm08.stdout:8/639: write de/d1d/d2e/d5f/fba [50177,20673] 0 2026-03-09T19:27:34.834 INFO:tasks.workunit.client.1.vm08.stdout:8/640: dread de/f16 [0,4194304] 0 2026-03-09T19:27:34.835 INFO:tasks.workunit.client.1.vm08.stdout:8/641: chown de/d1d/d2e 609 1 2026-03-09T19:27:34.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:34 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:27:34.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:34 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:34.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:34 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.mxylvw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:27:34.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:34 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T19:27:34.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:34 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:27:34.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:34 vm08.local ceph-mon[57794]: Deploying daemon mgr.vm08.mxylvw on vm08 2026-03-09T19:27:34.847 INFO:tasks.workunit.client.0.vm07.stdout:2/517: truncate d3/f15 384562 0 2026-03-09T19:27:34.851 INFO:tasks.workunit.client.1.vm08.stdout:1/808: rename d9/da/d95/fc0 to d9/d11/db6/ff6 0 2026-03-09T19:27:34.852 INFO:tasks.workunit.client.1.vm08.stdout:9/667: truncate d0/d2/d8/dcd/fb2 1427857 0 2026-03-09T19:27:34.859 INFO:tasks.workunit.client.0.vm07.stdout:3/512: dread d1/d6/f21 [0,4194304] 0 2026-03-09T19:27:34.862 INFO:tasks.workunit.client.1.vm08.stdout:7/715: dwrite d5/d14/dae/d3a/f56 [0,4194304] 0 2026-03-09T19:27:34.867 INFO:tasks.workunit.client.0.vm07.stdout:7/464: symlink d0/d4/d5/d26/la1 0 2026-03-09T19:27:34.869 INFO:tasks.workunit.client.0.vm07.stdout:3/513: dread d1/d6/f37 [0,4194304] 0 2026-03-09T19:27:34.878 INFO:tasks.workunit.client.1.vm08.stdout:5/639: dread d16/d45/f65 [0,4194304] 0 2026-03-09T19:27:34.880 INFO:tasks.workunit.client.1.vm08.stdout:2/592: creat d3/d9/d4a/d9a/fc8 x:0 0 0 2026-03-09T19:27:34.881 INFO:tasks.workunit.client.1.vm08.stdout:2/593: write d3/d4/f91 [378637,53793] 0 2026-03-09T19:27:34.886 INFO:tasks.workunit.client.0.vm07.stdout:5/439: dwrite d3/d1a/fa [0,4194304] 0 2026-03-09T19:27:34.886 INFO:tasks.workunit.client.1.vm08.stdout:3/728: link d0/d6/de/d1b/d16/d17/c84 d0/d6/d93/ce4 0 2026-03-09T19:27:34.890 INFO:tasks.workunit.client.0.vm07.stdout:4/450: mknod d3/d11/d16/d2f/d22/d70/d93/ca0 0 2026-03-09T19:27:34.891 INFO:tasks.workunit.client.1.vm08.stdout:2/594: dread d3/d4/f49 [0,4194304] 0 2026-03-09T19:27:34.891 INFO:tasks.workunit.client.1.vm08.stdout:2/595: chown d3/d9/f84 4041506 1 2026-03-09T19:27:34.894 INFO:tasks.workunit.client.1.vm08.stdout:8/642: readlink de/d25/d31/d82/d6d/d99/da0/lca 0 2026-03-09T19:27:34.907 INFO:tasks.workunit.client.0.vm07.stdout:2/518: rename d3/dd/d16/d30/d40/f4f to d3/dd/d16/d29/d3c/d5a/fb7 0 2026-03-09T19:27:34.908 INFO:tasks.workunit.client.0.vm07.stdout:2/519: dread - d3/dd/d16/d29/d2d/d45/d3b/d53/f9c zero size 2026-03-09T19:27:34.914 INFO:tasks.workunit.client.0.vm07.stdout:7/465: rmdir d0/d52/d54/d5a 39 2026-03-09T19:27:34.919 INFO:tasks.workunit.client.1.vm08.stdout:7/716: dread - d5/d14/dae/f7c zero size 2026-03-09T19:27:34.920 INFO:tasks.workunit.client.1.vm08.stdout:7/717: read - d5/d14/dae/d1c/d73/fbe zero size 2026-03-09T19:27:34.928 INFO:tasks.workunit.client.1.vm08.stdout:5/640: fsync f1 0 2026-03-09T19:27:34.931 INFO:tasks.workunit.client.0.vm07.stdout:0/420: dwrite d0/d6/d13/d17/d19/d57/d6a/f7a [0,4194304] 0 2026-03-09T19:27:34.931 INFO:tasks.workunit.client.1.vm08.stdout:0/679: write dd/d22/d27/d6c/fbf [932975,69450] 0 2026-03-09T19:27:34.931 INFO:tasks.workunit.client.1.vm08.stdout:0/680: write dd/f19 [319330,44409] 0 2026-03-09T19:27:34.931 INFO:tasks.workunit.client.0.vm07.stdout:0/421: chown d0/d6/d13/d1c/d11/d56/d78/d8a 1 1 2026-03-09T19:27:34.938 INFO:tasks.workunit.client.0.vm07.stdout:9/484: creat d0/db/d29/d68/d99/fae x:0 0 0 2026-03-09T19:27:34.943 INFO:tasks.workunit.client.0.vm07.stdout:1/489: dwrite d1/d11/d37/d5d/d50/f6b [0,4194304] 0 2026-03-09T19:27:34.945 INFO:tasks.workunit.client.0.vm07.stdout:8/483: creat d7/d9/d37/fb4 x:0 0 0 2026-03-09T19:27:34.965 INFO:tasks.workunit.client.0.vm07.stdout:8/484: sync 2026-03-09T19:27:34.967 INFO:tasks.workunit.client.0.vm07.stdout:8/485: chown d7/d9/d37/d45/d56/d67/l9c 104716508 1 2026-03-09T19:27:34.967 INFO:tasks.workunit.client.0.vm07.stdout:4/451: dwrite d3/d11/d16/f77 [0,4194304] 0 2026-03-09T19:27:34.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:34 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:27:34.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:34 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:34.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:34 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.mxylvw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:27:34.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:34 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T19:27:34.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:34 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:27:34.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:34 vm07.local ceph-mon[48545]: Deploying daemon mgr.vm08.mxylvw on vm08 2026-03-09T19:27:34.988 INFO:tasks.workunit.client.0.vm07.stdout:2/520: truncate d3/d11/f2e 4003821 0 2026-03-09T19:27:34.988 INFO:tasks.workunit.client.1.vm08.stdout:4/652: write da/d10/d16/fbf [937794,72145] 0 2026-03-09T19:27:35.001 INFO:tasks.workunit.client.1.vm08.stdout:1/809: dwrite d9/da/dc/fb9 [0,4194304] 0 2026-03-09T19:27:35.014 INFO:tasks.workunit.client.0.vm07.stdout:0/422: mknod d0/d6/c8c 0 2026-03-09T19:27:35.017 INFO:tasks.workunit.client.1.vm08.stdout:6/696: link d3/d68/d7e/fbb d3/d94/def/dc4/f104 0 2026-03-09T19:27:35.018 INFO:tasks.workunit.client.1.vm08.stdout:6/697: dread d3/d34/f35 [0,4194304] 0 2026-03-09T19:27:35.020 INFO:tasks.workunit.client.1.vm08.stdout:3/729: truncate d0/d6/de/d1a/f5a 538498 0 2026-03-09T19:27:35.022 INFO:tasks.workunit.client.0.vm07.stdout:3/514: dwrite d1/d74/f31 [0,4194304] 0 2026-03-09T19:27:35.027 INFO:tasks.workunit.client.0.vm07.stdout:6/405: dwrite d0/d1/db/d1d/f27 [0,4194304] 0 2026-03-09T19:27:35.040 INFO:tasks.workunit.client.1.vm08.stdout:2/596: write d3/d4/d23/d2c/f5b [150416,118125] 0 2026-03-09T19:27:35.043 INFO:tasks.workunit.client.0.vm07.stdout:5/440: dwrite d3/d1a/d28/d40/f46 [0,4194304] 0 2026-03-09T19:27:35.044 INFO:tasks.workunit.client.1.vm08.stdout:8/643: dwrite de/d1d/d2e/d5f/fbb [0,4194304] 0 2026-03-09T19:27:35.044 INFO:tasks.workunit.client.0.vm07.stdout:5/441: stat d3/d1a/d28/d36/l7b 0 2026-03-09T19:27:35.045 INFO:tasks.workunit.client.1.vm08.stdout:8/644: stat de/d25/d31/f8e 0 2026-03-09T19:27:35.057 INFO:tasks.workunit.client.0.vm07.stdout:8/486: mknod d7/d9/d10/d44/cb5 0 2026-03-09T19:27:35.062 INFO:tasks.workunit.client.1.vm08.stdout:7/718: dwrite d5/d14/d2b/d4b/fdd [0,4194304] 0 2026-03-09T19:27:35.064 INFO:tasks.workunit.client.1.vm08.stdout:9/668: creat d0/d2/d8/fe0 x:0 0 0 2026-03-09T19:27:35.067 INFO:tasks.workunit.client.0.vm07.stdout:2/521: truncate d3/dd/f73 2978642 0 2026-03-09T19:27:35.073 INFO:tasks.workunit.client.1.vm08.stdout:5/641: dwrite ff [0,4194304] 0 2026-03-09T19:27:35.079 INFO:tasks.workunit.client.0.vm07.stdout:2/522: read d3/fa [3018520,125599] 0 2026-03-09T19:27:35.091 INFO:tasks.workunit.client.0.vm07.stdout:9/485: creat d0/db/d9e/faf x:0 0 0 2026-03-09T19:27:35.100 INFO:tasks.workunit.client.1.vm08.stdout:0/681: creat dd/d22/d24/d49/d50/dd4/fd9 x:0 0 0 2026-03-09T19:27:35.111 INFO:tasks.workunit.client.1.vm08.stdout:3/730: chown d0/d6/de/d1b/d16/d17/f8c 177991187 1 2026-03-09T19:27:35.118 INFO:tasks.workunit.client.0.vm07.stdout:6/406: mkdir d0/d13/d1e/d95/d31/d9e 0 2026-03-09T19:27:35.118 INFO:tasks.workunit.client.0.vm07.stdout:6/407: chown d0/d1/db/d1d/d77 177 1 2026-03-09T19:27:35.118 INFO:tasks.workunit.client.1.vm08.stdout:8/645: dread - de/d25/d31/d82/d6d/fc3 zero size 2026-03-09T19:27:35.118 INFO:tasks.workunit.client.1.vm08.stdout:8/646: write de/d1d/f1e [2830342,67837] 0 2026-03-09T19:27:35.118 INFO:tasks.workunit.client.1.vm08.stdout:8/647: dread - de/d91/fd1 zero size 2026-03-09T19:27:35.119 INFO:tasks.workunit.client.1.vm08.stdout:8/648: chown de/d47/faa 5 1 2026-03-09T19:27:35.120 INFO:tasks.workunit.client.1.vm08.stdout:8/649: read de/d1d/d21/d73/fa6 [3544465,37921] 0 2026-03-09T19:27:35.127 INFO:tasks.workunit.client.1.vm08.stdout:6/698: dread d3/db/d43/d69/da0/fb7 [0,4194304] 0 2026-03-09T19:27:35.130 INFO:tasks.workunit.client.1.vm08.stdout:4/653: mkdir da/d10/d26/d3a/d49/dc3 0 2026-03-09T19:27:35.131 INFO:tasks.workunit.client.1.vm08.stdout:4/654: stat da/d10/d16/d28/d46/d52/d6e/d6d/la7 0 2026-03-09T19:27:35.133 INFO:tasks.workunit.client.1.vm08.stdout:7/719: mkdir d5/d14/dae/d3a/d42/d85/da0/df5 0 2026-03-09T19:27:35.139 INFO:tasks.workunit.client.0.vm07.stdout:2/523: fsync d3/d11/f18 0 2026-03-09T19:27:35.141 INFO:tasks.workunit.client.0.vm07.stdout:6/408: truncate d0/d1/db/d1d/f47 165744 0 2026-03-09T19:27:35.152 INFO:tasks.workunit.client.1.vm08.stdout:1/810: mknod d9/cf7 0 2026-03-09T19:27:35.152 INFO:tasks.workunit.client.1.vm08.stdout:0/682: dread - dd/d22/d27/d2e/db0/fbc zero size 2026-03-09T19:27:35.162 INFO:tasks.workunit.client.1.vm08.stdout:8/650: chown de/f1f 16002 1 2026-03-09T19:27:35.165 INFO:tasks.workunit.client.0.vm07.stdout:5/442: unlink d3/dd/d26/l87 0 2026-03-09T19:27:35.165 INFO:tasks.workunit.client.0.vm07.stdout:5/443: stat d3/dd/f23 0 2026-03-09T19:27:35.169 INFO:tasks.workunit.client.1.vm08.stdout:6/699: read - d3/d34/d6f/f50 zero size 2026-03-09T19:27:35.172 INFO:tasks.workunit.client.0.vm07.stdout:7/466: getdents d0/d4/d5/d26/d3c/d39 0 2026-03-09T19:27:35.173 INFO:tasks.workunit.client.0.vm07.stdout:7/467: rename d0/d52 to d0/d52/d54/d55/d7f/da2 22 2026-03-09T19:27:35.191 INFO:tasks.workunit.client.0.vm07.stdout:2/524: creat d3/dd/d16/d29/d2d/d45/d3b/d53/fb8 x:0 0 0 2026-03-09T19:27:35.192 INFO:tasks.workunit.client.0.vm07.stdout:3/515: mknod d1/d6/dd/d51/d8e/ca3 0 2026-03-09T19:27:35.193 INFO:tasks.workunit.client.0.vm07.stdout:6/409: truncate d0/d13/d1e/d95/f48 685625 0 2026-03-09T19:27:35.196 INFO:tasks.workunit.client.0.vm07.stdout:3/516: sync 2026-03-09T19:27:35.196 INFO:tasks.workunit.client.0.vm07.stdout:2/525: sync 2026-03-09T19:27:35.200 INFO:tasks.workunit.client.1.vm08.stdout:0/683: truncate dd/d22/d27/d2e/d37/f44 2865595 0 2026-03-09T19:27:35.201 INFO:tasks.workunit.client.0.vm07.stdout:8/487: link d7/d30/d32/c4b d7/d9/d37/d45/d56/d67/cb6 0 2026-03-09T19:27:35.202 INFO:tasks.workunit.client.0.vm07.stdout:4/452: getdents d3/d4f/d56 0 2026-03-09T19:27:35.203 INFO:tasks.workunit.client.0.vm07.stdout:1/490: write d1/db/f14 [1062830,71778] 0 2026-03-09T19:27:35.204 INFO:tasks.workunit.client.0.vm07.stdout:1/491: chown d1/db/d31/d56/f97 8 1 2026-03-09T19:27:35.207 INFO:tasks.workunit.client.0.vm07.stdout:7/468: creat d0/d4/d5/d8/fa3 x:0 0 0 2026-03-09T19:27:35.208 INFO:tasks.workunit.client.1.vm08.stdout:9/669: write d0/d2/dc8/fc9 [266511,11392] 0 2026-03-09T19:27:35.211 INFO:tasks.workunit.client.1.vm08.stdout:4/655: truncate da/d10/d26/d50/fbb 1509478 0 2026-03-09T19:27:35.219 INFO:tasks.workunit.client.1.vm08.stdout:3/731: dwrite d0/d8/d19/fc4 [0,4194304] 0 2026-03-09T19:27:35.222 INFO:tasks.workunit.client.1.vm08.stdout:2/597: dwrite d3/d4/d23/d2c/d39/f9b [0,4194304] 0 2026-03-09T19:27:35.226 INFO:tasks.workunit.client.1.vm08.stdout:5/642: truncate d16/d1e/d3b/f43 5713551 0 2026-03-09T19:27:35.239 INFO:tasks.workunit.client.1.vm08.stdout:1/811: write d9/d40/d49/d9e/fdd [957047,103333] 0 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.0.vm07.stdout:5/444: write d3/d1a/f1c [3499239,75630] 0 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.0.vm07.stdout:9/486: creat d0/db/fb0 x:0 0 0 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.0.vm07.stdout:9/487: chown d0/db/d29/d2c 97356 1 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.0.vm07.stdout:2/526: mknod d3/dd/d16/d29/d3c/cb9 0 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.0.vm07.stdout:8/488: read d7/d9/d10/d44/f48 [143633,81658] 0 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.0.vm07.stdout:4/453: creat d3/d11/d2b/d38/fa1 x:0 0 0 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.0.vm07.stdout:0/423: link d0/d6/d13/d1c/c47 d0/d6/d13/d17/c8d 0 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.1.vm08.stdout:8/651: write de/f1f [4255631,27996] 0 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.1.vm08.stdout:8/652: chown de/d1d/fb0 403693999 1 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.1.vm08.stdout:8/653: stat de/d25/d87 0 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.1.vm08.stdout:6/700: mknod d3/d34/d3b/df5/c105 0 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.1.vm08.stdout:4/656: unlink da/d10/d16/d28/d2f/d4f/d64/c82 0 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.1.vm08.stdout:3/732: creat d0/d4b/fe5 x:0 0 0 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.1.vm08.stdout:2/598: mknod d3/d4/d23/d2c/dc1/cc9 0 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.1.vm08.stdout:7/720: creat d5/d14/dae/ff6 x:0 0 0 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.1.vm08.stdout:7/721: chown d5/d14/d27/d78 5966 1 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.1.vm08.stdout:0/684: symlink dd/lda 0 2026-03-09T19:27:35.274 INFO:tasks.workunit.client.1.vm08.stdout:0/685: chown dd/d22/d24/d49/d50/d78/d86/laf 16852 1 2026-03-09T19:27:35.281 INFO:tasks.workunit.client.0.vm07.stdout:3/517: symlink d1/d6/dd/d51/d87/la4 0 2026-03-09T19:27:35.282 INFO:tasks.workunit.client.1.vm08.stdout:4/657: rename da/d10/d16/d28/d46/l68 to da/d10/d26/d3a/db5/lc4 0 2026-03-09T19:27:35.282 INFO:tasks.workunit.client.0.vm07.stdout:1/492: dread d1/d11/d37/d3f/d6e/f9f [0,4194304] 0 2026-03-09T19:27:35.285 INFO:tasks.workunit.client.0.vm07.stdout:2/527: mknod d3/dd/d16/d29/d3c/cba 0 2026-03-09T19:27:35.286 INFO:tasks.workunit.client.0.vm07.stdout:2/528: chown d3/dd/d16/d29/d2d/d45/c69 330205 1 2026-03-09T19:27:35.290 INFO:tasks.workunit.client.1.vm08.stdout:1/812: symlink d9/da/d17/lf8 0 2026-03-09T19:27:35.301 INFO:tasks.workunit.client.1.vm08.stdout:9/670: sync 2026-03-09T19:27:35.302 INFO:tasks.workunit.client.0.vm07.stdout:0/424: sync 2026-03-09T19:27:35.304 INFO:tasks.workunit.client.0.vm07.stdout:0/425: sync 2026-03-09T19:27:35.304 INFO:tasks.workunit.client.1.vm08.stdout:0/686: creat dd/d22/d24/fdb x:0 0 0 2026-03-09T19:27:35.305 INFO:tasks.workunit.client.1.vm08.stdout:7/722: read d5/d14/d38/f3c [451373,18694] 0 2026-03-09T19:27:35.306 INFO:tasks.workunit.client.1.vm08.stdout:7/723: chown d5/d14/dae/d3a/f56 61867098 1 2026-03-09T19:27:35.307 INFO:tasks.workunit.client.1.vm08.stdout:0/687: write dd/d22/d27/d65/fcb [659142,23503] 0 2026-03-09T19:27:35.311 INFO:tasks.workunit.client.0.vm07.stdout:5/445: dread d3/d1a/d28/f3c [0,4194304] 0 2026-03-09T19:27:35.312 INFO:tasks.workunit.client.0.vm07.stdout:5/446: write d3/d1a/fa [4340159,95044] 0 2026-03-09T19:27:35.312 INFO:tasks.workunit.client.0.vm07.stdout:0/426: dread d0/d6/d13/d1c/f27 [0,4194304] 0 2026-03-09T19:27:35.330 INFO:tasks.workunit.client.1.vm08.stdout:3/733: symlink d0/d6/le6 0 2026-03-09T19:27:35.363 INFO:tasks.workunit.client.1.vm08.stdout:1/813: symlink d9/da/d17/d60/lf9 0 2026-03-09T19:27:35.364 INFO:tasks.workunit.client.0.vm07.stdout:2/529: fsync d3/dd/f9a 0 2026-03-09T19:27:35.364 INFO:tasks.workunit.client.0.vm07.stdout:6/410: link d0/d13/l6b d0/d1/db/d52/l9f 0 2026-03-09T19:27:35.364 INFO:tasks.workunit.client.0.vm07.stdout:1/493: fsync d1/d11/d37/d3f/f82 0 2026-03-09T19:27:35.364 INFO:tasks.workunit.client.0.vm07.stdout:1/494: truncate d1/db/f9b 658120 0 2026-03-09T19:27:35.364 INFO:tasks.workunit.client.0.vm07.stdout:1/495: read d1/d11/d37/d3f/d6e/f9f [3814926,96373] 0 2026-03-09T19:27:35.364 INFO:tasks.workunit.client.0.vm07.stdout:3/518: dread d1/f68 [0,4194304] 0 2026-03-09T19:27:35.364 INFO:tasks.workunit.client.1.vm08.stdout:9/671: dwrite d0/d1b/d97/d48/d5e/f6e [0,4194304] 0 2026-03-09T19:27:35.364 INFO:tasks.workunit.client.1.vm08.stdout:7/724: rmdir d5/d14/dae/d1c/d83/d9c/dcb/dd2 39 2026-03-09T19:27:35.364 INFO:tasks.workunit.client.1.vm08.stdout:9/672: chown d0/d2/d14/d98/l60 2 1 2026-03-09T19:27:35.364 INFO:tasks.workunit.client.1.vm08.stdout:7/725: chown d5/d14/dae/f45 9047443 1 2026-03-09T19:27:35.364 INFO:tasks.workunit.client.1.vm08.stdout:7/726: write d5/d14/d27/d78/dc7/fd4 [1455521,49787] 0 2026-03-09T19:27:35.364 INFO:tasks.workunit.client.1.vm08.stdout:7/727: chown d5/d14/f46 3 1 2026-03-09T19:27:35.364 INFO:tasks.workunit.client.1.vm08.stdout:7/728: readlink d5/d14/dae/l26 0 2026-03-09T19:27:35.364 INFO:tasks.workunit.client.1.vm08.stdout:0/688: rename dd/d22/d24/f87 to dd/d22/d27/d2e/db0/fdc 0 2026-03-09T19:27:35.376 INFO:tasks.workunit.client.1.vm08.stdout:3/734: creat d0/d52/d6d/d77/d88/fe7 x:0 0 0 2026-03-09T19:27:35.425 INFO:tasks.workunit.client.1.vm08.stdout:5/643: write d16/d1e/d30/f3a [690145,76238] 0 2026-03-09T19:27:35.427 INFO:tasks.workunit.client.0.vm07.stdout:7/469: dwrite d0/d4/d5/d8/d41/d64/d74/d98/f18 [0,4194304] 0 2026-03-09T19:27:35.430 INFO:tasks.workunit.client.0.vm07.stdout:9/488: write d0/d6/f7b [53140,83446] 0 2026-03-09T19:27:35.431 INFO:tasks.workunit.client.0.vm07.stdout:7/470: truncate d0/d4/d5/d26/d3c/d39/f7a 426395 0 2026-03-09T19:27:35.435 INFO:tasks.workunit.client.1.vm08.stdout:6/701: dwrite d3/f12 [0,4194304] 0 2026-03-09T19:27:35.440 INFO:tasks.workunit.client.1.vm08.stdout:2/599: write d3/d9/d79/d46/d8c/fbb [484263,94160] 0 2026-03-09T19:27:35.449 INFO:tasks.workunit.client.1.vm08.stdout:8/654: dwrite f6 [4194304,4194304] 0 2026-03-09T19:27:35.460 INFO:tasks.workunit.client.1.vm08.stdout:7/729: creat d5/d14/d2b/d5d/ff7 x:0 0 0 2026-03-09T19:27:35.478 INFO:tasks.workunit.client.1.vm08.stdout:5/644: chown d16/d45/l85 10 1 2026-03-09T19:27:35.487 INFO:tasks.workunit.client.1.vm08.stdout:3/735: getdents d0/d6/d93/dcb/dde 0 2026-03-09T19:27:35.490 INFO:tasks.workunit.client.1.vm08.stdout:4/658: link da/d10/d16/d28/d46/d52/d6e/l62 da/d10/d1b/d23/lc5 0 2026-03-09T19:27:35.498 INFO:tasks.workunit.client.1.vm08.stdout:1/814: truncate d9/da/d17/fb1 2055078 0 2026-03-09T19:27:35.499 INFO:tasks.workunit.client.1.vm08.stdout:9/673: truncate d0/d1b/d97/f3f 4252720 0 2026-03-09T19:27:35.502 INFO:tasks.workunit.client.1.vm08.stdout:5/645: truncate d16/f18 5477353 0 2026-03-09T19:27:35.506 INFO:tasks.workunit.client.0.vm07.stdout:0/427: fsync d0/d6/d13/d17/d19/d57/d6a/f74 0 2026-03-09T19:27:35.507 INFO:tasks.workunit.client.1.vm08.stdout:6/702: creat d3/dbc/deb/f106 x:0 0 0 2026-03-09T19:27:35.520 INFO:tasks.workunit.client.0.vm07.stdout:8/489: creat d7/d30/fb7 x:0 0 0 2026-03-09T19:27:35.526 INFO:tasks.workunit.client.1.vm08.stdout:2/600: mkdir d3/dca 0 2026-03-09T19:27:35.529 INFO:tasks.workunit.client.0.vm07.stdout:6/411: mkdir d0/d1/db/d17/d4c/d7b/da0 0 2026-03-09T19:27:35.529 INFO:tasks.workunit.client.0.vm07.stdout:6/412: chown d0/d1/f92 296084 1 2026-03-09T19:27:35.537 INFO:tasks.workunit.client.1.vm08.stdout:7/730: mkdir d5/d14/dae/d1c/db5/df8 0 2026-03-09T19:27:35.537 INFO:tasks.workunit.client.1.vm08.stdout:7/731: chown d5/dc4 600743524 1 2026-03-09T19:27:35.538 INFO:tasks.workunit.client.1.vm08.stdout:7/732: chown d5/d14/d2b/d5d/l6e 19147 1 2026-03-09T19:27:35.544 INFO:tasks.workunit.client.0.vm07.stdout:2/530: dwrite d3/dd/f34 [0,4194304] 0 2026-03-09T19:27:35.561 INFO:tasks.workunit.client.0.vm07.stdout:1/496: dread d1/db/d31/d4f/f77 [0,4194304] 0 2026-03-09T19:27:35.580 INFO:tasks.workunit.client.0.vm07.stdout:9/489: rename d0/d6/ld to d0/db/d29/d68/d99/lb1 0 2026-03-09T19:27:35.581 INFO:tasks.workunit.client.1.vm08.stdout:3/736: dwrite d0/d52/d7c/f99 [0,4194304] 0 2026-03-09T19:27:35.589 INFO:tasks.workunit.client.1.vm08.stdout:1/815: fsync d9/d11/d7a/d89/d8d/da3/fab 0 2026-03-09T19:27:35.590 INFO:tasks.workunit.client.1.vm08.stdout:1/816: chown d9/d11/d7a/d89 30079 1 2026-03-09T19:27:35.593 INFO:tasks.workunit.client.1.vm08.stdout:9/674: rmdir d0/d1b/d97/dd3 39 2026-03-09T19:27:35.593 INFO:tasks.workunit.client.0.vm07.stdout:7/471: read - d0/d4/d5/d26/d3c/d39/f97 zero size 2026-03-09T19:27:35.594 INFO:tasks.workunit.client.1.vm08.stdout:9/675: truncate d0/d1b/d68/d7f/d8c/da2/fdd 76412 0 2026-03-09T19:27:35.599 INFO:tasks.workunit.client.0.vm07.stdout:8/490: truncate d7/d9/d37/f3b 577411 0 2026-03-09T19:27:35.599 INFO:tasks.workunit.client.0.vm07.stdout:4/454: link d3/d11/d16/d2f/f44 d3/fa2 0 2026-03-09T19:27:35.600 INFO:tasks.workunit.client.0.vm07.stdout:6/413: creat d0/d1/db/d52/fa1 x:0 0 0 2026-03-09T19:27:35.604 INFO:tasks.workunit.client.1.vm08.stdout:7/733: creat d5/d14/d2b/daa/ff9 x:0 0 0 2026-03-09T19:27:35.609 INFO:tasks.workunit.client.0.vm07.stdout:3/519: mknod d1/d1f/ca5 0 2026-03-09T19:27:35.610 INFO:tasks.workunit.client.1.vm08.stdout:0/689: getdents dd/d22/d24/d49/d50/d78/db4 0 2026-03-09T19:27:35.613 INFO:tasks.workunit.client.1.vm08.stdout:4/659: write da/d10/d26/f74 [4686286,77930] 0 2026-03-09T19:27:35.614 INFO:tasks.workunit.client.0.vm07.stdout:3/520: sync 2026-03-09T19:27:35.615 INFO:tasks.workunit.client.0.vm07.stdout:3/521: chown d1/d6/dd/d51/f49 118763580 1 2026-03-09T19:27:35.615 INFO:tasks.workunit.client.1.vm08.stdout:0/690: dread dd/f19 [0,4194304] 0 2026-03-09T19:27:35.620 INFO:tasks.workunit.client.0.vm07.stdout:0/428: rename d0/d6/d13/d33/c44 to d0/d6/d13/d1c/d61/d69/c8e 0 2026-03-09T19:27:35.636 INFO:tasks.workunit.client.0.vm07.stdout:1/497: dread d1/d3/f12 [0,4194304] 0 2026-03-09T19:27:35.656 INFO:tasks.workunit.client.1.vm08.stdout:5/646: mkdir d16/d1e/dc9/dcf 0 2026-03-09T19:27:35.658 INFO:tasks.workunit.client.0.vm07.stdout:8/491: creat d7/d50/da6/fb8 x:0 0 0 2026-03-09T19:27:35.757 INFO:tasks.workunit.client.0.vm07.stdout:7/472: write d0/d4/d5/d26/d32/f7c [3592921,30528] 0 2026-03-09T19:27:35.760 INFO:tasks.workunit.client.0.vm07.stdout:6/414: creat d0/d13/d1e/fa2 x:0 0 0 2026-03-09T19:27:35.765 INFO:tasks.workunit.client.1.vm08.stdout:1/817: dwrite d9/da/d12/f98 [4194304,4194304] 0 2026-03-09T19:27:35.769 INFO:tasks.workunit.client.1.vm08.stdout:1/818: dread d9/d11/d7a/d89/fdb [0,4194304] 0 2026-03-09T19:27:35.769 INFO:tasks.workunit.client.1.vm08.stdout:1/819: chown d9/da 187957406 1 2026-03-09T19:27:35.775 INFO:tasks.workunit.client.1.vm08.stdout:8/655: link de/d25/l5a de/d1d/d2e/d5f/ldc 0 2026-03-09T19:27:35.782 INFO:tasks.workunit.client.0.vm07.stdout:2/531: creat d3/dd/d16/d29/d2d/d45/d3b/dae/fbb x:0 0 0 2026-03-09T19:27:35.783 INFO:tasks.workunit.client.1.vm08.stdout:7/734: truncate d5/d14/dae/d1c/d83/d9c/dcb/fcd 331421 0 2026-03-09T19:27:35.788 INFO:tasks.workunit.client.1.vm08.stdout:6/703: dwrite d3/d68/d7e/fbb [0,4194304] 0 2026-03-09T19:27:35.790 INFO:tasks.workunit.client.1.vm08.stdout:6/704: chown d3/f3e 7 1 2026-03-09T19:27:35.792 INFO:tasks.workunit.client.0.vm07.stdout:2/532: sync 2026-03-09T19:27:35.802 INFO:tasks.workunit.client.0.vm07.stdout:1/498: truncate d1/db/f1f 377569 0 2026-03-09T19:27:35.807 INFO:tasks.workunit.client.1.vm08.stdout:4/660: creat da/d10/d26/da0/fc6 x:0 0 0 2026-03-09T19:27:35.815 INFO:tasks.workunit.client.0.vm07.stdout:5/447: link d3/d1a/d5a/c7e d3/dd/d26/d2d/d60/c8d 0 2026-03-09T19:27:35.836 INFO:tasks.workunit.client.1.vm08.stdout:0/691: dread dd/d22/d63/d6e/f8a [0,4194304] 0 2026-03-09T19:27:35.844 INFO:tasks.workunit.client.0.vm07.stdout:2/533: mknod d3/dd/daa/cbc 0 2026-03-09T19:27:35.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:35 vm08.local ceph-mon[57794]: pgmap v172: 65 pgs: 65 active+clean; 2.4 GiB data, 8.5 GiB used, 112 GiB / 120 GiB avail; 27 MiB/s rd, 81 MiB/s wr, 184 op/s 2026-03-09T19:27:35.848 INFO:tasks.workunit.client.0.vm07.stdout:5/448: stat d3/dd/d26/d3f/d47 0 2026-03-09T19:27:35.857 INFO:tasks.workunit.client.0.vm07.stdout:9/490: getdents d0/db 0 2026-03-09T19:27:35.858 INFO:tasks.workunit.client.1.vm08.stdout:3/737: creat d0/d6/de/d6e/d51/d7f/de3/fe8 x:0 0 0 2026-03-09T19:27:35.858 INFO:tasks.workunit.client.0.vm07.stdout:1/499: sync 2026-03-09T19:27:35.862 INFO:tasks.workunit.client.0.vm07.stdout:2/534: link d3/dd/l28 d3/dd/d16/d29/d2d/d45/d3b/d44/d96/lbd 0 2026-03-09T19:27:35.869 INFO:tasks.workunit.client.0.vm07.stdout:9/491: creat d0/db/d29/d2c/d36/d5a/fb2 x:0 0 0 2026-03-09T19:27:35.869 INFO:tasks.workunit.client.0.vm07.stdout:9/492: stat d0/d6/d3a/d7e/d7f 0 2026-03-09T19:27:35.869 INFO:tasks.workunit.client.1.vm08.stdout:8/656: sync 2026-03-09T19:27:35.873 INFO:tasks.workunit.client.1.vm08.stdout:8/657: dwrite de/f1f [0,4194304] 0 2026-03-09T19:27:35.875 INFO:tasks.workunit.client.1.vm08.stdout:8/658: dread - de/d91/fd1 zero size 2026-03-09T19:27:35.878 INFO:tasks.workunit.client.1.vm08.stdout:2/601: rename d3/d4/d23/d2c/d39/d5e/de/f7a to d3/d9/d79/d46/fcb 0 2026-03-09T19:27:35.901 INFO:tasks.workunit.client.0.vm07.stdout:2/535: dread - d3/dd/d16/d29/d3c/d4c/f9d zero size 2026-03-09T19:27:35.925 INFO:tasks.workunit.client.0.vm07.stdout:2/536: creat d3/dd/d16/d29/d3c/d5a/fbe x:0 0 0 2026-03-09T19:27:35.926 INFO:tasks.workunit.client.0.vm07.stdout:2/537: readlink d3/dd/d16/d29/d2d/d45/d3b/d53/l84 0 2026-03-09T19:27:35.927 INFO:tasks.workunit.client.0.vm07.stdout:2/538: chown d3/dd/d16/d29/d2d/d45/d3b/d44/c95 5742 1 2026-03-09T19:27:35.928 INFO:tasks.workunit.client.0.vm07.stdout:2/539: chown d3/dd/d16/d29/d3c/d4c/c6f 1950106 1 2026-03-09T19:27:35.929 INFO:tasks.workunit.client.1.vm08.stdout:6/705: mkdir d3/d34/d5c/da2/d107 0 2026-03-09T19:27:35.930 INFO:tasks.workunit.client.1.vm08.stdout:6/706: chown d3/d94/led 2674821 1 2026-03-09T19:27:35.942 INFO:tasks.workunit.client.1.vm08.stdout:3/738: truncate d0/d6/de/d15/d96/fd9 773905 0 2026-03-09T19:27:35.969 INFO:tasks.workunit.client.0.vm07.stdout:1/500: creat d1/db/d31/fa8 x:0 0 0 2026-03-09T19:27:35.969 INFO:tasks.workunit.client.0.vm07.stdout:1/501: creat d1/d11/d37/d3f/d45/d87/fa9 x:0 0 0 2026-03-09T19:27:35.969 INFO:tasks.workunit.client.1.vm08.stdout:3/739: write d0/d6/de/d1b/d16/dd1/fe1 [408457,130416] 0 2026-03-09T19:27:35.969 INFO:tasks.workunit.client.1.vm08.stdout:4/661: dread da/d10/d16/fc1 [0,4194304] 0 2026-03-09T19:27:35.969 INFO:tasks.workunit.client.1.vm08.stdout:8/659: unlink de/d25/d87/lc4 0 2026-03-09T19:27:35.969 INFO:tasks.workunit.client.1.vm08.stdout:2/602: mknod d3/d4/d23/d2c/d39/d5e/de/d18/d1f/ccc 0 2026-03-09T19:27:35.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:35 vm07.local ceph-mon[48545]: pgmap v172: 65 pgs: 65 active+clean; 2.4 GiB data, 8.5 GiB used, 112 GiB / 120 GiB avail; 27 MiB/s rd, 81 MiB/s wr, 184 op/s 2026-03-09T19:27:35.983 INFO:tasks.workunit.client.0.vm07.stdout:1/502: dread d1/d11/d37/f2c [0,4194304] 0 2026-03-09T19:27:35.984 INFO:tasks.workunit.client.1.vm08.stdout:6/707: dread d3/d94/fb5 [0,4194304] 0 2026-03-09T19:27:35.989 INFO:tasks.workunit.client.1.vm08.stdout:3/740: mknod d0/d6/d25/ce9 0 2026-03-09T19:27:36.001 INFO:tasks.workunit.client.0.vm07.stdout:1/503: truncate d1/d11/d37/f40 156389 0 2026-03-09T19:27:36.002 INFO:tasks.workunit.client.1.vm08.stdout:8/660: creat de/d25/d31/d82/d6d/d99/da5/fdd x:0 0 0 2026-03-09T19:27:36.003 INFO:tasks.workunit.client.1.vm08.stdout:6/708: sync 2026-03-09T19:27:36.010 INFO:tasks.workunit.client.0.vm07.stdout:4/455: dwrite d3/d4f/d56/d5f/f72 [0,4194304] 0 2026-03-09T19:27:36.013 INFO:tasks.workunit.client.0.vm07.stdout:1/504: dread d1/d3/d21/f47 [0,4194304] 0 2026-03-09T19:27:36.013 INFO:tasks.workunit.client.1.vm08.stdout:8/661: mkdir de/d25/d31/d82/d6d/d99/dde 0 2026-03-09T19:27:36.014 INFO:tasks.workunit.client.1.vm08.stdout:8/662: stat de/d1d/d2e/d5f/ldc 0 2026-03-09T19:27:36.017 INFO:tasks.workunit.client.1.vm08.stdout:6/709: creat d3/d34/da9/f108 x:0 0 0 2026-03-09T19:27:36.018 INFO:tasks.workunit.client.1.vm08.stdout:0/692: unlink dd/d22/d63/l5a 0 2026-03-09T19:27:36.020 INFO:tasks.workunit.client.1.vm08.stdout:5/647: rename d16/d8e/fa2 to d16/d1e/d8c/d99/da8/fd0 0 2026-03-09T19:27:36.021 INFO:tasks.workunit.client.1.vm08.stdout:5/648: chown d16/d1e/d3b/d61/l83 3033291 1 2026-03-09T19:27:36.023 INFO:tasks.workunit.client.1.vm08.stdout:6/710: fdatasync d3/db/d43/d69/da0/faf 0 2026-03-09T19:27:36.035 INFO:tasks.workunit.client.1.vm08.stdout:6/711: creat d3/f109 x:0 0 0 2026-03-09T19:27:36.037 INFO:tasks.workunit.client.1.vm08.stdout:0/693: sync 2026-03-09T19:27:36.037 INFO:tasks.workunit.client.1.vm08.stdout:6/712: mknod d3/d34/d5c/da2/c10a 0 2026-03-09T19:27:36.041 INFO:tasks.workunit.client.1.vm08.stdout:0/694: write dd/d22/d24/d49/d50/d78/d86/fd5 [448776,74264] 0 2026-03-09T19:27:36.041 INFO:tasks.workunit.client.1.vm08.stdout:6/713: truncate d3/db/d43/d69/f101 688039 0 2026-03-09T19:27:36.047 INFO:tasks.workunit.client.0.vm07.stdout:1/505: dread d1/d3/f4 [0,4194304] 0 2026-03-09T19:27:36.049 INFO:tasks.workunit.client.0.vm07.stdout:1/506: fdatasync d1/f6 0 2026-03-09T19:27:36.051 INFO:tasks.workunit.client.1.vm08.stdout:3/741: rename d0/d6/d93/ce4 to d0/d6/de/d1b/d16/d17/dac/dd2/dd3/cea 0 2026-03-09T19:27:36.053 INFO:tasks.workunit.client.0.vm07.stdout:1/507: fdatasync d1/db/d31/d56/f6a 0 2026-03-09T19:27:36.056 INFO:tasks.workunit.client.0.vm07.stdout:1/508: creat d1/d11/d37/d3f/d6e/d9c/faa x:0 0 0 2026-03-09T19:27:36.057 INFO:tasks.workunit.client.0.vm07.stdout:1/509: chown d1/d3/d21/f2e 157324753 1 2026-03-09T19:27:36.058 INFO:tasks.workunit.client.0.vm07.stdout:1/510: dread - d1/d3/d21/f2e zero size 2026-03-09T19:27:36.058 INFO:tasks.workunit.client.0.vm07.stdout:1/511: chown d1/f1d 3123 1 2026-03-09T19:27:36.059 INFO:tasks.workunit.client.1.vm08.stdout:8/663: rename de/d47/d85/c90 to de/d47/cdf 0 2026-03-09T19:27:36.062 INFO:tasks.workunit.client.1.vm08.stdout:3/742: dwrite d0/d6/de/d6e/d51/d7f/de3/fe8 [0,4194304] 0 2026-03-09T19:27:36.062 INFO:tasks.workunit.client.1.vm08.stdout:8/664: chown de/d25/d33/fb6 39779 1 2026-03-09T19:27:36.063 INFO:tasks.workunit.client.1.vm08.stdout:8/665: dread - de/d25/d31/d82/fb5 zero size 2026-03-09T19:27:36.064 INFO:tasks.workunit.client.0.vm07.stdout:1/512: rmdir d1/db/d31/d56 39 2026-03-09T19:27:36.067 INFO:tasks.workunit.client.1.vm08.stdout:0/695: link dd/l90 dd/d22/d7b/ldd 0 2026-03-09T19:27:36.072 INFO:tasks.workunit.client.0.vm07.stdout:3/522: write d1/d6/d4c/f61 [358618,81962] 0 2026-03-09T19:27:36.072 INFO:tasks.workunit.client.1.vm08.stdout:8/666: truncate de/d7c/f95 389813 0 2026-03-09T19:27:36.072 INFO:tasks.workunit.client.1.vm08.stdout:9/676: dwrite d0/d1b/d97/d48/d5e/fa1 [0,4194304] 0 2026-03-09T19:27:36.073 INFO:tasks.workunit.client.1.vm08.stdout:0/696: creat dd/d7e/fde x:0 0 0 2026-03-09T19:27:36.074 INFO:tasks.workunit.client.0.vm07.stdout:3/523: chown d1/d1f/d5c 3 1 2026-03-09T19:27:36.083 INFO:tasks.workunit.client.1.vm08.stdout:8/667: symlink de/d47/dd4/le0 0 2026-03-09T19:27:36.085 INFO:tasks.workunit.client.0.vm07.stdout:3/524: dread d1/d1f/d5c/f7d [0,4194304] 0 2026-03-09T19:27:36.091 INFO:tasks.workunit.client.1.vm08.stdout:8/668: creat de/d7c/fe1 x:0 0 0 2026-03-09T19:27:36.091 INFO:tasks.workunit.client.1.vm08.stdout:9/677: creat d0/d2/d14/d98/dbb/fe1 x:0 0 0 2026-03-09T19:27:36.093 INFO:tasks.workunit.client.0.vm07.stdout:8/492: dwrite d7/d9/d10/d44/f48 [0,4194304] 0 2026-03-09T19:27:36.095 INFO:tasks.workunit.client.0.vm07.stdout:6/415: write d0/d1/db/d1d/f3e [1421810,111469] 0 2026-03-09T19:27:36.095 INFO:tasks.workunit.client.1.vm08.stdout:8/669: symlink de/d25/d31/d82/d6d/d99/le2 0 2026-03-09T19:27:36.095 INFO:tasks.workunit.client.0.vm07.stdout:7/473: dwrite d0/d4/f6f [0,4194304] 0 2026-03-09T19:27:36.099 INFO:tasks.workunit.client.1.vm08.stdout:9/678: creat d0/d2/d80/d69/fe2 x:0 0 0 2026-03-09T19:27:36.099 INFO:tasks.workunit.client.1.vm08.stdout:8/670: rename de/d1d/c4d to de/d25/d87/ce3 0 2026-03-09T19:27:36.101 INFO:tasks.workunit.client.0.vm07.stdout:3/525: sync 2026-03-09T19:27:36.102 INFO:tasks.workunit.client.1.vm08.stdout:3/743: truncate d0/d52/d6d/d77/d88/fdd 241062 0 2026-03-09T19:27:36.104 INFO:tasks.workunit.client.1.vm08.stdout:8/671: unlink de/d1d/f97 0 2026-03-09T19:27:36.106 INFO:tasks.workunit.client.1.vm08.stdout:3/744: fdatasync d0/d6/d93/dcb/fce 0 2026-03-09T19:27:36.107 INFO:tasks.workunit.client.0.vm07.stdout:3/526: truncate d1/d74/f6e 657771 0 2026-03-09T19:27:36.107 INFO:tasks.workunit.client.0.vm07.stdout:5/449: dwrite d3/dd/d26/f7d [0,4194304] 0 2026-03-09T19:27:36.108 INFO:tasks.workunit.client.0.vm07.stdout:7/474: creat d0/d52/d54/fa4 x:0 0 0 2026-03-09T19:27:36.110 INFO:tasks.workunit.client.1.vm08.stdout:8/672: rename de/d1d/d2e/d5f/fd3 to de/d91/dc8/fe4 0 2026-03-09T19:27:36.117 INFO:tasks.workunit.client.0.vm07.stdout:3/527: truncate d1/d1f/f38 2159413 0 2026-03-09T19:27:36.117 INFO:tasks.workunit.client.1.vm08.stdout:7/735: write d5/d14/dae/d3a/d42/f65 [4757852,3806] 0 2026-03-09T19:27:36.121 INFO:tasks.workunit.client.0.vm07.stdout:5/450: dwrite d3/dd/f58 [4194304,4194304] 0 2026-03-09T19:27:36.131 INFO:tasks.workunit.client.0.vm07.stdout:6/416: getdents d0/d13/d1e 0 2026-03-09T19:27:36.131 INFO:tasks.workunit.client.0.vm07.stdout:6/417: readlink d0/d1/db/l71 0 2026-03-09T19:27:36.131 INFO:tasks.workunit.client.1.vm08.stdout:8/673: creat de/d25/d31/d82/d6d/d99/da5/db3/fe5 x:0 0 0 2026-03-09T19:27:36.131 INFO:tasks.workunit.client.1.vm08.stdout:8/674: readlink de/d1d/d2e/la2 0 2026-03-09T19:27:36.131 INFO:tasks.workunit.client.1.vm08.stdout:7/736: rmdir d5/d14/dae/d3a/d42/d85 39 2026-03-09T19:27:36.133 INFO:tasks.workunit.client.1.vm08.stdout:8/675: mkdir de/d47/de6 0 2026-03-09T19:27:36.139 INFO:tasks.workunit.client.1.vm08.stdout:1/820: truncate d9/da/d53/d67/fc4 3600972 0 2026-03-09T19:27:36.141 INFO:tasks.workunit.client.1.vm08.stdout:8/676: creat de/d7c/fe7 x:0 0 0 2026-03-09T19:27:36.143 INFO:tasks.workunit.client.1.vm08.stdout:1/821: mknod d9/da/d2d/d4e/cfa 0 2026-03-09T19:27:36.151 INFO:tasks.workunit.client.0.vm07.stdout:6/418: dread d0/fe [0,4194304] 0 2026-03-09T19:27:36.155 INFO:tasks.workunit.client.0.vm07.stdout:6/419: dwrite d0/d1/db/d1d/f3e [0,4194304] 0 2026-03-09T19:27:36.174 INFO:tasks.workunit.client.1.vm08.stdout:4/662: dwrite da/d10/d16/d28/d46/d52/d6e/d40/f70 [0,4194304] 0 2026-03-09T19:27:36.174 INFO:tasks.workunit.client.1.vm08.stdout:8/677: rename de/d25/d31/d82/d6d/fc3 to de/d47/fe8 0 2026-03-09T19:27:36.182 INFO:tasks.workunit.client.0.vm07.stdout:1/513: link d1/d11/d37/d3f/d45/l2d d1/d11/d37/lab 0 2026-03-09T19:27:36.206 INFO:tasks.workunit.client.1.vm08.stdout:7/737: rmdir d5/d14/dae/d1c/d83 39 2026-03-09T19:27:36.207 INFO:tasks.workunit.client.1.vm08.stdout:1/822: fdatasync d9/d11/d7a/d89/d8d/da3/fde 0 2026-03-09T19:27:36.207 INFO:tasks.workunit.client.1.vm08.stdout:2/603: dwrite d3/f7 [0,4194304] 0 2026-03-09T19:27:36.207 INFO:tasks.workunit.client.1.vm08.stdout:8/678: mkdir de/d91/dc8/de9 0 2026-03-09T19:27:36.210 INFO:tasks.workunit.client.1.vm08.stdout:7/738: rename d5/d14/dae/d1c/f29 to d5/d14/dae/d1c/db5/df8/ffa 0 2026-03-09T19:27:36.210 INFO:tasks.workunit.client.1.vm08.stdout:7/739: readlink d5/l57 0 2026-03-09T19:27:36.213 INFO:tasks.workunit.client.0.vm07.stdout:3/528: getdents d1/d6/dd/d51 0 2026-03-09T19:27:36.223 INFO:tasks.workunit.client.1.vm08.stdout:2/604: creat d3/d4/d23/d2c/d39/d5e/d14/fcd x:0 0 0 2026-03-09T19:27:36.229 INFO:tasks.workunit.client.1.vm08.stdout:8/679: creat de/d1d/d69/fea x:0 0 0 2026-03-09T19:27:36.231 INFO:tasks.workunit.client.0.vm07.stdout:1/514: getdents d1/d11/d37/d5a/d6d 0 2026-03-09T19:27:36.236 INFO:tasks.workunit.client.1.vm08.stdout:8/680: read de/f10 [926555,100547] 0 2026-03-09T19:27:36.246 INFO:tasks.workunit.client.1.vm08.stdout:4/663: mknod da/d10/d16/d28/d2f/d4f/d64/d81/cc7 0 2026-03-09T19:27:36.246 INFO:tasks.workunit.client.1.vm08.stdout:7/740: rename d5/d14/dae/d1c/d83 to d5/d14/d27/d54/dfb 0 2026-03-09T19:27:36.246 INFO:tasks.workunit.client.1.vm08.stdout:7/741: rename d5/d14 to d5/d14/d27/d54/dfb/dfc 22 2026-03-09T19:27:36.246 INFO:tasks.workunit.client.1.vm08.stdout:7/742: dread d5/d14/dae/d1c/f87 [0,4194304] 0 2026-03-09T19:27:36.254 INFO:tasks.workunit.client.1.vm08.stdout:2/605: dread d3/d9/f5d [0,4194304] 0 2026-03-09T19:27:36.257 INFO:tasks.workunit.client.1.vm08.stdout:2/606: symlink d3/dca/lce 0 2026-03-09T19:27:36.260 INFO:tasks.workunit.client.1.vm08.stdout:2/607: creat d3/d4/d23/d2c/d39/d5e/de/d18/da9/fcf x:0 0 0 2026-03-09T19:27:36.350 INFO:tasks.workunit.client.0.vm07.stdout:0/429: creat d0/d6/d13/d17/f8f x:0 0 0 2026-03-09T19:27:36.361 INFO:tasks.workunit.client.0.vm07.stdout:4/456: write d3/f7 [2167400,125978] 0 2026-03-09T19:27:36.365 INFO:tasks.workunit.client.0.vm07.stdout:4/457: chown d3/d11/d2b/f49 2167093 1 2026-03-09T19:27:36.379 INFO:tasks.workunit.client.0.vm07.stdout:0/430: dread d0/f3a [0,4194304] 0 2026-03-09T19:27:36.379 INFO:tasks.workunit.client.0.vm07.stdout:0/431: fdatasync d0/d6/d13/d17/d19/d57/f6f 0 2026-03-09T19:27:36.383 INFO:tasks.workunit.client.0.vm07.stdout:4/458: symlink d3/d11/la3 0 2026-03-09T19:27:36.384 INFO:tasks.workunit.client.0.vm07.stdout:0/432: chown d0/d6/d13/d1c/d11/c70 168291 1 2026-03-09T19:27:36.386 INFO:tasks.workunit.client.0.vm07.stdout:0/433: symlink d0/d6/d13/d33/l90 0 2026-03-09T19:27:36.388 INFO:tasks.workunit.client.0.vm07.stdout:0/434: chown d0/d6/d13/d1c/d11 78366 1 2026-03-09T19:27:36.403 INFO:tasks.workunit.client.0.vm07.stdout:0/435: dwrite d0/f3d [0,4194304] 0 2026-03-09T19:27:36.431 INFO:tasks.workunit.client.1.vm08.stdout:3/745: dread d0/d52/d6d/d77/f68 [0,4194304] 0 2026-03-09T19:27:36.436 INFO:tasks.workunit.client.1.vm08.stdout:3/746: fsync d0/d6/de/d1b/d16/d17/f8c 0 2026-03-09T19:27:36.437 INFO:tasks.workunit.client.1.vm08.stdout:3/747: readlink d0/l29 0 2026-03-09T19:27:36.443 INFO:tasks.workunit.client.1.vm08.stdout:9/679: dread d0/d1b/d97/d48/fb5 [0,4194304] 0 2026-03-09T19:27:36.453 INFO:tasks.workunit.client.1.vm08.stdout:9/680: dread d0/d1b/d97/d48/d5e/fa1 [0,4194304] 0 2026-03-09T19:27:36.456 INFO:tasks.workunit.client.1.vm08.stdout:9/681: creat d0/d1b/d68/d7f/fe3 x:0 0 0 2026-03-09T19:27:36.482 INFO:tasks.workunit.client.1.vm08.stdout:3/748: sync 2026-03-09T19:27:36.483 INFO:tasks.workunit.client.1.vm08.stdout:9/682: sync 2026-03-09T19:27:36.483 INFO:tasks.workunit.client.1.vm08.stdout:9/683: chown d0/d1b/d97/d48/d5d/ddf 1208 1 2026-03-09T19:27:36.484 INFO:tasks.workunit.client.1.vm08.stdout:9/684: chown d0/d1b/d68/d7f/d8c/da2/fdd 3964371 1 2026-03-09T19:27:36.491 INFO:tasks.workunit.client.1.vm08.stdout:1/823: unlink d9/d40/c45 0 2026-03-09T19:27:36.491 INFO:tasks.workunit.client.1.vm08.stdout:5/649: write d16/d1e/d8c/d99/da8/f8b [823913,48203] 0 2026-03-09T19:27:36.493 INFO:tasks.workunit.client.1.vm08.stdout:5/650: truncate d16/d1e/d30/fb4 630109 0 2026-03-09T19:27:36.496 INFO:tasks.workunit.client.1.vm08.stdout:5/651: sync 2026-03-09T19:27:36.498 INFO:tasks.workunit.client.0.vm07.stdout:2/540: rename d3/dd/d16/d29/d3c/d5a/d7a/d74/c9b to d3/dd/d16/d29/cbf 0 2026-03-09T19:27:36.500 INFO:tasks.workunit.client.1.vm08.stdout:6/714: write d3/d94/fb5 [1479410,8947] 0 2026-03-09T19:27:36.502 INFO:tasks.workunit.client.1.vm08.stdout:6/715: dread d3/db/d43/d69/f101 [0,4194304] 0 2026-03-09T19:27:36.512 INFO:tasks.workunit.client.1.vm08.stdout:0/697: dwrite dd/d31/fac [0,4194304] 0 2026-03-09T19:27:36.514 INFO:tasks.workunit.client.1.vm08.stdout:0/698: stat dd/d22/d27/d65 0 2026-03-09T19:27:36.517 INFO:tasks.workunit.client.1.vm08.stdout:0/699: chown dd/d22/d24/d49/d50/f9b 252 1 2026-03-09T19:27:36.523 INFO:tasks.workunit.client.1.vm08.stdout:0/700: dread dd/fe [0,4194304] 0 2026-03-09T19:27:36.523 INFO:tasks.workunit.client.0.vm07.stdout:9/493: creat d0/db/d29/fb3 x:0 0 0 2026-03-09T19:27:36.525 INFO:tasks.workunit.client.0.vm07.stdout:5/451: rename d3/dd/d26/d2d/d60/f7f to d3/dd/d26/d2c/f8e 0 2026-03-09T19:27:36.526 INFO:tasks.workunit.client.0.vm07.stdout:2/541: symlink d3/dd/d16/d30/lc0 0 2026-03-09T19:27:36.526 INFO:tasks.workunit.client.0.vm07.stdout:2/542: stat d3/d11 0 2026-03-09T19:27:36.527 INFO:tasks.workunit.client.0.vm07.stdout:2/543: truncate d3/d49/faf 171071 0 2026-03-09T19:27:36.534 INFO:tasks.workunit.client.0.vm07.stdout:8/493: dwrite d7/d50/f8f [0,4194304] 0 2026-03-09T19:27:36.555 INFO:tasks.workunit.client.0.vm07.stdout:2/544: mknod d3/dd/cc1 0 2026-03-09T19:27:36.575 INFO:tasks.workunit.client.0.vm07.stdout:7/475: write d0/d4/d5/d26/d3c/f63 [419346,72528] 0 2026-03-09T19:27:36.591 INFO:tasks.workunit.client.1.vm08.stdout:0/701: fdatasync dd/d22/d27/d2e/db0/fb2 0 2026-03-09T19:27:36.606 INFO:tasks.workunit.client.0.vm07.stdout:8/494: rmdir d7/d9/d37/d45/d56 39 2026-03-09T19:27:36.609 INFO:tasks.workunit.client.1.vm08.stdout:6/716: rename d3/d55/c96 to d3/d34/d5c/da2/d107/c10b 0 2026-03-09T19:27:36.611 INFO:tasks.workunit.client.0.vm07.stdout:7/476: symlink d0/d4/la5 0 2026-03-09T19:27:36.617 INFO:tasks.workunit.client.0.vm07.stdout:3/529: write d1/d6/f60 [900364,13212] 0 2026-03-09T19:27:36.623 INFO:tasks.workunit.client.1.vm08.stdout:1/824: creat d9/da/d53/ffb x:0 0 0 2026-03-09T19:27:36.624 INFO:tasks.workunit.client.0.vm07.stdout:1/515: truncate d1/d11/d37/d3f/f4a 3729968 0 2026-03-09T19:27:36.645 INFO:tasks.workunit.client.1.vm08.stdout:8/681: write de/d1d/d2e/d5f/f57 [2437042,74109] 0 2026-03-09T19:27:36.656 INFO:tasks.workunit.client.0.vm07.stdout:7/477: truncate d0/d4/d5/d8/f37 807996 0 2026-03-09T19:27:36.657 INFO:tasks.workunit.client.0.vm07.stdout:7/478: chown d0/d4/d5/d8/d1a/c72 1 1 2026-03-09T19:27:36.662 INFO:tasks.workunit.client.0.vm07.stdout:7/479: dwrite d0/d4/d5/d26/d32/f7c [0,4194304] 0 2026-03-09T19:27:36.663 INFO:tasks.workunit.client.1.vm08.stdout:8/682: read de/d1d/d2e/d5f/f57 [1156926,21766] 0 2026-03-09T19:27:36.671 INFO:tasks.workunit.client.1.vm08.stdout:7/743: dwrite d5/d14/d38/f3c [0,4194304] 0 2026-03-09T19:27:36.673 INFO:tasks.workunit.client.1.vm08.stdout:7/744: readlink d5/d14/d27/d54/lcc 0 2026-03-09T19:27:36.678 INFO:tasks.workunit.client.1.vm08.stdout:7/745: read d5/d14/dae/d3a/d42/f71 [857710,83941] 0 2026-03-09T19:27:36.690 INFO:tasks.workunit.client.1.vm08.stdout:4/664: truncate da/d10/d16/d28/d46/d52/d6e/d2c/f36 727445 0 2026-03-09T19:27:36.692 INFO:tasks.workunit.client.1.vm08.stdout:9/685: link d0/d2/d14/fbf d0/d2/d14/d98/dbb/fe4 0 2026-03-09T19:27:36.702 INFO:tasks.workunit.client.0.vm07.stdout:6/420: rename d0/d1/f8 to d0/fa3 0 2026-03-09T19:27:36.703 INFO:tasks.workunit.client.0.vm07.stdout:2/545: rename d3/dd/d16/d29 to d3/dd/d16/d29/d2d/d45/d3b/d44/d97/dc2 22 2026-03-09T19:27:36.711 INFO:tasks.workunit.client.0.vm07.stdout:6/421: sync 2026-03-09T19:27:36.713 INFO:tasks.workunit.client.1.vm08.stdout:2/608: dwrite d3/d4/d23/d2c/d39/d5e/d14/f78 [0,4194304] 0 2026-03-09T19:27:36.718 INFO:tasks.workunit.client.0.vm07.stdout:3/530: unlink d1/d6/d45/l9a 0 2026-03-09T19:27:36.718 INFO:tasks.workunit.client.0.vm07.stdout:3/531: truncate d1/d1f/f9c 39163 0 2026-03-09T19:27:36.719 INFO:tasks.workunit.client.0.vm07.stdout:3/532: chown d1/d6/dd/d51/f49 1311 1 2026-03-09T19:27:36.721 INFO:tasks.workunit.client.1.vm08.stdout:6/717: chown d3/d15/f64 170 1 2026-03-09T19:27:36.742 INFO:tasks.workunit.client.0.vm07.stdout:1/516: creat d1/d11/d37/d5a/d6d/fac x:0 0 0 2026-03-09T19:27:36.753 INFO:tasks.workunit.client.0.vm07.stdout:4/459: dwrite d3/d11/f79 [0,4194304] 0 2026-03-09T19:27:36.755 INFO:tasks.workunit.client.0.vm07.stdout:0/436: dwrite d0/d6/d13/d17/d19/d58/f77 [0,4194304] 0 2026-03-09T19:27:36.777 INFO:tasks.workunit.client.0.vm07.stdout:7/480: chown d0/d4/d5/c9d 2195 1 2026-03-09T19:27:36.786 INFO:tasks.workunit.client.1.vm08.stdout:1/825: dread d9/da/d2d/f41 [4194304,4194304] 0 2026-03-09T19:27:36.800 INFO:tasks.workunit.client.0.vm07.stdout:6/422: mkdir d0/d1/db/d24/da4 0 2026-03-09T19:27:36.811 INFO:tasks.workunit.client.1.vm08.stdout:5/652: write d16/d45/f54 [621383,123696] 0 2026-03-09T19:27:36.818 INFO:tasks.workunit.client.0.vm07.stdout:9/494: dwrite d0/d6/f4c [4194304,4194304] 0 2026-03-09T19:27:36.842 INFO:tasks.workunit.client.0.vm07.stdout:5/452: dwrite d3/d1a/d28/d36/f61 [0,4194304] 0 2026-03-09T19:27:36.843 INFO:tasks.workunit.client.1.vm08.stdout:5/653: dwrite d16/d45/daf/fc5 [0,4194304] 0 2026-03-09T19:27:36.843 INFO:tasks.workunit.client.1.vm08.stdout:3/749: dwrite d0/d52/d6d/d77/d88/fdd [0,4194304] 0 2026-03-09T19:27:36.875 INFO:tasks.workunit.client.1.vm08.stdout:8/683: symlink de/d25/d31/leb 0 2026-03-09T19:27:36.895 INFO:tasks.workunit.client.1.vm08.stdout:0/702: dwrite dd/d22/d7b/d82/fc7 [0,4194304] 0 2026-03-09T19:27:36.898 INFO:tasks.workunit.client.1.vm08.stdout:4/665: symlink da/d10/d26/d27/d32/lc8 0 2026-03-09T19:27:36.899 INFO:tasks.workunit.client.1.vm08.stdout:9/686: readlink d0/d1b/d97/d48/d5e/lab 0 2026-03-09T19:27:36.908 INFO:tasks.workunit.client.1.vm08.stdout:6/718: read d3/db/d43/d69/f101 [599357,74690] 0 2026-03-09T19:27:36.919 INFO:tasks.workunit.client.1.vm08.stdout:1/826: rmdir d9/d11/db6 39 2026-03-09T19:27:36.925 INFO:tasks.workunit.client.0.vm07.stdout:1/517: dread d1/f51 [0,4194304] 0 2026-03-09T19:27:36.931 INFO:tasks.workunit.client.0.vm07.stdout:2/546: rename d3/dd/d16/d29/d2d/d45/d3b/d53 to d3/dd/d16/d29/d2d/d45/dc3 0 2026-03-09T19:27:36.937 INFO:tasks.workunit.client.0.vm07.stdout:6/423: symlink d0/d1/db/d52/d94/d81/la5 0 2026-03-09T19:27:36.937 INFO:tasks.workunit.client.0.vm07.stdout:1/518: sync 2026-03-09T19:27:36.937 INFO:tasks.workunit.client.0.vm07.stdout:5/453: getdents d3/dd/d26/d2d/d79 0 2026-03-09T19:27:36.940 INFO:tasks.workunit.client.0.vm07.stdout:9/495: truncate d0/d6/d3a/f89 1058634 0 2026-03-09T19:27:36.940 INFO:tasks.workunit.client.0.vm07.stdout:9/496: fsync d0/d6/d73/fa6 0 2026-03-09T19:27:36.942 INFO:tasks.workunit.client.1.vm08.stdout:8/684: creat de/d1d/d2e/d5f/fec x:0 0 0 2026-03-09T19:27:36.942 INFO:tasks.workunit.client.1.vm08.stdout:9/687: sync 2026-03-09T19:27:36.943 INFO:tasks.workunit.client.1.vm08.stdout:8/685: dread - de/d25/d31/d82/d6d/d99/da5/db3/fe5 zero size 2026-03-09T19:27:36.945 INFO:tasks.workunit.client.0.vm07.stdout:0/437: link d0/d6/d13/d1c/d11/l28 d0/d6/d13/d33/l91 0 2026-03-09T19:27:36.948 INFO:tasks.workunit.client.0.vm07.stdout:1/519: mkdir d1/d11/d37/d3f/d7e/dad 0 2026-03-09T19:27:36.951 INFO:tasks.workunit.client.0.vm07.stdout:5/454: truncate d3/d1a/d28/d48/f69 4387042 0 2026-03-09T19:27:36.952 INFO:tasks.workunit.client.0.vm07.stdout:6/424: truncate d0/d13/d1e/d95/d31/f3c 8668045 0 2026-03-09T19:27:36.953 INFO:tasks.workunit.client.1.vm08.stdout:4/666: read - da/d10/d16/d28/d2f/d4f/d64/d81/fb2 zero size 2026-03-09T19:27:36.962 INFO:tasks.workunit.client.0.vm07.stdout:9/497: rename d0/d6/d3a/ca2 to d0/db/d29/d32/d5c/d80/cb4 0 2026-03-09T19:27:36.965 INFO:tasks.workunit.client.0.vm07.stdout:0/438: fsync d0/d6/f79 0 2026-03-09T19:27:36.965 INFO:tasks.workunit.client.1.vm08.stdout:6/719: symlink d3/d34/d3b/df5/l10c 0 2026-03-09T19:27:36.970 INFO:tasks.workunit.client.1.vm08.stdout:9/688: rename d0/d1b/d68/d7f/d8c to d0/d2/d80/de5 0 2026-03-09T19:27:36.974 INFO:tasks.workunit.client.0.vm07.stdout:6/425: rmdir d0/d1/db/d52/d94/d87 39 2026-03-09T19:27:36.976 INFO:tasks.workunit.client.0.vm07.stdout:1/520: dwrite d1/db/d31/f64 [0,4194304] 0 2026-03-09T19:27:36.985 INFO:tasks.workunit.client.0.vm07.stdout:0/439: mkdir d0/d6/d13/d1c/d50/d92 0 2026-03-09T19:27:36.987 INFO:tasks.workunit.client.0.vm07.stdout:0/440: chown d0/d6/d13/d1c/d50 1 1 2026-03-09T19:27:36.994 INFO:tasks.workunit.client.0.vm07.stdout:8/495: write d7/d9/d10/f1b [2573416,103964] 0 2026-03-09T19:27:37.003 INFO:tasks.workunit.client.0.vm07.stdout:6/426: creat d0/d13/d1e/d95/d31/fa6 x:0 0 0 2026-03-09T19:27:37.005 INFO:tasks.workunit.client.1.vm08.stdout:1/827: symlink d9/da/d53/lfc 0 2026-03-09T19:27:37.013 INFO:tasks.workunit.client.1.vm08.stdout:3/750: creat d0/d6/de/feb x:0 0 0 2026-03-09T19:27:37.021 INFO:tasks.workunit.client.0.vm07.stdout:1/521: mkdir d1/d3e/dae 0 2026-03-09T19:27:37.025 INFO:tasks.workunit.client.1.vm08.stdout:9/689: mkdir d0/d1b/d68/d7f/de6 0 2026-03-09T19:27:37.026 INFO:tasks.workunit.client.0.vm07.stdout:3/533: write d1/d74/f77 [292183,80554] 0 2026-03-09T19:27:37.028 INFO:tasks.workunit.client.0.vm07.stdout:3/534: read d1/d6/fb [5799053,120858] 0 2026-03-09T19:27:37.031 INFO:tasks.workunit.client.1.vm08.stdout:2/609: write d3/d9/f1e [3359743,104900] 0 2026-03-09T19:27:37.037 INFO:tasks.workunit.client.0.vm07.stdout:4/460: dwrite d3/d11/f7d [0,4194304] 0 2026-03-09T19:27:37.037 INFO:tasks.workunit.client.0.vm07.stdout:7/481: dwrite d0/d4/d5/d8/f35 [0,4194304] 0 2026-03-09T19:27:37.037 INFO:tasks.workunit.client.1.vm08.stdout:8/686: mknod de/d91/dc8/de9/ced 0 2026-03-09T19:27:37.037 INFO:tasks.workunit.client.1.vm08.stdout:2/610: dwrite d3/d4/d3e/d9d/fc5 [0,4194304] 0 2026-03-09T19:27:37.039 INFO:tasks.workunit.client.1.vm08.stdout:9/690: sync 2026-03-09T19:27:37.039 INFO:tasks.workunit.client.1.vm08.stdout:9/691: chown d0/d1b/d68 267442255 1 2026-03-09T19:27:37.044 INFO:tasks.workunit.client.1.vm08.stdout:7/746: getdents d5/d14/d27/d54/dfb/d9c/dcb 0 2026-03-09T19:27:37.045 INFO:tasks.workunit.client.0.vm07.stdout:2/547: getdents d3/dd/d16/d29/d3c/d5a/d7a 0 2026-03-09T19:27:37.053 INFO:tasks.workunit.client.1.vm08.stdout:5/654: dwrite d16/d8e/fb2 [0,4194304] 0 2026-03-09T19:27:37.060 INFO:tasks.workunit.client.0.vm07.stdout:3/535: dread d1/d1f/d5c/f75 [0,4194304] 0 2026-03-09T19:27:37.060 INFO:tasks.workunit.client.0.vm07.stdout:3/536: chown d1/d1f/d16/d28/d7c/c8f 17 1 2026-03-09T19:27:37.061 INFO:tasks.workunit.client.0.vm07.stdout:3/537: dread d1/d74/f52 [0,4194304] 0 2026-03-09T19:27:37.068 INFO:tasks.workunit.client.0.vm07.stdout:6/427: rename d0/d2d/f5b to d0/d1/db/d17/d4c/d7b/d7d/fa7 0 2026-03-09T19:27:37.075 INFO:tasks.workunit.client.1.vm08.stdout:4/667: unlink da/d10/d26/d3a/d69/d75/f7a 0 2026-03-09T19:27:37.084 INFO:tasks.workunit.client.1.vm08.stdout:1/828: symlink d9/da/d95/dcd/lfd 0 2026-03-09T19:27:37.090 INFO:tasks.workunit.client.1.vm08.stdout:3/751: mkdir d0/d6/de/d15/dec 0 2026-03-09T19:27:37.098 INFO:tasks.workunit.client.0.vm07.stdout:4/461: truncate d3/d4f/f7c 5063562 0 2026-03-09T19:27:37.098 INFO:tasks.workunit.client.0.vm07.stdout:5/455: write d3/f25 [2134889,106363] 0 2026-03-09T19:27:37.099 INFO:tasks.workunit.client.1.vm08.stdout:3/752: chown d0/d6/de/d1b/d16/f7b 101424679 1 2026-03-09T19:27:37.099 INFO:tasks.workunit.client.1.vm08.stdout:4/668: sync 2026-03-09T19:27:37.104 INFO:tasks.workunit.client.1.vm08.stdout:3/753: dread d0/d6/de/d15/d96/fa0 [0,4194304] 0 2026-03-09T19:27:37.110 INFO:tasks.workunit.client.1.vm08.stdout:3/754: write d0/d52/d6d/d77/d88/fe7 [463622,28235] 0 2026-03-09T19:27:37.110 INFO:tasks.workunit.client.1.vm08.stdout:6/720: dwrite d3/d94/def/dc4/fe5 [0,4194304] 0 2026-03-09T19:27:37.128 INFO:tasks.workunit.client.0.vm07.stdout:7/482: dread d0/d4/d5/d26/d3c/d58/f70 [0,4194304] 0 2026-03-09T19:27:37.128 INFO:tasks.workunit.client.1.vm08.stdout:2/611: mknod d3/d9/d26/cd0 0 2026-03-09T19:27:37.131 INFO:tasks.workunit.client.0.vm07.stdout:9/498: write d0/db/d29/d2c/d36/f71 [330150,129246] 0 2026-03-09T19:27:37.132 INFO:tasks.workunit.client.1.vm08.stdout:9/692: fdatasync d0/d2/f2a 0 2026-03-09T19:27:37.135 INFO:tasks.workunit.client.1.vm08.stdout:7/747: fdatasync d5/d14/d2b/d4b/fd6 0 2026-03-09T19:27:37.143 INFO:tasks.workunit.client.1.vm08.stdout:0/703: rmdir dd/d6a 0 2026-03-09T19:27:37.145 INFO:tasks.workunit.client.1.vm08.stdout:0/704: dread dd/d22/f28 [4194304,4194304] 0 2026-03-09T19:27:37.146 INFO:tasks.workunit.client.1.vm08.stdout:0/705: chown dd/d22/d63/d6e/d72 10 1 2026-03-09T19:27:37.146 INFO:tasks.workunit.client.1.vm08.stdout:0/706: write dd/d22/f28 [5589358,420] 0 2026-03-09T19:27:37.154 INFO:tasks.workunit.client.1.vm08.stdout:1/829: mknod d9/da/d2d/d62/cfe 0 2026-03-09T19:27:37.174 INFO:tasks.workunit.client.0.vm07.stdout:1/522: rename d1/d11/d37/d5d/f59 to d1/d11/d37/d3f/d45/d87/faf 0 2026-03-09T19:27:37.176 INFO:tasks.workunit.client.0.vm07.stdout:6/428: unlink d0/d1/db/d1d/d77/f9a 0 2026-03-09T19:27:37.176 INFO:tasks.workunit.client.1.vm08.stdout:8/687: truncate de/d1d/f27 717015 0 2026-03-09T19:27:37.176 INFO:tasks.workunit.client.0.vm07.stdout:6/429: readlink d0/d1/db/d24/l25 0 2026-03-09T19:27:37.177 INFO:tasks.workunit.client.0.vm07.stdout:6/430: write d0/d1/db/d52/fa1 [69102,23311] 0 2026-03-09T19:27:37.185 INFO:tasks.workunit.client.1.vm08.stdout:4/669: mkdir da/d10/d26/d27/da6/dc9 0 2026-03-09T19:27:37.187 INFO:tasks.workunit.client.1.vm08.stdout:4/670: chown da/d10/d26/d27/d32/f45 446534 1 2026-03-09T19:27:37.188 INFO:tasks.workunit.client.0.vm07.stdout:0/441: creat d0/f93 x:0 0 0 2026-03-09T19:27:37.189 INFO:tasks.workunit.client.1.vm08.stdout:3/755: symlink d0/d6/de/d6e/d51/led 0 2026-03-09T19:27:37.196 INFO:tasks.workunit.client.0.vm07.stdout:2/548: dwrite f0 [0,4194304] 0 2026-03-09T19:27:37.198 INFO:tasks.workunit.client.0.vm07.stdout:2/549: stat d3/dd/d16/d29/d2d/d45/d85/fa5 0 2026-03-09T19:27:37.201 INFO:tasks.workunit.client.0.vm07.stdout:4/462: dwrite d3/d11/d2b/f49 [0,4194304] 0 2026-03-09T19:27:37.205 INFO:tasks.workunit.client.0.vm07.stdout:5/456: unlink d3/dd/l2a 0 2026-03-09T19:27:37.206 INFO:tasks.workunit.client.1.vm08.stdout:6/721: mkdir d3/d34/d5c/de8/d10d 0 2026-03-09T19:27:37.228 INFO:tasks.workunit.client.0.vm07.stdout:7/483: creat d0/d4/d5/d26/d32/fa6 x:0 0 0 2026-03-09T19:27:37.244 INFO:tasks.workunit.client.0.vm07.stdout:9/499: mknod d0/d17/cb5 0 2026-03-09T19:27:37.246 INFO:tasks.workunit.client.1.vm08.stdout:9/693: fsync d0/d1b/d97/d48/d5d/ddf/f7d 0 2026-03-09T19:27:37.258 INFO:tasks.workunit.client.0.vm07.stdout:3/538: mknod d1/d6/ca6 0 2026-03-09T19:27:37.259 INFO:tasks.workunit.client.1.vm08.stdout:9/694: write d0/d1b/d97/d48/d5e/f6e [2229600,28533] 0 2026-03-09T19:27:37.259 INFO:tasks.workunit.client.1.vm08.stdout:7/748: read - d5/d14/d27/d78/dc7/fcf zero size 2026-03-09T19:27:37.261 INFO:tasks.workunit.client.1.vm08.stdout:5/655: creat d16/d1e/d8c/d99/dcc/fd1 x:0 0 0 2026-03-09T19:27:37.261 INFO:tasks.workunit.client.1.vm08.stdout:9/695: write d0/d2/d80/fde [941436,19678] 0 2026-03-09T19:27:37.263 INFO:tasks.workunit.client.1.vm08.stdout:5/656: write d16/d1e/d8c/d99/da8/fbc [3774187,41853] 0 2026-03-09T19:27:37.272 INFO:tasks.workunit.client.0.vm07.stdout:1/523: fsync d1/d3/f23 0 2026-03-09T19:27:37.272 INFO:tasks.workunit.client.0.vm07.stdout:1/524: chown d1/f76 47 1 2026-03-09T19:27:37.277 INFO:tasks.workunit.client.1.vm08.stdout:0/707: mkdir dd/d22/d27/d65/ddf 0 2026-03-09T19:27:37.285 INFO:tasks.workunit.client.0.vm07.stdout:7/484: dread d0/d52/d54/d55/f67 [0,4194304] 0 2026-03-09T19:27:37.294 INFO:tasks.workunit.client.0.vm07.stdout:6/431: mkdir d0/d1/d28/da8 0 2026-03-09T19:27:37.295 INFO:tasks.workunit.client.0.vm07.stdout:6/432: chown d0/d13/d1e/d95/d31/d9e 50892967 1 2026-03-09T19:27:37.296 INFO:tasks.workunit.client.0.vm07.stdout:6/433: chown d0/d13/c33 41202177 1 2026-03-09T19:27:37.296 INFO:tasks.workunit.client.1.vm08.stdout:1/830: dread d9/da/d53/d67/fc4 [0,4194304] 0 2026-03-09T19:27:37.336 INFO:tasks.workunit.client.1.vm08.stdout:8/688: rename de/d25/d31/d82/fb5 to de/d1d/d21/d73/fee 0 2026-03-09T19:27:37.344 INFO:tasks.workunit.client.1.vm08.stdout:3/756: creat d0/d8/d24/fee x:0 0 0 2026-03-09T19:27:37.353 INFO:tasks.workunit.client.0.vm07.stdout:4/463: mknod d3/d11/d29/d34/d50/ca4 0 2026-03-09T19:27:37.355 INFO:tasks.workunit.client.1.vm08.stdout:6/722: truncate d3/d15/fcb 985060 0 2026-03-09T19:27:37.357 INFO:tasks.workunit.client.0.vm07.stdout:2/550: dread d3/f5 [0,4194304] 0 2026-03-09T19:27:37.358 INFO:tasks.workunit.client.0.vm07.stdout:2/551: chown d3/dd/d16/d29/d3c/d5a/d7a/d74 1061 1 2026-03-09T19:27:37.362 INFO:tasks.workunit.client.0.vm07.stdout:8/496: getdents d7/d9/d10/d44 0 2026-03-09T19:27:37.362 INFO:tasks.workunit.client.0.vm07.stdout:8/497: readlink d7/d9/l8e 0 2026-03-09T19:27:37.364 INFO:tasks.workunit.client.0.vm07.stdout:3/539: fdatasync d1/d6/dd/d51/f6b 0 2026-03-09T19:27:37.376 INFO:tasks.workunit.client.1.vm08.stdout:7/749: fsync d5/d14/d27/d54/dfb/d9c/f9d 0 2026-03-09T19:27:37.376 INFO:tasks.workunit.client.1.vm08.stdout:6/723: dread d3/f3e [0,4194304] 0 2026-03-09T19:27:37.377 INFO:tasks.workunit.client.1.vm08.stdout:7/750: chown d5/d14/d27/d78/dc7 234 1 2026-03-09T19:27:37.378 INFO:tasks.workunit.client.1.vm08.stdout:6/724: write d3/d94/def/dc4/f104 [5214643,101042] 0 2026-03-09T19:27:37.387 INFO:tasks.workunit.client.0.vm07.stdout:1/525: write d1/d11/d37/d3f/d45/d87/faf [26694,107986] 0 2026-03-09T19:27:37.389 INFO:tasks.workunit.client.0.vm07.stdout:5/457: dread d3/f4d [0,4194304] 0 2026-03-09T19:27:37.396 INFO:tasks.workunit.client.0.vm07.stdout:1/526: sync 2026-03-09T19:27:37.400 INFO:tasks.workunit.client.0.vm07.stdout:1/527: dwrite d1/db/d31/fa8 [0,4194304] 0 2026-03-09T19:27:37.402 INFO:tasks.workunit.client.1.vm08.stdout:9/696: rmdir d0/d2/d80 39 2026-03-09T19:27:37.409 INFO:tasks.workunit.client.1.vm08.stdout:9/697: chown d0/d1b/d97/d48/d6f/la5 14745531 1 2026-03-09T19:27:37.413 INFO:tasks.workunit.client.0.vm07.stdout:7/485: truncate d0/f1 1597100 0 2026-03-09T19:27:37.419 INFO:tasks.workunit.client.1.vm08.stdout:5/657: truncate d16/d1e/f5c 4203921 0 2026-03-09T19:27:37.424 INFO:tasks.workunit.client.0.vm07.stdout:6/434: dread d0/d1/db/f4b [0,4194304] 0 2026-03-09T19:27:37.424 INFO:tasks.workunit.client.1.vm08.stdout:5/658: stat d16/d1e/d6e/lc4 0 2026-03-09T19:27:37.425 INFO:tasks.workunit.client.1.vm08.stdout:0/708: creat dd/d22/d27/d2e/fe0 x:0 0 0 2026-03-09T19:27:37.452 INFO:tasks.workunit.client.1.vm08.stdout:3/757: read d0/f7a [3877434,76159] 0 2026-03-09T19:27:37.455 INFO:tasks.workunit.client.1.vm08.stdout:8/689: dread de/d1d/d69/f9a [0,4194304] 0 2026-03-09T19:27:37.458 INFO:tasks.workunit.client.1.vm08.stdout:2/612: creat d3/d4/d23/fd1 x:0 0 0 2026-03-09T19:27:37.465 INFO:tasks.workunit.client.0.vm07.stdout:0/442: creat d0/d6/d13/d1c/d50/d92/f94 x:0 0 0 2026-03-09T19:27:37.468 INFO:tasks.workunit.client.1.vm08.stdout:3/758: dread d0/d52/d6d/d77/f68 [0,4194304] 0 2026-03-09T19:27:37.470 INFO:tasks.workunit.client.0.vm07.stdout:4/464: creat d3/d11/d29/d34/fa5 x:0 0 0 2026-03-09T19:27:37.472 INFO:tasks.workunit.client.0.vm07.stdout:2/552: truncate d3/dd/f1e 9710 0 2026-03-09T19:27:37.494 INFO:tasks.workunit.client.1.vm08.stdout:7/751: dread d5/d14/dae/f6b [0,4194304] 0 2026-03-09T19:27:37.498 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:37 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:37.498 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:37 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:37.498 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:37 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:27:37.514 INFO:tasks.workunit.client.1.vm08.stdout:9/698: read - d0/d1b/d97/d48/d5d/fd1 zero size 2026-03-09T19:27:37.515 INFO:tasks.workunit.client.0.vm07.stdout:5/458: fdatasync d3/d1a/d28/d6c/f7a 0 2026-03-09T19:27:37.516 INFO:tasks.workunit.client.1.vm08.stdout:9/699: write d0/d2/d8/f61 [8497463,35680] 0 2026-03-09T19:27:37.521 INFO:tasks.workunit.client.1.vm08.stdout:5/659: mknod d16/d1e/db3/cd2 0 2026-03-09T19:27:37.521 INFO:tasks.workunit.client.1.vm08.stdout:0/709: truncate dd/d22/f41 1011071 0 2026-03-09T19:27:37.527 INFO:tasks.workunit.client.0.vm07.stdout:7/486: unlink d0/d4/l4b 0 2026-03-09T19:27:37.527 INFO:tasks.workunit.client.0.vm07.stdout:1/528: creat d1/d11/d37/d5a/d6d/fb0 x:0 0 0 2026-03-09T19:27:37.527 INFO:tasks.workunit.client.0.vm07.stdout:1/529: chown d1/d11/d37/f2c 0 1 2026-03-09T19:27:37.534 INFO:tasks.workunit.client.0.vm07.stdout:0/443: mknod d0/d6/d13/d1c/d50/d92/c95 0 2026-03-09T19:27:37.538 INFO:tasks.workunit.client.0.vm07.stdout:4/465: dwrite d3/d11/d51/f8e [0,4194304] 0 2026-03-09T19:27:37.543 INFO:tasks.workunit.client.0.vm07.stdout:2/553: fsync d3/d11/f39 0 2026-03-09T19:27:37.599 INFO:tasks.workunit.client.1.vm08.stdout:0/710: unlink dd/d22/d63/d93/la9 0 2026-03-09T19:27:37.607 INFO:tasks.workunit.client.1.vm08.stdout:5/660: truncate d16/d1e/f2e 3416239 0 2026-03-09T19:27:37.609 INFO:tasks.workunit.client.0.vm07.stdout:5/459: dwrite d3/dd/d26/d2d/f54 [0,4194304] 0 2026-03-09T19:27:37.613 INFO:tasks.workunit.client.1.vm08.stdout:4/671: getdents da/d10/d16/d28/d46 0 2026-03-09T19:27:37.615 INFO:tasks.workunit.client.0.vm07.stdout:5/460: truncate d3/d1a/d28/d40/f46 5201188 0 2026-03-09T19:27:37.619 INFO:tasks.workunit.client.1.vm08.stdout:4/672: chown da/d10/d26/d50 654 1 2026-03-09T19:27:37.636 INFO:tasks.workunit.client.0.vm07.stdout:9/500: getdents d0 0 2026-03-09T19:27:37.637 INFO:tasks.workunit.client.1.vm08.stdout:1/831: truncate d9/d11/db6/ff6 1453395 0 2026-03-09T19:27:37.640 INFO:tasks.workunit.client.1.vm08.stdout:6/725: creat d3/d34/f10e x:0 0 0 2026-03-09T19:27:37.641 INFO:tasks.workunit.client.1.vm08.stdout:7/752: creat d5/d14/d27/d54/dfb/de9/ffd x:0 0 0 2026-03-09T19:27:37.641 INFO:tasks.workunit.client.1.vm08.stdout:7/753: chown d5/d14/d2b/l70 7278 1 2026-03-09T19:27:37.648 INFO:tasks.workunit.client.0.vm07.stdout:8/498: rename d7/f40 to d7/d9/d10/fb9 0 2026-03-09T19:27:37.651 INFO:tasks.workunit.client.1.vm08.stdout:9/700: mknod d0/d1b/d97/d48/d5d/ce7 0 2026-03-09T19:27:37.652 INFO:tasks.workunit.client.1.vm08.stdout:9/701: chown d0/d2/d14/d98/c71 1838943818 1 2026-03-09T19:27:37.677 INFO:tasks.workunit.client.0.vm07.stdout:1/530: dwrite d1/d11/d37/d3f/d45/f16 [0,4194304] 0 2026-03-09T19:27:37.686 INFO:tasks.workunit.client.1.vm08.stdout:5/661: fdatasync d16/d1e/d3b/d61/f7a 0 2026-03-09T19:27:37.687 INFO:tasks.workunit.client.1.vm08.stdout:5/662: chown d16/d1e/d6e/c80 3403 1 2026-03-09T19:27:37.707 INFO:tasks.workunit.client.0.vm07.stdout:6/435: dwrite d0/d1/d28/d76/f97 [0,4194304] 0 2026-03-09T19:27:37.710 INFO:tasks.workunit.client.1.vm08.stdout:4/673: mkdir da/d10/d16/d28/d46/d52/d6e/d73/dca 0 2026-03-09T19:27:37.718 INFO:tasks.workunit.client.0.vm07.stdout:5/461: fdatasync d3/d1a/f86 0 2026-03-09T19:27:37.724 INFO:tasks.workunit.client.0.vm07.stdout:0/444: dwrite d0/d6/d13/f6c [4194304,4194304] 0 2026-03-09T19:27:37.734 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:37 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:37.735 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:37 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:37.735 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:37 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:27:37.735 INFO:tasks.workunit.client.0.vm07.stdout:5/462: dwrite d3/d1a/d5d/f5f [0,4194304] 0 2026-03-09T19:27:37.754 INFO:tasks.workunit.client.0.vm07.stdout:8/499: fdatasync d7/f2e 0 2026-03-09T19:27:37.755 INFO:tasks.workunit.client.0.vm07.stdout:8/500: write d7/d9/d57/fb2 [79444,70914] 0 2026-03-09T19:27:37.756 INFO:tasks.workunit.client.0.vm07.stdout:4/466: rename d3/d11/d16/c59 to d3/d11/d16/d2f/ca6 0 2026-03-09T19:27:37.760 INFO:tasks.workunit.client.0.vm07.stdout:3/540: getdents d1/d6/d45/d54 0 2026-03-09T19:27:37.763 INFO:tasks.workunit.client.0.vm07.stdout:3/541: dread d1/d1f/d5c/f7d [0,4194304] 0 2026-03-09T19:27:37.772 INFO:tasks.workunit.client.0.vm07.stdout:2/554: unlink d3/dd/f1e 0 2026-03-09T19:27:37.772 INFO:tasks.workunit.client.0.vm07.stdout:1/531: mknod d1/d11/d37/d3f/d45/d87/cb1 0 2026-03-09T19:27:37.772 INFO:tasks.workunit.client.0.vm07.stdout:6/436: mkdir d0/d1/d28/da9 0 2026-03-09T19:27:37.785 INFO:tasks.workunit.client.0.vm07.stdout:7/487: creat d0/d52/d54/fa7 x:0 0 0 2026-03-09T19:27:37.785 INFO:tasks.workunit.client.0.vm07.stdout:7/488: dread - d0/d4/d5/d8/fa3 zero size 2026-03-09T19:27:37.789 INFO:tasks.workunit.client.1.vm08.stdout:2/613: creat d3/d9/fd2 x:0 0 0 2026-03-09T19:27:37.804 INFO:tasks.workunit.client.1.vm08.stdout:8/690: symlink de/d25/d31/d82/d6d/d99/da5/db3/lef 0 2026-03-09T19:27:37.808 INFO:tasks.workunit.client.1.vm08.stdout:8/691: dwrite de/d1d/d2e/d5f/fec [0,4194304] 0 2026-03-09T19:27:37.823 INFO:tasks.workunit.client.0.vm07.stdout:0/445: mknod d0/d6/d13/d1c/d50/c96 0 2026-03-09T19:27:37.828 INFO:tasks.workunit.client.1.vm08.stdout:1/832: dread d9/da/d12/d91/dc5/fd7 [0,4194304] 0 2026-03-09T19:27:37.834 INFO:tasks.workunit.client.1.vm08.stdout:7/754: creat d5/d14/d2b/d4b/ffe x:0 0 0 2026-03-09T19:27:37.843 INFO:tasks.workunit.client.1.vm08.stdout:7/755: dread d5/d14/d27/d54/dfb/d9c/dcb/fda [0,4194304] 0 2026-03-09T19:27:37.844 INFO:tasks.workunit.client.1.vm08.stdout:7/756: chown d5/d14/d27/d54/lcc 800163463 1 2026-03-09T19:27:37.859 INFO:tasks.workunit.client.0.vm07.stdout:4/467: rmdir d3/d11/d16/d2f/d22/d86 39 2026-03-09T19:27:37.869 INFO:tasks.workunit.client.1.vm08.stdout:5/663: creat d16/d1e/d9f/fd3 x:0 0 0 2026-03-09T19:27:37.869 INFO:tasks.workunit.client.1.vm08.stdout:5/664: chown d16/d1e/d6e/l77 1474982 1 2026-03-09T19:27:37.885 INFO:tasks.workunit.client.1.vm08.stdout:0/711: dwrite dd/f1e [0,4194304] 0 2026-03-09T19:27:37.893 INFO:tasks.workunit.client.0.vm07.stdout:1/532: dread d1/f38 [0,4194304] 0 2026-03-09T19:27:37.897 INFO:tasks.workunit.client.0.vm07.stdout:2/555: write d3/dd/d16/d29/d3c/d5a/d7a/f6e [50858,43894] 0 2026-03-09T19:27:37.897 INFO:tasks.workunit.client.0.vm07.stdout:0/446: readlink d0/d6/d13/d17/l49 0 2026-03-09T19:27:37.897 INFO:tasks.workunit.client.1.vm08.stdout:4/674: write da/d10/d16/d28/fa3 [575288,8696] 0 2026-03-09T19:27:37.905 INFO:tasks.workunit.client.0.vm07.stdout:5/463: mkdir d3/d1a/d28/d6c/d72/d8f 0 2026-03-09T19:27:37.905 INFO:tasks.workunit.client.1.vm08.stdout:4/675: read f2 [779505,126152] 0 2026-03-09T19:27:37.907 INFO:tasks.workunit.client.1.vm08.stdout:3/759: link d0/d52/d7c/d7e/cd6 d0/d6/de/d6e/d51/d92/cef 0 2026-03-09T19:27:37.914 INFO:tasks.workunit.client.0.vm07.stdout:0/447: dread d0/d6/d13/d17/d19/f1f [0,4194304] 0 2026-03-09T19:27:37.914 INFO:tasks.workunit.client.0.vm07.stdout:9/501: creat d0/db/d29/d2c/fb6 x:0 0 0 2026-03-09T19:27:37.914 INFO:tasks.workunit.client.0.vm07.stdout:0/448: write d0/d6/d13/f6c [2587224,102835] 0 2026-03-09T19:27:37.915 INFO:tasks.workunit.client.0.vm07.stdout:4/468: chown d3/d11/d29/f42 10702797 1 2026-03-09T19:27:37.915 INFO:tasks.workunit.client.0.vm07.stdout:4/469: write d3/f8d [2707227,124570] 0 2026-03-09T19:27:37.922 INFO:tasks.workunit.client.1.vm08.stdout:7/757: dread d5/d14/dae/d1c/d73/fac [0,4194304] 0 2026-03-09T19:27:37.930 INFO:tasks.workunit.client.1.vm08.stdout:9/702: rename d0/d2/d8 to d0/d2/d80/de5/da2/da8/de8 0 2026-03-09T19:27:37.931 INFO:tasks.workunit.client.1.vm08.stdout:9/703: truncate d0/d1b/d97/d48/d6f/fdb 315079 0 2026-03-09T19:27:37.933 INFO:tasks.workunit.client.1.vm08.stdout:9/704: write d0/d2/d80/de5/da2/da8/de8/dcd/fda [142040,96398] 0 2026-03-09T19:27:37.934 INFO:tasks.workunit.client.1.vm08.stdout:9/705: truncate d0/d1b/d68/d7f/fe3 857291 0 2026-03-09T19:27:37.938 INFO:tasks.workunit.client.1.vm08.stdout:8/692: dwrite de/d1d/d69/f84 [0,4194304] 0 2026-03-09T19:27:37.943 INFO:tasks.workunit.client.0.vm07.stdout:1/533: truncate d1/f38 1621769 0 2026-03-09T19:27:37.964 INFO:tasks.workunit.client.0.vm07.stdout:5/464: dread d3/f19 [0,4194304] 0 2026-03-09T19:27:37.968 INFO:tasks.workunit.client.0.vm07.stdout:9/502: dread d0/d17/f1f [0,4194304] 0 2026-03-09T19:27:37.974 INFO:tasks.workunit.client.0.vm07.stdout:2/556: write d3/fa [1290921,70941] 0 2026-03-09T19:27:37.978 INFO:tasks.workunit.client.1.vm08.stdout:0/712: write dd/d22/d24/f71 [544933,9589] 0 2026-03-09T19:27:37.982 INFO:tasks.workunit.client.1.vm08.stdout:0/713: dwrite dd/d22/d27/d4f/fd7 [0,4194304] 0 2026-03-09T19:27:37.984 INFO:tasks.workunit.client.1.vm08.stdout:0/714: chown dd/d22/d27/fc8 1646250178 1 2026-03-09T19:27:38.002 INFO:tasks.workunit.client.0.vm07.stdout:4/470: symlink d3/d11/d29/la7 0 2026-03-09T19:27:38.006 INFO:tasks.workunit.client.1.vm08.stdout:3/760: stat d0/d52/d6d/d77/d88/c97 0 2026-03-09T19:27:38.008 INFO:tasks.workunit.client.0.vm07.stdout:3/542: rename d1/l46 to d1/d1f/la7 0 2026-03-09T19:27:38.011 INFO:tasks.workunit.client.0.vm07.stdout:6/437: link d0/d1/db/d24/c36 d0/d1/db/d1d/caa 0 2026-03-09T19:27:38.025 INFO:tasks.workunit.client.0.vm07.stdout:1/534: dread d1/db/d31/d56/f6a [0,4194304] 0 2026-03-09T19:27:38.040 INFO:tasks.workunit.client.1.vm08.stdout:9/706: fsync d0/d2/f1d 0 2026-03-09T19:27:38.064 INFO:tasks.workunit.client.0.vm07.stdout:9/503: rmdir d0/db/d29/d32 39 2026-03-09T19:27:38.066 INFO:tasks.workunit.client.1.vm08.stdout:8/693: symlink de/d25/d31/d82/d6d/d99/da0/lf0 0 2026-03-09T19:27:38.069 INFO:tasks.workunit.client.0.vm07.stdout:4/471: symlink d3/d4f/la8 0 2026-03-09T19:27:38.069 INFO:tasks.workunit.client.0.vm07.stdout:4/472: readlink d3/d4f/d56/d5f/l81 0 2026-03-09T19:27:38.071 INFO:tasks.workunit.client.1.vm08.stdout:0/715: rmdir dd/d22/d7b/d82 39 2026-03-09T19:27:38.071 INFO:tasks.workunit.client.1.vm08.stdout:2/614: rmdir d3/d4/dab 0 2026-03-09T19:27:38.073 INFO:tasks.workunit.client.1.vm08.stdout:1/833: dwrite d9/d11/db6/ff6 [0,4194304] 0 2026-03-09T19:27:38.080 INFO:tasks.workunit.client.1.vm08.stdout:1/834: chown d9/da/d95/dcd/fee 622099 1 2026-03-09T19:27:38.087 INFO:tasks.workunit.client.0.vm07.stdout:8/501: rename d7/d9/d37/d45/d56/d62/f89 to d7/d30/d32/fba 0 2026-03-09T19:27:38.092 INFO:tasks.workunit.client.1.vm08.stdout:2/615: dread d3/d4/d23/d2c/d39/d5e/d14/f78 [0,4194304] 0 2026-03-09T19:27:38.092 INFO:tasks.workunit.client.0.vm07.stdout:6/438: mknod d0/d1/db/d1d/d77/cab 0 2026-03-09T19:27:38.092 INFO:tasks.workunit.client.0.vm07.stdout:6/439: readlink d0/d13/l5c 0 2026-03-09T19:27:38.092 INFO:tasks.workunit.client.0.vm07.stdout:7/489: getdents d0/d4/d5/d8 0 2026-03-09T19:27:38.092 INFO:tasks.workunit.client.0.vm07.stdout:8/502: dwrite d7/d9/d57/fb2 [0,4194304] 0 2026-03-09T19:27:38.109 INFO:tasks.workunit.client.0.vm07.stdout:8/503: sync 2026-03-09T19:27:38.110 INFO:tasks.workunit.client.1.vm08.stdout:2/616: dread d3/d4/d23/d2c/d39/d5e/de/f1c [0,4194304] 0 2026-03-09T19:27:38.112 INFO:tasks.workunit.client.1.vm08.stdout:2/617: write d3/d9/d4a/f59 [1098983,14722] 0 2026-03-09T19:27:38.113 INFO:tasks.workunit.client.1.vm08.stdout:3/761: truncate d0/d52/d6d/d77/d88/faf 61978 0 2026-03-09T19:27:38.114 INFO:tasks.workunit.client.0.vm07.stdout:1/535: creat d1/db/fb2 x:0 0 0 2026-03-09T19:27:38.116 INFO:tasks.workunit.client.0.vm07.stdout:5/465: fsync d3/dd/d26/d3f/d47/d56/f59 0 2026-03-09T19:27:38.116 INFO:tasks.workunit.client.1.vm08.stdout:3/762: rename d0/d6/de/d1b to d0/d6/de/d1b/d16/d17/dac/dd2/dd3/df0 22 2026-03-09T19:27:38.117 INFO:tasks.workunit.client.1.vm08.stdout:6/726: link d3/d34/da9/da4/cee d3/d15/c10f 0 2026-03-09T19:27:38.155 INFO:tasks.workunit.client.1.vm08.stdout:5/665: rmdir d16/d1e/db3/dc8 0 2026-03-09T19:27:38.163 INFO:tasks.workunit.client.1.vm08.stdout:4/676: truncate da/d10/d16/d28/d46/d52/d6e/d40/f70 494063 0 2026-03-09T19:27:38.168 INFO:tasks.workunit.client.0.vm07.stdout:9/504: dwrite d0/db/d29/d2c/f54 [0,4194304] 0 2026-03-09T19:27:38.179 INFO:tasks.workunit.client.1.vm08.stdout:7/758: dwrite d5/fc [0,4194304] 0 2026-03-09T19:27:38.182 INFO:tasks.workunit.client.0.vm07.stdout:6/440: truncate d0/d1/db/d17/d4c/f7c 412553 0 2026-03-09T19:27:38.197 INFO:tasks.workunit.client.0.vm07.stdout:1/536: rmdir d1/db/d31/d56 39 2026-03-09T19:27:38.202 INFO:tasks.workunit.client.1.vm08.stdout:2/618: fdatasync d3/d4/d23/d2c/d39/d5e/de/f17 0 2026-03-09T19:27:38.205 INFO:tasks.workunit.client.1.vm08.stdout:6/727: rename d3/f109 to d3/d94/def/f110 0 2026-03-09T19:27:38.205 INFO:tasks.workunit.client.1.vm08.stdout:2/619: chown d3/d4/d23/d2c/d39/d5e/de/d8b/f7e 284839 1 2026-03-09T19:27:38.206 INFO:tasks.workunit.client.1.vm08.stdout:6/728: write d3/d68/d7e/fbb [517971,1750] 0 2026-03-09T19:27:38.227 INFO:tasks.workunit.client.0.vm07.stdout:4/473: write d3/d11/d16/d2f/f44 [640657,126616] 0 2026-03-09T19:27:38.227 INFO:tasks.workunit.client.1.vm08.stdout:8/694: write de/f54 [998549,56201] 0 2026-03-09T19:27:38.240 INFO:tasks.workunit.client.1.vm08.stdout:0/716: write dd/d22/f2b [479309,13583] 0 2026-03-09T19:27:38.242 INFO:tasks.workunit.client.1.vm08.stdout:0/717: chown dd/d22/d27/d2e/db0/ca0 7099 1 2026-03-09T19:27:38.243 INFO:tasks.workunit.client.0.vm07.stdout:5/466: write d3/d1a/d28/d36/f8c [714977,107202] 0 2026-03-09T19:27:38.258 INFO:tasks.workunit.client.0.vm07.stdout:0/449: getdents d0/d6/d13/d17/d19 0 2026-03-09T19:27:38.259 INFO:tasks.workunit.client.0.vm07.stdout:0/450: read - d0/d6/d13/d1c/f3e zero size 2026-03-09T19:27:38.259 INFO:tasks.workunit.client.1.vm08.stdout:9/707: fsync d0/d2/d14/d5c/fc4 0 2026-03-09T19:27:38.261 INFO:tasks.workunit.client.1.vm08.stdout:5/666: readlink d16/d1e/l2d 0 2026-03-09T19:27:38.262 INFO:tasks.workunit.client.1.vm08.stdout:5/667: chown d16/d1e/d6e/l77 6 1 2026-03-09T19:27:38.268 INFO:tasks.workunit.client.0.vm07.stdout:3/543: dwrite d1/d6/d45/f5d [0,4194304] 0 2026-03-09T19:27:38.302 INFO:tasks.workunit.client.0.vm07.stdout:6/441: creat d0/d1/db/d17/d4c/d7b/fac x:0 0 0 2026-03-09T19:27:38.303 INFO:tasks.workunit.client.0.vm07.stdout:7/490: mknod d0/d52/d54/ca8 0 2026-03-09T19:27:38.307 INFO:tasks.workunit.client.0.vm07.stdout:8/504: creat d7/d9/da7/fbb x:0 0 0 2026-03-09T19:27:38.313 INFO:tasks.workunit.client.0.vm07.stdout:1/537: rename d1/d11/d37/d5a to d1/d3e/db3 0 2026-03-09T19:27:38.314 INFO:tasks.workunit.client.0.vm07.stdout:2/557: getdents d3/dd/d16/d30/da7/dad 0 2026-03-09T19:27:38.315 INFO:tasks.workunit.client.1.vm08.stdout:3/763: rename d0/d6/de/f9d to d0/d52/d6d/d77/ddf/ff1 0 2026-03-09T19:27:38.325 INFO:tasks.workunit.client.0.vm07.stdout:4/474: symlink d3/d11/d16/d2f/d22/d70/la9 0 2026-03-09T19:27:38.327 INFO:tasks.workunit.client.1.vm08.stdout:6/729: mkdir d3/d15/d111 0 2026-03-09T19:27:38.330 INFO:tasks.workunit.client.0.vm07.stdout:5/467: creat d3/d1a/d5d/f90 x:0 0 0 2026-03-09T19:27:38.331 INFO:tasks.workunit.client.0.vm07.stdout:9/505: dwrite d0/db/d29/d4d/f95 [0,4194304] 0 2026-03-09T19:27:38.332 INFO:tasks.workunit.client.0.vm07.stdout:5/468: chown d3/d1a/d28/d40/f49 7500 1 2026-03-09T19:27:38.332 INFO:tasks.workunit.client.1.vm08.stdout:8/695: mknod de/d1d/d4f/cf1 0 2026-03-09T19:27:38.335 INFO:tasks.workunit.client.0.vm07.stdout:9/506: dwrite d0/d6/f4c [4194304,4194304] 0 2026-03-09T19:27:38.337 INFO:tasks.workunit.client.0.vm07.stdout:0/451: dread - d0/d6/d13/d17/f64 zero size 2026-03-09T19:27:38.364 INFO:tasks.workunit.client.0.vm07.stdout:7/491: dread - d0/d4/d5/d26/f4a zero size 2026-03-09T19:27:38.365 INFO:tasks.workunit.client.1.vm08.stdout:5/668: symlink d16/d1e/db3/ld4 0 2026-03-09T19:27:38.369 INFO:tasks.workunit.client.0.vm07.stdout:6/442: read d0/d1/db/f15 [6973444,91586] 0 2026-03-09T19:27:38.369 INFO:tasks.workunit.client.0.vm07.stdout:7/492: dread d0/d4/d5/d8/d1a/f4d [0,4194304] 0 2026-03-09T19:27:38.370 INFO:tasks.workunit.client.0.vm07.stdout:7/493: chown d0/d4/d5/d26/f8e 23 1 2026-03-09T19:27:38.372 INFO:tasks.workunit.client.1.vm08.stdout:1/835: creat d9/da/d2d/fff x:0 0 0 2026-03-09T19:27:38.374 INFO:tasks.workunit.client.1.vm08.stdout:1/836: dread d9/da/d12/fac [0,4194304] 0 2026-03-09T19:27:38.384 INFO:tasks.workunit.client.1.vm08.stdout:6/730: unlink d3/d68/l9e 0 2026-03-09T19:27:38.384 INFO:tasks.workunit.client.1.vm08.stdout:1/837: dread d9/da/d12/d91/dc5/fd7 [0,4194304] 0 2026-03-09T19:27:38.384 INFO:tasks.workunit.client.1.vm08.stdout:0/718: mkdir dd/d22/de1 0 2026-03-09T19:27:38.390 INFO:tasks.workunit.client.1.vm08.stdout:5/669: mkdir d16/d8e/dd5 0 2026-03-09T19:27:38.391 INFO:tasks.workunit.client.1.vm08.stdout:7/759: link d5/fe1 d5/d14/d27/d54/dfb/d9c/dcb/fff 0 2026-03-09T19:27:38.402 INFO:tasks.workunit.client.1.vm08.stdout:0/719: rename dd/d22/d24/d49/d50/db3/fc4 to dd/d22/de1/fe2 0 2026-03-09T19:27:38.403 INFO:tasks.workunit.client.1.vm08.stdout:2/620: link d3/d9/d79/d46/d8c/d92/ca1 d3/cd3 0 2026-03-09T19:27:38.403 INFO:tasks.workunit.client.1.vm08.stdout:0/720: dread - dd/d22/d27/d2e/db0/fb2 zero size 2026-03-09T19:27:38.412 INFO:tasks.workunit.client.1.vm08.stdout:4/677: link l7 da/lcb 0 2026-03-09T19:27:38.421 INFO:tasks.workunit.client.1.vm08.stdout:7/760: dread - d5/fe1 zero size 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.1.vm08.stdout:0/721: mkdir dd/d22/d24/d49/d50/de3 0 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.1.vm08.stdout:2/621: fsync d3/d4/d23/d2c/d39/d5e/d14/f73 0 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.1.vm08.stdout:6/731: creat d3/db/d43/d69/f112 x:0 0 0 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.1.vm08.stdout:4/678: rmdir da/d10/d16/d28/d2f/d4f/d56 39 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.1.vm08.stdout:6/732: mkdir d3/d94/d113 0 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.1.vm08.stdout:0/722: mkdir dd/de4 0 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.1.vm08.stdout:0/723: chown dd/d22/d24/d49/d50/d78/db4/l8b 5 1 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.1.vm08.stdout:7/761: dwrite d5/d14/dae/ff6 [0,4194304] 0 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.0.vm07.stdout:5/469: dread d3/d1a/d28/d6c/f7a [0,4194304] 0 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.0.vm07.stdout:3/544: mknod d1/d3d/ca8 0 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.0.vm07.stdout:8/505: mkdir d7/d9/d37/d45/d97/dbc 0 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.0.vm07.stdout:6/443: mkdir d0/d1/d28/d76/dad 0 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.0.vm07.stdout:5/470: read d3/fe [1338332,26059] 0 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.0.vm07.stdout:6/444: write d0/d1/d28/d76/f97 [4213004,55077] 0 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.0.vm07.stdout:8/506: mknod d7/d1d/d83/cbd 0 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.0.vm07.stdout:0/452: dread d0/f1e [0,4194304] 0 2026-03-09T19:27:38.449 INFO:tasks.workunit.client.0.vm07.stdout:7/494: getdents d0/d4/d5/d99 0 2026-03-09T19:27:38.451 INFO:tasks.workunit.client.0.vm07.stdout:3/545: rename d1/d1f/d5c to d1/d6/dd/d51/d8e/da9 0 2026-03-09T19:27:38.453 INFO:tasks.workunit.client.1.vm08.stdout:6/733: creat d3/d34/dce/de3/f114 x:0 0 0 2026-03-09T19:27:38.453 INFO:tasks.workunit.client.0.vm07.stdout:7/495: mknod d0/d4/d5/d8/d41/d64/ca9 0 2026-03-09T19:27:38.455 INFO:tasks.workunit.client.0.vm07.stdout:0/453: read d0/d6/d13/d33/f39 [1088760,110272] 0 2026-03-09T19:27:38.464 INFO:tasks.workunit.client.0.vm07.stdout:3/546: creat d1/d3d/faa x:0 0 0 2026-03-09T19:27:38.464 INFO:tasks.workunit.client.0.vm07.stdout:3/547: stat d1/d1f/d16/d28/d7c/c80 0 2026-03-09T19:27:38.464 INFO:tasks.workunit.client.0.vm07.stdout:3/548: write d1/d6/d4c/f61 [812162,92465] 0 2026-03-09T19:27:38.464 INFO:tasks.workunit.client.0.vm07.stdout:8/507: rename d7/d9/d57/l6a to d7/d9/da7/lbe 0 2026-03-09T19:27:38.464 INFO:tasks.workunit.client.0.vm07.stdout:8/508: chown d7/d16/d1e/dab 346287718 1 2026-03-09T19:27:38.465 INFO:tasks.workunit.client.0.vm07.stdout:7/496: symlink d0/d4/d5/d26/d3c/d39/laa 0 2026-03-09T19:27:38.466 INFO:tasks.workunit.client.1.vm08.stdout:6/734: creat d3/d34/d5c/da2/d107/f115 x:0 0 0 2026-03-09T19:27:38.469 INFO:tasks.workunit.client.0.vm07.stdout:4/475: dread d3/d11/f1e [0,4194304] 0 2026-03-09T19:27:38.470 INFO:tasks.workunit.client.1.vm08.stdout:6/735: chown d3/d34/da9/da4/fe0 84 1 2026-03-09T19:27:38.471 INFO:tasks.workunit.client.1.vm08.stdout:6/736: dread - d3/d34/d6f/f50 zero size 2026-03-09T19:27:38.480 INFO:tasks.workunit.client.0.vm07.stdout:7/497: truncate d0/d4/d5/d8/d41/d64/d74/f82 451568 0 2026-03-09T19:27:38.510 INFO:tasks.workunit.client.1.vm08.stdout:6/737: unlink d3/l82 0 2026-03-09T19:27:38.511 INFO:tasks.workunit.client.1.vm08.stdout:6/738: chown d3/d34/d5c/c62 5109 1 2026-03-09T19:27:38.511 INFO:tasks.workunit.client.0.vm07.stdout:3/549: fsync d1/d6/dd/f67 0 2026-03-09T19:27:38.511 INFO:tasks.workunit.client.0.vm07.stdout:0/454: rename d0/d6/d13/d17/c4e to d0/d6/d13/d1c/d61/d69/c97 0 2026-03-09T19:27:38.511 INFO:tasks.workunit.client.0.vm07.stdout:0/455: dread d0/d6/d13/d17/d19/f7c [0,4194304] 0 2026-03-09T19:27:38.511 INFO:tasks.workunit.client.0.vm07.stdout:3/550: mknod d1/d1f/d16/cab 0 2026-03-09T19:27:38.511 INFO:tasks.workunit.client.0.vm07.stdout:0/456: dread - d0/f68 zero size 2026-03-09T19:27:38.512 INFO:tasks.workunit.client.0.vm07.stdout:3/551: mkdir d1/d6/d45/dac 0 2026-03-09T19:27:38.521 INFO:tasks.workunit.client.1.vm08.stdout:3/764: sync 2026-03-09T19:27:38.521 INFO:tasks.workunit.client.1.vm08.stdout:5/670: sync 2026-03-09T19:27:38.522 INFO:tasks.workunit.client.1.vm08.stdout:0/724: sync 2026-03-09T19:27:38.522 INFO:tasks.workunit.client.1.vm08.stdout:5/671: chown d16/d8e 5 1 2026-03-09T19:27:38.525 INFO:tasks.workunit.client.0.vm07.stdout:0/457: rename d0/d6/c89 to d0/d6/d13/d1c/d50/d92/c98 0 2026-03-09T19:27:38.525 INFO:tasks.workunit.client.0.vm07.stdout:3/552: read - d1/d6/dd/d51/f6b zero size 2026-03-09T19:27:38.533 INFO:tasks.workunit.client.0.vm07.stdout:7/498: dread d0/d4/d5/d8/d41/d64/d74/d98/f18 [0,4194304] 0 2026-03-09T19:27:38.540 INFO:tasks.workunit.client.0.vm07.stdout:4/476: getdents d3/d11/d16/d2f/d22/d70/d93 0 2026-03-09T19:27:38.540 INFO:tasks.workunit.client.0.vm07.stdout:4/477: fsync d3/f8d 0 2026-03-09T19:27:38.554 INFO:tasks.workunit.client.0.vm07.stdout:3/553: mknod d1/d1f/cad 0 2026-03-09T19:27:38.588 INFO:tasks.workunit.client.0.vm07.stdout:1/538: write d1/d3/f12 [1072805,106142] 0 2026-03-09T19:27:38.588 INFO:tasks.workunit.client.0.vm07.stdout:2/558: write d3/d11/f19 [2043528,96779] 0 2026-03-09T19:27:38.588 INFO:tasks.workunit.client.0.vm07.stdout:9/507: write d0/d6/f48 [3990696,61593] 0 2026-03-09T19:27:38.588 INFO:tasks.workunit.client.1.vm08.stdout:8/696: write de/d25/d31/f36 [1134985,20493] 0 2026-03-09T19:27:38.588 INFO:tasks.workunit.client.1.vm08.stdout:9/708: dwrite d0/d1b/d97/d48/f53 [0,4194304] 0 2026-03-09T19:27:38.588 INFO:tasks.workunit.client.1.vm08.stdout:1/838: write d9/da/d95/dcd/fee [71472,97045] 0 2026-03-09T19:27:38.588 INFO:tasks.workunit.client.1.vm08.stdout:7/762: chown d5/d14/d27/d54/dfb/d9c/dcb/fda 351 1 2026-03-09T19:27:38.588 INFO:tasks.workunit.client.1.vm08.stdout:8/697: mkdir de/d47/d85/df2 0 2026-03-09T19:27:38.592 INFO:tasks.workunit.client.1.vm08.stdout:2/622: dwrite d3/d9/f5d [0,4194304] 0 2026-03-09T19:27:38.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:38 vm08.local ceph-mon[57794]: pgmap v173: 65 pgs: 65 active+clean; 2.4 GiB data, 8.5 GiB used, 111 GiB / 120 GiB avail; 26 MiB/s rd, 74 MiB/s wr, 176 op/s 2026-03-09T19:27:38.600 INFO:tasks.workunit.client.1.vm08.stdout:4/679: write da/d10/d16/d28/d4d/fa9 [801470,96970] 0 2026-03-09T19:27:38.609 INFO:tasks.workunit.client.1.vm08.stdout:7/763: mkdir d5/d14/d27/d78/d100 0 2026-03-09T19:27:38.612 INFO:tasks.workunit.client.0.vm07.stdout:6/445: write d0/d13/d1e/d95/d31/f6c [665260,30468] 0 2026-03-09T19:27:38.612 INFO:tasks.workunit.client.0.vm07.stdout:2/559: mknod d3/d49/cc4 0 2026-03-09T19:27:38.612 INFO:tasks.workunit.client.1.vm08.stdout:8/698: symlink de/d25/d31/lf3 0 2026-03-09T19:27:38.614 INFO:tasks.workunit.client.1.vm08.stdout:9/709: mkdir d0/d1b/de9 0 2026-03-09T19:27:38.614 INFO:tasks.workunit.client.0.vm07.stdout:3/554: dread d1/d6/dd/f33 [0,4194304] 0 2026-03-09T19:27:38.616 INFO:tasks.workunit.client.0.vm07.stdout:5/471: dwrite d3/d1a/d28/d40/f49 [0,4194304] 0 2026-03-09T19:27:38.616 INFO:tasks.workunit.client.0.vm07.stdout:5/472: stat d3/dd/d26/d2d 0 2026-03-09T19:27:38.626 INFO:tasks.workunit.client.0.vm07.stdout:8/509: write d7/d9/d37/d45/d56/d67/fa3 [977436,82110] 0 2026-03-09T19:27:38.627 INFO:tasks.workunit.client.1.vm08.stdout:2/623: mkdir d3/d4/d23/d2c/d39/d5e/de/d18/d99/dd4 0 2026-03-09T19:27:38.631 INFO:tasks.workunit.client.1.vm08.stdout:4/680: readlink da/d10/d16/d28/d46/d52/d6e/l19 0 2026-03-09T19:27:38.632 INFO:tasks.workunit.client.0.vm07.stdout:2/560: unlink d3/dd/d16/d29/d3c/cba 0 2026-03-09T19:27:38.634 INFO:tasks.workunit.client.1.vm08.stdout:8/699: mknod de/d25/d31/d82/d6d/d99/da0/cf4 0 2026-03-09T19:27:38.638 INFO:tasks.workunit.client.1.vm08.stdout:6/739: dwrite d3/db/d43/f56 [0,4194304] 0 2026-03-09T19:27:38.642 INFO:tasks.workunit.client.0.vm07.stdout:0/458: getdents d0/d6 0 2026-03-09T19:27:38.646 INFO:tasks.workunit.client.1.vm08.stdout:9/710: mkdir d0/d2/d14/d98/d99/dea 0 2026-03-09T19:27:38.650 INFO:tasks.workunit.client.1.vm08.stdout:1/839: creat d9/da/d53/f100 x:0 0 0 2026-03-09T19:27:38.652 INFO:tasks.workunit.client.0.vm07.stdout:4/478: getdents d3/d11/d16/d2f 0 2026-03-09T19:27:38.652 INFO:tasks.workunit.client.1.vm08.stdout:4/681: stat da/c76 0 2026-03-09T19:27:38.653 INFO:tasks.workunit.client.1.vm08.stdout:4/682: write da/d10/d16/fbf [342295,8953] 0 2026-03-09T19:27:38.656 INFO:tasks.workunit.client.1.vm08.stdout:7/764: rename d5/d14/dae/d3a/d42/d85/c80 to d5/d14/d27/d78/dc7/c101 0 2026-03-09T19:27:38.668 INFO:tasks.workunit.client.0.vm07.stdout:2/561: fsync d3/dd/d16/d29/d3c/d4c/f9d 0 2026-03-09T19:27:38.669 INFO:tasks.workunit.client.0.vm07.stdout:2/562: chown d3/dd/d16/d29/d2d/d45/d85/l88 116 1 2026-03-09T19:27:38.669 INFO:tasks.workunit.client.0.vm07.stdout:3/555: rename d1/d6/c91 to d1/d3d/cae 0 2026-03-09T19:27:38.669 INFO:tasks.workunit.client.1.vm08.stdout:8/700: mknod de/d25/d31/d82/d6d/d99/da5/db3/cf5 0 2026-03-09T19:27:38.669 INFO:tasks.workunit.client.1.vm08.stdout:8/701: read - de/d1d/d4f/fd9 zero size 2026-03-09T19:27:38.669 INFO:tasks.workunit.client.1.vm08.stdout:8/702: chown de/d25/d31/d82/d6d/d99/da5/fdd 2587301 1 2026-03-09T19:27:38.677 INFO:tasks.workunit.client.1.vm08.stdout:8/703: read de/d91/dc8/fe4 [70820,84133] 0 2026-03-09T19:27:38.683 INFO:tasks.workunit.client.0.vm07.stdout:7/499: write d0/f6c [150117,3691] 0 2026-03-09T19:27:38.687 INFO:tasks.workunit.client.1.vm08.stdout:0/725: dwrite dd/d22/d27/d4f/f97 [0,4194304] 0 2026-03-09T19:27:38.697 INFO:tasks.workunit.client.1.vm08.stdout:3/765: dwrite d0/d52/fa8 [0,4194304] 0 2026-03-09T19:27:38.697 INFO:tasks.workunit.client.1.vm08.stdout:5/672: dwrite d16/d1e/d30/d6f/fbb [0,4194304] 0 2026-03-09T19:27:38.724 INFO:tasks.workunit.client.0.vm07.stdout:2/563: stat d3/dd/d16/d29/d2d/d45/f62 0 2026-03-09T19:27:38.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:38 vm07.local ceph-mon[48545]: pgmap v173: 65 pgs: 65 active+clean; 2.4 GiB data, 8.5 GiB used, 111 GiB / 120 GiB avail; 26 MiB/s rd, 74 MiB/s wr, 176 op/s 2026-03-09T19:27:38.734 INFO:tasks.workunit.client.0.vm07.stdout:3/556: creat d1/d6/dd/d51/faf x:0 0 0 2026-03-09T19:27:38.734 INFO:tasks.workunit.client.0.vm07.stdout:7/500: stat d0/d52/d54/f5e 0 2026-03-09T19:27:38.735 INFO:tasks.workunit.client.0.vm07.stdout:1/539: write d1/d11/d37/d3f/d7e/f7f [335750,112591] 0 2026-03-09T19:27:38.736 INFO:tasks.workunit.client.0.vm07.stdout:7/501: write d0/d4/d5/d26/d3c/d58/f71 [340953,34887] 0 2026-03-09T19:27:38.744 INFO:tasks.workunit.client.0.vm07.stdout:6/446: getdents d0/d1/db/d17 0 2026-03-09T19:27:38.755 INFO:tasks.workunit.client.0.vm07.stdout:9/508: link d0/db/d29/d32/l63 d0/d6/d3a/d81/lb7 0 2026-03-09T19:27:38.756 INFO:tasks.workunit.client.0.vm07.stdout:9/509: read - d0/db/d9e/faf zero size 2026-03-09T19:27:38.758 INFO:tasks.workunit.client.0.vm07.stdout:8/510: dwrite d7/d30/d75/f88 [0,4194304] 0 2026-03-09T19:27:38.759 INFO:tasks.workunit.client.0.vm07.stdout:5/473: truncate d3/d1a/d28/d36/f61 1644020 0 2026-03-09T19:27:38.759 INFO:tasks.workunit.client.0.vm07.stdout:5/474: readlink d3/dd/l13 0 2026-03-09T19:27:38.759 INFO:tasks.workunit.client.0.vm07.stdout:0/459: dwrite d0/d6/d13/d1c/d11/f5f [4194304,4194304] 0 2026-03-09T19:27:38.760 INFO:tasks.workunit.client.0.vm07.stdout:8/511: read - d7/d50/da6/faf zero size 2026-03-09T19:27:38.763 INFO:tasks.workunit.client.0.vm07.stdout:8/512: dread d7/d9/d10/d44/f6c [0,4194304] 0 2026-03-09T19:27:38.767 INFO:tasks.workunit.client.0.vm07.stdout:3/557: sync 2026-03-09T19:27:38.784 INFO:tasks.workunit.client.0.vm07.stdout:6/447: mkdir d0/d4e/dae 0 2026-03-09T19:27:38.787 INFO:tasks.workunit.client.0.vm07.stdout:3/558: dread d1/d6/d4c/f61 [0,4194304] 0 2026-03-09T19:27:38.794 INFO:tasks.workunit.client.1.vm08.stdout:1/840: truncate d9/da/d53/fae 908742 0 2026-03-09T19:27:38.795 INFO:tasks.workunit.client.0.vm07.stdout:9/510: rename d0/d6f/c75 to d0/d6/cb8 0 2026-03-09T19:27:38.800 INFO:tasks.workunit.client.0.vm07.stdout:1/540: dread d1/d3e/f49 [0,4194304] 0 2026-03-09T19:27:38.808 INFO:tasks.workunit.client.0.vm07.stdout:4/479: getdents d3/d11/d2b 0 2026-03-09T19:27:38.818 INFO:tasks.workunit.client.1.vm08.stdout:7/765: mknod d5/d14/d27/d54/c102 0 2026-03-09T19:27:38.846 INFO:tasks.workunit.client.0.vm07.stdout:2/564: write d3/dd/f24 [2645874,55025] 0 2026-03-09T19:27:38.851 INFO:tasks.workunit.client.1.vm08.stdout:9/711: dwrite d0/d2/d14/d5c/fb0 [0,4194304] 0 2026-03-09T19:27:38.853 INFO:tasks.workunit.client.0.vm07.stdout:2/565: dwrite d3/dd/d16/d29/d3c/d5a/fbe [0,4194304] 0 2026-03-09T19:27:38.858 INFO:tasks.workunit.client.0.vm07.stdout:2/566: stat d3/dd/d16/d29/d2d/d45/d3b/d44/f5d 0 2026-03-09T19:27:38.865 INFO:tasks.workunit.client.0.vm07.stdout:5/475: readlink d3/dd/d26/d3f/d47/l67 0 2026-03-09T19:27:38.886 INFO:tasks.workunit.client.1.vm08.stdout:5/673: creat d16/d1e/d9f/fd6 x:0 0 0 2026-03-09T19:27:38.894 INFO:tasks.workunit.client.1.vm08.stdout:5/674: chown d16/d1e/d30/d6f/c97 0 1 2026-03-09T19:27:38.909 INFO:tasks.workunit.client.0.vm07.stdout:3/559: creat d1/d6/dd/fb0 x:0 0 0 2026-03-09T19:27:38.912 INFO:tasks.workunit.client.1.vm08.stdout:2/624: truncate d3/d4/d3e/d9d/fc5 2481510 0 2026-03-09T19:27:38.916 INFO:tasks.workunit.client.1.vm08.stdout:0/726: write dd/d22/d24/d49/d50/f9b [447460,89877] 0 2026-03-09T19:27:38.917 INFO:tasks.workunit.client.0.vm07.stdout:0/460: dwrite d0/f3a [0,4194304] 0 2026-03-09T19:27:38.919 INFO:tasks.workunit.client.1.vm08.stdout:4/683: dwrite da/d10/d26/d3a/d91/fb4 [0,4194304] 0 2026-03-09T19:27:38.930 INFO:tasks.workunit.client.1.vm08.stdout:1/841: fsync d9/d40/f57 0 2026-03-09T19:27:38.932 INFO:tasks.workunit.client.0.vm07.stdout:1/541: rmdir d1/d11/d37/d3f/d7e 39 2026-03-09T19:27:38.939 INFO:tasks.workunit.client.0.vm07.stdout:4/480: creat d3/d11/d51/faa x:0 0 0 2026-03-09T19:27:38.944 INFO:tasks.workunit.client.0.vm07.stdout:2/567: symlink d3/dd/d16/d30/da7/dad/lc5 0 2026-03-09T19:27:38.953 INFO:tasks.workunit.client.1.vm08.stdout:5/675: mknod d16/d1e/d9b/cd7 0 2026-03-09T19:27:38.967 INFO:tasks.workunit.client.0.vm07.stdout:6/448: mkdir d0/d4e/dae/daf 0 2026-03-09T19:27:38.968 INFO:tasks.workunit.client.0.vm07.stdout:3/560: creat d1/d6/d4c/fb1 x:0 0 0 2026-03-09T19:27:38.968 INFO:tasks.workunit.client.0.vm07.stdout:0/461: write d0/d6/d13/d17/f20 [1154226,48970] 0 2026-03-09T19:27:38.969 INFO:tasks.workunit.client.0.vm07.stdout:3/561: stat d1/d6/dd/d51/d8e/ca3 0 2026-03-09T19:27:38.970 INFO:tasks.workunit.client.0.vm07.stdout:1/542: symlink d1/d11/lb4 0 2026-03-09T19:27:38.972 INFO:tasks.workunit.client.0.vm07.stdout:6/449: sync 2026-03-09T19:27:38.972 INFO:tasks.workunit.client.0.vm07.stdout:4/481: rmdir d3/d11/d16/d2f/d91 39 2026-03-09T19:27:38.977 INFO:tasks.workunit.client.1.vm08.stdout:1/842: mkdir d9/d11/d7a/d89/de7/d101 0 2026-03-09T19:27:38.980 INFO:tasks.workunit.client.0.vm07.stdout:2/568: mknod d3/dd/d16/d29/d2d/d45/d85/d8a/cc6 0 2026-03-09T19:27:38.982 INFO:tasks.workunit.client.1.vm08.stdout:8/704: rmdir de/d47/de6 0 2026-03-09T19:27:38.987 INFO:tasks.workunit.client.0.vm07.stdout:7/502: getdents d0/d52/d54 0 2026-03-09T19:27:39.003 INFO:tasks.workunit.client.1.vm08.stdout:3/766: rmdir d0/d6/d93/dcb/dde 0 2026-03-09T19:27:39.019 INFO:tasks.workunit.client.0.vm07.stdout:9/511: creat d0/db/d29/d32/fb9 x:0 0 0 2026-03-09T19:27:39.022 INFO:tasks.workunit.client.1.vm08.stdout:5/676: chown d16/d45/fb1 1 1 2026-03-09T19:27:39.025 INFO:tasks.workunit.client.1.vm08.stdout:6/740: getdents d3/d34/d5c/da2/dd6 0 2026-03-09T19:27:39.030 INFO:tasks.workunit.client.0.vm07.stdout:0/462: mkdir d0/d6/d13/d1c/d50/d92/d99 0 2026-03-09T19:27:39.048 INFO:tasks.workunit.client.0.vm07.stdout:1/543: dread d1/f38 [0,4194304] 0 2026-03-09T19:27:39.050 INFO:tasks.workunit.client.1.vm08.stdout:9/712: write d0/d1b/d97/d48/d6f/f79 [388863,58376] 0 2026-03-09T19:27:39.053 INFO:tasks.workunit.client.1.vm08.stdout:4/684: write da/d10/f3d [4328849,80886] 0 2026-03-09T19:27:39.054 INFO:tasks.workunit.client.0.vm07.stdout:5/476: dwrite d3/f68 [4194304,4194304] 0 2026-03-09T19:27:39.076 INFO:tasks.workunit.client.1.vm08.stdout:2/625: dwrite d3/d9/d26/f6a [0,4194304] 0 2026-03-09T19:27:39.081 INFO:tasks.workunit.client.1.vm08.stdout:2/626: chown d3/d4/d23/d2c/d39/d5e/de/d18/f93 241 1 2026-03-09T19:27:39.092 INFO:tasks.workunit.client.0.vm07.stdout:4/482: creat d3/d11/d16/fab x:0 0 0 2026-03-09T19:27:39.093 INFO:tasks.workunit.client.0.vm07.stdout:2/569: mkdir d3/dd/daa/dc7 0 2026-03-09T19:27:39.093 INFO:tasks.workunit.client.1.vm08.stdout:0/727: symlink dd/d22/le5 0 2026-03-09T19:27:39.093 INFO:tasks.workunit.client.0.vm07.stdout:8/513: getdents d7/d9/d37/d45/d56/d62 0 2026-03-09T19:27:39.093 INFO:tasks.workunit.client.1.vm08.stdout:1/843: creat d9/d11/d7a/d89/d8d/daa/f102 x:0 0 0 2026-03-09T19:27:39.094 INFO:tasks.workunit.client.0.vm07.stdout:8/514: chown d7/d9/d37/d45/d4f/c78 854 1 2026-03-09T19:27:39.120 INFO:tasks.workunit.client.1.vm08.stdout:5/677: unlink d16/d45/d81/lad 0 2026-03-09T19:27:39.121 INFO:tasks.workunit.client.0.vm07.stdout:9/512: mkdir d0/db/d29/d2c/d36/d7d/dba 0 2026-03-09T19:27:39.124 INFO:tasks.workunit.client.1.vm08.stdout:5/678: dwrite d16/d1e/d30/fb4 [0,4194304] 0 2026-03-09T19:27:39.130 INFO:tasks.workunit.client.1.vm08.stdout:6/741: symlink d3/d94/def/l116 0 2026-03-09T19:27:39.135 INFO:tasks.workunit.client.0.vm07.stdout:3/562: symlink d1/d1f/lb2 0 2026-03-09T19:27:39.136 INFO:tasks.workunit.client.0.vm07.stdout:7/503: write d0/d4/d5/d26/d3c/d58/f70 [2123792,125652] 0 2026-03-09T19:27:39.137 INFO:tasks.workunit.client.0.vm07.stdout:0/463: mknod d0/d6/d13/d33/c9a 0 2026-03-09T19:27:39.140 INFO:tasks.workunit.client.0.vm07.stdout:7/504: truncate d0/d4/d5/d26/f75 1164279 0 2026-03-09T19:27:39.143 INFO:tasks.workunit.client.1.vm08.stdout:8/705: dwrite de/d1d/d2e/d5f/f57 [0,4194304] 0 2026-03-09T19:27:39.148 INFO:tasks.workunit.client.1.vm08.stdout:3/767: dwrite d0/d8/d24/f2d [0,4194304] 0 2026-03-09T19:27:39.148 INFO:tasks.workunit.client.1.vm08.stdout:8/706: write de/d91/fd1 [415528,126181] 0 2026-03-09T19:27:39.148 INFO:tasks.workunit.client.1.vm08.stdout:3/768: chown d0/d6/de/d6e/d51/d7f/de3/fe8 11506683 1 2026-03-09T19:27:39.155 INFO:tasks.workunit.client.1.vm08.stdout:8/707: write de/d1d/d69/f84 [1059450,86811] 0 2026-03-09T19:27:39.160 INFO:tasks.workunit.client.0.vm07.stdout:0/464: dread d0/d6/d13/d1c/f27 [0,4194304] 0 2026-03-09T19:27:39.161 INFO:tasks.workunit.client.0.vm07.stdout:0/465: chown d0/f3a 110935736 1 2026-03-09T19:27:39.168 INFO:tasks.workunit.client.0.vm07.stdout:1/544: fdatasync d1/d11/d37/d3f/d45/f98 0 2026-03-09T19:27:39.170 INFO:tasks.workunit.client.1.vm08.stdout:2/627: dread - d3/d4/d23/fc0 zero size 2026-03-09T19:27:39.171 INFO:tasks.workunit.client.1.vm08.stdout:7/766: rename d5/d14/d27/d54/dfb/d9c/dcb/fcd to d5/d14/dae/f103 0 2026-03-09T19:27:39.173 INFO:tasks.workunit.client.0.vm07.stdout:5/477: write d3/f19 [8972523,91515] 0 2026-03-09T19:27:39.174 INFO:tasks.workunit.client.1.vm08.stdout:1/844: unlink d9/d11/db6/ff6 0 2026-03-09T19:27:39.174 INFO:tasks.workunit.client.1.vm08.stdout:7/767: write d5/d14/d27/d78/dc7/fd4 [980142,36308] 0 2026-03-09T19:27:39.175 INFO:tasks.workunit.client.1.vm08.stdout:7/768: stat d5/d14/dae/d3a/d42/ff2 0 2026-03-09T19:27:39.179 INFO:tasks.workunit.client.0.vm07.stdout:8/515: read - d7/d30/d32/fa9 zero size 2026-03-09T19:27:39.180 INFO:tasks.workunit.client.1.vm08.stdout:5/679: creat d16/d1e/dc9/fd8 x:0 0 0 2026-03-09T19:27:39.180 INFO:tasks.workunit.client.0.vm07.stdout:9/513: chown d0/db/d29/d32/l92 4 1 2026-03-09T19:27:39.182 INFO:tasks.workunit.client.1.vm08.stdout:6/742: write d3/d15/fc9 [9123907,97988] 0 2026-03-09T19:27:39.195 INFO:tasks.workunit.client.1.vm08.stdout:3/769: dread d0/d6/f39 [4194304,4194304] 0 2026-03-09T19:27:39.200 INFO:tasks.workunit.client.1.vm08.stdout:0/728: rename dd/d22/d24/d49/d50/d78/d86/laf to dd/d22/d27/d65/le6 0 2026-03-09T19:27:39.206 INFO:tasks.workunit.client.0.vm07.stdout:5/478: unlink d3/d1a/d5d/f5f 0 2026-03-09T19:27:39.207 INFO:tasks.workunit.client.0.vm07.stdout:8/516: creat d7/d30/d75/fbf x:0 0 0 2026-03-09T19:27:39.214 INFO:tasks.workunit.client.0.vm07.stdout:3/563: dread d1/d1f/d16/d28/f34 [0,4194304] 0 2026-03-09T19:27:39.226 INFO:tasks.workunit.client.0.vm07.stdout:1/545: mkdir d1/d11/d37/d3f/db5 0 2026-03-09T19:27:39.227 INFO:tasks.workunit.client.0.vm07.stdout:1/546: chown d1/d11/d37/d3f/d45/d87/fa9 29 1 2026-03-09T19:27:39.230 INFO:tasks.workunit.client.0.vm07.stdout:1/547: sync 2026-03-09T19:27:39.232 INFO:tasks.workunit.client.1.vm08.stdout:6/743: dread - d3/d68/fec zero size 2026-03-09T19:27:39.234 INFO:tasks.workunit.client.0.vm07.stdout:6/450: getdents d0/d13/d1e/d95 0 2026-03-09T19:27:39.235 INFO:tasks.workunit.client.0.vm07.stdout:2/570: creat d3/fc8 x:0 0 0 2026-03-09T19:27:39.235 INFO:tasks.workunit.client.1.vm08.stdout:3/770: mknod d0/d6/de/d1a/cf2 0 2026-03-09T19:27:39.238 INFO:tasks.workunit.client.1.vm08.stdout:4/685: read da/d10/d16/d28/d2f/d4f/d64/d84/d8a/fb9 [174913,122455] 0 2026-03-09T19:27:39.254 INFO:tasks.workunit.client.0.vm07.stdout:8/517: creat d7/d9/d37/d45/d56/fc0 x:0 0 0 2026-03-09T19:27:39.258 INFO:tasks.workunit.client.0.vm07.stdout:5/479: dread d3/f4e [0,4194304] 0 2026-03-09T19:27:39.269 INFO:tasks.workunit.client.1.vm08.stdout:9/713: write d0/d2/d14/f19 [1139649,53665] 0 2026-03-09T19:27:39.270 INFO:tasks.workunit.client.0.vm07.stdout:3/564: rmdir d1/d1f/d16/d28/d7c 39 2026-03-09T19:27:39.285 INFO:tasks.workunit.client.1.vm08.stdout:6/744: dread - d3/d15/ffd zero size 2026-03-09T19:27:39.298 INFO:tasks.workunit.client.0.vm07.stdout:4/483: write d3/f13 [664,78064] 0 2026-03-09T19:27:39.298 INFO:tasks.workunit.client.0.vm07.stdout:4/484: stat d3/l90 0 2026-03-09T19:27:39.304 INFO:tasks.workunit.client.0.vm07.stdout:2/571: mkdir d3/dd/d16/d29/d3c/d5a/d7a/d74/dc9 0 2026-03-09T19:27:39.305 INFO:tasks.workunit.client.0.vm07.stdout:2/572: write d3/fc8 [997149,20444] 0 2026-03-09T19:27:39.306 INFO:tasks.workunit.client.0.vm07.stdout:2/573: chown d3/dd/d16/d30/da7 3 1 2026-03-09T19:27:39.314 INFO:tasks.workunit.client.0.vm07.stdout:0/466: write d0/f65 [485186,71336] 0 2026-03-09T19:27:39.315 INFO:tasks.workunit.client.0.vm07.stdout:8/518: creat d7/d9/d37/d45/d56/fc1 x:0 0 0 2026-03-09T19:27:39.315 INFO:tasks.workunit.client.0.vm07.stdout:0/467: write d0/f93 [502035,51185] 0 2026-03-09T19:27:39.323 INFO:tasks.workunit.client.1.vm08.stdout:9/714: dread - d0/d2/fc1 zero size 2026-03-09T19:27:39.328 INFO:tasks.workunit.client.1.vm08.stdout:9/715: dwrite d0/d2/f1a [8388608,4194304] 0 2026-03-09T19:27:39.337 INFO:tasks.workunit.client.0.vm07.stdout:5/480: rename d3/d1a/f17 to d3/d1a/d28/d6c/d72/d8f/f91 0 2026-03-09T19:27:39.337 INFO:tasks.workunit.client.0.vm07.stdout:5/481: stat d3/f19 0 2026-03-09T19:27:39.344 INFO:tasks.workunit.client.1.vm08.stdout:8/708: dwrite de/d25/d31/d82/d6d/f88 [0,4194304] 0 2026-03-09T19:27:39.353 INFO:tasks.workunit.client.0.vm07.stdout:5/482: sync 2026-03-09T19:27:39.353 INFO:tasks.workunit.client.0.vm07.stdout:7/505: getdents d0/d4/d5/d8/d41/d64/d74 0 2026-03-09T19:27:39.354 INFO:tasks.workunit.client.0.vm07.stdout:7/506: sync 2026-03-09T19:27:39.355 INFO:tasks.workunit.client.0.vm07.stdout:0/468: dread d0/f41 [0,4194304] 0 2026-03-09T19:27:39.367 INFO:tasks.workunit.client.0.vm07.stdout:7/507: dread d0/d4/d5/d8/d41/d64/d74/f88 [0,4194304] 0 2026-03-09T19:27:39.371 INFO:tasks.workunit.client.1.vm08.stdout:2/628: dwrite d3/d4/f55 [0,4194304] 0 2026-03-09T19:27:39.384 INFO:tasks.workunit.client.1.vm08.stdout:1/845: write d9/d11/d7a/d89/d8d/da3/fde [528821,71831] 0 2026-03-09T19:27:39.390 INFO:tasks.workunit.client.1.vm08.stdout:9/716: creat d0/d1b/d97/d48/d5e/feb x:0 0 0 2026-03-09T19:27:39.396 INFO:tasks.workunit.client.1.vm08.stdout:7/769: getdents d5/d14/d27/d78 0 2026-03-09T19:27:39.400 INFO:tasks.workunit.client.1.vm08.stdout:0/729: dwrite dd/d22/f3e [4194304,4194304] 0 2026-03-09T19:27:39.420 INFO:tasks.workunit.client.0.vm07.stdout:9/514: truncate d0/db/d29/d2c/f34 4131352 0 2026-03-09T19:27:39.421 INFO:tasks.workunit.client.0.vm07.stdout:1/548: write d1/f38 [1949170,104570] 0 2026-03-09T19:27:39.421 INFO:tasks.workunit.client.0.vm07.stdout:9/515: write d0/db/d29/fb3 [985747,123054] 0 2026-03-09T19:27:39.424 INFO:tasks.workunit.client.0.vm07.stdout:6/451: dwrite d0/d1/d28/f64 [0,4194304] 0 2026-03-09T19:27:39.431 INFO:tasks.workunit.client.1.vm08.stdout:4/686: dwrite da/d10/d16/d28/d2f/d4f/d64/d81/faa [0,4194304] 0 2026-03-09T19:27:39.432 INFO:tasks.workunit.client.1.vm08.stdout:4/687: chown da/d10/d16/d28/d2f/d4f/c58 114046 1 2026-03-09T19:27:39.435 INFO:tasks.workunit.client.1.vm08.stdout:5/680: dwrite d16/d1e/d8c/d99/da8/d9a/fac [0,4194304] 0 2026-03-09T19:27:39.438 INFO:tasks.workunit.client.1.vm08.stdout:3/771: dwrite d0/d6/de/d6e/d51/f70 [0,4194304] 0 2026-03-09T19:27:39.472 INFO:tasks.workunit.client.1.vm08.stdout:7/770: symlink d5/d14/d27/d78/l104 0 2026-03-09T19:27:39.490 INFO:tasks.workunit.client.1.vm08.stdout:0/730: dread dd/d22/d7b/d82/fc7 [0,4194304] 0 2026-03-09T19:27:39.493 INFO:tasks.workunit.client.0.vm07.stdout:2/574: write d3/d49/f82 [1584064,117055] 0 2026-03-09T19:27:39.495 INFO:tasks.workunit.client.1.vm08.stdout:6/745: dwrite d3/d55/fe1 [0,4194304] 0 2026-03-09T19:27:39.501 INFO:tasks.workunit.client.1.vm08.stdout:6/746: dwrite d3/db/d43/d69/f112 [0,4194304] 0 2026-03-09T19:27:39.504 INFO:tasks.workunit.client.1.vm08.stdout:6/747: truncate d3/d94/def/f110 417159 0 2026-03-09T19:27:39.522 INFO:tasks.workunit.client.1.vm08.stdout:4/688: chown da/lcb 2528884 1 2026-03-09T19:27:39.522 INFO:tasks.workunit.client.1.vm08.stdout:4/689: stat da/d10/lbc 0 2026-03-09T19:27:39.530 INFO:tasks.workunit.client.1.vm08.stdout:3/772: symlink d0/d6/de/d1b/d16/d17/dac/dd2/dd3/lf3 0 2026-03-09T19:27:39.530 INFO:tasks.workunit.client.1.vm08.stdout:3/773: chown d0/d52 134807501 1 2026-03-09T19:27:39.532 INFO:tasks.workunit.client.1.vm08.stdout:9/717: link d0/d2/d14/f19 d0/d2/d80/fec 0 2026-03-09T19:27:39.535 INFO:tasks.workunit.client.1.vm08.stdout:6/748: rename d3/d34/d5c/de8 to d3/d34/da9/da4/d117 0 2026-03-09T19:27:39.539 INFO:tasks.workunit.client.1.vm08.stdout:2/629: creat d3/d4/d3e/fd5 x:0 0 0 2026-03-09T19:27:39.543 INFO:tasks.workunit.client.1.vm08.stdout:6/749: truncate d3/d15/ffd 767232 0 2026-03-09T19:27:39.545 INFO:tasks.workunit.client.0.vm07.stdout:3/565: rename d1/d6/dd/d51 to d1/d3d/d47/db3 0 2026-03-09T19:27:39.553 INFO:tasks.workunit.client.0.vm07.stdout:5/483: read d3/d1a/d28/d48/f69 [3182553,87843] 0 2026-03-09T19:27:39.555 INFO:tasks.workunit.client.1.vm08.stdout:7/771: sync 2026-03-09T19:27:39.559 INFO:tasks.workunit.client.0.vm07.stdout:0/469: mknod d0/d6/d13/d17/d19/d57/d6a/c9b 0 2026-03-09T19:27:39.560 INFO:tasks.workunit.client.0.vm07.stdout:0/470: read d0/d6/f43 [1775063,33826] 0 2026-03-09T19:27:39.560 INFO:tasks.workunit.client.0.vm07.stdout:4/485: symlink d3/d11/lac 0 2026-03-09T19:27:39.561 INFO:tasks.workunit.client.1.vm08.stdout:9/718: rmdir d0/d1b/d97/d48/d5d/d74 39 2026-03-09T19:27:39.565 INFO:tasks.workunit.client.1.vm08.stdout:0/731: creat dd/d22/fe7 x:0 0 0 2026-03-09T19:27:39.568 INFO:tasks.workunit.client.0.vm07.stdout:1/549: rmdir d1/db 39 2026-03-09T19:27:39.569 INFO:tasks.workunit.client.0.vm07.stdout:1/550: dread - d1/d3e/db3/d6d/fac zero size 2026-03-09T19:27:39.570 INFO:tasks.workunit.client.0.vm07.stdout:9/516: truncate d0/db/d29/d2c/d36/fa1 239977 0 2026-03-09T19:27:39.594 INFO:tasks.workunit.client.1.vm08.stdout:5/681: link d16/l1a d16/d1e/d8c/d99/da8/d9a/ld9 0 2026-03-09T19:27:39.594 INFO:tasks.workunit.client.1.vm08.stdout:4/690: creat da/d10/d26/d3a/fcc x:0 0 0 2026-03-09T19:27:39.607 INFO:tasks.workunit.client.0.vm07.stdout:2/575: unlink d3/fc8 0 2026-03-09T19:27:39.626 INFO:tasks.workunit.client.1.vm08.stdout:7/772: creat d5/d14/d2b/daa/f105 x:0 0 0 2026-03-09T19:27:39.642 INFO:tasks.workunit.client.1.vm08.stdout:8/709: write de/d47/faa [331141,69223] 0 2026-03-09T19:27:39.643 INFO:tasks.workunit.client.1.vm08.stdout:1/846: write d9/d11/f29 [2757988,125184] 0 2026-03-09T19:27:39.643 INFO:tasks.workunit.client.1.vm08.stdout:8/710: chown de/d1d/d2e/d5f/l9b 1870426547 1 2026-03-09T19:27:39.643 INFO:tasks.workunit.client.1.vm08.stdout:8/711: chown de/fb2 56 1 2026-03-09T19:27:39.674 INFO:tasks.workunit.client.1.vm08.stdout:3/774: write d0/d8/f66 [2046182,92875] 0 2026-03-09T19:27:39.688 INFO:tasks.workunit.client.1.vm08.stdout:2/630: dwrite d3/d4/f6 [4194304,4194304] 0 2026-03-09T19:27:39.702 INFO:tasks.workunit.client.1.vm08.stdout:0/732: rmdir dd/d22/d27/d6c 39 2026-03-09T19:27:39.713 INFO:tasks.workunit.client.0.vm07.stdout:3/566: chown d1/d1f/c17 199881 1 2026-03-09T19:27:39.716 INFO:tasks.workunit.client.0.vm07.stdout:0/471: creat d0/d6/d13/d1c/d61/d69/f9c x:0 0 0 2026-03-09T19:27:39.718 INFO:tasks.workunit.client.1.vm08.stdout:4/691: chown da/d10/d16/d28/d46/d52/d6e/d2c/l5e 6978 1 2026-03-09T19:27:39.718 INFO:tasks.workunit.client.1.vm08.stdout:5/682: truncate d16/d1e/d30/d8a/fba 58678 0 2026-03-09T19:27:39.725 INFO:tasks.workunit.client.0.vm07.stdout:4/486: symlink d3/d4f/d56/d5f/lad 0 2026-03-09T19:27:39.726 INFO:tasks.workunit.client.0.vm07.stdout:1/551: mkdir d1/d11/d37/d3f/d6e/d9c/db6 0 2026-03-09T19:27:39.727 INFO:tasks.workunit.client.0.vm07.stdout:9/517: symlink d0/db/lbb 0 2026-03-09T19:27:39.728 INFO:tasks.workunit.client.0.vm07.stdout:6/452: mkdir d0/d1/d28/d76/dad/db0 0 2026-03-09T19:27:39.737 INFO:tasks.workunit.client.0.vm07.stdout:6/453: dread d0/d1/db/d24/f4d [0,4194304] 0 2026-03-09T19:27:39.738 INFO:tasks.workunit.client.0.vm07.stdout:2/576: rmdir d3/dd/d16/d29/d2d/d45/d3b/d44/d97 39 2026-03-09T19:27:39.748 INFO:tasks.workunit.client.0.vm07.stdout:8/519: link d7/d9/f87 d7/d9/d37/d45/d56/d62/fc2 0 2026-03-09T19:27:39.756 INFO:tasks.workunit.client.0.vm07.stdout:3/567: fdatasync d1/d6/dd/f33 0 2026-03-09T19:27:39.757 INFO:tasks.workunit.client.0.vm07.stdout:4/487: creat d3/d11/d16/fae x:0 0 0 2026-03-09T19:27:39.761 INFO:tasks.workunit.client.0.vm07.stdout:3/568: sync 2026-03-09T19:27:39.765 INFO:tasks.workunit.client.0.vm07.stdout:1/552: dread - d1/d11/d37/d3f/d45/d87/d88/fa5 zero size 2026-03-09T19:27:39.780 INFO:tasks.workunit.client.0.vm07.stdout:1/553: dread d1/d11/d37/f2c [0,4194304] 0 2026-03-09T19:27:39.807 INFO:tasks.workunit.client.0.vm07.stdout:2/577: symlink d3/dd/d16/d29/d2d/d45/d3b/lca 0 2026-03-09T19:27:39.819 INFO:tasks.workunit.client.0.vm07.stdout:7/508: rename d0/d4/d5/d8/l8d to d0/lab 0 2026-03-09T19:27:39.825 INFO:tasks.workunit.client.0.vm07.stdout:0/472: symlink d0/l9d 0 2026-03-09T19:27:39.825 INFO:tasks.workunit.client.0.vm07.stdout:8/520: dwrite d7/d9/d37/d34/f55 [0,4194304] 0 2026-03-09T19:27:39.827 INFO:tasks.workunit.client.0.vm07.stdout:7/509: dwrite d0/d4/d5/d26/d3c/d58/f70 [0,4194304] 0 2026-03-09T19:27:39.836 INFO:tasks.workunit.client.0.vm07.stdout:4/488: creat d3/d11/d2b/d37/faf x:0 0 0 2026-03-09T19:27:39.840 INFO:tasks.workunit.client.0.vm07.stdout:1/554: dread - d1/db/fb2 zero size 2026-03-09T19:27:39.843 INFO:tasks.workunit.client.1.vm08.stdout:8/712: mknod de/d47/dd4/cf6 0 2026-03-09T19:27:39.852 INFO:tasks.workunit.client.1.vm08.stdout:9/719: rename d0/d1b/d97/d48/d5e to d0/d1b/d97/d48/d5d/d74/ded 0 2026-03-09T19:27:39.857 INFO:tasks.workunit.client.0.vm07.stdout:6/454: symlink d0/d1/lb1 0 2026-03-09T19:27:39.860 INFO:tasks.workunit.client.0.vm07.stdout:6/455: dwrite d0/d1/db/d1d/f3e [0,4194304] 0 2026-03-09T19:27:39.865 INFO:tasks.workunit.client.0.vm07.stdout:6/456: read d0/d1/db/d17/f38 [2960992,90232] 0 2026-03-09T19:27:39.869 INFO:tasks.workunit.client.0.vm07.stdout:6/457: chown d0/ff 15 1 2026-03-09T19:27:39.885 INFO:tasks.workunit.client.1.vm08.stdout:2/631: symlink d3/d4/d23/d2c/d39/d5e/de/d18/d1f/ld6 0 2026-03-09T19:27:39.890 INFO:tasks.workunit.client.0.vm07.stdout:2/578: mknod d3/dd/d16/d29/d3c/da2/ccb 0 2026-03-09T19:27:39.895 INFO:tasks.workunit.client.1.vm08.stdout:0/733: truncate dd/d22/d27/d2e/d37/f40 994722 0 2026-03-09T19:27:39.899 INFO:tasks.workunit.client.1.vm08.stdout:3/775: write d0/d6/de/d1b/d16/f7b [316360,28437] 0 2026-03-09T19:27:39.899 INFO:tasks.workunit.client.1.vm08.stdout:6/750: read d3/d15/ffd [456922,66858] 0 2026-03-09T19:27:39.905 INFO:tasks.workunit.client.1.vm08.stdout:5/683: fsync d16/d45/f55 0 2026-03-09T19:27:39.911 INFO:tasks.workunit.client.0.vm07.stdout:5/484: getdents d3/dd/d26/d2d/d60/d83 0 2026-03-09T19:27:39.917 INFO:tasks.workunit.client.1.vm08.stdout:7/773: creat d5/d14/d27/d78/d100/f106 x:0 0 0 2026-03-09T19:27:39.918 INFO:tasks.workunit.client.1.vm08.stdout:7/774: chown d5/d14/d27/d78/lb6 229812 1 2026-03-09T19:27:39.927 INFO:tasks.workunit.client.1.vm08.stdout:4/692: dwrite da/d10/d16/f4b [0,4194304] 0 2026-03-09T19:27:39.931 INFO:tasks.workunit.client.0.vm07.stdout:8/521: creat d7/d9/d37/d45/d56/d62/fc3 x:0 0 0 2026-03-09T19:27:39.938 INFO:tasks.workunit.client.0.vm07.stdout:7/510: dwrite d0/d52/f62 [0,4194304] 0 2026-03-09T19:27:39.950 INFO:tasks.workunit.client.0.vm07.stdout:0/473: mkdir d0/d6/d13/d17/d19/d57/d9e 0 2026-03-09T19:27:39.960 INFO:tasks.workunit.client.1.vm08.stdout:9/720: fsync d0/d1b/f82 0 2026-03-09T19:27:39.961 INFO:tasks.workunit.client.0.vm07.stdout:4/489: dread d3/d4f/f5b [0,4194304] 0 2026-03-09T19:27:39.964 INFO:tasks.workunit.client.1.vm08.stdout:9/721: dread - d0/d2/d80/d69/fe2 zero size 2026-03-09T19:27:39.968 INFO:tasks.workunit.client.0.vm07.stdout:1/555: mkdir d1/d11/db7 0 2026-03-09T19:27:39.978 INFO:tasks.workunit.client.0.vm07.stdout:9/518: rmdir d0/d6/d3a/d7e/d7f 0 2026-03-09T19:27:39.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:39 vm07.local ceph-mon[48545]: pgmap v174: 65 pgs: 65 active+clean; 2.5 GiB data, 8.8 GiB used, 111 GiB / 120 GiB avail; 32 MiB/s rd, 82 MiB/s wr, 240 op/s 2026-03-09T19:27:39.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:39 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:39.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:39 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:39.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:39 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:39.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:39 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:39.979 INFO:tasks.workunit.client.1.vm08.stdout:0/734: symlink dd/d22/d27/d65/le8 0 2026-03-09T19:27:40.002 INFO:tasks.workunit.client.0.vm07.stdout:6/458: truncate d0/d13/d1e/d95/d31/f3c 8884999 0 2026-03-09T19:27:40.008 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:39 vm08.local ceph-mon[57794]: pgmap v174: 65 pgs: 65 active+clean; 2.5 GiB data, 8.8 GiB used, 111 GiB / 120 GiB avail; 32 MiB/s rd, 82 MiB/s wr, 240 op/s 2026-03-09T19:27:40.008 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:39 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:40.008 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:39 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:40.008 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:39 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:40.008 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:39 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:40.009 INFO:tasks.workunit.client.1.vm08.stdout:1/847: dwrite d9/d11/f73 [0,4194304] 0 2026-03-09T19:27:40.012 INFO:tasks.workunit.client.0.vm07.stdout:3/569: write d1/d3d/d47/db3/d8e/da9/f93 [1019020,92200] 0 2026-03-09T19:27:40.025 INFO:tasks.workunit.client.1.vm08.stdout:7/775: mkdir d5/d14/d27/d54/d107 0 2026-03-09T19:27:40.025 INFO:tasks.workunit.client.1.vm08.stdout:5/684: dread d16/d1e/f7d [0,4194304] 0 2026-03-09T19:27:40.026 INFO:tasks.workunit.client.0.vm07.stdout:2/579: dread d3/f1a [0,4194304] 0 2026-03-09T19:27:40.048 INFO:tasks.workunit.client.0.vm07.stdout:7/511: rmdir d0/d4/d5/d8/d41/d64/d74 39 2026-03-09T19:27:40.052 INFO:tasks.workunit.client.0.vm07.stdout:8/522: write d7/d9/d37/d45/f4e [1675120,35424] 0 2026-03-09T19:27:40.053 INFO:tasks.workunit.client.1.vm08.stdout:4/693: truncate da/d10/d16/f9f 635705 0 2026-03-09T19:27:40.065 INFO:tasks.workunit.client.1.vm08.stdout:8/713: symlink de/d25/d87/dc9/dd8/lf7 0 2026-03-09T19:27:40.067 INFO:tasks.workunit.client.0.vm07.stdout:0/474: dread d0/d6/d13/f31 [0,4194304] 0 2026-03-09T19:27:40.068 INFO:tasks.workunit.client.0.vm07.stdout:0/475: truncate d0/d6/d13/d17/f8f 1023623 0 2026-03-09T19:27:40.073 INFO:tasks.workunit.client.0.vm07.stdout:1/556: symlink d1/d11/d37/d3f/d45/d87/d88/lb8 0 2026-03-09T19:27:40.083 INFO:tasks.workunit.client.0.vm07.stdout:9/519: rename d0/db/c55 to d0/d17/cbc 0 2026-03-09T19:27:40.109 INFO:tasks.workunit.client.1.vm08.stdout:2/632: dwrite d3/d4/d23/d2c/f31 [4194304,4194304] 0 2026-03-09T19:27:40.109 INFO:tasks.workunit.client.0.vm07.stdout:6/459: write d0/d2d/f88 [805030,62742] 0 2026-03-09T19:27:40.111 INFO:tasks.workunit.client.1.vm08.stdout:0/735: dwrite dd/d22/d7b/d82/fb1 [0,4194304] 0 2026-03-09T19:27:40.130 INFO:tasks.workunit.client.0.vm07.stdout:3/570: dread d1/d3d/f5e [4194304,4194304] 0 2026-03-09T19:27:40.139 INFO:tasks.workunit.client.0.vm07.stdout:5/485: truncate d3/dd/f23 160365 0 2026-03-09T19:27:40.139 INFO:tasks.workunit.client.0.vm07.stdout:2/580: readlink d3/d11/d38/l8e 0 2026-03-09T19:27:40.139 INFO:tasks.workunit.client.0.vm07.stdout:7/512: dread - d0/d4/d5/f85 zero size 2026-03-09T19:27:40.140 INFO:tasks.workunit.client.0.vm07.stdout:2/581: write d3/d49/faf [781353,110573] 0 2026-03-09T19:27:40.140 INFO:tasks.workunit.client.0.vm07.stdout:7/513: chown d0/d4/d5/d26/d32/f7c 3443 1 2026-03-09T19:27:40.142 INFO:tasks.workunit.client.0.vm07.stdout:2/582: read - d3/dd/d16/d29/d2d/d45/d3b/d44/f5d zero size 2026-03-09T19:27:40.143 INFO:tasks.workunit.client.1.vm08.stdout:8/714: creat de/d25/d31/d82/d6d/d99/da5/ff8 x:0 0 0 2026-03-09T19:27:40.143 INFO:tasks.workunit.client.1.vm08.stdout:5/685: fdatasync d16/d1e/d3b/d61/f7a 0 2026-03-09T19:27:40.169 INFO:tasks.workunit.client.1.vm08.stdout:8/715: dread f6 [4194304,4194304] 0 2026-03-09T19:27:40.169 INFO:tasks.workunit.client.1.vm08.stdout:8/716: write de/d7c/fe1 [904409,23101] 0 2026-03-09T19:27:40.170 INFO:tasks.workunit.client.1.vm08.stdout:8/717: readlink de/d1d/d69/l7f 0 2026-03-09T19:27:40.172 INFO:tasks.workunit.client.0.vm07.stdout:9/520: truncate d0/db/d29/d32/d5c/f78 2561143 0 2026-03-09T19:27:40.173 INFO:tasks.workunit.client.0.vm07.stdout:6/460: rename d0/d2d/l61 to d0/d1/db/d52/d94/d81/lb2 0 2026-03-09T19:27:40.179 INFO:tasks.workunit.client.0.vm07.stdout:3/571: rmdir d1/d6 39 2026-03-09T19:27:40.182 INFO:tasks.workunit.client.0.vm07.stdout:9/521: dread d0/f4 [0,4194304] 0 2026-03-09T19:27:40.188 INFO:tasks.workunit.client.0.vm07.stdout:9/522: dread d0/db/d29/d2c/d36/f3c [0,4194304] 0 2026-03-09T19:27:40.190 INFO:tasks.workunit.client.0.vm07.stdout:5/486: unlink d3/dd/d26/f7d 0 2026-03-09T19:27:40.191 INFO:tasks.workunit.client.0.vm07.stdout:5/487: read d3/fe [2085566,120308] 0 2026-03-09T19:27:40.192 INFO:tasks.workunit.client.1.vm08.stdout:6/751: creat d3/d68/f118 x:0 0 0 2026-03-09T19:27:40.203 INFO:tasks.workunit.client.0.vm07.stdout:4/490: creat d3/d11/d16/d2f/fb0 x:0 0 0 2026-03-09T19:27:40.204 INFO:tasks.workunit.client.1.vm08.stdout:2/633: creat d3/d9/d4a/d9a/fd7 x:0 0 0 2026-03-09T19:27:40.204 INFO:tasks.workunit.client.0.vm07.stdout:4/491: write d3/d11/f7d [5019196,65826] 0 2026-03-09T19:27:40.204 INFO:tasks.workunit.client.0.vm07.stdout:4/492: readlink d3/d11/l2a 0 2026-03-09T19:27:40.205 INFO:tasks.workunit.client.0.vm07.stdout:4/493: readlink d3/d11/d29/d34/l43 0 2026-03-09T19:27:40.214 INFO:tasks.workunit.client.0.vm07.stdout:7/514: mkdir d0/d4/d5/d26/d3c/dac 0 2026-03-09T19:27:40.222 INFO:tasks.workunit.client.1.vm08.stdout:1/848: write d9/d11/f56 [3747827,120821] 0 2026-03-09T19:27:40.223 INFO:tasks.workunit.client.1.vm08.stdout:7/776: stat d5/d14/d27/d54/dfb/d9c/dcb/fda 0 2026-03-09T19:27:40.227 INFO:tasks.workunit.client.0.vm07.stdout:8/523: write d7/d30/f61 [352977,75399] 0 2026-03-09T19:27:40.243 INFO:tasks.workunit.client.0.vm07.stdout:6/461: creat d0/d1/d28/fb3 x:0 0 0 2026-03-09T19:27:40.243 INFO:tasks.workunit.client.1.vm08.stdout:4/694: write da/d10/d16/d28/d46/d52/d6e/d2c/f7c [4310918,18856] 0 2026-03-09T19:27:40.267 INFO:tasks.workunit.client.1.vm08.stdout:8/718: symlink de/d47/lf9 0 2026-03-09T19:27:40.268 INFO:tasks.workunit.client.0.vm07.stdout:9/523: rename d0/d6f/c85 to d0/db/d29/d2c/d36/cbd 0 2026-03-09T19:27:40.268 INFO:tasks.workunit.client.0.vm07.stdout:9/524: chown d0/db/d29/d68/f8e 5086067 1 2026-03-09T19:27:40.274 INFO:tasks.workunit.client.0.vm07.stdout:9/525: dread d0/db/d29/fb3 [0,4194304] 0 2026-03-09T19:27:40.274 INFO:tasks.workunit.client.1.vm08.stdout:6/752: fsync d3/d34/d6f/f2f 0 2026-03-09T19:27:40.276 INFO:tasks.workunit.client.0.vm07.stdout:0/476: creat d0/f9f x:0 0 0 2026-03-09T19:27:40.335 INFO:tasks.workunit.client.0.vm07.stdout:2/583: write d3/dd/f1d [483937,27932] 0 2026-03-09T19:27:40.336 INFO:tasks.workunit.client.0.vm07.stdout:4/494: read d3/d11/d29/d34/f5c [474745,14330] 0 2026-03-09T19:27:40.339 INFO:tasks.workunit.client.0.vm07.stdout:7/515: chown d0/d52/d54/d5a/d87/d92/fa0 492806602 1 2026-03-09T19:27:40.347 INFO:tasks.workunit.client.0.vm07.stdout:1/557: link d1/d11/d37/d3f/d45/d87/d88/lb8 d1/d11/d37/d3f/d45/d87/lb9 0 2026-03-09T19:27:40.349 INFO:tasks.workunit.client.1.vm08.stdout:4/695: truncate da/d10/f77 5871420 0 2026-03-09T19:27:40.351 INFO:tasks.workunit.client.0.vm07.stdout:6/462: read d0/d1/db/d17/d4c/f60 [1362561,90674] 0 2026-03-09T19:27:40.352 INFO:tasks.workunit.client.1.vm08.stdout:2/634: write d3/d4/fa7 [3354673,121268] 0 2026-03-09T19:27:40.354 INFO:tasks.workunit.client.1.vm08.stdout:2/635: write d3/d4/d23/d2c/f31 [9158396,73480] 0 2026-03-09T19:27:40.359 INFO:tasks.workunit.client.1.vm08.stdout:1/849: write d9/da/d53/d67/d6c/d76/f99 [2058931,128617] 0 2026-03-09T19:27:40.368 INFO:tasks.workunit.client.0.vm07.stdout:3/572: creat d1/d6/d71/fb4 x:0 0 0 2026-03-09T19:27:40.375 INFO:tasks.workunit.client.0.vm07.stdout:5/488: rename d3/dd/d26/d2c to d3/d1a/d28/d40/d92 0 2026-03-09T19:27:40.378 INFO:tasks.workunit.client.1.vm08.stdout:9/722: getdents d0/d1b/d97/d48 0 2026-03-09T19:27:40.379 INFO:tasks.workunit.client.1.vm08.stdout:9/723: chown d0/d1b/d97/d48/d5d/d74/ded/feb 114463495 1 2026-03-09T19:27:40.385 INFO:tasks.workunit.client.1.vm08.stdout:8/719: unlink de/d1d/d4f/cf1 0 2026-03-09T19:27:40.386 INFO:tasks.workunit.client.1.vm08.stdout:8/720: read de/d25/d31/f8e [475314,6346] 0 2026-03-09T19:27:40.389 INFO:tasks.workunit.client.0.vm07.stdout:9/526: mkdir d0/d6/d73/dbe 0 2026-03-09T19:27:40.392 INFO:tasks.workunit.client.1.vm08.stdout:6/753: dread - d3/db/d43/d69/f89 zero size 2026-03-09T19:27:40.396 INFO:tasks.workunit.client.1.vm08.stdout:3/776: link d0/d6/de/d1b/d16/d17/dac/dd2/dd3/cea d0/d52/d6d/d77/cf4 0 2026-03-09T19:27:40.396 INFO:tasks.workunit.client.1.vm08.stdout:3/777: fdatasync d0/d6/de/d6e/d51/d7f/de3/fe8 0 2026-03-09T19:27:40.398 INFO:tasks.workunit.client.0.vm07.stdout:9/527: sync 2026-03-09T19:27:40.401 INFO:tasks.workunit.client.0.vm07.stdout:0/477: write d0/f68 [895589,105803] 0 2026-03-09T19:27:40.402 INFO:tasks.workunit.client.0.vm07.stdout:0/478: sync 2026-03-09T19:27:40.403 INFO:tasks.workunit.client.0.vm07.stdout:0/479: stat d0/d6/d13/d33/l90 0 2026-03-09T19:27:40.403 INFO:tasks.workunit.client.0.vm07.stdout:0/480: read d0/f41 [33898,44821] 0 2026-03-09T19:27:40.406 INFO:tasks.workunit.client.1.vm08.stdout:7/777: mknod d5/d14/dae/d3a/d42/c108 0 2026-03-09T19:27:40.416 INFO:tasks.workunit.client.1.vm08.stdout:4/696: dread da/f18 [0,4194304] 0 2026-03-09T19:27:40.418 INFO:tasks.workunit.client.1.vm08.stdout:4/697: chown da/d10/d26/d3a/d49/c7f 9991786 1 2026-03-09T19:27:40.418 INFO:tasks.workunit.client.1.vm08.stdout:2/636: dwrite d3/d9/d79/d46/fcb [0,4194304] 0 2026-03-09T19:27:40.419 INFO:tasks.workunit.client.1.vm08.stdout:2/637: stat d3/d9/f84 0 2026-03-09T19:27:40.441 INFO:tasks.workunit.client.1.vm08.stdout:9/724: mknod d0/d2/dc8/cee 0 2026-03-09T19:27:40.442 INFO:tasks.workunit.client.1.vm08.stdout:9/725: chown d0/d1b/d97/d48/d5d/d74/ded/fb7 3609691 1 2026-03-09T19:27:40.446 INFO:tasks.workunit.client.1.vm08.stdout:8/721: mknod de/d25/d87/dc9/cfa 0 2026-03-09T19:27:40.456 INFO:tasks.workunit.client.1.vm08.stdout:6/754: symlink d3/d34/d5c/da2/l119 0 2026-03-09T19:27:40.463 INFO:tasks.workunit.client.1.vm08.stdout:3/778: mkdir d0/d6/de/d15/d96/df5 0 2026-03-09T19:27:40.472 INFO:tasks.workunit.client.1.vm08.stdout:0/736: getdents dd/d22/d24/d49/d92 0 2026-03-09T19:27:40.482 INFO:tasks.workunit.client.1.vm08.stdout:4/698: rmdir da 39 2026-03-09T19:27:40.484 INFO:tasks.workunit.client.1.vm08.stdout:7/778: dwrite d5/d14/d27/d54/dfb/d9c/fef [0,4194304] 0 2026-03-09T19:27:40.485 INFO:tasks.workunit.client.1.vm08.stdout:7/779: dread - d5/d14/d27/fc9 zero size 2026-03-09T19:27:40.486 INFO:tasks.workunit.client.1.vm08.stdout:7/780: readlink d5/d14/dae/d1c/lde 0 2026-03-09T19:27:40.500 INFO:tasks.workunit.client.1.vm08.stdout:5/686: link d16/d8e/ca4 d16/d1e/d6e/cda 0 2026-03-09T19:27:40.507 INFO:tasks.workunit.client.0.vm07.stdout:2/584: mknod d3/dd/d16/d29/d3c/d4c/ccc 0 2026-03-09T19:27:40.511 INFO:tasks.workunit.client.1.vm08.stdout:3/779: creat d0/d4b/ff6 x:0 0 0 2026-03-09T19:27:40.511 INFO:tasks.workunit.client.1.vm08.stdout:3/780: chown d0/d6/de/c10 708 1 2026-03-09T19:27:40.511 INFO:tasks.workunit.client.0.vm07.stdout:2/585: chown d3/dd/d16/d29/d3c/d5a 23312383 1 2026-03-09T19:27:40.511 INFO:tasks.workunit.client.0.vm07.stdout:2/586: readlink d3/dd/d16/d29/d2d/d45/d85/l88 0 2026-03-09T19:27:40.511 INFO:tasks.workunit.client.0.vm07.stdout:4/495: fdatasync d3/d11/d16/d2f/d22/f58 0 2026-03-09T19:27:40.512 INFO:tasks.workunit.client.0.vm07.stdout:4/496: chown d3/d11/d16/d2f/d22/d70/d93/ca0 88055 1 2026-03-09T19:27:40.512 INFO:tasks.workunit.client.0.vm07.stdout:4/497: chown d3/d11/d2b/f71 38958675 1 2026-03-09T19:27:40.527 INFO:tasks.workunit.client.1.vm08.stdout:4/699: dread da/d10/d16/d28/f44 [0,4194304] 0 2026-03-09T19:27:40.530 INFO:tasks.workunit.client.0.vm07.stdout:8/524: creat d7/d9/d37/d45/d97/dbc/fc4 x:0 0 0 2026-03-09T19:27:40.533 INFO:tasks.workunit.client.1.vm08.stdout:2/638: symlink d3/d4/d23/d2c/ld8 0 2026-03-09T19:27:40.544 INFO:tasks.workunit.client.1.vm08.stdout:1/850: link d9/d40/d49/f7c d9/da/d2c/f103 0 2026-03-09T19:27:40.546 INFO:tasks.workunit.client.1.vm08.stdout:1/851: read d9/da/f8e [444259,106840] 0 2026-03-09T19:27:40.553 INFO:tasks.workunit.client.1.vm08.stdout:7/781: dwrite d5/d14/f1e [0,4194304] 0 2026-03-09T19:27:40.553 INFO:tasks.workunit.client.1.vm08.stdout:7/782: truncate d5/d14/d2b/d5d/f84 4362074 0 2026-03-09T19:27:40.567 INFO:tasks.workunit.client.0.vm07.stdout:6/463: creat d0/d1/db/d17/fb4 x:0 0 0 2026-03-09T19:27:40.567 INFO:tasks.workunit.client.0.vm07.stdout:9/528: unlink d0/db/f6c 0 2026-03-09T19:27:40.568 INFO:tasks.workunit.client.0.vm07.stdout:6/464: chown d0/d13/l96 21557 1 2026-03-09T19:27:40.568 INFO:tasks.workunit.client.0.vm07.stdout:2/587: mknod d3/dd/d16/d29/d3c/da2/ccd 0 2026-03-09T19:27:40.569 INFO:tasks.workunit.client.0.vm07.stdout:7/516: creat d0/d52/d54/d55/d7f/fad x:0 0 0 2026-03-09T19:27:40.570 INFO:tasks.workunit.client.0.vm07.stdout:8/525: dread - d7/d9/d37/fb4 zero size 2026-03-09T19:27:40.571 INFO:tasks.workunit.client.1.vm08.stdout:5/687: mkdir d16/d8e/ddb 0 2026-03-09T19:27:40.571 INFO:tasks.workunit.client.0.vm07.stdout:9/529: truncate d0/f56 5074016 0 2026-03-09T19:27:40.571 INFO:tasks.workunit.client.1.vm08.stdout:9/726: creat d0/d1b/d97/dd3/fef x:0 0 0 2026-03-09T19:27:40.571 INFO:tasks.workunit.client.1.vm08.stdout:5/688: write d16/d1e/f5a [2585580,40905] 0 2026-03-09T19:27:40.573 INFO:tasks.workunit.client.1.vm08.stdout:3/781: fsync d0/d52/d7c/f8f 0 2026-03-09T19:27:40.574 INFO:tasks.workunit.client.1.vm08.stdout:4/700: symlink da/d10/d16/d28/d46/d52/d6e/d73/lcd 0 2026-03-09T19:27:40.574 INFO:tasks.workunit.client.1.vm08.stdout:4/701: fsync da/d10/d26/da0/fc6 0 2026-03-09T19:27:40.578 INFO:tasks.workunit.client.0.vm07.stdout:5/489: rename d3/d1a/d5d/f90 to d3/f93 0 2026-03-09T19:27:40.579 INFO:tasks.workunit.client.0.vm07.stdout:2/588: creat d3/dd/d16/d29/d2d/d45/d85/d8a/fce x:0 0 0 2026-03-09T19:27:40.579 INFO:tasks.workunit.client.0.vm07.stdout:0/481: link d0/l88 d0/d6/d13/d1c/d11/la0 0 2026-03-09T19:27:40.581 INFO:tasks.workunit.client.1.vm08.stdout:9/727: symlink d0/d1b/d68/d7f/lf0 0 2026-03-09T19:27:40.582 INFO:tasks.workunit.client.1.vm08.stdout:9/728: readlink d0/d2/l67 0 2026-03-09T19:27:40.583 INFO:tasks.workunit.client.1.vm08.stdout:9/729: chown d0/d2/d80/d69/f93 4821590 1 2026-03-09T19:27:40.584 INFO:tasks.workunit.client.0.vm07.stdout:8/526: truncate d7/f1c 289960 0 2026-03-09T19:27:40.587 INFO:tasks.workunit.client.0.vm07.stdout:9/530: mknod d0/d6/d3a/d7e/cbf 0 2026-03-09T19:27:40.599 INFO:tasks.workunit.client.1.vm08.stdout:3/782: dread - d0/d6/fb8 zero size 2026-03-09T19:27:40.600 INFO:tasks.workunit.client.0.vm07.stdout:3/573: getdents d1 0 2026-03-09T19:27:40.600 INFO:tasks.workunit.client.0.vm07.stdout:3/574: fdatasync d1/d1f/f9c 0 2026-03-09T19:27:40.600 INFO:tasks.workunit.client.0.vm07.stdout:9/531: dread d0/db/f39 [0,4194304] 0 2026-03-09T19:27:40.601 INFO:tasks.workunit.client.1.vm08.stdout:2/639: sync 2026-03-09T19:27:40.601 INFO:tasks.workunit.client.1.vm08.stdout:1/852: sync 2026-03-09T19:27:40.601 INFO:tasks.workunit.client.0.vm07.stdout:7/517: sync 2026-03-09T19:27:40.608 INFO:tasks.workunit.client.1.vm08.stdout:7/783: getdents d5/d14/dae/dd1 0 2026-03-09T19:27:40.608 INFO:tasks.workunit.client.1.vm08.stdout:7/784: chown d5/d14/dae/d3a/d42/d6a/d8f/ld7 14054256 1 2026-03-09T19:27:40.610 INFO:tasks.workunit.client.1.vm08.stdout:5/689: mknod d16/d1e/d8c/d99/da8/cdc 0 2026-03-09T19:27:40.612 INFO:tasks.workunit.client.1.vm08.stdout:8/722: getdents de/d91/dc8/de9 0 2026-03-09T19:27:40.623 INFO:tasks.workunit.client.0.vm07.stdout:2/589: unlink d3/dd/d16/d29/d3c/d5a/d7a/l92 0 2026-03-09T19:27:40.626 INFO:tasks.workunit.client.0.vm07.stdout:8/527: truncate d7/d1d/f3f 1334650 0 2026-03-09T19:27:40.628 INFO:tasks.workunit.client.1.vm08.stdout:0/737: dwrite dd/d22/f41 [0,4194304] 0 2026-03-09T19:27:40.634 INFO:tasks.workunit.client.0.vm07.stdout:8/528: dwrite d7/d9/d37/d45/f4e [0,4194304] 0 2026-03-09T19:27:40.635 INFO:tasks.workunit.client.0.vm07.stdout:8/529: stat d7/d16/d1e/dab 0 2026-03-09T19:27:40.637 INFO:tasks.workunit.client.0.vm07.stdout:4/498: write d3/d11/d2b/d38/f8a [305710,48973] 0 2026-03-09T19:27:40.646 INFO:tasks.workunit.client.0.vm07.stdout:6/465: dwrite d0/d1/f92 [0,4194304] 0 2026-03-09T19:27:40.653 INFO:tasks.workunit.client.0.vm07.stdout:1/558: dwrite d1/d11/d37/f40 [0,4194304] 0 2026-03-09T19:27:40.658 INFO:tasks.workunit.client.1.vm08.stdout:6/755: getdents d3 0 2026-03-09T19:27:40.661 INFO:tasks.workunit.client.0.vm07.stdout:8/530: dwrite d7/d30/fb7 [0,4194304] 0 2026-03-09T19:27:40.663 INFO:tasks.workunit.client.0.vm07.stdout:0/482: dwrite d0/d6/d13/d1c/d11/d56/f67 [0,4194304] 0 2026-03-09T19:27:40.664 INFO:tasks.workunit.client.1.vm08.stdout:3/783: mkdir d0/d52/d6d/d77/d88/df7 0 2026-03-09T19:27:40.666 INFO:tasks.workunit.client.0.vm07.stdout:4/499: dread d3/d11/f6c [0,4194304] 0 2026-03-09T19:27:40.666 INFO:tasks.workunit.client.0.vm07.stdout:3/575: stat d1/d74/f6e 0 2026-03-09T19:27:40.775 INFO:tasks.workunit.client.0.vm07.stdout:7/518: rename d0/d52/d54/d55/d57 to d0/d4/d5/d26/d32/dae 0 2026-03-09T19:27:40.785 INFO:tasks.workunit.client.0.vm07.stdout:6/466: mknod d0/d1/db/d17/d4c/cb5 0 2026-03-09T19:27:40.787 INFO:tasks.workunit.client.0.vm07.stdout:8/531: truncate d7/d9/d37/d45/d56/d67/f7b 1031022 0 2026-03-09T19:27:40.796 INFO:tasks.workunit.client.0.vm07.stdout:8/532: sync 2026-03-09T19:27:40.808 INFO:tasks.workunit.client.0.vm07.stdout:0/483: mkdir d0/d6/d13/da1 0 2026-03-09T19:27:40.809 INFO:tasks.workunit.client.0.vm07.stdout:8/533: mkdir d7/d50/da6/dc5 0 2026-03-09T19:27:40.818 INFO:tasks.workunit.client.1.vm08.stdout:9/730: write d0/d2/f2a [4602435,93717] 0 2026-03-09T19:27:40.819 INFO:tasks.workunit.client.0.vm07.stdout:0/484: rename d0/d6/d13/f4c to d0/d6/d13/d17/d19/d58/fa2 0 2026-03-09T19:27:40.833 INFO:tasks.workunit.client.0.vm07.stdout:8/534: creat d7/d1d/d83/d9f/fc6 x:0 0 0 2026-03-09T19:27:40.838 INFO:tasks.workunit.client.0.vm07.stdout:8/535: sync 2026-03-09T19:27:40.841 INFO:tasks.workunit.client.0.vm07.stdout:8/536: truncate d7/d9/d37/d45/d56/f7a 5029051 0 2026-03-09T19:27:40.841 INFO:tasks.workunit.client.0.vm07.stdout:0/485: getdents d0/d6/d13/d1c/d61 0 2026-03-09T19:27:40.842 INFO:tasks.workunit.client.0.vm07.stdout:0/486: stat d0/d6/d13/d33/c9a 0 2026-03-09T19:27:40.842 INFO:tasks.workunit.client.0.vm07.stdout:8/537: symlink d7/d30/d32/lc7 0 2026-03-09T19:27:40.849 INFO:tasks.workunit.client.0.vm07.stdout:8/538: creat d7/d9/d37/d45/d56/d62/fc8 x:0 0 0 2026-03-09T19:27:40.861 INFO:tasks.workunit.client.1.vm08.stdout:0/738: symlink dd/de4/le9 0 2026-03-09T19:27:40.868 INFO:tasks.workunit.client.1.vm08.stdout:9/731: fsync d0/d2/d80/de5/da2/da8/de8/f2d 0 2026-03-09T19:27:40.874 INFO:tasks.workunit.client.1.vm08.stdout:3/784: mkdir d0/d6/de/d15/d96/df5/df8 0 2026-03-09T19:27:40.884 INFO:tasks.workunit.client.1.vm08.stdout:9/732: read d0/d1b/f9f [2765088,125387] 0 2026-03-09T19:27:40.887 INFO:tasks.workunit.client.1.vm08.stdout:3/785: creat d0/d6/de/d6e/d51/d7f/de3/ff9 x:0 0 0 2026-03-09T19:27:40.889 INFO:tasks.workunit.client.1.vm08.stdout:2/640: getdents d3/d4/d23/d2c/d39/d5e/de/d18/da9 0 2026-03-09T19:27:40.906 INFO:tasks.workunit.client.1.vm08.stdout:2/641: read d3/d9/d26/f35 [2092849,21725] 0 2026-03-09T19:27:40.912 INFO:tasks.workunit.client.1.vm08.stdout:3/786: dread d0/d6/de/d1b/d16/d17/fbc [0,4194304] 0 2026-03-09T19:27:40.916 INFO:tasks.workunit.client.1.vm08.stdout:9/733: link d0/d2/d80/de5/da2/da8/de8/f8e d0/d1b/d97/d48/ff1 0 2026-03-09T19:27:40.925 INFO:tasks.workunit.client.1.vm08.stdout:9/734: dread d0/d1b/f82 [0,4194304] 0 2026-03-09T19:27:40.926 INFO:tasks.workunit.client.1.vm08.stdout:3/787: creat d0/d52/d6d/d77/ddf/ffa x:0 0 0 2026-03-09T19:27:40.929 INFO:tasks.workunit.client.1.vm08.stdout:9/735: creat d0/d2/d80/d69/ff2 x:0 0 0 2026-03-09T19:27:40.931 INFO:tasks.workunit.client.1.vm08.stdout:3/788: dread - d0/d8/d19/fcc zero size 2026-03-09T19:27:40.932 INFO:tasks.workunit.client.1.vm08.stdout:3/789: chown d0/d4b/f74 1719 1 2026-03-09T19:27:40.945 INFO:tasks.workunit.client.1.vm08.stdout:3/790: mkdir d0/d8/d24/dfb 0 2026-03-09T19:27:40.946 INFO:tasks.workunit.client.1.vm08.stdout:3/791: stat d0/d6/de/d1b/d16/d17 0 2026-03-09T19:27:40.949 INFO:tasks.workunit.client.0.vm07.stdout:1/559: creat d1/d3e/db3/fba x:0 0 0 2026-03-09T19:27:40.952 INFO:tasks.workunit.client.1.vm08.stdout:3/792: mknod d0/d6/de/d1a/cfc 0 2026-03-09T19:27:40.962 INFO:tasks.workunit.client.1.vm08.stdout:3/793: mknod d0/d6/de/d1b/cfd 0 2026-03-09T19:27:40.962 INFO:tasks.workunit.client.1.vm08.stdout:9/736: sync 2026-03-09T19:27:40.965 INFO:tasks.workunit.client.1.vm08.stdout:9/737: creat d0/d2/d80/d69/ff3 x:0 0 0 2026-03-09T19:27:40.966 INFO:tasks.workunit.client.1.vm08.stdout:3/794: dread d0/d52/d6d/d77/d88/fe7 [0,4194304] 0 2026-03-09T19:27:40.968 INFO:tasks.workunit.client.0.vm07.stdout:8/539: mknod d7/d16/cc9 0 2026-03-09T19:27:40.969 INFO:tasks.workunit.client.1.vm08.stdout:3/795: fdatasync d0/d6/f39 0 2026-03-09T19:27:40.979 INFO:tasks.workunit.client.0.vm07.stdout:3/576: creat d1/d3d/d47/db3/fb5 x:0 0 0 2026-03-09T19:27:40.982 INFO:tasks.workunit.client.0.vm07.stdout:5/490: dwrite d3/d1a/d28/d48/f50 [0,4194304] 0 2026-03-09T19:27:40.992 INFO:tasks.workunit.client.0.vm07.stdout:3/577: truncate d1/d6/dd/f3b 4671263 0 2026-03-09T19:27:41.006 INFO:tasks.workunit.client.1.vm08.stdout:6/756: rmdir d3/d94/d113 0 2026-03-09T19:27:41.016 INFO:tasks.workunit.client.1.vm08.stdout:7/785: rename d5/d14/dae/d3a/d42/d6a to d5/d14/dae/dd1/d109 0 2026-03-09T19:27:41.019 INFO:tasks.workunit.client.1.vm08.stdout:6/757: dread d3/d15/f40 [0,4194304] 0 2026-03-09T19:27:41.025 INFO:tasks.workunit.client.1.vm08.stdout:7/786: creat d5/d14/d27/d78/dc7/f10a x:0 0 0 2026-03-09T19:27:41.027 INFO:tasks.workunit.client.1.vm08.stdout:6/758: symlink d3/d15/l11a 0 2026-03-09T19:27:41.033 INFO:tasks.workunit.client.1.vm08.stdout:6/759: read d3/db/d43/f51 [312639,3355] 0 2026-03-09T19:27:41.039 INFO:tasks.workunit.client.1.vm08.stdout:7/787: dread d5/d14/dae/d3a/d42/d85/f19 [0,4194304] 0 2026-03-09T19:27:41.048 INFO:tasks.workunit.client.1.vm08.stdout:7/788: link d5/d14/d27/d54/dfb/d90/l93 d5/d14/d27/d54/d107/l10b 0 2026-03-09T19:27:41.050 INFO:tasks.workunit.client.1.vm08.stdout:4/702: dwrite da/d10/d16/d28/d2f/d4f/d64/f6f [0,4194304] 0 2026-03-09T19:27:41.051 INFO:tasks.workunit.client.1.vm08.stdout:4/703: dread - da/d10/d16/d28/d2f/d4f/d64/d81/fb2 zero size 2026-03-09T19:27:41.052 INFO:tasks.workunit.client.1.vm08.stdout:7/789: creat d5/dc4/f10c x:0 0 0 2026-03-09T19:27:41.056 INFO:tasks.workunit.client.0.vm07.stdout:2/590: write d3/dd/d16/d30/da7/dad/fb6 [110950,6154] 0 2026-03-09T19:27:41.057 INFO:tasks.workunit.client.1.vm08.stdout:1/853: dwrite d9/da/d12/f72 [0,4194304] 0 2026-03-09T19:27:41.068 INFO:tasks.workunit.client.0.vm07.stdout:2/591: truncate d3/dd/d16/d29/f58 2497583 0 2026-03-09T19:27:41.068 INFO:tasks.workunit.client.0.vm07.stdout:4/500: write d3/d11/d16/d2f/d22/f7a [915611,17002] 0 2026-03-09T19:27:41.069 INFO:tasks.workunit.client.0.vm07.stdout:7/519: write d0/d4/d5/d8/f15 [3229980,94308] 0 2026-03-09T19:27:41.070 INFO:tasks.workunit.client.0.vm07.stdout:7/520: read - d0/d4/d5/d26/d3c/d39/f97 zero size 2026-03-09T19:27:41.078 INFO:tasks.workunit.client.0.vm07.stdout:2/592: mkdir d3/dd/d16/d29/d3c/dcf 0 2026-03-09T19:27:41.078 INFO:tasks.workunit.client.0.vm07.stdout:4/501: unlink d3/d11/f1e 0 2026-03-09T19:27:41.079 INFO:tasks.workunit.client.0.vm07.stdout:2/593: write d3/d49/f82 [4321000,9041] 0 2026-03-09T19:27:41.080 INFO:tasks.workunit.client.0.vm07.stdout:4/502: write d3/d4f/d56/d5f/f72 [238553,51168] 0 2026-03-09T19:27:41.082 INFO:tasks.workunit.client.0.vm07.stdout:6/467: dwrite d0/d1/db/d1d/f2e [0,4194304] 0 2026-03-09T19:27:41.084 INFO:tasks.workunit.client.0.vm07.stdout:6/468: chown d0/d2d/c5d 109621 1 2026-03-09T19:27:41.089 INFO:tasks.workunit.client.0.vm07.stdout:7/521: creat d0/d4/d5/d8/d1a/d2a/faf x:0 0 0 2026-03-09T19:27:41.108 INFO:tasks.workunit.client.0.vm07.stdout:4/503: creat d3/d11/d29/d34/d50/fb1 x:0 0 0 2026-03-09T19:27:41.112 INFO:tasks.workunit.client.0.vm07.stdout:7/522: truncate d0/d52/d54/f9e 1008211 0 2026-03-09T19:27:41.113 INFO:tasks.workunit.client.1.vm08.stdout:1/854: link d9/l59 d9/d11/db6/l104 0 2026-03-09T19:27:41.113 INFO:tasks.workunit.client.0.vm07.stdout:6/469: sync 2026-03-09T19:27:41.121 INFO:tasks.workunit.client.0.vm07.stdout:2/594: rename d3/dd/d16/d29/d2d/d45/d85/l88 to d3/dd/d16/d29/d2d/d45/d3b/d44/d97/ld0 0 2026-03-09T19:27:41.121 INFO:tasks.workunit.client.0.vm07.stdout:2/595: stat d3/dd/d16/d29/fa3 0 2026-03-09T19:27:41.125 INFO:tasks.workunit.client.0.vm07.stdout:7/523: creat d0/d4/d5/d8/d41/fb0 x:0 0 0 2026-03-09T19:27:41.138 INFO:tasks.workunit.client.0.vm07.stdout:7/524: readlink d0/l8a 0 2026-03-09T19:27:41.139 INFO:tasks.workunit.client.0.vm07.stdout:5/491: mknod d3/c94 0 2026-03-09T19:27:41.142 INFO:tasks.workunit.client.0.vm07.stdout:2/596: creat d3/dd/d16/d29/d2d/d45/d3b/d44/d97/da4/fd1 x:0 0 0 2026-03-09T19:27:41.147 INFO:tasks.workunit.client.0.vm07.stdout:2/597: fdatasync d3/f27 0 2026-03-09T19:27:41.152 INFO:tasks.workunit.client.0.vm07.stdout:7/525: mkdir d0/d80/db1 0 2026-03-09T19:27:41.159 INFO:tasks.workunit.client.1.vm08.stdout:5/690: creat d16/d45/fdd x:0 0 0 2026-03-09T19:27:41.159 INFO:tasks.workunit.client.0.vm07.stdout:8/540: fsync d7/f1c 0 2026-03-09T19:27:41.166 INFO:tasks.workunit.client.1.vm08.stdout:5/691: chown c8 13 1 2026-03-09T19:27:41.167 INFO:tasks.workunit.client.0.vm07.stdout:9/532: rmdir d0 39 2026-03-09T19:27:41.168 INFO:tasks.workunit.client.0.vm07.stdout:7/526: chown d0/d4/d5/d8/d41/d64/d74/d98/l19 16104189 1 2026-03-09T19:27:41.169 INFO:tasks.workunit.client.0.vm07.stdout:5/492: mkdir d3/dd/d95 0 2026-03-09T19:27:41.169 INFO:tasks.workunit.client.0.vm07.stdout:7/527: dread - d0/d4/d5/d26/d3c/d39/f97 zero size 2026-03-09T19:27:41.169 INFO:tasks.workunit.client.1.vm08.stdout:5/692: creat d16/d45/d81/fde x:0 0 0 2026-03-09T19:27:41.169 INFO:tasks.workunit.client.0.vm07.stdout:7/528: stat d0/d52/d54/l5b 0 2026-03-09T19:27:41.177 INFO:tasks.workunit.client.1.vm08.stdout:8/723: rename de/d1d/f59 to de/d25/d31/d82/d6d/d99/dde/ffb 0 2026-03-09T19:27:41.182 INFO:tasks.workunit.client.0.vm07.stdout:7/529: creat d0/d4/d5/d8/d1a/d2a/fb2 x:0 0 0 2026-03-09T19:27:41.183 INFO:tasks.workunit.client.0.vm07.stdout:7/530: chown d0/d4/d5/d8/d41/f73 2324 1 2026-03-09T19:27:41.184 INFO:tasks.workunit.client.0.vm07.stdout:0/487: dwrite d0/f1e [0,4194304] 0 2026-03-09T19:27:41.193 INFO:tasks.workunit.client.1.vm08.stdout:8/724: mkdir de/d25/d87/dc9/dfc 0 2026-03-09T19:27:41.196 INFO:tasks.workunit.client.1.vm08.stdout:9/738: rename d0/d2/d14/d98/faf to d0/d2/d14/d98/d99/dea/ff4 0 2026-03-09T19:27:41.201 INFO:tasks.workunit.client.1.vm08.stdout:3/796: rename d0/d6/de/d6e/d51/cda to d0/d52/d7c/d7e/cfe 0 2026-03-09T19:27:41.202 INFO:tasks.workunit.client.0.vm07.stdout:5/493: truncate d3/f4e 3133702 0 2026-03-09T19:27:41.203 INFO:tasks.workunit.client.1.vm08.stdout:9/739: fsync d0/d2/d80/d69/f93 0 2026-03-09T19:27:41.207 INFO:tasks.workunit.client.1.vm08.stdout:9/740: dwrite d0/d2/d80/de5/da2/da8/de8/f61 [4194304,4194304] 0 2026-03-09T19:27:41.230 INFO:tasks.workunit.client.1.vm08.stdout:7/790: rename d5/d14/dae/dd1/d109/f62 to d5/d14/d2b/daa/f10d 0 2026-03-09T19:27:41.238 INFO:tasks.workunit.client.0.vm07.stdout:0/488: rmdir d0/d6/d13/d1c/d11/d56 39 2026-03-09T19:27:41.251 INFO:tasks.workunit.client.0.vm07.stdout:9/533: unlink d0/db/d29/d32/l63 0 2026-03-09T19:27:41.267 INFO:tasks.workunit.client.0.vm07.stdout:9/534: truncate d0/db/fb0 471860 0 2026-03-09T19:27:41.267 INFO:tasks.workunit.client.0.vm07.stdout:9/535: chown d0/db/d29/d2c/d36/d5a 58226705 1 2026-03-09T19:27:41.269 INFO:tasks.workunit.client.1.vm08.stdout:7/791: mknod d5/d14/dae/dd1/d109/d8f/c10e 0 2026-03-09T19:27:41.279 INFO:tasks.workunit.client.1.vm08.stdout:0/739: write dd/d22/d7b/f83 [3851950,72121] 0 2026-03-09T19:27:41.279 INFO:tasks.workunit.client.0.vm07.stdout:9/536: dread d0/db/f1d [0,4194304] 0 2026-03-09T19:27:41.280 INFO:tasks.workunit.client.1.vm08.stdout:0/740: chown dd/d22/d7b/d82/cb5 56433327 1 2026-03-09T19:27:41.303 INFO:tasks.workunit.client.1.vm08.stdout:2/642: dwrite f2 [0,4194304] 0 2026-03-09T19:27:41.304 INFO:tasks.workunit.client.1.vm08.stdout:2/643: fdatasync d3/d4/f8 0 2026-03-09T19:27:41.315 INFO:tasks.workunit.client.0.vm07.stdout:0/489: rmdir d0/d6/d13/d1c/d11/d56/d78/d8a 0 2026-03-09T19:27:41.319 INFO:tasks.workunit.client.0.vm07.stdout:9/537: dread d0/db/d29/d32/d5c/d69/f83 [0,4194304] 0 2026-03-09T19:27:41.319 INFO:tasks.workunit.client.0.vm07.stdout:9/538: read - d0/db/f91 zero size 2026-03-09T19:27:41.329 INFO:tasks.workunit.client.1.vm08.stdout:1/855: mknod d9/da/c105 0 2026-03-09T19:27:41.338 INFO:tasks.workunit.client.0.vm07.stdout:1/560: write d1/db/d31/d56/f97 [770079,32869] 0 2026-03-09T19:27:41.346 INFO:tasks.workunit.client.1.vm08.stdout:2/644: dread d3/d9/d79/d46/f72 [0,4194304] 0 2026-03-09T19:27:41.349 INFO:tasks.workunit.client.1.vm08.stdout:1/856: symlink d9/da/d12/d39/l106 0 2026-03-09T19:27:41.372 INFO:tasks.workunit.client.0.vm07.stdout:3/578: dwrite d1/d6/dd/f57 [0,4194304] 0 2026-03-09T19:27:41.398 INFO:tasks.workunit.client.1.vm08.stdout:1/857: creat d9/d11/d7a/d89/d8d/daa/f107 x:0 0 0 2026-03-09T19:27:41.408 INFO:tasks.workunit.client.0.vm07.stdout:4/504: mkdir d3/d11/d16/d2f/db2 0 2026-03-09T19:27:41.411 INFO:tasks.workunit.client.1.vm08.stdout:0/741: link dd/d22/d63/d93/caa dd/d22/d7b/cea 0 2026-03-09T19:27:41.416 INFO:tasks.workunit.client.1.vm08.stdout:6/760: dwrite d3/ff1 [0,4194304] 0 2026-03-09T19:27:41.441 INFO:tasks.workunit.client.1.vm08.stdout:1/858: unlink d9/da/d2d/d62/fcc 0 2026-03-09T19:27:41.453 INFO:tasks.workunit.client.1.vm08.stdout:4/704: dwrite da/f18 [0,4194304] 0 2026-03-09T19:27:41.460 INFO:tasks.workunit.client.1.vm08.stdout:4/705: dwrite da/d10/d26/d3a/d91/fb4 [0,4194304] 0 2026-03-09T19:27:41.469 INFO:tasks.workunit.client.1.vm08.stdout:0/742: dread dd/d22/d24/d49/f4c [0,4194304] 0 2026-03-09T19:27:41.480 INFO:tasks.workunit.client.1.vm08.stdout:1/859: symlink d9/da/d2d/d4e/l108 0 2026-03-09T19:27:41.492 INFO:tasks.workunit.client.0.vm07.stdout:2/598: rmdir d3/dd 39 2026-03-09T19:27:41.493 INFO:tasks.workunit.client.0.vm07.stdout:3/579: creat d1/d74/fb6 x:0 0 0 2026-03-09T19:27:41.495 INFO:tasks.workunit.client.0.vm07.stdout:4/505: rmdir d3/d11/d16 39 2026-03-09T19:27:41.496 INFO:tasks.workunit.client.0.vm07.stdout:1/561: truncate d1/d11/d37/d3f/f4a 641115 0 2026-03-09T19:27:41.497 INFO:tasks.workunit.client.0.vm07.stdout:1/562: readlink d1/db/l67 0 2026-03-09T19:27:41.499 INFO:tasks.workunit.client.0.vm07.stdout:3/580: creat d1/d3d/fb7 x:0 0 0 2026-03-09T19:27:41.501 INFO:tasks.workunit.client.0.vm07.stdout:6/470: link d0/d13/d1e/d95/f48 d0/fb6 0 2026-03-09T19:27:41.503 INFO:tasks.workunit.client.0.vm07.stdout:1/563: unlink d1/d11/d37/d3f/d6e/la6 0 2026-03-09T19:27:41.503 INFO:tasks.workunit.client.0.vm07.stdout:8/541: write d7/d9/d10/f41 [1299973,127758] 0 2026-03-09T19:27:41.505 INFO:tasks.workunit.client.1.vm08.stdout:5/693: write d16/d45/f65 [646819,4209] 0 2026-03-09T19:27:41.506 INFO:tasks.workunit.client.1.vm08.stdout:5/694: write d16/d1e/d8c/d99/da8/f8b [641087,66371] 0 2026-03-09T19:27:41.508 INFO:tasks.workunit.client.0.vm07.stdout:3/581: getdents d1/d6/d45/dac 0 2026-03-09T19:27:41.512 INFO:tasks.workunit.client.0.vm07.stdout:4/506: dread d3/f1a [0,4194304] 0 2026-03-09T19:27:41.515 INFO:tasks.workunit.client.1.vm08.stdout:8/725: rename de/d25/d31/d82/d6d to de/d47/dfd 0 2026-03-09T19:27:41.515 INFO:tasks.workunit.client.0.vm07.stdout:4/507: write d3/d11/f6c [3068224,725] 0 2026-03-09T19:27:41.521 INFO:tasks.workunit.client.1.vm08.stdout:5/695: symlink d16/d45/daf/ldf 0 2026-03-09T19:27:41.526 INFO:tasks.workunit.client.1.vm08.stdout:8/726: mknod de/d47/dfd/d99/dde/cfe 0 2026-03-09T19:27:41.529 INFO:tasks.workunit.client.1.vm08.stdout:5/696: rmdir d16/d45/d81 39 2026-03-09T19:27:41.530 INFO:tasks.workunit.client.0.vm07.stdout:6/471: sync 2026-03-09T19:27:41.531 INFO:tasks.workunit.client.1.vm08.stdout:1/860: sync 2026-03-09T19:27:41.531 INFO:tasks.workunit.client.0.vm07.stdout:4/508: sync 2026-03-09T19:27:41.547 INFO:tasks.workunit.client.1.vm08.stdout:8/727: creat de/d91/dc8/de9/fff x:0 0 0 2026-03-09T19:27:41.552 INFO:tasks.workunit.client.0.vm07.stdout:3/582: symlink d1/d89/lb8 0 2026-03-09T19:27:41.553 INFO:tasks.workunit.client.0.vm07.stdout:3/583: chown d1/d1f/d16/cab 819259588 1 2026-03-09T19:27:41.555 INFO:tasks.workunit.client.0.vm07.stdout:7/531: write d0/d4/d5/f85 [897397,46566] 0 2026-03-09T19:27:41.564 INFO:tasks.workunit.client.1.vm08.stdout:1/861: dread d9/d11/d7a/ff1 [0,4194304] 0 2026-03-09T19:27:41.572 INFO:tasks.workunit.client.1.vm08.stdout:5/697: dread d16/d45/f5d [0,4194304] 0 2026-03-09T19:27:41.572 INFO:tasks.workunit.client.1.vm08.stdout:5/698: write d16/d1e/d9f/fd6 [904170,23877] 0 2026-03-09T19:27:41.580 INFO:tasks.workunit.client.0.vm07.stdout:6/472: fdatasync d0/d4e/f78 0 2026-03-09T19:27:41.580 INFO:tasks.workunit.client.1.vm08.stdout:5/699: read d16/d45/daf/fc5 [689710,89125] 0 2026-03-09T19:27:41.580 INFO:tasks.workunit.client.1.vm08.stdout:3/797: write d0/d52/f8a [144496,27256] 0 2026-03-09T19:27:41.582 INFO:tasks.workunit.client.1.vm08.stdout:5/700: dread - d16/d1e/d8c/d99/dcc/fd1 zero size 2026-03-09T19:27:41.590 INFO:tasks.workunit.client.1.vm08.stdout:5/701: sync 2026-03-09T19:27:41.601 INFO:tasks.workunit.client.0.vm07.stdout:5/494: dwrite d3/d1a/fb [0,4194304] 0 2026-03-09T19:27:41.602 INFO:tasks.workunit.client.1.vm08.stdout:8/728: rename de/d47/cdf to de/d25/d87/dc9/dd8/c100 0 2026-03-09T19:27:41.620 INFO:tasks.workunit.client.0.vm07.stdout:7/532: truncate d0/d4/d5/d26/f8e 592042 0 2026-03-09T19:27:41.622 INFO:tasks.workunit.client.1.vm08.stdout:7/792: write d5/d14/d27/d78/dc7/fcf [167201,40716] 0 2026-03-09T19:27:41.627 INFO:tasks.workunit.client.0.vm07.stdout:0/490: dwrite d0/d6/d13/d17/d19/d57/f6f [0,4194304] 0 2026-03-09T19:27:41.627 INFO:tasks.workunit.client.1.vm08.stdout:3/798: unlink d0/d6/de/d15/l48 0 2026-03-09T19:27:41.633 INFO:tasks.workunit.client.1.vm08.stdout:9/741: write d0/d1b/d97/d48/d6f/f84 [242536,47326] 0 2026-03-09T19:27:41.634 INFO:tasks.workunit.client.1.vm08.stdout:9/742: chown d0/d2/d14/d5c/fb0 3454 1 2026-03-09T19:27:41.639 INFO:tasks.workunit.client.1.vm08.stdout:9/743: sync 2026-03-09T19:27:41.647 INFO:tasks.workunit.client.1.vm08.stdout:2/645: dwrite d3/d9/d4a/fa4 [0,4194304] 0 2026-03-09T19:27:41.651 INFO:tasks.workunit.client.1.vm08.stdout:8/729: chown de/d47/dfd/d99/dde/ffb 5817 1 2026-03-09T19:27:41.655 INFO:tasks.workunit.client.1.vm08.stdout:6/761: dwrite d3/db/d43/d69/da0/fb7 [0,4194304] 0 2026-03-09T19:27:41.657 INFO:tasks.workunit.client.1.vm08.stdout:3/799: creat d0/d8/d19/fff x:0 0 0 2026-03-09T19:27:41.657 INFO:tasks.workunit.client.1.vm08.stdout:7/793: fsync d5/d14/dae/d1c/f87 0 2026-03-09T19:27:41.660 INFO:tasks.workunit.client.0.vm07.stdout:9/539: dwrite d0/db/d29/fb3 [0,4194304] 0 2026-03-09T19:27:41.660 INFO:tasks.workunit.client.1.vm08.stdout:0/743: write dd/d22/d24/d49/f4c [2447654,44853] 0 2026-03-09T19:27:41.661 INFO:tasks.workunit.client.0.vm07.stdout:9/540: fsync d0/db/d29/d32/fb9 0 2026-03-09T19:27:41.665 INFO:tasks.workunit.client.1.vm08.stdout:4/706: dwrite da/d10/f77 [0,4194304] 0 2026-03-09T19:27:41.685 INFO:tasks.workunit.client.0.vm07.stdout:5/495: truncate d3/dd/f8a 169999 0 2026-03-09T19:27:41.691 INFO:tasks.workunit.client.1.vm08.stdout:5/702: fdatasync d16/d1e/d30/d8a/f98 0 2026-03-09T19:27:41.692 INFO:tasks.workunit.client.0.vm07.stdout:7/533: creat d0/d4/d5/d8/d1a/d2a/fb3 x:0 0 0 2026-03-09T19:27:41.696 INFO:tasks.workunit.client.0.vm07.stdout:2/599: write d3/fc [3282659,109292] 0 2026-03-09T19:27:41.705 INFO:tasks.workunit.client.1.vm08.stdout:9/744: truncate d0/d1b/f82 44635 0 2026-03-09T19:27:41.705 INFO:tasks.workunit.client.0.vm07.stdout:7/534: dread d0/d52/d54/f6a [0,4194304] 0 2026-03-09T19:27:41.705 INFO:tasks.workunit.client.0.vm07.stdout:8/542: write d7/d9/fd [5198902,95919] 0 2026-03-09T19:27:41.705 INFO:tasks.workunit.client.0.vm07.stdout:0/491: stat d0/d6/d13/d1c/d61/d69/c8e 0 2026-03-09T19:27:41.705 INFO:tasks.workunit.client.0.vm07.stdout:1/564: truncate d1/db/d31/d56/f97 130510 0 2026-03-09T19:27:41.708 INFO:tasks.workunit.client.1.vm08.stdout:1/862: write d9/da/d2c/f8a [206800,56923] 0 2026-03-09T19:27:41.713 INFO:tasks.workunit.client.0.vm07.stdout:0/492: dwrite d0/d6/d13/d17/d19/d57/f75 [0,4194304] 0 2026-03-09T19:27:41.720 INFO:tasks.workunit.client.0.vm07.stdout:3/584: write d1/d3d/d47/db3/f6b [914761,9040] 0 2026-03-09T19:27:41.723 INFO:tasks.workunit.client.0.vm07.stdout:4/509: rename d3/d11/f87 to d3/d11/d16/d2f/d91/fb3 0 2026-03-09T19:27:41.732 INFO:tasks.workunit.client.0.vm07.stdout:9/541: read d0/db/d29/d32/d5c/f78 [1732010,32957] 0 2026-03-09T19:27:41.735 INFO:tasks.workunit.client.0.vm07.stdout:7/535: mkdir d0/d52/db4 0 2026-03-09T19:27:41.736 INFO:tasks.workunit.client.0.vm07.stdout:8/543: dread - d7/d50/fa0 zero size 2026-03-09T19:27:41.736 INFO:tasks.workunit.client.0.vm07.stdout:8/544: chown d7/d30/fb7 752 1 2026-03-09T19:27:41.742 INFO:tasks.workunit.client.0.vm07.stdout:8/545: dwrite d7/d9/d37/d45/d56/d62/fc8 [0,4194304] 0 2026-03-09T19:27:41.745 INFO:tasks.workunit.client.0.vm07.stdout:0/493: rmdir d0/d6/d13/d1c/d11/d56 39 2026-03-09T19:27:41.750 INFO:tasks.workunit.client.0.vm07.stdout:8/546: dread - d7/d50/fa0 zero size 2026-03-09T19:27:41.759 INFO:tasks.workunit.client.0.vm07.stdout:1/565: dread d1/d11/d37/d5d/d50/f63 [0,4194304] 0 2026-03-09T19:27:41.762 INFO:tasks.workunit.client.0.vm07.stdout:9/542: creat d0/db/d29/d2c/d36/d7d/fc0 x:0 0 0 2026-03-09T19:27:41.763 INFO:tasks.workunit.client.0.vm07.stdout:2/600: creat d3/dd/d16/d29/d2d/d45/d85/d8a/fd2 x:0 0 0 2026-03-09T19:27:41.765 INFO:tasks.workunit.client.0.vm07.stdout:9/543: write d0/db/d29/d2c/d36/d7d/fc0 [11265,110050] 0 2026-03-09T19:27:41.775 INFO:tasks.workunit.client.0.vm07.stdout:4/510: dread d3/fc [0,4194304] 0 2026-03-09T19:27:41.783 INFO:tasks.workunit.client.0.vm07.stdout:6/473: write d0/fe [1788682,129908] 0 2026-03-09T19:27:41.788 INFO:tasks.workunit.client.1.vm08.stdout:8/730: dread - de/d25/d31/d82/f96 zero size 2026-03-09T19:27:41.813 INFO:tasks.workunit.client.0.vm07.stdout:8/547: truncate d7/d9/d37/d45/d56/f5f 4635214 0 2026-03-09T19:27:41.814 INFO:tasks.workunit.client.0.vm07.stdout:5/496: write d3/d1a/d5d/f80 [291512,85583] 0 2026-03-09T19:27:41.816 INFO:tasks.workunit.client.0.vm07.stdout:5/497: chown d3/dd/d26/c6f 14 1 2026-03-09T19:27:41.816 INFO:tasks.workunit.client.0.vm07.stdout:9/544: dread d0/db/fb0 [0,4194304] 0 2026-03-09T19:27:41.817 INFO:tasks.workunit.client.0.vm07.stdout:9/545: stat d0/d6/c15 0 2026-03-09T19:27:41.818 INFO:tasks.workunit.client.1.vm08.stdout:0/744: dread dd/d31/f54 [0,4194304] 0 2026-03-09T19:27:41.821 INFO:tasks.workunit.client.0.vm07.stdout:9/546: dread d0/db/d29/d32/d5c/d69/f83 [0,4194304] 0 2026-03-09T19:27:41.824 INFO:tasks.workunit.client.0.vm07.stdout:6/474: mknod d0/d13/d1e/d95/d31/cb7 0 2026-03-09T19:27:41.825 INFO:tasks.workunit.client.0.vm07.stdout:3/585: write d1/d6/d4c/f61 [1132227,25512] 0 2026-03-09T19:27:41.829 INFO:tasks.workunit.client.0.vm07.stdout:0/494: creat d0/d6/d13/d17/d19/d57/d9e/fa3 x:0 0 0 2026-03-09T19:27:41.830 INFO:tasks.workunit.client.0.vm07.stdout:0/495: chown d0/d6/d13/d1c/d11/f5f 3 1 2026-03-09T19:27:41.831 INFO:tasks.workunit.client.1.vm08.stdout:1/863: dread d9/da/d12/f98 [8388608,4194304] 0 2026-03-09T19:27:41.839 INFO:tasks.workunit.client.1.vm08.stdout:1/864: dread d9/da/f8e [0,4194304] 0 2026-03-09T19:27:41.842 INFO:tasks.workunit.client.0.vm07.stdout:1/566: dwrite d1/d11/f48 [0,4194304] 0 2026-03-09T19:27:41.842 INFO:tasks.workunit.client.1.vm08.stdout:9/745: write d0/d1b/d97/fca [167843,40871] 0 2026-03-09T19:27:41.844 INFO:tasks.workunit.client.1.vm08.stdout:7/794: symlink d5/d14/dae/d1c/l10f 0 2026-03-09T19:27:41.845 INFO:tasks.workunit.client.0.vm07.stdout:1/567: chown d1/d11/d37/d3f/d45/d87/cb1 1911127 1 2026-03-09T19:27:41.849 INFO:tasks.workunit.client.0.vm07.stdout:1/568: chown d1/d3/d21 32648649 1 2026-03-09T19:27:41.854 INFO:tasks.workunit.client.0.vm07.stdout:8/548: dread d7/d9/d10/d44/d9a/fa1 [0,4194304] 0 2026-03-09T19:27:41.855 INFO:tasks.workunit.client.0.vm07.stdout:8/549: chown d7/d1d/d83 1485101415 1 2026-03-09T19:27:41.855 INFO:tasks.workunit.client.1.vm08.stdout:4/707: mknod da/d10/d26/d50/db0/cce 0 2026-03-09T19:27:41.859 INFO:tasks.workunit.client.0.vm07.stdout:5/498: dread d3/d1a/d28/d36/f61 [0,4194304] 0 2026-03-09T19:27:41.862 INFO:tasks.workunit.client.0.vm07.stdout:9/547: read d0/d6/d3a/f89 [767201,74217] 0 2026-03-09T19:27:41.868 INFO:tasks.workunit.client.1.vm08.stdout:3/800: write d0/d52/f5c [3359863,18641] 0 2026-03-09T19:27:41.869 INFO:tasks.workunit.client.1.vm08.stdout:3/801: dread - d0/d6/de/d6e/d51/fb5 zero size 2026-03-09T19:27:41.874 INFO:tasks.workunit.client.0.vm07.stdout:6/475: mknod d0/d44/cb8 0 2026-03-09T19:27:41.878 INFO:tasks.workunit.client.1.vm08.stdout:2/646: creat d3/d4/d23/d2c/d39/d5e/fd9 x:0 0 0 2026-03-09T19:27:41.878 INFO:tasks.workunit.client.1.vm08.stdout:2/647: chown d3 22411 1 2026-03-09T19:27:41.879 INFO:tasks.workunit.client.1.vm08.stdout:2/648: write d3/d9/d26/f6a [1432689,100143] 0 2026-03-09T19:27:41.885 INFO:tasks.workunit.client.1.vm08.stdout:5/703: write d16/d1e/d3b/f43 [5840121,82738] 0 2026-03-09T19:27:41.885 INFO:tasks.workunit.client.0.vm07.stdout:0/496: dread d0/d6/d13/d17/d19/d57/d6a/f74 [0,4194304] 0 2026-03-09T19:27:41.887 INFO:tasks.workunit.client.1.vm08.stdout:5/704: read f1 [1981203,5730] 0 2026-03-09T19:27:41.887 INFO:tasks.workunit.client.0.vm07.stdout:3/586: dwrite d1/d3d/f5e [0,4194304] 0 2026-03-09T19:27:41.920 INFO:tasks.workunit.client.1.vm08.stdout:9/746: mkdir d0/d1b/d97/d48/d6f/df5 0 2026-03-09T19:27:41.922 INFO:tasks.workunit.client.1.vm08.stdout:5/705: sync 2026-03-09T19:27:41.926 INFO:tasks.workunit.client.1.vm08.stdout:7/795: creat d5/d14/d27/d78/dc7/f110 x:0 0 0 2026-03-09T19:27:41.930 INFO:tasks.workunit.client.0.vm07.stdout:4/511: creat d3/d11/d16/d2f/fb4 x:0 0 0 2026-03-09T19:27:41.937 INFO:tasks.workunit.client.1.vm08.stdout:0/745: mknod dd/d9d/dcc/ceb 0 2026-03-09T19:27:41.943 INFO:tasks.workunit.client.0.vm07.stdout:9/548: rename d0/d6/d3a/d7e to d0/dc1 0 2026-03-09T19:27:41.945 INFO:tasks.workunit.client.1.vm08.stdout:3/802: unlink d0/d6/de/d6e/d51/c9b 0 2026-03-09T19:27:41.946 INFO:tasks.workunit.client.0.vm07.stdout:1/569: write d1/d3e/db3/d6d/f85 [309499,120496] 0 2026-03-09T19:27:41.951 INFO:tasks.workunit.client.1.vm08.stdout:2/649: fdatasync d3/d9/f84 0 2026-03-09T19:27:41.953 INFO:tasks.workunit.client.0.vm07.stdout:0/497: rmdir d0 39 2026-03-09T19:27:41.954 INFO:tasks.workunit.client.0.vm07.stdout:5/499: dread d3/d1a/d28/f2e [0,4194304] 0 2026-03-09T19:27:41.956 INFO:tasks.workunit.client.0.vm07.stdout:3/587: rmdir d1/d1f/d16/d28 39 2026-03-09T19:27:41.957 INFO:tasks.workunit.client.1.vm08.stdout:6/762: link d3/d34/f100 d3/d34/f11b 0 2026-03-09T19:27:41.958 INFO:tasks.workunit.client.1.vm08.stdout:1/865: symlink d9/d11/d7a/d89/de7/d101/l109 0 2026-03-09T19:27:41.961 INFO:tasks.workunit.client.0.vm07.stdout:2/601: getdents d3/dd/d16/d29/d2d/d45/d3b/dae 0 2026-03-09T19:27:41.963 INFO:tasks.workunit.client.1.vm08.stdout:5/706: fsync d16/d45/d81/fde 0 2026-03-09T19:27:41.963 INFO:tasks.workunit.client.1.vm08.stdout:5/707: dread - d16/d1e/d30/fc2 zero size 2026-03-09T19:27:41.968 INFO:tasks.workunit.client.0.vm07.stdout:7/536: link d0/d4/l59 d0/d4/d5/lb5 0 2026-03-09T19:27:41.969 INFO:tasks.workunit.client.1.vm08.stdout:0/746: truncate dd/d31/fac 1576070 0 2026-03-09T19:27:41.970 INFO:tasks.workunit.client.0.vm07.stdout:1/570: readlink d1/d11/d37/d3f/d45/l2d 0 2026-03-09T19:27:41.972 INFO:tasks.workunit.client.1.vm08.stdout:3/803: creat d0/d6/de/d1b/d16/f100 x:0 0 0 2026-03-09T19:27:41.973 INFO:tasks.workunit.client.0.vm07.stdout:8/550: write d7/d9/f36 [78784,2903] 0 2026-03-09T19:27:41.973 INFO:tasks.workunit.client.0.vm07.stdout:8/551: chown d7/d50/f6f 922772473 1 2026-03-09T19:27:41.974 INFO:tasks.workunit.client.0.vm07.stdout:8/552: dread - d7/d9/d37/d45/d56/d62/fc3 zero size 2026-03-09T19:27:41.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:41 vm07.local ceph-mon[48545]: pgmap v175: 65 pgs: 65 active+clean; 2.6 GiB data, 9.0 GiB used, 111 GiB / 120 GiB avail; 32 MiB/s rd, 73 MiB/s wr, 190 op/s 2026-03-09T19:27:41.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:41 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:41.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:41 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:41.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:41 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:27:41.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:41 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:27:41.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:41 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:41.982 INFO:tasks.workunit.client.1.vm08.stdout:4/708: dwrite da/d10/d1b/f37 [0,4194304] 0 2026-03-09T19:27:41.987 INFO:tasks.workunit.client.0.vm07.stdout:4/512: dread d3/d4f/f7c [0,4194304] 0 2026-03-09T19:27:41.997 INFO:tasks.workunit.client.0.vm07.stdout:4/513: dread d3/d11/d29/f3c [0,4194304] 0 2026-03-09T19:27:41.998 INFO:tasks.workunit.client.0.vm07.stdout:6/476: symlink d0/d1/d28/da9/lb9 0 2026-03-09T19:27:41.998 INFO:tasks.workunit.client.0.vm07.stdout:6/477: stat d0/d1/db/d52/d94 0 2026-03-09T19:27:41.999 INFO:tasks.workunit.client.1.vm08.stdout:2/650: chown d3/cd3 6767 1 2026-03-09T19:27:42.000 INFO:tasks.workunit.client.1.vm08.stdout:8/731: link de/l13 de/d47/dfd/d99/da5/db3/l101 0 2026-03-09T19:27:42.001 INFO:tasks.workunit.client.0.vm07.stdout:3/588: mknod d1/d6/d71/cb9 0 2026-03-09T19:27:42.001 INFO:tasks.workunit.client.1.vm08.stdout:6/763: symlink d3/d94/l11c 0 2026-03-09T19:27:42.004 INFO:tasks.workunit.client.1.vm08.stdout:1/866: read - d9/da/d12/d39/fbb zero size 2026-03-09T19:27:42.006 INFO:tasks.workunit.client.1.vm08.stdout:9/747: symlink d0/d1b/d97/d48/d6f/df5/lf6 0 2026-03-09T19:27:42.008 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:41 vm08.local ceph-mon[57794]: pgmap v175: 65 pgs: 65 active+clean; 2.6 GiB data, 9.0 GiB used, 111 GiB / 120 GiB avail; 32 MiB/s rd, 73 MiB/s wr, 190 op/s 2026-03-09T19:27:42.008 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:41 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:42.008 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:41 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:42.008 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:41 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:27:42.008 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:41 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:27:42.008 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:41 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' 2026-03-09T19:27:42.016 INFO:tasks.workunit.client.1.vm08.stdout:5/708: rename d16/d45/f55 to d16/d1e/d8c/d99/da8/d9a/fe0 0 2026-03-09T19:27:42.021 INFO:tasks.workunit.client.1.vm08.stdout:7/796: creat d5/d14/dae/d3a/d42/d85/da0/df5/f111 x:0 0 0 2026-03-09T19:27:42.024 INFO:tasks.workunit.client.1.vm08.stdout:0/747: symlink dd/d31/lec 0 2026-03-09T19:27:42.026 INFO:tasks.workunit.client.1.vm08.stdout:3/804: fsync d0/d4b/fdc 0 2026-03-09T19:27:42.028 INFO:tasks.workunit.client.0.vm07.stdout:6/478: rename d0/d4e/d75/c93 to d0/d13/cba 0 2026-03-09T19:27:42.031 INFO:tasks.workunit.client.1.vm08.stdout:5/709: sync 2026-03-09T19:27:42.032 INFO:tasks.workunit.client.1.vm08.stdout:7/797: sync 2026-03-09T19:27:42.032 INFO:tasks.workunit.client.1.vm08.stdout:0/748: sync 2026-03-09T19:27:42.037 INFO:tasks.workunit.client.0.vm07.stdout:1/571: write d1/f76 [1623155,117382] 0 2026-03-09T19:27:42.041 INFO:tasks.workunit.client.0.vm07.stdout:0/498: chown d0/d6/d13/d1c/d50/d92/c98 121 1 2026-03-09T19:27:42.041 INFO:tasks.workunit.client.0.vm07.stdout:2/602: dwrite d3/dd/d16/d30/f3a [4194304,4194304] 0 2026-03-09T19:27:42.057 INFO:tasks.workunit.client.0.vm07.stdout:8/553: write d7/d9/d10/d44/f6c [334476,50223] 0 2026-03-09T19:27:42.057 INFO:tasks.workunit.client.0.vm07.stdout:4/514: write d3/d4f/d56/f7f [983834,89782] 0 2026-03-09T19:27:42.061 INFO:tasks.workunit.client.0.vm07.stdout:4/515: dwrite d3/f13 [0,4194304] 0 2026-03-09T19:27:42.071 INFO:tasks.workunit.client.0.vm07.stdout:4/516: chown d3/d11/d16/d2f/d22/d70/d99 6 1 2026-03-09T19:27:42.071 INFO:tasks.workunit.client.0.vm07.stdout:3/589: symlink d1/d3d/d47/db3/lba 0 2026-03-09T19:27:42.072 INFO:tasks.workunit.client.0.vm07.stdout:0/499: sync 2026-03-09T19:27:42.076 INFO:tasks.workunit.client.0.vm07.stdout:0/500: sync 2026-03-09T19:27:42.077 INFO:tasks.workunit.client.0.vm07.stdout:0/501: write d0/d6/d13/f6c [5501597,53776] 0 2026-03-09T19:27:42.085 INFO:tasks.workunit.client.0.vm07.stdout:5/500: write d3/dd/f8a [980844,37787] 0 2026-03-09T19:27:42.086 INFO:tasks.workunit.client.0.vm07.stdout:5/501: readlink d3/d1a/d28/d40/l55 0 2026-03-09T19:27:42.088 INFO:tasks.workunit.client.0.vm07.stdout:3/590: dread d1/d6/fb [4194304,4194304] 0 2026-03-09T19:27:42.089 INFO:tasks.workunit.client.1.vm08.stdout:1/867: rename d9/d11/d7a/d89/d8d/daa/cc7 to d9/da/d53/d67/d6c/d76/c10a 0 2026-03-09T19:27:42.097 INFO:tasks.workunit.client.1.vm08.stdout:4/709: mknod da/d10/d16/d28/d2f/d4f/d64/d84/d8a/da2/ccf 0 2026-03-09T19:27:42.100 INFO:tasks.workunit.client.1.vm08.stdout:4/710: chown da/d10/d16/d28/d2f/d4f/d64/d81/f86 67 1 2026-03-09T19:27:42.100 INFO:tasks.workunit.client.1.vm08.stdout:4/711: chown da/d10/d26/d50/db0/cce 3 1 2026-03-09T19:27:42.101 INFO:tasks.workunit.client.1.vm08.stdout:5/710: creat d16/d1e/d3b/fe1 x:0 0 0 2026-03-09T19:27:42.103 INFO:tasks.workunit.client.0.vm07.stdout:6/479: mknod d0/d4e/d75/cbb 0 2026-03-09T19:27:42.105 INFO:tasks.workunit.client.1.vm08.stdout:7/798: truncate d5/d14/dae/d3a/d42/ff2 570278 0 2026-03-09T19:27:42.110 INFO:tasks.workunit.client.1.vm08.stdout:8/732: write de/d1d/d2e/d5f/f80 [79447,125051] 0 2026-03-09T19:27:42.110 INFO:tasks.workunit.client.1.vm08.stdout:3/805: write d0/d8/fc2 [241576,10624] 0 2026-03-09T19:27:42.112 INFO:tasks.workunit.client.0.vm07.stdout:9/549: dwrite d0/d17/f5e [0,4194304] 0 2026-03-09T19:27:42.112 INFO:tasks.workunit.client.1.vm08.stdout:8/733: chown de/d47/dfd/d99/da0/fda 377023246 1 2026-03-09T19:27:42.112 INFO:tasks.workunit.client.1.vm08.stdout:2/651: dwrite d3/d4/d23/d2c/d39/d5e/de/d18/f93 [0,4194304] 0 2026-03-09T19:27:42.135 INFO:tasks.workunit.client.0.vm07.stdout:4/517: creat d3/d11/d2b/fb5 x:0 0 0 2026-03-09T19:27:42.141 INFO:tasks.workunit.client.0.vm07.stdout:7/537: rmdir d0/d4/d5/d26/d3c/dac 0 2026-03-09T19:27:42.155 INFO:tasks.workunit.client.0.vm07.stdout:0/502: dwrite d0/d6/d13/d1c/d11/d56/f67 [0,4194304] 0 2026-03-09T19:27:42.155 INFO:tasks.workunit.client.0.vm07.stdout:5/502: truncate d3/d1a/d28/f3c 1647842 0 2026-03-09T19:27:42.165 INFO:tasks.workunit.client.1.vm08.stdout:0/749: creat dd/d22/d27/d65/ddf/fed x:0 0 0 2026-03-09T19:27:42.169 INFO:tasks.workunit.client.0.vm07.stdout:6/480: readlink d0/d1/db/d52/l9f 0 2026-03-09T19:27:42.171 INFO:tasks.workunit.client.1.vm08.stdout:7/799: creat d5/dc4/f112 x:0 0 0 2026-03-09T19:27:42.177 INFO:tasks.workunit.client.0.vm07.stdout:2/603: write d3/dd/d16/d29/d2d/d45/d3b/dae/fbb [627736,68117] 0 2026-03-09T19:27:42.180 INFO:tasks.workunit.client.1.vm08.stdout:3/806: creat d0/d6/de/d6e/d51/d92/f101 x:0 0 0 2026-03-09T19:27:42.180 INFO:tasks.workunit.client.0.vm07.stdout:3/591: dwrite d1/d6/dd/f33 [0,4194304] 0 2026-03-09T19:27:42.190 INFO:tasks.workunit.client.1.vm08.stdout:2/652: write d3/d4/d23/d2c/d39/d5e/d14/f78 [2738863,11080] 0 2026-03-09T19:27:42.191 INFO:tasks.workunit.client.0.vm07.stdout:8/554: write d7/d30/d32/fba [547427,4446] 0 2026-03-09T19:27:42.191 INFO:tasks.workunit.client.1.vm08.stdout:2/653: chown d3/d4/d23/d2c/d39/d5e/de/c56 110 1 2026-03-09T19:27:42.195 INFO:tasks.workunit.client.1.vm08.stdout:8/734: fsync de/d47/dfd/d99/da5/db3/f2d 0 2026-03-09T19:27:42.203 INFO:tasks.workunit.client.1.vm08.stdout:1/868: mkdir d9/d11/d10b 0 2026-03-09T19:27:42.204 INFO:tasks.workunit.client.1.vm08.stdout:1/869: chown d9/da/d12/d39/fa7 22 1 2026-03-09T19:27:42.208 INFO:tasks.workunit.client.0.vm07.stdout:0/503: creat d0/d6/fa4 x:0 0 0 2026-03-09T19:27:42.214 INFO:tasks.workunit.client.0.vm07.stdout:4/518: dread d3/d11/d16/f77 [0,4194304] 0 2026-03-09T19:27:42.214 INFO:tasks.workunit.client.1.vm08.stdout:4/712: rename da/d10/d26/da0 to da/d10/d16/d28/d2f/d4f/d56/dd0 0 2026-03-09T19:27:42.218 INFO:tasks.workunit.client.1.vm08.stdout:0/750: mknod dd/d22/d27/d2e/cee 0 2026-03-09T19:27:42.220 INFO:tasks.workunit.client.0.vm07.stdout:6/481: rmdir d0/d1/d28/d76 39 2026-03-09T19:27:42.221 INFO:tasks.workunit.client.1.vm08.stdout:7/800: truncate d5/f1a 1091294 0 2026-03-09T19:27:42.223 INFO:tasks.workunit.client.0.vm07.stdout:1/572: rename d1/d3/l33 to d1/d11/d37/d3f/d45/lbb 0 2026-03-09T19:27:42.227 INFO:tasks.workunit.client.1.vm08.stdout:3/807: dread d0/d8/f5b [0,4194304] 0 2026-03-09T19:27:42.228 INFO:tasks.workunit.client.1.vm08.stdout:2/654: chown d3/d4/d23/d2c/d39/d5e/de/d18/d1f/lc7 126123 1 2026-03-09T19:27:42.230 INFO:tasks.workunit.client.0.vm07.stdout:9/550: chown d0/db/d29/d4d/c8c 627612 1 2026-03-09T19:27:42.231 INFO:tasks.workunit.client.0.vm07.stdout:9/551: readlink d0/db/d29/d32/l92 0 2026-03-09T19:27:42.231 INFO:tasks.workunit.client.1.vm08.stdout:8/735: creat de/d1d/d21/d73/f102 x:0 0 0 2026-03-09T19:27:42.236 INFO:tasks.workunit.client.0.vm07.stdout:7/538: mkdir d0/d4/d5/d26/db6 0 2026-03-09T19:27:42.237 INFO:tasks.workunit.client.1.vm08.stdout:9/748: getdents d0/d1b/d97 0 2026-03-09T19:27:42.242 INFO:tasks.workunit.client.0.vm07.stdout:8/555: write d7/d9/d37/f85 [18914,99882] 0 2026-03-09T19:27:42.252 INFO:tasks.workunit.client.0.vm07.stdout:4/519: rename d3/d11/d29/d34/d50 to d3/d11/d2b/d37/db6 0 2026-03-09T19:27:42.253 INFO:tasks.workunit.client.0.vm07.stdout:1/573: write d1/d11/d37/d3f/d45/f26 [4739438,128733] 0 2026-03-09T19:27:42.254 INFO:tasks.workunit.client.0.vm07.stdout:1/574: chown d1/d3e/db3/d9a 0 1 2026-03-09T19:27:42.259 INFO:tasks.workunit.client.0.vm07.stdout:2/604: creat d3/dd/d16/d29/d3c/d5a/d7a/d74/dc9/fd3 x:0 0 0 2026-03-09T19:27:42.263 INFO:tasks.workunit.client.1.vm08.stdout:3/808: rename d0/d6/d25/cc8 to d0/d6/d93/dcb/c102 0 2026-03-09T19:27:42.268 INFO:tasks.workunit.client.0.vm07.stdout:7/539: read d0/d4/d5/d8/d41/d64/d74/d98/f1f [152657,40575] 0 2026-03-09T19:27:42.273 INFO:tasks.workunit.client.1.vm08.stdout:2/655: write d3/d4/d23/fc0 [365670,81963] 0 2026-03-09T19:27:42.273 INFO:tasks.workunit.client.0.vm07.stdout:8/556: dwrite d7/d50/f6d [0,4194304] 0 2026-03-09T19:27:42.274 INFO:tasks.workunit.client.1.vm08.stdout:2/656: fdatasync d3/d4/d3e/fd5 0 2026-03-09T19:27:42.275 INFO:tasks.workunit.client.1.vm08.stdout:1/870: mknod d9/d11/d7a/c10c 0 2026-03-09T19:27:42.275 INFO:tasks.workunit.client.0.vm07.stdout:5/503: link d3/dd/d26/c6f d3/d1a/d5a/c96 0 2026-03-09T19:27:42.276 INFO:tasks.workunit.client.1.vm08.stdout:1/871: chown d9/da/d53/db3/le3 7240 1 2026-03-09T19:27:42.282 INFO:tasks.workunit.client.0.vm07.stdout:6/482: creat d0/d4e/d7f/fbc x:0 0 0 2026-03-09T19:27:42.282 INFO:tasks.workunit.client.0.vm07.stdout:6/483: dread - d0/d13/d1e/d95/d31/fa6 zero size 2026-03-09T19:27:42.285 INFO:tasks.workunit.client.1.vm08.stdout:5/711: getdents d16/d1e/d9b 0 2026-03-09T19:27:42.290 INFO:tasks.workunit.client.1.vm08.stdout:0/751: symlink dd/d22/d7b/lef 0 2026-03-09T19:27:42.292 INFO:tasks.workunit.client.0.vm07.stdout:1/575: creat d1/d11/fbc x:0 0 0 2026-03-09T19:27:42.292 INFO:tasks.workunit.client.0.vm07.stdout:1/576: stat d1/db/fb2 0 2026-03-09T19:27:42.294 INFO:tasks.workunit.client.1.vm08.stdout:7/801: mkdir d5/d14/dae/d1c/d113 0 2026-03-09T19:27:42.297 INFO:tasks.workunit.client.0.vm07.stdout:3/592: symlink d1/d1f/d16/d28/d7c/lbb 0 2026-03-09T19:27:42.303 INFO:tasks.workunit.client.1.vm08.stdout:3/809: readlink d0/d52/d6d/d77/l42 0 2026-03-09T19:27:42.304 INFO:tasks.workunit.client.0.vm07.stdout:7/540: rmdir d0/d4/d5/d8/d41/d64/d74/d98 39 2026-03-09T19:27:42.306 INFO:tasks.workunit.client.0.vm07.stdout:0/504: creat d0/d6/d13/d1c/fa5 x:0 0 0 2026-03-09T19:27:42.308 INFO:tasks.workunit.client.0.vm07.stdout:8/557: rmdir d7 39 2026-03-09T19:27:42.311 INFO:tasks.workunit.client.0.vm07.stdout:0/505: dwrite d0/d6/d13/d1c/d11/f2e [0,4194304] 0 2026-03-09T19:27:42.311 INFO:tasks.workunit.client.0.vm07.stdout:5/504: symlink d3/dd/d26/d3f/l97 0 2026-03-09T19:27:42.322 INFO:tasks.workunit.client.1.vm08.stdout:4/713: creat da/d10/d16/d28/d46/d52/d6e/fd1 x:0 0 0 2026-03-09T19:27:42.329 INFO:tasks.workunit.client.0.vm07.stdout:3/593: mknod d1/d6/d4c/d97/cbc 0 2026-03-09T19:27:42.336 INFO:tasks.workunit.client.1.vm08.stdout:8/736: rename de/d25/f64 to de/d47/f103 0 2026-03-09T19:27:42.339 INFO:tasks.workunit.client.0.vm07.stdout:3/594: dread d1/d6/d45/f5d [0,4194304] 0 2026-03-09T19:27:42.339 INFO:tasks.workunit.client.0.vm07.stdout:3/595: chown d1/f20 69 1 2026-03-09T19:27:42.340 INFO:tasks.workunit.client.0.vm07.stdout:9/552: write d0/d6/d57/f59 [1413621,86510] 0 2026-03-09T19:27:42.343 INFO:tasks.workunit.client.1.vm08.stdout:2/657: dwrite d3/d9/d4a/d9a/fc8 [0,4194304] 0 2026-03-09T19:27:42.343 INFO:tasks.workunit.client.1.vm08.stdout:9/749: dwrite d0/d1b/d97/d48/fb5 [0,4194304] 0 2026-03-09T19:27:42.360 INFO:tasks.workunit.client.0.vm07.stdout:7/541: fdatasync d0/d52/d54/fa4 0 2026-03-09T19:27:42.366 INFO:tasks.workunit.client.0.vm07.stdout:8/558: read d7/d9/d10/f41 [1604628,74484] 0 2026-03-09T19:27:42.371 INFO:tasks.workunit.client.1.vm08.stdout:1/872: creat d9/da/d95/dcd/de9/f10d x:0 0 0 2026-03-09T19:27:42.375 INFO:tasks.workunit.client.0.vm07.stdout:6/484: rename d0/d1/db/c2c to d0/d1/db/d1d/d77/cbd 0 2026-03-09T19:27:42.375 INFO:tasks.workunit.client.0.vm07.stdout:6/485: readlink d0/d1/db/d52/l9f 0 2026-03-09T19:27:42.375 INFO:tasks.workunit.client.1.vm08.stdout:7/802: rename d5/d14/dae/d3a/d42/d85/f19 to d5/d14/d27/d54/dfb/d90/f114 0 2026-03-09T19:27:42.375 INFO:tasks.workunit.client.1.vm08.stdout:8/737: mknod de/d47/dfd/d99/da0/c104 0 2026-03-09T19:27:42.377 INFO:tasks.workunit.client.1.vm08.stdout:2/658: mknod d3/d4/d23/d2c/d39/d5e/de/d18/cda 0 2026-03-09T19:27:42.378 INFO:tasks.workunit.client.1.vm08.stdout:2/659: fsync d3/d4/f55 0 2026-03-09T19:27:42.381 INFO:tasks.workunit.client.1.vm08.stdout:4/714: mknod da/d10/d16/d28/cd2 0 2026-03-09T19:27:42.382 INFO:tasks.workunit.client.1.vm08.stdout:5/712: creat d16/d1e/d8c/d99/da8/fe2 x:0 0 0 2026-03-09T19:27:42.383 INFO:tasks.workunit.client.1.vm08.stdout:7/803: mknod d5/d14/d2b/d5d/c115 0 2026-03-09T19:27:42.384 INFO:tasks.workunit.client.1.vm08.stdout:7/804: chown d5/d14/d38/fbb 161980417 1 2026-03-09T19:27:42.384 INFO:tasks.workunit.client.1.vm08.stdout:7/805: rename d5/d14 to d5/d14/dae/d1c/db5/d116 22 2026-03-09T19:27:42.384 INFO:tasks.workunit.client.1.vm08.stdout:9/750: sync 2026-03-09T19:27:42.387 INFO:tasks.workunit.client.1.vm08.stdout:6/764: getdents d3 0 2026-03-09T19:27:42.387 INFO:tasks.workunit.client.1.vm08.stdout:4/715: truncate da/d10/f13 2048935 0 2026-03-09T19:27:42.389 INFO:tasks.workunit.client.0.vm07.stdout:4/520: link d3/d11/d16/d2f/l3a d3/d11/d16/d2f/d22/d70/lb7 0 2026-03-09T19:27:42.389 INFO:tasks.workunit.client.1.vm08.stdout:0/752: getdents dd/d22/d63/d6e/d72 0 2026-03-09T19:27:42.390 INFO:tasks.workunit.client.1.vm08.stdout:7/806: fdatasync d5/d14/d27/f35 0 2026-03-09T19:27:42.391 INFO:tasks.workunit.client.1.vm08.stdout:9/751: mknod d0/d2/d14/d98/d99/cf7 0 2026-03-09T19:27:42.395 INFO:tasks.workunit.client.0.vm07.stdout:9/553: symlink d0/d6/d57/d5d/lc2 0 2026-03-09T19:27:42.398 INFO:tasks.workunit.client.1.vm08.stdout:6/765: mkdir d3/db/d43/d11d 0 2026-03-09T19:27:42.399 INFO:tasks.workunit.client.1.vm08.stdout:4/716: dread - da/d10/d26/d27/d32/f9e zero size 2026-03-09T19:27:42.401 INFO:tasks.workunit.client.0.vm07.stdout:5/505: mkdir d3/dd/d26/d3f/d47/d71/d76/d98 0 2026-03-09T19:27:42.401 INFO:tasks.workunit.client.1.vm08.stdout:7/807: dread d5/d14/dae/ff6 [0,4194304] 0 2026-03-09T19:27:42.402 INFO:tasks.workunit.client.0.vm07.stdout:7/542: sync 2026-03-09T19:27:42.403 INFO:tasks.workunit.client.1.vm08.stdout:5/713: mkdir d16/d8e/ddb/de3 0 2026-03-09T19:27:42.404 INFO:tasks.workunit.client.0.vm07.stdout:8/559: creat d7/d9/d37/d45/d97/dbc/fca x:0 0 0 2026-03-09T19:27:42.406 INFO:tasks.workunit.client.0.vm07.stdout:3/596: rename d1/d6/d71/fb4 to d1/d1f/d16/d28/d7c/fbd 0 2026-03-09T19:27:42.407 INFO:tasks.workunit.client.0.vm07.stdout:6/486: mkdir d0/d4e/d7f/dbe 0 2026-03-09T19:27:42.410 INFO:tasks.workunit.client.0.vm07.stdout:1/577: creat d1/d11/d37/d3f/d7e/dad/fbd x:0 0 0 2026-03-09T19:27:42.412 INFO:tasks.workunit.client.1.vm08.stdout:2/660: creat d3/d4/d23/d2c/fdb x:0 0 0 2026-03-09T19:27:42.414 INFO:tasks.workunit.client.1.vm08.stdout:5/714: sync 2026-03-09T19:27:42.416 INFO:tasks.workunit.client.0.vm07.stdout:2/605: getdents d3/dd/d16/d29/d3c/d4c 0 2026-03-09T19:27:42.419 INFO:tasks.workunit.client.1.vm08.stdout:3/810: write d0/d52/d6d/d77/ddf/ff1 [3491,106770] 0 2026-03-09T19:27:42.423 INFO:tasks.workunit.client.1.vm08.stdout:1/873: dwrite d9/da/d12/d39/fa7 [0,4194304] 0 2026-03-09T19:27:42.429 INFO:tasks.workunit.client.1.vm08.stdout:3/811: dread d0/d52/f5c [0,4194304] 0 2026-03-09T19:27:42.441 INFO:tasks.workunit.client.1.vm08.stdout:6/766: truncate d3/d15/f2b 1772760 0 2026-03-09T19:27:42.443 INFO:tasks.workunit.client.1.vm08.stdout:8/738: write de/d25/d33/f41 [700553,51677] 0 2026-03-09T19:27:42.453 INFO:tasks.workunit.client.0.vm07.stdout:7/543: chown d0/lab 763 1 2026-03-09T19:27:42.454 INFO:tasks.workunit.client.1.vm08.stdout:8/739: fdatasync de/d1d/d2e/d5f/fec 0 2026-03-09T19:27:42.454 INFO:tasks.workunit.client.1.vm08.stdout:4/717: mkdir da/d10/d1b/dd3 0 2026-03-09T19:27:42.454 INFO:tasks.workunit.client.1.vm08.stdout:9/752: fdatasync d0/d1b/d97/d48/d5d/ddf/fd9 0 2026-03-09T19:27:42.459 INFO:tasks.workunit.client.0.vm07.stdout:3/597: write d1/d6/d45/f5d [2204983,5905] 0 2026-03-09T19:27:42.460 INFO:tasks.workunit.client.0.vm07.stdout:3/598: write d1/d6/d45/f5d [1205602,121493] 0 2026-03-09T19:27:42.468 INFO:tasks.workunit.client.1.vm08.stdout:7/808: write d5/d14/d2b/d5d/fb2 [2936691,8757] 0 2026-03-09T19:27:42.475 INFO:tasks.workunit.client.1.vm08.stdout:0/753: dwrite dd/d22/fba [0,4194304] 0 2026-03-09T19:27:42.476 INFO:tasks.workunit.client.0.vm07.stdout:8/560: dwrite d7/d50/f6f [0,4194304] 0 2026-03-09T19:27:42.476 INFO:tasks.workunit.client.0.vm07.stdout:1/578: mknod d1/db/d31/d56/cbe 0 2026-03-09T19:27:42.477 INFO:tasks.workunit.client.0.vm07.stdout:4/521: mknod d3/d11/cb8 0 2026-03-09T19:27:42.478 INFO:tasks.workunit.client.1.vm08.stdout:5/715: dwrite d16/d1e/d30/f3f [0,4194304] 0 2026-03-09T19:27:42.478 INFO:tasks.workunit.client.0.vm07.stdout:2/606: mkdir d3/dd/d16/d30/da7/dad/dd4 0 2026-03-09T19:27:42.479 INFO:tasks.workunit.client.1.vm08.stdout:5/716: readlink d16/d1e/d30/lb5 0 2026-03-09T19:27:42.482 INFO:tasks.workunit.client.1.vm08.stdout:3/812: chown d0/d52/d6d/f8b 332 1 2026-03-09T19:27:42.490 INFO:tasks.workunit.client.1.vm08.stdout:5/717: dwrite d16/d1e/d8c/d99/dcc/fd1 [0,4194304] 0 2026-03-09T19:27:42.499 INFO:tasks.workunit.client.0.vm07.stdout:9/554: mkdir d0/d6f/dc3 0 2026-03-09T19:27:42.503 INFO:tasks.workunit.client.1.vm08.stdout:8/740: mknod de/d47/dfd/d99/da5/db3/c105 0 2026-03-09T19:27:42.506 INFO:tasks.workunit.client.1.vm08.stdout:8/741: chown de/d47/dd4/cf6 20095 1 2026-03-09T19:27:42.516 INFO:tasks.workunit.client.0.vm07.stdout:0/506: link d0/d6/d13/d17/c21 d0/d6/d13/d1c/d52/d81/ca6 0 2026-03-09T19:27:42.526 INFO:tasks.workunit.client.1.vm08.stdout:2/661: creat d3/d4/d23/d2c/d39/d5e/de/d18/d99/dd4/fdc x:0 0 0 2026-03-09T19:27:42.529 INFO:tasks.workunit.client.0.vm07.stdout:1/579: unlink d1/d11/f83 0 2026-03-09T19:27:42.529 INFO:tasks.workunit.client.0.vm07.stdout:1/580: stat d1/d11/d37/d5d/la4 0 2026-03-09T19:27:42.534 INFO:tasks.workunit.client.0.vm07.stdout:0/507: sync 2026-03-09T19:27:42.535 INFO:tasks.workunit.client.0.vm07.stdout:0/508: write d0/d6/d13/d1c/d50/d92/f94 [571063,80055] 0 2026-03-09T19:27:42.536 INFO:tasks.workunit.client.0.vm07.stdout:0/509: truncate d0/d6/fa4 270459 0 2026-03-09T19:27:42.536 INFO:tasks.workunit.client.0.vm07.stdout:0/510: dread - d0/f9f zero size 2026-03-09T19:27:42.540 INFO:tasks.workunit.client.0.vm07.stdout:0/511: sync 2026-03-09T19:27:42.548 INFO:tasks.workunit.client.0.vm07.stdout:4/522: readlink d3/l6 0 2026-03-09T19:27:42.554 INFO:tasks.workunit.client.0.vm07.stdout:5/506: creat d3/f99 x:0 0 0 2026-03-09T19:27:42.555 INFO:tasks.workunit.client.0.vm07.stdout:5/507: chown d3/dd/c31 845636 1 2026-03-09T19:27:42.559 INFO:tasks.workunit.client.0.vm07.stdout:5/508: dread d3/d1a/d28/d6c/f7a [0,4194304] 0 2026-03-09T19:27:42.564 INFO:tasks.workunit.client.0.vm07.stdout:6/487: rename d0/d13/d1e to d0/dbf 0 2026-03-09T19:27:42.565 INFO:tasks.workunit.client.1.vm08.stdout:1/874: write d9/da/dc/f68 [648977,15162] 0 2026-03-09T19:27:42.569 INFO:tasks.workunit.client.0.vm07.stdout:2/607: dread d3/d11/f31 [0,4194304] 0 2026-03-09T19:27:42.569 INFO:tasks.workunit.client.1.vm08.stdout:6/767: dwrite d3/d94/fb5 [0,4194304] 0 2026-03-09T19:27:42.583 INFO:tasks.workunit.client.0.vm07.stdout:3/599: write d1/d1f/f13 [2270731,17037] 0 2026-03-09T19:27:42.583 INFO:tasks.workunit.client.1.vm08.stdout:4/718: write da/fab [3572410,8000] 0 2026-03-09T19:27:42.583 INFO:tasks.workunit.client.1.vm08.stdout:9/753: write d0/d2/d80/f6a [2013496,98330] 0 2026-03-09T19:27:42.587 INFO:tasks.workunit.client.1.vm08.stdout:7/809: dwrite d5/d14/dae/d3a/d42/ff3 [0,4194304] 0 2026-03-09T19:27:42.591 INFO:tasks.workunit.client.0.vm07.stdout:7/544: write d0/d4/d5/d8/d41/d64/d74/d98/f47 [1103081,15185] 0 2026-03-09T19:27:42.597 INFO:tasks.workunit.client.0.vm07.stdout:0/512: chown d0/d6/d13/d33/l48 3 1 2026-03-09T19:27:42.598 INFO:tasks.workunit.client.1.vm08.stdout:5/718: mknod d16/d1e/d8c/d99/da8/d9a/ce4 0 2026-03-09T19:27:42.601 INFO:tasks.workunit.client.0.vm07.stdout:9/555: creat d0/d6f/dc3/fc4 x:0 0 0 2026-03-09T19:27:42.604 INFO:tasks.workunit.client.0.vm07.stdout:5/509: creat d3/d1a/d28/d6c/d72/f9a x:0 0 0 2026-03-09T19:27:42.604 INFO:tasks.workunit.client.0.vm07.stdout:5/510: read - d3/d1a/d28/d6c/d72/f9a zero size 2026-03-09T19:27:42.607 INFO:tasks.workunit.client.0.vm07.stdout:4/523: rename d3/d11/d16/d2f to d3/d11/d29/db9 0 2026-03-09T19:27:42.613 INFO:tasks.workunit.client.0.vm07.stdout:6/488: symlink d0/d1/db/d52/d94/d81/lc0 0 2026-03-09T19:27:42.613 INFO:tasks.workunit.client.0.vm07.stdout:6/489: dread - d0/dbf/d95/f74 zero size 2026-03-09T19:27:42.613 INFO:tasks.workunit.client.0.vm07.stdout:1/581: dread d1/f1d [0,4194304] 0 2026-03-09T19:27:42.613 INFO:tasks.workunit.client.0.vm07.stdout:1/582: stat d1/d11/d37/d3f/d45/d87/lb9 0 2026-03-09T19:27:42.613 INFO:tasks.workunit.client.0.vm07.stdout:3/600: creat d1/d6/d45/fbe x:0 0 0 2026-03-09T19:27:42.615 INFO:tasks.workunit.client.1.vm08.stdout:6/768: symlink d3/d34/da9/da4/d117/l11e 0 2026-03-09T19:27:42.615 INFO:tasks.workunit.client.1.vm08.stdout:6/769: truncate d3/d68/f118 228483 0 2026-03-09T19:27:42.631 INFO:tasks.workunit.client.0.vm07.stdout:5/511: mknod d3/dd/d26/d3f/d47/d71/d76/c9b 0 2026-03-09T19:27:42.632 INFO:tasks.workunit.client.0.vm07.stdout:4/524: creat d3/d11/d2b/fba x:0 0 0 2026-03-09T19:27:42.636 INFO:tasks.workunit.client.0.vm07.stdout:6/490: mknod d0/d4e/d7f/cc1 0 2026-03-09T19:27:42.638 INFO:tasks.workunit.client.1.vm08.stdout:0/754: creat dd/d22/d24/d49/d50/d78/ff0 x:0 0 0 2026-03-09T19:27:42.640 INFO:tasks.workunit.client.1.vm08.stdout:8/742: write de/d7c/fe7 [720055,62281] 0 2026-03-09T19:27:42.642 INFO:tasks.workunit.client.1.vm08.stdout:3/813: dwrite d0/d6/d25/f56 [0,4194304] 0 2026-03-09T19:27:42.642 INFO:tasks.workunit.client.0.vm07.stdout:7/545: write d0/d4/d5/d8/f37 [1513004,83572] 0 2026-03-09T19:27:42.643 INFO:tasks.workunit.client.0.vm07.stdout:0/513: write d0/d6/d13/d1c/d50/f60 [1326331,42330] 0 2026-03-09T19:27:42.643 INFO:tasks.workunit.client.0.vm07.stdout:8/561: dwrite d7/d9/d10/f20 [0,4194304] 0 2026-03-09T19:27:42.645 INFO:tasks.workunit.client.1.vm08.stdout:5/719: write d16/d1e/d30/f53 [336219,56273] 0 2026-03-09T19:27:42.652 INFO:tasks.workunit.client.1.vm08.stdout:5/720: dread d16/d1e/d30/f3f [0,4194304] 0 2026-03-09T19:27:42.657 INFO:tasks.workunit.client.0.vm07.stdout:2/608: fsync d3/d11/f2e 0 2026-03-09T19:27:42.664 INFO:tasks.workunit.client.1.vm08.stdout:2/662: creat d3/d9/fdd x:0 0 0 2026-03-09T19:27:42.665 INFO:tasks.workunit.client.0.vm07.stdout:4/525: unlink d3/d11/d2b/d37/l5a 0 2026-03-09T19:27:42.667 INFO:tasks.workunit.client.1.vm08.stdout:0/755: symlink dd/d22/d63/lf1 0 2026-03-09T19:27:42.672 INFO:tasks.workunit.client.0.vm07.stdout:3/601: dwrite d1/d6/d71/f69 [0,4194304] 0 2026-03-09T19:27:42.674 INFO:tasks.workunit.client.0.vm07.stdout:7/546: mkdir d0/d4/d5/d26/d3c/d58/db7 0 2026-03-09T19:27:42.680 INFO:tasks.workunit.client.0.vm07.stdout:3/602: dwrite d1/d6/d4c/fb1 [0,4194304] 0 2026-03-09T19:27:42.688 INFO:tasks.workunit.client.0.vm07.stdout:1/583: mkdir d1/d11/d37/d3f/d45/dbf 0 2026-03-09T19:27:42.689 INFO:tasks.workunit.client.0.vm07.stdout:1/584: write d1/d11/d37/d3f/d45/d87/faf [3838666,102148] 0 2026-03-09T19:27:42.690 INFO:tasks.workunit.client.0.vm07.stdout:1/585: fdatasync d1/d11/d37/d3f/d7e/dad/fbd 0 2026-03-09T19:27:42.691 INFO:tasks.workunit.client.0.vm07.stdout:1/586: chown d1/d11/d37/d3f/d6e/d9c/faa 53269724 1 2026-03-09T19:27:42.702 INFO:tasks.workunit.client.0.vm07.stdout:9/556: creat d0/db/d29/fc5 x:0 0 0 2026-03-09T19:27:42.705 INFO:tasks.workunit.client.0.vm07.stdout:5/512: fdatasync d3/f18 0 2026-03-09T19:27:42.710 INFO:tasks.workunit.client.0.vm07.stdout:4/526: unlink d3/d11/d16/c36 0 2026-03-09T19:27:42.713 INFO:tasks.workunit.client.0.vm07.stdout:0/514: creat d0/d6/d13/d1c/d50/d92/d99/fa7 x:0 0 0 2026-03-09T19:27:42.723 INFO:tasks.workunit.client.0.vm07.stdout:3/603: mkdir d1/d6/dd/dbf 0 2026-03-09T19:27:42.729 INFO:tasks.workunit.client.1.vm08.stdout:5/721: creat d16/d8e/fe5 x:0 0 0 2026-03-09T19:27:42.731 INFO:tasks.workunit.client.0.vm07.stdout:9/557: chown d0/l14 12 1 2026-03-09T19:27:42.731 INFO:tasks.workunit.client.1.vm08.stdout:1/875: creat d9/d11/f10e x:0 0 0 2026-03-09T19:27:42.733 INFO:tasks.workunit.client.1.vm08.stdout:6/770: symlink d3/d34/da9/dfe/l11f 0 2026-03-09T19:27:42.735 INFO:tasks.workunit.client.0.vm07.stdout:5/513: write d3/f68 [4985177,21161] 0 2026-03-09T19:27:42.738 INFO:tasks.workunit.client.1.vm08.stdout:4/719: creat da/d10/d16/d28/d2f/fd4 x:0 0 0 2026-03-09T19:27:42.743 INFO:tasks.workunit.client.1.vm08.stdout:9/754: link d0/d1b/l52 d0/d1b/d97/d48/d5d/ddf/da7/lf8 0 2026-03-09T19:27:42.743 INFO:tasks.workunit.client.0.vm07.stdout:6/491: dwrite d0/dbf/d95/d31/f89 [0,4194304] 0 2026-03-09T19:27:42.743 INFO:tasks.workunit.client.0.vm07.stdout:6/492: chown d0/d1/db/d52/l9f 203 1 2026-03-09T19:27:42.744 INFO:tasks.workunit.client.0.vm07.stdout:4/527: sync 2026-03-09T19:27:42.744 INFO:tasks.workunit.client.0.vm07.stdout:8/562: link d7/d9/d37/fb4 d7/d1d/d83/d9f/fcb 0 2026-03-09T19:27:42.746 INFO:tasks.workunit.client.0.vm07.stdout:6/493: dread d0/d1/db/d1d/f2e [0,4194304] 0 2026-03-09T19:27:42.748 INFO:tasks.workunit.client.0.vm07.stdout:6/494: chown d0/d1/db/d1d/d77/c90 22 1 2026-03-09T19:27:42.751 INFO:tasks.workunit.client.1.vm08.stdout:7/810: creat d5/d14/dae/d3a/f117 x:0 0 0 2026-03-09T19:27:42.757 INFO:tasks.workunit.client.0.vm07.stdout:7/547: symlink d0/d80/db1/lb8 0 2026-03-09T19:27:42.757 INFO:tasks.workunit.client.0.vm07.stdout:3/604: dread - d1/d1f/f6d zero size 2026-03-09T19:27:42.759 INFO:tasks.workunit.client.0.vm07.stdout:7/548: dread d0/d4/d5/d26/d3c/d39/f7a [0,4194304] 0 2026-03-09T19:27:42.763 INFO:tasks.workunit.client.1.vm08.stdout:2/663: dread d3/d4/d23/d2c/d39/d5e/fb7 [0,4194304] 0 2026-03-09T19:27:42.766 INFO:tasks.workunit.client.0.vm07.stdout:2/609: dwrite d3/dd/d16/d29/f42 [0,4194304] 0 2026-03-09T19:27:42.775 INFO:tasks.workunit.client.1.vm08.stdout:6/771: creat d3/d34/d3b/df5/f120 x:0 0 0 2026-03-09T19:27:42.776 INFO:tasks.workunit.client.0.vm07.stdout:8/563: rmdir d7/d9/d10/d44 39 2026-03-09T19:27:42.776 INFO:tasks.workunit.client.0.vm07.stdout:8/564: chown d7/d9/d37/d45/d4f/fb0 0 1 2026-03-09T19:27:42.777 INFO:tasks.workunit.client.1.vm08.stdout:4/720: creat da/d10/d16/fd5 x:0 0 0 2026-03-09T19:27:42.778 INFO:tasks.workunit.client.1.vm08.stdout:4/721: chown da/d10/d16/d28/f44 1635 1 2026-03-09T19:27:42.780 INFO:tasks.workunit.client.1.vm08.stdout:9/755: fsync d0/d2/d80/de5/da2/da8/de8/dcd/fc6 0 2026-03-09T19:27:42.782 INFO:tasks.workunit.client.0.vm07.stdout:6/495: fsync d0/d4e/f99 0 2026-03-09T19:27:42.785 INFO:tasks.workunit.client.0.vm07.stdout:6/496: dread d0/d1/db/d17/d4c/f60 [0,4194304] 0 2026-03-09T19:27:42.785 INFO:tasks.workunit.client.1.vm08.stdout:7/811: truncate d5/d14/d27/d54/dfb/d9c/f9d 1030848 0 2026-03-09T19:27:42.789 INFO:tasks.workunit.client.0.vm07.stdout:0/515: write d0/d6/d13/d17/d19/d57/d6a/f7a [2536813,35872] 0 2026-03-09T19:27:42.790 INFO:tasks.workunit.client.0.vm07.stdout:1/587: dwrite d1/f96 [0,4194304] 0 2026-03-09T19:27:42.801 INFO:tasks.workunit.client.1.vm08.stdout:0/756: write dd/d22/d24/d49/d50/dd4/fd9 [831337,110427] 0 2026-03-09T19:27:42.802 INFO:tasks.workunit.client.1.vm08.stdout:1/876: write d9/da/d12/fac [1411056,25990] 0 2026-03-09T19:27:42.806 INFO:tasks.workunit.client.0.vm07.stdout:3/605: creat d1/d6/d4c/d97/fc0 x:0 0 0 2026-03-09T19:27:42.817 INFO:tasks.workunit.client.0.vm07.stdout:2/610: rename d3/dd/daa/cbc to d3/dd/d16/d29/d2d/d45/d3b/cd5 0 2026-03-09T19:27:42.817 INFO:tasks.workunit.client.0.vm07.stdout:6/497: creat d0/d1/db/d24/fc2 x:0 0 0 2026-03-09T19:27:42.817 INFO:tasks.workunit.client.0.vm07.stdout:1/588: creat d1/d11/d37/d3f/d6e/d9c/fc0 x:0 0 0 2026-03-09T19:27:42.817 INFO:tasks.workunit.client.1.vm08.stdout:4/722: mknod da/d10/d26/cd6 0 2026-03-09T19:27:42.817 INFO:tasks.workunit.client.1.vm08.stdout:7/812: truncate d5/d14/dae/f1f 4848997 0 2026-03-09T19:27:42.817 INFO:tasks.workunit.client.1.vm08.stdout:8/743: getdents de/d47/dfd/d99 0 2026-03-09T19:27:42.817 INFO:tasks.workunit.client.1.vm08.stdout:0/757: creat dd/d22/d24/d49/d50/dd4/ff2 x:0 0 0 2026-03-09T19:27:42.818 INFO:tasks.workunit.client.1.vm08.stdout:3/814: getdents d0/d6/de/d6e/d51/d92 0 2026-03-09T19:27:42.821 INFO:tasks.workunit.client.0.vm07.stdout:7/549: unlink d0/f1 0 2026-03-09T19:27:42.821 INFO:tasks.workunit.client.0.vm07.stdout:7/550: read - d0/d52/d54/fa7 zero size 2026-03-09T19:27:42.825 INFO:tasks.workunit.client.0.vm07.stdout:4/528: dwrite d3/d11/d29/f42 [0,4194304] 0 2026-03-09T19:27:42.829 INFO:tasks.workunit.client.0.vm07.stdout:8/565: rename d7/d9/da7 to d7/d30/d75/dcc 0 2026-03-09T19:27:42.839 INFO:tasks.workunit.client.0.vm07.stdout:5/514: truncate d3/dd/d26/d2d/f54 1440101 0 2026-03-09T19:27:42.841 INFO:tasks.workunit.client.1.vm08.stdout:7/813: symlink d5/d14/dae/dd1/l118 0 2026-03-09T19:27:42.845 INFO:tasks.workunit.client.1.vm08.stdout:0/758: symlink dd/d22/d27/d2e/db0/lf3 0 2026-03-09T19:27:42.846 INFO:tasks.workunit.client.1.vm08.stdout:3/815: rmdir d0/d52/d6d/d77/d88 39 2026-03-09T19:27:42.851 INFO:tasks.workunit.client.1.vm08.stdout:5/722: link d16/d1e/d3b/c58 d16/d1e/ce6 0 2026-03-09T19:27:42.852 INFO:tasks.workunit.client.1.vm08.stdout:5/723: chown d16/d1e/d8c/d99/da8/cdc 1 1 2026-03-09T19:27:42.852 INFO:tasks.workunit.client.1.vm08.stdout:2/664: link d3/l4c d3/d4/d23/d2c/d39/d5e/de/d18/d99/dd4/lde 0 2026-03-09T19:27:42.852 INFO:tasks.workunit.client.0.vm07.stdout:1/589: dread d1/d11/d37/d5d/d50/f62 [0,4194304] 0 2026-03-09T19:27:42.852 INFO:tasks.workunit.client.1.vm08.stdout:2/665: read d3/d9/d4a/d9a/fc8 [3917455,4835] 0 2026-03-09T19:27:42.852 INFO:tasks.workunit.client.0.vm07.stdout:1/590: read d1/d3/d21/f2e [1964564,49977] 0 2026-03-09T19:27:42.853 INFO:tasks.workunit.client.1.vm08.stdout:2/666: chown d3/d4/d3e/d4e/d88/db0 1912438 1 2026-03-09T19:27:42.853 INFO:tasks.workunit.client.0.vm07.stdout:5/515: symlink d3/d1a/d5a/l9c 0 2026-03-09T19:27:42.854 INFO:tasks.workunit.client.0.vm07.stdout:5/516: write d3/d1a/fb [4382784,42312] 0 2026-03-09T19:27:42.858 INFO:tasks.workunit.client.0.vm07.stdout:0/516: creat d0/fa8 x:0 0 0 2026-03-09T19:27:42.860 INFO:tasks.workunit.client.1.vm08.stdout:1/877: write d9/da/f2f [5119152,7728] 0 2026-03-09T19:27:42.861 INFO:tasks.workunit.client.0.vm07.stdout:3/606: write d1/d1f/d16/d28/f34 [582856,70307] 0 2026-03-09T19:27:42.861 INFO:tasks.workunit.client.0.vm07.stdout:3/607: fdatasync d1/d1f/f9c 0 2026-03-09T19:27:42.864 INFO:tasks.workunit.client.0.vm07.stdout:3/608: dread d1/d6/d45/f5d [0,4194304] 0 2026-03-09T19:27:42.864 INFO:tasks.workunit.client.0.vm07.stdout:3/609: chown d1/f68 170 1 2026-03-09T19:27:42.869 INFO:tasks.workunit.client.0.vm07.stdout:2/611: write d3/d11/f31 [1950789,102530] 0 2026-03-09T19:27:42.869 INFO:tasks.workunit.client.1.vm08.stdout:6/772: write d3/d34/d5c/fac [768230,97543] 0 2026-03-09T19:27:42.879 INFO:tasks.workunit.client.0.vm07.stdout:8/566: dwrite d7/d16/d1e/f6e [0,4194304] 0 2026-03-09T19:27:42.885 INFO:tasks.workunit.client.0.vm07.stdout:8/567: dwrite d7/d9/d37/d45/d56/d62/fc3 [0,4194304] 0 2026-03-09T19:27:42.886 INFO:tasks.workunit.client.0.vm07.stdout:8/568: read d7/d16/f69 [441199,66057] 0 2026-03-09T19:27:42.887 INFO:tasks.workunit.client.0.vm07.stdout:9/558: getdents d0/dc1 0 2026-03-09T19:27:42.893 INFO:tasks.workunit.client.1.vm08.stdout:8/744: creat de/d25/d87/dc9/dfc/f106 x:0 0 0 2026-03-09T19:27:42.900 INFO:tasks.workunit.client.1.vm08.stdout:0/759: mknod dd/d22/d27/d2e/cf4 0 2026-03-09T19:27:42.900 INFO:tasks.workunit.client.1.vm08.stdout:0/760: fdatasync dd/d22/d24/d49/d50/d78/ff0 0 2026-03-09T19:27:42.900 INFO:tasks.workunit.client.1.vm08.stdout:3/816: mkdir d0/d6/de/d54/d103 0 2026-03-09T19:27:42.900 INFO:tasks.workunit.client.1.vm08.stdout:2/667: truncate d3/d4/f91 1040496 0 2026-03-09T19:27:42.901 INFO:tasks.workunit.client.1.vm08.stdout:0/761: read dd/d22/d27/d4f/f97 [812085,98775] 0 2026-03-09T19:27:42.901 INFO:tasks.workunit.client.1.vm08.stdout:0/762: stat dd/de4 0 2026-03-09T19:27:42.902 INFO:tasks.workunit.client.0.vm07.stdout:0/517: mknod d0/d6/d13/d1c/d11/d56/d78/ca9 0 2026-03-09T19:27:42.907 INFO:tasks.workunit.client.0.vm07.stdout:3/610: symlink d1/d6/d4c/lc1 0 2026-03-09T19:27:42.908 INFO:tasks.workunit.client.1.vm08.stdout:9/756: link d0/d2/d80/de5/da2/da8/de8/fe d0/d2/d14/d98/d99/ff9 0 2026-03-09T19:27:42.909 INFO:tasks.workunit.client.1.vm08.stdout:9/757: chown d0/d1b/d97/d48/d6f/la5 0 1 2026-03-09T19:27:42.914 INFO:tasks.workunit.client.0.vm07.stdout:6/498: link d0/dbf/d95/f35 d0/d1/db/d24/da4/fc3 0 2026-03-09T19:27:42.929 INFO:tasks.workunit.client.1.vm08.stdout:2/668: rmdir d3/d4/d3e 39 2026-03-09T19:27:42.929 INFO:tasks.workunit.client.0.vm07.stdout:8/569: mkdir d7/d30/d75/dcd 0 2026-03-09T19:27:42.929 INFO:tasks.workunit.client.0.vm07.stdout:9/559: rmdir d0/db/d29/d2c/d36 39 2026-03-09T19:27:42.929 INFO:tasks.workunit.client.0.vm07.stdout:7/551: rename d0/d4/d5/d26/d32/dae to d0/d4/d5/d26/db9 0 2026-03-09T19:27:42.929 INFO:tasks.workunit.client.0.vm07.stdout:7/552: stat d0/d4/d5/d8/d1a/d2a/l53 0 2026-03-09T19:27:42.929 INFO:tasks.workunit.client.1.vm08.stdout:4/723: getdents da/d10/d16 0 2026-03-09T19:27:42.930 INFO:tasks.workunit.client.1.vm08.stdout:1/878: sync 2026-03-09T19:27:42.934 INFO:tasks.workunit.client.1.vm08.stdout:0/763: rmdir dd/d22/d27/d65/ddf 39 2026-03-09T19:27:42.938 INFO:tasks.workunit.client.0.vm07.stdout:0/518: mkdir d0/d6/d13/d17/d19/daa 0 2026-03-09T19:27:42.944 INFO:tasks.workunit.client.0.vm07.stdout:9/560: sync 2026-03-09T19:27:42.945 INFO:tasks.workunit.client.0.vm07.stdout:6/499: fsync d0/d1/db/d52/fa1 0 2026-03-09T19:27:42.946 INFO:tasks.workunit.client.0.vm07.stdout:9/561: sync 2026-03-09T19:27:42.947 INFO:tasks.workunit.client.0.vm07.stdout:8/570: truncate d7/d30/d32/fa9 638046 0 2026-03-09T19:27:42.948 INFO:tasks.workunit.client.0.vm07.stdout:8/571: chown d7/d9/d37/d45/d56/d67/l68 0 1 2026-03-09T19:27:42.950 INFO:tasks.workunit.client.1.vm08.stdout:3/817: rename d0/d52/d6d/d77/d88/cc7 to d0/d4b/c104 0 2026-03-09T19:27:42.954 INFO:tasks.workunit.client.0.vm07.stdout:1/591: write d1/d3/f23 [4138346,64960] 0 2026-03-09T19:27:42.956 INFO:tasks.workunit.client.1.vm08.stdout:7/814: dwrite d5/d14/f46 [0,4194304] 0 2026-03-09T19:27:42.965 INFO:tasks.workunit.client.0.vm07.stdout:2/612: write d3/d49/fa1 [583639,27961] 0 2026-03-09T19:27:42.966 INFO:tasks.workunit.client.1.vm08.stdout:8/745: write de/d91/f9d [303545,31836] 0 2026-03-09T19:27:42.967 INFO:tasks.workunit.client.1.vm08.stdout:6/773: dwrite d3/d15/f40 [0,4194304] 0 2026-03-09T19:27:42.982 INFO:tasks.workunit.client.1.vm08.stdout:5/724: truncate d16/d1e/d8c/d99/dcc/fd1 3524306 0 2026-03-09T19:27:42.989 INFO:tasks.workunit.client.1.vm08.stdout:9/758: dwrite d0/d1b/d97/d48/d5d/ddf/fd9 [0,4194304] 0 2026-03-09T19:27:42.994 INFO:tasks.workunit.client.1.vm08.stdout:2/669: fsync d3/d4/d23/d2c/d39/d5e/de/d18/d1f/f7f 0 2026-03-09T19:27:42.995 INFO:tasks.workunit.client.0.vm07.stdout:7/553: creat d0/d52/d54/d5a/d87/d92/fba x:0 0 0 2026-03-09T19:27:42.998 INFO:tasks.workunit.client.1.vm08.stdout:1/879: creat d9/da/d95/f10f x:0 0 0 2026-03-09T19:27:42.999 INFO:tasks.workunit.client.0.vm07.stdout:0/519: truncate d0/d6/d13/d17/f64 780769 0 2026-03-09T19:27:43.000 INFO:tasks.workunit.client.1.vm08.stdout:1/880: read d9/da/f2f [2871581,54824] 0 2026-03-09T19:27:43.004 INFO:tasks.workunit.client.1.vm08.stdout:4/724: truncate da/d10/d16/d28/d46/d52/d6e/d73/fae 328387 0 2026-03-09T19:27:43.007 INFO:tasks.workunit.client.0.vm07.stdout:6/500: truncate d0/ff 5075710 0 2026-03-09T19:27:43.024 INFO:tasks.workunit.client.1.vm08.stdout:3/818: creat d0/d8/d24/f105 x:0 0 0 2026-03-09T19:27:43.024 INFO:tasks.workunit.client.1.vm08.stdout:7/815: mkdir d5/d14/dae/d1c/d73/d119 0 2026-03-09T19:27:43.024 INFO:tasks.workunit.client.0.vm07.stdout:6/501: dread - d0/d4e/f99 zero size 2026-03-09T19:27:43.024 INFO:tasks.workunit.client.0.vm07.stdout:1/592: write d1/d11/d37/f2c [4100850,95057] 0 2026-03-09T19:27:43.024 INFO:tasks.workunit.client.0.vm07.stdout:4/529: rename d3/d4f/d56/d5f/d88/f92 to d3/d11/d29/db9/d91/fbb 0 2026-03-09T19:27:43.025 INFO:tasks.workunit.client.0.vm07.stdout:4/530: stat d3/l78 0 2026-03-09T19:27:43.033 INFO:tasks.workunit.client.0.vm07.stdout:8/572: write d7/f9d [57708,66783] 0 2026-03-09T19:27:43.035 INFO:tasks.workunit.client.1.vm08.stdout:0/764: dwrite dd/d7e/f8e [4194304,4194304] 0 2026-03-09T19:27:43.038 INFO:tasks.workunit.client.0.vm07.stdout:8/573: dread d7/d9/d10/f20 [0,4194304] 0 2026-03-09T19:27:43.042 INFO:tasks.workunit.client.1.vm08.stdout:0/765: dwrite dd/d22/fe7 [0,4194304] 0 2026-03-09T19:27:43.049 INFO:tasks.workunit.client.1.vm08.stdout:6/774: fdatasync d3/db/f8f 0 2026-03-09T19:27:43.049 INFO:tasks.workunit.client.0.vm07.stdout:0/520: creat d0/d6/d13/d17/d19/d58/fab x:0 0 0 2026-03-09T19:27:43.050 INFO:tasks.workunit.client.1.vm08.stdout:5/725: rename d16/d1e/d30/d8a/lb6 to d16/d1e/d3b/d61/le7 0 2026-03-09T19:27:43.051 INFO:tasks.workunit.client.1.vm08.stdout:5/726: truncate d16/d1e/d9f/fd3 436397 0 2026-03-09T19:27:43.065 INFO:tasks.workunit.client.1.vm08.stdout:8/746: dwrite de/d47/dfd/d99/da5/db3/fe5 [0,4194304] 0 2026-03-09T19:27:43.070 INFO:tasks.workunit.client.1.vm08.stdout:9/759: mknod d0/d2/d80/d69/cfa 0 2026-03-09T19:27:43.078 INFO:tasks.workunit.client.0.vm07.stdout:1/593: fdatasync d1/d11/d37/d3f/d6e/f9f 0 2026-03-09T19:27:43.084 INFO:tasks.workunit.client.0.vm07.stdout:3/611: rename d1/d1f/d16 to d1/d3d/d47/db3/dc2 0 2026-03-09T19:27:43.092 INFO:tasks.workunit.client.1.vm08.stdout:7/816: creat d5/d14/d38/f11a x:0 0 0 2026-03-09T19:27:43.092 INFO:tasks.workunit.client.1.vm08.stdout:7/817: read d5/d14/d27/d78/dc7/fcf [195175,76321] 0 2026-03-09T19:27:43.092 INFO:tasks.workunit.client.0.vm07.stdout:3/612: chown d1/d3d/d47/db3/dc2/d28/f3c 7928 1 2026-03-09T19:27:43.092 INFO:tasks.workunit.client.0.vm07.stdout:4/531: creat d3/d11/d29/d34/fbc x:0 0 0 2026-03-09T19:27:43.094 INFO:tasks.workunit.client.1.vm08.stdout:0/766: mkdir dd/d22/d63/d6e/df5 0 2026-03-09T19:27:43.094 INFO:tasks.workunit.client.1.vm08.stdout:0/767: chown dd/d7e 274 1 2026-03-09T19:27:43.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:42 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:27:43.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:42 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr fail", "who": "vm07.xacuym"}]: dispatch 2026-03-09T19:27:43.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:42 vm08.local ceph-mon[57794]: Activating manager daemon vm08.mxylvw 2026-03-09T19:27:43.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:42 vm08.local ceph-mon[57794]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "mgr fail", "who": "vm07.xacuym"}]': finished 2026-03-09T19:27:43.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:42 vm08.local ceph-mon[57794]: osdmap e42: 6 total, 6 up, 6 in 2026-03-09T19:27:43.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:42 vm08.local ceph-mon[57794]: mgrmap e20: vm08.mxylvw(active, starting, since 0.0646493s) 2026-03-09T19:27:43.098 INFO:tasks.workunit.client.1.vm08.stdout:6/775: mkdir d3/d34/d5c/da2/d121 0 2026-03-09T19:27:43.099 INFO:tasks.workunit.client.1.vm08.stdout:0/768: dread dd/d22/d24/d49/d50/db3/fce [0,4194304] 0 2026-03-09T19:27:43.099 INFO:tasks.workunit.client.0.vm07.stdout:5/517: write d3/dd/d26/d2d/f54 [1964763,100771] 0 2026-03-09T19:27:43.099 INFO:tasks.workunit.client.0.vm07.stdout:7/554: write d0/d4/d5/d26/d3c/d39/f97 [877771,43906] 0 2026-03-09T19:27:43.106 INFO:tasks.workunit.client.1.vm08.stdout:5/727: symlink d16/d1e/db3/le8 0 2026-03-09T19:27:43.118 INFO:tasks.workunit.client.0.vm07.stdout:1/594: dread d1/db/d31/d56/f97 [0,4194304] 0 2026-03-09T19:27:43.119 INFO:tasks.workunit.client.0.vm07.stdout:9/562: dwrite d0/db/d29/d2c/d36/f71 [0,4194304] 0 2026-03-09T19:27:43.122 INFO:tasks.workunit.client.1.vm08.stdout:1/881: dwrite d9/d40/f92 [0,4194304] 0 2026-03-09T19:27:43.126 INFO:tasks.workunit.client.1.vm08.stdout:4/725: dwrite da/d10/d26/f74 [0,4194304] 0 2026-03-09T19:27:43.128 INFO:tasks.workunit.client.1.vm08.stdout:1/882: read - d9/d11/d7a/d89/d8d/daa/f107 zero size 2026-03-09T19:27:43.134 INFO:tasks.workunit.client.1.vm08.stdout:9/760: dread d0/d2/d14/f28 [0,4194304] 0 2026-03-09T19:27:43.144 INFO:tasks.workunit.client.0.vm07.stdout:6/502: write d0/d1/db/d52/d94/d87/f8e [2894707,68433] 0 2026-03-09T19:27:43.147 INFO:tasks.workunit.client.0.vm07.stdout:0/521: dwrite d0/d6/d13/d17/d19/d57/d6a/f74 [0,4194304] 0 2026-03-09T19:27:43.149 INFO:tasks.workunit.client.0.vm07.stdout:2/613: rename d3/dd/d16/d29/d3c/d5a/d7a/c61 to d3/dd/d16/d29/d3c/dcf/cd6 0 2026-03-09T19:27:43.150 INFO:tasks.workunit.client.1.vm08.stdout:2/670: mknod d3/d4/d23/d2c/d39/d5e/cdf 0 2026-03-09T19:27:43.162 INFO:tasks.workunit.client.0.vm07.stdout:8/574: creat d7/d9/d37/d45/d4f/db1/fce x:0 0 0 2026-03-09T19:27:43.173 INFO:tasks.workunit.client.0.vm07.stdout:1/595: mkdir d1/d11/d37/d5d/dc1 0 2026-03-09T19:27:43.177 INFO:tasks.workunit.client.1.vm08.stdout:0/769: unlink dd/d22/f41 0 2026-03-09T19:27:43.182 INFO:tasks.workunit.client.1.vm08.stdout:0/770: dwrite dd/d22/d24/d49/d50/dd4/fd9 [0,4194304] 0 2026-03-09T19:27:43.184 INFO:tasks.workunit.client.1.vm08.stdout:8/747: creat de/d91/dd5/f107 x:0 0 0 2026-03-09T19:27:43.185 INFO:tasks.workunit.client.0.vm07.stdout:0/522: creat d0/d6/d13/d1c/d50/d92/d99/fac x:0 0 0 2026-03-09T19:27:43.185 INFO:tasks.workunit.client.1.vm08.stdout:8/748: fdatasync de/d47/dfd/d99/da5/db3/fe5 0 2026-03-09T19:27:43.192 INFO:tasks.workunit.client.1.vm08.stdout:0/771: dwrite dd/d22/d24/d49/d50/d78/ff0 [0,4194304] 0 2026-03-09T19:27:43.206 INFO:tasks.workunit.client.0.vm07.stdout:3/613: rename d1/d1f/f6d to d1/d3d/d47/fc3 0 2026-03-09T19:27:43.210 INFO:tasks.workunit.client.1.vm08.stdout:2/671: rmdir d3/d4/d23/d2c/d39/d5e/de/d8b 39 2026-03-09T19:27:43.213 INFO:tasks.workunit.client.1.vm08.stdout:5/728: dread d16/f34 [0,4194304] 0 2026-03-09T19:27:43.217 INFO:tasks.workunit.client.0.vm07.stdout:8/575: mkdir d7/d16/dcf 0 2026-03-09T19:27:43.219 INFO:tasks.workunit.client.1.vm08.stdout:0/772: dread dd/d22/d24/d49/d92/fcd [0,4194304] 0 2026-03-09T19:27:43.220 INFO:tasks.workunit.client.1.vm08.stdout:6/776: mknod d3/d34/d6f/c122 0 2026-03-09T19:27:43.221 INFO:tasks.workunit.client.1.vm08.stdout:6/777: readlink d3/d68/d7e/la6 0 2026-03-09T19:27:43.225 INFO:tasks.workunit.client.0.vm07.stdout:1/596: symlink d1/d11/d37/d3f/d7e/lc2 0 2026-03-09T19:27:43.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:42 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:27:43.230 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:42 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr fail", "who": "vm07.xacuym"}]: dispatch 2026-03-09T19:27:43.230 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:42 vm07.local ceph-mon[48545]: Activating manager daemon vm08.mxylvw 2026-03-09T19:27:43.230 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:42 vm07.local ceph-mon[48545]: from='mgr.14227 192.168.123.107:0/4204686808' entity='mgr.vm07.xacuym' cmd='[{"prefix": "mgr fail", "who": "vm07.xacuym"}]': finished 2026-03-09T19:27:43.230 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:42 vm07.local ceph-mon[48545]: osdmap e42: 6 total, 6 up, 6 in 2026-03-09T19:27:43.230 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:42 vm07.local ceph-mon[48545]: mgrmap e20: vm08.mxylvw(active, starting, since 0.0646493s) 2026-03-09T19:27:43.230 INFO:tasks.workunit.client.1.vm08.stdout:3/819: rename d0/d52/f6a to d0/f106 0 2026-03-09T19:27:43.234 INFO:tasks.workunit.client.1.vm08.stdout:8/749: chown de/d25/f71 6 1 2026-03-09T19:27:43.237 INFO:tasks.workunit.client.0.vm07.stdout:4/532: rename d3/d11/d2b/l33 to d3/d11/d29/db9/d91/lbd 0 2026-03-09T19:27:43.238 INFO:tasks.workunit.client.0.vm07.stdout:4/533: chown d3/d11/d2b/f69 3591324 1 2026-03-09T19:27:43.238 INFO:tasks.workunit.client.0.vm07.stdout:4/534: stat d3/d11/d29/db9/d22/d70/d93 0 2026-03-09T19:27:43.239 INFO:tasks.workunit.client.0.vm07.stdout:4/535: fsync d3/f8d 0 2026-03-09T19:27:43.241 INFO:tasks.workunit.client.1.vm08.stdout:2/672: mknod d3/d4/d23/d2c/d39/d5e/de/d18/d99/dd4/ce0 0 2026-03-09T19:27:43.242 INFO:tasks.workunit.client.0.vm07.stdout:4/536: dwrite d3/fa2 [0,4194304] 0 2026-03-09T19:27:43.252 INFO:tasks.workunit.client.1.vm08.stdout:5/729: creat d16/d1e/d9f/fe9 x:0 0 0 2026-03-09T19:27:43.255 INFO:tasks.workunit.client.1.vm08.stdout:0/773: creat dd/d9d/dcc/ff6 x:0 0 0 2026-03-09T19:27:43.263 INFO:tasks.workunit.client.1.vm08.stdout:3/820: mkdir d0/d6/de/d1b/d16/d17/d107 0 2026-03-09T19:27:43.267 INFO:tasks.workunit.client.1.vm08.stdout:3/821: dwrite d0/d6/de/d6e/d51/d7f/de3/fe8 [0,4194304] 0 2026-03-09T19:27:43.278 INFO:tasks.workunit.client.0.vm07.stdout:7/555: write d0/d4/d5/d8/d41/d64/d74/f82 [230029,94767] 0 2026-03-09T19:27:43.279 INFO:tasks.workunit.client.1.vm08.stdout:8/750: sync 2026-03-09T19:27:43.281 INFO:tasks.workunit.client.0.vm07.stdout:5/518: dwrite d3/d1a/d28/f2e [0,4194304] 0 2026-03-09T19:27:43.281 INFO:tasks.workunit.client.1.vm08.stdout:7/818: dwrite d5/d14/dae/d1c/d73/fac [0,4194304] 0 2026-03-09T19:27:43.283 INFO:tasks.workunit.client.1.vm08.stdout:7/819: chown d5/d14/dae/dd1/d109/d8f/fe0 11026891 1 2026-03-09T19:27:43.285 INFO:tasks.workunit.client.1.vm08.stdout:9/761: write d0/d2/d14/d98/dbb/fe1 [812498,26659] 0 2026-03-09T19:27:43.286 INFO:tasks.workunit.client.0.vm07.stdout:3/614: mkdir d1/d3d/d47/db3/dc2/d28/dc4 0 2026-03-09T19:27:43.288 INFO:tasks.workunit.client.0.vm07.stdout:8/576: mkdir d7/d16/dcf/dd0 0 2026-03-09T19:27:43.299 INFO:tasks.workunit.client.1.vm08.stdout:5/730: dread d16/d1e/d3b/f68 [0,4194304] 0 2026-03-09T19:27:43.314 INFO:tasks.workunit.client.0.vm07.stdout:9/563: dwrite d0/d6/d3a/d81/fa3 [0,4194304] 0 2026-03-09T19:27:43.316 INFO:tasks.workunit.client.0.vm07.stdout:0/523: dwrite d0/f3c [0,4194304] 0 2026-03-09T19:27:43.317 INFO:tasks.workunit.client.1.vm08.stdout:2/673: dwrite d3/d9/d79/f98 [0,4194304] 0 2026-03-09T19:27:43.327 INFO:tasks.workunit.client.1.vm08.stdout:6/778: mkdir d3/d34/d6f/d123 0 2026-03-09T19:27:43.328 INFO:tasks.workunit.client.0.vm07.stdout:1/597: getdents d1/d11/d37/d3f/d6e/d9c/db6 0 2026-03-09T19:27:43.332 INFO:tasks.workunit.client.1.vm08.stdout:4/726: rename c8 to da/d10/d16/d28/d46/d52/cd7 0 2026-03-09T19:27:43.342 INFO:tasks.workunit.client.1.vm08.stdout:4/727: dread - da/d10/d26/d27/f96 zero size 2026-03-09T19:27:43.342 INFO:tasks.workunit.client.1.vm08.stdout:4/728: stat da/d10/d16/d28/d46/d52/d6e/d73 0 2026-03-09T19:27:43.342 INFO:tasks.workunit.client.1.vm08.stdout:0/774: write dd/f15 [7971148,121248] 0 2026-03-09T19:27:43.342 INFO:tasks.workunit.client.1.vm08.stdout:3/822: rmdir d0/d6/dad 39 2026-03-09T19:27:43.343 INFO:tasks.workunit.client.0.vm07.stdout:6/503: getdents d0/d1/db 0 2026-03-09T19:27:43.344 INFO:tasks.workunit.client.0.vm07.stdout:6/504: chown d0/dbf/d95/d31/c50 257 1 2026-03-09T19:27:43.344 INFO:tasks.workunit.client.1.vm08.stdout:8/751: creat de/d91/f108 x:0 0 0 2026-03-09T19:27:43.344 INFO:tasks.workunit.client.1.vm08.stdout:8/752: stat de/d1d/d4f/fd9 0 2026-03-09T19:27:43.346 INFO:tasks.workunit.client.0.vm07.stdout:7/556: unlink d0/d4/d5/d26/f91 0 2026-03-09T19:27:43.347 INFO:tasks.workunit.client.1.vm08.stdout:7/820: mkdir d5/d14/dae/d3a/d42/d85/da0/df5/d11b 0 2026-03-09T19:27:43.355 INFO:tasks.workunit.client.0.vm07.stdout:5/519: creat d3/d1a/d28/d6c/d72/f9d x:0 0 0 2026-03-09T19:27:43.356 INFO:tasks.workunit.client.1.vm08.stdout:8/753: dread de/d91/dc8/fe4 [0,4194304] 0 2026-03-09T19:27:43.359 INFO:tasks.workunit.client.1.vm08.stdout:6/779: creat d3/d34/d5c/da2/f124 x:0 0 0 2026-03-09T19:27:43.373 INFO:tasks.workunit.client.1.vm08.stdout:0/775: truncate dd/d22/d7b/d82/fc7 1163899 0 2026-03-09T19:27:43.373 INFO:tasks.workunit.client.0.vm07.stdout:3/615: dread d1/f20 [0,4194304] 0 2026-03-09T19:27:43.375 INFO:tasks.workunit.client.0.vm07.stdout:0/524: dread d0/d6/d13/d17/d19/d58/f77 [0,4194304] 0 2026-03-09T19:27:43.375 INFO:tasks.workunit.client.1.vm08.stdout:7/821: chown d5/d14/dae/d3a/d42/fa9 1350805 1 2026-03-09T19:27:43.378 INFO:tasks.workunit.client.1.vm08.stdout:9/762: symlink d0/d1b/d97/d48/d5d/lfb 0 2026-03-09T19:27:43.390 INFO:tasks.workunit.client.1.vm08.stdout:1/883: rename d9/f36 to d9/d40/d49/d9e/f110 0 2026-03-09T19:27:43.401 INFO:tasks.workunit.client.1.vm08.stdout:8/754: mknod de/d47/d85/df2/c109 0 2026-03-09T19:27:43.404 INFO:tasks.workunit.client.1.vm08.stdout:2/674: write d3/d4/d23/d2c/d39/f9b [687516,126230] 0 2026-03-09T19:27:43.410 INFO:tasks.workunit.client.0.vm07.stdout:9/564: write d0/d6/d3a/f89 [414363,19747] 0 2026-03-09T19:27:43.410 INFO:tasks.workunit.client.0.vm07.stdout:9/565: read - d0/db/fac zero size 2026-03-09T19:27:43.410 INFO:tasks.workunit.client.1.vm08.stdout:3/823: rename d0/d52/f5c to d0/d6/de/d1b/d16/d17/dac/dd2/f108 0 2026-03-09T19:27:43.418 INFO:tasks.workunit.client.1.vm08.stdout:1/884: mknod d9/da/d95/c111 0 2026-03-09T19:27:43.422 INFO:tasks.workunit.client.1.vm08.stdout:2/675: dread d3/d9/d26/f6a [0,4194304] 0 2026-03-09T19:27:43.423 INFO:tasks.workunit.client.1.vm08.stdout:4/729: write da/f21 [6539671,86138] 0 2026-03-09T19:27:43.423 INFO:tasks.workunit.client.1.vm08.stdout:2/676: chown d3/d4/d23/d2c/d39/d5e/d14/c1d 7 1 2026-03-09T19:27:43.424 INFO:tasks.workunit.client.1.vm08.stdout:4/730: write da/d10/d16/f4b [853959,130518] 0 2026-03-09T19:27:43.424 INFO:tasks.workunit.client.1.vm08.stdout:2/677: readlink d3/d4/d23/d2c/d39/d5e/de/d18/d1f/ld6 0 2026-03-09T19:27:43.429 INFO:tasks.workunit.client.1.vm08.stdout:5/731: dwrite d16/f17 [0,4194304] 0 2026-03-09T19:27:43.435 INFO:tasks.workunit.client.1.vm08.stdout:6/780: write d3/d34/da9/f97 [430050,106260] 0 2026-03-09T19:27:43.440 INFO:tasks.workunit.client.0.vm07.stdout:4/537: mkdir d3/dbe 0 2026-03-09T19:27:43.440 INFO:tasks.workunit.client.1.vm08.stdout:0/776: mkdir dd/d22/df7 0 2026-03-09T19:27:43.441 INFO:tasks.workunit.client.1.vm08.stdout:7/822: mkdir d5/d14/d27/d54/d107/d11c 0 2026-03-09T19:27:43.443 INFO:tasks.workunit.client.0.vm07.stdout:6/505: rmdir d0/d4e/dae 39 2026-03-09T19:27:43.446 INFO:tasks.workunit.client.0.vm07.stdout:7/557: fdatasync d0/d52/d54/fa4 0 2026-03-09T19:27:43.447 INFO:tasks.workunit.client.1.vm08.stdout:5/732: sync 2026-03-09T19:27:43.449 INFO:tasks.workunit.client.0.vm07.stdout:5/520: mkdir d3/dd/d26/d2d/d9e 0 2026-03-09T19:27:43.461 INFO:tasks.workunit.client.0.vm07.stdout:8/577: mknod d7/d9/d10/d44/d9a/cd1 0 2026-03-09T19:27:43.462 INFO:tasks.workunit.client.1.vm08.stdout:3/824: rmdir d0/d6/de/d1b/d16/d17/dac/dd2/dd3 39 2026-03-09T19:27:43.464 INFO:tasks.workunit.client.1.vm08.stdout:1/885: symlink d9/da/d2d/d4e/l112 0 2026-03-09T19:27:43.467 INFO:tasks.workunit.client.0.vm07.stdout:3/616: rmdir d1/d1f 39 2026-03-09T19:27:43.472 INFO:tasks.workunit.client.1.vm08.stdout:2/678: read d3/d4/d23/d2c/d39/d5e/d14/f2b [3165951,109840] 0 2026-03-09T19:27:43.475 INFO:tasks.workunit.client.0.vm07.stdout:9/566: unlink d0/d6/c15 0 2026-03-09T19:27:43.477 INFO:tasks.workunit.client.1.vm08.stdout:6/781: creat d3/d94/def/dc4/f125 x:0 0 0 2026-03-09T19:27:43.481 INFO:tasks.workunit.client.0.vm07.stdout:8/578: sync 2026-03-09T19:27:43.485 INFO:tasks.workunit.client.0.vm07.stdout:4/538: mknod d3/d11/d29/db9/d22/d70/cbf 0 2026-03-09T19:27:43.487 INFO:tasks.workunit.client.1.vm08.stdout:6/782: dread d3/db/d43/f56 [0,4194304] 0 2026-03-09T19:27:43.494 INFO:tasks.workunit.client.1.vm08.stdout:4/731: dwrite da/d10/d16/f9f [0,4194304] 0 2026-03-09T19:27:43.497 INFO:tasks.workunit.client.0.vm07.stdout:0/525: dwrite d0/d6/d13/d1c/d11/f80 [0,4194304] 0 2026-03-09T19:27:43.502 INFO:tasks.workunit.client.0.vm07.stdout:2/614: link d3/dd/d16/d29/d3c/dcf/cd6 d3/dd/d16/d29/d3c/cd7 0 2026-03-09T19:27:43.505 INFO:tasks.workunit.client.1.vm08.stdout:7/823: truncate d5/d14/d2b/d4b/fe2 4209099 0 2026-03-09T19:27:43.517 INFO:tasks.workunit.client.0.vm07.stdout:6/506: rename d0/d1/db/d17/d4c to d0/d1/db/d17/dc4 0 2026-03-09T19:27:43.518 INFO:tasks.workunit.client.1.vm08.stdout:9/763: creat d0/ffc x:0 0 0 2026-03-09T19:27:43.527 INFO:tasks.workunit.client.1.vm08.stdout:5/733: dread d16/d1e/d30/f3f [0,4194304] 0 2026-03-09T19:27:43.533 INFO:tasks.workunit.client.1.vm08.stdout:5/734: dwrite d16/d1e/d9f/fd3 [0,4194304] 0 2026-03-09T19:27:43.535 INFO:tasks.workunit.client.1.vm08.stdout:3/825: rename d0/d8/d19 to d0/d6/de/d1b/d16/d17/dac/d109 0 2026-03-09T19:27:43.542 INFO:tasks.workunit.client.1.vm08.stdout:5/735: dread d16/fbe [0,4194304] 0 2026-03-09T19:27:43.547 INFO:tasks.workunit.client.0.vm07.stdout:8/579: mkdir d7/d1d/d83/d9f/dd2 0 2026-03-09T19:27:43.549 INFO:tasks.workunit.client.1.vm08.stdout:2/679: fsync d3/d4/d23/d2c/f94 0 2026-03-09T19:27:43.549 INFO:tasks.workunit.client.1.vm08.stdout:2/680: stat d3/d9/d4a/f89 0 2026-03-09T19:27:43.555 INFO:tasks.workunit.client.1.vm08.stdout:6/783: unlink d3/db/ff7 0 2026-03-09T19:27:43.557 INFO:tasks.workunit.client.1.vm08.stdout:8/755: write de/d1d/f27 [1137802,130010] 0 2026-03-09T19:27:43.560 INFO:tasks.workunit.client.0.vm07.stdout:0/526: creat d0/d6/d13/d1c/d61/d69/fad x:0 0 0 2026-03-09T19:27:43.565 INFO:tasks.workunit.client.1.vm08.stdout:4/732: unlink da/d10/d16/d28/d46/d52/d6e/d40/d6c/lb3 0 2026-03-09T19:27:43.566 INFO:tasks.workunit.client.1.vm08.stdout:4/733: chown da/d10/l3c 2 1 2026-03-09T19:27:43.567 INFO:tasks.workunit.client.1.vm08.stdout:4/734: chown da/d10/d16/d28/d2f/f80 147809344 1 2026-03-09T19:27:43.568 INFO:tasks.workunit.client.1.vm08.stdout:4/735: write da/d10/d16/d28/d46/d52/d6e/fd1 [426007,128390] 0 2026-03-09T19:27:43.569 INFO:tasks.workunit.client.0.vm07.stdout:9/567: write d0/d6/f8 [449671,2985] 0 2026-03-09T19:27:43.569 INFO:tasks.workunit.client.1.vm08.stdout:4/736: chown da/d10/d16/d28/d2f/d4f/d64/d84/d8a/da2 223927 1 2026-03-09T19:27:43.576 INFO:tasks.workunit.client.0.vm07.stdout:2/615: fdatasync f0 0 2026-03-09T19:27:43.576 INFO:tasks.workunit.client.0.vm07.stdout:4/539: dwrite d3/d11/d16/f6e [0,4194304] 0 2026-03-09T19:27:43.577 INFO:tasks.workunit.client.0.vm07.stdout:4/540: write d3/d11/d2b/d37/db6/fb1 [218194,129306] 0 2026-03-09T19:27:43.595 INFO:tasks.workunit.client.1.vm08.stdout:0/777: dwrite dd/d22/d27/f3f [0,4194304] 0 2026-03-09T19:27:43.600 INFO:tasks.workunit.client.0.vm07.stdout:5/521: mkdir d3/dd/d26/d2d/d79/d9f 0 2026-03-09T19:27:43.613 INFO:tasks.workunit.client.1.vm08.stdout:3/826: truncate d0/d6/de/d1b/d16/d17/f94 1232576 0 2026-03-09T19:27:43.616 INFO:tasks.workunit.client.1.vm08.stdout:1/886: symlink d9/d11/l113 0 2026-03-09T19:27:43.621 INFO:tasks.workunit.client.0.vm07.stdout:1/598: getdents d1/d11/d37/d5d 0 2026-03-09T19:27:43.623 INFO:tasks.workunit.client.0.vm07.stdout:9/568: read - d0/d6/d73/fa6 zero size 2026-03-09T19:27:43.627 INFO:tasks.workunit.client.1.vm08.stdout:6/784: rmdir d3/d15/dc2 39 2026-03-09T19:27:43.631 INFO:tasks.workunit.client.0.vm07.stdout:2/616: creat d3/dd/d16/d30/da7/fd8 x:0 0 0 2026-03-09T19:27:43.633 INFO:tasks.workunit.client.0.vm07.stdout:2/617: dread - d3/dd/d16/d29/d3c/d5a/d7a/d74/dc9/fd3 zero size 2026-03-09T19:27:43.633 INFO:tasks.workunit.client.0.vm07.stdout:0/527: dwrite d0/d6/d13/d17/d19/f1f [0,4194304] 0 2026-03-09T19:27:43.640 INFO:tasks.workunit.client.0.vm07.stdout:0/528: sync 2026-03-09T19:27:43.640 INFO:tasks.workunit.client.0.vm07.stdout:0/529: chown d0/d6/d13/d17/l49 3 1 2026-03-09T19:27:43.642 INFO:tasks.workunit.client.0.vm07.stdout:0/530: sync 2026-03-09T19:27:43.643 INFO:tasks.workunit.client.0.vm07.stdout:0/531: read d0/d6/d13/d17/d19/d58/f77 [1554836,109306] 0 2026-03-09T19:27:43.649 INFO:tasks.workunit.client.1.vm08.stdout:4/737: read da/d10/d1b/f85 [225287,52384] 0 2026-03-09T19:27:43.650 INFO:tasks.workunit.client.1.vm08.stdout:8/756: write de/d47/dfd/d99/da5/db3/f2d [774914,18651] 0 2026-03-09T19:27:43.654 INFO:tasks.workunit.client.1.vm08.stdout:8/757: chown de/d25/d31/leb 3 1 2026-03-09T19:27:43.654 INFO:tasks.workunit.client.1.vm08.stdout:8/758: chown de/d25/d87/dc9/cfa 7376 1 2026-03-09T19:27:43.655 INFO:tasks.workunit.client.0.vm07.stdout:7/558: creat d0/d4/d5/d8/d41/d64/fbb x:0 0 0 2026-03-09T19:27:43.657 INFO:tasks.workunit.client.1.vm08.stdout:7/824: mknod d5/d14/d27/d54/d107/d11c/c11d 0 2026-03-09T19:27:43.659 INFO:tasks.workunit.client.0.vm07.stdout:5/522: unlink d3/d1a/d28/d48/f50 0 2026-03-09T19:27:43.663 INFO:tasks.workunit.client.0.vm07.stdout:8/580: mknod d7/cd3 0 2026-03-09T19:27:43.665 INFO:tasks.workunit.client.1.vm08.stdout:0/778: creat dd/d9d/dcc/ff8 x:0 0 0 2026-03-09T19:27:43.671 INFO:tasks.workunit.client.0.vm07.stdout:9/569: creat d0/d6/d3a/fc6 x:0 0 0 2026-03-09T19:27:43.671 INFO:tasks.workunit.client.1.vm08.stdout:1/887: readlink d9/da/dc/ld 0 2026-03-09T19:27:43.672 INFO:tasks.workunit.client.1.vm08.stdout:5/736: write d16/d1e/d8c/d99/da8/d9a/fe0 [2408937,81658] 0 2026-03-09T19:27:43.676 INFO:tasks.workunit.client.1.vm08.stdout:2/681: mknod d3/d4/d3e/d4e/ce1 0 2026-03-09T19:27:43.679 INFO:tasks.workunit.client.1.vm08.stdout:6/785: creat d3/dbc/f126 x:0 0 0 2026-03-09T19:27:43.681 INFO:tasks.workunit.client.0.vm07.stdout:6/507: creat d0/d1/d28/fc5 x:0 0 0 2026-03-09T19:27:43.682 INFO:tasks.workunit.client.0.vm07.stdout:7/559: stat d0/d4/d5/d26/f8e 0 2026-03-09T19:27:43.685 INFO:tasks.workunit.client.0.vm07.stdout:3/617: rename d1/d6/d4c/d97/ca0 to d1/d6/d4c/cc5 0 2026-03-09T19:27:43.690 INFO:tasks.workunit.client.0.vm07.stdout:1/599: creat d1/d3e/dae/fc3 x:0 0 0 2026-03-09T19:27:43.692 INFO:tasks.workunit.client.0.vm07.stdout:1/600: dread d1/d11/d37/d3f/d7e/f7f [0,4194304] 0 2026-03-09T19:27:43.695 INFO:tasks.workunit.client.0.vm07.stdout:8/581: write f5 [2072886,107199] 0 2026-03-09T19:27:43.696 INFO:tasks.workunit.client.0.vm07.stdout:4/541: creat d3/d11/d29/db9/d22/fc0 x:0 0 0 2026-03-09T19:27:43.697 INFO:tasks.workunit.client.0.vm07.stdout:5/523: dwrite d3/dd/d26/d3f/d47/d56/f65 [0,4194304] 0 2026-03-09T19:27:43.699 INFO:tasks.workunit.client.0.vm07.stdout:4/542: chown d3/d11/d29/f42 64488 1 2026-03-09T19:27:43.699 INFO:tasks.workunit.client.0.vm07.stdout:5/524: write d3/f99 [275422,20106] 0 2026-03-09T19:27:43.700 INFO:tasks.workunit.client.0.vm07.stdout:0/532: mknod d0/d6/d13/d17/cae 0 2026-03-09T19:27:43.710 INFO:tasks.workunit.client.0.vm07.stdout:6/508: chown d0/d1/db/d24/da4/fc3 20 1 2026-03-09T19:27:43.716 INFO:tasks.workunit.client.0.vm07.stdout:7/560: creat d0/d4/d5/d8/d41/d64/d79/fbc x:0 0 0 2026-03-09T19:27:43.724 INFO:tasks.workunit.client.0.vm07.stdout:2/618: creat d3/dd/d16/d29/d2d/d45/fd9 x:0 0 0 2026-03-09T19:27:43.727 INFO:tasks.workunit.client.0.vm07.stdout:4/543: dread d3/d11/f74 [0,4194304] 0 2026-03-09T19:27:43.728 INFO:tasks.workunit.client.0.vm07.stdout:0/533: unlink d0/d6/d13/d17/d19/d57/d9e/fa3 0 2026-03-09T19:27:43.731 INFO:tasks.workunit.client.0.vm07.stdout:5/525: truncate d3/d1a/d28/d36/f61 935811 0 2026-03-09T19:27:43.737 INFO:tasks.workunit.client.0.vm07.stdout:9/570: rename d0/c9c to d0/d6/d57/d8f/cc7 0 2026-03-09T19:27:43.739 INFO:tasks.workunit.client.0.vm07.stdout:3/618: read d1/d74/f31 [330961,19068] 0 2026-03-09T19:27:43.743 INFO:tasks.workunit.client.1.vm08.stdout:4/738: rename da/d10/d26/d38 to da/d10/d26/dd8 0 2026-03-09T19:27:43.750 INFO:tasks.workunit.client.0.vm07.stdout:4/544: dread - d3/d11/d29/db9/d22/d70/d93/f9c zero size 2026-03-09T19:27:43.751 INFO:tasks.workunit.client.1.vm08.stdout:9/764: mkdir d0/d1b/de9/dfd 0 2026-03-09T19:27:43.753 INFO:tasks.workunit.client.0.vm07.stdout:0/534: truncate d0/d6/d13/d1c/d11/f29 1044691 0 2026-03-09T19:27:43.754 INFO:tasks.workunit.client.1.vm08.stdout:0/779: creat dd/d22/d7b/d82/ff9 x:0 0 0 2026-03-09T19:27:43.757 INFO:tasks.workunit.client.1.vm08.stdout:3/827: mkdir d0/d6/d10a 0 2026-03-09T19:27:43.764 INFO:tasks.workunit.client.0.vm07.stdout:7/561: write d0/d4/d5/d8/d41/d64/d74/d98/f1f [902625,9849] 0 2026-03-09T19:27:43.769 INFO:tasks.workunit.client.1.vm08.stdout:1/888: mknod d9/d40/d49/c114 0 2026-03-09T19:27:43.769 INFO:tasks.workunit.client.0.vm07.stdout:6/509: dwrite d0/d1/db/d17/f1a [0,4194304] 0 2026-03-09T19:27:43.773 INFO:tasks.workunit.client.0.vm07.stdout:8/582: dwrite d7/d9/d37/d45/d56/f5f [4194304,4194304] 0 2026-03-09T19:27:43.779 INFO:tasks.workunit.client.0.vm07.stdout:1/601: rename d1/d11/d37/d3f/d45/d87/d88/fa5 to d1/db/d31/d4f/fc4 0 2026-03-09T19:27:43.790 INFO:tasks.workunit.client.0.vm07.stdout:7/562: dread d0/d52/d54/d55/f67 [0,4194304] 0 2026-03-09T19:27:43.794 INFO:tasks.workunit.client.1.vm08.stdout:8/759: dwrite de/d47/dfd/d99/fd0 [0,4194304] 0 2026-03-09T19:27:43.803 INFO:tasks.workunit.client.1.vm08.stdout:6/786: rmdir d3/dbc 39 2026-03-09T19:27:43.807 INFO:tasks.workunit.client.1.vm08.stdout:4/739: symlink da/d10/d16/d28/d2f/d4f/d64/d84/d8a/ld9 0 2026-03-09T19:27:43.813 INFO:tasks.workunit.client.0.vm07.stdout:0/535: rmdir d0/d6/d13/d1c/d50/d92 39 2026-03-09T19:27:43.816 INFO:tasks.workunit.client.1.vm08.stdout:0/780: stat dd/d22/d27/l79 0 2026-03-09T19:27:43.819 INFO:tasks.workunit.client.0.vm07.stdout:0/536: dwrite d0/d6/d13/d1c/d61/d69/fad [0,4194304] 0 2026-03-09T19:27:43.819 INFO:tasks.workunit.client.0.vm07.stdout:0/537: read d0/d6/d13/d17/d19/d58/f77 [3646077,108787] 0 2026-03-09T19:27:43.821 INFO:tasks.workunit.client.0.vm07.stdout:0/538: write d0/d6/d13/d1c/d11/d56/f67 [1582006,4317] 0 2026-03-09T19:27:43.825 INFO:tasks.workunit.client.0.vm07.stdout:0/539: dread d0/d6/d13/d1c/d11/f2e [0,4194304] 0 2026-03-09T19:27:43.830 INFO:tasks.workunit.client.0.vm07.stdout:6/510: unlink d0/d13/f1b 0 2026-03-09T19:27:43.830 INFO:tasks.workunit.client.1.vm08.stdout:5/737: write d16/f34 [270680,63217] 0 2026-03-09T19:27:43.843 INFO:tasks.workunit.client.0.vm07.stdout:3/619: symlink d1/d3d/d47/db3/dc2/d28/d7c/lc6 0 2026-03-09T19:27:43.846 INFO:tasks.workunit.client.1.vm08.stdout:7/825: rename d5/d14/dae/d3a/d42/fb7 to d5/d14/dae/d1c/f11e 0 2026-03-09T19:27:43.848 INFO:tasks.workunit.client.1.vm08.stdout:3/828: symlink d0/d52/d7c/l10b 0 2026-03-09T19:27:43.870 INFO:tasks.workunit.client.1.vm08.stdout:6/787: fdatasync d3/d15/fcb 0 2026-03-09T19:27:43.870 INFO:tasks.workunit.client.1.vm08.stdout:9/765: rename d0/d1b/d97/d48/d5d/ddf/da7 to d0/d1b/d68/dfe 0 2026-03-09T19:27:43.870 INFO:tasks.workunit.client.1.vm08.stdout:4/740: link da/d10/d16/d28/f44 da/d10/d16/d28/d2f/d4f/d56/fda 0 2026-03-09T19:27:43.870 INFO:tasks.workunit.client.0.vm07.stdout:0/540: rmdir d0/d6/d13/d33 39 2026-03-09T19:27:43.870 INFO:tasks.workunit.client.0.vm07.stdout:6/511: fdatasync d0/d1/db/d52/d94/f84 0 2026-03-09T19:27:43.870 INFO:tasks.workunit.client.0.vm07.stdout:6/512: dread - d0/d1/db/f9d zero size 2026-03-09T19:27:43.870 INFO:tasks.workunit.client.0.vm07.stdout:6/513: stat d0/d1/d28/da9 0 2026-03-09T19:27:43.873 INFO:tasks.workunit.client.1.vm08.stdout:9/766: fdatasync d0/d2/d14/d98/dbb/fe1 0 2026-03-09T19:27:43.875 INFO:tasks.workunit.client.0.vm07.stdout:2/619: rename d3/f22 to d3/dd/d16/d29/d2d/d45/d3b/dae/fda 0 2026-03-09T19:27:43.876 INFO:tasks.workunit.client.0.vm07.stdout:2/620: read - d3/dd/d16/d29/d3c/d5a/d7a/d74/dc9/fd3 zero size 2026-03-09T19:27:43.877 INFO:tasks.workunit.client.1.vm08.stdout:5/738: sync 2026-03-09T19:27:43.879 INFO:tasks.workunit.client.1.vm08.stdout:8/760: rename de/f5d to de/d1d/d21/f10a 0 2026-03-09T19:27:43.884 INFO:tasks.workunit.client.1.vm08.stdout:1/889: write d9/da/d2d/f50 [55350,35945] 0 2026-03-09T19:27:43.886 INFO:tasks.workunit.client.1.vm08.stdout:2/682: dwrite d3/d4/d23/d2c/d39/d5e/d14/f73 [0,4194304] 0 2026-03-09T19:27:43.888 INFO:tasks.workunit.client.0.vm07.stdout:1/602: dwrite d1/d3e/db3/d6d/fb0 [0,4194304] 0 2026-03-09T19:27:43.889 INFO:tasks.workunit.client.0.vm07.stdout:4/545: dwrite d3/d11/d2b/f98 [0,4194304] 0 2026-03-09T19:27:43.897 INFO:tasks.workunit.client.0.vm07.stdout:5/526: dwrite d3/d1a/d28/d6c/f7a [0,4194304] 0 2026-03-09T19:27:43.916 INFO:tasks.workunit.client.1.vm08.stdout:7/826: dwrite d5/d14/d27/d54/dfb/f9b [0,4194304] 0 2026-03-09T19:27:43.918 INFO:tasks.workunit.client.1.vm08.stdout:6/788: write d3/db/fb0 [4529664,127843] 0 2026-03-09T19:27:43.918 INFO:tasks.workunit.client.0.vm07.stdout:9/571: write d0/d6/ff [3014063,105502] 0 2026-03-09T19:27:43.932 INFO:tasks.workunit.client.1.vm08.stdout:4/741: mkdir da/d10/d26/d3a/db5/ddb 0 2026-03-09T19:27:43.937 INFO:tasks.workunit.client.1.vm08.stdout:9/767: creat d0/d2/d80/de5/da2/fff x:0 0 0 2026-03-09T19:27:43.940 INFO:tasks.workunit.client.1.vm08.stdout:5/739: truncate d16/f56 119565 0 2026-03-09T19:27:43.945 INFO:tasks.workunit.client.0.vm07.stdout:8/583: rmdir d7/d16/d1e/dab 0 2026-03-09T19:27:43.945 INFO:tasks.workunit.client.1.vm08.stdout:0/781: rename dd/d22/d27/l79 to dd/d31/dca/lfa 0 2026-03-09T19:27:43.949 INFO:tasks.workunit.client.1.vm08.stdout:8/761: creat de/d47/dfd/d99/da5/db3/f10b x:0 0 0 2026-03-09T19:27:43.954 INFO:tasks.workunit.client.0.vm07.stdout:7/563: rename d0/d52/d54/d5a/d87/d92 to d0/d4/d5/d26/d32/dbd 0 2026-03-09T19:27:43.956 INFO:tasks.workunit.client.1.vm08.stdout:4/742: sync 2026-03-09T19:27:43.957 INFO:tasks.workunit.client.1.vm08.stdout:4/743: chown da/d10/d26/d3a/db5/ddb 77 1 2026-03-09T19:27:43.958 INFO:tasks.workunit.client.1.vm08.stdout:4/744: chown da/d10/d16/d28/d2f/d4f/d64/d84/d8a/ld9 123271 1 2026-03-09T19:27:43.961 INFO:tasks.workunit.client.1.vm08.stdout:2/683: creat d3/d4/d23/d2c/dc1/fe2 x:0 0 0 2026-03-09T19:27:43.966 INFO:tasks.workunit.client.0.vm07.stdout:1/603: symlink d1/db/d31/d4f/lc5 0 2026-03-09T19:27:43.969 INFO:tasks.workunit.client.0.vm07.stdout:4/546: rmdir d3/d4f/d56/d5f 39 2026-03-09T19:27:43.976 INFO:tasks.workunit.client.0.vm07.stdout:5/527: creat d3/d1a/d28/d40/d92/fa0 x:0 0 0 2026-03-09T19:27:43.981 INFO:tasks.workunit.client.1.vm08.stdout:5/740: rename d16/d1e/d9f/fe9 to d16/d1e/d8c/d99/fea 0 2026-03-09T19:27:43.982 INFO:tasks.workunit.client.1.vm08.stdout:8/762: sync 2026-03-09T19:27:43.987 INFO:tasks.workunit.client.0.vm07.stdout:6/514: symlink d0/lc6 0 2026-03-09T19:27:43.989 INFO:tasks.workunit.client.0.vm07.stdout:8/584: creat d7/d9/d37/d45/d4f/fd4 x:0 0 0 2026-03-09T19:27:43.990 INFO:tasks.workunit.client.0.vm07.stdout:1/604: sync 2026-03-09T19:27:43.993 INFO:tasks.workunit.client.0.vm07.stdout:8/585: dwrite d7/d16/d1e/f6e [4194304,4194304] 0 2026-03-09T19:27:44.001 INFO:tasks.workunit.client.0.vm07.stdout:2/621: rename d3/c78 to d3/dd/d16/d29/d3c/d4c/cdb 0 2026-03-09T19:27:44.004 INFO:tasks.workunit.client.0.vm07.stdout:2/622: sync 2026-03-09T19:27:44.007 INFO:tasks.workunit.client.1.vm08.stdout:3/829: write d0/d6/dad/fb6 [136314,22245] 0 2026-03-09T19:27:44.009 INFO:tasks.workunit.client.0.vm07.stdout:3/620: dwrite d1/d6/dd/f4d [0,4194304] 0 2026-03-09T19:27:44.011 INFO:tasks.workunit.client.1.vm08.stdout:6/789: write d3/db/d43/d69/fb1 [120130,73904] 0 2026-03-09T19:27:44.011 INFO:tasks.workunit.client.1.vm08.stdout:6/790: chown d3/d55/c80 165785745 1 2026-03-09T19:27:44.012 INFO:tasks.workunit.client.1.vm08.stdout:6/791: stat d3/d34/d3b/cde 0 2026-03-09T19:27:44.012 INFO:tasks.workunit.client.1.vm08.stdout:9/768: write d0/d1b/d97/d48/d5d/d74/ded/f6e [4251594,115398] 0 2026-03-09T19:27:44.017 INFO:tasks.workunit.client.1.vm08.stdout:0/782: dwrite fc [0,4194304] 0 2026-03-09T19:27:44.028 INFO:tasks.workunit.client.1.vm08.stdout:2/684: write d3/d9/f84 [4813483,76467] 0 2026-03-09T19:27:44.032 INFO:tasks.workunit.client.0.vm07.stdout:9/572: chown d0/db/d29/d2c/d36/d5a/ca4 3 1 2026-03-09T19:27:44.032 INFO:tasks.workunit.client.0.vm07.stdout:0/541: getdents d0/d6/d13/d1c/d61 0 2026-03-09T19:27:44.033 INFO:tasks.workunit.client.0.vm07.stdout:6/515: mkdir d0/d1/db/d17/dc4/dc7 0 2026-03-09T19:27:44.034 INFO:tasks.workunit.client.1.vm08.stdout:8/763: creat de/d47/dfd/f10c x:0 0 0 2026-03-09T19:27:44.036 INFO:tasks.workunit.client.1.vm08.stdout:1/890: link d9/da/d12/d39/l6e d9/d40/d49/l115 0 2026-03-09T19:27:44.037 INFO:tasks.workunit.client.0.vm07.stdout:8/586: unlink d7/d16/cc9 0 2026-03-09T19:27:44.038 INFO:tasks.workunit.client.0.vm07.stdout:8/587: write d7/d30/f61 [449712,84465] 0 2026-03-09T19:27:44.039 INFO:tasks.workunit.client.0.vm07.stdout:2/623: mkdir d3/d49/ddc 0 2026-03-09T19:27:44.044 INFO:tasks.workunit.client.1.vm08.stdout:7/827: creat d5/d14/dae/f11f x:0 0 0 2026-03-09T19:27:44.059 INFO:tasks.workunit.client.0.vm07.stdout:3/621: truncate d1/d3d/f95 229879 0 2026-03-09T19:27:44.059 INFO:tasks.workunit.client.0.vm07.stdout:6/516: unlink d0/lc6 0 2026-03-09T19:27:44.059 INFO:tasks.workunit.client.0.vm07.stdout:1/605: symlink d1/d11/d37/d5d/dc1/lc6 0 2026-03-09T19:27:44.059 INFO:tasks.workunit.client.1.vm08.stdout:6/792: creat d3/d34/da9/f127 x:0 0 0 2026-03-09T19:27:44.059 INFO:tasks.workunit.client.1.vm08.stdout:5/741: symlink d16/d8e/dd5/leb 0 2026-03-09T19:27:44.059 INFO:tasks.workunit.client.1.vm08.stdout:8/764: mkdir de/d47/dfd/d99/da0/d10d 0 2026-03-09T19:27:44.060 INFO:tasks.workunit.client.1.vm08.stdout:8/765: truncate de/d91/f9d 719281 0 2026-03-09T19:27:44.060 INFO:tasks.workunit.client.1.vm08.stdout:7/828: dread d5/d14/d27/d78/dc7/fd9 [0,4194304] 0 2026-03-09T19:27:44.063 INFO:tasks.workunit.client.0.vm07.stdout:8/588: fsync d7/d9/d37/d34/faa 0 2026-03-09T19:27:44.066 INFO:tasks.workunit.client.1.vm08.stdout:4/745: link da/d10/d16/d28/d46/d52/d6e/d6d/c7d da/d10/d1b/dd3/cdc 0 2026-03-09T19:27:44.066 INFO:tasks.workunit.client.1.vm08.stdout:1/891: sync 2026-03-09T19:27:44.077 INFO:tasks.workunit.client.0.vm07.stdout:5/528: write d3/dd/f52 [4599282,12017] 0 2026-03-09T19:27:44.078 INFO:tasks.workunit.client.0.vm07.stdout:9/573: mknod d0/d17/cc8 0 2026-03-09T19:27:44.078 INFO:tasks.workunit.client.0.vm07.stdout:8/589: dread d7/d9/fd [0,4194304] 0 2026-03-09T19:27:44.084 INFO:tasks.workunit.client.0.vm07.stdout:0/542: mknod d0/d6/d13/caf 0 2026-03-09T19:27:44.087 INFO:tasks.workunit.client.1.vm08.stdout:9/769: write d0/d1b/f49 [1307648,111928] 0 2026-03-09T19:27:44.092 INFO:tasks.workunit.client.1.vm08.stdout:3/830: write d0/d6/de/d1b/d16/d17/f94 [531837,82511] 0 2026-03-09T19:27:44.096 INFO:tasks.workunit.client.1.vm08.stdout:0/783: fsync dd/f18 0 2026-03-09T19:27:44.098 INFO:tasks.workunit.client.1.vm08.stdout:2/685: fdatasync d3/d4/d3e/d9d/fc5 0 2026-03-09T19:27:44.100 INFO:tasks.workunit.client.0.vm07.stdout:1/606: unlink d1/d11/d37/d3f/d6e/d9c/l9e 0 2026-03-09T19:27:44.101 INFO:tasks.workunit.client.0.vm07.stdout:1/607: write d1/d11/d37/d3f/d45/f26 [3564589,4632] 0 2026-03-09T19:27:44.102 INFO:tasks.workunit.client.0.vm07.stdout:1/608: write d1/d11/d37/d3f/d45/d87/faf [1059387,107811] 0 2026-03-09T19:27:44.103 INFO:tasks.workunit.client.1.vm08.stdout:8/766: creat de/d47/dfd/d99/dde/f10e x:0 0 0 2026-03-09T19:27:44.109 INFO:tasks.workunit.client.0.vm07.stdout:7/564: rename d0/d4/d5/d8/d1a/d2a/faf to d0/d4/d5/d26/fbe 0 2026-03-09T19:27:44.111 INFO:tasks.workunit.client.1.vm08.stdout:7/829: mkdir d5/d14/d27/d78/dc7/dce/d120 0 2026-03-09T19:27:44.112 INFO:tasks.workunit.client.1.vm08.stdout:7/830: stat d5/d14/d38/c76 0 2026-03-09T19:27:44.115 INFO:tasks.workunit.client.0.vm07.stdout:3/622: creat d1/d89/fc7 x:0 0 0 2026-03-09T19:27:44.116 INFO:tasks.workunit.client.0.vm07.stdout:4/547: link d3/d11/d29/db9/d91/l9f d3/d11/d2b/d37/lc1 0 2026-03-09T19:27:44.118 INFO:tasks.workunit.client.0.vm07.stdout:4/548: dread d3/d11/d2b/d37/db6/fb1 [0,4194304] 0 2026-03-09T19:27:44.120 INFO:tasks.workunit.client.0.vm07.stdout:9/574: dread - d0/db/d29/d2c/fb6 zero size 2026-03-09T19:27:44.122 INFO:tasks.workunit.client.1.vm08.stdout:9/770: truncate d0/d1b/f7c 2339858 0 2026-03-09T19:27:44.123 INFO:tasks.workunit.client.1.vm08.stdout:9/771: chown d0/d2/d80/d69 53 1 2026-03-09T19:27:44.127 INFO:tasks.workunit.client.1.vm08.stdout:9/772: dwrite d0/d1b/d97/d48/d6f/f84 [4194304,4194304] 0 2026-03-09T19:27:44.129 INFO:tasks.workunit.client.1.vm08.stdout:1/892: write d9/da/d53/d67/fc4 [2868007,72563] 0 2026-03-09T19:27:44.131 INFO:tasks.workunit.client.0.vm07.stdout:8/590: read - d7/d30/d75/dcc/fbb zero size 2026-03-09T19:27:44.132 INFO:tasks.workunit.client.1.vm08.stdout:3/831: symlink d0/d6/de/d1b/d16/d17/dac/l10c 0 2026-03-09T19:27:44.133 INFO:tasks.workunit.client.0.vm07.stdout:0/543: truncate d0/d6/d13/d1c/d11/d56/f7f 4050615 0 2026-03-09T19:27:44.134 INFO:tasks.workunit.client.1.vm08.stdout:3/832: chown d0/d6/de/d1b/d16/d17/ca7 2970 1 2026-03-09T19:27:44.138 INFO:tasks.workunit.client.1.vm08.stdout:9/773: sync 2026-03-09T19:27:44.140 INFO:tasks.workunit.client.1.vm08.stdout:1/893: dwrite d9/d11/d7a/d89/d8d/da3/fde [0,4194304] 0 2026-03-09T19:27:44.171 INFO:tasks.workunit.client.1.vm08.stdout:0/784: write dd/d22/d27/d2e/db0/fbc [56532,32999] 0 2026-03-09T19:27:44.171 INFO:tasks.workunit.client.1.vm08.stdout:1/894: dread d9/d40/f57 [0,4194304] 0 2026-03-09T19:27:44.175 INFO:tasks.workunit.client.0.vm07.stdout:6/517: truncate d0/fa3 4538385 0 2026-03-09T19:27:44.185 INFO:tasks.workunit.client.1.vm08.stdout:2/686: mknod d3/d4/d23/d2c/d39/d5e/de/d18/d99/ce3 0 2026-03-09T19:27:44.190 INFO:tasks.workunit.client.1.vm08.stdout:5/742: rename d16/c59 to d16/d1e/d3b/cec 0 2026-03-09T19:27:44.195 INFO:tasks.workunit.client.1.vm08.stdout:8/767: fsync de/d47/dfd/d99/dde/ffb 0 2026-03-09T19:27:44.196 INFO:tasks.workunit.client.1.vm08.stdout:8/768: truncate de/d47/dfd/d99/da5/db3/fe5 4648516 0 2026-03-09T19:27:44.199 INFO:tasks.workunit.client.0.vm07.stdout:1/609: symlink d1/d91/lc7 0 2026-03-09T19:27:44.201 INFO:tasks.workunit.client.1.vm08.stdout:4/746: mknod da/d10/d16/d28/cdd 0 2026-03-09T19:27:44.202 INFO:tasks.workunit.client.0.vm07.stdout:2/624: rename d3/d49/ddc to d3/dd/d16/d30/da7/dad/ddd 0 2026-03-09T19:27:44.203 INFO:tasks.workunit.client.0.vm07.stdout:7/565: mknod d0/d4/d5/d8/d41/d64/d79/cbf 0 2026-03-09T19:27:44.204 INFO:tasks.workunit.client.1.vm08.stdout:3/833: creat d0/d4b/f10d x:0 0 0 2026-03-09T19:27:44.206 INFO:tasks.workunit.client.0.vm07.stdout:5/529: mknod d3/d1a/d28/d6c/d72/d8f/ca1 0 2026-03-09T19:27:44.206 INFO:tasks.workunit.client.0.vm07.stdout:9/575: creat d0/db/d29/d32/d5c/d69/fc9 x:0 0 0 2026-03-09T19:27:44.211 INFO:tasks.workunit.client.0.vm07.stdout:8/591: creat d7/d16/dcf/fd5 x:0 0 0 2026-03-09T19:27:44.211 INFO:tasks.workunit.client.1.vm08.stdout:1/895: mknod d9/d11/d7a/d89/d8d/daa/c116 0 2026-03-09T19:27:44.214 INFO:tasks.workunit.client.0.vm07.stdout:1/610: fdatasync d1/d3/d21/f5f 0 2026-03-09T19:27:44.215 INFO:tasks.workunit.client.1.vm08.stdout:2/687: truncate d3/d4/d23/d2c/d39/d5e/de/f17 2449758 0 2026-03-09T19:27:44.217 INFO:tasks.workunit.client.1.vm08.stdout:6/793: rename d3/d34/d3b/cde to d3/dbc/c128 0 2026-03-09T19:27:44.227 INFO:tasks.workunit.client.1.vm08.stdout:0/785: dwrite dd/d31/f54 [0,4194304] 0 2026-03-09T19:27:44.245 INFO:tasks.workunit.client.1.vm08.stdout:5/743: dwrite d16/d1e/d6e/f72 [4194304,4194304] 0 2026-03-09T19:27:44.246 INFO:tasks.workunit.client.0.vm07.stdout:0/544: rename d0/d6/d13/d17/d19/f53 to d0/d6/d13/d1c/d52/d81/fb0 0 2026-03-09T19:27:44.254 INFO:tasks.workunit.client.1.vm08.stdout:5/744: read d16/f4d [402353,96049] 0 2026-03-09T19:27:44.255 INFO:tasks.workunit.client.0.vm07.stdout:3/623: link d1/d89/fc7 d1/d6/d4c/fc8 0 2026-03-09T19:27:44.255 INFO:tasks.workunit.client.1.vm08.stdout:8/769: mkdir de/d47/d85/d10f 0 2026-03-09T19:27:44.256 INFO:tasks.workunit.client.0.vm07.stdout:7/566: chown d0/d4/d5/lb5 27 1 2026-03-09T19:27:44.256 INFO:tasks.workunit.client.1.vm08.stdout:7/831: symlink d5/d14/d27/d78/dc7/dce/d120/l121 0 2026-03-09T19:27:44.258 INFO:tasks.workunit.client.0.vm07.stdout:4/549: fdatasync d3/d11/d29/db9/d22/d86/f97 0 2026-03-09T19:27:44.259 INFO:tasks.workunit.client.1.vm08.stdout:3/834: symlink d0/d52/l10e 0 2026-03-09T19:27:44.260 INFO:tasks.workunit.client.1.vm08.stdout:9/774: mknod d0/d2/d14/d98/d99/dd8/c100 0 2026-03-09T19:27:44.260 INFO:tasks.workunit.client.1.vm08.stdout:9/775: chown d0/d1b/d97/d48/d5d/ddf/fd9 129348011 1 2026-03-09T19:27:44.261 INFO:tasks.workunit.client.1.vm08.stdout:9/776: chown d0/d2/d80/de5/lbe 4370 1 2026-03-09T19:27:44.262 INFO:tasks.workunit.client.0.vm07.stdout:8/592: creat d7/d9/d37/d45/d4f/db1/fd6 x:0 0 0 2026-03-09T19:27:44.262 INFO:tasks.workunit.client.1.vm08.stdout:6/794: mknod d3/db/d43/c129 0 2026-03-09T19:27:44.292 INFO:tasks.workunit.client.0.vm07.stdout:3/624: mknod d1/d3d/d47/cc9 0 2026-03-09T19:27:44.292 INFO:tasks.workunit.client.0.vm07.stdout:4/550: fdatasync d3/d11/d2b/d37/db6/fb1 0 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.0.vm07.stdout:5/530: mkdir d3/dd/d95/da2 0 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.0.vm07.stdout:5/531: write d3/d1a/d28/d40/f49 [3527191,83683] 0 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.0.vm07.stdout:4/551: dread d3/d11/d29/f3c [0,4194304] 0 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.0.vm07.stdout:6/518: mknod d0/d4e/dae/daf/cc8 0 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.0.vm07.stdout:7/567: symlink d0/d4/d5/d26/lc0 0 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.0.vm07.stdout:7/568: read d0/f13 [6038417,127870] 0 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.0.vm07.stdout:7/569: stat d0/d4 0 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.0.vm07.stdout:7/570: write d0/d4/d5/d8/d41/d64/fbb [654017,67775] 0 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.1.vm08.stdout:1/896: rename d9/d11/c27 to d9/d40/d49/c117 0 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.1.vm08.stdout:0/786: symlink dd/d22/d24/d49/d50/d78/db4/lfb 0 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.1.vm08.stdout:8/770: read de/d1d/d21/d73/fa7 [282872,121033] 0 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.1.vm08.stdout:4/747: fdatasync da/d10/f13 0 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.1.vm08.stdout:2/688: symlink d3/d4/d23/d2c/d39/d5e/db8/le4 0 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.1.vm08.stdout:1/897: chown d9/l59 4019 1 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.1.vm08.stdout:0/787: dread - dd/d22/d24/fdb zero size 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.1.vm08.stdout:0/788: chown dd/f6d 236 1 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.1.vm08.stdout:0/789: write dd/d7e/f8e [4274044,9742] 0 2026-03-09T19:27:44.293 INFO:tasks.workunit.client.1.vm08.stdout:8/771: chown de/d1d/c2b 104813 1 2026-03-09T19:27:44.295 INFO:tasks.workunit.client.0.vm07.stdout:0/545: dread d0/d6/d13/d1c/d50/f85 [0,4194304] 0 2026-03-09T19:27:44.302 INFO:tasks.workunit.client.1.vm08.stdout:2/689: rename d3/d9/d26/f69 to d3/d4/d3e/d4e/d88/fe5 0 2026-03-09T19:27:44.302 INFO:tasks.workunit.client.1.vm08.stdout:2/690: write d3/d4/d23/d2c/d39/d5e/de/d18/d1f/f3a [1776758,74836] 0 2026-03-09T19:27:44.302 INFO:tasks.workunit.client.0.vm07.stdout:2/625: rename d3/dd/d16/d29/d2d/d45/d3b/c68 to d3/dd/d16/d29/d2d/d45/cde 0 2026-03-09T19:27:44.303 INFO:tasks.workunit.client.0.vm07.stdout:3/625: symlink d1/d3d/d47/db3/dc2/lca 0 2026-03-09T19:27:44.303 INFO:tasks.workunit.client.0.vm07.stdout:4/552: creat d3/d4f/d56/d5f/fc2 x:0 0 0 2026-03-09T19:27:44.303 INFO:tasks.workunit.client.0.vm07.stdout:3/626: fdatasync d1/d6/d4c/d97/fc0 0 2026-03-09T19:27:44.303 INFO:tasks.workunit.client.0.vm07.stdout:2/626: dread d3/dd/d16/d29/d2d/d45/d3b/dae/fbb [0,4194304] 0 2026-03-09T19:27:44.304 INFO:tasks.workunit.client.1.vm08.stdout:8/772: dread de/d1d/d69/f8f [0,4194304] 0 2026-03-09T19:27:44.305 INFO:tasks.workunit.client.0.vm07.stdout:0/546: creat d0/d6/d13/d17/d19/d57/fb1 x:0 0 0 2026-03-09T19:27:44.305 INFO:tasks.workunit.client.1.vm08.stdout:6/795: mkdir d3/db/d12a 0 2026-03-09T19:27:44.306 INFO:tasks.workunit.client.1.vm08.stdout:1/898: fdatasync d9/da/d12/d91/dc5/fd7 0 2026-03-09T19:27:44.310 INFO:tasks.workunit.client.0.vm07.stdout:6/519: link d0/dbf/c51 d0/d1/db/d17/dc4/cc9 0 2026-03-09T19:27:44.310 INFO:tasks.workunit.client.0.vm07.stdout:4/553: rename d3/d11/d2b/d37/db6/fb1 to d3/d4f/d56/d5f/fc3 0 2026-03-09T19:27:44.310 INFO:tasks.workunit.client.1.vm08.stdout:1/899: chown d9/d11/d7a/d89/lb0 4706 1 2026-03-09T19:27:44.310 INFO:tasks.workunit.client.1.vm08.stdout:0/790: creat dd/d22/d24/d49/d50/db3/ffc x:0 0 0 2026-03-09T19:27:44.311 INFO:tasks.workunit.client.0.vm07.stdout:4/554: write d3/f8d [3332466,99961] 0 2026-03-09T19:27:44.312 INFO:tasks.workunit.client.0.vm07.stdout:4/555: chown d3/d11/d2b 500641196 1 2026-03-09T19:27:44.312 INFO:tasks.workunit.client.0.vm07.stdout:8/593: sync 2026-03-09T19:27:44.313 INFO:tasks.workunit.client.0.vm07.stdout:5/532: sync 2026-03-09T19:27:44.316 INFO:tasks.workunit.client.0.vm07.stdout:2/627: rmdir d3/dd/d16/d29/d3c/d4c 39 2026-03-09T19:27:44.318 INFO:tasks.workunit.client.0.vm07.stdout:6/520: fdatasync d0/d1/db/d17/dc4/d7b/fac 0 2026-03-09T19:27:44.322 INFO:tasks.workunit.client.0.vm07.stdout:7/571: link d0/d4/d5/d8/f37 d0/d4/d5/d8/d41/d64/fc1 0 2026-03-09T19:27:44.322 INFO:tasks.workunit.client.0.vm07.stdout:4/556: fsync d3/f1a 0 2026-03-09T19:27:44.323 INFO:tasks.workunit.client.0.vm07.stdout:4/557: chown d3/d11/d29/db9/l3a 7 1 2026-03-09T19:27:44.324 INFO:tasks.workunit.client.1.vm08.stdout:1/900: dwrite d9/da/dc/f2e [0,4194304] 0 2026-03-09T19:27:44.328 INFO:tasks.workunit.client.0.vm07.stdout:8/594: mknod d7/d9/d10/d44/cd7 0 2026-03-09T19:27:44.329 INFO:tasks.workunit.client.1.vm08.stdout:5/745: getdents d16/d1e/d30/d8a 0 2026-03-09T19:27:44.329 INFO:tasks.workunit.client.1.vm08.stdout:5/746: stat d16/d1e/dc9 0 2026-03-09T19:27:44.329 INFO:tasks.workunit.client.1.vm08.stdout:5/747: chown d16/f4e 11 1 2026-03-09T19:27:44.338 INFO:tasks.workunit.client.0.vm07.stdout:5/533: symlink d3/dd/d95/la3 0 2026-03-09T19:27:44.339 INFO:tasks.workunit.client.0.vm07.stdout:2/628: stat d3/dd/d16/d29/d3c/d4c/c6f 0 2026-03-09T19:27:44.340 INFO:tasks.workunit.client.1.vm08.stdout:8/773: mkdir de/d25/d110 0 2026-03-09T19:27:44.341 INFO:tasks.workunit.client.0.vm07.stdout:6/521: truncate d0/f8b 698769 0 2026-03-09T19:27:44.343 INFO:tasks.workunit.client.1.vm08.stdout:4/748: getdents da/d10/d16/d28/d2f/d4f/d56 0 2026-03-09T19:27:44.344 INFO:tasks.workunit.client.1.vm08.stdout:2/691: link d3/d9/f84 d3/d4/d23/d2c/fe6 0 2026-03-09T19:27:44.348 INFO:tasks.workunit.client.1.vm08.stdout:5/748: rename d16/d1e/c38 to d16/d1e/d6e/dcd/ced 0 2026-03-09T19:27:44.350 INFO:tasks.workunit.client.1.vm08.stdout:4/749: fdatasync da/d10/d26/d27/f96 0 2026-03-09T19:27:44.352 INFO:tasks.workunit.client.1.vm08.stdout:5/749: dwrite d16/d1e/f5b [4194304,4194304] 0 2026-03-09T19:27:44.354 INFO:tasks.workunit.client.1.vm08.stdout:0/791: getdents dd/d22/de1 0 2026-03-09T19:27:44.364 INFO:tasks.workunit.client.1.vm08.stdout:1/901: sync 2026-03-09T19:27:44.368 INFO:tasks.workunit.client.1.vm08.stdout:5/750: stat d16/l23 0 2026-03-09T19:27:44.382 INFO:tasks.workunit.client.1.vm08.stdout:0/792: dread dd/d22/d27/d4f/f97 [0,4194304] 0 2026-03-09T19:27:44.385 INFO:tasks.workunit.client.0.vm07.stdout:9/576: dwrite d0/db/d29/d2c/f43 [0,4194304] 0 2026-03-09T19:27:44.386 INFO:tasks.workunit.client.1.vm08.stdout:7/832: dwrite d5/fc [4194304,4194304] 0 2026-03-09T19:27:44.386 INFO:tasks.workunit.client.0.vm07.stdout:9/577: fdatasync d0/db/d29/d32/d5c/d69/fc9 0 2026-03-09T19:27:44.387 INFO:tasks.workunit.client.0.vm07.stdout:9/578: chown d0/d6/d73/l77 0 1 2026-03-09T19:27:44.388 INFO:tasks.workunit.client.0.vm07.stdout:9/579: chown d0/db/d29/d32/d5c/d69/f8d 232530 1 2026-03-09T19:27:44.388 INFO:tasks.workunit.client.1.vm08.stdout:3/835: dwrite d0/d6/de/d15/fa4 [0,4194304] 0 2026-03-09T19:27:44.389 INFO:tasks.workunit.client.1.vm08.stdout:3/836: stat d0/d52/d6d/d77/d88/df7 0 2026-03-09T19:27:44.394 INFO:tasks.workunit.client.0.vm07.stdout:9/580: dwrite d0/db/d29/d4d/fa5 [0,4194304] 0 2026-03-09T19:27:44.405 INFO:tasks.workunit.client.1.vm08.stdout:9/777: dwrite d0/d2/d14/d98/d99/fd4 [0,4194304] 0 2026-03-09T19:27:44.407 INFO:tasks.workunit.client.1.vm08.stdout:9/778: write d0/d2/f2a [958884,19153] 0 2026-03-09T19:27:44.407 INFO:tasks.workunit.client.1.vm08.stdout:9/779: chown d0/d1b/d97/fca 5 1 2026-03-09T19:27:44.409 INFO:tasks.workunit.client.1.vm08.stdout:9/780: read d0/d2/d14/d98/dbb/fe1 [490965,5008] 0 2026-03-09T19:27:44.417 INFO:tasks.workunit.client.0.vm07.stdout:1/611: dwrite d1/d3e/d5c/fa7 [0,4194304] 0 2026-03-09T19:27:44.429 INFO:tasks.workunit.client.1.vm08.stdout:0/793: symlink dd/d9d/lfd 0 2026-03-09T19:27:44.432 INFO:tasks.workunit.client.1.vm08.stdout:7/833: fsync d5/d14/d38/dad/fc1 0 2026-03-09T19:27:44.432 INFO:tasks.workunit.client.0.vm07.stdout:3/627: dwrite d1/d6/dd/f2b [0,4194304] 0 2026-03-09T19:27:44.433 INFO:tasks.workunit.client.0.vm07.stdout:3/628: chown d1/d74/f6e 400259 1 2026-03-09T19:27:44.446 INFO:tasks.workunit.client.0.vm07.stdout:0/547: dwrite d0/d6/d13/d1c/d11/f2e [0,4194304] 0 2026-03-09T19:27:44.454 INFO:tasks.workunit.client.1.vm08.stdout:3/837: symlink d0/d6/de/d6e/d51/d92/l10f 0 2026-03-09T19:27:44.468 INFO:tasks.workunit.client.1.vm08.stdout:6/796: dwrite d3/d34/d3b/f58 [0,4194304] 0 2026-03-09T19:27:44.468 INFO:tasks.workunit.client.1.vm08.stdout:6/797: dwrite d3/fe4 [0,4194304] 0 2026-03-09T19:27:44.468 INFO:tasks.workunit.client.1.vm08.stdout:6/798: chown d3/d34/d5c/c9f 842 1 2026-03-09T19:27:44.468 INFO:tasks.workunit.client.1.vm08.stdout:6/799: stat d3/d94/def/f110 0 2026-03-09T19:27:44.479 INFO:tasks.workunit.client.1.vm08.stdout:5/751: mknod d16/d1e/d30/cee 0 2026-03-09T19:27:44.480 INFO:tasks.workunit.client.1.vm08.stdout:0/794: truncate dd/d22/d27/d2e/db0/fb2 689226 0 2026-03-09T19:27:44.485 INFO:tasks.workunit.client.0.vm07.stdout:7/572: dread d0/f6c [0,4194304] 0 2026-03-09T19:27:44.490 INFO:tasks.workunit.client.0.vm07.stdout:5/534: fsync d3/d1a/d28/d40/f46 0 2026-03-09T19:27:44.501 INFO:tasks.workunit.client.1.vm08.stdout:5/752: mkdir d16/d1e/d6e/dcd/def 0 2026-03-09T19:27:44.501 INFO:tasks.workunit.client.0.vm07.stdout:4/558: dread d3/d11/d16/f82 [0,4194304] 0 2026-03-09T19:27:44.501 INFO:tasks.workunit.client.0.vm07.stdout:1/612: fdatasync d1/d3/f4 0 2026-03-09T19:27:44.502 INFO:tasks.workunit.client.0.vm07.stdout:8/595: dread d7/d9/d37/d45/d56/d62/f64 [0,4194304] 0 2026-03-09T19:27:44.504 INFO:tasks.workunit.client.1.vm08.stdout:5/753: dwrite d16/f17 [0,4194304] 0 2026-03-09T19:27:44.512 INFO:tasks.workunit.client.0.vm07.stdout:3/629: dread - d1/d6/dd/f8a zero size 2026-03-09T19:27:44.519 INFO:tasks.workunit.client.1.vm08.stdout:3/838: symlink d0/d6/l110 0 2026-03-09T19:27:44.519 INFO:tasks.workunit.client.1.vm08.stdout:1/902: getdents d9/d11/d7a/d89/de7 0 2026-03-09T19:27:44.519 INFO:tasks.workunit.client.1.vm08.stdout:0/795: mknod dd/d22/d27/cfe 0 2026-03-09T19:27:44.519 INFO:tasks.workunit.client.0.vm07.stdout:0/548: unlink d0/f93 0 2026-03-09T19:27:44.519 INFO:tasks.workunit.client.0.vm07.stdout:7/573: dread d0/d52/d54/f5e [0,4194304] 0 2026-03-09T19:27:44.519 INFO:tasks.workunit.client.0.vm07.stdout:5/535: dread d3/d1a/d28/f3c [0,4194304] 0 2026-03-09T19:27:44.519 INFO:tasks.workunit.client.0.vm07.stdout:2/629: fsync d3/dd/d16/d29/d2d/d45/d3b/dae/fda 0 2026-03-09T19:27:44.521 INFO:tasks.workunit.client.0.vm07.stdout:1/613: dread d1/d11/d37/d5d/d50/f63 [0,4194304] 0 2026-03-09T19:27:44.522 INFO:tasks.workunit.client.0.vm07.stdout:1/614: fsync d1/d3e/db3/d6d/fac 0 2026-03-09T19:27:44.524 INFO:tasks.workunit.client.1.vm08.stdout:5/754: creat d16/d45/daf/ff0 x:0 0 0 2026-03-09T19:27:44.525 INFO:tasks.workunit.client.0.vm07.stdout:4/559: fsync d3/d4f/d56/d5f/f7b 0 2026-03-09T19:27:44.526 INFO:tasks.workunit.client.0.vm07.stdout:9/581: truncate d0/f56 1607941 0 2026-03-09T19:27:44.543 INFO:tasks.workunit.client.1.vm08.stdout:6/800: creat d3/d34/d6f/f12b x:0 0 0 2026-03-09T19:27:44.543 INFO:tasks.workunit.client.1.vm08.stdout:5/755: mknod d16/d1e/d9f/cf1 0 2026-03-09T19:27:44.543 INFO:tasks.workunit.client.1.vm08.stdout:3/839: unlink d0/d6/de/d1b/d16/d17/f3f 0 2026-03-09T19:27:44.543 INFO:tasks.workunit.client.1.vm08.stdout:6/801: unlink d3/d55/c78 0 2026-03-09T19:27:44.543 INFO:tasks.workunit.client.1.vm08.stdout:5/756: rename d16/d1e/d3b/c4f to d16/d1e/d3b/cf2 0 2026-03-09T19:27:44.543 INFO:tasks.workunit.client.1.vm08.stdout:3/840: chown d0/d6/de/d15/ca2 265076 1 2026-03-09T19:27:44.543 INFO:tasks.workunit.client.1.vm08.stdout:3/841: creat d0/d6/de/d15/d96/df5/f111 x:0 0 0 2026-03-09T19:27:44.543 INFO:tasks.workunit.client.0.vm07.stdout:5/536: symlink d3/dd/d26/d2d/d60/la4 0 2026-03-09T19:27:44.543 INFO:tasks.workunit.client.0.vm07.stdout:5/537: readlink d3/dd/d26/d3f/l97 0 2026-03-09T19:27:44.543 INFO:tasks.workunit.client.0.vm07.stdout:1/615: mkdir d1/d3e/dc8 0 2026-03-09T19:27:44.543 INFO:tasks.workunit.client.0.vm07.stdout:4/560: creat d3/d11/d29/fc4 x:0 0 0 2026-03-09T19:27:44.543 INFO:tasks.workunit.client.0.vm07.stdout:9/582: stat d0/db/d29/d2c/f4a 0 2026-03-09T19:27:44.543 INFO:tasks.workunit.client.0.vm07.stdout:8/596: mkdir d7/d9/d10/dd8 0 2026-03-09T19:27:44.544 INFO:tasks.workunit.client.0.vm07.stdout:3/630: truncate d1/d1f/f36 617628 0 2026-03-09T19:27:44.544 INFO:tasks.workunit.client.0.vm07.stdout:3/631: write d1/d6/dd/f4d [2382117,37047] 0 2026-03-09T19:27:44.544 INFO:tasks.workunit.client.0.vm07.stdout:5/538: creat d3/d1a/d28/d40/fa5 x:0 0 0 2026-03-09T19:27:44.544 INFO:tasks.workunit.client.0.vm07.stdout:3/632: stat d1/d6/d45/d54 0 2026-03-09T19:27:44.547 INFO:tasks.workunit.client.0.vm07.stdout:4/561: mknod d3/d11/cc5 0 2026-03-09T19:27:44.547 INFO:tasks.workunit.client.0.vm07.stdout:0/549: getdents d0/d6/d13/d1c/d61/d69 0 2026-03-09T19:27:44.547 INFO:tasks.workunit.client.0.vm07.stdout:7/574: getdents d0/d80/db1 0 2026-03-09T19:27:44.548 INFO:tasks.workunit.client.0.vm07.stdout:9/583: mknod d0/d6f/dc3/cca 0 2026-03-09T19:27:44.554 INFO:tasks.workunit.client.0.vm07.stdout:3/633: dwrite d1/d3d/d47/db3/dc2/d28/f34 [0,4194304] 0 2026-03-09T19:27:44.560 INFO:tasks.workunit.client.0.vm07.stdout:0/550: dwrite d0/d6/d13/d17/d19/d57/f6f [0,4194304] 0 2026-03-09T19:27:44.561 INFO:tasks.workunit.client.1.vm08.stdout:0/796: sync 2026-03-09T19:27:44.561 INFO:tasks.workunit.client.1.vm08.stdout:1/903: dread d9/da/d53/d67/f77 [0,4194304] 0 2026-03-09T19:27:44.564 INFO:tasks.workunit.client.1.vm08.stdout:0/797: creat dd/d31/fff x:0 0 0 2026-03-09T19:27:44.570 INFO:tasks.workunit.client.0.vm07.stdout:5/539: dwrite d3/f93 [0,4194304] 0 2026-03-09T19:27:44.571 INFO:tasks.workunit.client.0.vm07.stdout:4/562: creat d3/d11/d29/db9/d22/d86/fc6 x:0 0 0 2026-03-09T19:27:44.571 INFO:tasks.workunit.client.1.vm08.stdout:5/757: dread ff [0,4194304] 0 2026-03-09T19:27:44.572 INFO:tasks.workunit.client.0.vm07.stdout:1/616: link d1/db/d31/d56/c73 d1/db/d31/d4f/d7a/cc9 0 2026-03-09T19:27:44.575 INFO:tasks.workunit.client.0.vm07.stdout:2/630: read d3/d11/f2e [770711,4344] 0 2026-03-09T19:27:44.582 INFO:tasks.workunit.client.1.vm08.stdout:1/904: fdatasync d9/da/d2c/fd6 0 2026-03-09T19:27:44.585 INFO:tasks.workunit.client.0.vm07.stdout:8/597: dread d7/d9/d37/d45/d56/f5f [0,4194304] 0 2026-03-09T19:27:44.589 INFO:tasks.workunit.client.1.vm08.stdout:5/758: symlink d16/d1e/d9b/lf3 0 2026-03-09T19:27:44.591 INFO:tasks.workunit.client.1.vm08.stdout:1/905: rename d9/da/d53/lfc to d9/da/d12/l118 0 2026-03-09T19:27:44.593 INFO:tasks.workunit.client.1.vm08.stdout:0/798: mkdir dd/d22/d100 0 2026-03-09T19:27:44.594 INFO:tasks.workunit.client.0.vm07.stdout:9/584: link d0/db/d29/d32/fb9 d0/db/d29/d68/fcb 0 2026-03-09T19:27:44.595 INFO:tasks.workunit.client.1.vm08.stdout:3/842: dread d0/d52/d6d/d77/d88/fe0 [0,4194304] 0 2026-03-09T19:27:44.596 INFO:tasks.workunit.client.1.vm08.stdout:1/906: stat d9/c19 0 2026-03-09T19:27:44.598 INFO:tasks.workunit.client.0.vm07.stdout:5/540: truncate d3/d1a/f12 3234161 0 2026-03-09T19:27:44.598 INFO:tasks.workunit.client.1.vm08.stdout:0/799: creat dd/d22/d63/d93/f101 x:0 0 0 2026-03-09T19:27:44.599 INFO:tasks.workunit.client.1.vm08.stdout:0/800: write dd/d7e/f8e [8108148,62520] 0 2026-03-09T19:27:44.611 INFO:tasks.workunit.client.1.vm08.stdout:5/759: sync 2026-03-09T19:27:44.614 INFO:tasks.workunit.client.1.vm08.stdout:8/774: dwrite de/d1d/d21/f45 [0,4194304] 0 2026-03-09T19:27:44.623 INFO:tasks.workunit.client.1.vm08.stdout:4/750: dwrite da/d10/d16/d28/d2f/d4f/f65 [0,4194304] 0 2026-03-09T19:27:44.626 INFO:tasks.workunit.client.1.vm08.stdout:2/692: dwrite d3/d4/d23/d2c/d39/d5e/de/d18/d1f/f7f [0,4194304] 0 2026-03-09T19:27:44.626 INFO:tasks.workunit.client.0.vm07.stdout:2/631: chown d3/dd/d16/d29/d2d/d45/d3b/cd5 1 1 2026-03-09T19:27:44.641 INFO:tasks.workunit.client.0.vm07.stdout:8/598: dread d7/d50/f80 [0,4194304] 0 2026-03-09T19:27:44.645 INFO:tasks.workunit.client.1.vm08.stdout:0/801: creat dd/d9d/dcc/f102 x:0 0 0 2026-03-09T19:27:44.646 INFO:tasks.workunit.client.0.vm07.stdout:0/551: symlink d0/d6/d13/da1/lb2 0 2026-03-09T19:27:44.648 INFO:tasks.workunit.client.1.vm08.stdout:5/760: fsync d16/d1e/d30/d8a/fba 0 2026-03-09T19:27:44.649 INFO:tasks.workunit.client.0.vm07.stdout:0/552: dwrite d0/d6/d13/d17/d19/d57/fb1 [0,4194304] 0 2026-03-09T19:27:44.665 INFO:tasks.workunit.client.1.vm08.stdout:8/775: dread de/d1d/f1e [0,4194304] 0 2026-03-09T19:27:44.672 INFO:tasks.workunit.client.1.vm08.stdout:1/907: creat d9/da/d2d/d4e/df4/f119 x:0 0 0 2026-03-09T19:27:44.681 INFO:tasks.workunit.client.0.vm07.stdout:4/563: link d3/d11/d2b/d38/f8a d3/d11/d29/fc7 0 2026-03-09T19:27:44.695 INFO:tasks.workunit.client.0.vm07.stdout:7/575: truncate d0/d4/d5/d26/f75 1919352 0 2026-03-09T19:27:44.699 INFO:tasks.workunit.client.1.vm08.stdout:9/781: dwrite d0/d2/d14/d5c/fc4 [0,4194304] 0 2026-03-09T19:27:44.713 INFO:tasks.workunit.client.0.vm07.stdout:6/522: dwrite d0/dbf/fa2 [0,4194304] 0 2026-03-09T19:27:44.713 INFO:tasks.workunit.client.1.vm08.stdout:7/834: dwrite d5/d14/dae/f1f [0,4194304] 0 2026-03-09T19:27:44.726 INFO:tasks.workunit.client.1.vm08.stdout:2/693: mkdir d3/d4/d23/d2c/d39/d5e/de/d18/d1f/de7 0 2026-03-09T19:27:44.729 INFO:tasks.workunit.client.0.vm07.stdout:8/599: symlink d7/d1d/ld9 0 2026-03-09T19:27:44.730 INFO:tasks.workunit.client.1.vm08.stdout:6/802: dwrite d3/d94/f102 [0,4194304] 0 2026-03-09T19:27:44.731 INFO:tasks.workunit.client.1.vm08.stdout:9/782: sync 2026-03-09T19:27:44.731 INFO:tasks.workunit.client.1.vm08.stdout:0/802: creat dd/d22/d24/d49/d50/dd4/f103 x:0 0 0 2026-03-09T19:27:44.763 INFO:tasks.workunit.client.0.vm07.stdout:1/617: rename d1/d3e/d5c to d1/db/d31/dca 0 2026-03-09T19:27:44.766 INFO:tasks.workunit.client.0.vm07.stdout:5/541: symlink d3/d1a/d28/d40/d92/d89/la6 0 2026-03-09T19:27:44.771 INFO:tasks.workunit.client.0.vm07.stdout:7/576: mkdir d0/d4/d5/d26/db9/dc2 0 2026-03-09T19:27:44.774 INFO:tasks.workunit.client.0.vm07.stdout:8/600: symlink d7/d30/d75/lda 0 2026-03-09T19:27:44.777 INFO:tasks.workunit.client.0.vm07.stdout:3/634: truncate d1/d3d/d47/db3/d8e/da9/f93 3830683 0 2026-03-09T19:27:44.783 INFO:tasks.workunit.client.0.vm07.stdout:2/632: write d3/f27 [530309,43288] 0 2026-03-09T19:27:44.784 INFO:tasks.workunit.client.0.vm07.stdout:2/633: read - d3/f63 zero size 2026-03-09T19:27:44.784 INFO:tasks.workunit.client.0.vm07.stdout:2/634: write d3/fc [2302963,108005] 0 2026-03-09T19:27:44.786 INFO:tasks.workunit.client.0.vm07.stdout:4/564: creat d3/d11/d29/db9/db2/fc8 x:0 0 0 2026-03-09T19:27:44.792 INFO:tasks.workunit.client.0.vm07.stdout:9/585: link d0/d6/l49 d0/d6f/lcc 0 2026-03-09T19:27:44.794 INFO:tasks.workunit.client.0.vm07.stdout:0/553: write d0/f3a [3403027,55858] 0 2026-03-09T19:27:44.807 INFO:tasks.workunit.client.0.vm07.stdout:5/542: dread d3/f99 [0,4194304] 0 2026-03-09T19:27:44.808 INFO:tasks.workunit.client.0.vm07.stdout:5/543: truncate d3/d1a/d28/d40/f49 4442387 0 2026-03-09T19:27:44.809 INFO:tasks.workunit.client.0.vm07.stdout:6/523: dread d0/dbf/d95/d31/f6c [0,4194304] 0 2026-03-09T19:27:44.811 INFO:tasks.workunit.client.0.vm07.stdout:1/618: write d1/d11/d37/d5d/d50/f6b [2044,130390] 0 2026-03-09T19:27:44.811 INFO:tasks.workunit.client.0.vm07.stdout:1/619: stat d1/d11/d37/d3f/d45/f16 0 2026-03-09T19:27:44.812 INFO:tasks.workunit.client.0.vm07.stdout:7/577: symlink d0/d4/d5/d8/d1a/lc3 0 2026-03-09T19:27:44.815 INFO:tasks.workunit.client.0.vm07.stdout:4/565: rmdir d3/d11/d29 39 2026-03-09T19:27:44.824 INFO:tasks.workunit.client.0.vm07.stdout:8/601: mknod d7/d9/d10/dd8/cdb 0 2026-03-09T19:27:44.829 INFO:tasks.workunit.client.1.vm08.stdout:5/761: rename d16/d1e/dc9/dcf to d16/d1e/d8c/d99/dcc/df4 0 2026-03-09T19:27:44.830 INFO:tasks.workunit.client.1.vm08.stdout:5/762: write d16/d1e/d6e/f72 [8995051,86664] 0 2026-03-09T19:27:44.831 INFO:tasks.workunit.client.1.vm08.stdout:5/763: read d16/d1e/d3b/f68 [2685168,55437] 0 2026-03-09T19:27:44.836 INFO:tasks.workunit.client.1.vm08.stdout:3/843: creat d0/d52/d7c/f112 x:0 0 0 2026-03-09T19:27:44.836 INFO:tasks.workunit.client.0.vm07.stdout:6/524: creat d0/d4e/d7f/fca x:0 0 0 2026-03-09T19:27:44.837 INFO:tasks.workunit.client.1.vm08.stdout:1/908: creat d9/da/d12/f11a x:0 0 0 2026-03-09T19:27:44.837 INFO:tasks.workunit.client.0.vm07.stdout:6/525: dread - d0/d4e/f78 zero size 2026-03-09T19:27:44.837 INFO:tasks.workunit.client.0.vm07.stdout:6/526: stat d0/d2d/f4a 0 2026-03-09T19:27:44.838 INFO:tasks.workunit.client.0.vm07.stdout:2/635: mkdir d3/dd/d16/d29/d2d/d45/d3b/d44/d96/ddf 0 2026-03-09T19:27:44.839 INFO:tasks.workunit.client.0.vm07.stdout:1/620: mkdir d1/d11/d37/dcb 0 2026-03-09T19:27:44.840 INFO:tasks.workunit.client.0.vm07.stdout:7/578: mkdir d0/d52/d54/dc4 0 2026-03-09T19:27:44.840 INFO:tasks.workunit.client.0.vm07.stdout:4/566: readlink d3/d11/l2a 0 2026-03-09T19:27:44.845 INFO:tasks.workunit.client.0.vm07.stdout:8/602: creat d7/d9/d10/d44/fdc x:0 0 0 2026-03-09T19:27:44.845 INFO:tasks.workunit.client.1.vm08.stdout:2/694: dread d3/d9/d26/f6a [0,4194304] 0 2026-03-09T19:27:44.853 INFO:tasks.workunit.client.0.vm07.stdout:8/603: dread d7/f2e [0,4194304] 0 2026-03-09T19:27:44.855 INFO:tasks.workunit.client.1.vm08.stdout:7/835: dwrite d5/d14/d2b/d5d/ff7 [0,4194304] 0 2026-03-09T19:27:44.856 INFO:tasks.workunit.client.1.vm08.stdout:7/836: stat d5/d14/d27/d78/dc7/fcf 0 2026-03-09T19:27:44.868 INFO:tasks.workunit.client.0.vm07.stdout:3/635: link d1/d6/dd/fb0 d1/d3d/d47/db3/d87/fcb 0 2026-03-09T19:27:44.869 INFO:tasks.workunit.client.0.vm07.stdout:9/586: rename d0/db/d29/d32/fb9 to d0/dc1/fcd 0 2026-03-09T19:27:44.871 INFO:tasks.workunit.client.0.vm07.stdout:0/554: unlink d0/d6/d13/d1c/d11/d56/d62/c6e 0 2026-03-09T19:27:44.872 INFO:tasks.workunit.client.0.vm07.stdout:5/544: mknod d3/dd/d26/d2d/d79/d9f/ca7 0 2026-03-09T19:27:44.872 INFO:tasks.workunit.client.0.vm07.stdout:0/555: write d0/d6/d13/d17/d19/d57/d6a/f7a [4325790,32839] 0 2026-03-09T19:27:44.874 INFO:tasks.workunit.client.0.vm07.stdout:6/527: creat d0/d1/db/d1d/fcb x:0 0 0 2026-03-09T19:27:44.879 INFO:tasks.workunit.client.1.vm08.stdout:5/764: rename d16/d1e/d30 to d16/d45/daf/df5 0 2026-03-09T19:27:44.879 INFO:tasks.workunit.client.1.vm08.stdout:8/776: unlink de/d25/c60 0 2026-03-09T19:27:44.879 INFO:tasks.workunit.client.0.vm07.stdout:2/636: unlink d3/dd/d16/d29/d2d/d45/d8b/c65 0 2026-03-09T19:27:44.879 INFO:tasks.workunit.client.0.vm07.stdout:1/621: read d1/d3/d21/f5f [251045,71591] 0 2026-03-09T19:27:44.887 INFO:tasks.workunit.client.0.vm07.stdout:7/579: dread d0/d4/d5/d26/d3c/f60 [0,4194304] 0 2026-03-09T19:27:44.887 INFO:tasks.workunit.client.1.vm08.stdout:3/844: symlink d0/d6/d93/l113 0 2026-03-09T19:27:44.889 INFO:tasks.workunit.client.0.vm07.stdout:8/604: fsync d7/d9/d37/d45/d56/f5f 0 2026-03-09T19:27:44.889 INFO:tasks.workunit.client.0.vm07.stdout:3/636: unlink d1/d3d/d47/c83 0 2026-03-09T19:27:44.890 INFO:tasks.workunit.client.1.vm08.stdout:4/751: creat da/d10/d16/d28/fde x:0 0 0 2026-03-09T19:27:44.891 INFO:tasks.workunit.client.0.vm07.stdout:9/587: mknod d0/db/d29/da8/cce 0 2026-03-09T19:27:44.894 INFO:tasks.workunit.client.1.vm08.stdout:7/837: mkdir d5/d14/d27/d78/dc7/dce/d120/d122 0 2026-03-09T19:27:44.894 INFO:tasks.workunit.client.0.vm07.stdout:0/556: chown d0/d6/d13/d1c/d50/d92/c98 2 1 2026-03-09T19:27:44.894 INFO:tasks.workunit.client.1.vm08.stdout:7/838: dread - d5/d14/dae/d1c/d73/fbe zero size 2026-03-09T19:27:44.895 INFO:tasks.workunit.client.0.vm07.stdout:5/545: read d3/d1a/f12 [174098,41670] 0 2026-03-09T19:27:44.897 INFO:tasks.workunit.client.0.vm07.stdout:1/622: symlink d1/d3e/lcc 0 2026-03-09T19:27:44.897 INFO:tasks.workunit.client.1.vm08.stdout:9/783: truncate d0/d1b/f7c 1336821 0 2026-03-09T19:27:44.899 INFO:tasks.workunit.client.1.vm08.stdout:9/784: read d0/d1b/d97/f63 [1375712,51376] 0 2026-03-09T19:27:44.900 INFO:tasks.workunit.client.1.vm08.stdout:9/785: stat d0/d1b/d97/d48/d6f/fdb 0 2026-03-09T19:27:44.900 INFO:tasks.workunit.client.1.vm08.stdout:9/786: stat d0/d2/d80/d69/cfa 0 2026-03-09T19:27:44.901 INFO:tasks.workunit.client.1.vm08.stdout:9/787: chown d0/d1b/d68/d7f/fa0 2575 1 2026-03-09T19:27:44.903 INFO:tasks.workunit.client.0.vm07.stdout:4/567: creat d3/d11/d29/db9/d22/d86/fc9 x:0 0 0 2026-03-09T19:27:44.906 INFO:tasks.workunit.client.0.vm07.stdout:2/637: dread d3/dd/d16/d29/d3c/d5a/fbe [0,4194304] 0 2026-03-09T19:27:44.907 INFO:tasks.workunit.client.1.vm08.stdout:6/803: truncate d3/db/fb0 4314123 0 2026-03-09T19:27:44.908 INFO:tasks.workunit.client.0.vm07.stdout:7/580: fdatasync d0/d4/d5/d26/d32/f7c 0 2026-03-09T19:27:44.910 INFO:tasks.workunit.client.0.vm07.stdout:3/637: mknod d1/d89/ccc 0 2026-03-09T19:27:44.910 INFO:tasks.workunit.client.1.vm08.stdout:8/777: rename de/d25/d31/f36 to de/d1d/d2e/f111 0 2026-03-09T19:27:44.914 INFO:tasks.workunit.client.0.vm07.stdout:9/588: rename d0/d17/l27 to d0/db/d29/d32/d5c/d80/dad/lcf 0 2026-03-09T19:27:44.915 INFO:tasks.workunit.client.0.vm07.stdout:5/546: symlink d3/dd/d26/d2d/la8 0 2026-03-09T19:27:44.916 INFO:tasks.workunit.client.0.vm07.stdout:1/623: unlink d1/db/d31/d56/f97 0 2026-03-09T19:27:44.916 INFO:tasks.workunit.client.0.vm07.stdout:1/624: readlink d1/d3/d21/l4b 0 2026-03-09T19:27:44.916 INFO:tasks.workunit.client.1.vm08.stdout:4/752: readlink da/d10/d16/d28/d2f/d4f/d64/l8e 0 2026-03-09T19:27:44.921 INFO:tasks.workunit.client.0.vm07.stdout:2/638: mknod d3/dd/d16/d29/d2d/d45/d3b/dae/ce0 0 2026-03-09T19:27:44.924 INFO:tasks.workunit.client.1.vm08.stdout:0/803: rmdir dd/dc6 0 2026-03-09T19:27:44.924 INFO:tasks.workunit.client.1.vm08.stdout:0/804: fsync dd/d22/d63/d93/f101 0 2026-03-09T19:27:44.929 INFO:tasks.workunit.client.0.vm07.stdout:0/557: rename d0/d6/f79 to d0/d6/d13/d1c/d11/d56/d78/fb3 0 2026-03-09T19:27:44.931 INFO:tasks.workunit.client.0.vm07.stdout:9/589: rmdir d0/db/d29/d68/d99 39 2026-03-09T19:27:44.931 INFO:tasks.workunit.client.1.vm08.stdout:3/845: creat d0/d6/de/d1b/d16/d17/d107/f114 x:0 0 0 2026-03-09T19:27:44.931 INFO:tasks.workunit.client.1.vm08.stdout:1/909: creat d9/d11/d7a/d89/d8d/f11b x:0 0 0 2026-03-09T19:27:44.932 INFO:tasks.workunit.client.1.vm08.stdout:6/804: sync 2026-03-09T19:27:44.934 INFO:tasks.workunit.client.0.vm07.stdout:5/547: creat d3/d1a/d28/d40/d92/fa9 x:0 0 0 2026-03-09T19:27:44.934 INFO:tasks.workunit.client.1.vm08.stdout:4/753: truncate da/d10/d16/d28/d46/fbe 744913 0 2026-03-09T19:27:44.934 INFO:tasks.workunit.client.0.vm07.stdout:5/548: read - d3/d1a/d28/d6c/d72/f9d zero size 2026-03-09T19:27:44.935 INFO:tasks.workunit.client.0.vm07.stdout:3/638: dread d1/d3d/d47/f62 [0,4194304] 0 2026-03-09T19:27:44.937 INFO:tasks.workunit.client.1.vm08.stdout:0/805: truncate dd/d22/d27/d4f/fd7 3392810 0 2026-03-09T19:27:44.938 INFO:tasks.workunit.client.1.vm08.stdout:0/806: stat dd/d22/d27/fc8 0 2026-03-09T19:27:44.941 INFO:tasks.workunit.client.0.vm07.stdout:4/568: write d3/f1a [3149283,71759] 0 2026-03-09T19:27:44.943 INFO:tasks.workunit.client.1.vm08.stdout:2/695: dwrite d3/d4/f91 [0,4194304] 0 2026-03-09T19:27:44.954 INFO:tasks.workunit.client.1.vm08.stdout:2/696: dread d3/d4/d23/d2c/d39/d5e/d14/f78 [0,4194304] 0 2026-03-09T19:27:44.955 INFO:tasks.workunit.client.1.vm08.stdout:7/839: dwrite d5/d14/d27/d78/dc7/fcf [0,4194304] 0 2026-03-09T19:27:44.969 INFO:tasks.workunit.client.1.vm08.stdout:5/765: dwrite d16/d1e/f2e [0,4194304] 0 2026-03-09T19:27:44.969 INFO:tasks.workunit.client.0.vm07.stdout:8/605: dwrite d7/d1d/f3f [0,4194304] 0 2026-03-09T19:27:44.970 INFO:tasks.workunit.client.1.vm08.stdout:8/778: write de/d47/dfd/d99/da5/db3/f9e [848336,126032] 0 2026-03-09T19:27:44.974 INFO:tasks.workunit.client.0.vm07.stdout:8/606: dwrite d7/d9/d37/d45/d4f/db1/fd6 [0,4194304] 0 2026-03-09T19:27:44.974 INFO:tasks.workunit.client.0.vm07.stdout:8/607: chown d7/d50/c53 88337 1 2026-03-09T19:27:44.986 INFO:tasks.workunit.client.1.vm08.stdout:1/910: creat d9/da/d2d/d4e/df4/f11c x:0 0 0 2026-03-09T19:27:44.987 INFO:tasks.workunit.client.0.vm07.stdout:2/639: write d3/dd/d16/d29/d2d/d45/dc3/fb8 [440732,59799] 0 2026-03-09T19:27:44.987 INFO:tasks.workunit.client.0.vm07.stdout:6/528: rename d0/dbf/c8c to d0/d1/d28/d76/ccc 0 2026-03-09T19:27:44.987 INFO:tasks.workunit.client.0.vm07.stdout:2/640: write d3/dd/f73 [2453629,111441] 0 2026-03-09T19:27:44.987 INFO:tasks.workunit.client.0.vm07.stdout:0/558: unlink d0/d6/d13/d17/d19/d57/fb1 0 2026-03-09T19:27:44.987 INFO:tasks.workunit.client.0.vm07.stdout:0/559: read d0/d6/d13/d17/d19/d57/f6f [3121246,59652] 0 2026-03-09T19:27:44.987 INFO:tasks.workunit.client.0.vm07.stdout:9/590: chown d0/db/d29/d32/d5c/d80/dad/lcf 25 1 2026-03-09T19:27:44.993 INFO:tasks.workunit.client.1.vm08.stdout:6/805: unlink d3/d15/f6a 0 2026-03-09T19:27:44.994 INFO:tasks.workunit.client.0.vm07.stdout:9/591: dwrite d0/db/d29/fb3 [0,4194304] 0 2026-03-09T19:27:44.995 INFO:tasks.workunit.client.0.vm07.stdout:9/592: write d0/d6/d3a/f89 [838858,82314] 0 2026-03-09T19:27:45.004 INFO:tasks.workunit.client.1.vm08.stdout:9/788: creat d0/d2/d80/de5/da2/da8/de8/f101 x:0 0 0 2026-03-09T19:27:45.004 INFO:tasks.workunit.client.1.vm08.stdout:0/807: unlink dd/d22/d24/d49/d50/f95 0 2026-03-09T19:27:45.004 INFO:tasks.workunit.client.0.vm07.stdout:9/593: chown d0/d6/d73/dbe 33 1 2026-03-09T19:27:45.004 INFO:tasks.workunit.client.0.vm07.stdout:9/594: chown d0/d6f/dc3 442614 1 2026-03-09T19:27:45.004 INFO:tasks.workunit.client.0.vm07.stdout:9/595: chown d0/db/d29/d4d/c66 233 1 2026-03-09T19:27:45.004 INFO:tasks.workunit.client.0.vm07.stdout:5/549: read d3/f4e [906183,114672] 0 2026-03-09T19:27:45.004 INFO:tasks.workunit.client.0.vm07.stdout:4/569: fdatasync d3/d11/d29/db9/d22/f24 0 2026-03-09T19:27:45.005 INFO:tasks.workunit.client.1.vm08.stdout:1/911: sync 2026-03-09T19:27:45.005 INFO:tasks.workunit.client.1.vm08.stdout:1/912: readlink d9/da/dc/l28 0 2026-03-09T19:27:45.008 INFO:tasks.workunit.client.0.vm07.stdout:5/550: dwrite d3/d1a/d28/d6c/d72/f9a [0,4194304] 0 2026-03-09T19:27:45.018 INFO:tasks.workunit.client.0.vm07.stdout:7/581: rename d0/d4/d5/d26/db6 to d0/d4/d5/d8/d1a/d2a/dc5 0 2026-03-09T19:27:45.018 INFO:tasks.workunit.client.0.vm07.stdout:7/582: chown d0/d4/d5/d8/fa3 20569616 1 2026-03-09T19:27:45.024 INFO:tasks.workunit.client.0.vm07.stdout:6/529: symlink d0/d1/db/d17/dc4/d7b/lcd 0 2026-03-09T19:27:45.024 INFO:tasks.workunit.client.1.vm08.stdout:8/779: dread - de/d47/fe8 zero size 2026-03-09T19:27:45.026 INFO:tasks.workunit.client.1.vm08.stdout:8/780: dread de/f19 [0,4194304] 0 2026-03-09T19:27:45.031 INFO:tasks.workunit.client.0.vm07.stdout:2/641: unlink d3/dd/d16/d29/d2d/d45/d3b/l77 0 2026-03-09T19:27:45.032 INFO:tasks.workunit.client.1.vm08.stdout:3/846: symlink d0/d6/de/d6e/l115 0 2026-03-09T19:27:45.036 INFO:tasks.workunit.client.1.vm08.stdout:6/806: symlink d3/dbc/deb/l12c 0 2026-03-09T19:27:45.040 INFO:tasks.workunit.client.1.vm08.stdout:4/754: mkdir da/d10/d16/d28/d46/d52/d6e/ddf 0 2026-03-09T19:27:45.041 INFO:tasks.workunit.client.1.vm08.stdout:4/755: write da/f21 [1543583,71837] 0 2026-03-09T19:27:45.043 INFO:tasks.workunit.client.0.vm07.stdout:9/596: symlink d0/db/d29/d32/d5c/d80/dad/ld0 0 2026-03-09T19:27:45.044 INFO:tasks.workunit.client.1.vm08.stdout:0/808: mkdir dd/d22/de1/d104 0 2026-03-09T19:27:45.049 INFO:tasks.workunit.client.1.vm08.stdout:1/913: dread d9/da/d2c/f8a [0,4194304] 0 2026-03-09T19:27:45.054 INFO:tasks.workunit.client.0.vm07.stdout:5/551: creat d3/dd/d26/d2d/faa x:0 0 0 2026-03-09T19:27:45.054 INFO:tasks.workunit.client.0.vm07.stdout:3/639: rename d1/d3d/d47/db3/d8e/ca3 to d1/d6/d45/ccd 0 2026-03-09T19:27:45.054 INFO:tasks.workunit.client.0.vm07.stdout:2/642: symlink d3/dd/d16/d29/d2d/le1 0 2026-03-09T19:27:45.055 INFO:tasks.workunit.client.0.vm07.stdout:6/530: read d0/d1/db/f15 [8305270,67326] 0 2026-03-09T19:27:45.057 INFO:tasks.workunit.client.1.vm08.stdout:9/789: mknod d0/d2/d14/c102 0 2026-03-09T19:27:45.060 INFO:tasks.workunit.client.0.vm07.stdout:7/583: dread d0/d4/d5/d8/f15 [0,4194304] 0 2026-03-09T19:27:45.060 INFO:tasks.workunit.client.1.vm08.stdout:9/790: dwrite d0/d1b/d97/d48/d6f/f84 [4194304,4194304] 0 2026-03-09T19:27:45.072 INFO:tasks.workunit.client.1.vm08.stdout:0/809: rename dd/d22/d7b to dd/d22/d24/d49/d50/d105 0 2026-03-09T19:27:45.072 INFO:tasks.workunit.client.1.vm08.stdout:0/810: chown fc 0 1 2026-03-09T19:27:45.076 INFO:tasks.workunit.client.1.vm08.stdout:1/914: mkdir d9/da/d12/d91/dc5/d11d 0 2026-03-09T19:27:45.076 INFO:tasks.workunit.client.0.vm07.stdout:4/570: read d3/d4f/f5e [178922,5128] 0 2026-03-09T19:27:45.078 INFO:tasks.workunit.client.0.vm07.stdout:3/640: mknod d1/d6/d45/cce 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.1.vm08.stdout:4/756: fdatasync da/d10/d26/d3a/f51 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.1.vm08.stdout:4/757: write da/f21 [1637174,40510] 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.1.vm08.stdout:9/791: symlink d0/d2/d80/de5/da2/l103 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.1.vm08.stdout:6/807: creat d3/db/f12d x:0 0 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.1.vm08.stdout:4/758: read da/d10/d16/d28/d2f/d4f/f83 [2820176,89049] 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.1.vm08.stdout:4/759: dwrite da/d10/d16/d28/d46/d52/d6e/fd1 [0,4194304] 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.1.vm08.stdout:3/847: link d0/d52/d6d/d77/lab d0/d6/de/d1b/d16/l116 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.0.vm07.stdout:2/643: symlink d3/dd/d16/d29/d3c/d5a/le2 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.0.vm07.stdout:7/584: rename d0/d4/la5 to d0/d52/d54/d5a/lc6 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.0.vm07.stdout:2/644: dwrite d3/dd/d16/d30/f67 [0,4194304] 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.0.vm07.stdout:9/597: creat d0/d6f/d86/fd1 x:0 0 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.0.vm07.stdout:6/531: creat d0/d1/db/d17/dc4/dc7/fce x:0 0 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.0.vm07.stdout:6/532: readlink d0/d1/d28/da9/lb9 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.0.vm07.stdout:2/645: rename d3/dd/d16/d29/d2d/d45/d3b/dae/fbb to d3/dd/d16/d29/d2d/d45/d3b/d44/d96/ddf/fe3 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.0.vm07.stdout:5/552: creat d3/dd/fab x:0 0 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.0.vm07.stdout:9/598: creat d0/db/d29/d2c/d36/d7d/fd2 x:0 0 0 2026-03-09T19:27:45.098 INFO:tasks.workunit.client.0.vm07.stdout:3/641: creat d1/d6/dd/dbf/fcf x:0 0 0 2026-03-09T19:27:45.099 INFO:tasks.workunit.client.0.vm07.stdout:6/533: unlink d0/d1/db/l6d 0 2026-03-09T19:27:45.099 INFO:tasks.workunit.client.1.vm08.stdout:3/848: chown d0/d6/de/d1b 21 1 2026-03-09T19:27:45.103 INFO:tasks.workunit.client.0.vm07.stdout:9/599: fsync d0/d6/f20 0 2026-03-09T19:27:45.105 INFO:tasks.workunit.client.1.vm08.stdout:6/808: readlink d3/d55/l87 0 2026-03-09T19:27:45.106 INFO:tasks.workunit.client.1.vm08.stdout:0/811: sync 2026-03-09T19:27:45.110 INFO:tasks.workunit.client.0.vm07.stdout:5/553: rename d3/dd/d26/d3f/l6d to d3/dd/d26/d2d/d79/lac 0 2026-03-09T19:27:45.124 INFO:tasks.workunit.client.1.vm08.stdout:9/792: mknod d0/d2/d80/de5/da2/da8/de8/c104 0 2026-03-09T19:27:45.124 INFO:tasks.workunit.client.1.vm08.stdout:9/793: creat d0/d2/d80/de5/f105 x:0 0 0 2026-03-09T19:27:45.124 INFO:tasks.workunit.client.1.vm08.stdout:1/915: getdents d9/da 0 2026-03-09T19:27:45.124 INFO:tasks.workunit.client.0.vm07.stdout:6/534: unlink d0/d13/f3f 0 2026-03-09T19:27:45.124 INFO:tasks.workunit.client.0.vm07.stdout:1/625: dwrite d1/d11/d37/d3f/f4a [0,4194304] 0 2026-03-09T19:27:45.124 INFO:tasks.workunit.client.0.vm07.stdout:1/626: chown d1/d11/f42 696580 1 2026-03-09T19:27:45.124 INFO:tasks.workunit.client.0.vm07.stdout:9/600: mkdir d0/d6/d3a/dd3 0 2026-03-09T19:27:45.124 INFO:tasks.workunit.client.1.vm08.stdout:1/916: write d9/d11/f10e [631143,39989] 0 2026-03-09T19:27:45.129 INFO:tasks.workunit.client.0.vm07.stdout:6/535: chown d0/d1/c5 4478507 1 2026-03-09T19:27:45.130 INFO:tasks.workunit.client.0.vm07.stdout:2/646: creat d3/dd/d16/d29/d3c/fe4 x:0 0 0 2026-03-09T19:27:45.155 INFO:tasks.workunit.client.1.vm08.stdout:9/794: mkdir d0/d1b/d97/d106 0 2026-03-09T19:27:45.155 INFO:tasks.workunit.client.1.vm08.stdout:4/760: dread da/f1d [4194304,4194304] 0 2026-03-09T19:27:45.155 INFO:tasks.workunit.client.1.vm08.stdout:1/917: mknod d9/da/d53/db3/c11e 0 2026-03-09T19:27:45.155 INFO:tasks.workunit.client.1.vm08.stdout:9/795: symlink d0/d2/d80/de5/da2/da8/l107 0 2026-03-09T19:27:45.155 INFO:tasks.workunit.client.1.vm08.stdout:1/918: fdatasync d9/da/d2d/f41 0 2026-03-09T19:27:45.155 INFO:tasks.workunit.client.1.vm08.stdout:4/761: rename da/d10/d26/c7b to da/d10/d1b/ce0 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.1.vm08.stdout:4/762: readlink da/d10/d26/d3a/l4e 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.1.vm08.stdout:1/919: symlink d9/da/d2d/l11f 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.1.vm08.stdout:1/920: rename d9/d11/d7a/ff1 to d9/d11/f120 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.1.vm08.stdout:1/921: creat d9/da/d53/d67/d6c/f121 x:0 0 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:9/601: symlink d0/db/d29/d2c/d36/d7d/ld4 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:3/642: creat d1/d1f/fd0 x:0 0 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:2/647: unlink d3/dd/daa/fb1 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:9/602: rmdir d0/db/d9e 39 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:9/603: readlink d0/d17/l44 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:9/604: dread - d0/db/d29/d2c/fb6 zero size 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:9/605: fdatasync d0/d6/ff 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:3/643: mkdir d1/d6/d45/d54/dd1 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:6/536: creat d0/d1/db/d17/dc4/d7b/da0/fcf x:0 0 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:2/648: creat d3/dd/d16/d29/d2d/d45/d3b/fe5 x:0 0 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:2/649: fdatasync d3/dd/d16/d29/d2d/d45/d3b/fe5 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:3/644: write d1/d3d/d47/db3/dc2/d28/d7c/fbd [313112,67573] 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:2/650: truncate d3/f7c 349680 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:6/537: mknod d0/d1/cd0 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:9/606: link d0/db/d29/d4d/c4e d0/db/d29/d2c/d36/d5a/cd5 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:9/607: mknod d0/d6f/dc3/cd6 0 2026-03-09T19:27:45.156 INFO:tasks.workunit.client.0.vm07.stdout:2/651: getdents d3/dd/d16/d30/da7/dad 0 2026-03-09T19:27:45.157 INFO:tasks.workunit.client.0.vm07.stdout:4/571: sync 2026-03-09T19:27:45.157 INFO:tasks.workunit.client.0.vm07.stdout:7/585: sync 2026-03-09T19:27:45.158 INFO:tasks.workunit.client.0.vm07.stdout:4/572: fdatasync d3/d4f/d56/d5f/fc2 0 2026-03-09T19:27:45.163 INFO:tasks.workunit.client.0.vm07.stdout:9/608: dread d0/d17/f1f [0,4194304] 0 2026-03-09T19:27:45.163 INFO:tasks.workunit.client.0.vm07.stdout:1/627: read d1/db/d31/d4f/f79 [659023,68572] 0 2026-03-09T19:27:45.165 INFO:tasks.workunit.client.1.vm08.stdout:0/812: dread dd/d22/d27/d2e/f39 [0,4194304] 0 2026-03-09T19:27:45.169 INFO:tasks.workunit.client.0.vm07.stdout:3/645: dread d1/d74/f31 [0,4194304] 0 2026-03-09T19:27:45.180 INFO:tasks.workunit.client.1.vm08.stdout:5/766: write d16/d1e/d9f/fc1 [554637,40469] 0 2026-03-09T19:27:45.180 INFO:tasks.workunit.client.1.vm08.stdout:2/697: dwrite d3/d9/d26/f52 [0,4194304] 0 2026-03-09T19:27:45.180 INFO:tasks.workunit.client.1.vm08.stdout:7/840: dwrite d5/d14/dae/d1c/d73/fbe [0,4194304] 0 2026-03-09T19:27:45.180 INFO:tasks.workunit.client.0.vm07.stdout:3/646: fdatasync d1/d6/dd/f57 0 2026-03-09T19:27:45.180 INFO:tasks.workunit.client.0.vm07.stdout:3/647: dwrite d1/d6/dd/f57 [4194304,4194304] 0 2026-03-09T19:27:45.180 INFO:tasks.workunit.client.0.vm07.stdout:8/608: dwrite d7/d30/d75/dcc/fbb [0,4194304] 0 2026-03-09T19:27:45.195 INFO:tasks.workunit.client.0.vm07.stdout:0/560: write d0/f3d [5174935,125454] 0 2026-03-09T19:27:45.212 INFO:tasks.workunit.client.1.vm08.stdout:9/796: getdents d0/d2/d14 0 2026-03-09T19:27:45.213 INFO:tasks.workunit.client.1.vm08.stdout:7/841: sync 2026-03-09T19:27:45.220 INFO:tasks.workunit.client.0.vm07.stdout:4/573: dwrite d3/d11/f6c [0,4194304] 0 2026-03-09T19:27:45.220 INFO:tasks.workunit.client.1.vm08.stdout:8/781: write de/d25/d31/d82/fa9 [2139266,65405] 0 2026-03-09T19:27:45.221 INFO:tasks.workunit.client.1.vm08.stdout:8/782: dread - de/d1d/fb0 zero size 2026-03-09T19:27:45.228 INFO:tasks.workunit.client.0.vm07.stdout:2/652: dread d3/dd/d16/d29/f91 [0,4194304] 0 2026-03-09T19:27:45.236 INFO:tasks.workunit.client.0.vm07.stdout:1/628: creat d1/d11/d37/d3f/d7e/dad/fcd x:0 0 0 2026-03-09T19:27:45.237 INFO:tasks.workunit.client.0.vm07.stdout:3/648: dread d1/d3d/f5e [0,4194304] 0 2026-03-09T19:27:45.239 INFO:tasks.workunit.client.1.vm08.stdout:8/783: read de/d1d/d21/f45 [4179491,46569] 0 2026-03-09T19:27:45.240 INFO:tasks.workunit.client.1.vm08.stdout:2/698: mknod d3/d4/ce8 0 2026-03-09T19:27:45.247 INFO:tasks.workunit.client.0.vm07.stdout:4/574: rmdir d3/d11/d29/db9/d22 39 2026-03-09T19:27:45.247 INFO:tasks.workunit.client.0.vm07.stdout:4/575: dread - d3/d11/d16/fae zero size 2026-03-09T19:27:45.247 INFO:tasks.workunit.client.0.vm07.stdout:9/609: rename d0/db/d29/d2c/c6a to d0/d6f/dc3/cd7 0 2026-03-09T19:27:45.249 INFO:tasks.workunit.client.0.vm07.stdout:9/610: dread - d0/db/d29/fc5 zero size 2026-03-09T19:27:45.253 INFO:tasks.workunit.client.0.vm07.stdout:4/576: dwrite d3/d11/d2b/f98 [0,4194304] 0 2026-03-09T19:27:45.262 INFO:tasks.workunit.client.1.vm08.stdout:9/797: creat d0/d2/d80/de5/da2/da8/de8/f108 x:0 0 0 2026-03-09T19:27:45.270 INFO:tasks.workunit.client.1.vm08.stdout:7/842: getdents d5/d14/dae/d3a/d42/d85 0 2026-03-09T19:27:45.275 INFO:tasks.workunit.client.0.vm07.stdout:2/653: dread d3/dd/d16/d30/f3a [4194304,4194304] 0 2026-03-09T19:27:45.278 INFO:tasks.workunit.client.1.vm08.stdout:6/809: write d3/d94/def/dc4/f104 [4047732,24721] 0 2026-03-09T19:27:45.278 INFO:tasks.workunit.client.1.vm08.stdout:6/810: chown d3/db/d43/d69/da0/faf 1934333737 1 2026-03-09T19:27:45.282 INFO:tasks.workunit.client.1.vm08.stdout:2/699: creat d3/d4/d23/d2c/d39/d5e/de/fe9 x:0 0 0 2026-03-09T19:27:45.282 INFO:tasks.workunit.client.1.vm08.stdout:2/700: chown d3/d9/d79/d46/d8c/d92/lb1 217673 1 2026-03-09T19:27:45.283 INFO:tasks.workunit.client.0.vm07.stdout:5/554: write f2 [4711973,114426] 0 2026-03-09T19:27:45.283 INFO:tasks.workunit.client.1.vm08.stdout:2/701: fdatasync d3/d4/fa7 0 2026-03-09T19:27:45.283 INFO:tasks.workunit.client.1.vm08.stdout:2/702: fdatasync d3/d9/fdd 0 2026-03-09T19:27:45.285 INFO:tasks.workunit.client.1.vm08.stdout:9/798: mkdir d0/d1b/d68/d7f/de6/d109 0 2026-03-09T19:27:45.286 INFO:tasks.workunit.client.1.vm08.stdout:3/849: write d0/d52/d6d/d77/d88/faf [1018884,54758] 0 2026-03-09T19:27:45.287 INFO:tasks.workunit.client.0.vm07.stdout:4/577: rmdir d3/d11/d29/db9/db2 39 2026-03-09T19:27:45.294 INFO:tasks.workunit.client.0.vm07.stdout:9/611: truncate d0/db/d29/d68/f6b 1077322 0 2026-03-09T19:27:45.296 INFO:tasks.workunit.client.0.vm07.stdout:2/654: creat d3/dd/d16/d29/d2d/d45/d3b/d44/d97/fe6 x:0 0 0 2026-03-09T19:27:45.296 INFO:tasks.workunit.client.0.vm07.stdout:2/655: chown d3/dd/c10 23169002 1 2026-03-09T19:27:45.299 INFO:tasks.workunit.client.1.vm08.stdout:4/763: dwrite da/d10/d26/f87 [0,4194304] 0 2026-03-09T19:27:45.301 INFO:tasks.workunit.client.0.vm07.stdout:5/555: fsync d3/f18 0 2026-03-09T19:27:45.304 INFO:tasks.workunit.client.0.vm07.stdout:0/561: dread d0/d6/d13/f6c [4194304,4194304] 0 2026-03-09T19:27:45.304 INFO:tasks.workunit.client.1.vm08.stdout:2/703: symlink d3/d4/d23/d2c/d39/d5e/d14/lea 0 2026-03-09T19:27:45.307 INFO:tasks.workunit.client.0.vm07.stdout:6/538: write d0/d13/f57 [429616,28560] 0 2026-03-09T19:27:45.307 INFO:tasks.workunit.client.0.vm07.stdout:6/539: chown d0/d1/db/d52 77 1 2026-03-09T19:27:45.307 INFO:tasks.workunit.client.1.vm08.stdout:9/799: mknod d0/d1b/d68/dfe/c10a 0 2026-03-09T19:27:45.308 INFO:tasks.workunit.client.0.vm07.stdout:1/629: link d1/db/d31/d4f/f79 d1/d11/d37/d5d/dc1/fce 0 2026-03-09T19:27:45.309 INFO:tasks.workunit.client.1.vm08.stdout:3/850: sync 2026-03-09T19:27:45.313 INFO:tasks.workunit.client.0.vm07.stdout:4/578: mkdir d3/d11/d16/dca 0 2026-03-09T19:27:45.313 INFO:tasks.workunit.client.1.vm08.stdout:7/843: mknod d5/c123 0 2026-03-09T19:27:45.314 INFO:tasks.workunit.client.1.vm08.stdout:0/813: write dd/d22/d24/f60 [1040753,83188] 0 2026-03-09T19:27:45.317 INFO:tasks.workunit.client.1.vm08.stdout:6/811: symlink d3/d15/dc2/l12e 0 2026-03-09T19:27:45.319 INFO:tasks.workunit.client.0.vm07.stdout:9/612: rename d0/d6/f10 to d0/db/d29/d2c/d36/d7d/fd8 0 2026-03-09T19:27:45.325 INFO:tasks.workunit.client.1.vm08.stdout:2/704: symlink d3/d4/d23/d2c/d39/d5e/de/d18/d1f/leb 0 2026-03-09T19:27:45.325 INFO:tasks.workunit.client.1.vm08.stdout:3/851: mknod d0/d6/dad/c117 0 2026-03-09T19:27:45.325 INFO:tasks.workunit.client.0.vm07.stdout:2/656: mknod d3/dd/d16/d30/da7/ce7 0 2026-03-09T19:27:45.325 INFO:tasks.workunit.client.0.vm07.stdout:0/562: creat d0/d6/d13/d1c/d50/d92/fb4 x:0 0 0 2026-03-09T19:27:45.326 INFO:tasks.workunit.client.1.vm08.stdout:7/844: mknod d5/d14/d27/d54/dfb/c124 0 2026-03-09T19:27:45.328 INFO:tasks.workunit.client.1.vm08.stdout:6/812: mkdir d3/d15/dc2/d12f 0 2026-03-09T19:27:45.328 INFO:tasks.workunit.client.0.vm07.stdout:4/579: creat d3/d11/d51/fcb x:0 0 0 2026-03-09T19:27:45.329 INFO:tasks.workunit.client.0.vm07.stdout:4/580: chown d3/d11/d51/fcb 1881614765 1 2026-03-09T19:27:45.329 INFO:tasks.workunit.client.1.vm08.stdout:2/705: mkdir d3/d4/d23/d2c/d39/d5e/d14/dec 0 2026-03-09T19:27:45.330 INFO:tasks.workunit.client.0.vm07.stdout:6/540: rename d0/d1/db/f15 to d0/d1/d28/da8/fd1 0 2026-03-09T19:27:45.330 INFO:tasks.workunit.client.1.vm08.stdout:2/706: fdatasync d3/d4/f8 0 2026-03-09T19:27:45.330 INFO:tasks.workunit.client.0.vm07.stdout:4/581: dread d3/d4f/d56/d5f/fc3 [0,4194304] 0 2026-03-09T19:27:45.331 INFO:tasks.workunit.client.0.vm07.stdout:9/613: unlink d0/db/d29/d2c/d36/d7d/l87 0 2026-03-09T19:27:45.332 INFO:tasks.workunit.client.0.vm07.stdout:9/614: chown d0/d6/f48 408866 1 2026-03-09T19:27:45.332 INFO:tasks.workunit.client.0.vm07.stdout:9/615: readlink d0/d17/l44 0 2026-03-09T19:27:45.334 INFO:tasks.workunit.client.1.vm08.stdout:3/852: creat d0/d6/de/d15/d96/f118 x:0 0 0 2026-03-09T19:27:45.336 INFO:tasks.workunit.client.1.vm08.stdout:6/813: sync 2026-03-09T19:27:45.338 INFO:tasks.workunit.client.1.vm08.stdout:1/922: fdatasync d9/d11/f10e 0 2026-03-09T19:27:45.338 INFO:tasks.workunit.client.1.vm08.stdout:3/853: dwrite d0/d6/de/d6e/d51/d7f/de3/fe8 [4194304,4194304] 0 2026-03-09T19:27:45.343 INFO:tasks.workunit.client.1.vm08.stdout:1/923: readlink d9/da/d53/d67/d6c/lad 0 2026-03-09T19:27:45.352 INFO:tasks.workunit.client.1.vm08.stdout:7/845: creat d5/d14/dae/d3a/d42/d85/da0/f125 x:0 0 0 2026-03-09T19:27:45.352 INFO:tasks.workunit.client.1.vm08.stdout:2/707: dread d3/d9/d79/d46/d8c/f90 [0,4194304] 0 2026-03-09T19:27:45.352 INFO:tasks.workunit.client.1.vm08.stdout:2/708: readlink d3/d4/l34 0 2026-03-09T19:27:45.361 INFO:tasks.workunit.client.1.vm08.stdout:0/814: link dd/f6d dd/d22/d24/d49/d92/f106 0 2026-03-09T19:27:45.371 INFO:tasks.workunit.client.0.vm07.stdout:7/586: dwrite d0/d4/d5/d8/d41/f89 [0,4194304] 0 2026-03-09T19:27:45.371 INFO:tasks.workunit.client.0.vm07.stdout:8/609: dwrite d7/d50/da6/fb8 [0,4194304] 0 2026-03-09T19:27:45.372 INFO:tasks.workunit.client.1.vm08.stdout:8/784: write de/d1d/d21/f72 [339253,62881] 0 2026-03-09T19:27:45.372 INFO:tasks.workunit.client.1.vm08.stdout:5/767: dwrite d16/f56 [0,4194304] 0 2026-03-09T19:27:45.372 INFO:tasks.workunit.client.1.vm08.stdout:8/785: dread de/d1d/d21/f45 [0,4194304] 0 2026-03-09T19:27:45.397 INFO:tasks.workunit.client.0.vm07.stdout:3/649: write d1/d6/dd/f3b [3674702,129799] 0 2026-03-09T19:27:45.397 INFO:tasks.workunit.client.1.vm08.stdout:1/924: fdatasync d9/da/d2c/fd6 0 2026-03-09T19:27:45.400 INFO:tasks.workunit.client.1.vm08.stdout:7/846: mknod d5/d14/d27/d78/dc7/dce/d120/c126 0 2026-03-09T19:27:45.402 INFO:tasks.workunit.client.1.vm08.stdout:2/709: unlink d3/d4/d23/d2c/d39/d5e/de/d18/cda 0 2026-03-09T19:27:45.404 INFO:tasks.workunit.client.1.vm08.stdout:0/815: creat dd/d22/d24/d49/d50/db3/f107 x:0 0 0 2026-03-09T19:27:45.405 INFO:tasks.workunit.client.1.vm08.stdout:5/768: truncate d16/d45/daf/df5/f3a 217514 0 2026-03-09T19:27:45.405 INFO:tasks.workunit.client.1.vm08.stdout:5/769: chown c8 139 1 2026-03-09T19:27:45.410 INFO:tasks.workunit.client.1.vm08.stdout:4/764: dwrite da/d10/f13 [0,4194304] 0 2026-03-09T19:27:45.421 INFO:tasks.workunit.client.1.vm08.stdout:9/800: write d0/d2/d80/de5/da2/da8/de8/dcd/fb2 [1378701,95023] 0 2026-03-09T19:27:45.422 INFO:tasks.workunit.client.1.vm08.stdout:8/786: symlink de/d47/dfd/d99/l112 0 2026-03-09T19:27:45.423 INFO:tasks.workunit.client.1.vm08.stdout:8/787: chown de/d47/dfd/cd7 860309 1 2026-03-09T19:27:45.424 INFO:tasks.workunit.client.1.vm08.stdout:8/788: write de/d47/dfd/d99/da5/db3/f9e [961001,538] 0 2026-03-09T19:27:45.426 INFO:tasks.workunit.client.1.vm08.stdout:3/854: creat d0/d6/de/d15/d96/df5/df8/f119 x:0 0 0 2026-03-09T19:27:45.426 INFO:tasks.workunit.client.1.vm08.stdout:3/855: chown d0/d6/de/d1b/d16/d17/c47 127 1 2026-03-09T19:27:45.428 INFO:tasks.workunit.client.1.vm08.stdout:1/925: symlink d9/da/d53/d67/d6c/d76/l122 0 2026-03-09T19:27:45.431 INFO:tasks.workunit.client.1.vm08.stdout:2/710: unlink d3/d9/d79/d46/d8c/d92/lc4 0 2026-03-09T19:27:45.435 INFO:tasks.workunit.client.0.vm07.stdout:0/563: creat d0/d6/d13/d17/d19/d57/d6a/fb5 x:0 0 0 2026-03-09T19:27:45.435 INFO:tasks.workunit.client.0.vm07.stdout:1/630: symlink d1/db/d31/d4f/d7a/lcf 0 2026-03-09T19:27:45.441 INFO:tasks.workunit.client.0.vm07.stdout:8/610: creat d7/d50/da6/fdd x:0 0 0 2026-03-09T19:27:45.443 INFO:tasks.workunit.client.1.vm08.stdout:7/847: mknod d5/d14/dae/c127 0 2026-03-09T19:27:45.444 INFO:tasks.workunit.client.1.vm08.stdout:7/848: dread d5/d14/d2b/d4b/f66 [0,4194304] 0 2026-03-09T19:27:45.446 INFO:tasks.workunit.client.1.vm08.stdout:2/711: mkdir d3/d9/d26/ded 0 2026-03-09T19:27:45.446 INFO:tasks.workunit.client.1.vm08.stdout:3/856: dread d0/d4b/f74 [0,4194304] 0 2026-03-09T19:27:45.447 INFO:tasks.workunit.client.0.vm07.stdout:2/657: write d3/dd/d16/d29/d2d/d45/d85/d8a/f9e [3472283,73146] 0 2026-03-09T19:27:45.448 INFO:tasks.workunit.client.0.vm07.stdout:2/658: read - d3/dd/d16/d29/d2d/d45/d3b/d44/f5d zero size 2026-03-09T19:27:45.451 INFO:tasks.workunit.client.1.vm08.stdout:6/814: dwrite d3/dbc/deb/f106 [0,4194304] 0 2026-03-09T19:27:45.461 INFO:tasks.workunit.client.1.vm08.stdout:0/816: creat dd/d22/d27/d65/ddf/f108 x:0 0 0 2026-03-09T19:27:45.461 INFO:tasks.workunit.client.1.vm08.stdout:8/789: mkdir de/d113 0 2026-03-09T19:27:45.467 INFO:tasks.workunit.client.1.vm08.stdout:1/926: rename d9/da/d17/d60/la4 to d9/da/d95/l123 0 2026-03-09T19:27:45.473 INFO:tasks.workunit.client.0.vm07.stdout:5/556: write d3/dd/f23 [339440,82355] 0 2026-03-09T19:27:45.478 INFO:tasks.workunit.client.1.vm08.stdout:6/815: mkdir d3/d94/def/dc4/d130 0 2026-03-09T19:27:45.482 INFO:tasks.workunit.client.0.vm07.stdout:3/650: mknod d1/cd2 0 2026-03-09T19:27:45.482 INFO:tasks.workunit.client.0.vm07.stdout:4/582: rmdir d3/d11/d16/dca 0 2026-03-09T19:27:45.482 INFO:tasks.workunit.client.1.vm08.stdout:0/817: readlink dd/l1a 0 2026-03-09T19:27:45.482 INFO:tasks.workunit.client.1.vm08.stdout:3/857: rename d0/d6/de/d1b/c59 to d0/d6/d10a/c11a 0 2026-03-09T19:27:45.482 INFO:tasks.workunit.client.1.vm08.stdout:8/790: rename de/d47 to de/d47/d85/d10f/d114 22 2026-03-09T19:27:45.485 INFO:tasks.workunit.client.1.vm08.stdout:0/818: fdatasync dd/d22/f2b 0 2026-03-09T19:27:45.486 INFO:tasks.workunit.client.0.vm07.stdout:1/631: sync 2026-03-09T19:27:45.488 INFO:tasks.workunit.client.1.vm08.stdout:5/770: getdents d16 0 2026-03-09T19:27:45.491 INFO:tasks.workunit.client.0.vm07.stdout:5/557: dread d3/dd/d26/d3f/d47/d56/f75 [0,4194304] 0 2026-03-09T19:27:45.493 INFO:tasks.workunit.client.0.vm07.stdout:5/558: read d3/dd/f52 [1118105,75729] 0 2026-03-09T19:27:45.494 INFO:tasks.workunit.client.0.vm07.stdout:5/559: chown d3/d1a/d5a/c96 8986 1 2026-03-09T19:27:45.495 INFO:tasks.workunit.client.0.vm07.stdout:2/659: sync 2026-03-09T19:27:45.496 INFO:tasks.workunit.client.0.vm07.stdout:2/660: truncate d3/dd/d16/d29/d2d/d45/d3b/d44/d97/da4/fd1 949246 0 2026-03-09T19:27:45.496 INFO:tasks.workunit.client.1.vm08.stdout:4/765: dread da/d10/d26/d27/d32/f45 [0,4194304] 0 2026-03-09T19:27:45.497 INFO:tasks.workunit.client.0.vm07.stdout:2/661: chown d3/dd/d16/d29/d2d/d45/d3b/d44/d96/ddf 314 1 2026-03-09T19:27:45.497 INFO:tasks.workunit.client.0.vm07.stdout:8/611: rmdir d7/d16/dcf/dd0 0 2026-03-09T19:27:45.497 INFO:tasks.workunit.client.0.vm07.stdout:4/583: fsync d3/d11/d51/f9a 0 2026-03-09T19:27:45.497 INFO:tasks.workunit.client.1.vm08.stdout:8/791: symlink de/d47/dfd/d99/dde/l115 0 2026-03-09T19:27:45.498 INFO:tasks.workunit.client.1.vm08.stdout:1/927: sync 2026-03-09T19:27:45.500 INFO:tasks.workunit.client.0.vm07.stdout:3/651: truncate d1/d1f/f36 480632 0 2026-03-09T19:27:45.500 INFO:tasks.workunit.client.0.vm07.stdout:5/560: symlink d3/d1a/d5d/lad 0 2026-03-09T19:27:45.504 INFO:tasks.workunit.client.0.vm07.stdout:2/662: creat d3/dd/fe8 x:0 0 0 2026-03-09T19:27:45.510 INFO:tasks.workunit.client.1.vm08.stdout:0/819: unlink dd/d22/d24/d49/d50/f9b 0 2026-03-09T19:27:45.514 INFO:tasks.workunit.client.1.vm08.stdout:9/801: dwrite d0/d2/d80/d69/f7a [0,4194304] 0 2026-03-09T19:27:45.515 INFO:tasks.workunit.client.0.vm07.stdout:2/663: dread d3/d11/f18 [4194304,4194304] 0 2026-03-09T19:27:45.521 INFO:tasks.workunit.client.0.vm07.stdout:7/587: dwrite d0/d4/d5/d8/d41/d64/d74/d98/f83 [0,4194304] 0 2026-03-09T19:27:45.524 INFO:tasks.workunit.client.1.vm08.stdout:2/712: dwrite d3/d4/d23/d2c/d39/d5e/d14/f78 [0,4194304] 0 2026-03-09T19:27:45.526 INFO:tasks.workunit.client.0.vm07.stdout:0/564: dwrite d0/d6/d13/d17/d19/f34 [0,4194304] 0 2026-03-09T19:27:45.526 INFO:tasks.workunit.client.0.vm07.stdout:9/616: truncate d0/dc1/f7c 2400586 0 2026-03-09T19:27:45.526 INFO:tasks.workunit.client.0.vm07.stdout:6/541: dwrite d0/d1/db/d1d/f47 [0,4194304] 0 2026-03-09T19:27:45.527 INFO:tasks.workunit.client.1.vm08.stdout:7/849: write d5/d14/dae/d3a/d42/ff2 [643661,118535] 0 2026-03-09T19:27:45.532 INFO:tasks.workunit.client.1.vm08.stdout:3/858: dwrite d0/d52/d6d/d77/d88/fe7 [0,4194304] 0 2026-03-09T19:27:45.538 INFO:tasks.workunit.client.1.vm08.stdout:3/859: dwrite d0/d6/de/d15/d96/df5/df8/f119 [0,4194304] 0 2026-03-09T19:27:45.552 INFO:tasks.workunit.client.1.vm08.stdout:5/771: creat d16/d1e/d8c/d99/dcc/ff6 x:0 0 0 2026-03-09T19:27:45.553 INFO:tasks.workunit.client.0.vm07.stdout:1/632: mkdir d1/d11/d37/d3f/dd0 0 2026-03-09T19:27:45.554 INFO:tasks.workunit.client.0.vm07.stdout:8/612: fdatasync d7/d9/d37/d34/f91 0 2026-03-09T19:27:45.555 INFO:tasks.workunit.client.0.vm07.stdout:1/633: stat d1/d11/d37/l8d 0 2026-03-09T19:27:45.556 INFO:tasks.workunit.client.1.vm08.stdout:4/766: unlink da/d10/d26/d27/c59 0 2026-03-09T19:27:45.562 INFO:tasks.workunit.client.1.vm08.stdout:8/792: symlink de/d47/d85/l116 0 2026-03-09T19:27:45.572 INFO:tasks.workunit.client.0.vm07.stdout:3/652: mknod d1/d6/d4c/d97/cd3 0 2026-03-09T19:27:45.577 INFO:tasks.workunit.client.0.vm07.stdout:3/653: write d1/d6/d4c/f61 [1123212,806] 0 2026-03-09T19:27:45.577 INFO:tasks.workunit.client.0.vm07.stdout:5/561: creat d3/dd/d26/d2d/fae x:0 0 0 2026-03-09T19:27:45.577 INFO:tasks.workunit.client.1.vm08.stdout:6/816: creat d3/d34/f131 x:0 0 0 2026-03-09T19:27:45.577 INFO:tasks.workunit.client.1.vm08.stdout:6/817: dread d3/d94/f102 [0,4194304] 0 2026-03-09T19:27:45.577 INFO:tasks.workunit.client.1.vm08.stdout:6/818: write d3/db/f12d [300779,128829] 0 2026-03-09T19:27:45.579 INFO:tasks.workunit.client.1.vm08.stdout:0/820: truncate dd/d22/d27/f3d 60266 0 2026-03-09T19:27:45.581 INFO:tasks.workunit.client.0.vm07.stdout:2/664: unlink d3/dd/f1d 0 2026-03-09T19:27:45.583 INFO:tasks.workunit.client.1.vm08.stdout:9/802: unlink d0/ffc 0 2026-03-09T19:27:45.592 INFO:tasks.workunit.client.0.vm07.stdout:7/588: unlink d0/d4/d5/d26/d3c/d39/laa 0 2026-03-09T19:27:45.593 INFO:tasks.workunit.client.1.vm08.stdout:2/713: creat d3/d4/d23/d2c/d39/d5e/db8/fee x:0 0 0 2026-03-09T19:27:45.597 INFO:tasks.workunit.client.0.vm07.stdout:6/542: creat d0/d1/d28/da8/fd2 x:0 0 0 2026-03-09T19:27:45.598 INFO:tasks.workunit.client.0.vm07.stdout:6/543: stat d0/dbf/f34 0 2026-03-09T19:27:45.601 INFO:tasks.workunit.client.0.vm07.stdout:0/565: rename d0/d6/d13/d1c/d11/f5f to d0/d6/d13/d1c/d11/d56/d78/fb6 0 2026-03-09T19:27:45.603 INFO:tasks.workunit.client.1.vm08.stdout:3/860: fsync d0/d52/d7c/f8f 0 2026-03-09T19:27:45.603 INFO:tasks.workunit.client.0.vm07.stdout:3/654: sync 2026-03-09T19:27:45.604 INFO:tasks.workunit.client.0.vm07.stdout:3/655: chown d1/d3d/d47/db3/dc2/d28/dc4 424477 1 2026-03-09T19:27:45.605 INFO:tasks.workunit.client.1.vm08.stdout:5/772: mknod d16/d1e/d9f/cf7 0 2026-03-09T19:27:45.611 INFO:tasks.workunit.client.0.vm07.stdout:8/613: readlink d7/d9/d10/l52 0 2026-03-09T19:27:45.614 INFO:tasks.workunit.client.0.vm07.stdout:8/614: read - d7/d1d/d83/d9f/fc6 zero size 2026-03-09T19:27:45.614 INFO:tasks.workunit.client.0.vm07.stdout:4/584: mknod d3/d11/d2b/d38/d8f/ccc 0 2026-03-09T19:27:45.614 INFO:tasks.workunit.client.0.vm07.stdout:4/585: stat d3/d11/d2b/l31 0 2026-03-09T19:27:45.614 INFO:tasks.workunit.client.0.vm07.stdout:4/586: chown d3/f13 1 1 2026-03-09T19:27:45.619 INFO:tasks.workunit.client.1.vm08.stdout:0/821: readlink dd/d22/l25 0 2026-03-09T19:27:45.623 INFO:tasks.workunit.client.1.vm08.stdout:2/714: rmdir d3/d9/d4a 39 2026-03-09T19:27:45.628 INFO:tasks.workunit.client.1.vm08.stdout:3/861: truncate d0/f7a 43879 0 2026-03-09T19:27:45.629 INFO:tasks.workunit.client.1.vm08.stdout:3/862: stat d0/d6/de/d1b/d16/d17/dac 0 2026-03-09T19:27:45.635 INFO:tasks.workunit.client.1.vm08.stdout:6/819: dread d3/d34/f37 [0,4194304] 0 2026-03-09T19:27:45.635 INFO:tasks.workunit.client.1.vm08.stdout:6/820: dread - d3/d68/fec zero size 2026-03-09T19:27:45.643 INFO:tasks.workunit.client.0.vm07.stdout:0/566: mknod d0/d6/d13/d1c/d61/cb7 0 2026-03-09T19:27:45.644 INFO:tasks.workunit.client.1.vm08.stdout:4/767: dread da/d10/d16/fbf [0,4194304] 0 2026-03-09T19:27:45.653 INFO:tasks.workunit.client.0.vm07.stdout:8/615: rename d7/d9/d10/d44/d9a/fa1 to d7/d50/da6/fde 0 2026-03-09T19:27:45.653 INFO:tasks.workunit.client.0.vm07.stdout:8/616: chown d7/f1c 1 1 2026-03-09T19:27:45.657 INFO:tasks.workunit.client.1.vm08.stdout:6/821: truncate d3/d15/ffd 379486 0 2026-03-09T19:27:45.663 INFO:tasks.workunit.client.1.vm08.stdout:0/822: read dd/d22/d24/d49/d50/d105/f83 [78504,11026] 0 2026-03-09T19:27:45.670 INFO:tasks.workunit.client.1.vm08.stdout:4/768: rmdir da/d10/d16/d28/d46 39 2026-03-09T19:27:45.715 INFO:tasks.workunit.client.0.vm07.stdout:4/587: symlink d3/d11/d29/db9/d22/d70/d93/lcd 0 2026-03-09T19:27:45.721 INFO:tasks.workunit.client.1.vm08.stdout:8/793: rename de/d47/d85 to de/d117 0 2026-03-09T19:27:45.723 INFO:tasks.workunit.client.0.vm07.stdout:8/617: mkdir d7/d9/ddf 0 2026-03-09T19:27:45.723 INFO:tasks.workunit.client.0.vm07.stdout:8/618: stat d7/d50/da6/dc5 0 2026-03-09T19:27:45.726 INFO:tasks.workunit.client.1.vm08.stdout:1/928: dwrite d9/d40/f57 [4194304,4194304] 0 2026-03-09T19:27:45.727 INFO:tasks.workunit.client.0.vm07.stdout:4/588: chown d3/d11/d29/db9/d91/fbb 16279 1 2026-03-09T19:27:45.739 INFO:tasks.workunit.client.0.vm07.stdout:5/562: getdents d3/d1a/d28 0 2026-03-09T19:27:45.741 INFO:tasks.workunit.client.0.vm07.stdout:2/665: getdents d3/dd/d16/d30 0 2026-03-09T19:27:45.742 INFO:tasks.workunit.client.0.vm07.stdout:6/544: getdents d0/d1/db/d52/d94/d81 0 2026-03-09T19:27:45.744 INFO:tasks.workunit.client.0.vm07.stdout:8/619: readlink d7/d9/l8e 0 2026-03-09T19:27:45.745 INFO:tasks.workunit.client.1.vm08.stdout:7/850: write d5/d14/d2b/d4b/fd6 [396313,91596] 0 2026-03-09T19:27:45.748 INFO:tasks.workunit.client.0.vm07.stdout:9/617: dwrite d0/d6/fa [0,4194304] 0 2026-03-09T19:27:45.765 INFO:tasks.workunit.client.0.vm07.stdout:1/634: write d1/f51 [3714590,1358] 0 2026-03-09T19:27:45.769 INFO:tasks.workunit.client.0.vm07.stdout:6/545: read d0/d1/db/f14 [504377,58274] 0 2026-03-09T19:27:45.769 INFO:tasks.workunit.client.0.vm07.stdout:8/620: truncate d7/d9/d37/d34/f79 540963 0 2026-03-09T19:27:45.770 INFO:tasks.workunit.client.0.vm07.stdout:6/546: truncate d0/d1/db/d17/dc4/d7b/da0/fcf 271612 0 2026-03-09T19:27:45.771 INFO:tasks.workunit.client.0.vm07.stdout:7/589: write d0/d4/d5/d26/d32/fa6 [585225,93093] 0 2026-03-09T19:27:45.773 INFO:tasks.workunit.client.1.vm08.stdout:7/851: sync 2026-03-09T19:27:45.775 INFO:tasks.workunit.client.1.vm08.stdout:5/773: write d16/d45/d81/fce [931382,58752] 0 2026-03-09T19:27:45.776 INFO:tasks.workunit.client.1.vm08.stdout:5/774: write d16/d1e/fa5 [7819481,27426] 0 2026-03-09T19:27:45.784 INFO:tasks.workunit.client.0.vm07.stdout:3/656: dwrite d1/d6/f37 [0,4194304] 0 2026-03-09T19:27:45.787 INFO:tasks.workunit.client.0.vm07.stdout:7/590: fdatasync d0/d4/d5/d26/f4a 0 2026-03-09T19:27:45.791 INFO:tasks.workunit.client.0.vm07.stdout:3/657: dwrite d1/d1f/fd0 [0,4194304] 0 2026-03-09T19:27:45.791 INFO:tasks.workunit.client.1.vm08.stdout:6/822: unlink d3/d34/d5c/f7f 0 2026-03-09T19:27:45.791 INFO:tasks.workunit.client.1.vm08.stdout:6/823: stat d3/d34/dce 0 2026-03-09T19:27:45.791 INFO:tasks.workunit.client.0.vm07.stdout:3/658: truncate d1/d6/d4c/d97/fc0 220003 0 2026-03-09T19:27:45.792 INFO:tasks.workunit.client.1.vm08.stdout:0/823: creat dd/d22/d27/d4f/f109 x:0 0 0 2026-03-09T19:27:45.803 INFO:tasks.workunit.client.0.vm07.stdout:7/591: truncate d0/d52/f62 4726292 0 2026-03-09T19:27:45.803 INFO:tasks.workunit.client.0.vm07.stdout:7/592: fsync d0/d4/d5/d8/d41/fb0 0 2026-03-09T19:27:45.804 INFO:tasks.workunit.client.0.vm07.stdout:7/593: write d0/d4/d5/d8/d41/d64/d79/fbc [27363,81045] 0 2026-03-09T19:27:45.805 INFO:tasks.workunit.client.0.vm07.stdout:6/547: dread d0/d1/d28/da8/fd1 [0,4194304] 0 2026-03-09T19:27:45.805 INFO:tasks.workunit.client.1.vm08.stdout:1/929: unlink d9/da/d53/d67/f77 0 2026-03-09T19:27:45.806 INFO:tasks.workunit.client.1.vm08.stdout:1/930: write d9/d40/f92 [4573105,130321] 0 2026-03-09T19:27:45.815 INFO:tasks.workunit.client.0.vm07.stdout:6/548: unlink d0/d1/db/d17/f1a 0 2026-03-09T19:27:45.819 INFO:tasks.workunit.client.1.vm08.stdout:5/775: dwrite d16/d45/f65 [4194304,4194304] 0 2026-03-09T19:27:45.827 INFO:tasks.workunit.client.1.vm08.stdout:9/803: rename d0/d2/d80/de5/da2/da8/de8/fe0 to d0/d2/d14/d98/f10b 0 2026-03-09T19:27:45.827 INFO:tasks.workunit.client.1.vm08.stdout:3/863: rename d0/d6/de to d0/d6/de/d15/d96/d11b 22 2026-03-09T19:27:45.827 INFO:tasks.workunit.client.1.vm08.stdout:9/804: stat d0/d1b/f8d 0 2026-03-09T19:27:45.833 INFO:tasks.workunit.client.1.vm08.stdout:6/824: truncate d3/f6e 995988 0 2026-03-09T19:27:45.835 INFO:tasks.workunit.client.0.vm07.stdout:6/549: mkdir d0/d44/dd3 0 2026-03-09T19:27:45.839 INFO:tasks.workunit.client.0.vm07.stdout:6/550: truncate d0/dbf/d95/f35 2178178 0 2026-03-09T19:27:45.849 INFO:tasks.workunit.client.1.vm08.stdout:7/852: mkdir d5/d14/d2b/d128 0 2026-03-09T19:27:45.849 INFO:tasks.workunit.client.0.vm07.stdout:7/594: link d0/d52/d54/l5b d0/d4/d5/d26/lc7 0 2026-03-09T19:27:45.857 INFO:tasks.workunit.client.1.vm08.stdout:1/931: dread d9/da/d53/d67/d6c/d76/f99 [0,4194304] 0 2026-03-09T19:27:45.858 INFO:tasks.workunit.client.1.vm08.stdout:1/932: chown d9/da/d2d/d4e/cfa 0 1 2026-03-09T19:27:45.863 INFO:tasks.workunit.client.0.vm07.stdout:7/595: dread - d0/d4/d5/d8/fa3 zero size 2026-03-09T19:27:45.865 INFO:tasks.workunit.client.0.vm07.stdout:6/551: creat d0/d1/fd4 x:0 0 0 2026-03-09T19:27:45.883 INFO:tasks.workunit.client.1.vm08.stdout:4/769: getdents da/d10/d16/d28/d4d 0 2026-03-09T19:27:45.883 INFO:tasks.workunit.client.1.vm08.stdout:4/770: readlink da/d10/d16/d28/d2f/d4f/d64/d84/d8a/ld9 0 2026-03-09T19:27:45.887 INFO:tasks.workunit.client.1.vm08.stdout:7/853: creat d5/d14/d27/d54/dfb/d9c/dcb/dd2/f129 x:0 0 0 2026-03-09T19:27:45.890 INFO:tasks.workunit.client.1.vm08.stdout:6/825: link d3/db/d43/c81 d3/db/d12a/c132 0 2026-03-09T19:27:45.897 INFO:tasks.workunit.client.1.vm08.stdout:7/854: dread d5/d14/d27/d54/dfb/d90/f114 [4194304,4194304] 0 2026-03-09T19:27:45.899 INFO:tasks.workunit.client.1.vm08.stdout:5/776: unlink d16/d1e/d3b/f3c 0 2026-03-09T19:27:45.901 INFO:tasks.workunit.client.1.vm08.stdout:7/855: write d5/d14/f1e [2273199,129270] 0 2026-03-09T19:27:45.909 INFO:tasks.workunit.client.1.vm08.stdout:5/777: creat d16/d8e/ff8 x:0 0 0 2026-03-09T19:27:45.910 INFO:tasks.workunit.client.1.vm08.stdout:4/771: link da/d10/d16/d28/d2f/d4f/d64/d84/d8a/da2/ccf da/d10/d1b/ce1 0 2026-03-09T19:27:45.911 INFO:tasks.workunit.client.1.vm08.stdout:7/856: creat d5/d14/dae/d1c/d73/f12a x:0 0 0 2026-03-09T19:27:45.912 INFO:tasks.workunit.client.1.vm08.stdout:7/857: dread - d5/d14/dae/d1c/fab zero size 2026-03-09T19:27:45.913 INFO:tasks.workunit.client.0.vm07.stdout:5/563: dread d3/f68 [0,4194304] 0 2026-03-09T19:27:45.921 INFO:tasks.workunit.client.0.vm07.stdout:5/564: symlink d3/d1a/d28/d40/d92/laf 0 2026-03-09T19:27:45.924 INFO:tasks.workunit.client.1.vm08.stdout:7/858: fsync d5/d14/dae/d1c/f87 0 2026-03-09T19:27:45.930 INFO:tasks.workunit.client.0.vm07.stdout:5/565: dread d3/d1a/d28/d40/f49 [0,4194304] 0 2026-03-09T19:27:45.931 INFO:tasks.workunit.client.1.vm08.stdout:4/772: sync 2026-03-09T19:27:45.937 INFO:tasks.workunit.client.1.vm08.stdout:7/859: unlink d5/d14/dae/d3a/f117 0 2026-03-09T19:27:45.941 INFO:tasks.workunit.client.0.vm07.stdout:5/566: mknod d3/dd/d26/d2d/cb0 0 2026-03-09T19:27:45.948 INFO:tasks.workunit.client.0.vm07.stdout:5/567: symlink d3/d1a/d28/d40/lb1 0 2026-03-09T19:27:45.949 INFO:tasks.workunit.client.0.vm07.stdout:5/568: truncate d3/d1a/d28/d6c/d72/f9d 315768 0 2026-03-09T19:27:45.949 INFO:tasks.workunit.client.1.vm08.stdout:7/860: creat d5/dc4/f12b x:0 0 0 2026-03-09T19:27:45.950 INFO:tasks.workunit.client.0.vm07.stdout:5/569: chown d3/d1a/d5d/f78 1 1 2026-03-09T19:27:45.950 INFO:tasks.workunit.client.1.vm08.stdout:7/861: dread - d5/d14/d27/d78/dc7/f10a zero size 2026-03-09T19:27:45.953 INFO:tasks.workunit.client.1.vm08.stdout:5/778: link d16/d1e/d8c/d99/da8/d9a/ld9 d16/d45/daf/df5/d6f/lf9 0 2026-03-09T19:27:45.954 INFO:tasks.workunit.client.0.vm07.stdout:2/666: dwrite d3/fa [0,4194304] 0 2026-03-09T19:27:45.954 INFO:tasks.workunit.client.0.vm07.stdout:2/667: chown d3/dd/d16/d29/f91 2 1 2026-03-09T19:27:45.957 INFO:tasks.workunit.client.1.vm08.stdout:5/779: dwrite d16/d1e/d8c/d99/da8/d9a/fe0 [0,4194304] 0 2026-03-09T19:27:45.959 INFO:tasks.workunit.client.0.vm07.stdout:2/668: dwrite d3/dd/d16/d30/f67 [4194304,4194304] 0 2026-03-09T19:27:45.985 INFO:tasks.workunit.client.1.vm08.stdout:5/780: mkdir d16/d8e/dd5/dfa 0 2026-03-09T19:27:45.990 INFO:tasks.workunit.client.0.vm07.stdout:5/570: symlink d3/d1a/lb2 0 2026-03-09T19:27:45.991 INFO:tasks.workunit.client.1.vm08.stdout:4/773: creat da/d10/d16/d28/d46/d52/d6e/d73/dca/fe2 x:0 0 0 2026-03-09T19:27:45.992 INFO:tasks.workunit.client.0.vm07.stdout:5/571: dread d3/f93 [0,4194304] 0 2026-03-09T19:27:45.993 INFO:tasks.workunit.client.0.vm07.stdout:5/572: write d3/d1a/d28/d40/fa5 [35905,20889] 0 2026-03-09T19:27:45.994 INFO:tasks.workunit.client.1.vm08.stdout:2/715: symlink d3/d4/d23/lef 0 2026-03-09T19:27:46.003 INFO:tasks.workunit.client.0.vm07.stdout:5/573: creat d3/dd/d26/d3f/fb3 x:0 0 0 2026-03-09T19:27:46.010 INFO:tasks.workunit.client.1.vm08.stdout:2/716: chown d3/d4/d23/d2c/d39/d5e/de/d8b/f7e 1108746195 1 2026-03-09T19:27:46.010 INFO:tasks.workunit.client.1.vm08.stdout:2/717: read - d3/d4/d23/d2c/d39/d5e/de/fe9 zero size 2026-03-09T19:27:46.012 INFO:tasks.workunit.client.0.vm07.stdout:5/574: creat d3/dd/d26/d3f/d47/d71/fb4 x:0 0 0 2026-03-09T19:27:46.013 INFO:tasks.workunit.client.1.vm08.stdout:7/862: link d5/d14/d38/l79 d5/d14/d27/d54/l12c 0 2026-03-09T19:27:46.017 INFO:tasks.workunit.client.1.vm08.stdout:5/781: getdents d16/d1e/d8c/d99/dcc/df4 0 2026-03-09T19:27:46.019 INFO:tasks.workunit.client.0.vm07.stdout:5/575: unlink d3/dd/d26/d2d/l37 0 2026-03-09T19:27:46.020 INFO:tasks.workunit.client.1.vm08.stdout:0/824: write dd/d22/d63/d6e/f8a [4814124,40918] 0 2026-03-09T19:27:46.023 INFO:tasks.workunit.client.1.vm08.stdout:8/794: rename de/d47/dfd/f88 to de/d25/d31/f118 0 2026-03-09T19:27:46.027 INFO:tasks.workunit.client.0.vm07.stdout:5/576: truncate d3/dd/f52 3649147 0 2026-03-09T19:27:46.028 INFO:tasks.workunit.client.0.vm07.stdout:5/577: write d3/d1a/d28/d6c/d72/f9a [134454,33609] 0 2026-03-09T19:27:46.034 INFO:tasks.workunit.client.0.vm07.stdout:1/635: rmdir d1/d11/d37/d3f/d45/dbf 0 2026-03-09T19:27:46.040 INFO:tasks.workunit.client.1.vm08.stdout:9/805: rename d0/d1b to d0/d1b/d97/d48/d5d/d10c 22 2026-03-09T19:27:46.043 INFO:tasks.workunit.client.0.vm07.stdout:4/589: rmdir d3/d11/d29/db9 39 2026-03-09T19:27:46.046 INFO:tasks.workunit.client.0.vm07.stdout:4/590: symlink d3/d4f/lce 0 2026-03-09T19:27:46.046 INFO:tasks.workunit.client.0.vm07.stdout:7/596: write d0/d4/d5/d26/f8e [1565180,205] 0 2026-03-09T19:27:46.047 INFO:tasks.workunit.client.0.vm07.stdout:6/552: write d0/d1/db/d17/dc4/f8f [406762,55045] 0 2026-03-09T19:27:46.053 INFO:tasks.workunit.client.0.vm07.stdout:1/636: dread d1/db/d31/f64 [0,4194304] 0 2026-03-09T19:27:46.053 INFO:tasks.workunit.client.0.vm07.stdout:3/659: rmdir d1/d3d/d47/db3 39 2026-03-09T19:27:46.054 INFO:tasks.workunit.client.1.vm08.stdout:1/933: write d9/da/d2c/d6a/f6b [1673723,114253] 0 2026-03-09T19:27:46.061 INFO:tasks.workunit.client.1.vm08.stdout:5/782: rename d16/d1e/d8c/d99/la3 to d16/d1e/d8c/d99/da8/d9a/lfb 0 2026-03-09T19:27:46.063 INFO:tasks.workunit.client.1.vm08.stdout:6/826: getdents d3/db/d12a 0 2026-03-09T19:27:46.065 INFO:tasks.workunit.client.0.vm07.stdout:7/597: unlink d0/d4/l59 0 2026-03-09T19:27:46.066 INFO:tasks.workunit.client.0.vm07.stdout:6/553: mkdir d0/d2d/dd5 0 2026-03-09T19:27:46.067 INFO:tasks.workunit.client.1.vm08.stdout:3/864: mknod d0/d6/de/d6e/d51/c11c 0 2026-03-09T19:27:46.068 INFO:tasks.workunit.client.1.vm08.stdout:3/865: dread - d0/d6/d93/dcb/fce zero size 2026-03-09T19:27:46.069 INFO:tasks.workunit.client.0.vm07.stdout:7/598: dwrite d0/d4/d5/d8/d1a/d2a/fb2 [0,4194304] 0 2026-03-09T19:27:46.069 INFO:tasks.workunit.client.1.vm08.stdout:3/866: write d0/d52/d6d/d77/d88/fe7 [2830460,106066] 0 2026-03-09T19:27:46.069 INFO:tasks.workunit.client.1.vm08.stdout:3/867: chown d0/d6/de/d1b/d16/d17 1 1 2026-03-09T19:27:46.070 INFO:tasks.workunit.client.1.vm08.stdout:3/868: chown d0/d52/d7c/c9f 30591860 1 2026-03-09T19:27:46.079 INFO:tasks.workunit.client.1.vm08.stdout:2/718: creat d3/d4/d23/d2c/ff0 x:0 0 0 2026-03-09T19:27:46.081 INFO:tasks.workunit.client.0.vm07.stdout:1/637: unlink d1/d3/d21/f47 0 2026-03-09T19:27:46.082 INFO:tasks.workunit.client.1.vm08.stdout:1/934: truncate d9/da/f1e 4253397 0 2026-03-09T19:27:46.093 INFO:tasks.workunit.client.1.vm08.stdout:1/935: write d9/da/d53/d67/d6c/f121 [855668,125796] 0 2026-03-09T19:27:46.093 INFO:tasks.workunit.client.1.vm08.stdout:5/783: mkdir d16/d45/daf/df5/d8a/dfc 0 2026-03-09T19:27:46.093 INFO:tasks.workunit.client.1.vm08.stdout:6/827: creat d3/d94/def/f133 x:0 0 0 2026-03-09T19:27:46.094 INFO:tasks.workunit.client.1.vm08.stdout:9/806: creat d0/d1b/de9/dfd/f10d x:0 0 0 2026-03-09T19:27:46.097 INFO:tasks.workunit.client.0.vm07.stdout:3/660: truncate d1/d3d/d47/db3/d8e/da9/f75 493953 0 2026-03-09T19:27:46.107 INFO:tasks.workunit.client.1.vm08.stdout:0/825: getdents dd 0 2026-03-09T19:27:46.110 INFO:tasks.workunit.client.0.vm07.stdout:3/661: fdatasync d1/d6/dd/fb0 0 2026-03-09T19:27:46.111 INFO:tasks.workunit.client.0.vm07.stdout:2/669: write d3/dd/d16/d29/f58 [364298,6093] 0 2026-03-09T19:27:46.111 INFO:tasks.workunit.client.1.vm08.stdout:1/936: symlink d9/da/d2c/l124 0 2026-03-09T19:27:46.115 INFO:tasks.workunit.client.1.vm08.stdout:5/784: mkdir d16/d1e/d8c/d99/dcc/df4/dfd 0 2026-03-09T19:27:46.121 INFO:tasks.workunit.client.1.vm08.stdout:4/774: truncate da/d10/f77 332072 0 2026-03-09T19:27:46.123 INFO:tasks.workunit.client.0.vm07.stdout:3/662: dread - d1/d3d/d47/db3/d8e/da9/f82 zero size 2026-03-09T19:27:46.123 INFO:tasks.workunit.client.0.vm07.stdout:3/663: chown d1/d6/dd/dbf 0 1 2026-03-09T19:27:46.124 INFO:tasks.workunit.client.1.vm08.stdout:9/807: read d0/d1b/d97/fca [138809,22926] 0 2026-03-09T19:27:46.124 INFO:tasks.workunit.client.0.vm07.stdout:3/664: read d1/d3d/d47/db3/f6b [425321,27143] 0 2026-03-09T19:27:46.130 INFO:tasks.workunit.client.0.vm07.stdout:6/554: getdents d0/d1/db/d24/da4 0 2026-03-09T19:27:46.133 INFO:tasks.workunit.client.1.vm08.stdout:7/863: dwrite d5/d14/dae/f7c [0,4194304] 0 2026-03-09T19:27:46.140 INFO:tasks.workunit.client.1.vm08.stdout:0/826: creat dd/d22/d24/d49/d50/d78/d86/f10a x:0 0 0 2026-03-09T19:27:46.145 INFO:tasks.workunit.client.0.vm07.stdout:0/567: rename d0/d6/d13/d17/c21 to d0/d6/d13/d17/d19/d57/d6a/cb8 0 2026-03-09T19:27:46.147 INFO:tasks.workunit.client.1.vm08.stdout:1/937: symlink d9/da/l125 0 2026-03-09T19:27:46.149 INFO:tasks.workunit.client.0.vm07.stdout:5/578: write d3/d1a/d5d/f78 [386876,376] 0 2026-03-09T19:27:46.150 INFO:tasks.workunit.client.1.vm08.stdout:8/795: write de/d91/fbd [1313608,16867] 0 2026-03-09T19:27:46.151 INFO:tasks.workunit.client.0.vm07.stdout:5/579: read d3/f4e [826205,66782] 0 2026-03-09T19:27:46.152 INFO:tasks.workunit.client.0.vm07.stdout:4/591: write d3/d11/d29/f9b [180464,4592] 0 2026-03-09T19:27:46.156 INFO:tasks.workunit.client.1.vm08.stdout:4/775: rename da/d10/d16/d28/d46/d52/d6e/ddf to da/d10/de3 0 2026-03-09T19:27:46.172 INFO:tasks.workunit.client.1.vm08.stdout:9/808: mkdir d0/d2/d80/de5/da2/d10e 0 2026-03-09T19:27:46.172 INFO:tasks.workunit.client.1.vm08.stdout:9/809: dwrite d0/d1b/f49 [8388608,4194304] 0 2026-03-09T19:27:46.172 INFO:tasks.workunit.client.0.vm07.stdout:8/621: rename d7/d9/d57/c5b to d7/d16/d1e/ce0 0 2026-03-09T19:27:46.172 INFO:tasks.workunit.client.0.vm07.stdout:9/618: write d0/dc1/f7c [2946789,45385] 0 2026-03-09T19:27:46.172 INFO:tasks.workunit.client.0.vm07.stdout:5/580: rmdir d3/dd/d26/d2d/d79 39 2026-03-09T19:27:46.173 INFO:tasks.workunit.client.0.vm07.stdout:6/555: sync 2026-03-09T19:27:46.180 INFO:tasks.workunit.client.1.vm08.stdout:7/864: creat d5/d14/d38/dad/f12d x:0 0 0 2026-03-09T19:27:46.185 INFO:tasks.workunit.client.1.vm08.stdout:3/869: dwrite d0/d6/de/d1b/d16/d17/f8c [0,4194304] 0 2026-03-09T19:27:46.193 INFO:tasks.workunit.client.1.vm08.stdout:2/719: creat d3/d4/d23/d2c/d39/d5e/ff1 x:0 0 0 2026-03-09T19:27:46.195 INFO:tasks.workunit.client.0.vm07.stdout:1/638: truncate d1/f76 544149 0 2026-03-09T19:27:46.196 INFO:tasks.workunit.client.0.vm07.stdout:1/639: stat d1/d11/d37/dcb 0 2026-03-09T19:27:46.197 INFO:tasks.workunit.client.1.vm08.stdout:6/828: write d3/d68/ff4 [461797,99767] 0 2026-03-09T19:27:46.199 INFO:tasks.workunit.client.0.vm07.stdout:2/670: getdents d3/dd/d16 0 2026-03-09T19:27:46.202 INFO:tasks.workunit.client.0.vm07.stdout:1/640: dwrite d1/f96 [0,4194304] 0 2026-03-09T19:27:46.206 INFO:tasks.workunit.client.0.vm07.stdout:7/599: rename d0/d4/d5/d26/d3c/d39/f7a to d0/d4/d5/d8/d1a/d2a/fc8 0 2026-03-09T19:27:46.207 INFO:tasks.workunit.client.0.vm07.stdout:2/671: dread d3/dd/d16/d30/f7e [0,4194304] 0 2026-03-09T19:27:46.209 INFO:tasks.workunit.client.0.vm07.stdout:2/672: truncate d3/dd/d16/d29/d2d/d45/d85/d8a/fd2 213048 0 2026-03-09T19:27:46.209 INFO:tasks.workunit.client.0.vm07.stdout:2/673: write d3/f27 [3429272,117055] 0 2026-03-09T19:27:46.229 INFO:tasks.workunit.client.0.vm07.stdout:9/619: mknod d0/db/d29/d2c/d36/d5a/cd9 0 2026-03-09T19:27:46.232 INFO:tasks.workunit.client.0.vm07.stdout:9/620: dwrite d0/d6/d3a/f89 [0,4194304] 0 2026-03-09T19:27:46.235 INFO:tasks.workunit.client.1.vm08.stdout:8/796: dread de/d25/d33/f55 [0,4194304] 0 2026-03-09T19:27:46.244 INFO:tasks.workunit.client.0.vm07.stdout:3/665: write d1/d3d/d47/db3/f49 [455585,62066] 0 2026-03-09T19:27:46.251 INFO:tasks.workunit.client.0.vm07.stdout:5/581: mkdir d3/d1a/d28/d6c/d72/db5 0 2026-03-09T19:27:46.251 INFO:tasks.workunit.client.0.vm07.stdout:5/582: fsync d3/dd/d26/d3f/d47/d56/f65 0 2026-03-09T19:27:46.251 INFO:tasks.workunit.client.0.vm07.stdout:6/556: truncate d0/f3b 4956492 0 2026-03-09T19:27:46.258 INFO:tasks.workunit.client.1.vm08.stdout:9/810: dread d0/d2/dc8/fc9 [0,4194304] 0 2026-03-09T19:27:46.262 INFO:tasks.workunit.client.0.vm07.stdout:6/557: sync 2026-03-09T19:27:46.263 INFO:tasks.workunit.client.0.vm07.stdout:6/558: truncate d0/d1/d28/da8/fd2 241440 0 2026-03-09T19:27:46.268 INFO:tasks.workunit.client.1.vm08.stdout:7/865: dread d5/d14/d2b/daa/f10d [0,4194304] 0 2026-03-09T19:27:46.279 INFO:tasks.workunit.client.0.vm07.stdout:2/674: creat d3/dd/d16/d29/d2d/d45/d85/d8a/fe9 x:0 0 0 2026-03-09T19:27:46.283 INFO:tasks.workunit.client.1.vm08.stdout:0/827: fdatasync dd/d22/d27/d6c/fbf 0 2026-03-09T19:27:46.292 INFO:tasks.workunit.client.1.vm08.stdout:4/776: creat da/d10/d16/d28/d2f/d4f/d56/dd0/dc0/fe4 x:0 0 0 2026-03-09T19:27:46.298 INFO:tasks.workunit.client.0.vm07.stdout:4/592: creat d3/d4f/d56/fcf x:0 0 0 2026-03-09T19:27:46.302 INFO:tasks.workunit.client.1.vm08.stdout:9/811: sync 2026-03-09T19:27:46.306 INFO:tasks.workunit.client.0.vm07.stdout:0/568: rename d0/d6/d13/d17/f64 to d0/d6/d13/d1c/d61/d69/fb9 0 2026-03-09T19:27:46.312 INFO:tasks.workunit.client.0.vm07.stdout:8/622: dwrite d7/d9/d37/d34/f5a [0,4194304] 0 2026-03-09T19:27:46.314 INFO:tasks.workunit.client.0.vm07.stdout:0/569: dwrite d0/d6/d13/d1c/d61/d69/fad [4194304,4194304] 0 2026-03-09T19:27:46.319 INFO:tasks.workunit.client.1.vm08.stdout:3/870: dwrite d0/d52/d6d/f8b [0,4194304] 0 2026-03-09T19:27:46.319 INFO:tasks.workunit.client.0.vm07.stdout:6/559: mknod d0/d2d/cd6 0 2026-03-09T19:27:46.322 INFO:tasks.workunit.client.0.vm07.stdout:6/560: sync 2026-03-09T19:27:46.323 INFO:tasks.workunit.client.0.vm07.stdout:6/561: sync 2026-03-09T19:27:46.327 INFO:tasks.workunit.client.0.vm07.stdout:6/562: write d0/dbf/d95/d31/f89 [2732144,42012] 0 2026-03-09T19:27:46.342 INFO:tasks.workunit.client.1.vm08.stdout:6/829: dwrite d3/d94/def/f110 [0,4194304] 0 2026-03-09T19:27:46.350 INFO:tasks.workunit.client.1.vm08.stdout:2/720: mkdir d3/d4/d3e/df2 0 2026-03-09T19:27:46.352 INFO:tasks.workunit.client.1.vm08.stdout:2/721: dread d3/d9/d26/f52 [0,4194304] 0 2026-03-09T19:27:46.365 INFO:tasks.workunit.client.1.vm08.stdout:0/828: creat dd/d31/dca/f10b x:0 0 0 2026-03-09T19:27:46.368 INFO:tasks.workunit.client.1.vm08.stdout:5/785: getdents d16/d1e/d8c/d99/dcc/df4 0 2026-03-09T19:27:46.378 INFO:tasks.workunit.client.0.vm07.stdout:4/593: mkdir d3/d4f/d56/d5f/d88/dd0 0 2026-03-09T19:27:46.378 INFO:tasks.workunit.client.0.vm07.stdout:1/641: rename d1/db/d31/d56/c74 to d1/db/d31/d4f/d7a/cd1 0 2026-03-09T19:27:46.390 INFO:tasks.workunit.client.0.vm07.stdout:8/623: write d7/d30/d32/f74 [919326,127672] 0 2026-03-09T19:27:46.393 INFO:tasks.workunit.client.1.vm08.stdout:3/871: mkdir d0/d52/d11d 0 2026-03-09T19:27:46.398 INFO:tasks.workunit.client.1.vm08.stdout:6/830: unlink d3/d34/d3b/c75 0 2026-03-09T19:27:46.401 INFO:tasks.workunit.client.1.vm08.stdout:6/831: dwrite d3/fe4 [0,4194304] 0 2026-03-09T19:27:46.410 INFO:tasks.workunit.client.0.vm07.stdout:0/570: dread d0/d6/d13/d17/d19/d58/fa2 [0,4194304] 0 2026-03-09T19:27:46.414 INFO:tasks.workunit.client.1.vm08.stdout:7/866: dwrite d5/d14/d2b/d4b/fdd [0,4194304] 0 2026-03-09T19:27:46.437 INFO:tasks.workunit.client.1.vm08.stdout:0/829: rename dd/d22/d63/d6e/d72/fbd to dd/d22/d63/d93/f10c 0 2026-03-09T19:27:46.438 INFO:tasks.workunit.client.0.vm07.stdout:0/571: dread d0/d6/d13/d1c/d11/d56/d78/fb6 [0,4194304] 0 2026-03-09T19:27:46.450 INFO:tasks.workunit.client.1.vm08.stdout:8/797: write de/d1d/d21/f10a [734620,46694] 0 2026-03-09T19:27:46.450 INFO:tasks.workunit.client.1.vm08.stdout:8/798: stat de/d47/faa 0 2026-03-09T19:27:46.452 INFO:tasks.workunit.client.0.vm07.stdout:2/675: dwrite d3/dd/d16/d29/d2d/d45/dc3/f9c [0,4194304] 0 2026-03-09T19:27:46.454 INFO:tasks.workunit.client.0.vm07.stdout:9/621: truncate d0/db/d29/d2c/f43 486885 0 2026-03-09T19:27:46.464 INFO:tasks.workunit.client.1.vm08.stdout:9/812: truncate d0/d2/d14/d98/dbb/fe4 5296195 0 2026-03-09T19:27:46.465 INFO:tasks.workunit.client.1.vm08.stdout:3/872: truncate d0/d6/de/feb 1004093 0 2026-03-09T19:27:46.469 INFO:tasks.workunit.client.1.vm08.stdout:3/873: dwrite d0/d8/d24/f105 [0,4194304] 0 2026-03-09T19:27:46.479 INFO:tasks.workunit.client.1.vm08.stdout:7/867: creat d5/d14/dae/dd1/d109/d8f/f12e x:0 0 0 2026-03-09T19:27:46.483 INFO:tasks.workunit.client.1.vm08.stdout:0/830: mknod dd/d22/d24/d49/d50/d105/d82/c10d 0 2026-03-09T19:27:46.484 INFO:tasks.workunit.client.0.vm07.stdout:6/563: read d0/d2d/f88 [59486,95596] 0 2026-03-09T19:27:46.485 INFO:tasks.workunit.client.0.vm07.stdout:3/666: creat d1/d3d/d47/db3/dc2/d28/fd4 x:0 0 0 2026-03-09T19:27:46.489 INFO:tasks.workunit.client.0.vm07.stdout:3/667: dwrite d1/d6/d4c/f61 [0,4194304] 0 2026-03-09T19:27:46.490 INFO:tasks.workunit.client.0.vm07.stdout:4/594: truncate d3/d11/d2b/d37/faf 286763 0 2026-03-09T19:27:46.498 INFO:tasks.workunit.client.0.vm07.stdout:1/642: fdatasync d1/d11/d37/d3f/d45/f98 0 2026-03-09T19:27:46.501 INFO:tasks.workunit.client.1.vm08.stdout:3/874: stat d0/d6/de/d1b/d16/d17/dac/d109/fdb 0 2026-03-09T19:27:46.505 INFO:tasks.workunit.client.0.vm07.stdout:8/624: fdatasync d7/d9/d10/f20 0 2026-03-09T19:27:46.514 INFO:tasks.workunit.client.1.vm08.stdout:6/832: rename d3/d94/lc5 to d3/d34/d5c/da2/dd6/l134 0 2026-03-09T19:27:46.514 INFO:tasks.workunit.client.0.vm07.stdout:0/572: creat d0/d6/d13/d1c/d50/d92/fba x:0 0 0 2026-03-09T19:27:46.517 INFO:tasks.workunit.client.0.vm07.stdout:2/676: symlink d3/dd/d16/d29/d2d/d45/d3b/d44/d97/da4/lea 0 2026-03-09T19:27:46.518 INFO:tasks.workunit.client.1.vm08.stdout:3/875: mknod d0/d6/de/d15/d96/c11e 0 2026-03-09T19:27:46.521 INFO:tasks.workunit.client.0.vm07.stdout:6/564: creat d0/d1/db/d17/dc4/fd7 x:0 0 0 2026-03-09T19:27:46.522 INFO:tasks.workunit.client.1.vm08.stdout:2/722: rename d3/d9/d4a/f89 to d3/d4/d3e/d4e/d88/db0/ff3 0 2026-03-09T19:27:46.527 INFO:tasks.workunit.client.1.vm08.stdout:5/786: write d16/d1e/d8c/d99/da8/d9a/fac [152054,113330] 0 2026-03-09T19:27:46.527 INFO:tasks.workunit.client.1.vm08.stdout:4/777: truncate da/d10/d16/d28/d2f/f80 1786647 0 2026-03-09T19:27:46.527 INFO:tasks.workunit.client.1.vm08.stdout:1/938: write d9/da/d17/fe1 [4694588,73179] 0 2026-03-09T19:27:46.536 INFO:tasks.workunit.client.1.vm08.stdout:8/799: dwrite de/d1d/d21/f62 [0,4194304] 0 2026-03-09T19:27:46.541 INFO:tasks.workunit.client.1.vm08.stdout:7/868: dwrite d5/d14/dae/d3a/fca [0,4194304] 0 2026-03-09T19:27:46.542 INFO:tasks.workunit.client.1.vm08.stdout:7/869: stat d5/d14/dae/d3a/d42/cec 0 2026-03-09T19:27:46.555 INFO:tasks.workunit.client.1.vm08.stdout:9/813: write d0/d2/d80/de5/da2/da8/de8/dcd/fb6 [508831,14345] 0 2026-03-09T19:27:46.558 INFO:tasks.workunit.client.1.vm08.stdout:6/833: dread d3/d15/ffd [0,4194304] 0 2026-03-09T19:27:46.559 INFO:tasks.workunit.client.1.vm08.stdout:6/834: read - d3/d68/fec zero size 2026-03-09T19:27:46.561 INFO:tasks.workunit.client.1.vm08.stdout:0/831: link dd/d22/d24/d49/d50/d105/f83 dd/d22/d63/d6e/f10e 0 2026-03-09T19:27:46.562 INFO:tasks.workunit.client.0.vm07.stdout:3/668: fdatasync d1/d3d/d47/db3/faf 0 2026-03-09T19:27:46.564 INFO:tasks.workunit.client.1.vm08.stdout:3/876: creat d0/d6/de/d6e/d51/f11f x:0 0 0 2026-03-09T19:27:46.564 INFO:tasks.workunit.client.0.vm07.stdout:5/583: getdents d3/dd/d26/d3f 0 2026-03-09T19:27:46.566 INFO:tasks.workunit.client.1.vm08.stdout:2/723: rename d3/d4/d23/d2c/d39/d5e/de/d18/d99/cb5 to d3/d9/d4a/cf4 0 2026-03-09T19:27:46.579 INFO:tasks.workunit.client.1.vm08.stdout:2/724: chown d3/d4/d23/d2c/d39/d5e/d14/c1d 120187 1 2026-03-09T19:27:46.579 INFO:tasks.workunit.client.1.vm08.stdout:2/725: readlink d3/d4/d23/d2c/d39/d5e/de/d18/d1f/ld6 0 2026-03-09T19:27:46.579 INFO:tasks.workunit.client.1.vm08.stdout:1/939: creat d9/d11/db6/f126 x:0 0 0 2026-03-09T19:27:46.579 INFO:tasks.workunit.client.1.vm08.stdout:5/787: mkdir d16/d1e/d8c/d99/dcc/dfe 0 2026-03-09T19:27:46.579 INFO:tasks.workunit.client.0.vm07.stdout:7/600: mkdir d0/d4/d5/d26/dc9 0 2026-03-09T19:27:46.579 INFO:tasks.workunit.client.0.vm07.stdout:1/643: mkdir d1/db/d31/d4f/d7a/dd2 0 2026-03-09T19:27:46.579 INFO:tasks.workunit.client.0.vm07.stdout:8/625: mknod d7/d9/d10/dd8/ce1 0 2026-03-09T19:27:46.579 INFO:tasks.workunit.client.0.vm07.stdout:8/626: stat d7/d30/d75/dcc 0 2026-03-09T19:27:46.579 INFO:tasks.workunit.client.0.vm07.stdout:8/627: fdatasync d7/d9/d10/d44/fdc 0 2026-03-09T19:27:46.579 INFO:tasks.workunit.client.0.vm07.stdout:2/677: symlink d3/dd/d16/leb 0 2026-03-09T19:27:46.581 INFO:tasks.workunit.client.1.vm08.stdout:8/800: symlink de/d47/dfd/d99/dde/l119 0 2026-03-09T19:27:46.584 INFO:tasks.workunit.client.0.vm07.stdout:6/565: rename d0/d1/f92 to d0/dbf/fd8 0 2026-03-09T19:27:46.593 INFO:tasks.workunit.client.0.vm07.stdout:4/595: dread - d3/d11/d29/db9/db2/fc8 zero size 2026-03-09T19:27:46.603 INFO:tasks.workunit.client.1.vm08.stdout:5/788: sync 2026-03-09T19:27:46.608 INFO:tasks.workunit.client.1.vm08.stdout:9/814: write d0/d1b/d97/d48/d5d/ddf/f7d [778071,71710] 0 2026-03-09T19:27:46.610 INFO:tasks.workunit.client.1.vm08.stdout:3/877: write d0/d6/de/d1b/f7d [4240074,57690] 0 2026-03-09T19:27:46.613 INFO:tasks.workunit.client.0.vm07.stdout:7/601: write d0/d52/d54/fa4 [764859,22029] 0 2026-03-09T19:27:46.614 INFO:tasks.workunit.client.1.vm08.stdout:2/726: dwrite d3/d4/f49 [0,4194304] 0 2026-03-09T19:27:46.621 INFO:tasks.workunit.client.1.vm08.stdout:2/727: dwrite d3/d4/d23/d2c/d39/d5e/ff1 [0,4194304] 0 2026-03-09T19:27:46.622 INFO:tasks.workunit.client.1.vm08.stdout:2/728: chown d3/d4/fa7 678 1 2026-03-09T19:27:46.627 INFO:tasks.workunit.client.1.vm08.stdout:7/870: mkdir d5/d14/dae/d12f 0 2026-03-09T19:27:46.627 INFO:tasks.workunit.client.0.vm07.stdout:0/573: mknod d0/cbb 0 2026-03-09T19:27:46.637 INFO:tasks.workunit.client.0.vm07.stdout:9/622: creat d0/db/fda x:0 0 0 2026-03-09T19:27:46.640 INFO:tasks.workunit.client.1.vm08.stdout:4/778: dwrite da/d10/f77 [0,4194304] 0 2026-03-09T19:27:46.642 INFO:tasks.workunit.client.1.vm08.stdout:0/832: mknod dd/d22/d100/c10f 0 2026-03-09T19:27:46.646 INFO:tasks.workunit.client.0.vm07.stdout:6/566: mkdir d0/dbf/dd9 0 2026-03-09T19:27:46.655 INFO:tasks.workunit.client.1.vm08.stdout:5/789: unlink d16/d45/daf/df5/d6f/c7f 0 2026-03-09T19:27:46.657 INFO:tasks.workunit.client.0.vm07.stdout:7/602: fdatasync d0/d80/f81 0 2026-03-09T19:27:46.659 INFO:tasks.workunit.client.1.vm08.stdout:3/878: mknod d0/d6/de/d6e/d51/c120 0 2026-03-09T19:27:46.660 INFO:tasks.workunit.client.1.vm08.stdout:3/879: dread - d0/d6/de/d1b/d16/d17/dac/d109/fff zero size 2026-03-09T19:27:46.662 INFO:tasks.workunit.client.1.vm08.stdout:1/940: dread d9/da/dc/f2e [0,4194304] 0 2026-03-09T19:27:46.674 INFO:tasks.workunit.client.0.vm07.stdout:1/644: dwrite d1/d11/d37/d3f/d6e/f9f [0,4194304] 0 2026-03-09T19:27:46.675 INFO:tasks.workunit.client.1.vm08.stdout:7/871: fdatasync d5/d14/d2b/f30 0 2026-03-09T19:27:46.681 INFO:tasks.workunit.client.1.vm08.stdout:2/729: dread d3/d4/d23/d2c/d39/d5e/de/f17 [0,4194304] 0 2026-03-09T19:27:46.691 INFO:tasks.workunit.client.0.vm07.stdout:3/669: creat d1/d3d/d47/db3/dc2/fd5 x:0 0 0 2026-03-09T19:27:46.695 INFO:tasks.workunit.client.1.vm08.stdout:4/779: fsync da/d10/d16/d28/d2f/d4f/d56/dd0/fc6 0 2026-03-09T19:27:46.695 INFO:tasks.workunit.client.1.vm08.stdout:4/780: chown da/d10/d16/d28/d46/d52/d6e/d40/d6c 1 1 2026-03-09T19:27:46.696 INFO:tasks.workunit.client.0.vm07.stdout:3/670: stat d1/d1f/f13 0 2026-03-09T19:27:46.696 INFO:tasks.workunit.client.0.vm07.stdout:3/671: dread d1/d1f/fd0 [0,4194304] 0 2026-03-09T19:27:46.696 INFO:tasks.workunit.client.0.vm07.stdout:3/672: stat d1/d6/dd/f3b 0 2026-03-09T19:27:46.696 INFO:tasks.workunit.client.0.vm07.stdout:3/673: truncate d1/d6/d4c/d97/fc0 874821 0 2026-03-09T19:27:46.696 INFO:tasks.workunit.client.0.vm07.stdout:3/674: readlink d1/d3d/d47/db3/l41 0 2026-03-09T19:27:46.698 INFO:tasks.workunit.client.0.vm07.stdout:7/603: symlink d0/d52/d54/lca 0 2026-03-09T19:27:46.700 INFO:tasks.workunit.client.0.vm07.stdout:4/596: write d3/d11/d51/f8e [5088521,65250] 0 2026-03-09T19:27:46.702 INFO:tasks.workunit.client.0.vm07.stdout:8/628: dwrite d7/f1c [0,4194304] 0 2026-03-09T19:27:46.702 INFO:tasks.workunit.client.0.vm07.stdout:0/574: chown d0/d6/d13/d33/f35 21936 1 2026-03-09T19:27:46.705 INFO:tasks.workunit.client.0.vm07.stdout:2/678: dwrite d3/d49/faf [0,4194304] 0 2026-03-09T19:27:46.708 INFO:tasks.workunit.client.0.vm07.stdout:5/584: rename d3/d1a/d28/d40/f49 to d3/dd/d26/d3f/d47/fb6 0 2026-03-09T19:27:46.711 INFO:tasks.workunit.client.0.vm07.stdout:0/575: dread d0/d6/d13/d1c/d11/d56/d78/fb6 [4194304,4194304] 0 2026-03-09T19:27:46.716 INFO:tasks.workunit.client.0.vm07.stdout:5/585: dwrite d3/d1a/d28/d6c/f7a [4194304,4194304] 0 2026-03-09T19:27:46.741 INFO:tasks.workunit.client.0.vm07.stdout:0/576: truncate d0/d6/d13/d33/f35 3035375 0 2026-03-09T19:27:46.745 INFO:tasks.workunit.client.1.vm08.stdout:5/790: rename d16/d1e/d9b to d16/d8e/dff 0 2026-03-09T19:27:46.748 INFO:tasks.workunit.client.1.vm08.stdout:9/815: mknod d0/d2/c10f 0 2026-03-09T19:27:46.753 INFO:tasks.workunit.client.1.vm08.stdout:3/880: rmdir d0/d6/de/d15/d96 39 2026-03-09T19:27:46.760 INFO:tasks.workunit.client.0.vm07.stdout:9/623: write d0/db/d29/d2c/d36/d5a/fb2 [155739,24671] 0 2026-03-09T19:27:46.764 INFO:tasks.workunit.client.0.vm07.stdout:6/567: dwrite d0/dbf/d95/d31/f3c [0,4194304] 0 2026-03-09T19:27:46.768 INFO:tasks.workunit.client.1.vm08.stdout:0/833: dwrite dd/f1e [0,4194304] 0 2026-03-09T19:27:46.780 INFO:tasks.workunit.client.0.vm07.stdout:3/675: write d1/d74/f52 [532125,6349] 0 2026-03-09T19:27:46.784 INFO:tasks.workunit.client.0.vm07.stdout:4/597: write d3/d4f/f7c [2325658,20093] 0 2026-03-09T19:27:46.784 INFO:tasks.workunit.client.0.vm07.stdout:7/604: dwrite d0/d52/d54/f5e [0,4194304] 0 2026-03-09T19:27:46.786 INFO:tasks.workunit.client.0.vm07.stdout:1/645: rename d1/d11/c61 to d1/db/d31/d56/cd3 0 2026-03-09T19:27:46.790 INFO:tasks.workunit.client.1.vm08.stdout:7/872: fsync d5/d14/dae/f45 0 2026-03-09T19:27:46.790 INFO:tasks.workunit.client.1.vm08.stdout:7/873: dread - d5/d14/d27/d78/dc7/f110 zero size 2026-03-09T19:27:46.792 INFO:tasks.workunit.client.0.vm07.stdout:4/598: dwrite d3/d11/d2b/f98 [0,4194304] 0 2026-03-09T19:27:46.794 INFO:tasks.workunit.client.0.vm07.stdout:9/624: truncate d0/db/f1d 563387 0 2026-03-09T19:27:46.800 INFO:tasks.workunit.client.1.vm08.stdout:6/835: getdents d3/d34/dce 0 2026-03-09T19:27:46.808 INFO:tasks.workunit.client.0.vm07.stdout:6/568: mkdir d0/d1/db/d24/da4/dda 0 2026-03-09T19:27:46.808 INFO:tasks.workunit.client.0.vm07.stdout:6/569: stat d0/d1/d28/d76/dad 0 2026-03-09T19:27:46.811 INFO:tasks.workunit.client.1.vm08.stdout:4/781: mknod da/d10/d16/d28/d2f/d4f/d64/d84/d8a/da2/ce5 0 2026-03-09T19:27:46.828 INFO:tasks.workunit.client.0.vm07.stdout:2/679: creat d3/dd/d16/d29/d2d/d45/d3b/d44/d97/fec x:0 0 0 2026-03-09T19:27:46.828 INFO:tasks.workunit.client.0.vm07.stdout:2/680: dwrite d3/dd/d16/d29/d2d/d45/d3b/d44/d97/fe6 [0,4194304] 0 2026-03-09T19:27:46.829 INFO:tasks.workunit.client.1.vm08.stdout:5/791: symlink d16/d1e/d6e/dcd/l100 0 2026-03-09T19:27:46.829 INFO:tasks.workunit.client.1.vm08.stdout:5/792: write d16/d8e/fe5 [626699,116552] 0 2026-03-09T19:27:46.829 INFO:tasks.workunit.client.0.vm07.stdout:7/605: dread d0/d4/d5/d8/d41/d64/d74/d98/f47 [0,4194304] 0 2026-03-09T19:27:46.835 INFO:tasks.workunit.client.0.vm07.stdout:7/606: dwrite d0/d4/d5/d8/d41/d64/d74/f82 [0,4194304] 0 2026-03-09T19:27:46.835 INFO:tasks.workunit.client.0.vm07.stdout:7/607: chown d0/f6c 20 1 2026-03-09T19:27:46.844 INFO:tasks.workunit.client.0.vm07.stdout:6/570: sync 2026-03-09T19:27:46.846 INFO:tasks.workunit.client.1.vm08.stdout:9/816: mknod d0/d2/d80/de5/da2/da8/c110 0 2026-03-09T19:27:46.850 INFO:tasks.workunit.client.0.vm07.stdout:6/571: dwrite d0/d1/d28/fc5 [0,4194304] 0 2026-03-09T19:27:46.855 INFO:tasks.workunit.client.0.vm07.stdout:7/608: dread d0/d4/d5/d8/f37 [0,4194304] 0 2026-03-09T19:27:46.861 INFO:tasks.workunit.client.1.vm08.stdout:3/881: mknod d0/d8/d24/c121 0 2026-03-09T19:27:46.867 INFO:tasks.workunit.client.0.vm07.stdout:0/577: rename d0/d6/d13/d17/d19/daa to d0/d6/d13/d1c/d11/d56/d78/dbc 0 2026-03-09T19:27:46.868 INFO:tasks.workunit.client.0.vm07.stdout:0/578: write d0/f3a [3689238,119985] 0 2026-03-09T19:27:46.869 INFO:tasks.workunit.client.0.vm07.stdout:0/579: fsync d0/d6/d13/d1c/d50/d92/d99/fac 0 2026-03-09T19:27:46.870 INFO:tasks.workunit.client.1.vm08.stdout:1/941: mknod d9/da/d12/d39/c127 0 2026-03-09T19:27:46.874 INFO:tasks.workunit.client.1.vm08.stdout:0/834: creat dd/d22/d27/d2e/db0/f110 x:0 0 0 2026-03-09T19:27:46.877 INFO:tasks.workunit.client.1.vm08.stdout:8/801: getdents de/d25/d31 0 2026-03-09T19:27:46.878 INFO:tasks.workunit.client.0.vm07.stdout:5/586: rmdir d3/dd/d95/da2 0 2026-03-09T19:27:46.878 INFO:tasks.workunit.client.0.vm07.stdout:5/587: write d3/dd/f8a [791142,112719] 0 2026-03-09T19:27:46.879 INFO:tasks.workunit.client.0.vm07.stdout:5/588: stat d3/d1a/d28/d6c/d72/d8f/f91 0 2026-03-09T19:27:46.885 INFO:tasks.workunit.client.1.vm08.stdout:7/874: fdatasync d5/fb 0 2026-03-09T19:27:46.886 INFO:tasks.workunit.client.1.vm08.stdout:7/875: write d5/d14/dae/d3a/d42/d85/da0/f125 [516564,78065] 0 2026-03-09T19:27:46.886 INFO:tasks.workunit.client.1.vm08.stdout:7/876: read - d5/d14/d27/f8a zero size 2026-03-09T19:27:46.887 INFO:tasks.workunit.client.1.vm08.stdout:7/877: chown d5/d14/d27/d54/dfb/d9c/dcb/dd2/f129 21571678 1 2026-03-09T19:27:46.889 INFO:tasks.workunit.client.1.vm08.stdout:7/878: dread d5/d14/dae/d1c/f87 [0,4194304] 0 2026-03-09T19:27:46.889 INFO:tasks.workunit.client.1.vm08.stdout:7/879: chown d5/lbf 197933665 1 2026-03-09T19:27:46.893 INFO:tasks.workunit.client.1.vm08.stdout:7/880: dwrite d5/d14/d27/d78/dc7/f10a [0,4194304] 0 2026-03-09T19:27:46.904 INFO:tasks.workunit.client.1.vm08.stdout:4/782: rename da/d10/d26/d3a/d69/d75 to da/d10/d26/d3a/db5/de6 0 2026-03-09T19:27:46.911 INFO:tasks.workunit.client.1.vm08.stdout:9/817: mkdir d0/d1b/d97/d48/d5d/ddf/d111 0 2026-03-09T19:27:46.913 INFO:tasks.workunit.client.1.vm08.stdout:3/882: dread - d0/d52/fd0 zero size 2026-03-09T19:27:46.916 INFO:tasks.workunit.client.1.vm08.stdout:1/942: chown d9/da/d2d/d4e/c64 5878 1 2026-03-09T19:27:46.917 INFO:tasks.workunit.client.1.vm08.stdout:1/943: chown d9/da/d12/d39/f52 3146099 1 2026-03-09T19:27:46.917 INFO:tasks.workunit.client.1.vm08.stdout:1/944: readlink d9/da/d17/d60/lda 0 2026-03-09T19:27:46.924 INFO:tasks.workunit.client.1.vm08.stdout:7/881: sync 2026-03-09T19:27:46.924 INFO:tasks.workunit.client.1.vm08.stdout:4/783: sync 2026-03-09T19:27:46.925 INFO:tasks.workunit.client.1.vm08.stdout:4/784: readlink da/d10/d26/d27/d32/lc8 0 2026-03-09T19:27:46.926 INFO:tasks.workunit.client.1.vm08.stdout:7/882: write d5/d14/dae/d1c/d73/fbe [1863644,2890] 0 2026-03-09T19:27:46.930 INFO:tasks.workunit.client.1.vm08.stdout:8/802: dwrite de/d91/dc8/fe4 [0,4194304] 0 2026-03-09T19:27:46.934 INFO:tasks.workunit.client.1.vm08.stdout:7/883: dwrite d5/d14/d38/dad/f12d [0,4194304] 0 2026-03-09T19:27:46.942 INFO:tasks.workunit.client.0.vm07.stdout:7/609: rename d0/d4/d5/d26/d3c to d0/d4/d5/d8/d41/d64/d74/d98/dcb 0 2026-03-09T19:27:46.973 INFO:tasks.workunit.client.1.vm08.stdout:6/836: creat d3/d34/d6f/dd2/f135 x:0 0 0 2026-03-09T19:27:47.002 INFO:tasks.workunit.client.1.vm08.stdout:2/730: creat d3/d4/d23/d2c/d39/d5e/de/ff5 x:0 0 0 2026-03-09T19:27:47.007 INFO:tasks.workunit.client.0.vm07.stdout:5/589: mkdir d3/dd/d26/d3f/d47/d71/db7 0 2026-03-09T19:27:47.034 INFO:tasks.workunit.client.1.vm08.stdout:4/785: rmdir da/d10/d26/d3a/d49 39 2026-03-09T19:27:47.034 INFO:tasks.workunit.client.1.vm08.stdout:8/803: rmdir de/d1d/d21 39 2026-03-09T19:27:47.034 INFO:tasks.workunit.client.0.vm07.stdout:2/681: mkdir d3/dd/d16/d30/da7/dad/ddd/ded 0 2026-03-09T19:27:47.034 INFO:tasks.workunit.client.0.vm07.stdout:7/610: mknod d0/d52/ccc 0 2026-03-09T19:27:47.038 INFO:tasks.workunit.client.0.vm07.stdout:5/590: mkdir d3/d1a/d5a/db8 0 2026-03-09T19:27:47.038 INFO:tasks.workunit.client.1.vm08.stdout:7/884: unlink d5/d14/d2b/f9f 0 2026-03-09T19:27:47.040 INFO:tasks.workunit.client.1.vm08.stdout:2/731: read d3/f7 [851215,98967] 0 2026-03-09T19:27:47.040 INFO:tasks.workunit.client.1.vm08.stdout:2/732: chown d3/d9/f5d 1 1 2026-03-09T19:27:47.041 INFO:tasks.workunit.client.1.vm08.stdout:2/733: chown d3/d4/d23/d2c/d39/d5e/d14/c1d 129714679 1 2026-03-09T19:27:47.042 INFO:tasks.workunit.client.0.vm07.stdout:5/591: dwrite d3/d1a/d28/d40/d92/fa9 [0,4194304] 0 2026-03-09T19:27:47.044 INFO:tasks.workunit.client.1.vm08.stdout:3/883: mknod d0/d52/d11d/c122 0 2026-03-09T19:27:47.045 INFO:tasks.workunit.client.0.vm07.stdout:6/572: creat d0/d1/fdb x:0 0 0 2026-03-09T19:27:47.047 INFO:tasks.workunit.client.1.vm08.stdout:3/884: dread d0/d8/d24/f105 [0,4194304] 0 2026-03-09T19:27:47.049 INFO:tasks.workunit.client.1.vm08.stdout:3/885: chown d0/d6/de/d1b/d16/d17/dac/d109/l40 12 1 2026-03-09T19:27:47.053 INFO:tasks.workunit.client.1.vm08.stdout:8/804: rename de/d47/dfd/d99/le2 to de/d1d/d4f/l11a 0 2026-03-09T19:27:47.053 INFO:tasks.workunit.client.1.vm08.stdout:7/885: rmdir d5/d14/dae/d3a/d42 39 2026-03-09T19:27:47.068 INFO:tasks.workunit.client.0.vm07.stdout:8/629: dwrite d7/d16/f69 [0,4194304] 0 2026-03-09T19:27:47.072 INFO:tasks.workunit.client.0.vm07.stdout:5/592: mknod d3/dd/d26/d2d/cb9 0 2026-03-09T19:27:47.080 INFO:tasks.workunit.client.1.vm08.stdout:2/734: mkdir d3/d4/d23/d2c/d39/db9/df6 0 2026-03-09T19:27:47.088 INFO:tasks.workunit.client.0.vm07.stdout:6/573: mknod d0/d4e/cdc 0 2026-03-09T19:27:47.097 INFO:tasks.workunit.client.1.vm08.stdout:7/886: truncate d5/d14/dae/f49 1466242 0 2026-03-09T19:27:47.100 INFO:tasks.workunit.client.0.vm07.stdout:8/630: mkdir d7/d9/d37/d45/d97/dbc/de2 0 2026-03-09T19:27:47.104 INFO:tasks.workunit.client.0.vm07.stdout:1/646: write d1/d3/d21/f2e [730950,3042] 0 2026-03-09T19:27:47.105 INFO:tasks.workunit.client.0.vm07.stdout:4/599: write d3/d11/d2b/f69 [371429,75366] 0 2026-03-09T19:27:47.106 INFO:tasks.workunit.client.0.vm07.stdout:3/676: write d1/d6/f9 [260988,13967] 0 2026-03-09T19:27:47.106 INFO:tasks.workunit.client.0.vm07.stdout:9/625: write d0/db/d29/da8/fab [458544,31246] 0 2026-03-09T19:27:47.107 INFO:tasks.workunit.client.0.vm07.stdout:4/600: fdatasync d3/d11/d29/db9/fb4 0 2026-03-09T19:27:47.108 INFO:tasks.workunit.client.1.vm08.stdout:5/793: write d16/d1e/d3b/f68 [4067399,122131] 0 2026-03-09T19:27:47.114 INFO:tasks.workunit.client.0.vm07.stdout:3/677: read d1/d6/d71/f69 [2192122,28126] 0 2026-03-09T19:27:47.120 INFO:tasks.workunit.client.0.vm07.stdout:4/601: dread d3/d11/d29/db9/d22/f24 [0,4194304] 0 2026-03-09T19:27:47.124 INFO:tasks.workunit.client.0.vm07.stdout:5/593: dread d3/dd/d26/d3f/d47/fb6 [0,4194304] 0 2026-03-09T19:27:47.125 INFO:tasks.workunit.client.0.vm07.stdout:5/594: fdatasync d3/dd/d26/d3f/fb3 0 2026-03-09T19:27:47.127 INFO:tasks.workunit.client.0.vm07.stdout:2/682: getdents d3/dd/d16/d29/d3c/d5a/d7a 0 2026-03-09T19:27:47.133 INFO:tasks.workunit.client.0.vm07.stdout:6/574: creat d0/d1/db/d52/d94/d87/fdd x:0 0 0 2026-03-09T19:27:47.133 INFO:tasks.workunit.client.1.vm08.stdout:0/835: dread dd/f6d [0,4194304] 0 2026-03-09T19:27:47.139 INFO:tasks.workunit.client.1.vm08.stdout:9/818: write d0/d1b/d97/d48/f53 [521954,77152] 0 2026-03-09T19:27:47.143 INFO:tasks.workunit.client.1.vm08.stdout:9/819: dread d0/d2/d80/fec [0,4194304] 0 2026-03-09T19:27:47.145 INFO:tasks.workunit.client.1.vm08.stdout:1/945: getdents d9/da/d2c/d6a 0 2026-03-09T19:27:47.147 INFO:tasks.workunit.client.0.vm07.stdout:7/611: write d0/d4/d5/d8/d41/d64/d74/d98/dcb/f63 [35477,3147] 0 2026-03-09T19:27:47.149 INFO:tasks.workunit.client.1.vm08.stdout:6/837: dread d3/db/f20 [0,4194304] 0 2026-03-09T19:27:47.150 INFO:tasks.workunit.client.1.vm08.stdout:6/838: chown d3/dbc/deb/l12c 26679 1 2026-03-09T19:27:47.151 INFO:tasks.workunit.client.0.vm07.stdout:1/647: creat d1/db/d31/d4f/fd4 x:0 0 0 2026-03-09T19:27:47.157 INFO:tasks.workunit.client.1.vm08.stdout:7/887: creat d5/d14/d27/d54/dfb/d9c/dcb/dd2/f130 x:0 0 0 2026-03-09T19:27:47.158 INFO:tasks.workunit.client.1.vm08.stdout:7/888: stat d5/d14/d38/dad/fe5 0 2026-03-09T19:27:47.161 INFO:tasks.workunit.client.1.vm08.stdout:3/886: dwrite d0/d6/f39 [0,4194304] 0 2026-03-09T19:27:47.173 INFO:tasks.workunit.client.0.vm07.stdout:3/678: fdatasync d1/d6/dd/fb0 0 2026-03-09T19:27:47.173 INFO:tasks.workunit.client.0.vm07.stdout:3/679: readlink d1/d89/lb8 0 2026-03-09T19:27:47.174 INFO:tasks.workunit.client.0.vm07.stdout:4/602: creat d3/d11/d29/db9/d22/d70/d93/fd1 x:0 0 0 2026-03-09T19:27:47.175 INFO:tasks.workunit.client.1.vm08.stdout:0/836: rename dd/d22/d27/d2e/d37/ca1 to dd/d22/d63/d6e/df5/c111 0 2026-03-09T19:27:47.176 INFO:tasks.workunit.client.0.vm07.stdout:4/603: fdatasync d3/d11/d29/db9/d22/d86/fc9 0 2026-03-09T19:27:47.181 INFO:tasks.workunit.client.0.vm07.stdout:8/631: write d7/d9/d57/fb2 [3285495,92919] 0 2026-03-09T19:27:47.181 INFO:tasks.workunit.client.0.vm07.stdout:5/595: creat d3/dd/d95/fba x:0 0 0 2026-03-09T19:27:47.183 INFO:tasks.workunit.client.0.vm07.stdout:3/680: dwrite d1/d3d/d47/db3/dc2/d28/d7c/fbd [0,4194304] 0 2026-03-09T19:27:47.184 INFO:tasks.workunit.client.1.vm08.stdout:4/786: dwrite da/d10/d16/d28/d2f/d4f/d64/f6f [0,4194304] 0 2026-03-09T19:27:47.196 INFO:tasks.workunit.client.1.vm08.stdout:8/805: write de/d25/d31/d82/fa1 [4702578,39714] 0 2026-03-09T19:27:47.202 INFO:tasks.workunit.client.1.vm08.stdout:4/787: read da/d10/d16/fc1 [3703917,24494] 0 2026-03-09T19:27:47.206 INFO:tasks.workunit.client.1.vm08.stdout:4/788: dwrite da/d10/f25 [4194304,4194304] 0 2026-03-09T19:27:47.207 INFO:tasks.workunit.client.1.vm08.stdout:0/837: sync 2026-03-09T19:27:47.221 INFO:tasks.workunit.client.0.vm07.stdout:6/575: creat d0/d1/db/d24/fde x:0 0 0 2026-03-09T19:27:47.226 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:47 vm08.local ceph-mon[57794]: Active manager daemon vm08.mxylvw restarted 2026-03-09T19:27:47.226 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:47 vm08.local ceph-mon[57794]: Activating manager daemon vm08.mxylvw 2026-03-09T19:27:47.226 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:47 vm08.local ceph-mon[57794]: from='mgr.? 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.mxylvw/crt"}]: dispatch 2026-03-09T19:27:47.226 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:47 vm08.local ceph-mon[57794]: osdmap e43: 6 total, 6 up, 6 in 2026-03-09T19:27:47.226 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:47 vm08.local ceph-mon[57794]: mgrmap e21: vm08.mxylvw(active, starting, since 0.0233987s) 2026-03-09T19:27:47.226 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:47 vm08.local ceph-mon[57794]: from='mgr.? 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T19:27:47.226 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:47 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T19:27:47.226 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:47 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:27:47.226 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:47 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:27:47.226 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:47 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.zkmcyw"}]: dispatch 2026-03-09T19:27:47.226 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:47 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.uizncw"}]: dispatch 2026-03-09T19:27:47.226 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:47 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.zcaqju"}]: dispatch 2026-03-09T19:27:47.226 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:47 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.jwsqrf"}]: dispatch 2026-03-09T19:27:47.226 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:47 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mgr metadata", "who": "vm08.mxylvw", "id": "vm08.mxylvw"}]: dispatch 2026-03-09T19:27:47.226 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:47 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:27:47.227 INFO:tasks.workunit.client.1.vm08.stdout:1/946: read d9/da/d17/fb1 [1015244,91608] 0 2026-03-09T19:27:47.238 INFO:tasks.workunit.client.1.vm08.stdout:6/839: mknod d3/d34/da9/da4/c136 0 2026-03-09T19:27:47.243 INFO:tasks.workunit.client.0.vm07.stdout:1/648: unlink d1/d11/d37/d3f/d45/f26 0 2026-03-09T19:27:47.244 INFO:tasks.workunit.client.0.vm07.stdout:1/649: chown d1/d3 0 1 2026-03-09T19:27:47.244 INFO:tasks.workunit.client.1.vm08.stdout:7/889: dread - d5/d14/d27/d54/dfb/d9c/fd8 zero size 2026-03-09T19:27:47.244 INFO:tasks.workunit.client.1.vm08.stdout:3/887: dread - d0/d6/de/d6e/d51/f79 zero size 2026-03-09T19:27:47.246 INFO:tasks.workunit.client.1.vm08.stdout:2/735: creat d3/d4/d23/d2c/d39/d5e/ff7 x:0 0 0 2026-03-09T19:27:47.250 INFO:tasks.workunit.client.0.vm07.stdout:3/681: read - d1/d3d/d47/db3/d8e/da9/f92 zero size 2026-03-09T19:27:47.264 INFO:tasks.workunit.client.1.vm08.stdout:5/794: write d16/d1e/f7d [1310867,62891] 0 2026-03-09T19:27:47.265 INFO:tasks.workunit.client.1.vm08.stdout:5/795: write d16/d1e/d8c/d99/da8/fe2 [261690,125672] 0 2026-03-09T19:27:47.277 INFO:tasks.workunit.client.1.vm08.stdout:1/947: truncate d9/d40/d49/f70 4666264 0 2026-03-09T19:27:47.279 INFO:tasks.workunit.client.0.vm07.stdout:9/626: link d0/db/d29/f67 d0/d6f/fdb 0 2026-03-09T19:27:47.281 INFO:tasks.workunit.client.1.vm08.stdout:6/840: mknod d3/d34/da9/c137 0 2026-03-09T19:27:47.281 INFO:tasks.workunit.client.1.vm08.stdout:6/841: chown d3/d94/fb5 211462796 1 2026-03-09T19:27:47.285 INFO:tasks.workunit.client.0.vm07.stdout:1/650: creat d1/d11/d37/d3f/d45/d87/d88/fd5 x:0 0 0 2026-03-09T19:27:47.286 INFO:tasks.workunit.client.0.vm07.stdout:1/651: chown d1/d11/d37/d3f/d7e/dad/fbd 13490 1 2026-03-09T19:27:47.288 INFO:tasks.workunit.client.1.vm08.stdout:7/890: mkdir d5/d14/d2b/daa/d131 0 2026-03-09T19:27:47.300 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:47 vm07.local ceph-mon[48545]: Active manager daemon vm08.mxylvw restarted 2026-03-09T19:27:47.301 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:47 vm07.local ceph-mon[48545]: Activating manager daemon vm08.mxylvw 2026-03-09T19:27:47.301 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:47 vm07.local ceph-mon[48545]: from='mgr.? 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.mxylvw/crt"}]: dispatch 2026-03-09T19:27:47.301 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:47 vm07.local ceph-mon[48545]: osdmap e43: 6 total, 6 up, 6 in 2026-03-09T19:27:47.301 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:47 vm07.local ceph-mon[48545]: mgrmap e21: vm08.mxylvw(active, starting, since 0.0233987s) 2026-03-09T19:27:47.301 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:47 vm07.local ceph-mon[48545]: from='mgr.? 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T19:27:47.301 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:47 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T19:27:47.301 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:47 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:27:47.301 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:47 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:27:47.301 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:47 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.zkmcyw"}]: dispatch 2026-03-09T19:27:47.301 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:47 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.uizncw"}]: dispatch 2026-03-09T19:27:47.301 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:47 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.zcaqju"}]: dispatch 2026-03-09T19:27:47.301 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:47 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.jwsqrf"}]: dispatch 2026-03-09T19:27:47.301 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:47 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mgr metadata", "who": "vm08.mxylvw", "id": "vm08.mxylvw"}]: dispatch 2026-03-09T19:27:47.301 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:47 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:27:47.301 INFO:tasks.workunit.client.1.vm08.stdout:8/806: write de/fb2 [3418008,43811] 0 2026-03-09T19:27:47.302 INFO:tasks.workunit.client.1.vm08.stdout:9/820: write d0/d1b/d97/d48/d5d/f92 [170183,59051] 0 2026-03-09T19:27:47.309 INFO:tasks.workunit.client.1.vm08.stdout:0/838: dwrite dd/d22/d27/d2e/fe0 [0,4194304] 0 2026-03-09T19:27:47.315 INFO:tasks.workunit.client.1.vm08.stdout:4/789: dwrite da/d10/d26/d3a/d91/fb4 [0,4194304] 0 2026-03-09T19:27:47.315 INFO:tasks.workunit.client.1.vm08.stdout:4/790: dread - da/d10/d26/d27/f96 zero size 2026-03-09T19:27:47.328 INFO:tasks.workunit.client.1.vm08.stdout:2/736: creat d3/d4/d23/d2c/d39/d5e/d87/ff8 x:0 0 0 2026-03-09T19:27:47.329 INFO:tasks.workunit.client.0.vm07.stdout:6/576: write d0/d1/db/f4b [961203,31936] 0 2026-03-09T19:27:47.330 INFO:tasks.workunit.client.0.vm07.stdout:6/577: write d0/d1/db/d17/dc4/fd7 [81391,60536] 0 2026-03-09T19:27:47.334 INFO:tasks.workunit.client.0.vm07.stdout:8/632: dwrite d7/d1d/f3d [0,4194304] 0 2026-03-09T19:27:47.358 INFO:tasks.workunit.client.0.vm07.stdout:1/652: creat d1/d11/d37/d5d/dc1/fd6 x:0 0 0 2026-03-09T19:27:47.364 INFO:tasks.workunit.client.1.vm08.stdout:1/948: mkdir d9/da/d2d/d4e/df4/d128 0 2026-03-09T19:27:47.370 INFO:tasks.workunit.client.0.vm07.stdout:0/580: truncate d0/d6/d13/d1c/d11/d56/f7f 2973413 0 2026-03-09T19:27:47.382 INFO:tasks.workunit.client.0.vm07.stdout:0/581: dwrite d0/d6/d13/d1c/d50/d92/d99/fa7 [0,4194304] 0 2026-03-09T19:27:47.397 INFO:tasks.workunit.client.1.vm08.stdout:8/807: rmdir de/d47/dfd 39 2026-03-09T19:27:47.405 INFO:tasks.workunit.client.0.vm07.stdout:3/682: dwrite d1/d6/f60 [0,4194304] 0 2026-03-09T19:27:47.408 INFO:tasks.workunit.client.1.vm08.stdout:5/796: dwrite d16/d1e/f27 [4194304,4194304] 0 2026-03-09T19:27:47.408 INFO:tasks.workunit.client.1.vm08.stdout:3/888: dwrite d0/d6/de/d1b/d16/d17/fbc [0,4194304] 0 2026-03-09T19:27:47.408 INFO:tasks.workunit.client.0.vm07.stdout:3/683: read - d1/d6/d4c/fc8 zero size 2026-03-09T19:27:47.408 INFO:tasks.workunit.client.0.vm07.stdout:3/684: chown d1/d89/lb8 140621 1 2026-03-09T19:27:47.408 INFO:tasks.workunit.client.0.vm07.stdout:3/685: fsync d1/d6/d45/fbe 0 2026-03-09T19:27:47.420 INFO:tasks.workunit.client.1.vm08.stdout:9/821: symlink d0/d1b/d68/d7f/l112 0 2026-03-09T19:27:47.443 INFO:tasks.workunit.client.1.vm08.stdout:4/791: symlink da/d10/d16/d28/d2f/d4f/d64/d81/le7 0 2026-03-09T19:27:47.457 INFO:tasks.workunit.client.1.vm08.stdout:7/891: mkdir d5/d14/dae/d3a/d42/d85/da0/d132 0 2026-03-09T19:27:47.461 INFO:tasks.workunit.client.1.vm08.stdout:8/808: symlink de/d1d/d2e/db4/l11b 0 2026-03-09T19:27:47.465 INFO:tasks.workunit.client.0.vm07.stdout:9/627: write d0/d17/f33 [793883,11576] 0 2026-03-09T19:27:47.466 INFO:tasks.workunit.client.0.vm07.stdout:9/628: truncate d0/d6/fa 5169148 0 2026-03-09T19:27:47.472 INFO:tasks.workunit.client.1.vm08.stdout:3/889: rename d0/d6/de/d1b/lc9 to d0/d6/de/d1b/d16/d17/dac/d109/l123 0 2026-03-09T19:27:47.477 INFO:tasks.workunit.client.1.vm08.stdout:9/822: creat d0/d2/d80/de5/da2/da8/f113 x:0 0 0 2026-03-09T19:27:47.487 INFO:tasks.workunit.client.1.vm08.stdout:2/737: truncate d3/d9/d79/f6b 1106674 0 2026-03-09T19:27:47.491 INFO:tasks.workunit.client.1.vm08.stdout:6/842: write d3/db/d43/fd3 [3393129,74032] 0 2026-03-09T19:27:47.491 INFO:tasks.workunit.client.1.vm08.stdout:6/843: write d3/db/fdb [845540,42231] 0 2026-03-09T19:27:47.498 INFO:tasks.workunit.client.1.vm08.stdout:1/949: mkdir d9/d11/d129 0 2026-03-09T19:27:47.501 INFO:tasks.workunit.client.1.vm08.stdout:4/792: dread da/d10/d16/d28/d46/fb1 [0,4194304] 0 2026-03-09T19:27:47.502 INFO:tasks.workunit.client.1.vm08.stdout:4/793: write da/f21 [3662117,61792] 0 2026-03-09T19:27:47.505 INFO:tasks.workunit.client.1.vm08.stdout:8/809: creat de/d25/d31/f11c x:0 0 0 2026-03-09T19:27:47.519 INFO:tasks.workunit.client.1.vm08.stdout:7/892: rename d5/d14/dae/d3a/d42/ff3 to d5/d14/d27/d54/dfb/d90/daf/f133 0 2026-03-09T19:27:47.520 INFO:tasks.workunit.client.1.vm08.stdout:8/810: dread de/d25/d31/d82/fa1 [0,4194304] 0 2026-03-09T19:27:47.525 INFO:tasks.workunit.client.0.vm07.stdout:4/604: rename d3/d11/d29/db9/d22/d86/fc6 to d3/fd2 0 2026-03-09T19:27:47.528 INFO:tasks.workunit.client.1.vm08.stdout:5/797: write d16/f34 [325169,130460] 0 2026-03-09T19:27:47.532 INFO:tasks.workunit.client.1.vm08.stdout:5/798: dwrite d16/d1e/f5b [4194304,4194304] 0 2026-03-09T19:27:47.538 INFO:tasks.workunit.client.0.vm07.stdout:2/683: getdents d3/dd 0 2026-03-09T19:27:47.538 INFO:tasks.workunit.client.0.vm07.stdout:6/578: creat d0/d4e/d75/fdf x:0 0 0 2026-03-09T19:27:47.538 INFO:tasks.workunit.client.1.vm08.stdout:0/839: link dd/c11 dd/d22/d24/d49/d50/db3/c112 0 2026-03-09T19:27:47.538 INFO:tasks.workunit.client.1.vm08.stdout:0/840: stat dd/d22/de1 0 2026-03-09T19:27:47.539 INFO:tasks.workunit.client.0.vm07.stdout:6/579: chown d0/d1/db/d17/fb4 0 1 2026-03-09T19:27:47.544 INFO:tasks.workunit.client.0.vm07.stdout:7/612: getdents d0/d4/d5/d8/d41/d64/d74 0 2026-03-09T19:27:47.548 INFO:tasks.workunit.client.1.vm08.stdout:2/738: dwrite d3/d4/d23/d2c/d39/d5e/de/d18/f1a [4194304,4194304] 0 2026-03-09T19:27:47.575 INFO:tasks.workunit.client.0.vm07.stdout:0/582: creat d0/d6/d13/da1/fbd x:0 0 0 2026-03-09T19:27:47.575 INFO:tasks.workunit.client.0.vm07.stdout:0/583: chown d0/d6/d13/d1c/d11/d56/d78/f7d 30 1 2026-03-09T19:27:47.577 INFO:tasks.workunit.client.1.vm08.stdout:4/794: mknod da/d10/ce8 0 2026-03-09T19:27:47.582 INFO:tasks.workunit.client.0.vm07.stdout:5/596: rename d3/d1a/d28/d6c/d72/d8f/ca1 to d3/d1a/d5a/db8/cbb 0 2026-03-09T19:27:47.591 INFO:tasks.workunit.client.1.vm08.stdout:8/811: fdatasync de/d25/d33/fb6 0 2026-03-09T19:27:47.591 INFO:tasks.workunit.client.1.vm08.stdout:3/890: dread d0/d6/de/d15/d96/fd9 [0,4194304] 0 2026-03-09T19:27:47.591 INFO:tasks.workunit.client.1.vm08.stdout:9/823: symlink d0/d1b/d97/d48/l114 0 2026-03-09T19:27:47.591 INFO:tasks.workunit.client.0.vm07.stdout:4/605: stat d3/d11/d2b/d37/c1d 0 2026-03-09T19:27:47.591 INFO:tasks.workunit.client.0.vm07.stdout:5/597: dread d3/d1a/d28/d40/d92/fa9 [0,4194304] 0 2026-03-09T19:27:47.591 INFO:tasks.workunit.client.0.vm07.stdout:7/613: mkdir d0/d4/d5/d8/dcd 0 2026-03-09T19:27:47.592 INFO:tasks.workunit.client.1.vm08.stdout:5/799: dread d16/f4d [0,4194304] 0 2026-03-09T19:27:47.593 INFO:tasks.workunit.client.1.vm08.stdout:5/800: chown d16/d1e/d6e/lc4 251717070 1 2026-03-09T19:27:47.594 INFO:tasks.workunit.client.0.vm07.stdout:0/584: read d0/d6/d13/d1c/d11/d56/d78/fb3 [3317222,66362] 0 2026-03-09T19:27:47.598 INFO:tasks.workunit.client.1.vm08.stdout:6/844: mknod d3/d15/dc2/d12f/c138 0 2026-03-09T19:27:47.599 INFO:tasks.workunit.client.1.vm08.stdout:1/950: mkdir d9/da/d95/d12a 0 2026-03-09T19:27:47.602 INFO:tasks.workunit.client.1.vm08.stdout:2/739: sync 2026-03-09T19:27:47.602 INFO:tasks.workunit.client.1.vm08.stdout:3/891: sync 2026-03-09T19:27:47.603 INFO:tasks.workunit.client.1.vm08.stdout:3/892: stat d0/d6/de/d1b/d16/d17/dac/d109/l72 0 2026-03-09T19:27:47.608 INFO:tasks.workunit.client.1.vm08.stdout:2/740: dwrite d3/d4/d23/d2c/d39/d5e/de/d18/d99/dd4/fdc [0,4194304] 0 2026-03-09T19:27:47.638 INFO:tasks.workunit.client.0.vm07.stdout:4/606: mknod d3/d11/d2b/d37/cd3 0 2026-03-09T19:27:47.662 INFO:tasks.workunit.client.1.vm08.stdout:8/812: dread de/d7c/f95 [0,4194304] 0 2026-03-09T19:27:47.672 INFO:tasks.workunit.client.0.vm07.stdout:5/598: read d3/fe [1536418,17892] 0 2026-03-09T19:27:47.673 INFO:tasks.workunit.client.0.vm07.stdout:5/599: chown d3/d1a/d28/c30 443125805 1 2026-03-09T19:27:47.674 INFO:tasks.workunit.client.0.vm07.stdout:5/600: truncate d3/dd/d26/d2d/fae 211953 0 2026-03-09T19:27:47.674 INFO:tasks.workunit.client.1.vm08.stdout:9/824: chown d0/d1b/d97/f34 43731 1 2026-03-09T19:27:47.674 INFO:tasks.workunit.client.0.vm07.stdout:5/601: fsync d3/d1a/fb 0 2026-03-09T19:27:47.677 INFO:tasks.workunit.client.0.vm07.stdout:7/614: creat d0/d52/d54/d55/fce x:0 0 0 2026-03-09T19:27:47.679 INFO:tasks.workunit.client.1.vm08.stdout:5/801: creat d16/d1e/d8c/f101 x:0 0 0 2026-03-09T19:27:47.689 INFO:tasks.workunit.client.1.vm08.stdout:0/841: mkdir dd/d22/d27/d113 0 2026-03-09T19:27:47.691 INFO:tasks.workunit.client.0.vm07.stdout:9/629: write d0/db/d29/d2c/d36/d7d/fc0 [340673,73614] 0 2026-03-09T19:27:47.695 INFO:tasks.workunit.client.1.vm08.stdout:7/893: dwrite d5/d14/d2b/d4b/ffe [0,4194304] 0 2026-03-09T19:27:47.697 INFO:tasks.workunit.client.1.vm08.stdout:7/894: write d5/d14/f46 [779137,56358] 0 2026-03-09T19:27:47.726 INFO:tasks.workunit.client.0.vm07.stdout:2/684: dwrite d3/f5 [0,4194304] 0 2026-03-09T19:27:47.728 INFO:tasks.workunit.client.0.vm07.stdout:6/580: link d0/dbf/d95/d31/cb7 d0/dbf/ce0 0 2026-03-09T19:27:47.734 INFO:tasks.workunit.client.1.vm08.stdout:2/741: mkdir d3/d9/d79/df9 0 2026-03-09T19:27:47.737 INFO:tasks.workunit.client.1.vm08.stdout:2/742: dread d3/d4/f49 [0,4194304] 0 2026-03-09T19:27:47.744 INFO:tasks.workunit.client.0.vm07.stdout:3/686: dwrite d1/f98 [0,4194304] 0 2026-03-09T19:27:47.750 INFO:tasks.workunit.client.0.vm07.stdout:7/615: truncate d0/d52/d54/f7d 4897497 0 2026-03-09T19:27:47.751 INFO:tasks.workunit.client.0.vm07.stdout:9/630: symlink d0/d17/ldc 0 2026-03-09T19:27:47.757 INFO:tasks.workunit.client.0.vm07.stdout:8/633: rename d7/d30/d75/dcc/fbb to d7/fe3 0 2026-03-09T19:27:47.758 INFO:tasks.workunit.client.0.vm07.stdout:8/634: write d7/d9/d37/d45/d4f/db1/fce [107059,9051] 0 2026-03-09T19:27:47.758 INFO:tasks.workunit.client.0.vm07.stdout:8/635: write d7/d1d/f3d [4791422,76940] 0 2026-03-09T19:27:47.759 INFO:tasks.workunit.client.0.vm07.stdout:8/636: read - d7/d9/d37/d45/d4f/fd4 zero size 2026-03-09T19:27:47.760 INFO:tasks.workunit.client.0.vm07.stdout:8/637: truncate d7/d9/d10/d44/fdc 376378 0 2026-03-09T19:27:47.762 INFO:tasks.workunit.client.0.vm07.stdout:2/685: fdatasync d3/dd/d16/d29/f91 0 2026-03-09T19:27:47.763 INFO:tasks.workunit.client.0.vm07.stdout:6/581: creat d0/d1/d28/d76/dad/fe1 x:0 0 0 2026-03-09T19:27:47.768 INFO:tasks.workunit.client.0.vm07.stdout:5/602: creat d3/dd/d26/d3f/d47/d71/db7/fbc x:0 0 0 2026-03-09T19:27:47.769 INFO:tasks.workunit.client.0.vm07.stdout:0/585: creat d0/d6/d13/fbe x:0 0 0 2026-03-09T19:27:47.770 INFO:tasks.workunit.client.0.vm07.stdout:7/616: write d0/d4/d5/d8/f15 [2241554,69483] 0 2026-03-09T19:27:47.771 INFO:tasks.workunit.client.0.vm07.stdout:7/617: read d0/d4/d5/d8/d41/d64/d74/d98/f83 [2648718,5008] 0 2026-03-09T19:27:47.781 INFO:tasks.workunit.client.0.vm07.stdout:9/631: fdatasync d0/db/d29/d2c/f61 0 2026-03-09T19:27:47.781 INFO:tasks.workunit.client.0.vm07.stdout:9/632: stat d0/d6/l24 0 2026-03-09T19:27:47.782 INFO:tasks.workunit.client.0.vm07.stdout:9/633: write d0/d6f/dc3/fc4 [385886,101469] 0 2026-03-09T19:27:47.784 INFO:tasks.workunit.client.0.vm07.stdout:8/638: rmdir d7/d9/d37/d45/d4f 39 2026-03-09T19:27:47.786 INFO:tasks.workunit.client.0.vm07.stdout:9/634: dwrite d0/d6/d3a/d81/fa3 [0,4194304] 0 2026-03-09T19:27:47.789 INFO:tasks.workunit.client.0.vm07.stdout:9/635: chown d0/dc1 233 1 2026-03-09T19:27:47.808 INFO:tasks.workunit.client.0.vm07.stdout:2/686: mkdir d3/dd/d16/d29/d2d/d45/d8b/d98/dee 0 2026-03-09T19:27:47.810 INFO:tasks.workunit.client.0.vm07.stdout:6/582: creat d0/d44/fe2 x:0 0 0 2026-03-09T19:27:47.810 INFO:tasks.workunit.client.0.vm07.stdout:2/687: write d3/dd/d16/d29/d2d/d45/d3b/d44/d97/fe6 [642454,101238] 0 2026-03-09T19:27:47.821 INFO:tasks.workunit.client.0.vm07.stdout:5/603: creat d3/d1a/d28/d6c/d72/d8f/fbd x:0 0 0 2026-03-09T19:27:47.835 INFO:tasks.workunit.client.0.vm07.stdout:0/586: unlink d0/d6/d13/d17/d19/d57/f75 0 2026-03-09T19:27:47.839 INFO:tasks.workunit.client.0.vm07.stdout:4/607: dwrite d3/d11/d29/d34/fa5 [0,4194304] 0 2026-03-09T19:27:47.892 INFO:tasks.workunit.client.0.vm07.stdout:9/636: readlink d0/l2e 0 2026-03-09T19:27:47.907 INFO:tasks.workunit.client.0.vm07.stdout:3/687: creat d1/d6/fd6 x:0 0 0 2026-03-09T19:27:47.909 INFO:tasks.workunit.client.0.vm07.stdout:0/587: readlink d0/d6/d13/l4a 0 2026-03-09T19:27:47.911 INFO:tasks.workunit.client.0.vm07.stdout:8/639: dwrite d7/d30/f3e [0,4194304] 0 2026-03-09T19:27:47.915 INFO:tasks.workunit.client.0.vm07.stdout:0/588: dwrite d0/f3c [0,4194304] 0 2026-03-09T19:27:47.916 INFO:tasks.workunit.client.1.vm08.stdout:0/842: creat dd/d22/d24/f114 x:0 0 0 2026-03-09T19:27:47.934 INFO:tasks.workunit.client.1.vm08.stdout:5/802: dread d16/d1e/d8c/d99/dcc/fd1 [0,4194304] 0 2026-03-09T19:27:47.942 INFO:tasks.workunit.client.1.vm08.stdout:6/845: creat d3/d94/def/dc4/d130/f139 x:0 0 0 2026-03-09T19:27:47.950 INFO:tasks.workunit.client.0.vm07.stdout:1/653: rename d1/d11/d37/d3f/d45/d87/fa2 to d1/d11/d37/d3f/fd7 0 2026-03-09T19:27:47.951 INFO:tasks.workunit.client.1.vm08.stdout:4/795: rename da/d10/d26/d50 to da/d10/d16/d28/d2f/de9 0 2026-03-09T19:27:47.969 INFO:tasks.workunit.client.1.vm08.stdout:6/846: sync 2026-03-09T19:27:47.974 INFO:tasks.workunit.client.1.vm08.stdout:9/825: write d0/d2/d80/de5/da2/da8/de8/dcd/fc6 [121846,129338] 0 2026-03-09T19:27:47.975 INFO:tasks.workunit.client.0.vm07.stdout:7/618: write d0/d4/d5/f20 [2054166,109983] 0 2026-03-09T19:27:47.980 INFO:tasks.workunit.client.1.vm08.stdout:7/895: chown d5/d14/d27/d78/dc7/c101 1924 1 2026-03-09T19:27:47.982 INFO:tasks.workunit.client.0.vm07.stdout:5/604: mkdir d3/dd/dbe 0 2026-03-09T19:27:47.983 INFO:tasks.workunit.client.0.vm07.stdout:5/605: chown d3/d1a/d28/c44 16 1 2026-03-09T19:27:47.983 INFO:tasks.workunit.client.0.vm07.stdout:9/637: write d0/db/d29/d32/d5c/d69/f83 [1957777,7234] 0 2026-03-09T19:27:47.987 INFO:tasks.workunit.client.0.vm07.stdout:3/688: symlink d1/d3d/ld7 0 2026-03-09T19:27:47.988 INFO:tasks.workunit.client.0.vm07.stdout:3/689: chown d1/d3d/d47/db3/dc2/c1d 80520110 1 2026-03-09T19:27:47.993 INFO:tasks.workunit.client.1.vm08.stdout:2/743: truncate d3/d9/d4a/d9a/fd7 946925 0 2026-03-09T19:27:48.011 INFO:tasks.workunit.client.1.vm08.stdout:5/803: mkdir d16/d1e/dc9/d102 0 2026-03-09T19:27:48.013 INFO:tasks.workunit.client.0.vm07.stdout:8/640: dwrite d7/d50/f8f [4194304,4194304] 0 2026-03-09T19:27:48.026 INFO:tasks.workunit.client.0.vm07.stdout:4/608: mknod d3/d11/d29/db9/cd4 0 2026-03-09T19:27:48.029 INFO:tasks.workunit.client.1.vm08.stdout:3/893: rename d0/d6/de/d1b/d16/d17/dac/d109/ld4 to d0/d6/de/d15/d96/l124 0 2026-03-09T19:27:48.031 INFO:tasks.workunit.client.0.vm07.stdout:8/641: dread d7/d50/f8f [0,4194304] 0 2026-03-09T19:27:48.034 INFO:tasks.workunit.client.0.vm07.stdout:6/583: rename d0/d1/db/d52/d94/f84 to d0/d1/db/fe3 0 2026-03-09T19:27:48.065 INFO:tasks.workunit.client.0.vm07.stdout:2/688: link d3/f4 d3/dd/d16/d30/da7/dad/fef 0 2026-03-09T19:27:48.065 INFO:tasks.workunit.client.1.vm08.stdout:6/847: truncate d3/d94/fb5 2086250 0 2026-03-09T19:27:48.075 INFO:tasks.workunit.client.0.vm07.stdout:1/654: write d1/f6 [2470849,87343] 0 2026-03-09T19:27:48.078 INFO:tasks.workunit.client.1.vm08.stdout:4/796: dwrite da/d10/d16/d28/d46/d52/d6e/d2c/f7c [0,4194304] 0 2026-03-09T19:27:48.085 INFO:tasks.workunit.client.1.vm08.stdout:9/826: write d0/d2/d14/d98/f38 [112208,33892] 0 2026-03-09T19:27:48.085 INFO:tasks.workunit.client.1.vm08.stdout:9/827: stat d0/d2/d80/d69/ff3 0 2026-03-09T19:27:48.096 INFO:tasks.workunit.client.0.vm07.stdout:5/606: rmdir d3 39 2026-03-09T19:27:48.103 INFO:tasks.workunit.client.0.vm07.stdout:1/655: sync 2026-03-09T19:27:48.104 INFO:tasks.workunit.client.1.vm08.stdout:1/951: link d9/da/dc/f10 d9/da/d12/f12b 0 2026-03-09T19:27:48.113 INFO:tasks.workunit.client.0.vm07.stdout:9/638: symlink d0/d6/d57/ldd 0 2026-03-09T19:27:48.123 INFO:tasks.workunit.client.0.vm07.stdout:3/690: mknod d1/d6/dd/dbf/cd8 0 2026-03-09T19:27:48.127 INFO:tasks.workunit.client.1.vm08.stdout:8/813: link de/d1d/d2e/d5f/f57 de/d1d/d4f/f11d 0 2026-03-09T19:27:48.133 INFO:tasks.workunit.client.0.vm07.stdout:2/689: dread d3/dd/d16/d29/d2d/d45/d3b/fb0 [4194304,4194304] 0 2026-03-09T19:27:48.153 INFO:tasks.workunit.client.1.vm08.stdout:3/894: rmdir d0/d6/de 39 2026-03-09T19:27:48.155 INFO:tasks.workunit.client.0.vm07.stdout:4/609: dread d3/d11/d2b/d37/f30 [0,4194304] 0 2026-03-09T19:27:48.155 INFO:tasks.workunit.client.0.vm07.stdout:8/642: mknod d7/d1d/d83/d9f/ce4 0 2026-03-09T19:27:48.158 INFO:tasks.workunit.client.0.vm07.stdout:7/619: rename d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58/f5d to d0/d52/d54/d95/fcf 0 2026-03-09T19:27:48.162 INFO:tasks.workunit.client.0.vm07.stdout:8/643: dwrite d7/d1d/f3f [0,4194304] 0 2026-03-09T19:27:48.184 INFO:tasks.workunit.client.0.vm07.stdout:9/639: mkdir d0/d6/d57/d5d/dde 0 2026-03-09T19:27:48.184 INFO:tasks.workunit.client.0.vm07.stdout:3/691: symlink d1/d3d/ld9 0 2026-03-09T19:27:48.186 INFO:tasks.workunit.client.0.vm07.stdout:3/692: write d1/d6/d4c/fb1 [4042483,6473] 0 2026-03-09T19:27:48.188 INFO:tasks.workunit.client.1.vm08.stdout:9/828: readlink d0/d1b/d97/lce 0 2026-03-09T19:27:48.193 INFO:tasks.workunit.client.1.vm08.stdout:1/952: rename d9/d40/d49/le0 to d9/da/d17/d60/df5/l12c 0 2026-03-09T19:27:48.198 INFO:tasks.workunit.client.0.vm07.stdout:0/589: creat d0/d6/d13/fbf x:0 0 0 2026-03-09T19:27:48.198 INFO:tasks.workunit.client.1.vm08.stdout:2/744: mknod d3/d9/d79/df9/cfa 0 2026-03-09T19:27:48.201 INFO:tasks.workunit.client.0.vm07.stdout:2/690: creat d3/dd/d16/d29/d2d/d45/d3b/d44/d96/ff0 x:0 0 0 2026-03-09T19:27:48.203 INFO:tasks.workunit.client.0.vm07.stdout:0/590: read d0/d6/d13/d1c/d11/d56/d78/fb3 [2923856,107731] 0 2026-03-09T19:27:48.208 INFO:tasks.workunit.client.1.vm08.stdout:6/848: creat d3/d34/d6f/d123/f13a x:0 0 0 2026-03-09T19:27:48.212 INFO:tasks.workunit.client.0.vm07.stdout:8/644: creat d7/d50/fe5 x:0 0 0 2026-03-09T19:27:48.212 INFO:tasks.workunit.client.0.vm07.stdout:8/645: fsync d7/d50/da6/fde 0 2026-03-09T19:27:48.229 INFO:tasks.workunit.client.0.vm07.stdout:5/607: creat d3/dd/d26/d3f/fbf x:0 0 0 2026-03-09T19:27:48.230 INFO:tasks.workunit.client.1.vm08.stdout:1/953: symlink d9/d11/d7a/d89/d8d/daa/l12d 0 2026-03-09T19:27:48.231 INFO:tasks.workunit.client.1.vm08.stdout:1/954: dread - d9/d11/faf zero size 2026-03-09T19:27:48.233 INFO:tasks.workunit.client.0.vm07.stdout:9/640: mkdir d0/db/d29/d32/d5c/d80/ddf 0 2026-03-09T19:27:48.234 INFO:tasks.workunit.client.0.vm07.stdout:6/584: dwrite d0/d1/db/f43 [0,4194304] 0 2026-03-09T19:27:48.235 INFO:tasks.workunit.client.1.vm08.stdout:2/745: mknod d3/d9/d79/d46/d8c/d92/cfb 0 2026-03-09T19:27:48.252 INFO:tasks.workunit.client.1.vm08.stdout:0/843: getdents dd/d22/d27/d6c 0 2026-03-09T19:27:48.253 INFO:tasks.workunit.client.1.vm08.stdout:0/844: fdatasync dd/d22/d24/d49/d50/dd4/ff2 0 2026-03-09T19:27:48.254 INFO:tasks.workunit.client.1.vm08.stdout:0/845: chown dd/d22/d63/d6e 1963564189 1 2026-03-09T19:27:48.258 INFO:tasks.workunit.client.1.vm08.stdout:5/804: creat d16/d45/daf/df5/f103 x:0 0 0 2026-03-09T19:27:48.261 INFO:tasks.workunit.client.0.vm07.stdout:4/610: mknod d3/cd5 0 2026-03-09T19:27:48.261 INFO:tasks.workunit.client.1.vm08.stdout:8/814: dwrite de/d1d/d2e/d5f/f57 [0,4194304] 0 2026-03-09T19:27:48.270 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:27:48.270 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:27:48.270 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:27:48.271 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:27:48.271 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:27:48.271 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T19:27:48.271 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T19:27:48.271 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T19:27:48.271 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.mxylvw/key"}]: dispatch 2026-03-09T19:27:48.271 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T19:27:48.271 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: Manager daemon vm08.mxylvw is now available 2026-03-09T19:27:48.271 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:48.271 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:48.271 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:48.271 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:48.271 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:48.271 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:27:48.271 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:48 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:27:48.276 INFO:tasks.workunit.client.0.vm07.stdout:0/591: dread d0/d6/d13/d1c/d61/d69/fb9 [0,4194304] 0 2026-03-09T19:27:48.282 INFO:tasks.workunit.client.1.vm08.stdout:4/797: truncate da/d10/d26/d27/f35 1130842 0 2026-03-09T19:27:48.282 INFO:tasks.workunit.client.0.vm07.stdout:7/620: write d0/d4/d5/d8/d41/d64/d74/d98/f18 [5123697,124298] 0 2026-03-09T19:27:48.282 INFO:tasks.workunit.client.0.vm07.stdout:0/592: truncate d0/d6/d13/d1c/d50/d92/fb4 142426 0 2026-03-09T19:27:48.283 INFO:tasks.workunit.client.0.vm07.stdout:0/593: dwrite d0/f1e [0,4194304] 0 2026-03-09T19:27:48.285 INFO:tasks.workunit.client.0.vm07.stdout:0/594: truncate d0/d6/d13/f6c 8669132 0 2026-03-09T19:27:48.286 INFO:tasks.workunit.client.0.vm07.stdout:0/595: readlink d0/d6/d13/l4a 0 2026-03-09T19:27:48.316 INFO:tasks.workunit.client.1.vm08.stdout:7/896: getdents d5/d14/d2b 0 2026-03-09T19:27:48.321 INFO:tasks.workunit.client.0.vm07.stdout:1/656: truncate d1/d11/f48 2957988 0 2026-03-09T19:27:48.327 INFO:tasks.workunit.client.1.vm08.stdout:9/829: dwrite d0/d2/d80/de5/da2/da8/de8/dcd/fb9 [0,4194304] 0 2026-03-09T19:27:48.342 INFO:tasks.workunit.client.0.vm07.stdout:8/646: dwrite d7/d9/d37/d45/d4f/db1/fce [0,4194304] 0 2026-03-09T19:27:48.345 INFO:tasks.workunit.client.0.vm07.stdout:5/608: mknod d3/d1a/d28/d40/d92/d89/cc0 0 2026-03-09T19:27:48.361 INFO:tasks.workunit.client.1.vm08.stdout:2/746: mknod d3/d9/d4a/cfc 0 2026-03-09T19:27:48.368 INFO:tasks.workunit.client.0.vm07.stdout:6/585: mknod d0/d4e/dae/ce4 0 2026-03-09T19:27:48.373 INFO:tasks.workunit.client.1.vm08.stdout:0/846: creat dd/f115 x:0 0 0 2026-03-09T19:27:48.378 INFO:tasks.workunit.client.1.vm08.stdout:5/805: rename d16/d45/daf/df5/d6f/fbb to d16/d1e/d8c/f104 0 2026-03-09T19:27:48.379 INFO:tasks.workunit.client.1.vm08.stdout:3/895: dwrite d0/d6/de/d1b/d16/d17/dac/d109/fb7 [0,4194304] 0 2026-03-09T19:27:48.383 INFO:tasks.workunit.client.0.vm07.stdout:2/691: mkdir d3/dd/d16/d29/d3c/df1 0 2026-03-09T19:27:48.387 INFO:tasks.workunit.client.0.vm07.stdout:4/611: unlink d3/d11/d2b/f7e 0 2026-03-09T19:27:48.388 INFO:tasks.workunit.client.0.vm07.stdout:1/657: dread d1/d11/d37/d5d/d50/f6b [0,4194304] 0 2026-03-09T19:27:48.388 INFO:tasks.workunit.client.1.vm08.stdout:3/896: dwrite d0/d4b/f10d [0,4194304] 0 2026-03-09T19:27:48.388 INFO:tasks.workunit.client.0.vm07.stdout:1/658: stat d1/d11/d37/d3f/d6e 0 2026-03-09T19:27:48.398 INFO:tasks.workunit.client.1.vm08.stdout:2/747: sync 2026-03-09T19:27:48.400 INFO:tasks.workunit.client.1.vm08.stdout:5/806: dread d16/d45/f5d [0,4194304] 0 2026-03-09T19:27:48.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:27:48.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:27:48.403 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:27:48.403 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:27:48.403 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:27:48.403 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T19:27:48.403 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T19:27:48.403 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T19:27:48.403 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.mxylvw/key"}]: dispatch 2026-03-09T19:27:48.403 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T19:27:48.403 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: Manager daemon vm08.mxylvw is now available 2026-03-09T19:27:48.403 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:48.403 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:48.403 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:48.403 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:48.403 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:48.403 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:27:48.403 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:48 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:27:48.458 INFO:tasks.workunit.client.0.vm07.stdout:7/621: unlink d0/d52/d54/d55/fce 0 2026-03-09T19:27:48.478 INFO:tasks.workunit.client.1.vm08.stdout:1/955: dwrite d9/d40/d49/d9e/fce [0,4194304] 0 2026-03-09T19:27:48.487 INFO:tasks.workunit.client.1.vm08.stdout:7/897: read - d5/d14/d2b/daa/f105 zero size 2026-03-09T19:27:48.555 INFO:tasks.workunit.client.1.vm08.stdout:5/807: creat d16/d1e/d8c/d99/f105 x:0 0 0 2026-03-09T19:27:48.568 INFO:tasks.workunit.client.1.vm08.stdout:6/849: link d3/db/c46 d3/d94/def/c13b 0 2026-03-09T19:27:48.569 INFO:tasks.workunit.client.1.vm08.stdout:6/850: chown d3/d34/da9/da4/d117 77467 1 2026-03-09T19:27:48.596 INFO:tasks.workunit.client.1.vm08.stdout:1/956: readlink d9/da/d53/db3/le3 0 2026-03-09T19:27:48.622 INFO:tasks.workunit.client.1.vm08.stdout:5/808: creat d16/d1e/d8c/d99/da8/d9a/f106 x:0 0 0 2026-03-09T19:27:48.627 INFO:tasks.workunit.client.1.vm08.stdout:5/809: dwrite d16/d1e/f27 [0,4194304] 0 2026-03-09T19:27:48.635 INFO:tasks.workunit.client.1.vm08.stdout:0/847: dread - dd/d22/d24/d49/d50/dd4/ff2 zero size 2026-03-09T19:27:48.658 INFO:tasks.workunit.client.1.vm08.stdout:9/830: write d0/d1b/d97/fca [877862,9386] 0 2026-03-09T19:27:48.679 INFO:tasks.workunit.client.1.vm08.stdout:1/957: truncate d9/d40/fef 66381 0 2026-03-09T19:27:48.691 INFO:tasks.workunit.client.1.vm08.stdout:8/815: dwrite de/d47/dfd/d99/da5/fdd [0,4194304] 0 2026-03-09T19:27:48.718 INFO:tasks.workunit.client.1.vm08.stdout:5/810: rename d16/d45/d81 to d16/d8e/ddb/d107 0 2026-03-09T19:27:48.722 INFO:tasks.workunit.client.1.vm08.stdout:7/898: dwrite d5/d14/d38/fbb [0,4194304] 0 2026-03-09T19:27:48.727 INFO:tasks.workunit.client.1.vm08.stdout:3/897: write d0/f106 [714366,39163] 0 2026-03-09T19:27:48.749 INFO:tasks.workunit.client.0.vm07.stdout:8/647: fsync d7/d9/f87 0 2026-03-09T19:27:48.750 INFO:tasks.workunit.client.0.vm07.stdout:8/648: write d7/d9/d57/fb2 [2138126,74952] 0 2026-03-09T19:27:48.757 INFO:tasks.workunit.client.0.vm07.stdout:5/609: creat d3/dd/d95/fc1 x:0 0 0 2026-03-09T19:27:48.759 INFO:tasks.workunit.client.1.vm08.stdout:0/848: write dd/d22/d27/d2e/f39 [138125,96904] 0 2026-03-09T19:27:48.770 INFO:tasks.workunit.client.0.vm07.stdout:4/612: mkdir d3/d11/d29/db9/d91/dd6 0 2026-03-09T19:27:48.773 INFO:tasks.workunit.client.0.vm07.stdout:4/613: dread d3/d11/d29/d34/f5c [0,4194304] 0 2026-03-09T19:27:48.776 INFO:tasks.workunit.client.0.vm07.stdout:7/622: rmdir d0/d4/d5/d8 39 2026-03-09T19:27:48.777 INFO:tasks.workunit.client.0.vm07.stdout:0/596: creat d0/d6/d13/d1c/d11/d8b/fc0 x:0 0 0 2026-03-09T19:27:48.790 INFO:tasks.workunit.client.0.vm07.stdout:1/659: dwrite d1/d11/f42 [0,4194304] 0 2026-03-09T19:27:48.791 INFO:tasks.workunit.client.0.vm07.stdout:1/660: readlink d1/db/d31/d4f/l5b 0 2026-03-09T19:27:48.835 INFO:tasks.workunit.client.0.vm07.stdout:3/693: getdents d1/d3d/d47/db3/d8e 0 2026-03-09T19:27:48.835 INFO:tasks.workunit.client.0.vm07.stdout:3/694: readlink d1/d3d/ld7 0 2026-03-09T19:27:48.842 INFO:tasks.workunit.client.0.vm07.stdout:2/692: creat d3/dd/d16/d30/da7/dad/ddd/ded/ff2 x:0 0 0 2026-03-09T19:27:48.842 INFO:tasks.workunit.client.0.vm07.stdout:0/597: creat d0/d6/d13/da1/fc1 x:0 0 0 2026-03-09T19:27:48.869 INFO:tasks.workunit.client.0.vm07.stdout:1/661: symlink d1/d11/d37/d3f/d45/d87/ld8 0 2026-03-09T19:27:48.872 INFO:tasks.workunit.client.1.vm08.stdout:4/798: getdents da/d10/d16/d28/d2f/d4f/d64 0 2026-03-09T19:27:48.872 INFO:tasks.workunit.client.1.vm08.stdout:4/799: stat da/d10/d16/d28/d2f/d4f/d56/dd0/lad 0 2026-03-09T19:27:48.875 INFO:tasks.workunit.client.0.vm07.stdout:8/649: mknod d7/d50/da6/dc5/ce6 0 2026-03-09T19:27:48.875 INFO:tasks.workunit.client.0.vm07.stdout:8/650: write f5 [3902717,88103] 0 2026-03-09T19:27:48.876 INFO:tasks.workunit.client.1.vm08.stdout:2/748: link d3/d4/d23/d2c/d39/d5e/c3b d3/d4/d3e/d4e/d88/cfd 0 2026-03-09T19:27:48.885 INFO:tasks.workunit.client.1.vm08.stdout:8/816: read de/d1d/d2e/d5f/fbb [1258887,7365] 0 2026-03-09T19:27:48.885 INFO:tasks.workunit.client.0.vm07.stdout:9/641: link d0/db/d29/d2c/f4a d0/db/fe0 0 2026-03-09T19:27:48.890 INFO:tasks.workunit.client.1.vm08.stdout:9/831: rename d0/d1b/f65 to d0/d1b/d97/d48/d5d/d74/f115 0 2026-03-09T19:27:48.890 INFO:tasks.workunit.client.1.vm08.stdout:9/832: fdatasync d0/d1b/d97/fca 0 2026-03-09T19:27:48.893 INFO:tasks.workunit.client.0.vm07.stdout:7/623: chown d0/d4/d5/d8/d1a/c72 137871 1 2026-03-09T19:27:48.894 INFO:tasks.workunit.client.0.vm07.stdout:2/693: mkdir d3/dd/d16/d30/da7/dad/ddd/ded/df3 0 2026-03-09T19:27:48.897 INFO:tasks.workunit.client.0.vm07.stdout:0/598: symlink d0/d6/d13/d1c/d11/d56/d78/lc2 0 2026-03-09T19:27:48.900 INFO:tasks.workunit.client.0.vm07.stdout:4/614: write d3/d11/d2b/d37/faf [118197,115151] 0 2026-03-09T19:27:48.903 INFO:tasks.workunit.client.1.vm08.stdout:3/898: dwrite d0/d6/de/d1b/d16/d17/dac/d109/fcc [0,4194304] 0 2026-03-09T19:27:48.904 INFO:tasks.workunit.client.0.vm07.stdout:1/662: creat d1/d3e/db3/d9a/fd9 x:0 0 0 2026-03-09T19:27:48.904 INFO:tasks.workunit.client.1.vm08.stdout:0/849: symlink dd/d22/d27/d2e/d37/l116 0 2026-03-09T19:27:48.905 INFO:tasks.workunit.client.1.vm08.stdout:3/899: write d0/d6/de/d1b/d16/d17/fbc [2343289,56039] 0 2026-03-09T19:27:48.909 INFO:tasks.workunit.client.0.vm07.stdout:8/651: mkdir d7/d9/d37/d45/d56/d67/de7 0 2026-03-09T19:27:48.910 INFO:tasks.workunit.client.0.vm07.stdout:0/599: sync 2026-03-09T19:27:48.912 INFO:tasks.workunit.client.1.vm08.stdout:4/800: dread da/d10/d16/d28/d46/d52/d6e/d73/fae [0,4194304] 0 2026-03-09T19:27:48.915 INFO:tasks.workunit.client.0.vm07.stdout:6/586: link d0/d1/db/d17/dc4/d7b/d7d/fa7 d0/d1/db/d17/fe5 0 2026-03-09T19:27:48.921 INFO:tasks.workunit.client.0.vm07.stdout:9/642: fdatasync d0/db/d29/d2c/d36/d7d/f82 0 2026-03-09T19:27:48.928 INFO:tasks.workunit.client.1.vm08.stdout:8/817: dread de/d1d/d69/f8f [0,4194304] 0 2026-03-09T19:27:48.930 INFO:tasks.workunit.client.0.vm07.stdout:3/695: mknod d1/d6/cda 0 2026-03-09T19:27:48.934 INFO:tasks.workunit.client.0.vm07.stdout:3/696: chown d1/d3d/ld9 0 1 2026-03-09T19:27:48.934 INFO:tasks.workunit.client.0.vm07.stdout:3/697: truncate d1/d6/f19 1704007 0 2026-03-09T19:27:48.934 INFO:tasks.workunit.client.0.vm07.stdout:7/624: read - d0/d4/d5/d8/d1a/d2a/fb3 zero size 2026-03-09T19:27:48.934 INFO:tasks.workunit.client.1.vm08.stdout:5/811: link d16/d8e/ff8 d16/d1e/d8c/d99/dcc/df4/dfd/f108 0 2026-03-09T19:27:48.936 INFO:tasks.workunit.client.0.vm07.stdout:7/625: read d0/d4/d5/d8/d41/d64/d74/d98/f18 [4068856,46253] 0 2026-03-09T19:27:48.938 INFO:tasks.workunit.client.0.vm07.stdout:2/694: rename d3/l26 to d3/d49/lf4 0 2026-03-09T19:27:48.943 INFO:tasks.workunit.client.1.vm08.stdout:9/833: dwrite d0/d1b/f8a [0,4194304] 0 2026-03-09T19:27:48.954 INFO:tasks.workunit.client.0.vm07.stdout:8/652: rmdir d7 39 2026-03-09T19:27:48.961 INFO:tasks.workunit.client.0.vm07.stdout:0/600: unlink d0/f68 0 2026-03-09T19:27:48.961 INFO:tasks.workunit.client.1.vm08.stdout:0/850: truncate dd/d22/d27/f9e 3894555 0 2026-03-09T19:27:48.962 INFO:tasks.workunit.client.1.vm08.stdout:3/900: creat d0/d6/de/d1b/d16/dd1/f125 x:0 0 0 2026-03-09T19:27:48.962 INFO:tasks.workunit.client.0.vm07.stdout:5/610: getdents d3/d1a/d28/d40/d92 0 2026-03-09T19:27:48.962 INFO:tasks.workunit.client.1.vm08.stdout:6/851: getdents d3/d68/d7e 0 2026-03-09T19:27:48.965 INFO:tasks.workunit.client.0.vm07.stdout:5/611: sync 2026-03-09T19:27:48.966 INFO:tasks.workunit.client.0.vm07.stdout:5/612: truncate d3/dd/d95/fba 344301 0 2026-03-09T19:27:48.968 INFO:tasks.workunit.client.1.vm08.stdout:2/749: dread d3/d4/d3e/d9d/fc5 [0,4194304] 0 2026-03-09T19:27:48.972 INFO:tasks.workunit.client.1.vm08.stdout:1/958: rename d9/d11/f120 to d9/da/d2d/d4e/df4/f12e 0 2026-03-09T19:27:48.977 INFO:tasks.workunit.client.0.vm07.stdout:2/695: fsync d3/dd/d16/d29/d2d/d45/d3b/dae/fda 0 2026-03-09T19:27:48.978 INFO:tasks.workunit.client.0.vm07.stdout:2/696: fdatasync d3/d49/faf 0 2026-03-09T19:27:48.981 INFO:tasks.workunit.client.1.vm08.stdout:5/812: dwrite d16/d1e/d8c/d99/fea [0,4194304] 0 2026-03-09T19:27:48.991 INFO:tasks.workunit.client.0.vm07.stdout:1/663: dread d1/d11/d37/d3f/f82 [0,4194304] 0 2026-03-09T19:27:48.994 INFO:tasks.workunit.client.0.vm07.stdout:8/653: readlink d7/d16/l28 0 2026-03-09T19:27:48.994 INFO:tasks.workunit.client.0.vm07.stdout:8/654: fsync d7/d50/da6/fb8 0 2026-03-09T19:27:48.999 INFO:tasks.workunit.client.0.vm07.stdout:6/587: creat d0/d4e/d7f/dbe/fe6 x:0 0 0 2026-03-09T19:27:49.000 INFO:tasks.workunit.client.1.vm08.stdout:7/899: link d5/d14/dae/d3a/d42/d85/l8e d5/d14/d27/d54/dfb/l134 0 2026-03-09T19:27:49.000 INFO:tasks.workunit.client.0.vm07.stdout:4/615: write d3/d11/d16/f77 [269371,45573] 0 2026-03-09T19:27:49.004 INFO:tasks.workunit.client.0.vm07.stdout:0/601: read - d0/d6/d13/d1c/d11/d56/d78/f7d zero size 2026-03-09T19:27:49.011 INFO:tasks.workunit.client.1.vm08.stdout:0/851: unlink dd/d22/d63/d93/lc5 0 2026-03-09T19:27:49.011 INFO:tasks.workunit.client.0.vm07.stdout:0/602: write d0/d6/d13/d1c/d50/d92/fba [806920,98531] 0 2026-03-09T19:27:49.011 INFO:tasks.workunit.client.0.vm07.stdout:0/603: dread - d0/d6/d13/fbf zero size 2026-03-09T19:27:49.011 INFO:tasks.workunit.client.0.vm07.stdout:5/613: fdatasync d3/d1a/d28/d40/d92/fa9 0 2026-03-09T19:27:49.012 INFO:tasks.workunit.client.0.vm07.stdout:9/643: write d0/d6/d57/d8f/f9f [244154,1236] 0 2026-03-09T19:27:49.018 INFO:tasks.workunit.client.1.vm08.stdout:1/959: truncate d9/da/d12/d39/fa7 348446 0 2026-03-09T19:27:49.018 INFO:tasks.workunit.client.1.vm08.stdout:8/818: symlink de/d113/l11e 0 2026-03-09T19:27:49.021 INFO:tasks.workunit.client.0.vm07.stdout:2/697: unlink d3/d11/c5b 0 2026-03-09T19:27:49.026 INFO:tasks.workunit.client.1.vm08.stdout:9/834: fdatasync d0/d1b/f9f 0 2026-03-09T19:27:49.026 INFO:tasks.workunit.client.0.vm07.stdout:1/664: creat d1/d3e/db3/fda x:0 0 0 2026-03-09T19:27:49.027 INFO:tasks.workunit.client.1.vm08.stdout:4/801: dwrite da/d10/d16/d28/d2f/d4f/d64/d81/faa [0,4194304] 0 2026-03-09T19:27:49.030 INFO:tasks.workunit.client.1.vm08.stdout:4/802: read - da/d10/d16/d28/d2f/d4f/d64/d81/fb2 zero size 2026-03-09T19:27:49.038 INFO:tasks.workunit.client.1.vm08.stdout:6/852: truncate d3/db/f42 2090955 0 2026-03-09T19:27:49.041 INFO:tasks.workunit.client.0.vm07.stdout:8/655: unlink d7/d50/f84 0 2026-03-09T19:27:49.044 INFO:tasks.workunit.client.1.vm08.stdout:3/901: mknod d0/d6/de/d15/dec/c126 0 2026-03-09T19:27:49.050 INFO:tasks.workunit.client.1.vm08.stdout:3/902: chown d0/d6/de 11064002 1 2026-03-09T19:27:49.050 INFO:tasks.workunit.client.1.vm08.stdout:3/903: dread d0/d52/d6d/f8b [0,4194304] 0 2026-03-09T19:27:49.051 INFO:tasks.workunit.client.0.vm07.stdout:6/588: mknod d0/d4e/d7f/dbe/ce7 0 2026-03-09T19:27:49.057 INFO:tasks.workunit.client.1.vm08.stdout:8/819: symlink de/d1d/d69/l11f 0 2026-03-09T19:27:49.058 INFO:tasks.workunit.client.0.vm07.stdout:0/604: dread d0/d6/d13/d1c/d50/d92/f94 [0,4194304] 0 2026-03-09T19:27:49.062 INFO:tasks.workunit.client.1.vm08.stdout:0/852: sync 2026-03-09T19:27:49.063 INFO:tasks.workunit.client.1.vm08.stdout:0/853: chown dd/d22/d27/d6c/f7f 331 1 2026-03-09T19:27:49.070 INFO:tasks.workunit.client.1.vm08.stdout:2/750: write d3/d9/f1e [2885372,29985] 0 2026-03-09T19:27:49.078 INFO:tasks.workunit.client.1.vm08.stdout:9/835: mkdir d0/d1b/d68/d7f/d116 0 2026-03-09T19:27:49.079 INFO:tasks.workunit.client.1.vm08.stdout:9/836: chown d0/d1b/d97/fca 12703576 1 2026-03-09T19:27:49.080 INFO:tasks.workunit.client.0.vm07.stdout:3/698: link d1/d74/f31 d1/d6/d71/fdb 0 2026-03-09T19:27:49.088 INFO:tasks.workunit.client.0.vm07.stdout:7/626: creat d0/d4/d5/fd0 x:0 0 0 2026-03-09T19:27:49.090 INFO:tasks.workunit.client.1.vm08.stdout:4/803: symlink da/d10/d1b/d23/lea 0 2026-03-09T19:27:49.096 INFO:tasks.workunit.client.0.vm07.stdout:2/698: dread d3/dd/f24 [0,4194304] 0 2026-03-09T19:27:49.096 INFO:tasks.workunit.client.1.vm08.stdout:7/900: write d5/d14/dae/d3a/d42/d85/da0/fe4 [865694,40980] 0 2026-03-09T19:27:49.100 INFO:tasks.workunit.client.1.vm08.stdout:6/853: fsync d3/db/d43/d69/f89 0 2026-03-09T19:27:49.105 INFO:tasks.workunit.client.1.vm08.stdout:3/904: creat d0/d52/d7c/d7e/f127 x:0 0 0 2026-03-09T19:27:49.114 INFO:tasks.workunit.client.1.vm08.stdout:8/820: creat de/d7c/f120 x:0 0 0 2026-03-09T19:27:49.114 INFO:tasks.workunit.client.1.vm08.stdout:8/821: chown de/d1d/d2e/d5f/l6c 131096079 1 2026-03-09T19:27:49.119 INFO:tasks.workunit.client.1.vm08.stdout:1/960: dwrite d9/da/d17/d60/fea [0,4194304] 0 2026-03-09T19:27:49.120 INFO:tasks.workunit.client.0.vm07.stdout:6/589: dwrite d0/dbf/f34 [0,4194304] 0 2026-03-09T19:27:49.121 INFO:tasks.workunit.client.0.vm07.stdout:6/590: chown d0/d1/db/d24 1753 1 2026-03-09T19:27:49.122 INFO:tasks.workunit.client.0.vm07.stdout:6/591: truncate d0/d1/db/d17/dc4/fd7 923026 0 2026-03-09T19:27:49.144 INFO:tasks.workunit.client.1.vm08.stdout:5/813: link d16/d1e/d8c/d99/da8/cb7 d16/d8e/dff/c109 0 2026-03-09T19:27:49.146 INFO:tasks.workunit.client.1.vm08.stdout:0/854: readlink dd/de4/le9 0 2026-03-09T19:27:49.149 INFO:tasks.workunit.client.0.vm07.stdout:5/614: mknod d3/dd/d26/d2d/d79/d9f/cc2 0 2026-03-09T19:27:49.149 INFO:tasks.workunit.client.0.vm07.stdout:5/615: chown d3/d1a/fb 558893 1 2026-03-09T19:27:49.160 INFO:tasks.workunit.client.1.vm08.stdout:2/751: mknod d3/d4/d23/d2c/d39/d5e/db8/cfe 0 2026-03-09T19:27:49.162 INFO:tasks.workunit.client.0.vm07.stdout:3/699: mkdir d1/d6/dd/dbf/ddc 0 2026-03-09T19:27:49.168 INFO:tasks.workunit.client.0.vm07.stdout:5/616: dread d3/f18 [0,4194304] 0 2026-03-09T19:27:49.172 INFO:tasks.workunit.client.1.vm08.stdout:2/752: dread d3/d4/f6 [4194304,4194304] 0 2026-03-09T19:27:49.173 INFO:tasks.workunit.client.1.vm08.stdout:2/753: write d3/d4/fa7 [2391814,110935] 0 2026-03-09T19:27:49.178 INFO:tasks.workunit.client.0.vm07.stdout:2/699: unlink d3/dd/d16/d29/d3c/d5a/d7a/f6e 0 2026-03-09T19:27:49.179 INFO:tasks.workunit.client.0.vm07.stdout:7/627: write d0/d4/d5/d8/d1a/d2a/fb3 [4447,8923] 0 2026-03-09T19:27:49.181 INFO:tasks.workunit.client.0.vm07.stdout:1/665: fdatasync d1/db/f1f 0 2026-03-09T19:27:49.182 INFO:tasks.workunit.client.0.vm07.stdout:1/666: readlink d1/la1 0 2026-03-09T19:27:49.182 INFO:tasks.workunit.client.0.vm07.stdout:9/644: dwrite d0/db/d29/d2c/d36/d7d/fd8 [8388608,4194304] 0 2026-03-09T19:27:49.183 INFO:tasks.workunit.client.1.vm08.stdout:7/901: creat d5/d14/d27/d54/dfb/d90/daf/f135 x:0 0 0 2026-03-09T19:27:49.196 INFO:tasks.workunit.client.0.vm07.stdout:4/616: creat d3/fd7 x:0 0 0 2026-03-09T19:27:49.197 INFO:tasks.workunit.client.1.vm08.stdout:6/854: rename d3/db/d43/f56 to d3/d34/da9/da4/d117/f13c 0 2026-03-09T19:27:49.202 INFO:tasks.workunit.client.1.vm08.stdout:3/905: read d0/f7a [5671,35614] 0 2026-03-09T19:27:49.203 INFO:tasks.workunit.client.1.vm08.stdout:3/906: chown d0/l29 6296 1 2026-03-09T19:27:49.208 INFO:tasks.workunit.client.0.vm07.stdout:3/700: symlink d1/d6/d4c/d97/ldd 0 2026-03-09T19:27:49.212 INFO:tasks.workunit.client.1.vm08.stdout:1/961: creat d9/da/d53/d67/d6c/d76/f12f x:0 0 0 2026-03-09T19:27:49.212 INFO:tasks.workunit.client.0.vm07.stdout:5/617: symlink d3/dd/d26/d3f/d47/d71/lc3 0 2026-03-09T19:27:49.212 INFO:tasks.workunit.client.0.vm07.stdout:5/618: truncate d3/dd/d26/d3f/fb3 985899 0 2026-03-09T19:27:49.217 INFO:tasks.workunit.client.1.vm08.stdout:5/814: unlink d16/f18 0 2026-03-09T19:27:49.226 INFO:tasks.workunit.client.1.vm08.stdout:5/815: dread ff [4194304,4194304] 0 2026-03-09T19:27:49.228 INFO:tasks.workunit.client.1.vm08.stdout:8/822: write de/d47/faa [466776,90835] 0 2026-03-09T19:27:49.232 INFO:tasks.workunit.client.1.vm08.stdout:8/823: dwrite de/d91/f9d [0,4194304] 0 2026-03-09T19:27:49.239 INFO:tasks.workunit.client.1.vm08.stdout:0/855: write dd/d22/d27/fc8 [128540,110387] 0 2026-03-09T19:27:49.241 INFO:tasks.workunit.client.0.vm07.stdout:2/700: dwrite d3/d11/f31 [0,4194304] 0 2026-03-09T19:27:49.241 INFO:tasks.workunit.client.1.vm08.stdout:2/754: mkdir d3/d4/d23/d2c/d39/d5e/db8/dff 0 2026-03-09T19:27:49.242 INFO:tasks.workunit.client.1.vm08.stdout:2/755: fsync d3/d4/fa7 0 2026-03-09T19:27:49.243 INFO:tasks.workunit.client.0.vm07.stdout:9/645: mknod d0/d6f/d86/ce1 0 2026-03-09T19:27:49.244 INFO:tasks.workunit.client.0.vm07.stdout:2/701: truncate d3/dd/d16/d29/d2d/d45/dc3/f9c 5073095 0 2026-03-09T19:27:49.245 INFO:tasks.workunit.client.0.vm07.stdout:2/702: dread - d3/dd/fe8 zero size 2026-03-09T19:27:49.246 INFO:tasks.workunit.client.0.vm07.stdout:8/656: rmdir d7/d30/d75/dcd 0 2026-03-09T19:27:49.247 INFO:tasks.workunit.client.0.vm07.stdout:4/617: dread - d3/d11/d51/faa zero size 2026-03-09T19:27:49.248 INFO:tasks.workunit.client.1.vm08.stdout:6/855: rmdir d3 39 2026-03-09T19:27:49.251 INFO:tasks.workunit.client.0.vm07.stdout:6/592: fdatasync d0/d13/f18 0 2026-03-09T19:27:49.251 INFO:tasks.workunit.client.0.vm07.stdout:6/593: write d0/dbf/d95/d31/f3c [3719472,43322] 0 2026-03-09T19:27:49.271 INFO:tasks.workunit.client.0.vm07.stdout:0/605: rename d0/d6/d13/d1c/d11/d56/d78 to d0/d6/d13/d17/dc3 0 2026-03-09T19:27:49.271 INFO:tasks.workunit.client.0.vm07.stdout:0/606: truncate d0/d6/d13/fbf 551199 0 2026-03-09T19:27:49.272 INFO:tasks.workunit.client.0.vm07.stdout:5/619: creat d3/dd/d26/d3f/fc4 x:0 0 0 2026-03-09T19:27:49.273 INFO:tasks.workunit.client.0.vm07.stdout:5/620: write d3/d1a/d28/d40/d92/fa0 [903038,32825] 0 2026-03-09T19:27:49.277 INFO:tasks.workunit.client.0.vm07.stdout:1/667: symlink d1/d11/d37/d3f/ldb 0 2026-03-09T19:27:49.278 INFO:tasks.workunit.client.1.vm08.stdout:8/824: rename de/d25/d87/l92 to de/d1d/d4f/l121 0 2026-03-09T19:27:49.283 INFO:tasks.workunit.client.0.vm07.stdout:9/646: creat d0/db/d29/d32/d5c/d80/fe2 x:0 0 0 2026-03-09T19:27:49.286 INFO:tasks.workunit.client.0.vm07.stdout:2/703: mknod d3/dd/d16/d30/da7/dad/cf5 0 2026-03-09T19:27:49.286 INFO:tasks.workunit.client.1.vm08.stdout:0/856: creat dd/d22/d63/d6e/df5/f117 x:0 0 0 2026-03-09T19:27:49.287 INFO:tasks.workunit.client.0.vm07.stdout:7/628: dread d0/d4/d5/d26/d32/f7c [0,4194304] 0 2026-03-09T19:27:49.287 INFO:tasks.workunit.client.0.vm07.stdout:7/629: chown d0/c28 29 1 2026-03-09T19:27:49.288 INFO:tasks.workunit.client.0.vm07.stdout:7/630: chown d0/d80/f81 6700709 1 2026-03-09T19:27:49.296 INFO:tasks.workunit.client.1.vm08.stdout:3/907: dwrite d0/d6/de/d6e/d51/d92/f101 [0,4194304] 0 2026-03-09T19:27:49.300 INFO:tasks.workunit.client.0.vm07.stdout:8/657: write d7/d30/d32/fba [1200424,43914] 0 2026-03-09T19:27:49.311 INFO:tasks.workunit.client.1.vm08.stdout:7/902: creat d5/d14/d2b/daa/d131/f136 x:0 0 0 2026-03-09T19:27:49.311 INFO:tasks.workunit.client.1.vm08.stdout:7/903: chown d5/d14/dae 2227 1 2026-03-09T19:27:49.316 INFO:tasks.workunit.client.0.vm07.stdout:6/594: symlink d0/d1/d28/da8/le8 0 2026-03-09T19:27:49.317 INFO:tasks.workunit.client.0.vm07.stdout:3/701: mknod d1/d6/d45/dac/cde 0 2026-03-09T19:27:49.318 INFO:tasks.workunit.client.1.vm08.stdout:6/856: stat d3/db/d43/d11d 0 2026-03-09T19:27:49.319 INFO:tasks.workunit.client.0.vm07.stdout:4/618: rename d3/d11/d51/l5d to d3/d11/d2b/d38/ld8 0 2026-03-09T19:27:49.320 INFO:tasks.workunit.client.1.vm08.stdout:1/962: mkdir d9/d11/d7a/d130 0 2026-03-09T19:27:49.321 INFO:tasks.workunit.client.0.vm07.stdout:5/621: truncate d3/dd/f24 3092688 0 2026-03-09T19:27:49.324 INFO:tasks.workunit.client.1.vm08.stdout:5/816: symlink d16/d1e/dc9/d102/l10a 0 2026-03-09T19:27:49.325 INFO:tasks.workunit.client.1.vm08.stdout:5/817: readlink d16/d45/daf/ldf 0 2026-03-09T19:27:49.327 INFO:tasks.workunit.client.0.vm07.stdout:9/647: creat d0/d6/fe3 x:0 0 0 2026-03-09T19:27:49.330 INFO:tasks.workunit.client.0.vm07.stdout:9/648: dwrite d0/d6/ff [0,4194304] 0 2026-03-09T19:27:49.335 INFO:tasks.workunit.client.1.vm08.stdout:6/857: dread d3/f12 [0,4194304] 0 2026-03-09T19:27:49.344 INFO:tasks.workunit.client.0.vm07.stdout:2/704: unlink d3/dd/d16/d30/f3a 0 2026-03-09T19:27:49.344 INFO:tasks.workunit.client.0.vm07.stdout:9/649: dread d0/db/d29/d2c/d36/d7d/fd8 [0,4194304] 0 2026-03-09T19:27:49.345 INFO:tasks.workunit.client.0.vm07.stdout:2/705: dread - d3/dd/d16/d29/d2d/d45/d3b/fe5 zero size 2026-03-09T19:27:49.352 INFO:tasks.workunit.client.1.vm08.stdout:9/837: link d0/d2/d14/d98/f40 d0/d1b/d97/f117 0 2026-03-09T19:27:49.356 INFO:tasks.workunit.client.1.vm08.stdout:0/857: dread - dd/d31/dca/f10b zero size 2026-03-09T19:27:49.359 INFO:tasks.workunit.client.1.vm08.stdout:4/804: getdents da/d10/d16/d28/d2f/de9 0 2026-03-09T19:27:49.364 INFO:tasks.workunit.client.1.vm08.stdout:3/908: symlink d0/d52/d6d/d77/d88/l128 0 2026-03-09T19:27:49.368 INFO:tasks.workunit.client.1.vm08.stdout:7/904: rename d5/d14/c31 to d5/d14/d2b/d5d/c137 0 2026-03-09T19:27:49.369 INFO:tasks.workunit.client.1.vm08.stdout:7/905: chown d5/d14/dae/d1c/d73/fbe 811071 1 2026-03-09T19:27:49.371 INFO:tasks.workunit.client.0.vm07.stdout:0/607: mknod d0/cc4 0 2026-03-09T19:27:49.371 INFO:tasks.workunit.client.1.vm08.stdout:2/756: mknod d3/c100 0 2026-03-09T19:27:49.372 INFO:tasks.workunit.client.0.vm07.stdout:4/619: fsync d3/d4f/d56/d5f/f72 0 2026-03-09T19:27:49.373 INFO:tasks.workunit.client.0.vm07.stdout:4/620: stat d3/f1a 0 2026-03-09T19:27:49.373 INFO:tasks.workunit.client.0.vm07.stdout:4/621: write d3/f8d [3905886,43079] 0 2026-03-09T19:27:49.375 INFO:tasks.workunit.client.1.vm08.stdout:8/825: dwrite de/f16 [0,4194304] 0 2026-03-09T19:27:49.377 INFO:tasks.workunit.client.1.vm08.stdout:1/963: fsync d9/da/d2c/fd6 0 2026-03-09T19:27:49.387 INFO:tasks.workunit.client.0.vm07.stdout:5/622: rename d3/dd/f58 to d3/dd/d26/d2d/d60/fc5 0 2026-03-09T19:27:49.393 INFO:tasks.workunit.client.0.vm07.stdout:5/623: dread d3/dd/f23 [0,4194304] 0 2026-03-09T19:27:49.398 INFO:tasks.workunit.client.1.vm08.stdout:4/805: mknod da/d10/d16/ceb 0 2026-03-09T19:27:49.400 INFO:tasks.workunit.client.1.vm08.stdout:3/909: fsync d0/d4b/fb0 0 2026-03-09T19:27:49.401 INFO:tasks.workunit.client.1.vm08.stdout:3/910: write d0/d52/d7c/d7e/f127 [263778,9133] 0 2026-03-09T19:27:49.401 INFO:tasks.workunit.client.1.vm08.stdout:7/906: fsync d5/d14/d38/f47 0 2026-03-09T19:27:49.402 INFO:tasks.workunit.client.1.vm08.stdout:3/911: chown d0/d6/de/d15/l78 203107662 1 2026-03-09T19:27:49.403 INFO:tasks.workunit.client.1.vm08.stdout:3/912: readlink d0/d6/de/d6e/d51/d92/l10f 0 2026-03-09T19:27:49.406 INFO:tasks.workunit.client.1.vm08.stdout:7/907: dwrite d5/d14/dae/d3a/fca [0,4194304] 0 2026-03-09T19:27:49.406 INFO:tasks.workunit.client.1.vm08.stdout:3/913: truncate d0/d6/de/d1b/d16/dd1/f125 824935 0 2026-03-09T19:27:49.410 INFO:tasks.workunit.client.1.vm08.stdout:2/757: creat d3/d4/d23/d2c/dc1/f101 x:0 0 0 2026-03-09T19:27:49.411 INFO:tasks.workunit.client.1.vm08.stdout:2/758: readlink d3/d4/d23/d2c/d39/d5e/de/d18/d1f/l22 0 2026-03-09T19:27:49.412 INFO:tasks.workunit.client.1.vm08.stdout:2/759: dread - d3/d4/d23/d2c/dc1/f101 zero size 2026-03-09T19:27:49.424 INFO:tasks.workunit.client.1.vm08.stdout:8/826: creat de/d47/dd4/f122 x:0 0 0 2026-03-09T19:27:49.424 INFO:tasks.workunit.client.1.vm08.stdout:8/827: readlink de/d25/d31/ld2 0 2026-03-09T19:27:49.425 INFO:tasks.workunit.client.1.vm08.stdout:1/964: truncate d9/da/d95/dcd/fee 355933 0 2026-03-09T19:27:49.426 INFO:tasks.workunit.client.1.vm08.stdout:1/965: chown d9/d11/d7a/c9f 61 1 2026-03-09T19:27:49.429 INFO:tasks.workunit.client.1.vm08.stdout:9/838: truncate d0/d2/d14/d98/dbb/fe4 3648979 0 2026-03-09T19:27:49.435 INFO:tasks.workunit.client.1.vm08.stdout:3/914: mkdir d0/d6/d93/dcb/d129 0 2026-03-09T19:27:49.435 INFO:tasks.workunit.client.1.vm08.stdout:3/915: dread - d0/d6/de/d15/d96/df5/f111 zero size 2026-03-09T19:27:49.440 INFO:tasks.workunit.client.1.vm08.stdout:7/908: mknod d5/d14/dae/d1c/db5/df8/c138 0 2026-03-09T19:27:49.445 INFO:tasks.workunit.client.1.vm08.stdout:2/760: readlink d3/l4c 0 2026-03-09T19:27:49.446 INFO:tasks.workunit.client.1.vm08.stdout:2/761: dread - d3/d4/d23/d2c/ff0 zero size 2026-03-09T19:27:49.446 INFO:tasks.workunit.client.1.vm08.stdout:2/762: readlink d3/d4/d3e/d4e/d88/l8a 0 2026-03-09T19:27:49.447 INFO:tasks.workunit.client.1.vm08.stdout:2/763: chown d3/d4/d23/d2c/d39/d5e/db8/dff 5918 1 2026-03-09T19:27:49.452 INFO:tasks.workunit.client.1.vm08.stdout:0/858: dwrite dd/d22/d24/f71 [0,4194304] 0 2026-03-09T19:27:49.456 INFO:tasks.workunit.client.1.vm08.stdout:8/828: dread - de/d47/dfd/d99/da0/fda zero size 2026-03-09T19:27:49.456 INFO:tasks.workunit.client.1.vm08.stdout:0/859: chown dd/d22/d24/d49/d50/d78/fbb 931702345 1 2026-03-09T19:27:49.457 INFO:tasks.workunit.client.1.vm08.stdout:0/860: truncate dd/f115 269589 0 2026-03-09T19:27:49.464 INFO:tasks.workunit.client.0.vm07.stdout:3/702: symlink d1/d6/ldf 0 2026-03-09T19:27:49.464 INFO:tasks.workunit.client.1.vm08.stdout:5/818: link d16/d1e/d6e/l77 d16/d1e/d9f/l10b 0 2026-03-09T19:27:49.469 INFO:tasks.workunit.client.1.vm08.stdout:5/819: dwrite d16/d1e/d8c/d99/da8/d9a/f106 [0,4194304] 0 2026-03-09T19:27:49.473 INFO:tasks.workunit.client.1.vm08.stdout:9/839: rename d0/d1b/d97/d48/d5d/d74/ded/feb to d0/d2/d14/d5c/f118 0 2026-03-09T19:27:49.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:49 vm07.local ceph-mon[48545]: Migrating agent root cert to cert store 2026-03-09T19:27:49.490 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:49 vm07.local ceph-mon[48545]: Migrating agent root key to cert store 2026-03-09T19:27:49.490 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:49 vm07.local ceph-mon[48545]: Checking for cert/key for grafana.vm07 2026-03-09T19:27:49.490 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:49 vm07.local ceph-mon[48545]: Migrating grafana.vm07 cert to cert store 2026-03-09T19:27:49.490 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:49 vm07.local ceph-mon[48545]: Migrating grafana.vm07 key to cert store 2026-03-09T19:27:49.490 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:49 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.mxylvw/mirror_snapshot_schedule"}]: dispatch 2026-03-09T19:27:49.490 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:49 vm07.local ceph-mon[48545]: mgrmap e22: vm08.mxylvw(active, since 1.12073s) 2026-03-09T19:27:49.490 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:49 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.mxylvw/mirror_snapshot_schedule"}]: dispatch 2026-03-09T19:27:49.490 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:49 vm07.local ceph-mon[48545]: pgmap v3: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-09T19:27:49.490 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:49 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.mxylvw/trash_purge_schedule"}]: dispatch 2026-03-09T19:27:49.490 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:49 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.mxylvw/trash_purge_schedule"}]: dispatch 2026-03-09T19:27:49.490 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:49 vm07.local ceph-mon[48545]: Standby manager daemon vm07.xacuym started 2026-03-09T19:27:49.490 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:49 vm07.local ceph-mon[48545]: from='mgr.? 192.168.123.107:0/219465446' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.xacuym/crt"}]: dispatch 2026-03-09T19:27:49.490 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:49 vm07.local ceph-mon[48545]: from='mgr.? 192.168.123.107:0/219465446' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T19:27:49.490 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:49 vm07.local ceph-mon[48545]: from='mgr.? 192.168.123.107:0/219465446' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.xacuym/key"}]: dispatch 2026-03-09T19:27:49.490 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:49 vm07.local ceph-mon[48545]: from='mgr.? 192.168.123.107:0/219465446' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T19:27:49.491 INFO:tasks.workunit.client.1.vm08.stdout:6/858: dwrite d3/db/fb0 [0,4194304] 0 2026-03-09T19:27:49.492 INFO:tasks.workunit.client.1.vm08.stdout:7/909: sync 2026-03-09T19:27:49.503 INFO:tasks.workunit.client.1.vm08.stdout:4/806: dwrite da/d10/d26/dd8/f57 [0,4194304] 0 2026-03-09T19:27:49.506 INFO:tasks.workunit.client.1.vm08.stdout:8/829: rename de/d1d/d2e/d5f/f57 to de/d25/d87/dc9/dd8/f123 0 2026-03-09T19:27:49.519 INFO:tasks.workunit.client.0.vm07.stdout:2/706: mkdir d3/dd/d16/d29/d2d/d45/df6 0 2026-03-09T19:27:49.523 INFO:tasks.workunit.client.1.vm08.stdout:9/840: truncate d0/d2/d80/d69/ff2 476934 0 2026-03-09T19:27:49.526 INFO:tasks.workunit.client.0.vm07.stdout:7/631: link d0/d4/d5/d26/d32/dbd/fba d0/d4/d5/d26/db9/dc2/fd1 0 2026-03-09T19:27:49.527 INFO:tasks.workunit.client.1.vm08.stdout:3/916: symlink d0/d6/l12a 0 2026-03-09T19:27:49.530 INFO:tasks.workunit.client.0.vm07.stdout:8/658: creat d7/d9/d37/fe8 x:0 0 0 2026-03-09T19:27:49.531 INFO:tasks.workunit.client.1.vm08.stdout:2/764: symlink d3/l102 0 2026-03-09T19:27:49.534 INFO:tasks.workunit.client.0.vm07.stdout:6/595: link d0/d4e/d7f/fbc d0/d4e/d7f/fe9 0 2026-03-09T19:27:49.537 INFO:tasks.workunit.client.1.vm08.stdout:7/910: symlink d5/d14/d27/d54/dfb/d90/l139 0 2026-03-09T19:27:49.543 INFO:tasks.workunit.client.0.vm07.stdout:9/650: dwrite d0/db/d29/d2c/f61 [0,4194304] 0 2026-03-09T19:27:49.544 INFO:tasks.workunit.client.0.vm07.stdout:9/651: chown d0/d6f/d86 260676654 1 2026-03-09T19:27:49.544 INFO:tasks.workunit.client.1.vm08.stdout:1/966: dwrite d9/da/d12/f5c [0,4194304] 0 2026-03-09T19:27:49.550 INFO:tasks.workunit.client.1.vm08.stdout:4/807: dread - da/d10/d16/d28/d46/d52/d6e/d40/fc2 zero size 2026-03-09T19:27:49.550 INFO:tasks.workunit.client.0.vm07.stdout:1/668: getdents d1/d11/d37/d3f/d45/d87/d88 0 2026-03-09T19:27:49.556 INFO:tasks.workunit.client.1.vm08.stdout:7/911: sync 2026-03-09T19:27:49.558 INFO:tasks.workunit.client.1.vm08.stdout:7/912: read d5/d14/dae/d1c/f5a [1777936,2820] 0 2026-03-09T19:27:49.566 INFO:tasks.workunit.client.0.vm07.stdout:2/707: mknod d3/dd/d16/d30/da7/dad/cf7 0 2026-03-09T19:27:49.568 INFO:tasks.workunit.client.1.vm08.stdout:5/820: rename d16/d45/daf/df5/d8a to d16/d1e/dc9/d10c 0 2026-03-09T19:27:49.569 INFO:tasks.workunit.client.0.vm07.stdout:7/632: creat d0/d52/fd2 x:0 0 0 2026-03-09T19:27:49.569 INFO:tasks.workunit.client.1.vm08.stdout:9/841: symlink d0/d1b/d97/d48/d5d/d74/ded/l119 0 2026-03-09T19:27:49.570 INFO:tasks.workunit.client.0.vm07.stdout:8/659: mkdir d7/d30/d32/de9 0 2026-03-09T19:27:49.576 INFO:tasks.workunit.client.0.vm07.stdout:3/703: rename d1/d1f/l25 to d1/d6/d45/d54/dd1/le0 0 2026-03-09T19:27:49.579 INFO:tasks.workunit.client.0.vm07.stdout:6/596: mknod d0/d1/d28/d76/dad/cea 0 2026-03-09T19:27:49.585 INFO:tasks.workunit.client.1.vm08.stdout:4/808: dread da/d10/d16/d28/d46/fbe [0,4194304] 0 2026-03-09T19:27:49.585 INFO:tasks.workunit.client.1.vm08.stdout:7/913: unlink d5/d14/d27/d78/dc7/dce/cd5 0 2026-03-09T19:27:49.585 INFO:tasks.workunit.client.1.vm08.stdout:1/967: rename d9/da/d12/d39 to d9/da/d17/d60/d131 0 2026-03-09T19:27:49.585 INFO:tasks.workunit.client.0.vm07.stdout:0/608: creat d0/d6/d13/d17/fc5 x:0 0 0 2026-03-09T19:27:49.585 INFO:tasks.workunit.client.0.vm07.stdout:9/652: stat d0/db/d29/d32/c37 0 2026-03-09T19:27:49.585 INFO:tasks.workunit.client.0.vm07.stdout:0/609: write d0/d6/d13/fbf [115468,123940] 0 2026-03-09T19:27:49.585 INFO:tasks.workunit.client.0.vm07.stdout:4/622: creat d3/d11/d29/db9/d22/fd9 x:0 0 0 2026-03-09T19:27:49.588 INFO:tasks.workunit.client.1.vm08.stdout:7/914: sync 2026-03-09T19:27:49.592 INFO:tasks.workunit.client.0.vm07.stdout:8/660: unlink d7/d50/f80 0 2026-03-09T19:27:49.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:49 vm08.local ceph-mon[57794]: Migrating agent root cert to cert store 2026-03-09T19:27:49.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:49 vm08.local ceph-mon[57794]: Migrating agent root key to cert store 2026-03-09T19:27:49.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:49 vm08.local ceph-mon[57794]: Checking for cert/key for grafana.vm07 2026-03-09T19:27:49.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:49 vm08.local ceph-mon[57794]: Migrating grafana.vm07 cert to cert store 2026-03-09T19:27:49.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:49 vm08.local ceph-mon[57794]: Migrating grafana.vm07 key to cert store 2026-03-09T19:27:49.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:49 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.mxylvw/mirror_snapshot_schedule"}]: dispatch 2026-03-09T19:27:49.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:49 vm08.local ceph-mon[57794]: mgrmap e22: vm08.mxylvw(active, since 1.12073s) 2026-03-09T19:27:49.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:49 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.mxylvw/mirror_snapshot_schedule"}]: dispatch 2026-03-09T19:27:49.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:49 vm08.local ceph-mon[57794]: pgmap v3: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-09T19:27:49.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:49 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.mxylvw/trash_purge_schedule"}]: dispatch 2026-03-09T19:27:49.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:49 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.mxylvw/trash_purge_schedule"}]: dispatch 2026-03-09T19:27:49.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:49 vm08.local ceph-mon[57794]: Standby manager daemon vm07.xacuym started 2026-03-09T19:27:49.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:49 vm08.local ceph-mon[57794]: from='mgr.? 192.168.123.107:0/219465446' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.xacuym/crt"}]: dispatch 2026-03-09T19:27:49.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:49 vm08.local ceph-mon[57794]: from='mgr.? 192.168.123.107:0/219465446' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T19:27:49.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:49 vm08.local ceph-mon[57794]: from='mgr.? 192.168.123.107:0/219465446' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.xacuym/key"}]: dispatch 2026-03-09T19:27:49.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:49 vm08.local ceph-mon[57794]: from='mgr.? 192.168.123.107:0/219465446' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T19:27:49.600 INFO:tasks.workunit.client.1.vm08.stdout:0/861: dwrite dd/d22/d24/f77 [0,4194304] 0 2026-03-09T19:27:49.609 INFO:tasks.workunit.client.0.vm07.stdout:6/597: creat d0/dbf/d95/feb x:0 0 0 2026-03-09T19:27:49.610 INFO:tasks.workunit.client.0.vm07.stdout:6/598: write d0/d44/fe2 [966428,97677] 0 2026-03-09T19:27:49.612 INFO:tasks.workunit.client.1.vm08.stdout:6/859: dwrite d3/db/d43/d69/f89 [0,4194304] 0 2026-03-09T19:27:49.623 INFO:tasks.workunit.client.1.vm08.stdout:7/915: truncate d5/d14/dae/f45 415 0 2026-03-09T19:27:49.625 INFO:tasks.workunit.client.0.vm07.stdout:9/653: creat d0/db/d29/d32/d5c/d80/fe4 x:0 0 0 2026-03-09T19:27:49.628 INFO:tasks.workunit.client.1.vm08.stdout:9/842: creat d0/d1b/d97/dd3/f11a x:0 0 0 2026-03-09T19:27:49.631 INFO:tasks.workunit.client.1.vm08.stdout:8/830: dwrite de/f10 [0,4194304] 0 2026-03-09T19:27:49.643 INFO:tasks.workunit.client.0.vm07.stdout:1/669: creat d1/d11/d37/d3f/dd0/fdc x:0 0 0 2026-03-09T19:27:49.643 INFO:tasks.workunit.client.1.vm08.stdout:3/917: link d0/cbf d0/d52/c12b 0 2026-03-09T19:27:49.646 INFO:tasks.workunit.client.0.vm07.stdout:4/623: truncate d3/d11/d29/d34/f5c 958513 0 2026-03-09T19:27:49.647 INFO:tasks.workunit.client.0.vm07.stdout:5/624: getdents d3/dd 0 2026-03-09T19:27:49.647 INFO:tasks.workunit.client.1.vm08.stdout:2/765: write d3/d4/d23/d2c/f94 [2125464,119611] 0 2026-03-09T19:27:49.648 INFO:tasks.workunit.client.0.vm07.stdout:5/625: write d3/dd/d26/d3f/d47/d71/fb4 [955231,39832] 0 2026-03-09T19:27:49.652 INFO:tasks.workunit.client.0.vm07.stdout:2/708: link d3/dd/d16/d29/d2d/d45/d85/d8a/f9e d3/dd/d16/d30/da7/dad/ddd/ff8 0 2026-03-09T19:27:49.658 INFO:tasks.workunit.client.1.vm08.stdout:1/968: mkdir d9/d11/d7a/d130/d132 0 2026-03-09T19:27:49.667 INFO:tasks.workunit.client.1.vm08.stdout:4/809: write da/d10/d1b/f85 [1535796,116355] 0 2026-03-09T19:27:49.668 INFO:tasks.workunit.client.1.vm08.stdout:4/810: write da/d10/d16/d28/d2f/d4f/d64/f6f [1805459,31462] 0 2026-03-09T19:27:49.672 INFO:tasks.workunit.client.1.vm08.stdout:7/916: dread - d5/dc4/f10c zero size 2026-03-09T19:27:49.678 INFO:tasks.workunit.client.1.vm08.stdout:8/831: read de/d1d/d4f/fae [72418,92680] 0 2026-03-09T19:27:49.682 INFO:tasks.workunit.client.0.vm07.stdout:4/624: unlink d3/d11/d2b/d37/l32 0 2026-03-09T19:27:49.683 INFO:tasks.workunit.client.1.vm08.stdout:0/862: symlink dd/d22/d24/d49/d50/de3/l118 0 2026-03-09T19:27:49.684 INFO:tasks.workunit.client.0.vm07.stdout:5/626: fsync d3/d1a/d28/d36/f8c 0 2026-03-09T19:27:49.687 INFO:tasks.workunit.client.1.vm08.stdout:5/821: rename d16/d1e/d9f/cf1 to d16/d1e/d8c/d99/da8/c10d 0 2026-03-09T19:27:49.689 INFO:tasks.workunit.client.0.vm07.stdout:9/654: dwrite d0/db/fb0 [0,4194304] 0 2026-03-09T19:27:49.698 INFO:tasks.workunit.client.1.vm08.stdout:2/766: dread d3/d4/d23/d2c/f80 [0,4194304] 0 2026-03-09T19:27:49.701 INFO:tasks.workunit.client.0.vm07.stdout:8/661: truncate d7/d16/f71 1741529 0 2026-03-09T19:27:49.702 INFO:tasks.workunit.client.0.vm07.stdout:3/704: creat d1/d3d/d47/db3/dc2/fe1 x:0 0 0 2026-03-09T19:27:49.704 INFO:tasks.workunit.client.0.vm07.stdout:2/709: dwrite d3/dd/d16/d29/fa3 [0,4194304] 0 2026-03-09T19:27:49.705 INFO:tasks.workunit.client.0.vm07.stdout:0/610: creat d0/d6/d13/d1c/d50/fc6 x:0 0 0 2026-03-09T19:27:49.706 INFO:tasks.workunit.client.0.vm07.stdout:1/670: mknod d1/db/d31/cdd 0 2026-03-09T19:27:49.710 INFO:tasks.workunit.client.0.vm07.stdout:5/627: unlink d3/d1a/c8b 0 2026-03-09T19:27:49.715 INFO:tasks.workunit.client.0.vm07.stdout:9/655: fsync d0/d6/f8 0 2026-03-09T19:27:49.717 INFO:tasks.workunit.client.0.vm07.stdout:7/633: rename d0/d4/d5/c9d to d0/cd3 0 2026-03-09T19:27:49.717 INFO:tasks.workunit.client.0.vm07.stdout:3/705: rmdir d1/d6/dd 39 2026-03-09T19:27:49.720 INFO:tasks.workunit.client.0.vm07.stdout:5/628: fsync d3/f93 0 2026-03-09T19:27:49.721 INFO:tasks.workunit.client.0.vm07.stdout:9/656: dread - d0/d6/d3a/d94/fa0 zero size 2026-03-09T19:27:49.722 INFO:tasks.workunit.client.0.vm07.stdout:9/657: write d0/d6/d3a/f89 [11249,89615] 0 2026-03-09T19:27:49.722 INFO:tasks.workunit.client.0.vm07.stdout:9/658: read - d0/db/fac zero size 2026-03-09T19:27:49.724 INFO:tasks.workunit.client.0.vm07.stdout:7/634: rename d0/d52/d54/ca8 to d0/d4/d5/d26/db9/cd4 0 2026-03-09T19:27:49.725 INFO:tasks.workunit.client.0.vm07.stdout:7/635: stat d0/d4/d5/d8/d41/d64/d74/d98/f18 0 2026-03-09T19:27:49.726 INFO:tasks.workunit.client.0.vm07.stdout:7/636: stat d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58 0 2026-03-09T19:27:49.728 INFO:tasks.workunit.client.1.vm08.stdout:6/860: symlink d3/d34/da9/da4/d117/d10d/l13d 0 2026-03-09T19:27:49.730 INFO:tasks.workunit.client.0.vm07.stdout:4/625: creat d3/fda x:0 0 0 2026-03-09T19:27:49.731 INFO:tasks.workunit.client.1.vm08.stdout:7/917: fsync d5/d14/d2b/d5d/fb2 0 2026-03-09T19:27:49.731 INFO:tasks.workunit.client.1.vm08.stdout:7/918: stat d5/d14/d2b/d5d/f84 0 2026-03-09T19:27:49.734 INFO:tasks.workunit.client.0.vm07.stdout:0/611: creat d0/d6/d13/d1c/d50/fc7 x:0 0 0 2026-03-09T19:27:49.737 INFO:tasks.workunit.client.0.vm07.stdout:4/626: dread d3/d11/d2b/d37/f30 [0,4194304] 0 2026-03-09T19:27:49.744 INFO:tasks.workunit.client.0.vm07.stdout:1/671: link d1/d11/lb4 d1/d3e/dae/lde 0 2026-03-09T19:27:49.745 INFO:tasks.workunit.client.1.vm08.stdout:4/811: dread da/d10/d1b/f37 [0,4194304] 0 2026-03-09T19:27:49.750 INFO:tasks.workunit.client.1.vm08.stdout:3/918: mkdir d0/d52/d7c/d12c 0 2026-03-09T19:27:49.751 INFO:tasks.workunit.client.1.vm08.stdout:3/919: chown d0/d52/d7c/d7e/cd6 4 1 2026-03-09T19:27:49.752 INFO:tasks.workunit.client.0.vm07.stdout:4/627: dread d3/d4f/f5e [0,4194304] 0 2026-03-09T19:27:49.760 INFO:tasks.workunit.client.1.vm08.stdout:4/812: dread da/d10/f77 [0,4194304] 0 2026-03-09T19:27:49.762 INFO:tasks.workunit.client.1.vm08.stdout:4/813: read da/d10/d26/f87 [808327,88757] 0 2026-03-09T19:27:49.763 INFO:tasks.workunit.client.1.vm08.stdout:4/814: write da/d10/d16/d28/d2f/d4f/d64/d81/faa [853237,44377] 0 2026-03-09T19:27:49.765 INFO:tasks.workunit.client.0.vm07.stdout:0/612: rename d0/d6/d13/d1c/d50/d92 to d0/d6/dc8 0 2026-03-09T19:27:49.765 INFO:tasks.workunit.client.0.vm07.stdout:1/672: sync 2026-03-09T19:27:49.767 INFO:tasks.workunit.client.0.vm07.stdout:1/673: sync 2026-03-09T19:27:49.769 INFO:tasks.workunit.client.1.vm08.stdout:2/767: symlink d3/d4/d23/d2c/d39/d5e/de/d18/d99/l103 0 2026-03-09T19:27:49.771 INFO:tasks.workunit.client.1.vm08.stdout:2/768: read d3/f7c [176171,110415] 0 2026-03-09T19:27:49.774 INFO:tasks.workunit.client.1.vm08.stdout:1/969: fdatasync d9/da/d17/fa9 0 2026-03-09T19:27:49.777 INFO:tasks.workunit.client.0.vm07.stdout:0/613: dread d0/d6/d13/f31 [0,4194304] 0 2026-03-09T19:27:49.779 INFO:tasks.workunit.client.0.vm07.stdout:1/674: dread d1/d3/f4 [0,4194304] 0 2026-03-09T19:27:49.780 INFO:tasks.workunit.client.1.vm08.stdout:7/919: chown d5/d14/d2b/d5d/cf4 37424 1 2026-03-09T19:27:49.784 INFO:tasks.workunit.client.0.vm07.stdout:4/628: mknod d3/d11/d29/db9/d22/cdb 0 2026-03-09T19:27:49.793 INFO:tasks.workunit.client.1.vm08.stdout:5/822: getdents d16/d8e/ddb/de3 0 2026-03-09T19:27:49.796 INFO:tasks.workunit.client.0.vm07.stdout:2/710: write d3/dd/f9a [651254,59008] 0 2026-03-09T19:27:49.798 INFO:tasks.workunit.client.0.vm07.stdout:6/599: write d0/dbf/d95/f35 [3065373,50735] 0 2026-03-09T19:27:49.803 INFO:tasks.workunit.client.0.vm07.stdout:2/711: unlink d3/dd/d16/d29/c6c 0 2026-03-09T19:27:49.805 INFO:tasks.workunit.client.0.vm07.stdout:6/600: creat d0/d1/d28/d76/fec x:0 0 0 2026-03-09T19:27:49.807 INFO:tasks.workunit.client.0.vm07.stdout:8/662: write d7/d9/d37/d45/d56/f7a [2916074,58632] 0 2026-03-09T19:27:49.811 INFO:tasks.workunit.client.0.vm07.stdout:2/712: fsync d3/dd/d16/d29/d2d/d45/d3b/dae/fda 0 2026-03-09T19:27:49.815 INFO:tasks.workunit.client.0.vm07.stdout:1/675: getdents d1/d91 0 2026-03-09T19:27:49.817 INFO:tasks.workunit.client.1.vm08.stdout:9/843: getdents d0/d1b/d97 0 2026-03-09T19:27:49.824 INFO:tasks.workunit.client.0.vm07.stdout:5/629: dwrite d3/d1a/d28/d40/d92/fa9 [0,4194304] 0 2026-03-09T19:27:49.830 INFO:tasks.workunit.client.0.vm07.stdout:9/659: dwrite d0/d6/d73/fa6 [0,4194304] 0 2026-03-09T19:27:49.830 INFO:tasks.workunit.client.1.vm08.stdout:5/823: fdatasync d16/d45/fb1 0 2026-03-09T19:27:49.832 INFO:tasks.workunit.client.1.vm08.stdout:5/824: truncate d16/d1e/d8c/d99/f105 897538 0 2026-03-09T19:27:49.836 INFO:tasks.workunit.client.0.vm07.stdout:2/713: mknod d3/dd/d16/d29/d2d/d45/d8b/d98/dee/cf9 0 2026-03-09T19:27:49.836 INFO:tasks.workunit.client.0.vm07.stdout:2/714: chown d3/dd/d16/d29/d2d/d45/d85/d8a 229 1 2026-03-09T19:27:49.837 INFO:tasks.workunit.client.0.vm07.stdout:3/706: write d1/d3d/d47/db3/d8e/da9/f93 [3159060,130228] 0 2026-03-09T19:27:49.839 INFO:tasks.workunit.client.0.vm07.stdout:2/715: sync 2026-03-09T19:27:49.840 INFO:tasks.workunit.client.0.vm07.stdout:2/716: sync 2026-03-09T19:27:49.840 INFO:tasks.workunit.client.1.vm08.stdout:4/815: creat da/d10/d26/d27/d9b/fec x:0 0 0 2026-03-09T19:27:49.842 INFO:tasks.workunit.client.1.vm08.stdout:4/816: dread da/d10/d16/d28/d46/d52/d6e/d73/fae [0,4194304] 0 2026-03-09T19:27:49.849 INFO:tasks.workunit.client.0.vm07.stdout:6/601: link d0/d1/d28/fc5 d0/dbf/d95/d31/fed 0 2026-03-09T19:27:49.853 INFO:tasks.workunit.client.1.vm08.stdout:2/769: mkdir d3/d9/d26/ded/d104 0 2026-03-09T19:27:49.854 INFO:tasks.workunit.client.1.vm08.stdout:2/770: write d3/d4/f8 [5128803,117183] 0 2026-03-09T19:27:49.854 INFO:tasks.workunit.client.1.vm08.stdout:2/771: chown d3/d4/d23/d2c/d39/db9/df6 0 1 2026-03-09T19:27:49.857 INFO:tasks.workunit.client.0.vm07.stdout:7/637: dwrite d0/d4/d5/d8/d41/d64/d74/d98/f18 [0,4194304] 0 2026-03-09T19:27:49.863 INFO:tasks.workunit.client.0.vm07.stdout:5/630: unlink d3/d1a/d28/d40/d92/fa0 0 2026-03-09T19:27:49.864 INFO:tasks.workunit.client.0.vm07.stdout:5/631: chown d3/d1a/d5a/db8 15350826 1 2026-03-09T19:27:49.869 INFO:tasks.workunit.client.1.vm08.stdout:6/861: rename d3/db/d43/d69/da0/faf to d3/d34/f13e 0 2026-03-09T19:27:49.873 INFO:tasks.workunit.client.1.vm08.stdout:7/920: creat d5/d14/dae/d12f/f13a x:0 0 0 2026-03-09T19:27:49.880 INFO:tasks.workunit.client.0.vm07.stdout:8/663: link d7/d9/d37/d45/d56/d62/fc8 d7/d9/d37/d45/d56/d67/de7/fea 0 2026-03-09T19:27:49.885 INFO:tasks.workunit.client.1.vm08.stdout:8/832: getdents de/d91/dc8 0 2026-03-09T19:27:49.887 INFO:tasks.workunit.client.0.vm07.stdout:4/629: rename d3/d11/d29/db9 to d3/d11/d2b/d38/ddc 0 2026-03-09T19:27:49.888 INFO:tasks.workunit.client.0.vm07.stdout:3/707: mknod d1/d3d/ce2 0 2026-03-09T19:27:49.891 INFO:tasks.workunit.client.1.vm08.stdout:0/863: getdents dd/d22/d24/d49/d50 0 2026-03-09T19:27:49.892 INFO:tasks.workunit.client.0.vm07.stdout:8/664: sync 2026-03-09T19:27:49.893 INFO:tasks.workunit.client.1.vm08.stdout:5/825: creat d16/f10e x:0 0 0 2026-03-09T19:27:49.893 INFO:tasks.workunit.client.0.vm07.stdout:0/614: write d0/f65 [1017465,54906] 0 2026-03-09T19:27:49.894 INFO:tasks.workunit.client.1.vm08.stdout:5/826: chown d16/d45/l85 177684650 1 2026-03-09T19:27:49.895 INFO:tasks.workunit.client.0.vm07.stdout:8/665: dwrite d7/d9/d37/d45/d4f/db1/fd6 [0,4194304] 0 2026-03-09T19:27:49.897 INFO:tasks.workunit.client.0.vm07.stdout:7/638: truncate d0/d80/f81 2534101 0 2026-03-09T19:27:49.901 INFO:tasks.workunit.client.0.vm07.stdout:9/660: mkdir d0/db/d29/d2c/de5 0 2026-03-09T19:27:49.902 INFO:tasks.workunit.client.1.vm08.stdout:1/970: write d9/d11/d7a/d89/d8d/da3/fc2 [359995,45105] 0 2026-03-09T19:27:49.904 INFO:tasks.workunit.client.0.vm07.stdout:4/630: creat d3/d11/d2b/d37/fdd x:0 0 0 2026-03-09T19:27:49.904 INFO:tasks.workunit.client.0.vm07.stdout:1/676: write d1/d3e/dae/fc3 [297507,125109] 0 2026-03-09T19:27:49.906 INFO:tasks.workunit.client.0.vm07.stdout:3/708: symlink d1/d6/d4c/le3 0 2026-03-09T19:27:49.907 INFO:tasks.workunit.client.0.vm07.stdout:3/709: dread - d1/d89/fc7 zero size 2026-03-09T19:27:49.921 INFO:tasks.workunit.client.0.vm07.stdout:2/717: link d3/dd/d16/d29/d2d/d45/d85/d8a/fe9 d3/dd/ffa 0 2026-03-09T19:27:49.923 INFO:tasks.workunit.client.0.vm07.stdout:6/602: write d0/d1/db/d52/fa1 [538183,111581] 0 2026-03-09T19:27:49.927 INFO:tasks.workunit.client.0.vm07.stdout:6/603: dwrite d0/dbf/d95/d31/f89 [4194304,4194304] 0 2026-03-09T19:27:49.931 INFO:tasks.workunit.client.1.vm08.stdout:8/833: truncate de/d7c/f95 974525 0 2026-03-09T19:27:49.931 INFO:tasks.workunit.client.1.vm08.stdout:8/834: chown de/d47/faa 107 1 2026-03-09T19:27:49.932 INFO:tasks.workunit.client.1.vm08.stdout:8/835: read - de/d7c/f120 zero size 2026-03-09T19:27:49.932 INFO:tasks.workunit.client.0.vm07.stdout:5/632: dwrite d3/dd/f52 [0,4194304] 0 2026-03-09T19:27:49.934 INFO:tasks.workunit.client.0.vm07.stdout:5/633: stat d3/dd/d26/d3f/d47/d71/fb4 0 2026-03-09T19:27:49.934 INFO:tasks.workunit.client.0.vm07.stdout:5/634: stat d3/d1a/d28/d40/d92/d89/cc0 0 2026-03-09T19:27:49.936 INFO:tasks.workunit.client.0.vm07.stdout:0/615: creat d0/d6/fc9 x:0 0 0 2026-03-09T19:27:49.959 INFO:tasks.workunit.client.0.vm07.stdout:8/666: rmdir d7/d1d 39 2026-03-09T19:27:49.961 INFO:tasks.workunit.client.1.vm08.stdout:5/827: fsync d16/d45/fb1 0 2026-03-09T19:27:49.974 INFO:tasks.workunit.client.1.vm08.stdout:7/921: write d5/d14/dae/d1c/f5a [2259205,112320] 0 2026-03-09T19:27:49.974 INFO:tasks.workunit.client.1.vm08.stdout:4/817: mkdir da/d10/d16/d28/d46/d52/d6e/ded 0 2026-03-09T19:27:49.977 INFO:tasks.workunit.client.1.vm08.stdout:9/844: dwrite d0/d2/d14/d98/d99/ff9 [0,4194304] 0 2026-03-09T19:27:49.984 INFO:tasks.workunit.client.0.vm07.stdout:9/661: rename d0/db/d29/d2c/f61 to d0/d6/d73/fe6 0 2026-03-09T19:27:49.986 INFO:tasks.workunit.client.1.vm08.stdout:1/971: mkdir d9/da/d17/d60/d131/d133 0 2026-03-09T19:27:49.989 INFO:tasks.workunit.client.0.vm07.stdout:4/631: unlink d3/ld 0 2026-03-09T19:27:49.993 INFO:tasks.workunit.client.0.vm07.stdout:4/632: dwrite d3/d4f/f7c [4194304,4194304] 0 2026-03-09T19:27:49.999 INFO:tasks.workunit.client.1.vm08.stdout:6/862: symlink d3/d15/d111/l13f 0 2026-03-09T19:27:50.000 INFO:tasks.workunit.client.1.vm08.stdout:6/863: fsync d3/d15/f40 0 2026-03-09T19:27:50.003 INFO:tasks.workunit.client.0.vm07.stdout:3/710: dread d1/d1f/f9c [0,4194304] 0 2026-03-09T19:27:50.006 INFO:tasks.workunit.client.0.vm07.stdout:2/718: chown d3/dd/d16/d29/d2d/d45/d3b/d44/d96/ddf/fe3 15916221 1 2026-03-09T19:27:50.014 INFO:tasks.workunit.client.1.vm08.stdout:8/836: fdatasync de/d1d/d4f/fd9 0 2026-03-09T19:27:50.014 INFO:tasks.workunit.client.1.vm08.stdout:0/864: fdatasync dd/d22/d27/d2e/d37/f44 0 2026-03-09T19:27:50.016 INFO:tasks.workunit.client.0.vm07.stdout:6/604: creat d0/d4e/d7f/dbe/fee x:0 0 0 2026-03-09T19:27:50.018 INFO:tasks.workunit.client.0.vm07.stdout:5/635: creat d3/d1a/d5a/fc6 x:0 0 0 2026-03-09T19:27:50.022 INFO:tasks.workunit.client.0.vm07.stdout:0/616: creat d0/d6/dc8/fca x:0 0 0 2026-03-09T19:27:50.023 INFO:tasks.workunit.client.1.vm08.stdout:5/828: truncate d16/d1e/d3b/f5e 1571068 0 2026-03-09T19:27:50.023 INFO:tasks.workunit.client.1.vm08.stdout:5/829: readlink d16/d1e/d6e/dcd/l100 0 2026-03-09T19:27:50.025 INFO:tasks.workunit.client.1.vm08.stdout:2/772: link d3/d4/d23/d2c/d39/d5e/de/d18/f93 d3/d9/d79/df9/f105 0 2026-03-09T19:27:50.027 INFO:tasks.workunit.client.0.vm07.stdout:8/667: mknod d7/d50/da6/dc5/ceb 0 2026-03-09T19:27:50.032 INFO:tasks.workunit.client.0.vm07.stdout:1/677: dwrite d1/f38 [0,4194304] 0 2026-03-09T19:27:50.034 INFO:tasks.workunit.client.0.vm07.stdout:7/639: mkdir d0/d4/d5/d8/d41/d64/dd5 0 2026-03-09T19:27:50.036 INFO:tasks.workunit.client.1.vm08.stdout:9/845: dread - d0/d2/d14/d98/d99/dea/ff4 zero size 2026-03-09T19:27:50.040 INFO:tasks.workunit.client.1.vm08.stdout:3/920: rename d0/d6/c8d to d0/d52/d6d/c12d 0 2026-03-09T19:27:50.051 INFO:tasks.workunit.client.0.vm07.stdout:2/719: chown d3/d49/lf4 771 1 2026-03-09T19:27:50.052 INFO:tasks.workunit.client.0.vm07.stdout:2/720: truncate d3/dd/d16/d29/f58 2691579 0 2026-03-09T19:27:50.055 INFO:tasks.workunit.client.1.vm08.stdout:6/864: rmdir d3/d34/da9 39 2026-03-09T19:27:50.059 INFO:tasks.workunit.client.1.vm08.stdout:6/865: dwrite d3/d34/d6f/d123/f13a [0,4194304] 0 2026-03-09T19:27:50.066 INFO:tasks.workunit.client.0.vm07.stdout:6/605: rename d0/d1/db/d52/d94/d87/fdd to d0/d44/dd3/fef 0 2026-03-09T19:27:50.072 INFO:tasks.workunit.client.1.vm08.stdout:4/818: dwrite da/d10/d16/d28/d2f/d4f/d64/d81/f86 [0,4194304] 0 2026-03-09T19:27:50.075 INFO:tasks.workunit.client.0.vm07.stdout:4/633: write d3/d4f/f5b [1196936,21205] 0 2026-03-09T19:27:50.078 INFO:tasks.workunit.client.0.vm07.stdout:5/636: creat d3/dd/d95/fc7 x:0 0 0 2026-03-09T19:27:50.079 INFO:tasks.workunit.client.1.vm08.stdout:8/837: mkdir de/d47/dfd/d124 0 2026-03-09T19:27:50.081 INFO:tasks.workunit.client.0.vm07.stdout:0/617: creat d0/d6/d13/d1c/d52/d81/fcb x:0 0 0 2026-03-09T19:27:50.084 INFO:tasks.workunit.client.0.vm07.stdout:3/711: dwrite d1/d3d/d47/db3/dc2/f39 [0,4194304] 0 2026-03-09T19:27:50.086 INFO:tasks.workunit.client.0.vm07.stdout:4/634: dwrite d3/fda [0,4194304] 0 2026-03-09T19:27:50.089 INFO:tasks.workunit.client.1.vm08.stdout:0/865: rmdir dd 39 2026-03-09T19:27:50.089 INFO:tasks.workunit.client.0.vm07.stdout:0/618: dwrite d0/d6/dc8/d99/fac [0,4194304] 0 2026-03-09T19:27:50.090 INFO:tasks.workunit.client.0.vm07.stdout:8/668: fdatasync d7/d9/d37/d45/f4e 0 2026-03-09T19:27:50.101 INFO:tasks.workunit.client.0.vm07.stdout:0/619: dwrite d0/d6/d13/da1/fc1 [0,4194304] 0 2026-03-09T19:27:50.104 INFO:tasks.workunit.client.1.vm08.stdout:7/922: fdatasync d5/d14/dae/d1c/f11e 0 2026-03-09T19:27:50.113 INFO:tasks.workunit.client.1.vm08.stdout:5/830: dread d16/d1e/d9f/fd3 [0,4194304] 0 2026-03-09T19:27:50.121 INFO:tasks.workunit.client.1.vm08.stdout:1/972: mknod d9/da/d12/d91/dc5/d11d/c134 0 2026-03-09T19:27:50.131 INFO:tasks.workunit.client.0.vm07.stdout:7/640: dread d0/d4/d5/d8/f35 [0,4194304] 0 2026-03-09T19:27:50.142 INFO:tasks.workunit.client.1.vm08.stdout:8/838: chown de/d1d/d21/f62 0 1 2026-03-09T19:27:50.142 INFO:tasks.workunit.client.1.vm08.stdout:8/839: chown de/d25/d33 49491870 1 2026-03-09T19:27:50.146 INFO:tasks.workunit.client.0.vm07.stdout:4/635: readlink d3/d11/d2b/d38/ddc/l3a 0 2026-03-09T19:27:50.152 INFO:tasks.workunit.client.1.vm08.stdout:2/773: symlink d3/d4/d23/d2c/d39/d5e/l106 0 2026-03-09T19:27:50.153 INFO:tasks.workunit.client.1.vm08.stdout:2/774: dread - d3/d4/d23/d2c/dc1/f101 zero size 2026-03-09T19:27:50.156 INFO:tasks.workunit.client.0.vm07.stdout:8/669: dread d7/d9/d10/f1b [0,4194304] 0 2026-03-09T19:27:50.156 INFO:tasks.workunit.client.0.vm07.stdout:1/678: symlink d1/d3e/dc8/ldf 0 2026-03-09T19:27:50.159 INFO:tasks.workunit.client.1.vm08.stdout:7/923: fdatasync d5/d14/dae/dd1/d109/d8f/fe0 0 2026-03-09T19:27:50.166 INFO:tasks.workunit.client.1.vm08.stdout:5/831: fsync d16/d45/daf/df5/f70 0 2026-03-09T19:27:50.183 INFO:tasks.workunit.client.0.vm07.stdout:7/641: mknod d0/d4/d5/d8/cd6 0 2026-03-09T19:27:50.183 INFO:tasks.workunit.client.1.vm08.stdout:6/866: creat d3/d34/d5c/da2/d121/f140 x:0 0 0 2026-03-09T19:27:50.193 INFO:tasks.workunit.client.1.vm08.stdout:4/819: write da/fab [112258,86085] 0 2026-03-09T19:27:50.195 INFO:tasks.workunit.client.1.vm08.stdout:9/846: dwrite d0/d1b/d97/f34 [0,4194304] 0 2026-03-09T19:27:50.198 INFO:tasks.workunit.client.0.vm07.stdout:5/637: dwrite d3/d1a/f12 [0,4194304] 0 2026-03-09T19:27:50.198 INFO:tasks.workunit.client.0.vm07.stdout:2/721: dwrite d3/dd/d16/d29/d2d/d45/d3b/d44/d96/ddf/fe3 [0,4194304] 0 2026-03-09T19:27:50.216 INFO:tasks.workunit.client.1.vm08.stdout:1/973: dwrite d9/da/d17/d60/d131/f51 [0,4194304] 0 2026-03-09T19:27:50.220 INFO:tasks.workunit.client.1.vm08.stdout:3/921: dwrite d0/d6/de/d1a/f5a [0,4194304] 0 2026-03-09T19:27:50.229 INFO:tasks.workunit.client.0.vm07.stdout:3/712: write d1/d1f/f38 [3086428,1616] 0 2026-03-09T19:27:50.237 INFO:tasks.workunit.client.1.vm08.stdout:8/840: symlink de/d113/l125 0 2026-03-09T19:27:50.240 INFO:tasks.workunit.client.1.vm08.stdout:2/775: fdatasync d3/d9/fd2 0 2026-03-09T19:27:50.242 INFO:tasks.workunit.client.0.vm07.stdout:1/679: mknod d1/db/d31/d4f/d7a/ce0 0 2026-03-09T19:27:50.242 INFO:tasks.workunit.client.0.vm07.stdout:9/662: link d0/db/d29/d2c/l40 d0/db/d29/le7 0 2026-03-09T19:27:50.243 INFO:tasks.workunit.client.0.vm07.stdout:4/636: dwrite d3/d4f/d56/d5f/fc3 [0,4194304] 0 2026-03-09T19:27:50.245 INFO:tasks.workunit.client.1.vm08.stdout:7/924: rename d5/d14/d27/d78 to d5/d14/dae/d1c/db5/df8/d13b 0 2026-03-09T19:27:50.248 INFO:tasks.workunit.client.1.vm08.stdout:7/925: dread d5/d14/d2b/d4b/ffe [0,4194304] 0 2026-03-09T19:27:50.261 INFO:tasks.workunit.client.0.vm07.stdout:6/606: link d0/d1/db/f9d d0/d1/db/d1d/d77/ff0 0 2026-03-09T19:27:50.270 INFO:tasks.workunit.client.1.vm08.stdout:0/866: dread dd/d22/d24/d49/d50/d78/d86/fd5 [0,4194304] 0 2026-03-09T19:27:50.271 INFO:tasks.workunit.client.1.vm08.stdout:0/867: write dd/d22/d24/f114 [153713,40] 0 2026-03-09T19:27:50.280 INFO:tasks.workunit.client.0.vm07.stdout:2/722: creat d3/dd/d16/d29/d3c/d4c/ffb x:0 0 0 2026-03-09T19:27:50.282 INFO:tasks.workunit.client.1.vm08.stdout:6/867: fsync d3/d34/f37 0 2026-03-09T19:27:50.287 INFO:tasks.workunit.client.0.vm07.stdout:5/638: rename d3/d1a/d5a/c96 to d3/dd/dbe/cc8 0 2026-03-09T19:27:50.294 INFO:tasks.workunit.client.0.vm07.stdout:3/713: fdatasync d1/d1f/f1a 0 2026-03-09T19:27:50.295 INFO:tasks.workunit.client.0.vm07.stdout:3/714: chown d1/d1f/lb2 2478399 1 2026-03-09T19:27:50.299 INFO:tasks.workunit.client.0.vm07.stdout:3/715: dwrite d1/d3d/d47/db3/dc2/d28/d7c/fbd [0,4194304] 0 2026-03-09T19:27:50.307 INFO:tasks.workunit.client.0.vm07.stdout:8/670: getdents d7/d9/ddf 0 2026-03-09T19:27:50.315 INFO:tasks.workunit.client.0.vm07.stdout:4/637: read - d3/d4f/f9d zero size 2026-03-09T19:27:50.320 INFO:tasks.workunit.client.0.vm07.stdout:9/663: mknod d0/d6/d57/d8f/ce8 0 2026-03-09T19:27:50.326 INFO:tasks.workunit.client.1.vm08.stdout:3/922: dwrite d0/d52/d6d/f8b [0,4194304] 0 2026-03-09T19:27:50.329 INFO:tasks.workunit.client.1.vm08.stdout:8/841: creat de/d117/f126 x:0 0 0 2026-03-09T19:27:50.342 INFO:tasks.workunit.client.0.vm07.stdout:7/642: creat d0/d52/db4/fd7 x:0 0 0 2026-03-09T19:27:50.350 INFO:tasks.workunit.client.1.vm08.stdout:2/776: dread d3/d9/d4a/f59 [0,4194304] 0 2026-03-09T19:27:50.352 INFO:tasks.workunit.client.0.vm07.stdout:2/723: symlink d3/dd/d16/d29/d2d/d45/d3b/dae/lfc 0 2026-03-09T19:27:50.353 INFO:tasks.workunit.client.0.vm07.stdout:2/724: write d3/dd/d16/d29/d2d/d45/d3b/d44/d97/fec [779999,25640] 0 2026-03-09T19:27:50.356 INFO:tasks.workunit.client.1.vm08.stdout:5/832: write d16/d1e/d9f/fd6 [504765,130479] 0 2026-03-09T19:27:50.356 INFO:tasks.workunit.client.0.vm07.stdout:9/664: sync 2026-03-09T19:27:50.357 INFO:tasks.workunit.client.1.vm08.stdout:5/833: stat d16/d1e/dc9/d102/l10a 0 2026-03-09T19:27:50.357 INFO:tasks.workunit.client.1.vm08.stdout:5/834: chown ff 23 1 2026-03-09T19:27:50.362 INFO:tasks.workunit.client.1.vm08.stdout:4/820: dwrite da/d10/d26/d3a/f51 [4194304,4194304] 0 2026-03-09T19:27:50.374 INFO:tasks.workunit.client.0.vm07.stdout:1/680: rename d1/d3/l84 to d1/d3e/db3/d6d/le1 0 2026-03-09T19:27:50.376 INFO:tasks.workunit.client.0.vm07.stdout:5/639: symlink d3/d1a/d5a/lc9 0 2026-03-09T19:27:50.380 INFO:tasks.workunit.client.0.vm07.stdout:3/716: rmdir d1/d3d/d47/db3/dc2/d28/d7c 39 2026-03-09T19:27:50.382 INFO:tasks.workunit.client.0.vm07.stdout:4/638: unlink d3/d11/d29/d34/fbc 0 2026-03-09T19:27:50.383 INFO:tasks.workunit.client.0.vm07.stdout:4/639: chown d3/d4f/d56/d5f/f72 116572 1 2026-03-09T19:27:50.383 INFO:tasks.workunit.client.0.vm07.stdout:4/640: chown d3/d11/d2b/d38/ddc/fb0 356572 1 2026-03-09T19:27:50.384 INFO:tasks.workunit.client.0.vm07.stdout:0/620: getdents d0/d6/d13/d1c/d11 0 2026-03-09T19:27:50.384 INFO:tasks.workunit.client.1.vm08.stdout:9/847: fdatasync d0/d1b/d97/d48/d5d/d74/f115 0 2026-03-09T19:27:50.385 INFO:tasks.workunit.client.0.vm07.stdout:0/621: chown d0/c4b 0 1 2026-03-09T19:27:50.390 INFO:tasks.workunit.client.1.vm08.stdout:1/974: creat d9/d11/d129/f135 x:0 0 0 2026-03-09T19:27:50.397 INFO:tasks.workunit.client.1.vm08.stdout:3/923: mkdir d0/d6/d10a/d12e 0 2026-03-09T19:27:50.397 INFO:tasks.workunit.client.1.vm08.stdout:8/842: mkdir de/d25/d33/d127 0 2026-03-09T19:27:50.397 INFO:tasks.workunit.client.1.vm08.stdout:8/843: chown de/d91/dc8 3557 1 2026-03-09T19:27:50.397 INFO:tasks.workunit.client.1.vm08.stdout:2/777: mknod d3/dca/c107 0 2026-03-09T19:27:50.401 INFO:tasks.workunit.client.1.vm08.stdout:5/835: creat d16/d8e/ddb/d107/f10f x:0 0 0 2026-03-09T19:27:50.411 INFO:tasks.workunit.client.1.vm08.stdout:0/868: write dd/d22/d24/d49/d50/d105/d82/fb1 [5580,36780] 0 2026-03-09T19:27:50.414 INFO:tasks.workunit.client.1.vm08.stdout:6/868: dwrite d3/db/f14 [4194304,4194304] 0 2026-03-09T19:27:50.417 INFO:tasks.workunit.client.1.vm08.stdout:7/926: symlink d5/d14/dae/d3a/l13c 0 2026-03-09T19:27:50.432 INFO:tasks.workunit.client.1.vm08.stdout:3/924: creat d0/d52/d7c/d7e/f12f x:0 0 0 2026-03-09T19:27:50.432 INFO:tasks.workunit.client.1.vm08.stdout:8/844: mkdir de/d47/dfd/d99/da5/db3/d128 0 2026-03-09T19:27:50.432 INFO:tasks.workunit.client.1.vm08.stdout:1/975: dread - d9/da/d53/db3/fd2 zero size 2026-03-09T19:27:50.433 INFO:tasks.workunit.client.1.vm08.stdout:1/976: fdatasync d9/da/d2d/d4e/df4/f11c 0 2026-03-09T19:27:50.436 INFO:tasks.workunit.client.1.vm08.stdout:8/845: dwrite de/fb2 [0,4194304] 0 2026-03-09T19:27:50.447 INFO:tasks.workunit.client.1.vm08.stdout:7/927: dread d5/d14/dae/f1f [0,4194304] 0 2026-03-09T19:27:50.453 INFO:tasks.workunit.client.1.vm08.stdout:5/836: fdatasync d16/d8e/dff/fb0 0 2026-03-09T19:27:50.456 INFO:tasks.workunit.client.1.vm08.stdout:7/928: dread d5/d14/d2b/d5d/fb2 [0,4194304] 0 2026-03-09T19:27:50.459 INFO:tasks.workunit.client.1.vm08.stdout:5/837: dread d16/d1e/f7d [0,4194304] 0 2026-03-09T19:27:50.464 INFO:tasks.workunit.client.1.vm08.stdout:7/929: dread d5/d14/dae/d3a/d42/d85/da0/fba [0,4194304] 0 2026-03-09T19:27:50.474 INFO:tasks.workunit.client.1.vm08.stdout:6/869: creat d3/d94/f141 x:0 0 0 2026-03-09T19:27:50.474 INFO:tasks.workunit.client.1.vm08.stdout:4/821: rename da/d10/d16/d28/d4d/c94 to da/d10/d16/d28/d2f/d4f/d56/d90/cee 0 2026-03-09T19:27:50.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:50 vm07.local ceph-mon[48545]: Deploying cephadm binary to vm07 2026-03-09T19:27:50.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:50 vm07.local ceph-mon[48545]: Deploying cephadm binary to vm08 2026-03-09T19:27:50.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:50 vm07.local ceph-mon[48545]: pgmap v4: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-09T19:27:50.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:50 vm07.local ceph-mon[48545]: mgrmap e23: vm08.mxylvw(active, since 2s), standbys: vm07.xacuym 2026-03-09T19:27:50.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:50 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mgr metadata", "who": "vm07.xacuym", "id": "vm07.xacuym"}]: dispatch 2026-03-09T19:27:50.486 INFO:tasks.workunit.client.1.vm08.stdout:2/778: symlink d3/l108 0 2026-03-09T19:27:50.486 INFO:tasks.workunit.client.1.vm08.stdout:1/977: creat d9/da/d2c/d6a/f136 x:0 0 0 2026-03-09T19:27:50.498 INFO:tasks.workunit.client.1.vm08.stdout:0/869: write dd/f19 [1386786,39928] 0 2026-03-09T19:27:50.499 INFO:tasks.workunit.client.1.vm08.stdout:0/870: chown dd/d22/d24/d49/d50/d105/ld6 35872344 1 2026-03-09T19:27:50.500 INFO:tasks.workunit.client.1.vm08.stdout:0/871: chown dd/d22/d27/d4f/f97 737169852 1 2026-03-09T19:27:50.501 INFO:tasks.workunit.client.1.vm08.stdout:7/930: dwrite d5/d14/d2b/d4b/f66 [0,4194304] 0 2026-03-09T19:27:50.511 INFO:tasks.workunit.client.1.vm08.stdout:9/848: rename d0/d1b/d97/d48/d5d/d74/la4 to d0/d2/d14/d98/d99/l11b 0 2026-03-09T19:27:50.513 INFO:tasks.workunit.client.1.vm08.stdout:9/849: chown d0/d2/d80/de5/da2/l103 2088916 1 2026-03-09T19:27:50.513 INFO:tasks.workunit.client.0.vm07.stdout:6/607: fsync d0/f3b 0 2026-03-09T19:27:50.517 INFO:tasks.workunit.client.1.vm08.stdout:9/850: dwrite d0/d1b/d97/fca [0,4194304] 0 2026-03-09T19:27:50.528 INFO:tasks.workunit.client.1.vm08.stdout:3/925: mknod d0/d6/d93/dcb/d129/c130 0 2026-03-09T19:27:50.528 INFO:tasks.workunit.client.1.vm08.stdout:1/978: stat d9/da/d17/l7b 0 2026-03-09T19:27:50.528 INFO:tasks.workunit.client.1.vm08.stdout:1/979: truncate d9/d11/f10e 829844 0 2026-03-09T19:27:50.528 INFO:tasks.workunit.client.0.vm07.stdout:2/725: read - d3/dd/d16/d29/d3c/d5a/d7a/d74/dc9/fd3 zero size 2026-03-09T19:27:50.528 INFO:tasks.workunit.client.0.vm07.stdout:2/726: fdatasync d3/dd/d16/d29/d2d/d45/d3b/fe5 0 2026-03-09T19:27:50.528 INFO:tasks.workunit.client.0.vm07.stdout:1/681: creat d1/d11/d37/d3f/d7e/fe2 x:0 0 0 2026-03-09T19:27:50.528 INFO:tasks.workunit.client.0.vm07.stdout:5/640: dread - d3/dd/fab zero size 2026-03-09T19:27:50.530 INFO:tasks.workunit.client.1.vm08.stdout:2/779: dread d3/d9/d4a/d9a/fd7 [0,4194304] 0 2026-03-09T19:27:50.531 INFO:tasks.workunit.client.0.vm07.stdout:3/717: creat d1/d74/fe4 x:0 0 0 2026-03-09T19:27:50.532 INFO:tasks.workunit.client.0.vm07.stdout:8/671: symlink d7/d9/d37/d45/lec 0 2026-03-09T19:27:50.533 INFO:tasks.workunit.client.0.vm07.stdout:4/641: creat d3/d11/d2b/d38/fde x:0 0 0 2026-03-09T19:27:50.534 INFO:tasks.workunit.client.0.vm07.stdout:0/622: creat d0/d6/d13/d17/dc3/fcc x:0 0 0 2026-03-09T19:27:50.536 INFO:tasks.workunit.client.1.vm08.stdout:0/872: rmdir dd/d22/d24/d49/d50 39 2026-03-09T19:27:50.538 INFO:tasks.workunit.client.0.vm07.stdout:6/608: creat d0/d1/db/d52/ff1 x:0 0 0 2026-03-09T19:27:50.539 INFO:tasks.workunit.client.0.vm07.stdout:2/727: fsync d3/dd/d16/d29/d2d/d45/fd9 0 2026-03-09T19:27:50.540 INFO:tasks.workunit.client.1.vm08.stdout:9/851: fdatasync d0/d2/d14/d98/dbb/fe1 0 2026-03-09T19:27:50.541 INFO:tasks.workunit.client.0.vm07.stdout:1/682: unlink d1/d3e/c57 0 2026-03-09T19:27:50.542 INFO:tasks.workunit.client.1.vm08.stdout:1/980: symlink d9/d11/d129/l137 0 2026-03-09T19:27:50.542 INFO:tasks.workunit.client.1.vm08.stdout:1/981: fsync d9/da/d17/d60/fea 0 2026-03-09T19:27:50.543 INFO:tasks.workunit.client.0.vm07.stdout:5/641: creat d3/d1a/d28/fca x:0 0 0 2026-03-09T19:27:50.545 INFO:tasks.workunit.client.0.vm07.stdout:4/642: creat d3/d11/d2b/d38/fdf x:0 0 0 2026-03-09T19:27:50.546 INFO:tasks.workunit.client.1.vm08.stdout:6/870: creat d3/db/d43/d69/f142 x:0 0 0 2026-03-09T19:27:50.548 INFO:tasks.workunit.client.0.vm07.stdout:4/643: dread d3/d4f/d56/d5f/fc3 [0,4194304] 0 2026-03-09T19:27:50.549 INFO:tasks.workunit.client.0.vm07.stdout:6/609: creat d0/d4e/d7f/dbe/ff2 x:0 0 0 2026-03-09T19:27:50.549 INFO:tasks.workunit.client.0.vm07.stdout:7/643: rmdir d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58/db7 0 2026-03-09T19:27:50.549 INFO:tasks.workunit.client.0.vm07.stdout:6/610: readlink d0/d4e/d75/l9c 0 2026-03-09T19:27:50.552 INFO:tasks.workunit.client.0.vm07.stdout:1/683: rename d1/d11/d37/d3f/d6e/d9c/fc0 to d1/d3e/fe3 0 2026-03-09T19:27:50.554 INFO:tasks.workunit.client.1.vm08.stdout:0/873: mknod dd/d9d/dcc/c119 0 2026-03-09T19:27:50.557 INFO:tasks.workunit.client.1.vm08.stdout:8/846: rename de/d47/l48 to de/d25/l129 0 2026-03-09T19:27:50.558 INFO:tasks.workunit.client.1.vm08.stdout:3/926: link d0/d6/d93/l113 d0/d6/de/d6e/d51/d7f/l131 0 2026-03-09T19:27:50.558 INFO:tasks.workunit.client.0.vm07.stdout:7/644: creat d0/d4/d5/d8/d41/d64/d74/d98/fd8 x:0 0 0 2026-03-09T19:27:50.559 INFO:tasks.workunit.client.0.vm07.stdout:6/611: creat d0/d1/db/d24/da4/ff3 x:0 0 0 2026-03-09T19:27:50.560 INFO:tasks.workunit.client.0.vm07.stdout:2/728: link d3/dd/d16/d29/d3c/d4c/ffb d3/dd/d16/d29/d2d/d45/d3b/ffd 0 2026-03-09T19:27:50.560 INFO:tasks.workunit.client.0.vm07.stdout:6/612: chown d0/d4e/l56 2 1 2026-03-09T19:27:50.568 INFO:tasks.workunit.client.1.vm08.stdout:0/874: read dd/d22/d24/d49/d50/d105/f83 [434185,77433] 0 2026-03-09T19:27:50.571 INFO:tasks.workunit.client.1.vm08.stdout:6/871: mkdir d3/d34/da9/da4/d117/d10d/d143 0 2026-03-09T19:27:50.573 INFO:tasks.workunit.client.1.vm08.stdout:8/847: dread de/d1d/d21/d73/fa6 [0,4194304] 0 2026-03-09T19:27:50.573 INFO:tasks.workunit.client.1.vm08.stdout:8/848: write de/d91/f9d [567351,66351] 0 2026-03-09T19:27:50.574 INFO:tasks.workunit.client.1.vm08.stdout:8/849: chown de/d25/d31/f118 614032 1 2026-03-09T19:27:50.575 INFO:tasks.workunit.client.0.vm07.stdout:4/644: sync 2026-03-09T19:27:50.582 INFO:tasks.workunit.client.1.vm08.stdout:3/927: rename d0/f106 to d0/d8/d24/dfb/f132 0 2026-03-09T19:27:50.587 INFO:tasks.workunit.client.0.vm07.stdout:7/645: creat d0/d4/d5/d8/d41/d64/d74/d98/fd9 x:0 0 0 2026-03-09T19:27:50.587 INFO:tasks.workunit.client.1.vm08.stdout:6/872: chown d3/d55/c70 970685 1 2026-03-09T19:27:50.588 INFO:tasks.workunit.client.0.vm07.stdout:6/613: creat d0/d1/db/d52/ff4 x:0 0 0 2026-03-09T19:27:50.589 INFO:tasks.workunit.client.0.vm07.stdout:3/718: getdents d1/d3d/d47/db3 0 2026-03-09T19:27:50.589 INFO:tasks.workunit.client.0.vm07.stdout:8/672: getdents d7/d9/d10/d44/d9a 0 2026-03-09T19:27:50.589 INFO:tasks.workunit.client.1.vm08.stdout:8/850: truncate de/d25/d31/f8e 1117862 0 2026-03-09T19:27:50.591 INFO:tasks.workunit.client.0.vm07.stdout:4/645: truncate d3/d11/d2b/d38/ddc/d22/f24 4919179 0 2026-03-09T19:27:50.592 INFO:tasks.workunit.client.0.vm07.stdout:0/623: getdents d0/d6/d13/d17/d19/d57/d9e 0 2026-03-09T19:27:50.594 INFO:tasks.workunit.client.0.vm07.stdout:5/642: dread d3/dd/d26/d3f/d47/d56/f75 [0,4194304] 0 2026-03-09T19:27:50.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:50 vm08.local ceph-mon[57794]: Deploying cephadm binary to vm07 2026-03-09T19:27:50.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:50 vm08.local ceph-mon[57794]: Deploying cephadm binary to vm08 2026-03-09T19:27:50.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:50 vm08.local ceph-mon[57794]: pgmap v4: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-09T19:27:50.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:50 vm08.local ceph-mon[57794]: mgrmap e23: vm08.mxylvw(active, since 2s), standbys: vm07.xacuym 2026-03-09T19:27:50.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:50 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mgr metadata", "who": "vm07.xacuym", "id": "vm07.xacuym"}]: dispatch 2026-03-09T19:27:50.608 INFO:tasks.workunit.client.1.vm08.stdout:3/928: dread d0/d52/fa8 [0,4194304] 0 2026-03-09T19:27:50.608 INFO:tasks.workunit.client.1.vm08.stdout:8/851: link de/d1d/d4f/fd9 de/d1d/d2e/f12a 0 2026-03-09T19:27:50.608 INFO:tasks.workunit.client.1.vm08.stdout:8/852: chown de/d25/l63 0 1 2026-03-09T19:27:50.608 INFO:tasks.workunit.client.0.vm07.stdout:3/719: truncate d1/d6/f1b 3255464 0 2026-03-09T19:27:50.608 INFO:tasks.workunit.client.0.vm07.stdout:4/646: fdatasync d3/d11/f7d 0 2026-03-09T19:27:50.608 INFO:tasks.workunit.client.0.vm07.stdout:0/624: truncate d0/d6/d13/f31 3914431 0 2026-03-09T19:27:50.608 INFO:tasks.workunit.client.0.vm07.stdout:5/643: mknod d3/dd/d26/d2d/d79/d9f/ccb 0 2026-03-09T19:27:50.608 INFO:tasks.workunit.client.0.vm07.stdout:0/625: stat d0/d6/d13/d17/d19/d57/f5a 0 2026-03-09T19:27:50.609 INFO:tasks.workunit.client.0.vm07.stdout:5/644: readlink d3/dd/d26/d2d/la8 0 2026-03-09T19:27:50.610 INFO:tasks.workunit.client.1.vm08.stdout:7/931: sync 2026-03-09T19:27:50.610 INFO:tasks.workunit.client.1.vm08.stdout:0/875: sync 2026-03-09T19:27:50.611 INFO:tasks.workunit.client.1.vm08.stdout:8/853: unlink de/d7c/f120 0 2026-03-09T19:27:50.613 INFO:tasks.workunit.client.0.vm07.stdout:2/729: dread d3/dd/f34 [0,4194304] 0 2026-03-09T19:27:50.613 INFO:tasks.workunit.client.1.vm08.stdout:0/876: symlink dd/d22/d27/d2e/db0/l11a 0 2026-03-09T19:27:50.613 INFO:tasks.workunit.client.0.vm07.stdout:6/614: mkdir d0/d2d/dd5/df5 0 2026-03-09T19:27:50.613 INFO:tasks.workunit.client.1.vm08.stdout:0/877: chown dd/d22/d24/d49/d50/ca3 2280 1 2026-03-09T19:27:50.614 INFO:tasks.workunit.client.1.vm08.stdout:0/878: chown dd/d22/d27/f42 0 1 2026-03-09T19:27:50.619 INFO:tasks.workunit.client.1.vm08.stdout:4/822: write da/d10/d1b/f79 [4472717,97289] 0 2026-03-09T19:27:50.622 INFO:tasks.workunit.client.0.vm07.stdout:9/665: write d0/db/f39 [1132130,110106] 0 2026-03-09T19:27:50.627 INFO:tasks.workunit.client.1.vm08.stdout:5/838: dwrite d16/d1e/d8c/f104 [0,4194304] 0 2026-03-09T19:27:50.638 INFO:tasks.workunit.client.1.vm08.stdout:2/780: write d3/d4/d23/d2c/d39/d5e/d14/f58 [1434849,128151] 0 2026-03-09T19:27:50.644 INFO:tasks.workunit.client.0.vm07.stdout:3/720: dread - d1/d74/fb6 zero size 2026-03-09T19:27:50.652 INFO:tasks.workunit.client.1.vm08.stdout:9/852: dwrite d0/d1b/d97/d48/d5d/d74/ded/fb7 [0,4194304] 0 2026-03-09T19:27:50.652 INFO:tasks.workunit.client.0.vm07.stdout:3/721: dread - d1/d3d/d47/db3/faf zero size 2026-03-09T19:27:50.659 INFO:tasks.workunit.client.1.vm08.stdout:7/932: rmdir d5/d14/dae/d3a/d42/d85/da0/df5 39 2026-03-09T19:27:50.660 INFO:tasks.workunit.client.1.vm08.stdout:1/982: write d9/da/d12/f98 [4163650,85394] 0 2026-03-09T19:27:50.660 INFO:tasks.workunit.client.1.vm08.stdout:1/983: readlink d9/da/d53/lc9 0 2026-03-09T19:27:50.665 INFO:tasks.workunit.client.1.vm08.stdout:8/854: creat de/d1d/d2e/db4/f12b x:0 0 0 2026-03-09T19:27:50.671 INFO:tasks.workunit.client.0.vm07.stdout:4/647: fdatasync d3/d11/d2b/d38/ddc/fb0 0 2026-03-09T19:27:50.671 INFO:tasks.workunit.client.0.vm07.stdout:1/684: write d1/db/f1f [639006,109462] 0 2026-03-09T19:27:50.674 INFO:tasks.workunit.client.1.vm08.stdout:0/879: dread dd/d22/f3e [0,4194304] 0 2026-03-09T19:27:50.677 INFO:tasks.workunit.client.0.vm07.stdout:4/648: sync 2026-03-09T19:27:50.735 INFO:tasks.workunit.client.0.vm07.stdout:5/645: dread d3/f19 [0,4194304] 0 2026-03-09T19:27:50.774 INFO:tasks.workunit.client.1.vm08.stdout:6/873: dwrite d3/db/d43/d69/da0/fdf [0,4194304] 0 2026-03-09T19:27:50.775 INFO:tasks.workunit.client.0.vm07.stdout:8/673: dwrite d7/d9/d37/d45/d97/dbc/fca [0,4194304] 0 2026-03-09T19:27:50.776 INFO:tasks.workunit.client.1.vm08.stdout:3/929: dwrite d0/f7a [0,4194304] 0 2026-03-09T19:27:50.800 INFO:tasks.workunit.client.1.vm08.stdout:5/839: creat d16/d45/daf/df5/d6f/f110 x:0 0 0 2026-03-09T19:27:50.804 INFO:tasks.workunit.client.1.vm08.stdout:5/840: dwrite d16/d1e/d8c/d99/da8/d9a/f106 [4194304,4194304] 0 2026-03-09T19:27:50.806 INFO:tasks.workunit.client.1.vm08.stdout:5/841: write d16/d1e/d8c/d99/f105 [914378,111007] 0 2026-03-09T19:27:50.809 INFO:tasks.workunit.client.0.vm07.stdout:6/615: symlink d0/d2d/dd5/lf6 0 2026-03-09T19:27:50.809 INFO:tasks.workunit.client.1.vm08.stdout:2/781: read d3/d4/d23/d2c/d39/d5e/de/d18/f50 [335045,12141] 0 2026-03-09T19:27:50.813 INFO:tasks.workunit.client.0.vm07.stdout:3/722: mkdir d1/d6/d45/d54/de5 0 2026-03-09T19:27:50.828 INFO:tasks.workunit.client.0.vm07.stdout:0/626: fdatasync d0/d6/d13/d33/f35 0 2026-03-09T19:27:50.829 INFO:tasks.workunit.client.0.vm07.stdout:0/627: chown d0/d6/dc8/c95 39840037 1 2026-03-09T19:27:50.829 INFO:tasks.workunit.client.0.vm07.stdout:4/649: dread - d3/d11/d29/fc4 zero size 2026-03-09T19:27:50.838 INFO:tasks.workunit.client.0.vm07.stdout:7/646: link d0/d52/d54/d55/f6d d0/fda 0 2026-03-09T19:27:50.841 INFO:tasks.workunit.client.0.vm07.stdout:2/730: truncate d3/dd/d16/d30/da7/dad/ddd/ff8 2319573 0 2026-03-09T19:27:50.842 INFO:tasks.workunit.client.0.vm07.stdout:7/647: dread d0/d4/d5/d8/d41/d64/d74/d98/dcb/f63 [0,4194304] 0 2026-03-09T19:27:50.850 INFO:tasks.workunit.client.0.vm07.stdout:7/648: fsync d0/d4/d5/d8/d41/fb0 0 2026-03-09T19:27:50.852 INFO:tasks.workunit.client.0.vm07.stdout:0/628: dread d0/d6/d13/d1c/d50/f60 [0,4194304] 0 2026-03-09T19:27:50.852 INFO:tasks.workunit.client.0.vm07.stdout:8/674: mkdir d7/d1d/d83/d9f/ded 0 2026-03-09T19:27:50.853 INFO:tasks.workunit.client.0.vm07.stdout:0/629: chown d0/d6/d13/d1c/d50/fc7 63 1 2026-03-09T19:27:50.856 INFO:tasks.workunit.client.0.vm07.stdout:9/666: creat d0/db/fe9 x:0 0 0 2026-03-09T19:27:50.860 INFO:tasks.workunit.client.0.vm07.stdout:9/667: chown d0/d6/d57/ldd 35062 1 2026-03-09T19:27:50.860 INFO:tasks.workunit.client.0.vm07.stdout:7/649: dwrite d0/d4/d5/d8/d1a/d2a/fc8 [0,4194304] 0 2026-03-09T19:27:50.862 INFO:tasks.workunit.client.0.vm07.stdout:4/650: sync 2026-03-09T19:27:50.862 INFO:tasks.workunit.client.0.vm07.stdout:9/668: read d0/d17/f1f [8055,43928] 0 2026-03-09T19:27:50.867 INFO:tasks.workunit.client.0.vm07.stdout:4/651: mkdir d3/d11/d29/d34/de0 0 2026-03-09T19:27:50.868 INFO:tasks.workunit.client.0.vm07.stdout:7/650: fdatasync d0/d4/d5/d26/f4a 0 2026-03-09T19:27:50.871 INFO:tasks.workunit.client.0.vm07.stdout:8/675: mknod d7/d9/d37/d45/d97/dbc/de2/cee 0 2026-03-09T19:27:50.885 INFO:tasks.workunit.client.0.vm07.stdout:8/676: mkdir d7/d1d/d83/d9f/dd2/def 0 2026-03-09T19:27:50.894 INFO:tasks.workunit.client.0.vm07.stdout:6/616: dread d0/dbf/fd8 [0,4194304] 0 2026-03-09T19:27:50.913 INFO:tasks.workunit.client.1.vm08.stdout:5/842: mkdir d16/d1e/d8c/d99/da8/d9a/d111 0 2026-03-09T19:27:50.913 INFO:tasks.workunit.client.1.vm08.stdout:5/843: chown d16/d45/daf/df5/fb9 5108 1 2026-03-09T19:27:50.914 INFO:tasks.workunit.client.1.vm08.stdout:5/844: fdatasync d16/d45/daf/df5/f3f 0 2026-03-09T19:27:50.915 INFO:tasks.workunit.client.1.vm08.stdout:5/845: truncate d16/d1e/d8c/f101 869191 0 2026-03-09T19:27:50.916 INFO:tasks.workunit.client.1.vm08.stdout:2/782: dread - d3/d4/d23/d2c/fdb zero size 2026-03-09T19:27:50.916 INFO:tasks.workunit.client.1.vm08.stdout:5/846: chown d16/d45/daf/df5/c41 424 1 2026-03-09T19:27:50.920 INFO:tasks.workunit.client.1.vm08.stdout:4/823: creat da/d10/d16/d28/d2f/d4f/fef x:0 0 0 2026-03-09T19:27:50.921 INFO:tasks.workunit.client.1.vm08.stdout:0/880: truncate dd/fe 7900671 0 2026-03-09T19:27:50.924 INFO:tasks.workunit.client.1.vm08.stdout:0/881: dwrite dd/d22/d24/d49/d50/d78/d86/f10a [0,4194304] 0 2026-03-09T19:27:50.928 INFO:tasks.workunit.client.1.vm08.stdout:7/933: dread d5/d14/dae/dd1/d109/d8f/fe0 [0,4194304] 0 2026-03-09T19:27:50.929 INFO:tasks.workunit.client.1.vm08.stdout:7/934: chown d5/d14/d27/d54/c102 37905 1 2026-03-09T19:27:50.936 INFO:tasks.workunit.client.1.vm08.stdout:2/783: dread - d3/d4/d23/fd1 zero size 2026-03-09T19:27:50.937 INFO:tasks.workunit.client.1.vm08.stdout:5/847: mkdir d16/d1e/dc9/d10c/d112 0 2026-03-09T19:27:50.939 INFO:tasks.workunit.client.1.vm08.stdout:4/824: creat da/d10/d26/d3a/d91/ff0 x:0 0 0 2026-03-09T19:27:50.939 INFO:tasks.workunit.client.1.vm08.stdout:4/825: readlink da/le 0 2026-03-09T19:27:50.940 INFO:tasks.workunit.client.1.vm08.stdout:0/882: dread - dd/d22/d63/d93/f101 zero size 2026-03-09T19:27:50.943 INFO:tasks.workunit.client.1.vm08.stdout:7/935: dread d5/d14/dae/f103 [0,4194304] 0 2026-03-09T19:27:50.945 INFO:tasks.workunit.client.1.vm08.stdout:2/784: creat d3/d9/d79/d46/d8c/d92/f109 x:0 0 0 2026-03-09T19:27:50.947 INFO:tasks.workunit.client.1.vm08.stdout:1/984: link d9/da/f30 d9/d11/d7a/d89/d8d/da3/f138 0 2026-03-09T19:27:50.950 INFO:tasks.workunit.client.1.vm08.stdout:4/826: read da/d10/d16/f9f [1343281,19444] 0 2026-03-09T19:27:50.950 INFO:tasks.workunit.client.1.vm08.stdout:4/827: readlink da/d10/d16/d28/d46/d52/d6e/d73/lcd 0 2026-03-09T19:27:50.952 INFO:tasks.workunit.client.1.vm08.stdout:0/883: creat dd/d22/d24/d49/d50/d105/d82/f11b x:0 0 0 2026-03-09T19:27:50.953 INFO:tasks.workunit.client.1.vm08.stdout:0/884: dread - dd/d22/d24/fdb zero size 2026-03-09T19:27:50.956 INFO:tasks.workunit.client.1.vm08.stdout:0/885: dread dd/d22/d27/d6c/fbf [0,4194304] 0 2026-03-09T19:27:50.957 INFO:tasks.workunit.client.1.vm08.stdout:0/886: dread dd/d22/d27/fc8 [0,4194304] 0 2026-03-09T19:27:50.958 INFO:tasks.workunit.client.1.vm08.stdout:7/936: dread - d5/d14/dae/f11f zero size 2026-03-09T19:27:50.967 INFO:tasks.workunit.client.0.vm07.stdout:9/669: symlink d0/db/d29/d32/d5c/d80/lea 0 2026-03-09T19:27:50.967 INFO:tasks.workunit.client.1.vm08.stdout:6/874: rename d3/d15/l5d to d3/d15/dc2/d12f/l144 0 2026-03-09T19:27:50.973 INFO:tasks.workunit.client.1.vm08.stdout:4/828: dread da/d10/d16/d28/d2f/d4f/f65 [0,4194304] 0 2026-03-09T19:27:50.981 INFO:tasks.workunit.client.1.vm08.stdout:6/875: mkdir d3/d94/d145 0 2026-03-09T19:27:50.985 INFO:tasks.workunit.client.1.vm08.stdout:7/937: mkdir d5/d14/dae/d3a/d42/d85/da0/d13d 0 2026-03-09T19:27:50.986 INFO:tasks.workunit.client.1.vm08.stdout:2/785: creat d3/d4/d3e/f10a x:0 0 0 2026-03-09T19:27:50.989 INFO:tasks.workunit.client.1.vm08.stdout:6/876: creat d3/d34/dce/de3/f146 x:0 0 0 2026-03-09T19:27:50.990 INFO:tasks.workunit.client.1.vm08.stdout:0/887: creat dd/f11c x:0 0 0 2026-03-09T19:27:50.991 INFO:tasks.workunit.client.1.vm08.stdout:7/938: chown d5/d14/dae/f49 185283132 1 2026-03-09T19:27:50.993 INFO:tasks.workunit.client.1.vm08.stdout:2/786: rename d3/d9/l51 to d3/d4/d3e/d9d/l10b 0 2026-03-09T19:27:50.994 INFO:tasks.workunit.client.1.vm08.stdout:2/787: write d3/d4/d23/d2c/ff0 [546009,53258] 0 2026-03-09T19:27:50.994 INFO:tasks.workunit.client.1.vm08.stdout:1/985: link d9/d11/d7a/d89/d8d/da3/lec d9/d11/d7a/l139 0 2026-03-09T19:27:51.000 INFO:tasks.workunit.client.1.vm08.stdout:6/877: mkdir d3/db/d12a/d147 0 2026-03-09T19:27:51.000 INFO:tasks.workunit.client.1.vm08.stdout:6/878: dread - d3/db/d43/fab zero size 2026-03-09T19:27:51.001 INFO:tasks.workunit.client.0.vm07.stdout:1/685: dwrite d1/d3/d21/f5f [0,4194304] 0 2026-03-09T19:27:51.003 INFO:tasks.workunit.client.1.vm08.stdout:7/939: creat d5/d14/d27/d54/dfb/d90/f13e x:0 0 0 2026-03-09T19:27:51.011 INFO:tasks.workunit.client.1.vm08.stdout:2/788: creat d3/d4/d23/d2c/d39/d5e/db8/f10c x:0 0 0 2026-03-09T19:27:51.012 INFO:tasks.workunit.client.1.vm08.stdout:2/789: read d3/d9/d4a/d9a/fd7 [185417,87001] 0 2026-03-09T19:27:51.017 INFO:tasks.workunit.client.1.vm08.stdout:1/986: mkdir d9/da/d17/d60/d13a 0 2026-03-09T19:27:51.021 INFO:tasks.workunit.client.1.vm08.stdout:7/940: mkdir d5/d14/dae/d1c/d13f 0 2026-03-09T19:27:51.021 INFO:tasks.workunit.client.1.vm08.stdout:7/941: readlink d5/l57 0 2026-03-09T19:27:51.056 INFO:tasks.workunit.client.0.vm07.stdout:3/723: write d1/f73 [266941,92750] 0 2026-03-09T19:27:51.060 INFO:tasks.workunit.client.0.vm07.stdout:3/724: dread d1/d3d/f95 [0,4194304] 0 2026-03-09T19:27:51.061 INFO:tasks.workunit.client.0.vm07.stdout:3/725: creat d1/d3d/d47/db3/d8e/da9/fe6 x:0 0 0 2026-03-09T19:27:51.065 INFO:tasks.workunit.client.0.vm07.stdout:5/646: mknod d3/d1a/d28/d6c/ccc 0 2026-03-09T19:27:51.068 INFO:tasks.workunit.client.0.vm07.stdout:5/647: dread d3/d1a/f12 [0,4194304] 0 2026-03-09T19:27:51.068 INFO:tasks.workunit.client.0.vm07.stdout:5/648: dread - d3/dd/d95/fc7 zero size 2026-03-09T19:27:51.070 INFO:tasks.workunit.client.0.vm07.stdout:0/630: truncate d0/d6/dc8/d99/fa7 3656986 0 2026-03-09T19:27:51.075 INFO:tasks.workunit.client.0.vm07.stdout:4/652: write d3/d11/d2b/f71 [1155476,30985] 0 2026-03-09T19:27:51.076 INFO:tasks.workunit.client.0.vm07.stdout:3/726: dread d1/d6/dd/f33 [0,4194304] 0 2026-03-09T19:27:51.078 INFO:tasks.workunit.client.0.vm07.stdout:4/653: unlink d3/d11/d16/f77 0 2026-03-09T19:27:51.078 INFO:tasks.workunit.client.0.vm07.stdout:7/651: write d0/f6c [512174,11741] 0 2026-03-09T19:27:51.083 INFO:tasks.workunit.client.0.vm07.stdout:4/654: mkdir d3/d11/d16/de1 0 2026-03-09T19:27:51.088 INFO:tasks.workunit.client.0.vm07.stdout:7/652: readlink d0/d4/d5/l6e 0 2026-03-09T19:27:51.089 INFO:tasks.workunit.client.0.vm07.stdout:8/677: symlink d7/d1d/d83/lf0 0 2026-03-09T19:27:51.089 INFO:tasks.workunit.client.0.vm07.stdout:8/678: creat d7/d9/d37/d34/ff1 x:0 0 0 2026-03-09T19:27:51.089 INFO:tasks.workunit.client.0.vm07.stdout:2/731: rmdir d3/dd/d16/d29/d2d/d45/d3b/d44/d96 39 2026-03-09T19:27:51.089 INFO:tasks.workunit.client.0.vm07.stdout:2/732: dread - d3/dd/ffa zero size 2026-03-09T19:27:51.089 INFO:tasks.workunit.client.0.vm07.stdout:8/679: fsync d7/d9/d37/d45/d4f/db1/fce 0 2026-03-09T19:27:51.090 INFO:tasks.workunit.client.0.vm07.stdout:5/649: sync 2026-03-09T19:27:51.090 INFO:tasks.workunit.client.0.vm07.stdout:3/727: sync 2026-03-09T19:27:51.091 INFO:tasks.workunit.client.0.vm07.stdout:7/653: link d0/d4/d5/d8/d41/d64/d74/d98/f83 d0/d4/d5/d8/dcd/fdb 0 2026-03-09T19:27:51.092 INFO:tasks.workunit.client.0.vm07.stdout:8/680: mkdir d7/d30/d75/dcc/df2 0 2026-03-09T19:27:51.093 INFO:tasks.workunit.client.0.vm07.stdout:8/681: write d7/d1d/d83/d9f/fc6 [750008,90385] 0 2026-03-09T19:27:51.094 INFO:tasks.workunit.client.0.vm07.stdout:5/650: getdents d3/dd/d26/d2d/d9e 0 2026-03-09T19:27:51.101 INFO:tasks.workunit.client.0.vm07.stdout:2/733: creat d3/dd/d16/ffe x:0 0 0 2026-03-09T19:27:51.102 INFO:tasks.workunit.client.0.vm07.stdout:3/728: creat d1/d3d/d47/fe7 x:0 0 0 2026-03-09T19:27:51.102 INFO:tasks.workunit.client.0.vm07.stdout:5/651: creat d3/dd/dbe/fcd x:0 0 0 2026-03-09T19:27:51.104 INFO:tasks.workunit.client.0.vm07.stdout:3/729: creat d1/d6/d45/dac/fe8 x:0 0 0 2026-03-09T19:27:51.105 INFO:tasks.workunit.client.0.vm07.stdout:5/652: truncate d3/dd/f23 756900 0 2026-03-09T19:27:51.106 INFO:tasks.workunit.client.0.vm07.stdout:5/653: creat d3/dd/dbe/fce x:0 0 0 2026-03-09T19:27:51.108 INFO:tasks.workunit.client.1.vm08.stdout:8/855: write de/d1d/d2e/d5f/fbb [2394066,130970] 0 2026-03-09T19:27:51.109 INFO:tasks.workunit.client.0.vm07.stdout:5/654: mkdir d3/dd/d26/d2d/d60/dcf 0 2026-03-09T19:27:51.109 INFO:tasks.workunit.client.1.vm08.stdout:8/856: mknod de/d25/d33/c12c 0 2026-03-09T19:27:51.142 INFO:tasks.workunit.client.0.vm07.stdout:6/617: write d0/d4e/d7f/fe9 [62506,122061] 0 2026-03-09T19:27:51.143 INFO:tasks.workunit.client.1.vm08.stdout:5/848: unlink d16/l1a 0 2026-03-09T19:27:51.143 INFO:tasks.workunit.client.1.vm08.stdout:5/849: chown d16/d1e/d6e/cda 54416 1 2026-03-09T19:27:51.147 INFO:tasks.workunit.client.1.vm08.stdout:5/850: dread - d16/d8e/dff/fb0 zero size 2026-03-09T19:27:51.147 INFO:tasks.workunit.client.1.vm08.stdout:5/851: stat d16/d1e/fa5 0 2026-03-09T19:27:51.156 INFO:tasks.workunit.client.0.vm07.stdout:6/618: symlink d0/d1/db/d24/da4/dda/lf7 0 2026-03-09T19:27:51.161 INFO:tasks.workunit.client.1.vm08.stdout:3/930: dwrite d0/d6/d25/f56 [0,4194304] 0 2026-03-09T19:27:51.164 INFO:tasks.workunit.client.1.vm08.stdout:3/931: write d0/d6/de/d1b/d16/d17/f1d [4166465,76229] 0 2026-03-09T19:27:51.175 INFO:tasks.workunit.client.1.vm08.stdout:5/852: dread d16/d45/f46 [0,4194304] 0 2026-03-09T19:27:51.179 INFO:tasks.workunit.client.0.vm07.stdout:6/619: creat d0/d13/ff8 x:0 0 0 2026-03-09T19:27:51.182 INFO:tasks.workunit.client.1.vm08.stdout:2/790: rename d3/d4/d23/d2c/d39/d5e/de/l77 to d3/d4/d23/d2c/d39/d5e/db8/dff/l10d 0 2026-03-09T19:27:51.189 INFO:tasks.workunit.client.0.vm07.stdout:6/620: mknod d0/d2d/dd5/cf9 0 2026-03-09T19:27:51.190 INFO:tasks.workunit.client.1.vm08.stdout:3/932: creat d0/d6/de/d1b/d16/d17/dac/d109/f133 x:0 0 0 2026-03-09T19:27:51.199 INFO:tasks.workunit.client.1.vm08.stdout:3/933: chown d0/d6/de/d15/d96/l124 342 1 2026-03-09T19:27:51.203 INFO:tasks.workunit.client.1.vm08.stdout:9/853: unlink d0/d2/d80/de5/da2/da8/de8/c104 0 2026-03-09T19:27:51.206 INFO:tasks.workunit.client.0.vm07.stdout:0/631: rename d0/d6/d13/d1c/d11/la0 to d0/d6/dc8/lcd 0 2026-03-09T19:27:51.214 INFO:tasks.workunit.client.1.vm08.stdout:9/854: dread d0/d1b/f4b [0,4194304] 0 2026-03-09T19:27:51.216 INFO:tasks.workunit.client.0.vm07.stdout:9/670: write d0/d6/f9b [768676,18386] 0 2026-03-09T19:27:51.216 INFO:tasks.workunit.client.1.vm08.stdout:8/857: rmdir de/d1d/d2e 39 2026-03-09T19:27:51.218 INFO:tasks.workunit.client.0.vm07.stdout:4/655: rename d3/d11/d2b/d38/ddc/ca6 to d3/d11/d29/d34/de0/ce2 0 2026-03-09T19:27:51.219 INFO:tasks.workunit.client.0.vm07.stdout:9/671: unlink d0/db/d29/l2d 0 2026-03-09T19:27:51.221 INFO:tasks.workunit.client.0.vm07.stdout:0/632: dread d0/d6/d13/d1c/d61/d69/fad [0,4194304] 0 2026-03-09T19:27:51.226 INFO:tasks.workunit.client.1.vm08.stdout:8/858: getdents de/d1d 0 2026-03-09T19:27:51.227 INFO:tasks.workunit.client.0.vm07.stdout:4/656: fsync d3/d11/d2b/d38/ddc/d22/f7a 0 2026-03-09T19:27:51.227 INFO:tasks.workunit.client.1.vm08.stdout:8/859: chown de/d47/dfd/d124 34597026 1 2026-03-09T19:27:51.227 INFO:tasks.workunit.client.0.vm07.stdout:4/657: dread - d3/d11/d2b/d38/ddc/fb4 zero size 2026-03-09T19:27:51.228 INFO:tasks.workunit.client.0.vm07.stdout:0/633: unlink d0/f65 0 2026-03-09T19:27:51.229 INFO:tasks.workunit.client.0.vm07.stdout:4/658: mknod d3/d11/d2b/d38/ddc/d22/d70/ce3 0 2026-03-09T19:27:51.230 INFO:tasks.workunit.client.0.vm07.stdout:0/634: mknod d0/d6/d13/d33/cce 0 2026-03-09T19:27:51.231 INFO:tasks.workunit.client.0.vm07.stdout:4/659: symlink d3/d11/d16/de1/le4 0 2026-03-09T19:27:51.231 INFO:tasks.workunit.client.0.vm07.stdout:0/635: creat d0/d6/fcf x:0 0 0 2026-03-09T19:27:51.231 INFO:tasks.workunit.client.0.vm07.stdout:0/636: chown d0/f9f 691 1 2026-03-09T19:27:51.232 INFO:tasks.workunit.client.0.vm07.stdout:4/660: creat d3/d11/d16/fe5 x:0 0 0 2026-03-09T19:27:51.232 INFO:tasks.workunit.client.0.vm07.stdout:0/637: fdatasync d0/d6/d13/f6c 0 2026-03-09T19:27:51.233 INFO:tasks.workunit.client.0.vm07.stdout:4/661: fsync d3/d4f/d56/d5f/f72 0 2026-03-09T19:27:51.233 INFO:tasks.workunit.client.0.vm07.stdout:0/638: chown d0/d6/dc8/c98 0 1 2026-03-09T19:27:51.235 INFO:tasks.workunit.client.0.vm07.stdout:4/662: creat d3/d11/d29/d34/de0/fe6 x:0 0 0 2026-03-09T19:27:51.236 INFO:tasks.workunit.client.0.vm07.stdout:4/663: mkdir d3/d11/d2b/d38/ddc/d22/d70/d99/de7 0 2026-03-09T19:27:51.236 INFO:tasks.workunit.client.0.vm07.stdout:0/639: read d0/d6/d13/d17/d19/d57/d6a/f74 [2747273,94100] 0 2026-03-09T19:27:51.239 INFO:tasks.workunit.client.0.vm07.stdout:0/640: mkdir d0/d6/d13/dd0 0 2026-03-09T19:27:51.240 INFO:tasks.workunit.client.0.vm07.stdout:0/641: chown d0/l9d 46 1 2026-03-09T19:27:51.244 INFO:tasks.workunit.client.0.vm07.stdout:9/672: dread d0/d6/d57/f58 [0,4194304] 0 2026-03-09T19:27:51.245 INFO:tasks.workunit.client.0.vm07.stdout:0/642: link d0/f3d d0/d6/d13/d1c/d11/fd1 0 2026-03-09T19:27:51.247 INFO:tasks.workunit.client.0.vm07.stdout:0/643: mknod d0/d6/d13/d17/d19/cd2 0 2026-03-09T19:27:51.250 INFO:tasks.workunit.client.0.vm07.stdout:9/673: rename d0/db/d29/d2c/d36/d7d to d0/d6/d57/deb 0 2026-03-09T19:27:51.251 INFO:tasks.workunit.client.0.vm07.stdout:0/644: truncate d0/d6/d13/d17/d19/f1f 5049484 0 2026-03-09T19:27:51.252 INFO:tasks.workunit.client.0.vm07.stdout:0/645: dread - d0/d6/d13/d17/fc5 zero size 2026-03-09T19:27:51.252 INFO:tasks.workunit.client.0.vm07.stdout:9/674: symlink d0/d6f/d86/lec 0 2026-03-09T19:27:51.254 INFO:tasks.workunit.client.0.vm07.stdout:9/675: creat d0/d6/d73/fed x:0 0 0 2026-03-09T19:27:51.256 INFO:tasks.workunit.client.0.vm07.stdout:9/676: symlink d0/db/lee 0 2026-03-09T19:27:51.263 INFO:tasks.workunit.client.0.vm07.stdout:0/646: sync 2026-03-09T19:27:51.266 INFO:tasks.workunit.client.0.vm07.stdout:0/647: creat d0/d6/d13/d1c/d52/fd3 x:0 0 0 2026-03-09T19:27:51.268 INFO:tasks.workunit.client.0.vm07.stdout:0/648: creat d0/d6/dc8/d99/fd4 x:0 0 0 2026-03-09T19:27:51.270 INFO:tasks.workunit.client.0.vm07.stdout:0/649: creat d0/d6/d13/dd0/fd5 x:0 0 0 2026-03-09T19:27:51.274 INFO:tasks.workunit.client.0.vm07.stdout:0/650: dwrite d0/d6/d13/da1/fbd [0,4194304] 0 2026-03-09T19:27:51.278 INFO:tasks.workunit.client.1.vm08.stdout:4/829: dwrite da/d10/d26/d3a/fcc [0,4194304] 0 2026-03-09T19:27:51.283 INFO:tasks.workunit.client.0.vm07.stdout:0/651: fdatasync d0/d6/d13/d17/d19/d58/fab 0 2026-03-09T19:27:51.286 INFO:tasks.workunit.client.0.vm07.stdout:0/652: stat d0/d6/d13/d17/cae 0 2026-03-09T19:27:51.293 INFO:tasks.workunit.client.0.vm07.stdout:8/682: rmdir d7/d30 39 2026-03-09T19:27:51.295 INFO:tasks.workunit.client.0.vm07.stdout:1/686: write d1/d3/f4 [3468502,67852] 0 2026-03-09T19:27:51.295 INFO:tasks.workunit.client.1.vm08.stdout:1/987: rename d9/da/l21 to d9/l13b 0 2026-03-09T19:27:51.296 INFO:tasks.workunit.client.1.vm08.stdout:6/879: write d3/dbc/f126 [875073,102912] 0 2026-03-09T19:27:51.298 INFO:tasks.workunit.client.1.vm08.stdout:0/888: dwrite dd/d22/d24/d49/d50/db3/ffc [0,4194304] 0 2026-03-09T19:27:51.299 INFO:tasks.workunit.client.0.vm07.stdout:1/687: dwrite d1/f38 [0,4194304] 0 2026-03-09T19:27:51.307 INFO:tasks.workunit.client.1.vm08.stdout:7/942: rmdir d5/d14/d27/d54/d86 39 2026-03-09T19:27:51.315 INFO:tasks.workunit.client.1.vm08.stdout:6/880: symlink d3/d34/d5c/da2/l148 0 2026-03-09T19:27:51.317 INFO:tasks.workunit.client.1.vm08.stdout:0/889: rename dd/d22/d24/c64 to dd/d22/d100/c11d 0 2026-03-09T19:27:51.317 INFO:tasks.workunit.client.1.vm08.stdout:0/890: stat dd/d22/d63/d6e/f10e 0 2026-03-09T19:27:51.319 INFO:tasks.workunit.client.0.vm07.stdout:0/653: fdatasync d0/d6/d13/d1c/d50/f60 0 2026-03-09T19:27:51.323 INFO:tasks.workunit.client.1.vm08.stdout:6/881: symlink d3/d34/da9/da4/d117/l149 0 2026-03-09T19:27:51.325 INFO:tasks.workunit.client.0.vm07.stdout:7/654: dwrite d0/d4/d5/d8/f93 [0,4194304] 0 2026-03-09T19:27:51.331 INFO:tasks.workunit.client.1.vm08.stdout:5/853: dwrite d16/d45/daf/df5/fb4 [0,4194304] 0 2026-03-09T19:27:51.332 INFO:tasks.workunit.client.1.vm08.stdout:5/854: chown d16/d1e/d8c/d99 737773 1 2026-03-09T19:27:51.332 INFO:tasks.workunit.client.0.vm07.stdout:5/655: dwrite d3/fe [0,4194304] 0 2026-03-09T19:27:51.332 INFO:tasks.workunit.client.0.vm07.stdout:8/683: fdatasync d7/d9/d37/d34/f79 0 2026-03-09T19:27:51.334 INFO:tasks.workunit.client.0.vm07.stdout:2/734: dwrite d3/dd/d16/d29/f42 [0,4194304] 0 2026-03-09T19:27:51.340 INFO:tasks.workunit.client.0.vm07.stdout:0/654: fsync d0/f41 0 2026-03-09T19:27:51.341 INFO:tasks.workunit.client.0.vm07.stdout:6/621: write d0/d1/db/f70 [3718256,29502] 0 2026-03-09T19:27:51.342 INFO:tasks.workunit.client.0.vm07.stdout:6/622: stat d0/d4e/l56 0 2026-03-09T19:27:51.342 INFO:tasks.workunit.client.0.vm07.stdout:6/623: write d0/d4e/d7f/dbe/fe6 [715605,36602] 0 2026-03-09T19:27:51.344 INFO:tasks.workunit.client.0.vm07.stdout:3/730: dwrite d1/d74/f6e [0,4194304] 0 2026-03-09T19:27:51.355 INFO:tasks.workunit.client.1.vm08.stdout:7/943: symlink d5/d14/dae/d3a/d42/d85/da0/df5/d11b/l140 0 2026-03-09T19:27:51.356 INFO:tasks.workunit.client.1.vm08.stdout:2/791: write d3/d4/f49 [929663,55899] 0 2026-03-09T19:27:51.361 INFO:tasks.workunit.client.1.vm08.stdout:3/934: dwrite d0/d52/d6d/d77/f68 [0,4194304] 0 2026-03-09T19:27:51.362 INFO:tasks.workunit.client.1.vm08.stdout:3/935: chown d0/d6/de/d1a/f5a 7491517 1 2026-03-09T19:27:51.373 INFO:tasks.workunit.client.1.vm08.stdout:3/936: write d0/d6/de/d1b/d16/d17/dac/d109/fb7 [2806765,34888] 0 2026-03-09T19:27:51.373 INFO:tasks.workunit.client.0.vm07.stdout:1/688: rename d1/db/d31/d56/c73 to d1/d3e/dae/ce4 0 2026-03-09T19:27:51.373 INFO:tasks.workunit.client.0.vm07.stdout:1/689: stat d1/d3/d21 0 2026-03-09T19:27:51.373 INFO:tasks.workunit.client.0.vm07.stdout:1/690: write d1/d3e/db3/fda [31114,118310] 0 2026-03-09T19:27:51.373 INFO:tasks.workunit.client.0.vm07.stdout:1/691: chown d1/d11/fbc 16290473 1 2026-03-09T19:27:51.378 INFO:tasks.workunit.client.1.vm08.stdout:9/855: dwrite d0/d1b/d68/d7f/fe3 [0,4194304] 0 2026-03-09T19:27:51.391 INFO:tasks.workunit.client.1.vm08.stdout:8/860: dwrite de/d1d/d4f/fae [0,4194304] 0 2026-03-09T19:27:51.395 INFO:tasks.workunit.client.1.vm08.stdout:9/856: dread d0/d1b/d97/d48/f53 [0,4194304] 0 2026-03-09T19:27:51.410 INFO:tasks.workunit.client.0.vm07.stdout:4/664: dwrite d3/d11/d2b/d38/ddc/d91/fb3 [0,4194304] 0 2026-03-09T19:27:51.411 INFO:tasks.workunit.client.1.vm08.stdout:8/861: chown de/d25/d31/f8e 715279452 1 2026-03-09T19:27:51.413 INFO:tasks.workunit.client.1.vm08.stdout:0/891: rename dd/d22/d24/d49/d50 to dd/d22/d27/d11e 0 2026-03-09T19:27:51.444 INFO:tasks.workunit.client.1.vm08.stdout:6/882: sync 2026-03-09T19:27:51.453 INFO:tasks.workunit.client.1.vm08.stdout:8/862: chown de/d25/d31/d82/l75 1 1 2026-03-09T19:27:51.460 INFO:tasks.workunit.client.0.vm07.stdout:9/677: dwrite d0/d6/d57/f59 [0,4194304] 0 2026-03-09T19:27:51.460 INFO:tasks.workunit.client.1.vm08.stdout:0/892: mknod dd/d22/d27/d6c/c11f 0 2026-03-09T19:27:51.460 INFO:tasks.workunit.client.1.vm08.stdout:5/855: link d16/c6d d16/d1e/dc9/d102/c113 0 2026-03-09T19:27:51.460 INFO:tasks.workunit.client.1.vm08.stdout:3/937: creat d0/d6/de/f134 x:0 0 0 2026-03-09T19:27:51.466 INFO:tasks.workunit.client.1.vm08.stdout:3/938: mkdir d0/d6/d10a/d135 0 2026-03-09T19:27:51.469 INFO:tasks.workunit.client.1.vm08.stdout:9/857: getdents d0/d2/d14/d98 0 2026-03-09T19:27:51.469 INFO:tasks.workunit.client.1.vm08.stdout:0/893: symlink dd/d22/d27/d65/l120 0 2026-03-09T19:27:51.473 INFO:tasks.workunit.client.1.vm08.stdout:8/863: link de/d47/dfd/d99/da5/db3/f9e de/d117/f12d 0 2026-03-09T19:27:51.474 INFO:tasks.workunit.client.1.vm08.stdout:3/939: creat d0/d6/de/d15/d96/f136 x:0 0 0 2026-03-09T19:27:51.475 INFO:tasks.workunit.client.1.vm08.stdout:0/894: mkdir dd/d22/d63/d6e/d72/d121 0 2026-03-09T19:27:51.479 INFO:tasks.workunit.client.1.vm08.stdout:9/858: rename d0/d2/d14/d98/d99/cf7 to d0/d2/d14/d98/d99/dd8/c11c 0 2026-03-09T19:27:51.481 INFO:tasks.workunit.client.0.vm07.stdout:2/735: mkdir d3/dd/d16/d29/d3c/da2/dff 0 2026-03-09T19:27:51.484 INFO:tasks.workunit.client.1.vm08.stdout:3/940: dread - d0/d8/d24/fee zero size 2026-03-09T19:27:51.497 INFO:tasks.workunit.client.1.vm08.stdout:0/895: creat dd/d22/d27/d11e/d78/db4/f122 x:0 0 0 2026-03-09T19:27:51.501 INFO:tasks.workunit.client.0.vm07.stdout:5/656: creat d3/dd/d26/d2d/d79/d9f/fd0 x:0 0 0 2026-03-09T19:27:51.501 INFO:tasks.workunit.client.0.vm07.stdout:6/624: symlink d0/d1/db/d52/d94/d81/lfa 0 2026-03-09T19:27:51.502 INFO:tasks.workunit.client.1.vm08.stdout:8/864: creat de/d47/dfd/d124/f12e x:0 0 0 2026-03-09T19:27:51.505 INFO:tasks.workunit.client.1.vm08.stdout:8/865: dwrite de/f10 [0,4194304] 0 2026-03-09T19:27:51.513 INFO:tasks.workunit.client.1.vm08.stdout:3/941: symlink d0/d6/d93/dcb/l137 0 2026-03-09T19:27:51.516 INFO:tasks.workunit.client.1.vm08.stdout:0/896: rename dd/d22/d27/d2e/cee to dd/d22/d27/d11e/dd4/c123 0 2026-03-09T19:27:51.517 INFO:tasks.workunit.client.0.vm07.stdout:7/655: mknod d0/d4/d5/d26/dc9/cdc 0 2026-03-09T19:27:51.518 INFO:tasks.workunit.client.1.vm08.stdout:9/859: symlink d0/l11d 0 2026-03-09T19:27:51.521 INFO:tasks.workunit.client.1.vm08.stdout:3/942: unlink d0/d6/de/d6e/d51/f11f 0 2026-03-09T19:27:51.524 INFO:tasks.workunit.client.1.vm08.stdout:3/943: dwrite d0/d52/d6d/f8b [0,4194304] 0 2026-03-09T19:27:51.526 INFO:tasks.workunit.client.1.vm08.stdout:3/944: chown d0/d52/d7c/fc1 235230 1 2026-03-09T19:27:51.535 INFO:tasks.workunit.client.0.vm07.stdout:4/665: symlink d3/d11/d2b/d38/ddc/d91/le8 0 2026-03-09T19:27:51.536 INFO:tasks.workunit.client.1.vm08.stdout:9/860: symlink d0/d1b/d68/dfe/l11e 0 2026-03-09T19:27:51.538 INFO:tasks.workunit.client.1.vm08.stdout:3/945: truncate d0/d6/de/d1b/d16/fcf 337602 0 2026-03-09T19:27:51.541 INFO:tasks.workunit.client.1.vm08.stdout:9/861: symlink d0/d1b/d97/d48/d5d/d74/ded/l11f 0 2026-03-09T19:27:51.546 INFO:tasks.workunit.client.1.vm08.stdout:0/897: getdents dd/d22/d27/d4f 0 2026-03-09T19:27:51.548 INFO:tasks.workunit.client.0.vm07.stdout:7/656: creat d0/d52/db4/fdd x:0 0 0 2026-03-09T19:27:51.549 INFO:tasks.workunit.client.0.vm07.stdout:8/684: link d7/d30/d32/fba d7/d16/d1e/ff3 0 2026-03-09T19:27:51.550 INFO:tasks.workunit.client.0.vm07.stdout:8/685: chown d7/d9/d10/d44/cd7 50791 1 2026-03-09T19:27:51.551 INFO:tasks.workunit.client.0.vm07.stdout:9/678: symlink d0/db/d29/d2c/lef 0 2026-03-09T19:27:51.560 INFO:tasks.workunit.client.0.vm07.stdout:9/679: stat d0/l2e 0 2026-03-09T19:27:51.560 INFO:tasks.workunit.client.0.vm07.stdout:0/655: creat d0/d6/d13/d1c/d50/fd6 x:0 0 0 2026-03-09T19:27:51.560 INFO:tasks.workunit.client.0.vm07.stdout:5/657: mknod d3/cd1 0 2026-03-09T19:27:51.563 INFO:tasks.workunit.client.0.vm07.stdout:6/625: dread d0/d13/f26 [0,4194304] 0 2026-03-09T19:27:51.567 INFO:tasks.workunit.client.1.vm08.stdout:3/946: sync 2026-03-09T19:27:51.583 INFO:tasks.workunit.client.0.vm07.stdout:3/731: rename d1/d6/dd/dbf/cd8 to d1/d3d/d47/ce9 0 2026-03-09T19:27:51.584 INFO:tasks.workunit.client.0.vm07.stdout:3/732: readlink d1/d3d/d47/db3/dc2/d28/l7f 0 2026-03-09T19:27:51.625 INFO:tasks.workunit.client.0.vm07.stdout:7/657: rmdir d0/d52/d54/d95 39 2026-03-09T19:27:51.627 INFO:tasks.workunit.client.1.vm08.stdout:0/898: link dd/d22/d27/d6c/c11f dd/d22/d27/c124 0 2026-03-09T19:27:51.643 INFO:tasks.workunit.client.1.vm08.stdout:4/830: write da/d10/d16/d28/d2f/d4f/d64/d84/d8a/fb9 [445072,42220] 0 2026-03-09T19:27:51.646 INFO:tasks.workunit.client.0.vm07.stdout:8/686: fsync d7/d9/d37/d45/f7e 0 2026-03-09T19:27:51.650 INFO:tasks.workunit.client.0.vm07.stdout:8/687: dwrite d7/d50/fe5 [0,4194304] 0 2026-03-09T19:27:51.653 INFO:tasks.workunit.client.1.vm08.stdout:8/866: fdatasync de/d1d/d2e/f61 0 2026-03-09T19:27:51.656 INFO:tasks.workunit.client.0.vm07.stdout:9/680: chown d0/d6/l49 1 1 2026-03-09T19:27:51.674 INFO:tasks.workunit.client.1.vm08.stdout:8/867: sync 2026-03-09T19:27:51.689 INFO:tasks.workunit.client.1.vm08.stdout:7/944: write d5/d14/dae/d3a/d42/f71 [904976,6622] 0 2026-03-09T19:27:51.689 INFO:tasks.workunit.client.1.vm08.stdout:2/792: write d3/d4/d23/d2c/d39/d5e/fb7 [1263053,73147] 0 2026-03-09T19:27:51.699 INFO:tasks.workunit.client.0.vm07.stdout:6/626: creat d0/d1/d28/da8/ffb x:0 0 0 2026-03-09T19:27:51.703 INFO:tasks.workunit.client.1.vm08.stdout:0/899: link dd/d22/d24/f114 dd/d22/d63/d93/f125 0 2026-03-09T19:27:51.705 INFO:tasks.workunit.client.0.vm07.stdout:4/666: rename d3/d11/d16/l1f to d3/d11/d2b/d38/le9 0 2026-03-09T19:27:51.717 INFO:tasks.workunit.client.0.vm07.stdout:6/627: dread d0/d1/db/f4b [0,4194304] 0 2026-03-09T19:27:51.723 INFO:tasks.workunit.client.0.vm07.stdout:7/658: mknod d0/d4/d5/d26/d32/dbd/cde 0 2026-03-09T19:27:51.743 INFO:tasks.workunit.client.0.vm07.stdout:8/688: truncate d7/f19 800756 0 2026-03-09T19:27:51.745 INFO:tasks.workunit.client.1.vm08.stdout:6/883: dwrite d3/dbc/deb/f106 [0,4194304] 0 2026-03-09T19:27:51.746 INFO:tasks.workunit.client.1.vm08.stdout:6/884: chown d3/db/fdb 1012 1 2026-03-09T19:27:51.759 INFO:tasks.workunit.client.0.vm07.stdout:9/681: dwrite d0/d6/d73/fe6 [0,4194304] 0 2026-03-09T19:27:51.761 INFO:tasks.workunit.client.0.vm07.stdout:9/682: stat d0/db/d29/d68/f8e 0 2026-03-09T19:27:51.761 INFO:tasks.workunit.client.0.vm07.stdout:9/683: stat d0/db/d29/d68/c76 0 2026-03-09T19:27:51.783 INFO:tasks.workunit.client.0.vm07.stdout:5/658: link d3/dd/d95/fc1 d3/d1a/d28/d6c/d72/db5/fd2 0 2026-03-09T19:27:51.784 INFO:tasks.workunit.client.0.vm07.stdout:5/659: write d3/dd/d26/d3f/fbf [302945,122475] 0 2026-03-09T19:27:51.796 INFO:tasks.workunit.client.0.vm07.stdout:1/692: write d1/d11/fbc [669363,44391] 0 2026-03-09T19:27:51.800 INFO:tasks.workunit.client.0.vm07.stdout:6/628: creat d0/d1/db/d52/ffc x:0 0 0 2026-03-09T19:27:51.803 INFO:tasks.workunit.client.0.vm07.stdout:8/689: creat d7/d9/d37/d45/d97/dbc/ff4 x:0 0 0 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.1.vm08.stdout:9/862: dwrite d0/d2/d80/de5/da2/da8/de8/dcd/fda [0,4194304] 0 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.1.vm08.stdout:4/831: write da/d10/d26/dd8/f43 [2076493,26729] 0 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.1.vm08.stdout:3/947: dwrite d0/d52/d7c/f99 [4194304,4194304] 0 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.1.vm08.stdout:8/868: truncate de/d1d/d4f/f51 341844 0 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.1.vm08.stdout:5/856: dread d16/d1e/d8c/d99/da8/d9a/f106 [4194304,4194304] 0 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.1.vm08.stdout:7/945: truncate d5/d14/dae/dd1/d109/d8f/fe0 3102577 0 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.1.vm08.stdout:7/946: write d5/d14/dae/d1c/db5/df8/d13b/dc7/f10a [2850216,122488] 0 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.1.vm08.stdout:0/900: mkdir dd/d22/d27/d11e/d105/d82/d126 0 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.1.vm08.stdout:5/857: mkdir d16/d8e/ddb/d107/d114 0 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.1.vm08.stdout:0/901: mknod dd/d22/d63/d6e/df5/c127 0 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.1.vm08.stdout:6/885: mkdir d3/d68/d14a 0 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.1.vm08.stdout:9/863: fdatasync d0/d1b/f7c 0 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.0.vm07.stdout:9/684: truncate d0/db/d29/fb3 20186 0 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.0.vm07.stdout:9/685: chown d0/db/d29/d68/c76 635711569 1 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.0.vm07.stdout:0/656: creat d0/d6/d13/d1c/fd7 x:0 0 0 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.0.vm07.stdout:0/657: chown d0/d6/d13/d17/fc5 17080405 1 2026-03-09T19:27:51.957 INFO:tasks.workunit.client.0.vm07.stdout:2/736: write d3/d11/f39 [641645,111250] 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:1/693: fdatasync d1/d3/f12 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:6/629: dread - d0/dbf/d95/d31/fa6 zero size 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:8/690: fsync d7/d9/d37/d45/d56/d67/fa3 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:9/686: fsync d0/db/f41 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:4/667: creat d3/fea x:0 0 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:4/668: write d3/d11/d2b/d38/ddc/fb4 [188062,118135] 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:1/694: creat d1/d11/d37/d5d/d50/fe5 x:0 0 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:3/733: getdents d1/d6/d4c 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:6/630: mknod d0/d2d/cfd 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:2/737: mknod d3/c100 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:1/695: creat d1/d11/d37/d3f/d45/d87/fe6 x:0 0 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:6/631: truncate d0/d1/fd4 175411 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:6/632: chown d0/d1/d28/d76/dad 9 1 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:4/669: mknod d3/d4f/d56/ceb 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:2/738: creat d3/dd/d16/d29/d2d/d45/d3b/d44/d97/da4/f101 x:0 0 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:3/734: link d1/d6/d45/fbe d1/d6/d45/dac/fea 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:3/735: dread - d1/d3d/faa zero size 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:6/633: link d0/d1/db/d52/fa1 d0/d1/d28/da8/ffe 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:6/634: fdatasync d0/d1/db/d24/da4/ff3 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:6/635: dwrite d0/d1/db/d17/dc4/fd7 [0,4194304] 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:2/739: link d3/dd/d16/d29/f58 d3/dd/d16/d29/d2d/d45/d85/f102 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:3/736: rename d1/d3d/d47/db3/dc2/d28/f3c to d1/d6/d45/dac/feb 0 2026-03-09T19:27:51.958 INFO:tasks.workunit.client.0.vm07.stdout:6/636: rename d0/d1/db/d24/fde to d0/d1/db/d52/d94/fff 0 2026-03-09T19:27:51.967 INFO:tasks.workunit.client.1.vm08.stdout:0/902: fdatasync dd/d22/d27/d4f/fd7 0 2026-03-09T19:27:51.969 INFO:tasks.workunit.client.0.vm07.stdout:3/737: rename d1/d3d/d47/db3/f49 to d1/d6/d45/d54/de5/fec 0 2026-03-09T19:27:51.971 INFO:tasks.workunit.client.0.vm07.stdout:3/738: dwrite d1/d74/fe4 [0,4194304] 0 2026-03-09T19:27:51.983 INFO:tasks.workunit.client.0.vm07.stdout:8/691: sync 2026-03-09T19:27:52.003 INFO:tasks.workunit.client.0.vm07.stdout:6/637: truncate d0/dbf/d95/f74 714713 0 2026-03-09T19:27:52.003 INFO:tasks.workunit.client.1.vm08.stdout:9/864: fdatasync d0/d2/f2f 0 2026-03-09T19:27:52.003 INFO:tasks.workunit.client.1.vm08.stdout:1/988: link d9/da/d2d/d4e/l5d d9/da/l13c 0 2026-03-09T19:27:52.013 INFO:tasks.workunit.client.0.vm07.stdout:3/739: creat d1/d6/dd/fed x:0 0 0 2026-03-09T19:27:52.013 INFO:tasks.workunit.client.0.vm07.stdout:8/692: fsync d7/d9/d10/fb9 0 2026-03-09T19:27:52.015 INFO:tasks.workunit.client.0.vm07.stdout:6/638: unlink d0/d1/db/d17/dc4/d7b/lcd 0 2026-03-09T19:27:52.018 INFO:tasks.workunit.client.1.vm08.stdout:0/903: unlink dd/d22/d27/d11e/de3/l118 0 2026-03-09T19:27:52.030 INFO:tasks.workunit.client.0.vm07.stdout:3/740: mkdir d1/d3d/d47/db3/d8e/dee 0 2026-03-09T19:27:52.032 INFO:tasks.workunit.client.1.vm08.stdout:5/858: rename d16/d8e/ddb/de3 to d16/d45/d115 0 2026-03-09T19:27:52.035 INFO:tasks.workunit.client.0.vm07.stdout:8/693: rename d7/d9/d37/d45/d56/d67/c8d to d7/d9/ddf/cf5 0 2026-03-09T19:27:52.035 INFO:tasks.workunit.client.0.vm07.stdout:8/694: chown d7/d30/d32/fba 431894 1 2026-03-09T19:27:52.036 INFO:tasks.workunit.client.0.vm07.stdout:8/695: fsync d7/d9/d37/d45/d97/dbc/fca 0 2026-03-09T19:27:52.038 INFO:tasks.workunit.client.1.vm08.stdout:9/865: write d0/d1b/d97/d48/d5d/ddf/f7d [216077,107255] 0 2026-03-09T19:27:52.046 INFO:tasks.workunit.client.0.vm07.stdout:6/639: getdents d0/dbf/d95/d31/d9e 0 2026-03-09T19:27:52.046 INFO:tasks.workunit.client.1.vm08.stdout:9/866: write d0/d1b/d97/d48/d5d/f92 [1675388,84795] 0 2026-03-09T19:27:52.055 INFO:tasks.workunit.client.0.vm07.stdout:8/696: dread d7/d30/f3e [0,4194304] 0 2026-03-09T19:27:52.066 INFO:tasks.workunit.client.1.vm08.stdout:5/859: truncate d16/d45/fdd 308366 0 2026-03-09T19:27:52.070 INFO:tasks.workunit.client.0.vm07.stdout:6/640: unlink d0/d1/db/d52/fa1 0 2026-03-09T19:27:52.073 INFO:tasks.workunit.client.0.vm07.stdout:8/697: creat d7/d9/d37/d34/ff6 x:0 0 0 2026-03-09T19:27:52.077 INFO:tasks.workunit.client.1.vm08.stdout:9/867: mknod d0/d1b/d68/c120 0 2026-03-09T19:27:52.081 INFO:tasks.workunit.client.0.vm07.stdout:3/741: mknod d1/d3d/d47/db3/dc2/d28/d7c/cef 0 2026-03-09T19:27:52.083 INFO:tasks.workunit.client.0.vm07.stdout:3/742: fdatasync d1/d74/f6e 0 2026-03-09T19:27:52.085 INFO:tasks.workunit.client.1.vm08.stdout:6/886: getdents d3/d34/d5c/da2/dd6 0 2026-03-09T19:27:52.086 INFO:tasks.workunit.client.0.vm07.stdout:6/641: creat d0/d4e/d75/f100 x:0 0 0 2026-03-09T19:27:52.087 INFO:tasks.workunit.client.0.vm07.stdout:6/642: dread - d0/d4e/d7f/dbe/fee zero size 2026-03-09T19:27:52.088 INFO:tasks.workunit.client.0.vm07.stdout:8/698: creat d7/d50/da6/dc5/ff7 x:0 0 0 2026-03-09T19:27:52.102 INFO:tasks.workunit.client.0.vm07.stdout:6/643: mknod d0/d44/c101 0 2026-03-09T19:27:52.138 INFO:tasks.workunit.client.1.vm08.stdout:6/887: mknod d3/db/d12a/d147/c14b 0 2026-03-09T19:27:52.138 INFO:tasks.workunit.client.1.vm08.stdout:6/888: write d3/db/d43/fd3 [3662921,52997] 0 2026-03-09T19:27:52.138 INFO:tasks.workunit.client.0.vm07.stdout:8/699: rename d7/d30/d75/lda to d7/d16/dcf/lf8 0 2026-03-09T19:27:52.138 INFO:tasks.workunit.client.0.vm07.stdout:6/644: truncate d0/dbf/d95/d31/fa6 401118 0 2026-03-09T19:27:52.141 INFO:tasks.workunit.client.1.vm08.stdout:6/889: rmdir d3/d34/d6f 39 2026-03-09T19:27:52.151 INFO:tasks.workunit.client.0.vm07.stdout:8/700: dread d7/d50/da6/fde [0,4194304] 0 2026-03-09T19:27:52.169 INFO:tasks.workunit.client.0.vm07.stdout:8/701: dread d7/d9/d37/d34/f55 [0,4194304] 0 2026-03-09T19:27:52.172 INFO:tasks.workunit.client.0.vm07.stdout:8/702: rename d7/d16/l29 to d7/d9/d37/d45/d56/d67/de7/lf9 0 2026-03-09T19:27:52.184 INFO:tasks.workunit.client.0.vm07.stdout:8/703: getdents d7/d9/d37/d45/d4f 0 2026-03-09T19:27:52.190 INFO:tasks.workunit.client.0.vm07.stdout:8/704: truncate d7/d9/d37/d45/f76 1587128 0 2026-03-09T19:27:52.199 INFO:tasks.workunit.client.0.vm07.stdout:8/705: symlink d7/d9/d10/d44/lfa 0 2026-03-09T19:27:52.200 INFO:tasks.workunit.client.0.vm07.stdout:8/706: fdatasync d7/d9/d37/d45/f7d 0 2026-03-09T19:27:52.200 INFO:tasks.workunit.client.0.vm07.stdout:8/707: symlink d7/d9/d37/d45/d56/d67/de7/lfb 0 2026-03-09T19:27:52.200 INFO:tasks.workunit.client.0.vm07.stdout:8/708: mkdir d7/d9/d10/dd8/dfc 0 2026-03-09T19:27:52.203 INFO:tasks.workunit.client.0.vm07.stdout:8/709: getdents d7/d30/d75/dcc 0 2026-03-09T19:27:52.282 INFO:tasks.workunit.client.0.vm07.stdout:5/660: write d3/d1a/d5d/f80 [1361312,110371] 0 2026-03-09T19:27:52.290 INFO:tasks.workunit.client.0.vm07.stdout:0/658: write d0/d6/d13/d1c/fa5 [318493,102208] 0 2026-03-09T19:27:52.294 INFO:tasks.workunit.client.0.vm07.stdout:7/659: dwrite d0/d52/f62 [0,4194304] 0 2026-03-09T19:27:52.295 INFO:tasks.workunit.client.0.vm07.stdout:9/687: dwrite d0/db/f41 [0,4194304] 0 2026-03-09T19:27:52.311 INFO:tasks.workunit.client.0.vm07.stdout:0/659: creat d0/d6/d13/d17/d19/d57/d6a/fd8 x:0 0 0 2026-03-09T19:27:52.313 INFO:tasks.workunit.client.0.vm07.stdout:9/688: stat d0/db/d29/d2c/d36/fa1 0 2026-03-09T19:27:52.315 INFO:tasks.workunit.client.1.vm08.stdout:2/793: dwrite d3/d9/f5d [0,4194304] 0 2026-03-09T19:27:52.316 INFO:tasks.workunit.client.1.vm08.stdout:2/794: readlink d3/d9/lc3 0 2026-03-09T19:27:52.317 INFO:tasks.workunit.client.0.vm07.stdout:7/660: chown d0/d4/d5/f20 411579 1 2026-03-09T19:27:52.318 INFO:tasks.workunit.client.0.vm07.stdout:7/661: write d0/d52/fd2 [537140,83898] 0 2026-03-09T19:27:52.321 INFO:tasks.workunit.client.0.vm07.stdout:0/660: mkdir d0/d6/d13/d17/d19/d58/dd9 0 2026-03-09T19:27:52.322 INFO:tasks.workunit.client.0.vm07.stdout:0/661: chown d0/d6/fcf 22005208 1 2026-03-09T19:27:52.326 INFO:tasks.workunit.client.0.vm07.stdout:7/662: creat d0/d4/d5/d8/d41/fdf x:0 0 0 2026-03-09T19:27:52.326 INFO:tasks.workunit.client.0.vm07.stdout:0/662: truncate d0/d6/d13/d1c/d50/f85 166451 0 2026-03-09T19:27:52.327 INFO:tasks.workunit.client.0.vm07.stdout:0/663: stat d0/d6/d13/d17/d19/d58/fab 0 2026-03-09T19:27:52.329 INFO:tasks.workunit.client.0.vm07.stdout:9/689: readlink d0/d17/l44 0 2026-03-09T19:27:52.329 INFO:tasks.workunit.client.0.vm07.stdout:1/696: write d1/d11/d37/d3f/d45/f16 [5075367,72762] 0 2026-03-09T19:27:52.330 INFO:tasks.workunit.client.1.vm08.stdout:4/832: dwrite da/d10/d16/d28/d46/fb1 [0,4194304] 0 2026-03-09T19:27:52.332 INFO:tasks.workunit.client.0.vm07.stdout:7/663: truncate d0/d4/d5/d8/fa3 840389 0 2026-03-09T19:27:52.332 INFO:tasks.workunit.client.1.vm08.stdout:4/833: chown da/d10/c17 0 1 2026-03-09T19:27:52.332 INFO:tasks.workunit.client.0.vm07.stdout:7/664: chown d0/d4/d5/d26/d32/dbd 1843 1 2026-03-09T19:27:52.341 INFO:tasks.workunit.client.0.vm07.stdout:0/664: rename d0/d6/d13/d17/d19/d57/d6a/f7a to d0/d6/d13/d17/d19/d57/d6a/fda 0 2026-03-09T19:27:52.343 INFO:tasks.workunit.client.1.vm08.stdout:3/948: dwrite d0/d4b/f74 [0,4194304] 0 2026-03-09T19:27:52.344 INFO:tasks.workunit.client.0.vm07.stdout:1/697: sync 2026-03-09T19:27:52.345 INFO:tasks.workunit.client.0.vm07.stdout:1/698: fdatasync d1/d3/f4 0 2026-03-09T19:27:52.347 INFO:tasks.workunit.client.0.vm07.stdout:1/699: sync 2026-03-09T19:27:52.349 INFO:tasks.workunit.client.0.vm07.stdout:1/700: chown d1/d3e/f49 11085 1 2026-03-09T19:27:52.355 INFO:tasks.workunit.client.1.vm08.stdout:8/869: dwrite de/d47/dfd/d99/fd0 [0,4194304] 0 2026-03-09T19:27:52.359 INFO:tasks.workunit.client.0.vm07.stdout:9/690: creat d0/d6/d57/d8f/ff0 x:0 0 0 2026-03-09T19:27:52.365 INFO:tasks.workunit.client.1.vm08.stdout:5/860: rmdir d16/d8e/ddb/d107 39 2026-03-09T19:27:52.374 INFO:tasks.workunit.client.0.vm07.stdout:4/670: write d3/d11/d2b/d38/ddc/d22/f24 [4027376,40155] 0 2026-03-09T19:27:52.379 INFO:tasks.workunit.client.0.vm07.stdout:4/671: dwrite d3/d11/d2b/f71 [0,4194304] 0 2026-03-09T19:27:52.385 INFO:tasks.workunit.client.1.vm08.stdout:3/949: truncate d0/d6/de/d15/d96/f118 677395 0 2026-03-09T19:27:52.388 INFO:tasks.workunit.client.1.vm08.stdout:7/947: write d5/d14/d2b/fb0 [768869,303] 0 2026-03-09T19:27:52.392 INFO:tasks.workunit.client.1.vm08.stdout:8/870: mkdir de/d25/d87/dc9/dfc/d12f 0 2026-03-09T19:27:52.392 INFO:tasks.workunit.client.1.vm08.stdout:8/871: fsync de/d1d/d4f/fae 0 2026-03-09T19:27:52.397 INFO:tasks.workunit.client.1.vm08.stdout:8/872: dwrite de/d47/dd4/f122 [0,4194304] 0 2026-03-09T19:27:52.405 INFO:tasks.workunit.client.1.vm08.stdout:5/861: rename d16/d8e/dff/cd7 to d16/c116 0 2026-03-09T19:27:52.412 INFO:tasks.workunit.client.0.vm07.stdout:2/740: write d3/f63 [647056,77752] 0 2026-03-09T19:27:52.434 INFO:tasks.workunit.client.1.vm08.stdout:8/873: unlink de/d1d/d69/f8f 0 2026-03-09T19:27:52.441 INFO:tasks.workunit.client.1.vm08.stdout:7/948: rename d5/d14/d27/d54/f75 to d5/d14/d2b/d128/f141 0 2026-03-09T19:27:52.448 INFO:tasks.workunit.client.1.vm08.stdout:1/989: write d9/da/d53/db3/fd2 [567080,111544] 0 2026-03-09T19:27:52.449 INFO:tasks.workunit.client.1.vm08.stdout:5/862: rmdir d16/d45/daf/df5 39 2026-03-09T19:27:52.453 INFO:tasks.workunit.client.1.vm08.stdout:0/904: dwrite dd/d22/d27/d11e/d105/f83 [4194304,4194304] 0 2026-03-09T19:27:52.462 INFO:tasks.workunit.client.0.vm07.stdout:0/665: dread d0/d6/d13/d1c/d11/f80 [0,4194304] 0 2026-03-09T19:27:52.465 INFO:tasks.workunit.client.1.vm08.stdout:7/949: creat d5/d14/d2b/daa/f142 x:0 0 0 2026-03-09T19:27:52.465 INFO:tasks.workunit.client.0.vm07.stdout:3/743: truncate d1/d6/f19 1530811 0 2026-03-09T19:27:52.468 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:52 vm08.local ceph-mon[57794]: pgmap v5: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-09T19:27:52.468 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:52 vm08.local ceph-mon[57794]: mgrmap e24: vm08.mxylvw(active, since 4s), standbys: vm07.xacuym 2026-03-09T19:27:52.471 INFO:tasks.workunit.client.1.vm08.stdout:3/950: link d0/d4b/ce2 d0/d52/d6d/d77/d88/df7/c138 0 2026-03-09T19:27:52.471 INFO:tasks.workunit.client.0.vm07.stdout:6/645: write d0/dbf/fd8 [46001,100928] 0 2026-03-09T19:27:52.474 INFO:tasks.workunit.client.1.vm08.stdout:6/890: write d3/d34/da9/fc7 [385468,121772] 0 2026-03-09T19:27:52.474 INFO:tasks.workunit.client.1.vm08.stdout:6/891: readlink d3/dbc/deb/l12c 0 2026-03-09T19:27:52.477 INFO:tasks.workunit.client.1.vm08.stdout:9/868: dwrite d0/d1b/d97/d48/d5d/ddf/fd9 [4194304,4194304] 0 2026-03-09T19:27:52.477 INFO:tasks.workunit.client.1.vm08.stdout:0/905: read dd/fe [3667047,95688] 0 2026-03-09T19:27:52.480 INFO:tasks.workunit.client.1.vm08.stdout:7/950: creat d5/d14/dae/d3a/d42/d85/da0/df5/f143 x:0 0 0 2026-03-09T19:27:52.502 INFO:tasks.workunit.client.1.vm08.stdout:0/906: creat dd/d22/d63/f128 x:0 0 0 2026-03-09T19:27:52.505 INFO:tasks.workunit.client.1.vm08.stdout:7/951: symlink d5/d14/d2b/d4b/l144 0 2026-03-09T19:27:52.506 INFO:tasks.workunit.client.1.vm08.stdout:1/990: sync 2026-03-09T19:27:52.509 INFO:tasks.workunit.client.0.vm07.stdout:1/701: read d1/d11/d37/d5d/dc1/fce [216233,26413] 0 2026-03-09T19:27:52.515 INFO:tasks.workunit.client.1.vm08.stdout:0/907: fsync dd/d22/d27/d11e/d78/fc2 0 2026-03-09T19:27:52.516 INFO:tasks.workunit.client.0.vm07.stdout:4/672: mknod d3/d11/d2b/d38/ddc/d22/d70/cec 0 2026-03-09T19:27:52.518 INFO:tasks.workunit.client.0.vm07.stdout:2/741: unlink d3/dd/ffa 0 2026-03-09T19:27:52.519 INFO:tasks.workunit.client.0.vm07.stdout:2/742: write d3/dd/d16/d29/f42 [2254642,115789] 0 2026-03-09T19:27:52.531 INFO:tasks.workunit.client.0.vm07.stdout:8/710: rename d7/d9/d37/d45/d56 to d7/d9/d10/dd8/dfd 0 2026-03-09T19:27:52.533 INFO:tasks.workunit.client.0.vm07.stdout:0/666: creat d0/d6/dc8/d99/fdb x:0 0 0 2026-03-09T19:27:52.533 INFO:tasks.workunit.client.0.vm07.stdout:5/661: write d3/dd/d26/d2d/d60/fc5 [7031447,33934] 0 2026-03-09T19:27:52.537 INFO:tasks.workunit.client.1.vm08.stdout:3/951: getdents d0/d6/d10a 0 2026-03-09T19:27:52.539 INFO:tasks.workunit.client.1.vm08.stdout:1/991: sync 2026-03-09T19:27:52.543 INFO:tasks.workunit.client.1.vm08.stdout:6/892: link d3/d15/dc2/ce6 d3/d94/def/c14c 0 2026-03-09T19:27:52.543 INFO:tasks.workunit.client.1.vm08.stdout:4/834: dwrite da/d10/d26/dd8/f93 [0,4194304] 0 2026-03-09T19:27:52.548 INFO:tasks.workunit.client.1.vm08.stdout:9/869: rename d0/d1b/c9a to d0/d2/c121 0 2026-03-09T19:27:52.548 INFO:tasks.workunit.client.0.vm07.stdout:4/673: truncate d3/d4f/f9d 931894 0 2026-03-09T19:27:52.551 INFO:tasks.workunit.client.0.vm07.stdout:8/711: sync 2026-03-09T19:27:52.551 INFO:tasks.workunit.client.0.vm07.stdout:4/674: chown d3/d11/d16/fe5 15 1 2026-03-09T19:27:52.552 INFO:tasks.workunit.client.1.vm08.stdout:3/952: creat d0/d6/d25/f139 x:0 0 0 2026-03-09T19:27:52.559 INFO:tasks.workunit.client.1.vm08.stdout:6/893: rename d3/d34/c59 to d3/d68/d14a/c14d 0 2026-03-09T19:27:52.561 INFO:tasks.workunit.client.0.vm07.stdout:7/665: rename d0/d4/d5/d8/d1a/d2a/fc8 to d0/d4/d5/d8/d41/fe0 0 2026-03-09T19:27:52.569 INFO:tasks.workunit.client.1.vm08.stdout:7/952: getdents d5/d14/dae/d12f 0 2026-03-09T19:27:52.576 INFO:tasks.workunit.client.0.vm07.stdout:1/702: creat d1/d11/d37/d3f/db5/fe7 x:0 0 0 2026-03-09T19:27:52.579 INFO:tasks.workunit.client.0.vm07.stdout:8/712: symlink d7/d9/d37/d34/lfe 0 2026-03-09T19:27:52.580 INFO:tasks.workunit.client.1.vm08.stdout:6/894: mknod d3/d34/d5c/da2/d107/c14e 0 2026-03-09T19:27:52.581 INFO:tasks.workunit.client.0.vm07.stdout:4/675: symlink d3/d11/d2b/d38/ddc/d22/d86/led 0 2026-03-09T19:27:52.586 INFO:tasks.workunit.client.1.vm08.stdout:9/870: mkdir d0/d1b/d122 0 2026-03-09T19:27:52.592 INFO:tasks.workunit.client.0.vm07.stdout:2/743: rename d3/dd/d16/d30/da7/dad to d3/dd/d103 0 2026-03-09T19:27:52.595 INFO:tasks.workunit.client.0.vm07.stdout:7/666: fdatasync d0/d4/d5/d8/d41/d64/d74/f88 0 2026-03-09T19:27:52.596 INFO:tasks.workunit.client.0.vm07.stdout:5/662: rmdir d3/d1a/d28/d40/d92 39 2026-03-09T19:27:52.598 INFO:tasks.workunit.client.1.vm08.stdout:0/908: getdents dd/d22/d27/d11e/db3 0 2026-03-09T19:27:52.601 INFO:tasks.workunit.client.0.vm07.stdout:7/667: write d0/d4/d5/d8/d41/d64/d74/d98/fd8 [331605,66499] 0 2026-03-09T19:27:52.601 INFO:tasks.workunit.client.0.vm07.stdout:1/703: unlink d1/db/d31/d4f/f79 0 2026-03-09T19:27:52.601 INFO:tasks.workunit.client.1.vm08.stdout:3/953: symlink d0/d6/de/d54/d103/l13a 0 2026-03-09T19:27:52.603 INFO:tasks.workunit.client.0.vm07.stdout:5/663: write d3/d1a/d5d/f80 [2392397,28555] 0 2026-03-09T19:27:52.603 INFO:tasks.workunit.client.1.vm08.stdout:6/895: symlink d3/d34/da9/da4/d117/d10d/l14f 0 2026-03-09T19:27:52.604 INFO:tasks.workunit.client.0.vm07.stdout:2/744: creat d3/dd/d16/d29/d2d/d45/d8b/d98/f104 x:0 0 0 2026-03-09T19:27:52.604 INFO:tasks.workunit.client.0.vm07.stdout:2/745: read - d3/dd/d16/d29/d2d/d45/fd9 zero size 2026-03-09T19:27:52.606 INFO:tasks.workunit.client.0.vm07.stdout:2/746: chown d3/dd/d16/d29/d2d/d45/d3b/d44/d97/da4 13557234 1 2026-03-09T19:27:52.606 INFO:tasks.workunit.client.1.vm08.stdout:6/896: write d3/db/fdb [333235,129550] 0 2026-03-09T19:27:52.608 INFO:tasks.workunit.client.1.vm08.stdout:8/874: rmdir de/d25/d87/dc9/dfc 39 2026-03-09T19:27:52.608 INFO:tasks.workunit.client.1.vm08.stdout:3/954: creat d0/d52/d6d/d77/ddf/f13b x:0 0 0 2026-03-09T19:27:52.614 INFO:tasks.workunit.client.1.vm08.stdout:2/795: dread d3/d4/d3e/d4e/d88/db0/ff3 [0,4194304] 0 2026-03-09T19:27:52.617 INFO:tasks.workunit.client.1.vm08.stdout:4/835: rename da/d10/d16/d28/d4d to da/d10/d26/d3a/d69/df1 0 2026-03-09T19:27:52.620 INFO:tasks.workunit.client.1.vm08.stdout:2/796: truncate d3/d9/f5d 4510581 0 2026-03-09T19:27:52.625 INFO:tasks.workunit.client.1.vm08.stdout:2/797: read d3/d4/d23/d2c/d39/d5e/de/d18/f1a [2952549,117983] 0 2026-03-09T19:27:52.626 INFO:tasks.workunit.client.1.vm08.stdout:5/863: dwrite d16/d45/daf/df5/fc2 [0,4194304] 0 2026-03-09T19:27:52.626 INFO:tasks.workunit.client.1.vm08.stdout:2/798: chown d3/d9/d26/cd0 26 1 2026-03-09T19:27:52.637 INFO:tasks.workunit.client.0.vm07.stdout:8/713: creat d7/d30/d32/de9/fff x:0 0 0 2026-03-09T19:27:52.639 INFO:tasks.workunit.client.1.vm08.stdout:2/799: read d3/f7c [419285,44501] 0 2026-03-09T19:27:52.641 INFO:tasks.workunit.client.0.vm07.stdout:9/691: dwrite d0/db/d29/d68/f8e [0,4194304] 0 2026-03-09T19:27:52.643 INFO:tasks.workunit.client.0.vm07.stdout:6/646: write d0/d1/db/d17/dc4/d7b/fac [642023,60709] 0 2026-03-09T19:27:52.648 INFO:tasks.workunit.client.0.vm07.stdout:1/704: fdatasync d1/f51 0 2026-03-09T19:27:52.648 INFO:tasks.workunit.client.0.vm07.stdout:1/705: chown d1/d11/l70 26 1 2026-03-09T19:27:52.655 INFO:tasks.workunit.client.0.vm07.stdout:2/747: mknod d3/dd/d16/d29/d2d/d45/d3b/d44/d97/da4/c105 0 2026-03-09T19:27:52.658 INFO:tasks.workunit.client.1.vm08.stdout:1/992: write d9/d11/d7a/f80 [284667,86086] 0 2026-03-09T19:27:52.659 INFO:tasks.workunit.client.1.vm08.stdout:3/955: mknod d0/d6/de/d1a/c13c 0 2026-03-09T19:27:52.659 INFO:tasks.workunit.client.1.vm08.stdout:3/956: stat d0/d6/de/d15/l64 0 2026-03-09T19:27:52.659 INFO:tasks.workunit.client.1.vm08.stdout:1/993: stat d9/da/d53/d67/d6c/f121 0 2026-03-09T19:27:52.665 INFO:tasks.workunit.client.0.vm07.stdout:8/714: unlink d7/d1d/f4d 0 2026-03-09T19:27:52.667 INFO:tasks.workunit.client.0.vm07.stdout:8/715: fsync d7/d9/d37/d45/d4f/db1/fd6 0 2026-03-09T19:27:52.668 INFO:tasks.workunit.client.0.vm07.stdout:8/716: chown d7/d9/d10/f20 0 1 2026-03-09T19:27:52.670 INFO:tasks.workunit.client.0.vm07.stdout:0/667: dwrite d0/d6/d13/d17/d19/d57/f6f [4194304,4194304] 0 2026-03-09T19:27:52.673 INFO:tasks.workunit.client.1.vm08.stdout:9/871: rename d0/d2/d80/de5/c75 to d0/d1b/d122/c123 0 2026-03-09T19:27:52.683 INFO:tasks.workunit.client.1.vm08.stdout:4/836: creat da/d10/d16/d28/d2f/d4f/d56/ff2 x:0 0 0 2026-03-09T19:27:52.692 INFO:tasks.workunit.client.1.vm08.stdout:7/953: dwrite d5/d14/d2b/fa1 [0,4194304] 0 2026-03-09T19:27:52.698 INFO:tasks.workunit.client.1.vm08.stdout:7/954: readlink d5/d14/dae/dd1/d109/d8f/ld7 0 2026-03-09T19:27:52.705 INFO:tasks.workunit.client.0.vm07.stdout:4/676: write d3/d11/d2b/d38/f96 [811510,1624] 0 2026-03-09T19:27:52.710 INFO:tasks.workunit.client.0.vm07.stdout:3/744: truncate d1/d6/f19 1497844 0 2026-03-09T19:27:52.716 INFO:tasks.workunit.client.1.vm08.stdout:5/864: mkdir d16/d117 0 2026-03-09T19:27:52.716 INFO:tasks.workunit.client.0.vm07.stdout:1/706: rename d1/d11/db7 to d1/db/d31/d4f/d7a/dd2/de8 0 2026-03-09T19:27:52.729 INFO:tasks.workunit.client.0.vm07.stdout:7/668: link d0/d4/d5/d8/f15 d0/d52/d54/d95/fe1 0 2026-03-09T19:27:52.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:52 vm07.local ceph-mon[48545]: pgmap v5: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-09T19:27:52.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:52 vm07.local ceph-mon[48545]: mgrmap e24: vm08.mxylvw(active, since 4s), standbys: vm07.xacuym 2026-03-09T19:27:52.733 INFO:tasks.workunit.client.0.vm07.stdout:9/692: write d0/db/d29/d32/d5c/d69/f8d [819549,67960] 0 2026-03-09T19:27:52.735 INFO:tasks.workunit.client.1.vm08.stdout:6/897: mknod d3/d34/d6f/d123/c150 0 2026-03-09T19:27:52.735 INFO:tasks.workunit.client.1.vm08.stdout:0/909: creat dd/d22/d27/d11e/d105/f129 x:0 0 0 2026-03-09T19:27:52.737 INFO:tasks.workunit.client.0.vm07.stdout:5/664: dwrite d3/d1a/fa [0,4194304] 0 2026-03-09T19:27:52.747 INFO:tasks.workunit.client.0.vm07.stdout:8/717: truncate d7/d50/f82 353300 0 2026-03-09T19:27:52.747 INFO:tasks.workunit.client.0.vm07.stdout:0/668: rmdir d0/d6/d13/d17 39 2026-03-09T19:27:52.751 INFO:tasks.workunit.client.0.vm07.stdout:4/677: truncate d3/d11/d29/f42 4740779 0 2026-03-09T19:27:52.754 INFO:tasks.workunit.client.0.vm07.stdout:4/678: fdatasync d3/d11/d2b/d38/fdf 0 2026-03-09T19:27:52.754 INFO:tasks.workunit.client.0.vm07.stdout:3/745: fsync d1/d3d/d47/db3/faf 0 2026-03-09T19:27:52.766 INFO:tasks.workunit.client.0.vm07.stdout:1/707: unlink d1/db/d31/d4f/d7a/ce0 0 2026-03-09T19:27:52.769 INFO:tasks.workunit.client.0.vm07.stdout:2/748: write d3/dd/d16/d29/d2d/d45/d85/d8a/fd2 [878116,59948] 0 2026-03-09T19:27:52.775 INFO:tasks.workunit.client.0.vm07.stdout:5/665: creat d3/dd/d26/d3f/d47/d71/fd3 x:0 0 0 2026-03-09T19:27:52.776 INFO:tasks.workunit.client.1.vm08.stdout:8/875: write de/d117/f12d [980330,2462] 0 2026-03-09T19:27:52.776 INFO:tasks.workunit.client.1.vm08.stdout:3/957: write d0/d6/de/d1b/d16/d17/dac/d109/fdb [762189,87812] 0 2026-03-09T19:27:52.778 INFO:tasks.workunit.client.1.vm08.stdout:3/958: readlink d0/d6/la5 0 2026-03-09T19:27:52.778 INFO:tasks.workunit.client.1.vm08.stdout:3/959: chown d0/d6/de/d1b/f7d 3396114 1 2026-03-09T19:27:52.780 INFO:tasks.workunit.client.1.vm08.stdout:3/960: chown d0/d6/d25/f87 1 1 2026-03-09T19:27:52.783 INFO:tasks.workunit.client.0.vm07.stdout:9/693: write d0/db/fac [173749,87120] 0 2026-03-09T19:27:52.785 INFO:tasks.workunit.client.0.vm07.stdout:8/718: fdatasync d7/f2e 0 2026-03-09T19:27:52.789 INFO:tasks.workunit.client.0.vm07.stdout:0/669: rmdir d0/d6/dc8 39 2026-03-09T19:27:52.789 INFO:tasks.workunit.client.0.vm07.stdout:0/670: read d0/f41 [36558,27724] 0 2026-03-09T19:27:52.790 INFO:tasks.workunit.client.0.vm07.stdout:6/647: dwrite d0/fa3 [0,4194304] 0 2026-03-09T19:27:52.794 INFO:tasks.workunit.client.0.vm07.stdout:4/679: mknod d3/d11/d2b/d38/ddc/d22/d86/cee 0 2026-03-09T19:27:52.803 INFO:tasks.workunit.client.0.vm07.stdout:7/669: unlink d0/d4/l24 0 2026-03-09T19:27:52.810 INFO:tasks.workunit.client.0.vm07.stdout:3/746: dread d1/d6/d4c/f61 [0,4194304] 0 2026-03-09T19:27:52.812 INFO:tasks.workunit.client.0.vm07.stdout:2/749: symlink d3/d11/d38/l106 0 2026-03-09T19:27:52.813 INFO:tasks.workunit.client.0.vm07.stdout:6/648: dread d0/ff [0,4194304] 0 2026-03-09T19:27:52.815 INFO:tasks.workunit.client.0.vm07.stdout:2/750: dwrite d3/f5 [0,4194304] 0 2026-03-09T19:27:52.820 INFO:tasks.workunit.client.1.vm08.stdout:9/872: mknod d0/d1b/d68/d7f/c124 0 2026-03-09T19:27:52.820 INFO:tasks.workunit.client.0.vm07.stdout:5/666: dread - d3/d1a/d28/d40/d92/f5e zero size 2026-03-09T19:27:52.823 INFO:tasks.workunit.client.1.vm08.stdout:4/837: mkdir da/d10/d16/d28/d2f/d4f/d64/d84/d8a/da2/df3 0 2026-03-09T19:27:52.832 INFO:tasks.workunit.client.0.vm07.stdout:8/719: mkdir d7/d9/d10/dd8/dfd/d62/d100 0 2026-03-09T19:27:52.833 INFO:tasks.workunit.client.0.vm07.stdout:8/720: chown d7/d16/f71 1928728 1 2026-03-09T19:27:52.835 INFO:tasks.workunit.client.0.vm07.stdout:0/671: chown d0/d6/d13/d17/cae 265493019 1 2026-03-09T19:27:52.839 INFO:tasks.workunit.client.1.vm08.stdout:5/865: symlink d16/l118 0 2026-03-09T19:27:52.839 INFO:tasks.workunit.client.0.vm07.stdout:0/672: chown d0/d6/d13/d1c/d11/d8b 165 1 2026-03-09T19:27:52.840 INFO:tasks.workunit.client.0.vm07.stdout:0/673: write d0/d6/d13/fbf [656213,67244] 0 2026-03-09T19:27:52.843 INFO:tasks.workunit.client.1.vm08.stdout:5/866: stat d16/d1e/d8c/d99/fea 0 2026-03-09T19:27:52.844 INFO:tasks.workunit.client.1.vm08.stdout:2/800: mkdir d3/d4/d10e 0 2026-03-09T19:27:52.848 INFO:tasks.workunit.client.1.vm08.stdout:0/910: creat dd/d22/d63/d6e/df5/f12a x:0 0 0 2026-03-09T19:27:52.852 INFO:tasks.workunit.client.0.vm07.stdout:1/708: write d1/d11/d37/f2c [508199,111943] 0 2026-03-09T19:27:52.856 INFO:tasks.workunit.client.1.vm08.stdout:1/994: mknod d9/da/d53/d67/d6c/d76/db2/c13d 0 2026-03-09T19:27:52.866 INFO:tasks.workunit.client.1.vm08.stdout:6/898: write d3/d68/d7e/fbb [1009326,107427] 0 2026-03-09T19:27:52.869 INFO:tasks.workunit.client.1.vm08.stdout:9/873: creat d0/d2/d80/de5/f125 x:0 0 0 2026-03-09T19:27:52.873 INFO:tasks.workunit.client.0.vm07.stdout:2/751: creat d3/dd/d16/d30/d40/f107 x:0 0 0 2026-03-09T19:27:52.881 INFO:tasks.workunit.client.0.vm07.stdout:6/649: dwrite d0/dbf/d95/d31/f6c [0,4194304] 0 2026-03-09T19:27:52.883 INFO:tasks.workunit.client.0.vm07.stdout:5/667: symlink d3/dd/d26/d2d/d60/ld4 0 2026-03-09T19:27:52.883 INFO:tasks.workunit.client.0.vm07.stdout:5/668: readlink d3/d1a/l27 0 2026-03-09T19:27:52.890 INFO:tasks.workunit.client.0.vm07.stdout:9/694: mknod d0/db/d29/d32/cf1 0 2026-03-09T19:27:52.891 INFO:tasks.workunit.client.1.vm08.stdout:2/801: creat d3/d4/d23/d2c/d39/f10f x:0 0 0 2026-03-09T19:27:52.891 INFO:tasks.workunit.client.1.vm08.stdout:0/911: creat dd/d31/f12b x:0 0 0 2026-03-09T19:27:52.892 INFO:tasks.workunit.client.1.vm08.stdout:4/838: write da/d10/d16/d28/d2f/fd4 [14469,93218] 0 2026-03-09T19:27:52.892 INFO:tasks.workunit.client.1.vm08.stdout:6/899: dwrite d3/dbc/f126 [0,4194304] 0 2026-03-09T19:27:52.901 INFO:tasks.workunit.client.1.vm08.stdout:2/802: dread - d3/d9/fd2 zero size 2026-03-09T19:27:52.910 INFO:tasks.workunit.client.1.vm08.stdout:2/803: readlink d3/d9/d79/d46/d8c/d92/lb1 0 2026-03-09T19:27:52.910 INFO:tasks.workunit.client.1.vm08.stdout:7/955: write d5/f1a [529656,63812] 0 2026-03-09T19:27:52.913 INFO:tasks.workunit.client.1.vm08.stdout:7/956: stat d5/d14/dae/d3a/d42/l6d 0 2026-03-09T19:27:52.914 INFO:tasks.workunit.client.1.vm08.stdout:8/876: dwrite de/d47/fc1 [0,4194304] 0 2026-03-09T19:27:52.920 INFO:tasks.workunit.client.1.vm08.stdout:1/995: read d9/da/d17/fa9 [1014187,79903] 0 2026-03-09T19:27:52.921 INFO:tasks.workunit.client.1.vm08.stdout:5/867: dwrite ff [0,4194304] 0 2026-03-09T19:27:52.926 INFO:tasks.workunit.client.1.vm08.stdout:8/877: dread - de/d1d/d2e/f56 zero size 2026-03-09T19:27:52.926 INFO:tasks.workunit.client.1.vm08.stdout:5/868: chown d16/d1e/d8c/d99/dcc 5 1 2026-03-09T19:27:52.934 INFO:tasks.workunit.client.1.vm08.stdout:8/878: truncate de/d47/dfd/d99/da5/fdd 4486903 0 2026-03-09T19:27:52.940 INFO:tasks.workunit.client.0.vm07.stdout:4/680: unlink d3/d4f/f9d 0 2026-03-09T19:27:52.940 INFO:tasks.workunit.client.0.vm07.stdout:1/709: rmdir d1/d3e/db3 39 2026-03-09T19:27:52.943 INFO:tasks.workunit.client.0.vm07.stdout:2/752: mkdir d3/dd/d16/d29/d2d/d45/d3b/d44/d97/da4/d108 0 2026-03-09T19:27:52.945 INFO:tasks.workunit.client.1.vm08.stdout:0/912: unlink dd/d22/d27/f9f 0 2026-03-09T19:27:52.945 INFO:tasks.workunit.client.1.vm08.stdout:6/900: mknod d3/d15/dc2/d12f/c151 0 2026-03-09T19:27:52.945 INFO:tasks.workunit.client.1.vm08.stdout:4/839: symlink da/d10/d16/lf4 0 2026-03-09T19:27:52.947 INFO:tasks.workunit.client.1.vm08.stdout:6/901: stat d3/d34/d5c/da2/c10a 0 2026-03-09T19:27:52.950 INFO:tasks.workunit.client.0.vm07.stdout:6/650: mknod d0/d4e/dae/daf/c102 0 2026-03-09T19:27:52.958 INFO:tasks.workunit.client.1.vm08.stdout:3/961: creat d0/f13d x:0 0 0 2026-03-09T19:27:52.959 INFO:tasks.workunit.client.0.vm07.stdout:0/674: mkdir d0/d6/dc8/d99/ddc 0 2026-03-09T19:27:52.959 INFO:tasks.workunit.client.0.vm07.stdout:9/695: creat d0/db/d29/d32/d5c/d80/ff2 x:0 0 0 2026-03-09T19:27:52.961 INFO:tasks.workunit.client.1.vm08.stdout:7/957: creat d5/d14/dae/d3a/d42/d85/f145 x:0 0 0 2026-03-09T19:27:52.963 INFO:tasks.workunit.client.0.vm07.stdout:7/670: link d0/d4/d5/d26/dc9/cdc d0/d52/d54/d5a/d87/ce2 0 2026-03-09T19:27:52.969 INFO:tasks.workunit.client.1.vm08.stdout:5/869: symlink d16/d1e/d6e/dcd/l119 0 2026-03-09T19:27:52.970 INFO:tasks.workunit.client.0.vm07.stdout:3/747: link d1/d6/dd/f44 d1/d6/dd/dbf/ddc/ff0 0 2026-03-09T19:27:52.970 INFO:tasks.workunit.client.0.vm07.stdout:6/651: mknod d0/d4e/d7f/c103 0 2026-03-09T19:27:52.971 INFO:tasks.workunit.client.0.vm07.stdout:5/669: dread d3/dd/d26/d2d/d60/fc5 [4194304,4194304] 0 2026-03-09T19:27:52.973 INFO:tasks.workunit.client.0.vm07.stdout:5/670: chown d3/d1a/d28/d40/fa5 797 1 2026-03-09T19:27:52.973 INFO:tasks.workunit.client.1.vm08.stdout:9/874: creat d0/d1b/d97/d106/f126 x:0 0 0 2026-03-09T19:27:52.973 INFO:tasks.workunit.client.0.vm07.stdout:8/721: dread d7/d9/d10/dd8/dfd/d62/fc3 [0,4194304] 0 2026-03-09T19:27:52.974 INFO:tasks.workunit.client.1.vm08.stdout:2/804: dread d3/d4/d23/d2c/d39/d5e/ff1 [0,4194304] 0 2026-03-09T19:27:52.976 INFO:tasks.workunit.client.0.vm07.stdout:4/681: sync 2026-03-09T19:27:52.977 INFO:tasks.workunit.client.1.vm08.stdout:4/840: dread da/d10/d16/d28/f8d [0,4194304] 0 2026-03-09T19:27:52.978 INFO:tasks.workunit.client.0.vm07.stdout:4/682: write d3/fc [93000,19486] 0 2026-03-09T19:27:52.981 INFO:tasks.workunit.client.0.vm07.stdout:0/675: truncate d0/d6/d13/d33/f39 282742 0 2026-03-09T19:27:52.981 INFO:tasks.workunit.client.0.vm07.stdout:1/710: mknod d1/db/d31/dca/ce9 0 2026-03-09T19:27:52.982 INFO:tasks.workunit.client.0.vm07.stdout:0/676: chown d0/d6/d13/d1c/d50/c5b 8122435 1 2026-03-09T19:27:52.986 INFO:tasks.workunit.client.0.vm07.stdout:6/652: creat d0/d1/d28/d76/dad/f104 x:0 0 0 2026-03-09T19:27:52.990 INFO:tasks.workunit.client.1.vm08.stdout:7/958: rmdir d5/d14/d2b/daa 39 2026-03-09T19:27:52.990 INFO:tasks.workunit.client.1.vm08.stdout:1/996: mknod d9/d11/d7a/d130/d132/c13e 0 2026-03-09T19:27:52.990 INFO:tasks.workunit.client.0.vm07.stdout:9/696: symlink d0/lf3 0 2026-03-09T19:27:52.990 INFO:tasks.workunit.client.0.vm07.stdout:1/711: dwrite d1/d11/d37/d5d/d50/fe5 [0,4194304] 0 2026-03-09T19:27:53.000 INFO:tasks.workunit.client.0.vm07.stdout:0/677: dwrite d0/d6/d13/d1c/d50/fc7 [0,4194304] 0 2026-03-09T19:27:53.014 INFO:tasks.workunit.client.1.vm08.stdout:4/841: dread da/d10/d16/d28/d46/fb1 [0,4194304] 0 2026-03-09T19:27:53.023 INFO:tasks.workunit.client.0.vm07.stdout:8/722: creat d7/d30/d32/de9/f101 x:0 0 0 2026-03-09T19:27:53.023 INFO:tasks.workunit.client.1.vm08.stdout:6/902: dread d3/d34/d3b/f58 [0,4194304] 0 2026-03-09T19:27:53.024 INFO:tasks.workunit.client.1.vm08.stdout:6/903: chown d3/d34/d6f/d123 123 1 2026-03-09T19:27:53.027 INFO:tasks.workunit.client.0.vm07.stdout:3/748: symlink d1/d6/dd/dbf/ddc/lf1 0 2026-03-09T19:27:53.027 INFO:tasks.workunit.client.1.vm08.stdout:9/875: rename d0/d1b/d97/d48/d5d/d74/ded/l11f to d0/d1b/d68/d7f/de6/d109/l127 0 2026-03-09T19:27:53.027 INFO:tasks.workunit.client.1.vm08.stdout:2/805: mkdir d3/d4/d23/d2c/d39/d5e/de/d18/da9/d110 0 2026-03-09T19:27:53.028 INFO:tasks.workunit.client.1.vm08.stdout:2/806: chown d3/d4/f6 73380 1 2026-03-09T19:27:53.029 INFO:tasks.workunit.client.0.vm07.stdout:7/671: truncate d0/d4/d5/d8/f15 1885307 0 2026-03-09T19:27:53.030 INFO:tasks.workunit.client.1.vm08.stdout:2/807: chown d3/d4/d23/d2c/d39/d5e 2912751 1 2026-03-09T19:27:53.030 INFO:tasks.workunit.client.1.vm08.stdout:0/913: mknod dd/d22/d27/d6c/dad/c12c 0 2026-03-09T19:27:53.031 INFO:tasks.workunit.client.0.vm07.stdout:2/753: link d3/dd/fe d3/dd/d16/d30/da7/f109 0 2026-03-09T19:27:53.035 INFO:tasks.workunit.client.0.vm07.stdout:6/653: truncate d0/d13/f18 1367884 0 2026-03-09T19:27:53.036 INFO:tasks.workunit.client.0.vm07.stdout:3/749: dwrite d1/f73 [0,4194304] 0 2026-03-09T19:27:53.048 INFO:tasks.workunit.client.1.vm08.stdout:1/997: creat d9/da/d53/db3/f13f x:0 0 0 2026-03-09T19:27:53.052 INFO:tasks.workunit.client.1.vm08.stdout:7/959: symlink d5/d14/dae/d3a/d42/d85/da0/df5/l146 0 2026-03-09T19:27:53.061 INFO:tasks.workunit.client.1.vm08.stdout:4/842: mknod da/d10/d26/d3a/db5/de6/cf5 0 2026-03-09T19:27:53.061 INFO:tasks.workunit.client.1.vm08.stdout:8/879: rmdir de/d117/d10f 0 2026-03-09T19:27:53.065 INFO:tasks.workunit.client.0.vm07.stdout:0/678: symlink d0/d6/dc8/d99/ldd 0 2026-03-09T19:27:53.068 INFO:tasks.workunit.client.1.vm08.stdout:6/904: mknod d3/d94/def/c152 0 2026-03-09T19:27:53.069 INFO:tasks.workunit.client.0.vm07.stdout:0/679: sync 2026-03-09T19:27:53.070 INFO:tasks.workunit.client.1.vm08.stdout:6/905: readlink d3/d34/d5c/l60 0 2026-03-09T19:27:53.070 INFO:tasks.workunit.client.1.vm08.stdout:9/876: symlink d0/d1b/d68/d7f/de6/l128 0 2026-03-09T19:27:53.075 INFO:tasks.workunit.client.1.vm08.stdout:9/877: read d0/d2/d80/de5/da2/da8/de8/f61 [7793631,20416] 0 2026-03-09T19:27:53.082 INFO:tasks.workunit.client.0.vm07.stdout:0/680: dwrite d0/d6/dc8/fca [0,4194304] 0 2026-03-09T19:27:53.084 INFO:tasks.workunit.client.0.vm07.stdout:2/754: mknod d3/dd/d16/d30/d40/c10a 0 2026-03-09T19:27:53.084 INFO:tasks.workunit.client.0.vm07.stdout:9/697: write d0/d6/f7b [90240,12330] 0 2026-03-09T19:27:53.094 INFO:tasks.workunit.client.0.vm07.stdout:5/671: dwrite d3/d1a/d28/f2e [4194304,4194304] 0 2026-03-09T19:27:53.097 INFO:tasks.workunit.client.0.vm07.stdout:5/672: dread - d3/dd/d26/d3f/fc4 zero size 2026-03-09T19:27:53.098 INFO:tasks.workunit.client.0.vm07.stdout:3/750: fsync d1/d6/dd/f57 0 2026-03-09T19:27:53.098 INFO:tasks.workunit.client.1.vm08.stdout:5/870: link d16/d45/f54 d16/d1e/d8c/d99/f11a 0 2026-03-09T19:27:53.099 INFO:tasks.workunit.client.0.vm07.stdout:5/673: truncate d3/d1a/d28/d6c/d72/db5/fd2 877274 0 2026-03-09T19:27:53.099 INFO:tasks.workunit.client.0.vm07.stdout:3/751: chown d1/d6/d4c/d97/cbc 25842 1 2026-03-09T19:27:53.101 INFO:tasks.workunit.client.0.vm07.stdout:5/674: write d3/d1a/d28/d40/d92/fa9 [2206062,4740] 0 2026-03-09T19:27:53.104 INFO:tasks.workunit.client.1.vm08.stdout:1/998: mkdir d9/d11/d7a/d89/de7/d140 0 2026-03-09T19:27:53.107 INFO:tasks.workunit.client.1.vm08.stdout:4/843: truncate da/d10/d16/f4b 4113752 0 2026-03-09T19:27:53.108 INFO:tasks.workunit.client.1.vm08.stdout:8/880: mkdir de/d117/df2/d130 0 2026-03-09T19:27:53.111 INFO:tasks.workunit.client.1.vm08.stdout:6/906: symlink d3/d94/def/dc4/l153 0 2026-03-09T19:27:53.112 INFO:tasks.workunit.client.1.vm08.stdout:3/962: getdents d0/d6/de/d6e/d51 0 2026-03-09T19:27:53.112 INFO:tasks.workunit.client.1.vm08.stdout:3/963: stat d0/d6/de/d1b/d16/dd1/f125 0 2026-03-09T19:27:53.116 INFO:tasks.workunit.client.0.vm07.stdout:1/712: chown d1/d3e/db3/d6d/le1 29 1 2026-03-09T19:27:53.117 INFO:tasks.workunit.client.0.vm07.stdout:3/752: dwrite d1/f73 [0,4194304] 0 2026-03-09T19:27:53.117 INFO:tasks.workunit.client.0.vm07.stdout:9/698: mknod d0/d6/d57/d8f/cf4 0 2026-03-09T19:27:53.122 INFO:tasks.workunit.client.1.vm08.stdout:1/999: creat d9/d40/d49/d9e/f141 x:0 0 0 2026-03-09T19:27:53.122 INFO:tasks.workunit.client.1.vm08.stdout:8/881: read - de/d47/dfd/d99/dde/f10e zero size 2026-03-09T19:27:53.131 INFO:tasks.workunit.client.0.vm07.stdout:3/753: read d1/d6/d71/f69 [1396336,119193] 0 2026-03-09T19:27:53.134 INFO:tasks.workunit.client.0.vm07.stdout:5/675: mknod d3/d1a/d28/d6c/d72/cd5 0 2026-03-09T19:27:53.137 INFO:tasks.workunit.client.1.vm08.stdout:6/907: mkdir d3/db/d12a/d147/d154 0 2026-03-09T19:27:53.137 INFO:tasks.workunit.client.1.vm08.stdout:4/844: mknod da/d10/d16/d28/d2f/cf6 0 2026-03-09T19:27:53.138 INFO:tasks.workunit.client.0.vm07.stdout:0/681: symlink d0/d6/d13/d1c/d61/d69/lde 0 2026-03-09T19:27:53.143 INFO:tasks.workunit.client.0.vm07.stdout:2/755: rmdir d3/dd/d16/d29/d2d/d45/d3b/d44/d96/ddf 39 2026-03-09T19:27:53.144 INFO:tasks.workunit.client.1.vm08.stdout:8/882: creat de/d25/d31/f131 x:0 0 0 2026-03-09T19:27:53.144 INFO:tasks.workunit.client.1.vm08.stdout:9/878: dread d0/d1b/d97/d48/fb5 [0,4194304] 0 2026-03-09T19:27:53.145 INFO:tasks.workunit.client.1.vm08.stdout:0/914: link dd/d22/d63/d6e/ld3 dd/de4/l12d 0 2026-03-09T19:27:53.146 INFO:tasks.workunit.client.1.vm08.stdout:6/908: rmdir d3/d34/d5c 39 2026-03-09T19:27:53.146 INFO:tasks.workunit.client.0.vm07.stdout:1/713: symlink d1/db/d31/d4f/d7a/dd2/de8/lea 0 2026-03-09T19:27:53.150 INFO:tasks.workunit.client.1.vm08.stdout:4/845: fdatasync da/d10/d26/d27/f35 0 2026-03-09T19:27:53.150 INFO:tasks.workunit.client.0.vm07.stdout:2/756: chown d3/dd/d16/d29/d2d/d45/d3b/dae/fda 219061824 1 2026-03-09T19:27:53.150 INFO:tasks.workunit.client.1.vm08.stdout:0/915: creat dd/d22/d27/d11e/de3/f12e x:0 0 0 2026-03-09T19:27:53.151 INFO:tasks.workunit.client.0.vm07.stdout:5/676: mknod d3/d1a/d28/d6c/d72/d8f/cd6 0 2026-03-09T19:27:53.151 INFO:tasks.workunit.client.0.vm07.stdout:4/683: link d3/d11/d2b/d38/c6b d3/d11/d2b/d38/ddc/cef 0 2026-03-09T19:27:53.155 INFO:tasks.workunit.client.0.vm07.stdout:0/682: creat d0/d6/d13/d1c/d52/fdf x:0 0 0 2026-03-09T19:27:53.155 INFO:tasks.workunit.client.0.vm07.stdout:9/699: mknod d0/db/d29/d68/d99/cf5 0 2026-03-09T19:27:53.156 INFO:tasks.workunit.client.0.vm07.stdout:7/672: getdents d0/d4/d5/d8/d1a/d2a 0 2026-03-09T19:27:53.156 INFO:tasks.workunit.client.0.vm07.stdout:1/714: symlink d1/db/d31/d4f/d7a/dd2/de8/leb 0 2026-03-09T19:27:53.160 INFO:tasks.workunit.client.0.vm07.stdout:4/684: write d3/d4f/f5b [2159749,114421] 0 2026-03-09T19:27:53.161 INFO:tasks.workunit.client.0.vm07.stdout:7/673: readlink d0/d4/d5/d8/d41/d64/d74/d98/l19 0 2026-03-09T19:27:53.161 INFO:tasks.workunit.client.0.vm07.stdout:5/677: creat d3/d1a/d5a/db8/fd7 x:0 0 0 2026-03-09T19:27:53.161 INFO:tasks.workunit.client.0.vm07.stdout:5/678: chown d3/dd/f52 442785 1 2026-03-09T19:27:53.164 INFO:tasks.workunit.client.0.vm07.stdout:0/683: mkdir d0/d6/d13/d17/d19/d58/de0 0 2026-03-09T19:27:53.168 INFO:tasks.workunit.client.0.vm07.stdout:9/700: creat d0/d6/d3a/d94/ff6 x:0 0 0 2026-03-09T19:27:53.172 INFO:tasks.workunit.client.1.vm08.stdout:4/846: dread da/d10/d26/d27/d32/f45 [0,4194304] 0 2026-03-09T19:27:53.180 INFO:tasks.workunit.client.0.vm07.stdout:7/674: creat d0/d4/d5/d26/db9/fe3 x:0 0 0 2026-03-09T19:27:53.184 INFO:tasks.workunit.client.1.vm08.stdout:8/883: dwrite de/d1d/d2e/d5f/fbb [0,4194304] 0 2026-03-09T19:27:53.190 INFO:tasks.workunit.client.0.vm07.stdout:0/684: creat d0/d6/d13/d33/fe1 x:0 0 0 2026-03-09T19:27:53.190 INFO:tasks.workunit.client.0.vm07.stdout:8/723: write d7/d9/d10/dd8/dfd/f5f [395234,31918] 0 2026-03-09T19:27:53.197 INFO:tasks.workunit.client.0.vm07.stdout:5/679: symlink d3/dd/d26/d2d/d79/d9f/ld8 0 2026-03-09T19:27:53.202 INFO:tasks.workunit.client.0.vm07.stdout:2/757: link d3/dd/d16/d29/d3c/d4c/ffb d3/dd/d16/d29/d3c/d4c/f10b 0 2026-03-09T19:27:53.209 INFO:tasks.workunit.client.1.vm08.stdout:2/808: write d3/d9/d26/f6a [3962634,119714] 0 2026-03-09T19:27:53.217 INFO:tasks.workunit.client.0.vm07.stdout:6/654: write d0/dbf/fa2 [2066738,58036] 0 2026-03-09T19:27:53.218 INFO:tasks.workunit.client.1.vm08.stdout:3/964: write d0/d52/d6d/d77/d88/fe7 [1276014,116917] 0 2026-03-09T19:27:53.219 INFO:tasks.workunit.client.1.vm08.stdout:5/871: dwrite d16/d1e/f7d [0,4194304] 0 2026-03-09T19:27:53.223 INFO:tasks.workunit.client.1.vm08.stdout:3/965: truncate d0/d52/d6d/d77/ddf/f13b 875526 0 2026-03-09T19:27:53.231 INFO:tasks.workunit.client.1.vm08.stdout:7/960: dread d5/d14/d2b/d128/f141 [0,4194304] 0 2026-03-09T19:27:53.232 INFO:tasks.workunit.client.1.vm08.stdout:7/961: chown d5/d14/dae/d1c/d73/f12a 128180 1 2026-03-09T19:27:53.232 INFO:tasks.workunit.client.1.vm08.stdout:3/966: dwrite d0/d52/d7c/d7e/f12f [0,4194304] 0 2026-03-09T19:27:53.239 INFO:tasks.workunit.client.1.vm08.stdout:3/967: write d0/f13d [782618,10908] 0 2026-03-09T19:27:53.259 INFO:tasks.workunit.client.0.vm07.stdout:1/715: mkdir d1/d11/d37/d3f/d45/d87/d89/dec 0 2026-03-09T19:27:53.260 INFO:tasks.workunit.client.1.vm08.stdout:0/916: getdents dd/d22/d27/d11e/d78 0 2026-03-09T19:27:53.264 INFO:tasks.workunit.client.0.vm07.stdout:3/754: write d1/d6/fb [2176322,123133] 0 2026-03-09T19:27:53.267 INFO:tasks.workunit.client.1.vm08.stdout:9/879: write d0/d2/d14/d5c/fd [2468937,13333] 0 2026-03-09T19:27:53.273 INFO:tasks.workunit.client.0.vm07.stdout:7/675: rename d0/d52/d54/d5a/c8f to d0/d4/d5/d8/d41/d64/d74/ce4 0 2026-03-09T19:27:53.299 INFO:tasks.workunit.client.1.vm08.stdout:6/909: dwrite d3/d34/d6f/f41 [0,4194304] 0 2026-03-09T19:27:53.312 INFO:tasks.workunit.client.1.vm08.stdout:3/968: creat d0/d6/de/d6e/d51/d7f/f13e x:0 0 0 2026-03-09T19:27:53.313 INFO:tasks.workunit.client.1.vm08.stdout:3/969: dread - d0/d6/d25/f139 zero size 2026-03-09T19:27:53.320 INFO:tasks.workunit.client.1.vm08.stdout:4/847: getdents da/d10/d16/d28/d2f/d4f/d56/d90 0 2026-03-09T19:27:53.321 INFO:tasks.workunit.client.1.vm08.stdout:8/884: write de/d25/d31/f11c [71545,19143] 0 2026-03-09T19:27:53.321 INFO:tasks.workunit.client.0.vm07.stdout:6/655: fsync d0/d13/f5f 0 2026-03-09T19:27:53.321 INFO:tasks.workunit.client.0.vm07.stdout:4/685: write d3/d4f/d56/fcf [415275,121933] 0 2026-03-09T19:27:53.321 INFO:tasks.workunit.client.0.vm07.stdout:1/716: creat d1/d11/d37/fed x:0 0 0 2026-03-09T19:27:53.329 INFO:tasks.workunit.client.1.vm08.stdout:9/880: dread d0/d1b/d97/d48/d5d/d74/f115 [0,4194304] 0 2026-03-09T19:27:53.333 INFO:tasks.workunit.client.0.vm07.stdout:8/724: rename d7/d1d/d83/d9f/ce4 to d7/d9/ddf/c102 0 2026-03-09T19:27:53.347 INFO:tasks.workunit.client.1.vm08.stdout:7/962: dwrite d5/d14/d38/dad/f12d [4194304,4194304] 0 2026-03-09T19:27:53.359 INFO:tasks.workunit.client.0.vm07.stdout:9/701: dwrite d0/db/d29/d4d/f95 [0,4194304] 0 2026-03-09T19:27:53.362 INFO:tasks.workunit.client.1.vm08.stdout:5/872: write d16/d1e/d3b/f5e [1687180,82072] 0 2026-03-09T19:27:53.365 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:53 vm07.local ceph-mon[48545]: [09/Mar/2026:19:27:52] ENGINE Bus STARTING 2026-03-09T19:27:53.371 INFO:tasks.workunit.client.1.vm08.stdout:0/917: fdatasync dd/f1e 0 2026-03-09T19:27:53.373 INFO:tasks.workunit.client.0.vm07.stdout:7/676: dwrite d0/d4/d5/d8/d41/fe0 [0,4194304] 0 2026-03-09T19:27:53.380 INFO:tasks.workunit.client.1.vm08.stdout:4/848: creat da/d10/d16/d28/d2f/d4f/d64/d84/d8a/da2/ff7 x:0 0 0 2026-03-09T19:27:53.384 INFO:tasks.workunit.client.0.vm07.stdout:9/702: sync 2026-03-09T19:27:53.384 INFO:tasks.workunit.client.1.vm08.stdout:9/881: rmdir d0/d1b/de9 39 2026-03-09T19:27:53.388 INFO:tasks.workunit.client.0.vm07.stdout:3/755: symlink d1/lf2 0 2026-03-09T19:27:53.392 INFO:tasks.workunit.client.0.vm07.stdout:0/685: write d0/d6/d13/d17/d19/d57/d6a/fda [4013083,118439] 0 2026-03-09T19:27:53.394 INFO:tasks.workunit.client.1.vm08.stdout:5/873: dread d16/fbe [0,4194304] 0 2026-03-09T19:27:53.397 INFO:tasks.workunit.client.0.vm07.stdout:8/725: rename f5 to d7/d50/da6/dc5/f103 0 2026-03-09T19:27:53.418 INFO:tasks.workunit.client.1.vm08.stdout:8/885: read - de/d25/d87/dc9/dfc/f106 zero size 2026-03-09T19:27:53.422 INFO:tasks.workunit.client.1.vm08.stdout:4/849: unlink da/d10/d16/d28/d2f/d4f/d56/f9a 0 2026-03-09T19:27:53.427 INFO:tasks.workunit.client.0.vm07.stdout:6/656: dwrite d0/d1/db/d17/fb4 [0,4194304] 0 2026-03-09T19:27:53.433 INFO:tasks.workunit.client.0.vm07.stdout:4/686: write d3/d11/d2b/f98 [256881,14818] 0 2026-03-09T19:27:53.439 INFO:tasks.workunit.client.1.vm08.stdout:6/910: dwrite d3/d34/dce/de3/f114 [0,4194304] 0 2026-03-09T19:27:53.439 INFO:tasks.workunit.client.1.vm08.stdout:2/809: getdents d3/d4/d23/d2c/d39/d5e/de/d18/da9 0 2026-03-09T19:27:53.449 INFO:tasks.workunit.client.1.vm08.stdout:3/970: creat d0/f13f x:0 0 0 2026-03-09T19:27:53.454 INFO:tasks.workunit.client.1.vm08.stdout:5/874: creat d16/d45/daf/df5/d6f/f11b x:0 0 0 2026-03-09T19:27:53.455 INFO:tasks.workunit.client.1.vm08.stdout:7/963: truncate d5/d14/dae/d3a/d42/f9a 1527766 0 2026-03-09T19:27:53.463 INFO:tasks.workunit.client.1.vm08.stdout:0/918: chown dd/d22/d27/d11e/db3/ffc 124 1 2026-03-09T19:27:53.466 INFO:tasks.workunit.client.1.vm08.stdout:4/850: creat da/d10/d16/d28/d46/d52/d6e/d40/ff8 x:0 0 0 2026-03-09T19:27:53.474 INFO:tasks.workunit.client.1.vm08.stdout:6/911: fsync d3/d34/f37 0 2026-03-09T19:27:53.483 INFO:tasks.workunit.client.1.vm08.stdout:9/882: write d0/d2/d14/d98/dbb/fe1 [1061209,59930] 0 2026-03-09T19:27:53.489 INFO:tasks.workunit.client.1.vm08.stdout:3/971: mknod d0/d6/de/d15/dec/c140 0 2026-03-09T19:27:53.493 INFO:tasks.workunit.client.1.vm08.stdout:3/972: read - d0/d6/de/f134 zero size 2026-03-09T19:27:53.494 INFO:tasks.workunit.client.1.vm08.stdout:0/919: mknod dd/d9d/c12f 0 2026-03-09T19:27:53.501 INFO:tasks.workunit.client.1.vm08.stdout:6/912: symlink d3/db/d43/d69/da0/l155 0 2026-03-09T19:27:53.519 INFO:tasks.workunit.client.1.vm08.stdout:5/875: mknod d16/d1e/d6e/dcd/def/c11c 0 2026-03-09T19:27:53.535 INFO:tasks.workunit.client.0.vm07.stdout:5/680: creat d3/d1a/d28/d40/fd9 x:0 0 0 2026-03-09T19:27:53.535 INFO:tasks.workunit.client.1.vm08.stdout:4/851: dread da/d10/d16/f4b [0,4194304] 0 2026-03-09T19:27:53.535 INFO:tasks.workunit.client.1.vm08.stdout:0/920: creat dd/d22/d27/d6c/f130 x:0 0 0 2026-03-09T19:27:53.535 INFO:tasks.workunit.client.1.vm08.stdout:2/810: link d3/d9/fd2 d3/d4/d23/d2c/d39/d5e/de/d18/d99/f111 0 2026-03-09T19:27:53.535 INFO:tasks.workunit.client.1.vm08.stdout:8/886: getdents de/d1d/d2e/db4 0 2026-03-09T19:27:53.535 INFO:tasks.workunit.client.1.vm08.stdout:9/883: dread d0/d2/d80/de5/da2/da8/de8/dcd/fb6 [0,4194304] 0 2026-03-09T19:27:53.535 INFO:tasks.workunit.client.1.vm08.stdout:8/887: read de/d25/d31/f8e [664581,96547] 0 2026-03-09T19:27:53.536 INFO:tasks.workunit.client.1.vm08.stdout:2/811: readlink d3/d4/d23/d2c/d39/d5e/l106 0 2026-03-09T19:27:53.543 INFO:tasks.workunit.client.1.vm08.stdout:4/852: mkdir da/d10/d26/d27/da6/df9 0 2026-03-09T19:27:53.543 INFO:tasks.workunit.client.1.vm08.stdout:8/888: dread de/d1d/d2e/f61 [0,4194304] 0 2026-03-09T19:27:53.545 INFO:tasks.workunit.client.1.vm08.stdout:4/853: chown da/d10/d16/d28/d46/d52/d6e/d2c/l5e 4522883 1 2026-03-09T19:27:53.545 INFO:tasks.workunit.client.1.vm08.stdout:5/876: dread d16/d45/fb1 [0,4194304] 0 2026-03-09T19:27:53.551 INFO:tasks.workunit.client.1.vm08.stdout:0/921: creat dd/d22/d63/d6e/df5/f131 x:0 0 0 2026-03-09T19:27:53.566 INFO:tasks.workunit.client.0.vm07.stdout:3/756: mknod d1/d74/cf3 0 2026-03-09T19:27:53.567 INFO:tasks.workunit.client.0.vm07.stdout:0/686: creat d0/d6/dc8/fe2 x:0 0 0 2026-03-09T19:27:53.567 INFO:tasks.workunit.client.0.vm07.stdout:0/687: stat d0/f3a 0 2026-03-09T19:27:53.567 INFO:tasks.workunit.client.1.vm08.stdout:0/922: chown dd/d22/d27/c47 566 1 2026-03-09T19:27:53.567 INFO:tasks.workunit.client.1.vm08.stdout:6/913: dread - d3/d34/d5c/da2/f124 zero size 2026-03-09T19:27:53.567 INFO:tasks.workunit.client.1.vm08.stdout:0/923: write dd/d22/d63/d6e/df5/f117 [606978,92549] 0 2026-03-09T19:27:53.567 INFO:tasks.workunit.client.1.vm08.stdout:0/924: chown dd/d22/d27/d11e/d78/ff0 798 1 2026-03-09T19:27:53.569 INFO:tasks.workunit.client.0.vm07.stdout:7/677: rename d0/d52 to d0/d80/db1/de5 0 2026-03-09T19:27:53.574 INFO:tasks.workunit.client.1.vm08.stdout:2/812: creat d3/d4/d23/d2c/d39/d5e/d14/f112 x:0 0 0 2026-03-09T19:27:53.574 INFO:tasks.workunit.client.1.vm08.stdout:8/889: creat de/d91/dc8/f132 x:0 0 0 2026-03-09T19:27:53.577 INFO:tasks.workunit.client.1.vm08.stdout:5/877: chown d16/d1e/d8c/d99/da8/d9a/ce4 7406 1 2026-03-09T19:27:53.578 INFO:tasks.workunit.client.1.vm08.stdout:9/884: dread d0/d2/d80/fde [0,4194304] 0 2026-03-09T19:27:53.583 INFO:tasks.workunit.client.0.vm07.stdout:4/687: truncate d3/d11/d51/f8e 4059154 0 2026-03-09T19:27:53.585 INFO:tasks.workunit.client.1.vm08.stdout:7/964: sync 2026-03-09T19:27:53.596 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:53 vm08.local ceph-mon[57794]: [09/Mar/2026:19:27:52] ENGINE Bus STARTING 2026-03-09T19:27:53.596 INFO:tasks.workunit.client.1.vm08.stdout:4/854: rename da/d10/d16/d28/d46/fb1 to da/d10/d16/d28/d46/d52/d6e/ded/ffa 0 2026-03-09T19:27:53.596 INFO:tasks.workunit.client.1.vm08.stdout:6/914: creat d3/db/d12a/d147/f156 x:0 0 0 2026-03-09T19:27:53.596 INFO:tasks.workunit.client.0.vm07.stdout:5/681: mkdir d3/dd/dda 0 2026-03-09T19:27:53.596 INFO:tasks.workunit.client.0.vm07.stdout:2/758: link d3/f93 d3/f10c 0 2026-03-09T19:27:53.596 INFO:tasks.workunit.client.0.vm07.stdout:1/717: link d1/d11/l19 d1/d11/d37/d3f/d45/d87/d89/dec/lee 0 2026-03-09T19:27:53.599 INFO:tasks.workunit.client.0.vm07.stdout:9/703: mknod d0/db/d29/d68/cf7 0 2026-03-09T19:27:53.603 INFO:tasks.workunit.client.1.vm08.stdout:4/855: dread da/d10/d16/d28/d2f/fd4 [0,4194304] 0 2026-03-09T19:27:53.605 INFO:tasks.workunit.client.1.vm08.stdout:5/878: creat d16/d1e/dc9/f11d x:0 0 0 2026-03-09T19:27:53.606 INFO:tasks.workunit.client.1.vm08.stdout:4/856: chown da/d10/d16/fbf 8383 1 2026-03-09T19:27:53.606 INFO:tasks.workunit.client.1.vm08.stdout:3/973: dwrite d0/d8/f4c [0,4194304] 0 2026-03-09T19:27:53.607 INFO:tasks.workunit.client.0.vm07.stdout:3/757: readlink d1/d6/d45/d54/dd1/le0 0 2026-03-09T19:27:53.609 INFO:tasks.workunit.client.0.vm07.stdout:8/726: mknod d7/d30/c104 0 2026-03-09T19:27:53.612 INFO:tasks.workunit.client.0.vm07.stdout:0/688: rmdir d0/d6/d13/d17/d19/d57 39 2026-03-09T19:27:53.613 INFO:tasks.workunit.client.1.vm08.stdout:2/813: read d3/d4/d23/d2c/d39/d5e/de/d18/d1f/f3a [3386012,80402] 0 2026-03-09T19:27:53.619 INFO:tasks.workunit.client.1.vm08.stdout:9/885: unlink d0/l11d 0 2026-03-09T19:27:53.620 INFO:tasks.workunit.client.1.vm08.stdout:8/890: dwrite de/d7c/fe1 [0,4194304] 0 2026-03-09T19:27:53.622 INFO:tasks.workunit.client.0.vm07.stdout:0/689: dread d0/f41 [0,4194304] 0 2026-03-09T19:27:53.623 INFO:tasks.workunit.client.1.vm08.stdout:8/891: chown de/d47/dfd/d99/da0/lf0 269 1 2026-03-09T19:27:53.636 INFO:tasks.workunit.client.0.vm07.stdout:7/678: unlink d0/d4/d5/d26/f8e 0 2026-03-09T19:27:53.637 INFO:tasks.workunit.client.0.vm07.stdout:4/688: rename d3/d4f/d56/ceb to d3/d11/d2b/d37/cf0 0 2026-03-09T19:27:53.637 INFO:tasks.workunit.client.0.vm07.stdout:1/718: creat d1/d11/d37/d3f/db5/fef x:0 0 0 2026-03-09T19:27:53.644 INFO:tasks.workunit.client.0.vm07.stdout:1/719: dread d1/d11/d37/f2c [0,4194304] 0 2026-03-09T19:27:53.656 INFO:tasks.workunit.client.0.vm07.stdout:9/704: dread d0/d6/d57/deb/fc0 [0,4194304] 0 2026-03-09T19:27:53.659 INFO:tasks.workunit.client.1.vm08.stdout:6/915: rename d3/db/f14 to d3/db/d12a/d147/f157 0 2026-03-09T19:27:53.660 INFO:tasks.workunit.client.1.vm08.stdout:6/916: chown d3/d34/dce 3 1 2026-03-09T19:27:53.661 INFO:tasks.workunit.client.1.vm08.stdout:7/965: write d5/d14/dae/f49 [2048192,81256] 0 2026-03-09T19:27:53.663 INFO:tasks.workunit.client.1.vm08.stdout:5/879: rmdir d16/d8e/dff 39 2026-03-09T19:27:53.664 INFO:tasks.workunit.client.0.vm07.stdout:5/682: dwrite d3/d1a/d28/d40/f46 [0,4194304] 0 2026-03-09T19:27:53.664 INFO:tasks.workunit.client.0.vm07.stdout:5/683: stat d3/d1a/f1c 0 2026-03-09T19:27:53.667 INFO:tasks.workunit.client.0.vm07.stdout:8/727: symlink d7/d9/d57/l105 0 2026-03-09T19:27:53.673 INFO:tasks.workunit.client.1.vm08.stdout:3/974: mkdir d0/d6/de/d6e/d51/d7f/de3/d141 0 2026-03-09T19:27:53.677 INFO:tasks.workunit.client.1.vm08.stdout:0/925: dwrite dd/d22/d27/d2e/db0/fdc [4194304,4194304] 0 2026-03-09T19:27:53.687 INFO:tasks.workunit.client.1.vm08.stdout:2/814: dread - d3/d4/d23/d2c/d39/d5e/fd9 zero size 2026-03-09T19:27:53.689 INFO:tasks.workunit.client.1.vm08.stdout:9/886: dread - d0/d2/d80/de5/da2/da8/de8/f108 zero size 2026-03-09T19:27:53.692 INFO:tasks.workunit.client.0.vm07.stdout:2/759: rename d3/dd/d16/d29/d2d/d45/d3b/d44/d97/fec to d3/dd/d16/d29/d3c/da2/f10d 0 2026-03-09T19:27:53.701 INFO:tasks.workunit.client.1.vm08.stdout:2/815: dwrite d3/d4/d3e/f10a [0,4194304] 0 2026-03-09T19:27:53.706 INFO:tasks.workunit.client.0.vm07.stdout:1/720: symlink d1/d3e/dc8/lf0 0 2026-03-09T19:27:53.718 INFO:tasks.workunit.client.1.vm08.stdout:3/975: sync 2026-03-09T19:27:53.723 INFO:tasks.workunit.client.0.vm07.stdout:6/657: getdents d0/d1/d28/da9 0 2026-03-09T19:27:53.726 INFO:tasks.workunit.client.0.vm07.stdout:6/658: dread d0/d4e/d7f/fe9 [0,4194304] 0 2026-03-09T19:27:53.730 INFO:tasks.workunit.client.1.vm08.stdout:6/917: dread d3/d34/f37 [0,4194304] 0 2026-03-09T19:27:53.734 INFO:tasks.workunit.client.0.vm07.stdout:2/760: sync 2026-03-09T19:27:53.735 INFO:tasks.workunit.client.0.vm07.stdout:2/761: dread - d3/dd/d16/d29/d2d/d45/d3b/fe5 zero size 2026-03-09T19:27:53.735 INFO:tasks.workunit.client.1.vm08.stdout:3/976: sync 2026-03-09T19:27:53.736 INFO:tasks.workunit.client.0.vm07.stdout:9/705: dread d0/db/d29/d2c/f30 [0,4194304] 0 2026-03-09T19:27:53.743 INFO:tasks.workunit.client.1.vm08.stdout:7/966: truncate d5/d14/d38/f4c 787624 0 2026-03-09T19:27:53.746 INFO:tasks.workunit.client.0.vm07.stdout:5/684: mknod d3/cdb 0 2026-03-09T19:27:53.750 INFO:tasks.workunit.client.1.vm08.stdout:4/857: dwrite da/d10/f13 [0,4194304] 0 2026-03-09T19:27:53.753 INFO:tasks.workunit.client.0.vm07.stdout:8/728: creat d7/d1d/d83/d9f/dd2/def/f106 x:0 0 0 2026-03-09T19:27:53.754 INFO:tasks.workunit.client.0.vm07.stdout:6/659: fdatasync d0/d1/d28/f64 0 2026-03-09T19:27:53.757 INFO:tasks.workunit.client.0.vm07.stdout:9/706: mkdir d0/d6f/dc3/df8 0 2026-03-09T19:27:53.757 INFO:tasks.workunit.client.0.vm07.stdout:2/762: creat d3/dd/d16/d29/d2d/d45/d8b/d98/dee/f10e x:0 0 0 2026-03-09T19:27:53.758 INFO:tasks.workunit.client.1.vm08.stdout:0/926: rename dd/d22/d63/d93 to dd/d31/d132 0 2026-03-09T19:27:53.758 INFO:tasks.workunit.client.0.vm07.stdout:8/729: fdatasync d7/d9/d37/fe8 0 2026-03-09T19:27:53.758 INFO:tasks.workunit.client.1.vm08.stdout:6/918: dread - d3/d94/fea zero size 2026-03-09T19:27:53.758 INFO:tasks.workunit.client.1.vm08.stdout:3/977: mknod d0/d6/d10a/c142 0 2026-03-09T19:27:53.759 INFO:tasks.workunit.client.0.vm07.stdout:0/690: dwrite d0/d6/d13/d17/d19/d58/fa2 [0,4194304] 0 2026-03-09T19:27:53.765 INFO:tasks.workunit.client.0.vm07.stdout:5/685: mkdir d3/d1a/d28/d40/d92/d89/ddc 0 2026-03-09T19:27:53.765 INFO:tasks.workunit.client.1.vm08.stdout:3/978: creat d0/d6/dad/f143 x:0 0 0 2026-03-09T19:27:53.765 INFO:tasks.workunit.client.1.vm08.stdout:0/927: stat dd/d22/d27/d6c/c11f 0 2026-03-09T19:27:53.765 INFO:tasks.workunit.client.1.vm08.stdout:7/967: fdatasync d5/d14/d2b/daa/f105 0 2026-03-09T19:27:53.767 INFO:tasks.workunit.client.0.vm07.stdout:4/689: getdents d3/d11/d29/d34/de0 0 2026-03-09T19:27:53.772 INFO:tasks.workunit.client.1.vm08.stdout:6/919: truncate d3/d34/f10e 513962 0 2026-03-09T19:27:53.772 INFO:tasks.workunit.client.1.vm08.stdout:4/858: dread da/d10/d16/d28/d2f/d4f/d64/d84/d8a/fb9 [0,4194304] 0 2026-03-09T19:27:53.772 INFO:tasks.workunit.client.1.vm08.stdout:3/979: creat d0/d6/d93/dcb/d129/f144 x:0 0 0 2026-03-09T19:27:53.772 INFO:tasks.workunit.client.1.vm08.stdout:0/928: rename dd/d22/d27/d11e/db3/ffc to dd/d22/d27/d11e/de3/f133 0 2026-03-09T19:27:53.773 INFO:tasks.workunit.client.1.vm08.stdout:4/859: stat da/d10/d16/d28/d46 0 2026-03-09T19:27:53.776 INFO:tasks.workunit.client.0.vm07.stdout:8/730: mkdir d7/d9/d37/d45/d4f/db1/d107 0 2026-03-09T19:27:53.780 INFO:tasks.workunit.client.1.vm08.stdout:7/968: mkdir d5/d14/d2b/d147 0 2026-03-09T19:27:53.794 INFO:tasks.workunit.client.0.vm07.stdout:6/660: rename d0/d1/db/d17/dc4/d7b/da0/fcf to d0/dbf/d95/f105 0 2026-03-09T19:27:53.795 INFO:tasks.workunit.client.0.vm07.stdout:6/661: truncate d0/d4e/d7f/dbe/fee 62159 0 2026-03-09T19:27:53.801 INFO:tasks.workunit.client.0.vm07.stdout:4/690: symlink d3/d11/d2b/d38/ddc/lf1 0 2026-03-09T19:27:53.801 INFO:tasks.workunit.client.1.vm08.stdout:4/860: mkdir da/d10/d16/d28/d2f/d4f/d64/d81/dfb 0 2026-03-09T19:27:53.802 INFO:tasks.workunit.client.1.vm08.stdout:3/980: readlink d0/d6/d93/ld5 0 2026-03-09T19:27:53.802 INFO:tasks.workunit.client.0.vm07.stdout:8/731: truncate d7/d9/d37/d34/f91 507203 0 2026-03-09T19:27:53.802 INFO:tasks.workunit.client.0.vm07.stdout:5/686: fsync d3/dd/fab 0 2026-03-09T19:27:53.802 INFO:tasks.workunit.client.0.vm07.stdout:8/732: chown d7/d9/d10/d44/fdc 44604 1 2026-03-09T19:27:53.806 INFO:tasks.workunit.client.1.vm08.stdout:6/920: mkdir d3/d34/d5c/d158 0 2026-03-09T19:27:53.806 INFO:tasks.workunit.client.1.vm08.stdout:8/892: write de/d1d/d21/f72 [1431019,124800] 0 2026-03-09T19:27:53.818 INFO:tasks.workunit.client.0.vm07.stdout:6/662: creat d0/d44/dd3/f106 x:0 0 0 2026-03-09T19:27:53.818 INFO:tasks.workunit.client.0.vm07.stdout:6/663: dread - d0/dbf/d95/feb zero size 2026-03-09T19:27:53.818 INFO:tasks.workunit.client.0.vm07.stdout:6/664: write d0/d1/d28/d76/dad/f104 [308474,96814] 0 2026-03-09T19:27:53.818 INFO:tasks.workunit.client.0.vm07.stdout:4/691: fdatasync d3/d11/f6c 0 2026-03-09T19:27:53.818 INFO:tasks.workunit.client.1.vm08.stdout:6/921: dread - d3/d34/dce/de3/f146 zero size 2026-03-09T19:27:53.818 INFO:tasks.workunit.client.1.vm08.stdout:0/929: fsync dd/d7e/f8e 0 2026-03-09T19:27:53.818 INFO:tasks.workunit.client.1.vm08.stdout:4/861: dread - da/d10/d1b/d23/f98 zero size 2026-03-09T19:27:53.818 INFO:tasks.workunit.client.1.vm08.stdout:4/862: stat l7 0 2026-03-09T19:27:53.823 INFO:tasks.workunit.client.0.vm07.stdout:3/758: write d1/d6/f19 [1041987,84057] 0 2026-03-09T19:27:53.829 INFO:tasks.workunit.client.0.vm07.stdout:1/721: dwrite d1/f51 [0,4194304] 0 2026-03-09T19:27:53.836 INFO:tasks.workunit.client.0.vm07.stdout:5/687: unlink d3/d1a/l11 0 2026-03-09T19:27:53.838 INFO:tasks.workunit.client.1.vm08.stdout:9/887: write d0/d2/f1d [7967290,13817] 0 2026-03-09T19:27:53.839 INFO:tasks.workunit.client.0.vm07.stdout:7/679: write d0/d4/d5/d8/d41/d64/d74/d98/dcb/d39/f97 [1520885,43560] 0 2026-03-09T19:27:53.839 INFO:tasks.workunit.client.1.vm08.stdout:2/816: write d3/d4/f55 [1692232,130661] 0 2026-03-09T19:27:53.845 INFO:tasks.workunit.client.1.vm08.stdout:5/880: truncate d16/d1e/d8c/d99/f105 126249 0 2026-03-09T19:27:53.847 INFO:tasks.workunit.client.0.vm07.stdout:9/707: write d0/d6/d3a/d81/fa3 [4744496,115829] 0 2026-03-09T19:27:53.851 INFO:tasks.workunit.client.0.vm07.stdout:2/763: dwrite d3/dd/d16/d29/d2d/d45/d85/fa5 [0,4194304] 0 2026-03-09T19:27:53.867 INFO:tasks.workunit.client.0.vm07.stdout:0/691: rename d0/d6/d13/d17/c8d to d0/d6/d13/dd0/ce3 0 2026-03-09T19:27:53.873 INFO:tasks.workunit.client.1.vm08.stdout:8/893: read - de/d91/f108 zero size 2026-03-09T19:27:53.876 INFO:tasks.workunit.client.0.vm07.stdout:6/665: unlink d0/d4e/d75/l9c 0 2026-03-09T19:27:53.877 INFO:tasks.workunit.client.0.vm07.stdout:6/666: readlink d0/d1/db/d1d/l63 0 2026-03-09T19:27:53.880 INFO:tasks.workunit.client.1.vm08.stdout:0/930: unlink dd/d22/d27/d4f/c59 0 2026-03-09T19:27:53.883 INFO:tasks.workunit.client.1.vm08.stdout:7/969: dwrite d5/d14/d27/d54/dfb/f9b [4194304,4194304] 0 2026-03-09T19:27:53.889 INFO:tasks.workunit.client.1.vm08.stdout:4/863: mknod da/d10/d26/dd8/cfc 0 2026-03-09T19:27:53.890 INFO:tasks.workunit.client.0.vm07.stdout:3/759: mknod d1/d6/dd/cf4 0 2026-03-09T19:27:53.902 INFO:tasks.workunit.client.1.vm08.stdout:5/881: rename d16/d8e/ddb to d16/d1e/d3b/d61/d11e 0 2026-03-09T19:27:53.910 INFO:tasks.workunit.client.1.vm08.stdout:2/817: read d3/d4/d23/d2c/d39/d5e/d14/f73 [4190631,113268] 0 2026-03-09T19:27:53.925 INFO:tasks.workunit.client.1.vm08.stdout:2/818: chown d3/d4/d23/d2c/d39/d5e/db8/f10c 1063912 1 2026-03-09T19:27:53.925 INFO:tasks.workunit.client.1.vm08.stdout:0/931: creat dd/d22/d27/d11e/d78/f134 x:0 0 0 2026-03-09T19:27:53.926 INFO:tasks.workunit.client.0.vm07.stdout:2/764: dread d3/dd/d16/d29/d2d/d45/d3b/d44/d97/da4/fd1 [0,4194304] 0 2026-03-09T19:27:53.926 INFO:tasks.workunit.client.0.vm07.stdout:6/667: dread d0/ff [0,4194304] 0 2026-03-09T19:27:53.926 INFO:tasks.workunit.client.0.vm07.stdout:6/668: write d0/d1/db/d24/da4/ff3 [133198,96085] 0 2026-03-09T19:27:53.926 INFO:tasks.workunit.client.0.vm07.stdout:1/722: mknod d1/d11/d37/d3f/d6e/d9c/db6/cf1 0 2026-03-09T19:27:53.926 INFO:tasks.workunit.client.0.vm07.stdout:5/688: rmdir d3/dd/d26/d3f/d47 39 2026-03-09T19:27:53.927 INFO:tasks.workunit.client.0.vm07.stdout:9/708: creat d0/d6/d3a/dd3/ff9 x:0 0 0 2026-03-09T19:27:53.929 INFO:tasks.workunit.client.0.vm07.stdout:4/692: link d3/d4f/f7c d3/d4f/d56/ff2 0 2026-03-09T19:27:53.931 INFO:tasks.workunit.client.0.vm07.stdout:9/709: dwrite d0/d6/d57/d8f/ff0 [0,4194304] 0 2026-03-09T19:27:53.939 INFO:tasks.workunit.client.0.vm07.stdout:6/669: mkdir d0/d1/db/d52/d94/d87/d107 0 2026-03-09T19:27:53.949 INFO:tasks.workunit.client.0.vm07.stdout:1/723: creat d1/d11/d37/ff2 x:0 0 0 2026-03-09T19:27:53.950 INFO:tasks.workunit.client.0.vm07.stdout:5/689: truncate d3/d1a/f10 3823434 0 2026-03-09T19:27:53.950 INFO:tasks.workunit.client.0.vm07.stdout:1/724: readlink d1/d3/l34 0 2026-03-09T19:27:53.950 INFO:tasks.workunit.client.0.vm07.stdout:5/690: stat d3/dd/d26/d2d 0 2026-03-09T19:27:53.950 INFO:tasks.workunit.client.0.vm07.stdout:9/710: symlink d0/d6/d3a/dd3/lfa 0 2026-03-09T19:27:53.950 INFO:tasks.workunit.client.0.vm07.stdout:0/692: getdents d0/d6/d13/d1c 0 2026-03-09T19:27:53.950 INFO:tasks.workunit.client.0.vm07.stdout:4/693: creat d3/dbe/ff3 x:0 0 0 2026-03-09T19:27:53.950 INFO:tasks.workunit.client.0.vm07.stdout:1/725: symlink d1/db/d31/d4f/d7a/lf3 0 2026-03-09T19:27:53.950 INFO:tasks.workunit.client.0.vm07.stdout:6/670: unlink d0/d1/db/d17/dc4/d7b/l98 0 2026-03-09T19:27:53.950 INFO:tasks.workunit.client.0.vm07.stdout:4/694: chown d3/d11/d2b/d37/cf0 0 1 2026-03-09T19:27:53.950 INFO:tasks.workunit.client.0.vm07.stdout:1/726: chown d1/d11/d37/fed 7 1 2026-03-09T19:27:53.950 INFO:tasks.workunit.client.0.vm07.stdout:5/691: dwrite d3/d1a/fa [4194304,4194304] 0 2026-03-09T19:27:53.955 INFO:tasks.workunit.client.0.vm07.stdout:4/695: read d3/d11/d2b/f2c [2687818,110643] 0 2026-03-09T19:27:53.960 INFO:tasks.workunit.client.0.vm07.stdout:1/727: mkdir d1/d11/d37/d3f/d45/d87/d88/df4 0 2026-03-09T19:27:53.961 INFO:tasks.workunit.client.0.vm07.stdout:1/728: write d1/d11/d37/d3f/db5/fef [556429,32727] 0 2026-03-09T19:27:53.961 INFO:tasks.workunit.client.0.vm07.stdout:0/693: truncate d0/d6/d13/d17/d19/d57/d6a/fb5 233706 0 2026-03-09T19:27:53.962 INFO:tasks.workunit.client.1.vm08.stdout:7/970: sync 2026-03-09T19:27:53.963 INFO:tasks.workunit.client.0.vm07.stdout:3/760: sync 2026-03-09T19:27:53.963 INFO:tasks.workunit.client.0.vm07.stdout:0/694: write d0/d6/d13/d1c/d52/fd3 [913753,53142] 0 2026-03-09T19:27:53.967 INFO:tasks.workunit.client.0.vm07.stdout:4/696: truncate d3/d11/d51/f8e 2069111 0 2026-03-09T19:27:53.969 INFO:tasks.workunit.client.0.vm07.stdout:6/671: link d0/d13/l5c d0/d1/db/d24/da4/dda/l108 0 2026-03-09T19:27:53.969 INFO:tasks.workunit.client.0.vm07.stdout:0/695: symlink d0/d6/d13/d1c/d11/d8b/le4 0 2026-03-09T19:27:53.970 INFO:tasks.workunit.client.0.vm07.stdout:6/672: creat d0/d1/db/f109 x:0 0 0 2026-03-09T19:27:53.972 INFO:tasks.workunit.client.0.vm07.stdout:3/761: dread d1/d3d/d47/db3/dc2/f3a [0,4194304] 0 2026-03-09T19:27:53.972 INFO:tasks.workunit.client.0.vm07.stdout:0/696: mkdir d0/d6/d13/d17/d19/de5 0 2026-03-09T19:27:53.976 INFO:tasks.workunit.client.0.vm07.stdout:6/673: truncate d0/d2d/f88 1747860 0 2026-03-09T19:27:53.978 INFO:tasks.workunit.client.0.vm07.stdout:4/697: dwrite d3/d4f/f5b [0,4194304] 0 2026-03-09T19:27:53.979 INFO:tasks.workunit.client.0.vm07.stdout:4/698: read - d3/d11/d2b/d38/fdf zero size 2026-03-09T19:27:53.988 INFO:tasks.workunit.client.1.vm08.stdout:3/981: dwrite d0/d6/de/d6e/d51/fbb [0,4194304] 0 2026-03-09T19:27:53.988 INFO:tasks.workunit.client.1.vm08.stdout:3/982: fsync d0/d6/de/d1b/d16/d17/dac/d109/f133 0 2026-03-09T19:27:53.988 INFO:tasks.workunit.client.0.vm07.stdout:4/699: stat d3/d11/d2b/f49 0 2026-03-09T19:27:53.988 INFO:tasks.workunit.client.0.vm07.stdout:4/700: chown d3/d11/d29/f3c 38 1 2026-03-09T19:27:53.992 INFO:tasks.workunit.client.0.vm07.stdout:8/733: write d7/d1d/f96 [893590,68323] 0 2026-03-09T19:27:53.995 INFO:tasks.workunit.client.0.vm07.stdout:3/762: dread d1/d6/d71/fdb [0,4194304] 0 2026-03-09T19:27:53.996 INFO:tasks.workunit.client.1.vm08.stdout:6/922: dwrite d3/d34/d3b/df5/f120 [0,4194304] 0 2026-03-09T19:27:53.999 INFO:tasks.workunit.client.0.vm07.stdout:6/674: sync 2026-03-09T19:27:54.002 INFO:tasks.workunit.client.1.vm08.stdout:6/923: symlink d3/d55/l159 0 2026-03-09T19:27:54.004 INFO:tasks.workunit.client.0.vm07.stdout:3/763: symlink d1/d6/d45/d54/lf5 0 2026-03-09T19:27:54.005 INFO:tasks.workunit.client.0.vm07.stdout:3/764: chown d1/d6/d71/f69 321 1 2026-03-09T19:27:54.007 INFO:tasks.workunit.client.0.vm07.stdout:6/675: dread d0/d44/fe2 [0,4194304] 0 2026-03-09T19:27:54.015 INFO:tasks.workunit.client.0.vm07.stdout:6/676: write d0/d1/db/d24/da4/ff3 [287349,8071] 0 2026-03-09T19:27:54.015 INFO:tasks.workunit.client.0.vm07.stdout:4/701: rename d3/d11/d2b/d37/faf to d3/d11/d29/ff4 0 2026-03-09T19:27:54.015 INFO:tasks.workunit.client.0.vm07.stdout:3/765: symlink d1/d6/d4c/lf6 0 2026-03-09T19:27:54.015 INFO:tasks.workunit.client.0.vm07.stdout:4/702: mkdir d3/d11/d16/df5 0 2026-03-09T19:27:54.015 INFO:tasks.workunit.client.0.vm07.stdout:3/766: creat d1/d3d/d47/db3/d87/ff7 x:0 0 0 2026-03-09T19:27:54.015 INFO:tasks.workunit.client.0.vm07.stdout:4/703: symlink d3/d11/d2b/d38/ddc/d22/d86/lf6 0 2026-03-09T19:27:54.018 INFO:tasks.workunit.client.0.vm07.stdout:3/767: unlink d1/d3d/d47/db3/dc2/d28/d7c/cef 0 2026-03-09T19:27:54.018 INFO:tasks.workunit.client.0.vm07.stdout:4/704: dread - d3/f64 zero size 2026-03-09T19:27:54.018 INFO:tasks.workunit.client.0.vm07.stdout:3/768: readlink d1/d3d/d47/db3/dc2/l42 0 2026-03-09T19:27:54.019 INFO:tasks.workunit.client.0.vm07.stdout:3/769: chown d1/d6/dd/dbf/fcf 66222 1 2026-03-09T19:27:54.022 INFO:tasks.workunit.client.0.vm07.stdout:4/705: mkdir d3/d11/d2b/d38/ddc/df7 0 2026-03-09T19:27:54.027 INFO:tasks.workunit.client.0.vm07.stdout:3/770: symlink d1/d6/d45/d54/de5/lf8 0 2026-03-09T19:27:54.027 INFO:tasks.workunit.client.0.vm07.stdout:4/706: symlink d3/d11/lf8 0 2026-03-09T19:27:54.039 INFO:tasks.workunit.client.0.vm07.stdout:3/771: getdents d1/d6/dd 0 2026-03-09T19:27:54.039 INFO:tasks.workunit.client.1.vm08.stdout:9/888: dwrite d0/d2/d80/fde [0,4194304] 0 2026-03-09T19:27:54.041 INFO:tasks.workunit.client.1.vm08.stdout:9/889: chown d0/d2/d80/de5/da2/da8/de8/f61 55248 1 2026-03-09T19:27:54.045 INFO:tasks.workunit.client.0.vm07.stdout:3/772: write d1/d3d/d47/db3/dc2/fe1 [804578,58319] 0 2026-03-09T19:27:54.045 INFO:tasks.workunit.client.1.vm08.stdout:8/894: write de/d47/dfd/d99/da5/db3/f2d [425543,12426] 0 2026-03-09T19:27:54.045 INFO:tasks.workunit.client.1.vm08.stdout:8/895: chown de/d1d/c2b 14789035 1 2026-03-09T19:27:54.049 INFO:tasks.workunit.client.0.vm07.stdout:7/680: dwrite d0/d4/d5/d8/d41/d64/d74/d98/dcb/f63 [0,4194304] 0 2026-03-09T19:27:54.051 INFO:tasks.workunit.client.1.vm08.stdout:9/890: creat d0/d1b/d97/d48/d5d/ddf/d111/f129 x:0 0 0 2026-03-09T19:27:54.069 INFO:tasks.workunit.client.1.vm08.stdout:4/864: write da/d10/d16/d28/fde [570283,129527] 0 2026-03-09T19:27:54.071 INFO:tasks.workunit.client.1.vm08.stdout:5/882: write d16/d8e/fb2 [2057659,44623] 0 2026-03-09T19:27:54.072 INFO:tasks.workunit.client.1.vm08.stdout:2/819: write d3/d4/d23/d2c/dc1/fe2 [279677,25266] 0 2026-03-09T19:27:54.072 INFO:tasks.workunit.client.1.vm08.stdout:2/820: dread - d3/d4/d23/fd1 zero size 2026-03-09T19:27:54.079 INFO:tasks.workunit.client.1.vm08.stdout:8/896: dread de/d47/dfd/d99/da5/db3/f2d [0,4194304] 0 2026-03-09T19:27:54.088 INFO:tasks.workunit.client.0.vm07.stdout:2/765: truncate d3/f5 1692284 0 2026-03-09T19:27:54.093 INFO:tasks.workunit.client.0.vm07.stdout:7/681: mknod d0/d80/ce6 0 2026-03-09T19:27:54.094 INFO:tasks.workunit.client.0.vm07.stdout:3/773: mknod d1/d3d/d47/db3/dc2/d28/dc4/cf9 0 2026-03-09T19:27:54.094 INFO:tasks.workunit.client.1.vm08.stdout:4/865: rename da/d10/d16/d28/d2f/de9/db0/cce to da/d10/cfd 0 2026-03-09T19:27:54.103 INFO:tasks.workunit.client.0.vm07.stdout:2/766: mknod d3/dd/d103/ddd/ded/df3/c10f 0 2026-03-09T19:27:54.103 INFO:tasks.workunit.client.1.vm08.stdout:2/821: dread d3/d9/d26/f52 [0,4194304] 0 2026-03-09T19:27:54.104 INFO:tasks.workunit.client.0.vm07.stdout:5/692: write d3/f4e [2922230,125399] 0 2026-03-09T19:27:54.110 INFO:tasks.workunit.client.1.vm08.stdout:7/971: write d5/d14/dae/f103 [251739,125545] 0 2026-03-09T19:27:54.112 INFO:tasks.workunit.client.0.vm07.stdout:9/711: dwrite d0/db/d29/d68/f6b [0,4194304] 0 2026-03-09T19:27:54.113 INFO:tasks.workunit.client.1.vm08.stdout:7/972: readlink d5/d14/dae/le8 0 2026-03-09T19:27:54.115 INFO:tasks.workunit.client.0.vm07.stdout:1/729: write d1/d11/d37/d5d/d50/f6b [52590,96829] 0 2026-03-09T19:27:54.119 INFO:tasks.workunit.client.1.vm08.stdout:9/891: rename d0/d2/d80/de5 to d0/d1b/de9/d12a 0 2026-03-09T19:27:54.131 INFO:tasks.workunit.client.0.vm07.stdout:7/682: chown d0/d4/d5/d8/d41/d64/d74/f82 54112 1 2026-03-09T19:27:54.133 INFO:tasks.workunit.client.1.vm08.stdout:5/883: truncate d16/d1e/f5b 8508590 0 2026-03-09T19:27:54.134 INFO:tasks.workunit.client.0.vm07.stdout:2/767: symlink d3/dd/d16/d29/d3c/d5a/l110 0 2026-03-09T19:27:54.137 INFO:tasks.workunit.client.0.vm07.stdout:2/768: write d3/dd/d16/d29/d2d/d45/d3b/fe5 [513212,15396] 0 2026-03-09T19:27:54.137 INFO:tasks.workunit.client.0.vm07.stdout:5/693: dread d3/f93 [0,4194304] 0 2026-03-09T19:27:54.138 INFO:tasks.workunit.client.1.vm08.stdout:4/866: mknod da/d10/d26/dd8/cfe 0 2026-03-09T19:27:54.145 INFO:tasks.workunit.client.1.vm08.stdout:7/973: rmdir d5/d14 39 2026-03-09T19:27:54.145 INFO:tasks.workunit.client.0.vm07.stdout:9/712: symlink d0/db/d29/d32/d5c/lfb 0 2026-03-09T19:27:54.148 INFO:tasks.workunit.client.0.vm07.stdout:9/713: chown d0/d6/d57/deb/fc0 1 1 2026-03-09T19:27:54.151 INFO:tasks.workunit.client.1.vm08.stdout:9/892: mknod d0/d1b/de9/d12a/c12b 0 2026-03-09T19:27:54.154 INFO:tasks.workunit.client.0.vm07.stdout:7/683: dread d0/d4/d5/d26/d32/dbd/fa0 [0,4194304] 0 2026-03-09T19:27:54.156 INFO:tasks.workunit.client.0.vm07.stdout:0/697: getdents d0/d6/d13/d1c/d11/d8b 0 2026-03-09T19:27:54.160 INFO:tasks.workunit.client.0.vm07.stdout:2/769: mkdir d3/d11/d38/d111 0 2026-03-09T19:27:54.163 INFO:tasks.workunit.client.0.vm07.stdout:4/707: dread d3/d11/d2b/d38/f8a [0,4194304] 0 2026-03-09T19:27:54.171 INFO:tasks.workunit.client.1.vm08.stdout:3/983: dwrite d0/d6/de/d6e/fbd [0,4194304] 0 2026-03-09T19:27:54.171 INFO:tasks.workunit.client.1.vm08.stdout:5/884: mknod d16/d1e/d8c/d99/c11f 0 2026-03-09T19:27:54.184 INFO:tasks.workunit.client.0.vm07.stdout:1/730: truncate d1/f76 408616 0 2026-03-09T19:27:54.187 INFO:tasks.workunit.client.1.vm08.stdout:6/924: write d3/d68/fb9 [1360346,127349] 0 2026-03-09T19:27:54.191 INFO:tasks.workunit.client.0.vm07.stdout:8/734: dwrite d7/d9/d37/d45/d4f/fd4 [0,4194304] 0 2026-03-09T19:27:54.191 INFO:tasks.workunit.client.1.vm08.stdout:4/867: rmdir da/d10/d26/d27/d32 39 2026-03-09T19:27:54.210 INFO:tasks.workunit.client.0.vm07.stdout:0/698: symlink d0/d6/d13/d17/dc3/le6 0 2026-03-09T19:27:54.210 INFO:tasks.workunit.client.1.vm08.stdout:9/893: creat d0/d1b/de9/f12c x:0 0 0 2026-03-09T19:27:54.214 INFO:tasks.workunit.client.0.vm07.stdout:6/677: dwrite d0/d1/db/d1d/d77/ff0 [0,4194304] 0 2026-03-09T19:27:54.215 INFO:tasks.workunit.client.0.vm07.stdout:7/684: truncate d0/d4/d5/d8/d41/d64/fc1 1923809 0 2026-03-09T19:27:54.217 INFO:tasks.workunit.client.0.vm07.stdout:2/770: mkdir d3/dd/d103/ddd/ded/df3/d112 0 2026-03-09T19:27:54.225 INFO:tasks.workunit.client.1.vm08.stdout:5/885: creat d16/d1e/d8c/d99/dcc/f120 x:0 0 0 2026-03-09T19:27:54.232 INFO:tasks.workunit.client.0.vm07.stdout:0/699: dread d0/d6/f4f [0,4194304] 0 2026-03-09T19:27:54.232 INFO:tasks.workunit.client.0.vm07.stdout:0/700: stat d0/d6/d13/d17/d19/d57/f6f 0 2026-03-09T19:27:54.236 INFO:tasks.workunit.client.0.vm07.stdout:0/701: chown d0/d6/d13/d17/f20 5 1 2026-03-09T19:27:54.236 INFO:tasks.workunit.client.0.vm07.stdout:4/708: mknod d3/d4f/cf9 0 2026-03-09T19:27:54.238 INFO:tasks.workunit.client.0.vm07.stdout:6/678: dread d0/d1/d28/d76/dad/f104 [0,4194304] 0 2026-03-09T19:27:54.240 INFO:tasks.workunit.client.1.vm08.stdout:6/925: mknod d3/d34/da9/da4/d117/c15a 0 2026-03-09T19:27:54.246 INFO:tasks.workunit.client.0.vm07.stdout:5/694: rename d3/fe to d3/dd/d26/d3f/d47/d71/d76/d98/fdd 0 2026-03-09T19:27:54.254 INFO:tasks.workunit.client.1.vm08.stdout:8/897: dwrite de/d25/d31/fc0 [0,4194304] 0 2026-03-09T19:27:54.256 INFO:tasks.workunit.client.1.vm08.stdout:2/822: link d3/d4/d3e/d4e/d88/cfd d3/d4/d10e/c113 0 2026-03-09T19:27:54.259 INFO:tasks.workunit.client.1.vm08.stdout:7/974: rmdir d5/d14/d27/d54/d107 39 2026-03-09T19:27:54.265 INFO:tasks.workunit.client.0.vm07.stdout:3/774: write d1/d74/f52 [782290,45519] 0 2026-03-09T19:27:54.265 INFO:tasks.workunit.client.1.vm08.stdout:0/932: dwrite dd/d22/d27/d11e/d78/ff0 [0,4194304] 0 2026-03-09T19:27:54.267 INFO:tasks.workunit.client.0.vm07.stdout:8/735: fdatasync d7/d9/d10/dd8/dfd/d67/fa3 0 2026-03-09T19:27:54.272 INFO:tasks.workunit.client.0.vm07.stdout:9/714: mkdir d0/d6f/dc3/df8/dfc 0 2026-03-09T19:27:54.273 INFO:tasks.workunit.client.0.vm07.stdout:9/715: chown d0/d6f/dc3/fc4 28 1 2026-03-09T19:27:54.273 INFO:tasks.workunit.client.1.vm08.stdout:5/886: fdatasync d16/d1e/d3b/f68 0 2026-03-09T19:27:54.284 INFO:tasks.workunit.client.1.vm08.stdout:3/984: creat d0/d6/de/d1b/d16/d17/dac/dd2/dd3/f145 x:0 0 0 2026-03-09T19:27:54.284 INFO:tasks.workunit.client.1.vm08.stdout:9/894: dread d0/f13 [0,4194304] 0 2026-03-09T19:27:54.288 INFO:tasks.workunit.client.1.vm08.stdout:6/926: readlink d3/db/l1a 0 2026-03-09T19:27:54.289 INFO:tasks.workunit.client.1.vm08.stdout:9/895: dwrite d0/d2/d14/d5c/fd [0,4194304] 0 2026-03-09T19:27:54.299 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:54 vm07.local ceph-mon[48545]: [09/Mar/2026:19:27:52] ENGINE Serving on https://192.168.123.108:7150 2026-03-09T19:27:54.299 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:54 vm07.local ceph-mon[48545]: [09/Mar/2026:19:27:52] ENGINE Client ('192.168.123.108', 37700) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T19:27:54.299 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:54 vm07.local ceph-mon[48545]: [09/Mar/2026:19:27:52] ENGINE Serving on http://192.168.123.108:8765 2026-03-09T19:27:54.299 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:54 vm07.local ceph-mon[48545]: [09/Mar/2026:19:27:52] ENGINE Bus STARTED 2026-03-09T19:27:54.299 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:54 vm07.local ceph-mon[48545]: pgmap v6: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-09T19:27:54.299 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:54 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:54.299 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:54 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:54.302 INFO:tasks.workunit.client.0.vm07.stdout:7/685: dread d0/d4/d5/d26/d32/fa6 [0,4194304] 0 2026-03-09T19:27:54.304 INFO:tasks.workunit.client.1.vm08.stdout:4/868: symlink da/d10/d1b/dd3/lff 0 2026-03-09T19:27:54.306 INFO:tasks.workunit.client.1.vm08.stdout:4/869: read da/d10/d16/d28/d46/fbe [392988,20055] 0 2026-03-09T19:27:54.311 INFO:tasks.workunit.client.0.vm07.stdout:0/702: symlink d0/d6/d13/d1c/d11/le7 0 2026-03-09T19:27:54.316 INFO:tasks.workunit.client.1.vm08.stdout:2/823: symlink d3/d4/d23/d2c/d39/d5e/db8/dff/l114 0 2026-03-09T19:27:54.319 INFO:tasks.workunit.client.0.vm07.stdout:4/709: read - d3/d11/d51/f9a zero size 2026-03-09T19:27:54.320 INFO:tasks.workunit.client.1.vm08.stdout:2/824: stat d3/d9/fdd 0 2026-03-09T19:27:54.323 INFO:tasks.workunit.client.1.vm08.stdout:2/825: write d3/d4/f8 [8144348,128323] 0 2026-03-09T19:27:54.326 INFO:tasks.workunit.client.0.vm07.stdout:6/679: mknod d0/d1/db/d52/c10a 0 2026-03-09T19:27:54.331 INFO:tasks.workunit.client.1.vm08.stdout:0/933: rmdir dd/d31 39 2026-03-09T19:27:54.352 INFO:tasks.workunit.client.1.vm08.stdout:3/985: fdatasync d0/d6/de/d15/d96/df5/df8/f119 0 2026-03-09T19:27:54.367 INFO:tasks.workunit.client.1.vm08.stdout:3/986: dread d0/d6/de/d1b/d16/d17/fbc [0,4194304] 0 2026-03-09T19:27:54.370 INFO:tasks.workunit.client.0.vm07.stdout:8/736: creat d7/d16/dcf/f108 x:0 0 0 2026-03-09T19:27:54.372 INFO:tasks.workunit.client.1.vm08.stdout:4/870: unlink da/d10/d1b/dd3/lff 0 2026-03-09T19:27:54.376 INFO:tasks.workunit.client.0.vm07.stdout:9/716: dread - d0/d6f/d86/fd1 zero size 2026-03-09T19:27:54.376 INFO:tasks.workunit.client.1.vm08.stdout:7/975: write d5/d14/d27/d54/dfb/d9c/fef [1621582,74794] 0 2026-03-09T19:27:54.377 INFO:tasks.workunit.client.0.vm07.stdout:2/771: mkdir d3/d11/d38/d111/d113 0 2026-03-09T19:27:54.379 INFO:tasks.workunit.client.0.vm07.stdout:1/731: dwrite d1/d3e/fe3 [0,4194304] 0 2026-03-09T19:27:54.386 INFO:tasks.workunit.client.0.vm07.stdout:7/686: mkdir d0/d4/d5/d8/d41/d64/d74/d98/de7 0 2026-03-09T19:27:54.399 INFO:tasks.workunit.client.0.vm07.stdout:3/775: dwrite d1/d6/dd/f8a [0,4194304] 0 2026-03-09T19:27:54.403 INFO:tasks.workunit.client.0.vm07.stdout:4/710: read - d3/d4f/d56/d5f/fc2 zero size 2026-03-09T19:27:54.406 INFO:tasks.workunit.client.1.vm08.stdout:6/927: dwrite d3/d94/fea [0,4194304] 0 2026-03-09T19:27:54.406 INFO:tasks.workunit.client.0.vm07.stdout:0/703: dread d0/d6/d13/d17/d19/f7c [0,4194304] 0 2026-03-09T19:27:54.407 INFO:tasks.workunit.client.0.vm07.stdout:0/704: stat d0/d6/c42 0 2026-03-09T19:27:54.412 INFO:tasks.workunit.client.1.vm08.stdout:5/887: creat d16/d1e/dc9/d10c/d112/f121 x:0 0 0 2026-03-09T19:27:54.413 INFO:tasks.workunit.client.1.vm08.stdout:6/928: write d3/db/d43/d69/da0/fdf [1801428,54406] 0 2026-03-09T19:27:54.420 INFO:tasks.workunit.client.0.vm07.stdout:6/680: truncate d0/d4e/d7f/fca 139703 0 2026-03-09T19:27:54.421 INFO:tasks.workunit.client.0.vm07.stdout:5/695: mkdir d3/d1a/d28/d40/d92/d89/ddc/dde 0 2026-03-09T19:27:54.423 INFO:tasks.workunit.client.0.vm07.stdout:5/696: read d3/dd/d26/d3f/fbf [85136,41188] 0 2026-03-09T19:27:54.425 INFO:tasks.workunit.client.1.vm08.stdout:9/896: dwrite d0/d1b/de9/d12a/da2/da8/de8/f101 [0,4194304] 0 2026-03-09T19:27:54.426 INFO:tasks.workunit.client.1.vm08.stdout:3/987: unlink d0/d6/de/d54/d103/l13a 0 2026-03-09T19:27:54.430 INFO:tasks.workunit.client.1.vm08.stdout:3/988: readlink d0/d6/de/d6e/lc6 0 2026-03-09T19:27:54.451 INFO:tasks.workunit.client.1.vm08.stdout:0/934: dwrite dd/d22/d27/d11e/dd4/fd9 [4194304,4194304] 0 2026-03-09T19:27:54.455 INFO:tasks.workunit.client.1.vm08.stdout:2/826: mknod d3/d4/d10e/c115 0 2026-03-09T19:27:54.485 INFO:tasks.workunit.client.1.vm08.stdout:3/989: dread - d0/d52/d7c/f112 zero size 2026-03-09T19:27:54.499 INFO:tasks.workunit.client.1.vm08.stdout:9/897: write d0/d2/fc1 [779410,97753] 0 2026-03-09T19:27:54.504 INFO:tasks.workunit.client.1.vm08.stdout:8/898: symlink de/d25/d33/d127/l133 0 2026-03-09T19:27:54.504 INFO:tasks.workunit.client.1.vm08.stdout:4/871: creat da/d10/d16/d28/d2f/d4f/d64/d84/d8a/da2/df3/f100 x:0 0 0 2026-03-09T19:27:54.509 INFO:tasks.workunit.client.1.vm08.stdout:7/976: fsync d5/d14/d2b/d5d/fb2 0 2026-03-09T19:27:54.513 INFO:tasks.workunit.client.1.vm08.stdout:0/935: chown dd/lc3 202613 1 2026-03-09T19:27:54.519 INFO:tasks.workunit.client.1.vm08.stdout:2/827: symlink d3/d4/d23/d2c/dc1/l116 0 2026-03-09T19:27:54.522 INFO:tasks.workunit.client.1.vm08.stdout:5/888: creat d16/d1e/d3b/d61/d11e/d107/f122 x:0 0 0 2026-03-09T19:27:54.532 INFO:tasks.workunit.client.1.vm08.stdout:5/889: dread d16/d1e/d9f/fd3 [0,4194304] 0 2026-03-09T19:27:54.534 INFO:tasks.workunit.client.1.vm08.stdout:6/929: rename d3/d55/caa to d3/dbc/c15b 0 2026-03-09T19:27:54.538 INFO:tasks.workunit.client.1.vm08.stdout:9/898: creat d0/d1b/de9/dfd/f12d x:0 0 0 2026-03-09T19:27:54.538 INFO:tasks.workunit.client.1.vm08.stdout:8/899: unlink de/d1d/d2e/d5f/fbb 0 2026-03-09T19:27:54.540 INFO:tasks.workunit.client.0.vm07.stdout:8/737: creat d7/d50/da6/f109 x:0 0 0 2026-03-09T19:27:54.549 INFO:tasks.workunit.client.1.vm08.stdout:4/872: unlink da/d10/d16/ceb 0 2026-03-09T19:27:54.555 INFO:tasks.workunit.client.0.vm07.stdout:2/772: mkdir d3/d49/d114 0 2026-03-09T19:27:54.556 INFO:tasks.workunit.client.0.vm07.stdout:2/773: readlink d3/dd/d16/d29/d2d/l35 0 2026-03-09T19:27:54.557 INFO:tasks.workunit.client.0.vm07.stdout:2/774: chown d3/dd/d103/cf7 65728 1 2026-03-09T19:27:54.565 INFO:tasks.workunit.client.0.vm07.stdout:9/717: mkdir d0/db/d29/d32/d5c/d80/dad/dfd 0 2026-03-09T19:27:54.566 INFO:tasks.workunit.client.1.vm08.stdout:0/936: symlink dd/de4/l135 0 2026-03-09T19:27:54.566 INFO:tasks.workunit.client.1.vm08.stdout:7/977: dread d5/d14/d27/d54/f5e [0,4194304] 0 2026-03-09T19:27:54.567 INFO:tasks.workunit.client.1.vm08.stdout:7/978: chown d5/d14/dae/d1c/d73/f12a 2 1 2026-03-09T19:27:54.574 INFO:tasks.workunit.client.1.vm08.stdout:5/890: mkdir d16/d8e/dd5/d123 0 2026-03-09T19:27:54.581 INFO:tasks.workunit.client.0.vm07.stdout:1/732: dread d1/d11/d37/d3f/f4a [0,4194304] 0 2026-03-09T19:27:54.582 INFO:tasks.workunit.client.0.vm07.stdout:1/733: chown d1/d3/l25 167408261 1 2026-03-09T19:27:54.583 INFO:tasks.workunit.client.0.vm07.stdout:3/776: mkdir d1/d6/d4c/dfa 0 2026-03-09T19:27:54.583 INFO:tasks.workunit.client.0.vm07.stdout:3/777: dread - d1/d3d/d47/fe7 zero size 2026-03-09T19:27:54.585 INFO:tasks.workunit.client.1.vm08.stdout:3/990: symlink d0/d52/d6d/d77/d88/df7/l146 0 2026-03-09T19:27:54.591 INFO:tasks.workunit.client.1.vm08.stdout:8/900: creat de/d47/dfd/d99/dde/f134 x:0 0 0 2026-03-09T19:27:54.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:54 vm08.local ceph-mon[57794]: [09/Mar/2026:19:27:52] ENGINE Serving on https://192.168.123.108:7150 2026-03-09T19:27:54.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:54 vm08.local ceph-mon[57794]: [09/Mar/2026:19:27:52] ENGINE Client ('192.168.123.108', 37700) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T19:27:54.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:54 vm08.local ceph-mon[57794]: [09/Mar/2026:19:27:52] ENGINE Serving on http://192.168.123.108:8765 2026-03-09T19:27:54.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:54 vm08.local ceph-mon[57794]: [09/Mar/2026:19:27:52] ENGINE Bus STARTED 2026-03-09T19:27:54.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:54 vm08.local ceph-mon[57794]: pgmap v6: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-09T19:27:54.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:54 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:54.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:54 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:54.600 INFO:tasks.workunit.client.1.vm08.stdout:9/899: write d0/d2/d80/f6a [3500366,26383] 0 2026-03-09T19:27:54.603 INFO:tasks.workunit.client.1.vm08.stdout:8/901: dwrite de/d47/fc1 [0,4194304] 0 2026-03-09T19:27:54.613 INFO:tasks.workunit.client.0.vm07.stdout:9/718: creat d0/db/d29/d32/d5c/d80/ffe x:0 0 0 2026-03-09T19:27:54.614 INFO:tasks.workunit.client.0.vm07.stdout:9/719: read - d0/db/d29/d32/d5c/d80/fe4 zero size 2026-03-09T19:27:54.614 INFO:tasks.workunit.client.0.vm07.stdout:9/720: readlink d0/d6/l25 0 2026-03-09T19:27:54.617 INFO:tasks.workunit.client.1.vm08.stdout:7/979: mknod d5/d14/d2b/d128/c148 0 2026-03-09T19:27:54.617 INFO:tasks.workunit.client.0.vm07.stdout:7/687: mkdir d0/d4/d5/d8/d1a/de8 0 2026-03-09T19:27:54.621 INFO:tasks.workunit.client.1.vm08.stdout:0/937: fsync dd/d22/f29 0 2026-03-09T19:27:54.632 INFO:tasks.workunit.client.0.vm07.stdout:3/778: mknod d1/cfb 0 2026-03-09T19:27:54.633 INFO:tasks.workunit.client.1.vm08.stdout:6/930: rename d3/dbc/deb/l12c to d3/d15/l15c 0 2026-03-09T19:27:54.633 INFO:tasks.workunit.client.1.vm08.stdout:7/980: dread - d5/d14/d27/d54/dfb/d9c/dcb/dd2/f130 zero size 2026-03-09T19:27:54.635 INFO:tasks.workunit.client.1.vm08.stdout:8/902: dread de/d91/fbd [0,4194304] 0 2026-03-09T19:27:54.639 INFO:tasks.workunit.client.1.vm08.stdout:4/873: creat da/d10/d16/d28/d2f/d4f/d64/d84/f101 x:0 0 0 2026-03-09T19:27:54.639 INFO:tasks.workunit.client.1.vm08.stdout:4/874: chown da/d10/d26/d27/fac 0 1 2026-03-09T19:27:54.643 INFO:tasks.workunit.client.0.vm07.stdout:0/705: rmdir d0/d6/d13/d17/d19/d57/d9e 0 2026-03-09T19:27:54.644 INFO:tasks.workunit.client.1.vm08.stdout:5/891: write d16/d1e/d8c/fab [4041452,58961] 0 2026-03-09T19:27:54.644 INFO:tasks.workunit.client.0.vm07.stdout:8/738: write d7/d1d/d83/d9f/fcb [291873,94297] 0 2026-03-09T19:27:54.644 INFO:tasks.workunit.client.1.vm08.stdout:5/892: write ff [1254960,61719] 0 2026-03-09T19:27:54.647 INFO:tasks.workunit.client.0.vm07.stdout:2/775: dwrite d3/dd/d16/d29/d2d/d45/fd9 [0,4194304] 0 2026-03-09T19:27:54.658 INFO:tasks.workunit.client.0.vm07.stdout:5/697: dwrite d3/dd/d95/fba [0,4194304] 0 2026-03-09T19:27:54.661 INFO:tasks.workunit.client.0.vm07.stdout:1/734: dwrite d1/d11/d37/d3f/d45/d87/d88/fd5 [0,4194304] 0 2026-03-09T19:27:54.663 INFO:tasks.workunit.client.0.vm07.stdout:1/735: readlink d1/d11/l19 0 2026-03-09T19:27:54.667 INFO:tasks.workunit.client.0.vm07.stdout:5/698: readlink d3/d1a/d28/d40/d92/d89/la6 0 2026-03-09T19:27:54.669 INFO:tasks.workunit.client.0.vm07.stdout:6/681: link d0/d1/db/d1d/d77/c90 d0/d1/db/d52/d94/d87/c10b 0 2026-03-09T19:27:54.670 INFO:tasks.workunit.client.1.vm08.stdout:2/828: link d3/d9/lc3 d3/d4/d3e/l117 0 2026-03-09T19:27:54.675 INFO:tasks.workunit.client.1.vm08.stdout:3/991: rename d0/d6/de/c60 to d0/d52/d11d/c147 0 2026-03-09T19:27:54.681 INFO:tasks.workunit.client.1.vm08.stdout:8/903: creat de/d25/d87/dc9/dd8/f135 x:0 0 0 2026-03-09T19:27:54.689 INFO:tasks.workunit.client.0.vm07.stdout:2/776: sync 2026-03-09T19:27:54.694 INFO:tasks.workunit.client.0.vm07.stdout:2/777: chown d3/dd/d16/d29/d3c/d5a/d7a/d74/dc9 1086 1 2026-03-09T19:27:54.702 INFO:tasks.workunit.client.1.vm08.stdout:5/893: dread d16/d45/daf/df5/fb4 [0,4194304] 0 2026-03-09T19:27:54.712 INFO:tasks.workunit.client.0.vm07.stdout:8/739: symlink d7/d50/da6/dc5/l10a 0 2026-03-09T19:27:54.714 INFO:tasks.workunit.client.1.vm08.stdout:5/894: dwrite d16/d45/daf/df5/fc2 [0,4194304] 0 2026-03-09T19:27:54.716 INFO:tasks.workunit.client.0.vm07.stdout:7/688: mknod d0/d4/d5/d8/d1a/de8/ce9 0 2026-03-09T19:27:54.717 INFO:tasks.workunit.client.1.vm08.stdout:5/895: stat d16/d45/daf/df5/l52 0 2026-03-09T19:27:54.725 INFO:tasks.workunit.client.1.vm08.stdout:7/981: dread d5/fa [0,4194304] 0 2026-03-09T19:27:54.732 INFO:tasks.workunit.client.0.vm07.stdout:4/711: link d3/f13 d3/d11/d2b/d38/ddc/ffa 0 2026-03-09T19:27:54.750 INFO:tasks.workunit.client.1.vm08.stdout:5/896: dread d16/d45/f65 [4194304,4194304] 0 2026-03-09T19:27:54.750 INFO:tasks.workunit.client.0.vm07.stdout:9/721: rmdir d0/db/d29/d32/d5c/d80/dad/dfd 0 2026-03-09T19:27:54.750 INFO:tasks.workunit.client.0.vm07.stdout:3/779: symlink d1/d3d/d47/db3/d8e/dee/lfc 0 2026-03-09T19:27:54.750 INFO:tasks.workunit.client.0.vm07.stdout:2/778: fdatasync d3/dd/d16/d29/d2d/d45/d3b/d44/d97/fe6 0 2026-03-09T19:27:54.750 INFO:tasks.workunit.client.0.vm07.stdout:0/706: fdatasync d0/d6/d13/d17/dc3/fb6 0 2026-03-09T19:27:54.750 INFO:tasks.workunit.client.0.vm07.stdout:8/740: truncate d7/d1d/f3d 1332285 0 2026-03-09T19:27:54.755 INFO:tasks.workunit.client.0.vm07.stdout:2/779: unlink d3/dd/d16/d29/d2d/d45/d3b/d44/c9f 0 2026-03-09T19:27:54.758 INFO:tasks.workunit.client.0.vm07.stdout:3/780: mkdir d1/d3d/d47/db3/dc2/d28/dfd 0 2026-03-09T19:27:54.758 INFO:tasks.workunit.client.0.vm07.stdout:5/699: getdents d3/d1a/d28/d40 0 2026-03-09T19:27:54.764 INFO:tasks.workunit.client.0.vm07.stdout:5/700: truncate d3/d1a/fb 2237548 0 2026-03-09T19:27:54.770 INFO:tasks.workunit.client.1.vm08.stdout:2/829: creat d3/d4/d23/d2c/d39/d5e/de/d18/da9/f118 x:0 0 0 2026-03-09T19:27:54.770 INFO:tasks.workunit.client.1.vm08.stdout:2/830: readlink d3/dca/lce 0 2026-03-09T19:27:54.770 INFO:tasks.workunit.client.1.vm08.stdout:9/900: getdents d0/d1b/d97/d48/d5d/ddf/d111 0 2026-03-09T19:27:54.770 INFO:tasks.workunit.client.1.vm08.stdout:4/875: mkdir da/d10/d16/d28/d2f/d4f/d102 0 2026-03-09T19:27:54.770 INFO:tasks.workunit.client.1.vm08.stdout:4/876: truncate da/d10/d16/d28/d46/d52/d6e/d40/ff8 613724 0 2026-03-09T19:27:54.771 INFO:tasks.workunit.client.1.vm08.stdout:8/904: sync 2026-03-09T19:27:54.774 INFO:tasks.workunit.client.0.vm07.stdout:5/701: read d3/dd/d26/d3f/d47/fb6 [3369084,108906] 0 2026-03-09T19:27:54.783 INFO:tasks.workunit.client.0.vm07.stdout:8/741: dread d7/d16/d1e/ff3 [0,4194304] 0 2026-03-09T19:27:54.784 INFO:tasks.workunit.client.1.vm08.stdout:5/897: creat d16/d45/daf/df5/d6f/f124 x:0 0 0 2026-03-09T19:27:54.784 INFO:tasks.workunit.client.1.vm08.stdout:7/982: mknod d5/d14/d2b/d4b/c149 0 2026-03-09T19:27:54.785 INFO:tasks.workunit.client.1.vm08.stdout:4/877: fdatasync da/d10/d16/fd5 0 2026-03-09T19:27:54.789 INFO:tasks.workunit.client.1.vm08.stdout:8/905: rename de/d91/dc8/lcf to de/d25/d87/l136 0 2026-03-09T19:27:54.791 INFO:tasks.workunit.client.1.vm08.stdout:6/931: getdents d3/d34/d6f/d123 0 2026-03-09T19:27:54.792 INFO:tasks.workunit.client.0.vm07.stdout:5/702: chown d3/f4d 605 1 2026-03-09T19:27:54.798 INFO:tasks.workunit.client.1.vm08.stdout:2/831: creat d3/d4/d23/d2c/d39/db9/df6/f119 x:0 0 0 2026-03-09T19:27:54.804 INFO:tasks.workunit.client.0.vm07.stdout:8/742: mkdir d7/d9/d10/dd8/d10b 0 2026-03-09T19:27:54.808 INFO:tasks.workunit.client.0.vm07.stdout:1/736: write d1/d3e/db3/d6d/fac [97269,38811] 0 2026-03-09T19:27:54.809 INFO:tasks.workunit.client.0.vm07.stdout:6/682: write d0/d2d/f4a [218454,90952] 0 2026-03-09T19:27:54.815 INFO:tasks.workunit.client.0.vm07.stdout:4/712: write d3/fd7 [643000,16071] 0 2026-03-09T19:27:54.816 INFO:tasks.workunit.client.0.vm07.stdout:9/722: write d0/db/d29/d32/d5c/d69/f83 [906858,55976] 0 2026-03-09T19:27:54.818 INFO:tasks.workunit.client.0.vm07.stdout:5/703: rename d3/dd/d26/d3f/d47/d56/f75 to d3/dd/d26/d3f/d47/d71/d76/fdf 0 2026-03-09T19:27:54.819 INFO:tasks.workunit.client.1.vm08.stdout:7/983: symlink d5/d14/d2b/daa/l14a 0 2026-03-09T19:27:54.820 INFO:tasks.workunit.client.0.vm07.stdout:8/743: creat d7/d50/f10c x:0 0 0 2026-03-09T19:27:54.823 INFO:tasks.workunit.client.0.vm07.stdout:7/689: dwrite d0/d80/db1/de5/d54/fa4 [0,4194304] 0 2026-03-09T19:27:54.825 INFO:tasks.workunit.client.1.vm08.stdout:8/906: creat de/d25/d87/f137 x:0 0 0 2026-03-09T19:27:54.831 INFO:tasks.workunit.client.1.vm08.stdout:6/932: dread d3/db/f12d [0,4194304] 0 2026-03-09T19:27:54.831 INFO:tasks.workunit.client.1.vm08.stdout:7/984: mknod d5/d14/dae/d1c/db5/df8/d13b/dc7/dce/c14b 0 2026-03-09T19:27:54.832 INFO:tasks.workunit.client.1.vm08.stdout:7/985: readlink d5/d14/dae/l26 0 2026-03-09T19:27:54.833 INFO:tasks.workunit.client.0.vm07.stdout:2/780: write d3/fc [996687,127393] 0 2026-03-09T19:27:54.833 INFO:tasks.workunit.client.0.vm07.stdout:3/781: write d1/d3d/d47/db3/d8e/da9/f7d [2520109,3503] 0 2026-03-09T19:27:54.837 INFO:tasks.workunit.client.0.vm07.stdout:0/707: dwrite d0/d6/d13/d1c/f27 [0,4194304] 0 2026-03-09T19:27:54.837 INFO:tasks.workunit.client.1.vm08.stdout:0/938: write dd/d31/d132/f101 [1008969,12383] 0 2026-03-09T19:27:54.838 INFO:tasks.workunit.client.0.vm07.stdout:2/781: write d3/dd/d16/d29/d2d/d45/d85/fa5 [5044098,32801] 0 2026-03-09T19:27:54.838 INFO:tasks.workunit.client.1.vm08.stdout:6/933: stat d3/db/d12a/d147/f157 0 2026-03-09T19:27:54.839 INFO:tasks.workunit.client.0.vm07.stdout:3/782: truncate d1/d6/d45/dac/fe8 255876 0 2026-03-09T19:27:54.839 INFO:tasks.workunit.client.1.vm08.stdout:9/901: write d0/d2/d14/f3b [1301,86093] 0 2026-03-09T19:27:54.843 INFO:tasks.workunit.client.1.vm08.stdout:9/902: readlink d0/d1b/de9/d12a/da2/da8/l107 0 2026-03-09T19:27:54.846 INFO:tasks.workunit.client.1.vm08.stdout:8/907: dwrite de/d47/dfd/d99/dde/f134 [0,4194304] 0 2026-03-09T19:27:54.846 INFO:tasks.workunit.client.1.vm08.stdout:0/939: write dd/d22/d27/d11e/de3/f12e [824129,3869] 0 2026-03-09T19:27:54.854 INFO:tasks.workunit.client.1.vm08.stdout:3/992: dwrite d0/d8/fc2 [0,4194304] 0 2026-03-09T19:27:54.857 INFO:tasks.workunit.client.0.vm07.stdout:9/723: dread d0/db/fe0 [0,4194304] 0 2026-03-09T19:27:54.868 INFO:tasks.workunit.client.1.vm08.stdout:5/898: mkdir d16/d1e/d3b/d61/d11e/d107/d114/d125 0 2026-03-09T19:27:54.869 INFO:tasks.workunit.client.1.vm08.stdout:5/899: chown d16/d1e/d8c/d99/da8/fbc 1553406 1 2026-03-09T19:27:54.873 INFO:tasks.workunit.client.1.vm08.stdout:6/934: dread d3/f6 [0,4194304] 0 2026-03-09T19:27:54.886 INFO:tasks.workunit.client.1.vm08.stdout:6/935: dread d3/dbc/deb/f106 [0,4194304] 0 2026-03-09T19:27:54.887 INFO:tasks.workunit.client.0.vm07.stdout:4/713: creat d3/d11/d2b/d38/ddc/d91/dd6/ffb x:0 0 0 2026-03-09T19:27:54.887 INFO:tasks.workunit.client.0.vm07.stdout:5/704: mkdir d3/dd/d26/d3f/de0 0 2026-03-09T19:27:54.893 INFO:tasks.workunit.client.0.vm07.stdout:7/690: truncate d0/d4/d5/d26/d32/fa6 904664 0 2026-03-09T19:27:54.898 INFO:tasks.workunit.client.0.vm07.stdout:1/737: dwrite d1/db/d31/f64 [4194304,4194304] 0 2026-03-09T19:27:54.904 INFO:tasks.workunit.client.1.vm08.stdout:9/903: dread d0/d1b/d97/f34 [0,4194304] 0 2026-03-09T19:27:54.904 INFO:tasks.workunit.client.1.vm08.stdout:0/940: mknod dd/d9d/dcc/c136 0 2026-03-09T19:27:54.905 INFO:tasks.workunit.client.1.vm08.stdout:3/993: unlink d0/d6/de/d1b/d16/d17/dac/d109/fcc 0 2026-03-09T19:27:54.905 INFO:tasks.workunit.client.0.vm07.stdout:3/783: creat d1/d3d/ffe x:0 0 0 2026-03-09T19:27:54.911 INFO:tasks.workunit.client.1.vm08.stdout:3/994: write d0/d6/de/d15/d96/f136 [547007,118671] 0 2026-03-09T19:27:54.920 INFO:tasks.workunit.client.0.vm07.stdout:3/784: dread d1/d74/f52 [0,4194304] 0 2026-03-09T19:27:54.920 INFO:tasks.workunit.client.1.vm08.stdout:9/904: sync 2026-03-09T19:27:54.921 INFO:tasks.workunit.client.1.vm08.stdout:9/905: write d0/d2/d14/d5c/fd [6127472,68473] 0 2026-03-09T19:27:54.932 INFO:tasks.workunit.client.1.vm08.stdout:4/878: rename da/d10/d16/d28/d46/d52/d6e to da/d10/d16/d28/d2f/d4f/d103 0 2026-03-09T19:27:54.934 INFO:tasks.workunit.client.0.vm07.stdout:9/724: stat d0/d17/c26 0 2026-03-09T19:27:54.936 INFO:tasks.workunit.client.1.vm08.stdout:0/941: rmdir dd/d22/d27/d65/ddf 39 2026-03-09T19:27:54.940 INFO:tasks.workunit.client.1.vm08.stdout:5/900: mkdir d16/d45/daf/d126 0 2026-03-09T19:27:54.940 INFO:tasks.workunit.client.0.vm07.stdout:5/705: write d3/f68 [1515730,126805] 0 2026-03-09T19:27:54.945 INFO:tasks.workunit.client.0.vm07.stdout:5/706: dread d3/d1a/fa [4194304,4194304] 0 2026-03-09T19:27:54.945 INFO:tasks.workunit.client.1.vm08.stdout:3/995: mknod d0/d6/de/d1b/d16/d17/c148 0 2026-03-09T19:27:54.946 INFO:tasks.workunit.client.1.vm08.stdout:3/996: stat d0/d6/de/d1a 0 2026-03-09T19:27:54.948 INFO:tasks.workunit.client.1.vm08.stdout:3/997: write d0/d6/de/d6e/d51/d7f/f13e [972781,43213] 0 2026-03-09T19:27:54.964 INFO:tasks.workunit.client.0.vm07.stdout:0/708: link d0/d6/d13/d17/d19/d57/d6a/fd8 d0/d6/d13/d1c/fe8 0 2026-03-09T19:27:54.965 INFO:tasks.workunit.client.1.vm08.stdout:9/906: symlink d0/d1b/d97/d48/d5d/d74/ded/l12e 0 2026-03-09T19:27:54.966 INFO:tasks.workunit.client.0.vm07.stdout:8/744: creat d7/d9/d37/d45/f10d x:0 0 0 2026-03-09T19:27:54.967 INFO:tasks.workunit.client.0.vm07.stdout:3/785: write d1/d3d/f5e [2178073,8816] 0 2026-03-09T19:27:54.967 INFO:tasks.workunit.client.0.vm07.stdout:7/691: write d0/d4/d5/d8/d41/d64/d74/d98/f47 [489383,65954] 0 2026-03-09T19:27:54.967 INFO:tasks.workunit.client.1.vm08.stdout:6/936: truncate d3/d34/da9/da4/fda 539708 0 2026-03-09T19:27:54.970 INFO:tasks.workunit.client.0.vm07.stdout:3/786: write d1/d3d/d47/db3/d8e/da9/fe6 [874556,44355] 0 2026-03-09T19:27:54.970 INFO:tasks.workunit.client.0.vm07.stdout:7/692: chown d0/d4/d5/d8/d41/d64/d74 39 1 2026-03-09T19:27:54.971 INFO:tasks.workunit.client.1.vm08.stdout:5/901: fdatasync d16/d45/daf/df5/fb9 0 2026-03-09T19:27:54.974 INFO:tasks.workunit.client.1.vm08.stdout:6/937: sync 2026-03-09T19:27:54.975 INFO:tasks.workunit.client.0.vm07.stdout:6/683: link d0/d1/db/c2a d0/d1/c10c 0 2026-03-09T19:27:54.976 INFO:tasks.workunit.client.0.vm07.stdout:6/684: readlink d0/d1/db/d52/d94/d81/lb2 0 2026-03-09T19:27:54.977 INFO:tasks.workunit.client.0.vm07.stdout:4/714: mkdir d3/dfc 0 2026-03-09T19:27:54.978 INFO:tasks.workunit.client.0.vm07.stdout:1/738: dwrite d1/f96 [0,4194304] 0 2026-03-09T19:27:54.979 INFO:tasks.workunit.client.1.vm08.stdout:9/907: unlink d0/d1b/d68/d7f/c124 0 2026-03-09T19:27:54.983 INFO:tasks.workunit.client.1.vm08.stdout:9/908: dread d0/d2/d14/d98/dbb/fe1 [0,4194304] 0 2026-03-09T19:27:54.994 INFO:tasks.workunit.client.1.vm08.stdout:5/902: creat d16/d1e/dc9/d10c/d112/f127 x:0 0 0 2026-03-09T19:27:54.996 INFO:tasks.workunit.client.0.vm07.stdout:8/745: rmdir d7/d9/d10 39 2026-03-09T19:27:54.996 INFO:tasks.workunit.client.0.vm07.stdout:2/782: creat d3/dd/d16/d29/d2d/d45/d3b/d44/d96/f115 x:0 0 0 2026-03-09T19:27:54.999 INFO:tasks.workunit.client.0.vm07.stdout:7/693: read d0/d4/d5/d26/d32/fa6 [567950,103569] 0 2026-03-09T19:27:54.999 INFO:tasks.workunit.client.0.vm07.stdout:5/707: truncate d3/dd/dbe/fce 851622 0 2026-03-09T19:27:55.002 INFO:tasks.workunit.client.0.vm07.stdout:3/787: dread d1/d1f/f36 [0,4194304] 0 2026-03-09T19:27:55.009 INFO:tasks.workunit.client.1.vm08.stdout:2/832: rename d3/d4/fa7 to d3/d4/d23/f11a 0 2026-03-09T19:27:55.009 INFO:tasks.workunit.client.1.vm08.stdout:4/879: creat da/d10/d26/d3a/d69/f104 x:0 0 0 2026-03-09T19:27:55.009 INFO:tasks.workunit.client.0.vm07.stdout:3/788: read d1/d1f/f9c [11969,130189] 0 2026-03-09T19:27:55.010 INFO:tasks.workunit.client.0.vm07.stdout:3/789: read - d1/d89/fc7 zero size 2026-03-09T19:27:55.011 INFO:tasks.workunit.client.1.vm08.stdout:9/909: truncate d0/d1b/d97/d48/d5d/d74/ded/fa1 1903091 0 2026-03-09T19:27:55.013 INFO:tasks.workunit.client.1.vm08.stdout:9/910: write d0/d1b/d97/d48/d5d/d74/ded/fb7 [3162124,89547] 0 2026-03-09T19:27:55.017 INFO:tasks.workunit.client.0.vm07.stdout:2/783: sync 2026-03-09T19:27:55.018 INFO:tasks.workunit.client.0.vm07.stdout:2/784: stat d3/fc 0 2026-03-09T19:27:55.026 INFO:tasks.workunit.client.0.vm07.stdout:5/708: write d3/d1a/f12 [3429768,61205] 0 2026-03-09T19:27:55.031 INFO:tasks.workunit.client.1.vm08.stdout:7/986: rename d5/fb to d5/d14/d27/d54/dfb/d9c/dcb/dd2/f14c 0 2026-03-09T19:27:55.031 INFO:tasks.workunit.client.1.vm08.stdout:3/998: dwrite d0/d52/d7c/fc1 [0,4194304] 0 2026-03-09T19:27:55.034 INFO:tasks.workunit.client.1.vm08.stdout:7/987: chown d5/dc4/f10c 4 1 2026-03-09T19:27:55.040 INFO:tasks.workunit.client.0.vm07.stdout:7/694: creat d0/d4/d5/d8/d41/d64/d79/fea x:0 0 0 2026-03-09T19:27:55.040 INFO:tasks.workunit.client.0.vm07.stdout:4/715: write d3/d11/d2b/d38/ddc/d22/d70/f8b [178711,100401] 0 2026-03-09T19:27:55.042 INFO:tasks.workunit.client.1.vm08.stdout:6/938: dwrite d3/d34/d5c/da2/dd6/ffa [0,4194304] 0 2026-03-09T19:27:55.043 INFO:tasks.workunit.client.0.vm07.stdout:3/790: mknod d1/d6/d45/cff 0 2026-03-09T19:27:55.047 INFO:tasks.workunit.client.0.vm07.stdout:7/695: sync 2026-03-09T19:27:55.050 INFO:tasks.workunit.client.1.vm08.stdout:5/903: getdents d16/d8e/dd5/dfa 0 2026-03-09T19:27:55.050 INFO:tasks.workunit.client.1.vm08.stdout:5/904: chown d16/d1e/f5a 0 1 2026-03-09T19:27:55.059 INFO:tasks.workunit.client.1.vm08.stdout:8/908: rename de/d1d/d21/d73/fa6 to de/d25/d33/d127/f138 0 2026-03-09T19:27:55.065 INFO:tasks.workunit.client.0.vm07.stdout:5/709: mknod d3/d1a/d5a/ce1 0 2026-03-09T19:27:55.067 INFO:tasks.workunit.client.1.vm08.stdout:2/833: symlink d3/d4/d23/d2c/d39/d5e/de/d18/d1f/de7/l11b 0 2026-03-09T19:27:55.068 INFO:tasks.workunit.client.0.vm07.stdout:0/709: symlink d0/d6/d13/d1c/d11/d56/le9 0 2026-03-09T19:27:55.069 INFO:tasks.workunit.client.0.vm07.stdout:0/710: dread - d0/d6/d13/d17/d19/d57/f5a zero size 2026-03-09T19:27:55.070 INFO:tasks.workunit.client.0.vm07.stdout:9/725: getdents d0/d17 0 2026-03-09T19:27:55.074 INFO:tasks.workunit.client.1.vm08.stdout:6/939: read - d3/d34/d6f/f12b zero size 2026-03-09T19:27:55.074 INFO:tasks.workunit.client.0.vm07.stdout:4/716: read d3/d11/d29/d34/fa5 [1874471,87721] 0 2026-03-09T19:27:55.074 INFO:tasks.workunit.client.0.vm07.stdout:3/791: truncate d1/d6/dd/f3b 1848419 0 2026-03-09T19:27:55.079 INFO:tasks.workunit.client.0.vm07.stdout:6/685: link d0/d44/c101 d0/d44/dd3/c10d 0 2026-03-09T19:27:55.080 INFO:tasks.workunit.client.0.vm07.stdout:6/686: readlink d0/d4e/l56 0 2026-03-09T19:27:55.085 INFO:tasks.workunit.client.1.vm08.stdout:0/942: rename dd/d22/d27/d11e/d105/d82/d126 to dd/de4/d137 0 2026-03-09T19:27:55.090 INFO:tasks.workunit.client.1.vm08.stdout:4/880: write da/d10/d26/d27/f96 [502124,6373] 0 2026-03-09T19:27:55.096 INFO:tasks.workunit.client.1.vm08.stdout:9/911: write d0/f83 [609969,87974] 0 2026-03-09T19:27:55.098 INFO:tasks.workunit.client.1.vm08.stdout:9/912: read - d0/d1b/de9/d12a/da2/da8/de8/f108 zero size 2026-03-09T19:27:55.099 INFO:tasks.workunit.client.0.vm07.stdout:7/696: write d0/f13 [5498118,54701] 0 2026-03-09T19:27:55.101 INFO:tasks.workunit.client.1.vm08.stdout:2/834: creat d3/dca/f11c x:0 0 0 2026-03-09T19:27:55.105 INFO:tasks.workunit.client.1.vm08.stdout:7/988: mknod d5/d14/d2b/d147/c14d 0 2026-03-09T19:27:55.106 INFO:tasks.workunit.client.0.vm07.stdout:8/746: dwrite d7/d9/d10/d44/d9a/f8a [0,4194304] 0 2026-03-09T19:27:55.110 INFO:tasks.workunit.client.1.vm08.stdout:3/999: dwrite d0/d6/d25/f87 [0,4194304] 0 2026-03-09T19:27:55.114 INFO:tasks.workunit.client.0.vm07.stdout:0/711: rename d0/d6/d13/d1c/d11/d56/f67 to d0/d6/d13/da1/fea 0 2026-03-09T19:27:55.115 INFO:tasks.workunit.client.0.vm07.stdout:3/792: mknod d1/d6/dd/c100 0 2026-03-09T19:27:55.117 INFO:tasks.workunit.client.1.vm08.stdout:4/881: dwrite da/d10/d26/d3a/d69/f104 [0,4194304] 0 2026-03-09T19:27:55.127 INFO:tasks.workunit.client.0.vm07.stdout:1/739: link d1/db/d31/d4f/d7a/cd1 d1/d11/d37/d3f/db5/cf5 0 2026-03-09T19:27:55.136 INFO:tasks.workunit.client.1.vm08.stdout:9/913: mkdir d0/d2/d14/d98/d99/d12f 0 2026-03-09T19:27:55.136 INFO:tasks.workunit.client.1.vm08.stdout:9/914: readlink d0/d2/l67 0 2026-03-09T19:27:55.145 INFO:tasks.workunit.client.1.vm08.stdout:2/835: fdatasync d3/d4/d23/d2c/d39/d5e/fd9 0 2026-03-09T19:27:55.146 INFO:tasks.workunit.client.1.vm08.stdout:2/836: readlink d3/d4/d3e/d4e/la2 0 2026-03-09T19:27:55.146 INFO:tasks.workunit.client.1.vm08.stdout:2/837: chown d3/d4/d23/d2c/d39/d5e/db8/dff 200 1 2026-03-09T19:27:55.150 INFO:tasks.workunit.client.1.vm08.stdout:7/989: mkdir d5/d14/dae/d1c/db5/df8/d13b/dc7/d14e 0 2026-03-09T19:27:55.150 INFO:tasks.workunit.client.1.vm08.stdout:7/990: stat d5/d14/d2b/daa/l14a 0 2026-03-09T19:27:55.154 INFO:tasks.workunit.client.1.vm08.stdout:5/905: rmdir d16/d8e/dd5/d123 0 2026-03-09T19:27:55.162 INFO:tasks.workunit.client.0.vm07.stdout:3/793: truncate d1/d3d/d47/db3/f6b 819420 0 2026-03-09T19:27:55.165 INFO:tasks.workunit.client.1.vm08.stdout:7/991: dread d5/d14/dae/f6b [0,4194304] 0 2026-03-09T19:27:55.166 INFO:tasks.workunit.client.1.vm08.stdout:7/992: chown d5/d14/dae/d3a/d42/c108 3622 1 2026-03-09T19:27:55.167 INFO:tasks.workunit.client.0.vm07.stdout:1/740: creat d1/d3e/db3/d6d/ff6 x:0 0 0 2026-03-09T19:27:55.170 INFO:tasks.workunit.client.1.vm08.stdout:5/906: symlink d16/d1e/dc9/d10c/l128 0 2026-03-09T19:27:55.172 INFO:tasks.workunit.client.0.vm07.stdout:1/741: dread d1/d3e/db3/d6d/fac [0,4194304] 0 2026-03-09T19:27:55.174 INFO:tasks.workunit.client.1.vm08.stdout:6/940: write d3/d94/def/dc4/fe5 [2638942,58378] 0 2026-03-09T19:27:55.178 INFO:tasks.workunit.client.1.vm08.stdout:8/909: dwrite de/d47/dfd/d99/dde/f10e [0,4194304] 0 2026-03-09T19:27:55.183 INFO:tasks.workunit.client.1.vm08.stdout:4/882: symlink da/d10/d26/d27/d32/l105 0 2026-03-09T19:27:55.184 INFO:tasks.workunit.client.0.vm07.stdout:1/742: sync 2026-03-09T19:27:55.185 INFO:tasks.workunit.client.0.vm07.stdout:2/785: dwrite d3/dd/d16/d29/d2d/d45/d3b/ffd [0,4194304] 0 2026-03-09T19:27:55.186 INFO:tasks.workunit.client.1.vm08.stdout:8/910: dwrite de/d91/f9d [4194304,4194304] 0 2026-03-09T19:27:55.198 INFO:tasks.workunit.client.1.vm08.stdout:7/993: rmdir d5 39 2026-03-09T19:27:55.199 INFO:tasks.workunit.client.0.vm07.stdout:4/717: dwrite d3/d4f/f5e [4194304,4194304] 0 2026-03-09T19:27:55.203 INFO:tasks.workunit.client.0.vm07.stdout:4/718: readlink d3/d11/d29/l3e 0 2026-03-09T19:27:55.211 INFO:tasks.workunit.client.0.vm07.stdout:4/719: readlink d3/d11/d2b/d38/ddc/d22/d86/lf6 0 2026-03-09T19:27:55.212 INFO:tasks.workunit.client.1.vm08.stdout:0/943: write dd/f7a [1962769,26899] 0 2026-03-09T19:27:55.212 INFO:tasks.workunit.client.0.vm07.stdout:7/697: dwrite d0/d4/d5/d8/d41/f89 [0,4194304] 0 2026-03-09T19:27:55.217 INFO:tasks.workunit.client.0.vm07.stdout:9/726: getdents d0/db/d29/d32 0 2026-03-09T19:27:55.227 INFO:tasks.workunit.client.0.vm07.stdout:5/710: rename d3/d1a/d5a/l9c to d3/d1a/d28/d48/le2 0 2026-03-09T19:27:55.236 INFO:tasks.workunit.client.1.vm08.stdout:4/883: creat da/d10/d26/d27/d32/f106 x:0 0 0 2026-03-09T19:27:55.240 INFO:tasks.workunit.client.1.vm08.stdout:9/915: link d0/d2/cb1 d0/d1b/d68/c130 0 2026-03-09T19:27:55.251 INFO:tasks.workunit.client.1.vm08.stdout:9/916: chown d0/d1b/d68/dfe/c10a 27835 1 2026-03-09T19:27:55.256 INFO:tasks.workunit.client.1.vm08.stdout:2/838: creat d3/d4/d23/f11d x:0 0 0 2026-03-09T19:27:55.266 INFO:tasks.workunit.client.1.vm08.stdout:5/907: creat d16/d8e/dd5/dfa/f129 x:0 0 0 2026-03-09T19:27:55.273 INFO:tasks.workunit.client.1.vm08.stdout:6/941: rename d3/db/d12a/c132 to d3/d94/def/dc4/d130/c15d 0 2026-03-09T19:27:55.278 INFO:tasks.workunit.client.0.vm07.stdout:6/687: write d0/dbf/d95/f74 [1704539,9003] 0 2026-03-09T19:27:55.281 INFO:tasks.workunit.client.0.vm07.stdout:0/712: write d0/d6/d13/d1c/d61/d69/fb9 [333226,63444] 0 2026-03-09T19:27:55.284 INFO:tasks.workunit.client.0.vm07.stdout:6/688: sync 2026-03-09T19:27:55.287 INFO:tasks.workunit.client.0.vm07.stdout:3/794: write d1/d6/dd/fb0 [6605,66093] 0 2026-03-09T19:27:55.289 INFO:tasks.workunit.client.0.vm07.stdout:9/727: creat d0/d6f/dc3/fff x:0 0 0 2026-03-09T19:27:55.289 INFO:tasks.workunit.client.0.vm07.stdout:3/795: write d1/d3d/d47/db3/d87/ff7 [1017143,66909] 0 2026-03-09T19:27:55.290 INFO:tasks.workunit.client.1.vm08.stdout:9/917: readlink d0/l12 0 2026-03-09T19:27:55.296 INFO:tasks.workunit.client.0.vm07.stdout:5/711: symlink d3/d1a/d5a/le3 0 2026-03-09T19:27:55.298 INFO:tasks.workunit.client.0.vm07.stdout:7/698: write d0/d4/d5/f20 [3981853,105915] 0 2026-03-09T19:27:55.300 INFO:tasks.workunit.client.0.vm07.stdout:7/699: write d0/d4/d5/d26/db9/fe3 [233844,101471] 0 2026-03-09T19:27:55.301 INFO:tasks.workunit.client.1.vm08.stdout:8/911: dwrite de/d25/d33/f55 [0,4194304] 0 2026-03-09T19:27:55.303 INFO:tasks.workunit.client.1.vm08.stdout:8/912: write de/fb2 [1128928,22348] 0 2026-03-09T19:27:55.314 INFO:tasks.workunit.client.0.vm07.stdout:8/747: getdents d7/d9/d37/d45/d97/dbc 0 2026-03-09T19:27:55.314 INFO:tasks.workunit.client.0.vm07.stdout:8/748: chown d7/d16/c2d 2740 1 2026-03-09T19:27:55.315 INFO:tasks.workunit.client.1.vm08.stdout:0/944: unlink dd/d31/dca/lfa 0 2026-03-09T19:27:55.317 INFO:tasks.workunit.client.0.vm07.stdout:1/743: unlink d1/d11/d37/d3f/db5/cf5 0 2026-03-09T19:27:55.334 INFO:tasks.workunit.client.1.vm08.stdout:2/839: rmdir d3/d9/d79/d46/d8c/d92 39 2026-03-09T19:27:55.341 INFO:tasks.workunit.client.1.vm08.stdout:4/884: rename da/d10/d26/dd8/cfc to da/d10/d16/d28/d2f/d4f/d56/dd0/c107 0 2026-03-09T19:27:55.345 INFO:tasks.workunit.client.0.vm07.stdout:0/713: mknod d0/d6/d13/d33/ceb 0 2026-03-09T19:27:55.346 INFO:tasks.workunit.client.1.vm08.stdout:4/885: write da/d10/d16/d28/d2f/d4f/d103/d40/ff8 [300875,122758] 0 2026-03-09T19:27:55.349 INFO:tasks.workunit.client.1.vm08.stdout:7/994: creat d5/d14/dae/d1c/d13f/f14f x:0 0 0 2026-03-09T19:27:55.350 INFO:tasks.workunit.client.0.vm07.stdout:3/796: mkdir d1/d6/d71/d101 0 2026-03-09T19:27:55.357 INFO:tasks.workunit.client.0.vm07.stdout:8/749: mkdir d7/d9/d37/d45/d4f/d10e 0 2026-03-09T19:27:55.360 INFO:tasks.workunit.client.0.vm07.stdout:1/744: read - d1/d11/d37/d3f/d7e/dad/fbd zero size 2026-03-09T19:27:55.361 INFO:tasks.workunit.client.1.vm08.stdout:6/942: unlink d3/d94/def/dc4/d130/c15d 0 2026-03-09T19:27:55.364 INFO:tasks.workunit.client.0.vm07.stdout:4/720: creat d3/d11/ffd x:0 0 0 2026-03-09T19:27:55.365 INFO:tasks.workunit.client.1.vm08.stdout:4/886: rename da/d10/d26/d3a/fcc to da/d10/d26/d27/d9b/f108 0 2026-03-09T19:27:55.368 INFO:tasks.workunit.client.1.vm08.stdout:5/908: write d16/d1e/d8c/d99/da8/fbc [4251526,39386] 0 2026-03-09T19:27:55.381 INFO:tasks.workunit.client.0.vm07.stdout:2/786: dwrite d3/f15 [0,4194304] 0 2026-03-09T19:27:55.394 INFO:tasks.workunit.client.0.vm07.stdout:9/728: write d0/d6/d3a/fc6 [392754,70109] 0 2026-03-09T19:27:55.395 INFO:tasks.workunit.client.0.vm07.stdout:9/729: fdatasync d0/d6/d73/fed 0 2026-03-09T19:27:55.399 INFO:tasks.workunit.client.1.vm08.stdout:0/945: mkdir dd/d22/d138 0 2026-03-09T19:27:55.408 INFO:tasks.workunit.client.1.vm08.stdout:8/913: write de/d47/dfd/d99/da5/ff8 [663162,68365] 0 2026-03-09T19:27:55.409 INFO:tasks.workunit.client.1.vm08.stdout:8/914: read de/d47/dfd/d99/dde/f134 [4063199,7271] 0 2026-03-09T19:27:55.410 INFO:tasks.workunit.client.1.vm08.stdout:5/909: dread d16/d1e/f5c [0,4194304] 0 2026-03-09T19:27:55.414 INFO:tasks.workunit.client.1.vm08.stdout:6/943: truncate d3/db/fb0 4847167 0 2026-03-09T19:27:55.417 INFO:tasks.workunit.client.0.vm07.stdout:7/700: dwrite d0/d4/d5/d26/d32/fa6 [0,4194304] 0 2026-03-09T19:27:55.438 INFO:tasks.workunit.client.1.vm08.stdout:9/918: creat d0/d1b/d97/d48/f131 x:0 0 0 2026-03-09T19:27:55.451 INFO:tasks.workunit.client.1.vm08.stdout:5/910: mkdir d16/d1e/d8c/d99/d12a 0 2026-03-09T19:27:55.452 INFO:tasks.workunit.client.1.vm08.stdout:0/946: write dd/d9d/dcc/ff6 [723094,32350] 0 2026-03-09T19:27:55.455 INFO:tasks.workunit.client.1.vm08.stdout:7/995: mknod d5/d14/dae/d1c/db5/df8/d13b/dc7/dce/d120/d122/c150 0 2026-03-09T19:27:55.456 INFO:tasks.workunit.client.1.vm08.stdout:6/944: rmdir d3/d15 39 2026-03-09T19:27:55.462 INFO:tasks.workunit.client.1.vm08.stdout:9/919: creat d0/d1b/d97/d48/d6f/df5/f132 x:0 0 0 2026-03-09T19:27:55.462 INFO:tasks.workunit.client.1.vm08.stdout:9/920: chown d0/d2/d14/f19 1 1 2026-03-09T19:27:55.466 INFO:tasks.workunit.client.1.vm08.stdout:9/921: dwrite d0/d2/fc1 [0,4194304] 0 2026-03-09T19:27:55.468 INFO:tasks.workunit.client.1.vm08.stdout:9/922: readlink d0/d1b/d97/d48/d6f/la5 0 2026-03-09T19:27:55.484 INFO:tasks.workunit.client.1.vm08.stdout:0/947: rmdir dd/d22/d27/d2e/db0 39 2026-03-09T19:27:55.490 INFO:tasks.workunit.client.1.vm08.stdout:5/911: truncate d16/d1e/d8c/d99/da8/d9a/f106 3982991 0 2026-03-09T19:27:55.497 INFO:tasks.workunit.client.1.vm08.stdout:4/887: dwrite da/d10/d16/d28/d2f/d4f/d103/d2c/f36 [0,4194304] 0 2026-03-09T19:27:55.497 INFO:tasks.workunit.client.1.vm08.stdout:7/996: creat d5/d14/d27/d54/dfb/f151 x:0 0 0 2026-03-09T19:27:55.499 INFO:tasks.workunit.client.1.vm08.stdout:5/912: dread d16/d1e/d8c/d99/dcc/fd1 [0,4194304] 0 2026-03-09T19:27:55.503 INFO:tasks.workunit.client.1.vm08.stdout:2/840: getdents d3/d9/d4a 0 2026-03-09T19:27:55.521 INFO:tasks.workunit.client.1.vm08.stdout:8/915: link de/d117/l116 de/d113/l139 0 2026-03-09T19:27:55.523 INFO:tasks.workunit.client.0.vm07.stdout:0/714: stat d0/d6/d13/d33/f39 0 2026-03-09T19:27:55.527 INFO:tasks.workunit.client.1.vm08.stdout:9/923: rmdir d0/d2/d14/d5c 39 2026-03-09T19:27:55.527 INFO:tasks.workunit.client.1.vm08.stdout:0/948: read - dd/d22/d27/d11e/d78/fc2 zero size 2026-03-09T19:27:55.528 INFO:tasks.workunit.client.1.vm08.stdout:7/997: symlink d5/d14/dae/d3a/d42/d85/da0/df5/d11b/l152 0 2026-03-09T19:27:55.529 INFO:tasks.workunit.client.1.vm08.stdout:7/998: chown d5/d14/d27/d54/dfb/d9c/dcb/dd2/f130 1 1 2026-03-09T19:27:55.530 INFO:tasks.workunit.client.1.vm08.stdout:7/999: chown d5/d14/dae/f7c 117 1 2026-03-09T19:27:55.531 INFO:tasks.workunit.client.1.vm08.stdout:5/913: mkdir d16/d1e/d9f/d12b 0 2026-03-09T19:27:55.531 INFO:tasks.workunit.client.0.vm07.stdout:3/797: fdatasync d1/d6/d4c/f61 0 2026-03-09T19:27:55.532 INFO:tasks.workunit.client.0.vm07.stdout:1/745: creat d1/d11/d37/d3f/d6e/d9c/ff7 x:0 0 0 2026-03-09T19:27:55.534 INFO:tasks.workunit.client.0.vm07.stdout:8/750: dread - d7/d1d/d83/d9f/fa4 zero size 2026-03-09T19:27:55.537 INFO:tasks.workunit.client.1.vm08.stdout:6/945: creat d3/d15/dc2/f15e x:0 0 0 2026-03-09T19:27:55.544 INFO:tasks.workunit.client.0.vm07.stdout:2/787: mkdir d3/dd/d16/d29/d2d/d45/d3b/d44/d97/da4/d116 0 2026-03-09T19:27:55.551 INFO:tasks.workunit.client.1.vm08.stdout:2/841: creat d3/d9/d79/d46/d8c/d92/f11e x:0 0 0 2026-03-09T19:27:55.554 INFO:tasks.workunit.client.1.vm08.stdout:6/946: mkdir d3/d34/da9/da4/d117/d10d/d15f 0 2026-03-09T19:27:55.556 INFO:tasks.workunit.client.0.vm07.stdout:6/689: creat d0/dbf/d95/f10e x:0 0 0 2026-03-09T19:27:55.557 INFO:tasks.workunit.client.0.vm07.stdout:0/715: creat d0/d6/d13/d17/d19/d57/d6a/fec x:0 0 0 2026-03-09T19:27:55.559 INFO:tasks.workunit.client.1.vm08.stdout:4/888: creat da/d10/d16/d28/d2f/f109 x:0 0 0 2026-03-09T19:27:55.559 INFO:tasks.workunit.client.0.vm07.stdout:5/712: write d3/dd/f8a [705723,95515] 0 2026-03-09T19:27:55.560 INFO:tasks.workunit.client.1.vm08.stdout:9/924: sync 2026-03-09T19:27:55.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.559+0000 7f0182e4a640 1 -- 192.168.123.107:0/1157381345 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01740a4830 msgr2=0x7f01740a4c30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:55.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.559+0000 7f0182e4a640 1 --2- 192.168.123.107:0/1157381345 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01740a4830 0x7f01740a4c30 secure :-1 s=READY pgs=344 cs=0 l=1 rev1=1 crypto rx=0x7f01780099b0 tx=0x7f017802f240 comp rx=0 tx=0).stop 2026-03-09T19:27:55.562 INFO:tasks.workunit.client.0.vm07.stdout:5/713: chown d3/dd/dbe/fce 1 1 2026-03-09T19:27:55.563 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.564+0000 7f0182e4a640 1 -- 192.168.123.107:0/1157381345 shutdown_connections 2026-03-09T19:27:55.563 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.564+0000 7f0182e4a640 1 --2- 192.168.123.107:0/1157381345 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f01740a5920 0x7f01740a5d80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:55.563 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.564+0000 7f0182e4a640 1 --2- 192.168.123.107:0/1157381345 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01740a4830 0x7f01740a4c30 unknown :-1 s=CLOSED pgs=344 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:55.564 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.564+0000 7f0182e4a640 1 -- 192.168.123.107:0/1157381345 >> 192.168.123.107:0/1157381345 conn(0x7f017409fe80 msgr2=0x7f01740a22e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:55.564 INFO:tasks.workunit.client.1.vm08.stdout:4/889: dwrite da/d10/d16/d28/d2f/d4f/d103/d40/ff8 [0,4194304] 0 2026-03-09T19:27:55.566 INFO:tasks.workunit.client.0.vm07.stdout:6/690: dwrite d0/dbf/d95/f10e [0,4194304] 0 2026-03-09T19:27:55.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.564+0000 7f0182e4a640 1 -- 192.168.123.107:0/1157381345 shutdown_connections 2026-03-09T19:27:55.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.565+0000 7f0182e4a640 1 -- 192.168.123.107:0/1157381345 wait complete. 2026-03-09T19:27:55.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.567+0000 7f0182e4a640 1 Processor -- start 2026-03-09T19:27:55.567 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.567+0000 7f0182e4a640 1 -- start start 2026-03-09T19:27:55.567 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.567+0000 7f0182e4a640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f01740a4830 0x7f017414d6e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:55.567 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.567+0000 7f0182e4a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01740a5920 0x7f017414dc20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:55.568 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.567+0000 7f0182e4a640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f017414e220 con 0x7f01740a5920 2026-03-09T19:27:55.568 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.567+0000 7f0182e4a640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f017414e390 con 0x7f01740a4830 2026-03-09T19:27:55.569 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.569+0000 7f0181647640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01740a5920 0x7f017414dc20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:55.570 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.569+0000 7f0181647640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01740a5920 0x7f017414dc20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:39182/0 (socket says 192.168.123.107:39182) 2026-03-09T19:27:55.570 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.569+0000 7f0181647640 1 -- 192.168.123.107:0/1770449821 learned_addr learned my addr 192.168.123.107:0/1770449821 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:27:55.570 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.569+0000 7f0181e48640 1 --2- 192.168.123.107:0/1770449821 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f01740a4830 0x7f017414d6e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:55.571 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.572+0000 7f0181e48640 1 -- 192.168.123.107:0/1770449821 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01740a5920 msgr2=0x7f017414dc20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:55.571 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.572+0000 7f0181e48640 1 --2- 192.168.123.107:0/1770449821 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01740a5920 0x7f017414dc20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:55.571 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.572+0000 7f0181e48640 1 -- 192.168.123.107:0/1770449821 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0178009660 con 0x7f01740a4830 2026-03-09T19:27:55.572 INFO:tasks.workunit.client.0.vm07.stdout:3/798: dread d1/d74/f52 [0,4194304] 0 2026-03-09T19:27:55.574 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.575+0000 7f0181e48640 1 --2- 192.168.123.107:0/1770449821 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f01740a4830 0x7f017414d6e0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f017802f750 tx=0x7f0178002700 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:55.577 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.578+0000 7f0172ffd640 1 -- 192.168.123.107:0/1770449821 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f017803d070 con 0x7f01740a4830 2026-03-09T19:27:55.577 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.578+0000 7f0182e4a640 1 -- 192.168.123.107:0/1770449821 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0174152bd0 con 0x7f01740a4830 2026-03-09T19:27:55.577 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.578+0000 7f0182e4a640 1 -- 192.168.123.107:0/1770449821 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f01741530c0 con 0x7f01740a4830 2026-03-09T19:27:55.577 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.578+0000 7f0172ffd640 1 -- 192.168.123.107:0/1770449821 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f0178038730 con 0x7f01740a4830 2026-03-09T19:27:55.578 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.578+0000 7f0182e4a640 1 -- 192.168.123.107:0/1770449821 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0144005350 con 0x7f01740a4830 2026-03-09T19:27:55.578 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.578+0000 7f0172ffd640 1 -- 192.168.123.107:0/1770449821 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f017804b7e0 con 0x7f01740a4830 2026-03-09T19:27:55.581 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.582+0000 7f0172ffd640 1 -- 192.168.123.107:0/1770449821 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 24) v1 ==== 99330+0+0 (secure 0 0 0) 0x7f017802faa0 con 0x7f01740a4830 2026-03-09T19:27:55.581 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.582+0000 7f0172ffd640 1 --2- 192.168.123.107:0/1770449821 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f0154076c60 0x7f0154079120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:55.581 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.582+0000 7f0172ffd640 1 -- 192.168.123.107:0/1770449821 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f01780bdc50 con 0x7f01740a4830 2026-03-09T19:27:55.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.584+0000 7f0181647640 1 --2- 192.168.123.107:0/1770449821 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f0154076c60 0x7f0154079120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:55.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.588+0000 7f0172ffd640 1 -- 192.168.123.107:0/1770449821 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f0178086890 con 0x7f01740a4830 2026-03-09T19:27:55.592 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.588+0000 7f0181647640 1 --2- 192.168.123.107:0/1770449821 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f0154076c60 0x7f0154079120 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f017414ea00 tx=0x7f01640074e0 comp rx=0 tx=0).ready entity=mgr.14696 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:55.592 INFO:tasks.workunit.client.1.vm08.stdout:2/842: creat d3/d4/d23/d2c/d39/d5e/de/d18/f11f x:0 0 0 2026-03-09T19:27:55.600 INFO:tasks.workunit.client.0.vm07.stdout:1/746: creat d1/d91/ff8 x:0 0 0 2026-03-09T19:27:55.616 INFO:tasks.workunit.client.1.vm08.stdout:8/916: write de/d1d/d21/f45 [658556,88784] 0 2026-03-09T19:27:55.622 INFO:tasks.workunit.client.0.vm07.stdout:4/721: write d3/fa2 [169485,41166] 0 2026-03-09T19:27:55.626 INFO:tasks.workunit.client.1.vm08.stdout:8/917: dwrite de/d47/dfd/d124/f12e [0,4194304] 0 2026-03-09T19:27:55.633 INFO:tasks.workunit.client.1.vm08.stdout:8/918: dread de/d91/f9d [4194304,4194304] 0 2026-03-09T19:27:55.634 INFO:tasks.workunit.client.1.vm08.stdout:5/914: write d16/d45/f5d [2355235,83808] 0 2026-03-09T19:27:55.639 INFO:tasks.workunit.client.0.vm07.stdout:9/730: dwrite d0/d6/d3a/f89 [0,4194304] 0 2026-03-09T19:27:55.640 INFO:tasks.workunit.client.0.vm07.stdout:9/731: chown d0/d6/d3a/d94 2079648 1 2026-03-09T19:27:55.642 INFO:tasks.workunit.client.1.vm08.stdout:0/949: truncate dd/d22/f28 213389 0 2026-03-09T19:27:55.646 INFO:tasks.workunit.client.1.vm08.stdout:0/950: fdatasync dd/d22/d27/d11e/d105/f129 0 2026-03-09T19:27:55.646 INFO:tasks.workunit.client.1.vm08.stdout:6/947: write d3/db/f8f [1699481,62977] 0 2026-03-09T19:27:55.647 INFO:tasks.workunit.client.0.vm07.stdout:8/751: write d7/d9/f87 [488893,8412] 0 2026-03-09T19:27:55.652 INFO:tasks.workunit.client.1.vm08.stdout:4/890: mkdir da/d10/d16/d28/d46/d52/d10a 0 2026-03-09T19:27:55.654 INFO:tasks.workunit.client.1.vm08.stdout:2/843: mkdir d3/d9/d79/d46/d8c/d92/d120 0 2026-03-09T19:27:55.665 INFO:tasks.workunit.client.0.vm07.stdout:0/716: dread d0/d6/dc8/fba [0,4194304] 0 2026-03-09T19:27:55.665 INFO:tasks.workunit.client.0.vm07.stdout:6/691: rename d0/d1/d28/d76/dad/db0 to d0/d1/db/d17/dc4/d7b/da0/d10f 0 2026-03-09T19:27:55.666 INFO:tasks.workunit.client.0.vm07.stdout:0/717: chown d0/d6/d13/l4a 798113395 1 2026-03-09T19:27:55.666 INFO:tasks.workunit.client.0.vm07.stdout:6/692: truncate d0/fa3 4797900 0 2026-03-09T19:27:55.668 INFO:tasks.workunit.client.0.vm07.stdout:3/799: mknod d1/d3d/d47/db3/d87/c102 0 2026-03-09T19:27:55.677 INFO:tasks.workunit.client.1.vm08.stdout:5/915: dread d16/d1e/d6e/f72 [0,4194304] 0 2026-03-09T19:27:55.679 INFO:tasks.workunit.client.1.vm08.stdout:5/916: truncate d16/d1e/d3b/d61/d11e/d107/f122 613215 0 2026-03-09T19:27:55.683 INFO:tasks.workunit.client.1.vm08.stdout:0/951: dwrite dd/d22/d27/d11e/d78/db4/f122 [0,4194304] 0 2026-03-09T19:27:55.685 INFO:tasks.workunit.client.1.vm08.stdout:0/952: read - dd/d22/d27/d6c/f89 zero size 2026-03-09T19:27:55.686 INFO:tasks.workunit.client.1.vm08.stdout:5/917: dread d16/d1e/fa5 [0,4194304] 0 2026-03-09T19:27:55.729 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.730+0000 7f0182e4a640 1 -- 192.168.123.107:0/1770449821 --> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f0144002bf0 con 0x7f0154076c60 2026-03-09T19:27:55.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.732+0000 7f0172ffd640 1 -- 192.168.123.107:0/1770449821 <== mgr.14696 v2:192.168.123.108:6828/502005203 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+318 (secure 0 0 0) 0x7f0144002bf0 con 0x7f0154076c60 2026-03-09T19:27:55.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.736+0000 7f0170ff9640 1 -- 192.168.123.107:0/1770449821 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f0154076c60 msgr2=0x7f0154079120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:55.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.736+0000 7f0170ff9640 1 --2- 192.168.123.107:0/1770449821 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f0154076c60 0x7f0154079120 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f017414ea00 tx=0x7f01640074e0 comp rx=0 tx=0).stop 2026-03-09T19:27:55.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.736+0000 7f0170ff9640 1 -- 192.168.123.107:0/1770449821 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f01740a4830 msgr2=0x7f017414d6e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:55.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.736+0000 7f0170ff9640 1 --2- 192.168.123.107:0/1770449821 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f01740a4830 0x7f017414d6e0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f017802f750 tx=0x7f0178002700 comp rx=0 tx=0).stop 2026-03-09T19:27:55.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.737+0000 7f0170ff9640 1 -- 192.168.123.107:0/1770449821 shutdown_connections 2026-03-09T19:27:55.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.737+0000 7f0170ff9640 1 --2- 192.168.123.107:0/1770449821 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f0154076c60 0x7f0154079120 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:55.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.737+0000 7f0170ff9640 1 --2- 192.168.123.107:0/1770449821 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f01740a5920 0x7f017414dc20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:55.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.737+0000 7f0170ff9640 1 --2- 192.168.123.107:0/1770449821 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f01740a4830 0x7f017414d6e0 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:55.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.737+0000 7f0170ff9640 1 -- 192.168.123.107:0/1770449821 >> 192.168.123.107:0/1770449821 conn(0x7f017409fe80 msgr2=0x7f01740a1750 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:55.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.738+0000 7f0170ff9640 1 -- 192.168.123.107:0/1770449821 shutdown_connections 2026-03-09T19:27:55.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.739+0000 7f0170ff9640 1 -- 192.168.123.107:0/1770449821 wait complete. 2026-03-09T19:27:55.749 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:27:55.789 INFO:tasks.workunit.client.0.vm07.stdout:9/732: symlink d0/d6/d57/l100 0 2026-03-09T19:27:55.790 INFO:tasks.workunit.client.0.vm07.stdout:9/733: write d0/d6/d3a/dd3/ff9 [912087,129474] 0 2026-03-09T19:27:55.795 INFO:tasks.workunit.client.1.vm08.stdout:4/891: creat da/d10/d16/d28/d2f/d4f/d103/d40/f10b x:0 0 0 2026-03-09T19:27:55.805 INFO:tasks.workunit.client.1.vm08.stdout:8/919: write de/d47/dfd/d99/da5/db3/f50 [379950,90511] 0 2026-03-09T19:27:55.816 INFO:tasks.workunit.client.0.vm07.stdout:1/747: dwrite d1/f1d [0,4194304] 0 2026-03-09T19:27:55.821 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:55 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:55.821 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:55 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:55.821 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:55 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:55.821 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:55 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:55.821 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:55 vm07.local ceph-mon[48545]: pgmap v7: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 34 MiB/s rd, 74 MiB/s wr, 214 op/s 2026-03-09T19:27:55.821 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:55 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:55.821 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:55 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:55.821 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:55 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:27:55.821 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:55 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:27:55.821 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.821+0000 7f39dffff640 1 -- 192.168.123.107:0/185713978 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f39e0071a50 msgr2=0x7f39e0071e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:55.821 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.821+0000 7f39dffff640 1 --2- 192.168.123.107:0/185713978 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f39e0071a50 0x7f39e0071e50 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f39d0010a30 tx=0x7f39d0033310 comp rx=0 tx=0).stop 2026-03-09T19:27:55.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.823+0000 7f39dffff640 1 -- 192.168.123.107:0/185713978 shutdown_connections 2026-03-09T19:27:55.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.823+0000 7f39dffff640 1 --2- 192.168.123.107:0/185713978 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f39e0072420 0x7f39e0077190 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:55.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.823+0000 7f39dffff640 1 --2- 192.168.123.107:0/185713978 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f39e0071a50 0x7f39e0071e50 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:55.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.823+0000 7f39dffff640 1 -- 192.168.123.107:0/185713978 >> 192.168.123.107:0/185713978 conn(0x7f39e006d4f0 msgr2=0x7f39e006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:55.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.823+0000 7f39dffff640 1 -- 192.168.123.107:0/185713978 shutdown_connections 2026-03-09T19:27:55.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.823+0000 7f39dffff640 1 -- 192.168.123.107:0/185713978 wait complete. 2026-03-09T19:27:55.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.824+0000 7f39dffff640 1 Processor -- start 2026-03-09T19:27:55.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.824+0000 7f39dffff640 1 -- start start 2026-03-09T19:27:55.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.825+0000 7f39dffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f39e0072420 0x7f39e00840e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:55.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.825+0000 7f39dffff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f39e0082730 0x7f39e0082bb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:55.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.825+0000 7f39dffff640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f39e0084620 con 0x7f39e0072420 2026-03-09T19:27:55.824 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.825+0000 7f39dffff640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f39e00830f0 con 0x7f39e0082730 2026-03-09T19:27:55.825 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.825+0000 7f39de7fc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f39e0082730 0x7f39e0082bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:55.825 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.825+0000 7f39de7fc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f39e0082730 0x7f39e0082bb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:56992/0 (socket says 192.168.123.107:56992) 2026-03-09T19:27:55.825 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.825+0000 7f39de7fc640 1 -- 192.168.123.107:0/3064901532 learned_addr learned my addr 192.168.123.107:0/3064901532 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:27:55.826 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.827+0000 7f39deffd640 1 --2- 192.168.123.107:0/3064901532 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f39e0072420 0x7f39e00840e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:55.827 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.828+0000 7f39deffd640 1 -- 192.168.123.107:0/3064901532 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f39e0082730 msgr2=0x7f39e0082bb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:55.827 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.828+0000 7f39deffd640 1 --2- 192.168.123.107:0/3064901532 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f39e0082730 0x7f39e0082bb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:55.827 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.828+0000 7f39deffd640 1 -- 192.168.123.107:0/3064901532 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f39d00106e0 con 0x7f39e0072420 2026-03-09T19:27:55.827 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.828+0000 7f39deffd640 1 --2- 192.168.123.107:0/3064901532 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f39e0072420 0x7f39e00840e0 secure :-1 s=READY pgs=345 cs=0 l=1 rev1=1 crypto rx=0x7f39d0010a30 tx=0x7f39d0002c80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:55.827 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.829+0000 7f39bffff640 1 -- 192.168.123.107:0/3064901532 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f39d0004700 con 0x7f39e0072420 2026-03-09T19:27:55.828 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.829+0000 7f39dffff640 1 -- 192.168.123.107:0/3064901532 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f39e0083370 con 0x7f39e0072420 2026-03-09T19:27:55.828 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.829+0000 7f39dffff640 1 -- 192.168.123.107:0/3064901532 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f39e01b5bc0 con 0x7f39e0072420 2026-03-09T19:27:55.828 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.830+0000 7f39bffff640 1 -- 192.168.123.107:0/3064901532 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f39d0044430 con 0x7f39e0072420 2026-03-09T19:27:55.828 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.830+0000 7f39bffff640 1 -- 192.168.123.107:0/3064901532 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f39d003b5e0 con 0x7f39e0072420 2026-03-09T19:27:55.829 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.830+0000 7f39dffff640 1 -- 192.168.123.107:0/3064901532 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f39e007a810 con 0x7f39e0072420 2026-03-09T19:27:55.831 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.832+0000 7f39bffff640 1 -- 192.168.123.107:0/3064901532 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 24) v1 ==== 99330+0+0 (secure 0 0 0) 0x7f39d0044a30 con 0x7f39e0072420 2026-03-09T19:27:55.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.835+0000 7f39bffff640 1 --2- 192.168.123.107:0/3064901532 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f39c4076e20 0x7f39c40792e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:55.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.835+0000 7f39bffff640 1 -- 192.168.123.107:0/3064901532 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f39d0002e40 con 0x7f39e0072420 2026-03-09T19:27:55.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.835+0000 7f39de7fc640 1 --2- 192.168.123.107:0/3064901532 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f39c4076e20 0x7f39c40792e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:55.835 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.836+0000 7f39bffff640 1 -- 192.168.123.107:0/3064901532 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f39d008a8f0 con 0x7f39e0072420 2026-03-09T19:27:55.835 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.836+0000 7f39de7fc640 1 --2- 192.168.123.107:0/3064901532 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f39c4076e20 0x7f39c40792e0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f39e0083e60 tx=0x7f39d800d040 comp rx=0 tx=0).ready entity=mgr.14696 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:55.839 INFO:tasks.workunit.client.0.vm07.stdout:7/701: getdents d0/d4/d5/d99 0 2026-03-09T19:27:55.840 INFO:tasks.workunit.client.0.vm07.stdout:7/702: chown d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58/f71 1 1 2026-03-09T19:27:55.841 INFO:tasks.workunit.client.0.vm07.stdout:7/703: stat d0/d80/ce6 0 2026-03-09T19:27:55.860 INFO:tasks.workunit.client.0.vm07.stdout:6/693: dread d0/d1/fa [4194304,4194304] 0 2026-03-09T19:27:55.867 INFO:tasks.workunit.client.0.vm07.stdout:2/788: creat d3/f117 x:0 0 0 2026-03-09T19:27:55.868 INFO:tasks.workunit.client.0.vm07.stdout:1/748: rename d1/c1a to d1/d11/d37/d3f/d6e/d9c/cf9 0 2026-03-09T19:27:55.868 INFO:tasks.workunit.client.0.vm07.stdout:5/714: creat d3/d1a/fe4 x:0 0 0 2026-03-09T19:27:55.869 INFO:tasks.workunit.client.0.vm07.stdout:3/800: read d1/d3d/d47/db3/f6b [716929,46974] 0 2026-03-09T19:27:55.871 INFO:tasks.workunit.client.0.vm07.stdout:0/718: creat d0/d6/d13/d1c/d52/fed x:0 0 0 2026-03-09T19:27:55.873 INFO:tasks.workunit.client.0.vm07.stdout:9/734: mkdir d0/d6/d3a/d81/d101 0 2026-03-09T19:27:55.879 INFO:tasks.workunit.client.0.vm07.stdout:2/789: symlink d3/dd/d16/d29/d3c/d4c/l118 0 2026-03-09T19:27:55.882 INFO:tasks.workunit.client.0.vm07.stdout:1/749: rename d1/d3e/dc8/lf0 to d1/d11/d37/d3f/d7e/dad/lfa 0 2026-03-09T19:27:55.888 INFO:tasks.workunit.client.0.vm07.stdout:3/801: rmdir d1/d3d 39 2026-03-09T19:27:55.891 INFO:tasks.workunit.client.0.vm07.stdout:0/719: fdatasync d0/d6/f4f 0 2026-03-09T19:27:55.898 INFO:tasks.workunit.client.0.vm07.stdout:9/735: mkdir d0/db/d29/d68/d99/d102 0 2026-03-09T19:27:55.904 INFO:tasks.workunit.client.0.vm07.stdout:2/790: rmdir d3/dd/d16/d30/d40 39 2026-03-09T19:27:55.908 INFO:tasks.workunit.client.0.vm07.stdout:2/791: dwrite d3/fc [0,4194304] 0 2026-03-09T19:27:55.909 INFO:tasks.workunit.client.0.vm07.stdout:2/792: stat d3/dd/d16/d29/d3c/c8d 0 2026-03-09T19:27:55.917 INFO:tasks.workunit.client.0.vm07.stdout:8/752: dwrite d7/d50/f6f [0,4194304] 0 2026-03-09T19:27:55.917 INFO:tasks.workunit.client.1.vm08.stdout:2/844: dwrite d3/d4/d23/d2c/d39/d5e/de/d18/d99/dd4/fdc [0,4194304] 0 2026-03-09T19:27:55.945 INFO:tasks.workunit.client.0.vm07.stdout:2/793: creat d3/dd/d103/ddd/ded/f119 x:0 0 0 2026-03-09T19:27:55.946 INFO:tasks.workunit.client.0.vm07.stdout:2/794: write d3/fc [3600471,86853] 0 2026-03-09T19:27:55.946 INFO:tasks.workunit.client.0.vm07.stdout:3/802: mkdir d1/d6/d4c/dfa/d103 0 2026-03-09T19:27:55.951 INFO:tasks.workunit.client.0.vm07.stdout:8/753: creat d7/d9/d10/d44/d9a/f10f x:0 0 0 2026-03-09T19:27:55.953 INFO:tasks.workunit.client.0.vm07.stdout:5/715: rmdir d3/dd/d95 39 2026-03-09T19:27:55.954 INFO:tasks.workunit.client.0.vm07.stdout:9/736: link d0/db/d29/d68/f6b d0/d6f/d86/f103 0 2026-03-09T19:27:55.954 INFO:tasks.workunit.client.0.vm07.stdout:5/716: chown d3/dd/d26/d3f/d47/d56 5 1 2026-03-09T19:27:55.955 INFO:tasks.workunit.client.0.vm07.stdout:6/694: link d0/d1/db/d24/l25 d0/d4e/l110 0 2026-03-09T19:27:55.958 INFO:tasks.workunit.client.0.vm07.stdout:8/754: dwrite d7/d9/f87 [0,4194304] 0 2026-03-09T19:27:55.958 INFO:tasks.workunit.client.1.vm08.stdout:4/892: mkdir da/d10/d1b/d10c 0 2026-03-09T19:27:55.962 INFO:tasks.workunit.client.0.vm07.stdout:1/750: creat d1/d11/d37/d3f/d45/ffb x:0 0 0 2026-03-09T19:27:55.968 INFO:tasks.workunit.client.0.vm07.stdout:5/717: dwrite d3/f68 [4194304,4194304] 0 2026-03-09T19:27:55.988 INFO:tasks.workunit.client.1.vm08.stdout:8/920: rmdir de/d1d/d2e 39 2026-03-09T19:27:55.990 INFO:tasks.workunit.client.1.vm08.stdout:2/845: creat d3/d4/d23/d2c/d39/db9/f121 x:0 0 0 2026-03-09T19:27:55.991 INFO:tasks.workunit.client.1.vm08.stdout:2/846: chown d3/d4/d23/d2c/d39/d5e/d87/ff8 981157 1 2026-03-09T19:27:55.993 INFO:tasks.workunit.client.0.vm07.stdout:9/737: fsync d0/db/d29/d2c/f4a 0 2026-03-09T19:27:55.995 INFO:tasks.workunit.client.0.vm07.stdout:9/738: read d0/d6/d3a/dd3/ff9 [385350,93881] 0 2026-03-09T19:27:55.995 INFO:tasks.workunit.client.1.vm08.stdout:6/948: creat d3/d34/f160 x:0 0 0 2026-03-09T19:27:55.996 INFO:tasks.workunit.client.0.vm07.stdout:1/751: fsync d1/db/d31/d4f/f77 0 2026-03-09T19:27:55.997 INFO:tasks.workunit.client.1.vm08.stdout:9/925: getdents d0/d1b/d97/d48/d6f/df5 0 2026-03-09T19:27:55.998 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:55.999+0000 7f39dffff640 1 -- 192.168.123.107:0/3064901532 --> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f39e0075b90 con 0x7f39c4076e20 2026-03-09T19:27:55.999 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.000+0000 7f39bffff640 1 -- 192.168.123.107:0/3064901532 <== mgr.14696 v2:192.168.123.108:6828/502005203 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+318 (secure 0 0 0) 0x7f39e0075b90 con 0x7f39c4076e20 2026-03-09T19:27:56.001 INFO:tasks.workunit.client.0.vm07.stdout:7/704: write d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58/f71 [691401,94095] 0 2026-03-09T19:27:56.003 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.004+0000 7f39bdffb640 1 -- 192.168.123.107:0/3064901532 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f39c4076e20 msgr2=0x7f39c40792e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:56.003 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.005+0000 7f39bdffb640 1 --2- 192.168.123.107:0/3064901532 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f39c4076e20 0x7f39c40792e0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f39e0083e60 tx=0x7f39d800d040 comp rx=0 tx=0).stop 2026-03-09T19:27:56.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.005+0000 7f39bdffb640 1 -- 192.168.123.107:0/3064901532 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f39e0072420 msgr2=0x7f39e00840e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:56.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.005+0000 7f39bdffb640 1 --2- 192.168.123.107:0/3064901532 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f39e0072420 0x7f39e00840e0 secure :-1 s=READY pgs=345 cs=0 l=1 rev1=1 crypto rx=0x7f39d0010a30 tx=0x7f39d0002c80 comp rx=0 tx=0).stop 2026-03-09T19:27:56.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.005+0000 7f39bdffb640 1 -- 192.168.123.107:0/3064901532 shutdown_connections 2026-03-09T19:27:56.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.005+0000 7f39bdffb640 1 --2- 192.168.123.107:0/3064901532 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f39c4076e20 0x7f39c40792e0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.004 INFO:tasks.workunit.client.0.vm07.stdout:2/795: mknod d3/dd/d16/d30/c11a 0 2026-03-09T19:27:56.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.005+0000 7f39bdffb640 1 --2- 192.168.123.107:0/3064901532 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f39e0082730 0x7f39e0082bb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.005+0000 7f39bdffb640 1 --2- 192.168.123.107:0/3064901532 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f39e0072420 0x7f39e00840e0 unknown :-1 s=CLOSED pgs=345 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.006+0000 7f39bdffb640 1 -- 192.168.123.107:0/3064901532 >> 192.168.123.107:0/3064901532 conn(0x7f39e006d4f0 msgr2=0x7f39e0075480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:56.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.006+0000 7f39bdffb640 1 -- 192.168.123.107:0/3064901532 shutdown_connections 2026-03-09T19:27:56.005 INFO:tasks.workunit.client.1.vm08.stdout:8/921: symlink de/d25/d87/dc9/dd8/l13a 0 2026-03-09T19:27:56.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.006+0000 7f39bdffb640 1 -- 192.168.123.107:0/3064901532 wait complete. 2026-03-09T19:27:56.007 INFO:tasks.workunit.client.0.vm07.stdout:3/803: dread d1/d6/d45/dac/feb [0,4194304] 0 2026-03-09T19:27:56.016 INFO:tasks.workunit.client.1.vm08.stdout:2/847: dread - d3/d9/fd2 zero size 2026-03-09T19:27:56.016 INFO:tasks.workunit.client.0.vm07.stdout:0/720: getdents d0/d6/d13 0 2026-03-09T19:27:56.017 INFO:tasks.workunit.client.0.vm07.stdout:6/695: sync 2026-03-09T19:27:56.022 INFO:tasks.workunit.client.0.vm07.stdout:9/739: dread - d0/db/fda zero size 2026-03-09T19:27:56.025 INFO:tasks.workunit.client.0.vm07.stdout:1/752: truncate d1/db/d31/dca/fa7 6079788 0 2026-03-09T19:27:56.028 INFO:tasks.workunit.client.1.vm08.stdout:2/848: truncate d3/d9/d4a/f59 1711561 0 2026-03-09T19:27:56.030 INFO:tasks.workunit.client.1.vm08.stdout:0/953: getdents dd/d22/d27/d2e/d37 0 2026-03-09T19:27:56.031 INFO:tasks.workunit.client.0.vm07.stdout:5/718: dread - d3/d1a/d28/d36/f63 zero size 2026-03-09T19:27:56.032 INFO:tasks.workunit.client.0.vm07.stdout:1/753: sync 2026-03-09T19:27:56.033 INFO:tasks.workunit.client.0.vm07.stdout:1/754: fsync d1/d11/d37/d3f/d6e/d9c/ff7 0 2026-03-09T19:27:56.044 INFO:tasks.workunit.client.1.vm08.stdout:4/893: creat da/f10d x:0 0 0 2026-03-09T19:27:56.052 INFO:tasks.workunit.client.1.vm08.stdout:2/849: fdatasync d3/d4/d23/d2c/d39/da3/fba 0 2026-03-09T19:27:56.061 INFO:tasks.workunit.client.0.vm07.stdout:9/740: fdatasync d0/db/d29/d4d/fa5 0 2026-03-09T19:27:56.062 INFO:tasks.workunit.client.0.vm07.stdout:4/722: rmdir d3/d11/d2b 39 2026-03-09T19:27:56.070 INFO:tasks.workunit.client.0.vm07.stdout:7/705: creat d0/d4/d5/d99/feb x:0 0 0 2026-03-09T19:27:56.073 INFO:tasks.workunit.client.0.vm07.stdout:5/719: read d3/d1a/d5d/f78 [164964,18285] 0 2026-03-09T19:27:56.074 INFO:tasks.workunit.client.1.vm08.stdout:8/922: mknod de/d1d/d2e/d5f/c13b 0 2026-03-09T19:27:56.074 INFO:tasks.workunit.client.1.vm08.stdout:8/923: stat de/d47/dfd/d99/da5/db3 0 2026-03-09T19:27:56.074 INFO:tasks.workunit.client.0.vm07.stdout:5/720: write d3/dd/dbe/fcd [915531,570] 0 2026-03-09T19:27:56.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.078+0000 7f248cdbe640 1 -- 192.168.123.107:0/729785135 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2480098a40 msgr2=0x7f2480098ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:56.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.078+0000 7f248cdbe640 1 --2- 192.168.123.107:0/729785135 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2480098a40 0x7f2480098ec0 secure :-1 s=READY pgs=346 cs=0 l=1 rev1=1 crypto rx=0x7f24780099b0 tx=0x7f247802f220 comp rx=0 tx=0).stop 2026-03-09T19:27:56.078 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.079+0000 7f248cdbe640 1 -- 192.168.123.107:0/729785135 shutdown_connections 2026-03-09T19:27:56.078 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.079+0000 7f248cdbe640 1 --2- 192.168.123.107:0/729785135 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2480098a40 0x7f2480098ec0 unknown :-1 s=CLOSED pgs=346 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.079+0000 7f248cdbe640 1 --2- 192.168.123.107:0/729785135 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2480097840 0x7f2480097c40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.079+0000 7f248cdbe640 1 -- 192.168.123.107:0/729785135 >> 192.168.123.107:0/729785135 conn(0x7f2480092ff0 msgr2=0x7f2480095410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:56.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.080+0000 7f248cdbe640 1 -- 192.168.123.107:0/729785135 shutdown_connections 2026-03-09T19:27:56.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.080+0000 7f248cdbe640 1 -- 192.168.123.107:0/729785135 wait complete. 2026-03-09T19:27:56.080 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.081+0000 7f248cdbe640 1 Processor -- start 2026-03-09T19:27:56.080 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.081+0000 7f248cdbe640 1 -- start start 2026-03-09T19:27:56.080 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.081+0000 7f248cdbe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2480097840 0x7f248012f1c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:56.080 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.081+0000 7f248cdbe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2480098a40 0x7f248012f700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:56.080 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.081+0000 7f248cdbe640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f248012fcd0 con 0x7f2480097840 2026-03-09T19:27:56.080 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.081+0000 7f248cdbe640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f248012fe40 con 0x7f2480098a40 2026-03-09T19:27:56.080 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.081+0000 7f24877fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2480097840 0x7f248012f1c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:56.081 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.081+0000 7f24877fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2480097840 0x7f248012f1c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:39224/0 (socket says 192.168.123.107:39224) 2026-03-09T19:27:56.081 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.081+0000 7f24877fe640 1 -- 192.168.123.107:0/664035234 learned_addr learned my addr 192.168.123.107:0/664035234 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:27:56.081 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.082+0000 7f2486ffd640 1 --2- 192.168.123.107:0/664035234 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2480098a40 0x7f248012f700 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:56.081 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.082+0000 7f24877fe640 1 -- 192.168.123.107:0/664035234 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2480098a40 msgr2=0x7f248012f700 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:56.081 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.082+0000 7f24877fe640 1 --2- 192.168.123.107:0/664035234 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2480098a40 0x7f248012f700 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.081 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.082+0000 7f24877fe640 1 -- 192.168.123.107:0/664035234 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2478009660 con 0x7f2480097840 2026-03-09T19:27:56.083 INFO:tasks.workunit.client.1.vm08.stdout:5/918: write d16/d1e/d9f/fd3 [3059680,112893] 0 2026-03-09T19:27:56.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.082+0000 7f24877fe640 1 --2- 192.168.123.107:0/664035234 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2480097840 0x7f248012f1c0 secure :-1 s=READY pgs=347 cs=0 l=1 rev1=1 crypto rx=0x7f247400ece0 tx=0x7f247400c6a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:56.083 INFO:tasks.workunit.client.0.vm07.stdout:0/721: truncate d0/d6/d13/d17/d19/f1f 1387284 0 2026-03-09T19:27:56.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.082+0000 7f2484ff9640 1 -- 192.168.123.107:0/664035234 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2474009800 con 0x7f2480097840 2026-03-09T19:27:56.084 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.083+0000 7f2484ff9640 1 -- 192.168.123.107:0/664035234 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f247400eea0 con 0x7f2480097840 2026-03-09T19:27:56.084 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.083+0000 7f248cdbe640 1 -- 192.168.123.107:0/664035234 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f24801348e0 con 0x7f2480097840 2026-03-09T19:27:56.084 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.083+0000 7f248cdbe640 1 -- 192.168.123.107:0/664035234 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2480134e30 con 0x7f2480097840 2026-03-09T19:27:56.084 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.083+0000 7f2484ff9640 1 -- 192.168.123.107:0/664035234 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2474010640 con 0x7f2480097840 2026-03-09T19:27:56.084 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.085+0000 7f2484ff9640 1 -- 192.168.123.107:0/664035234 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 24) v1 ==== 99330+0+0 (secure 0 0 0) 0x7f24740107a0 con 0x7f2480097840 2026-03-09T19:27:56.085 INFO:tasks.workunit.client.0.vm07.stdout:0/722: sync 2026-03-09T19:27:56.086 INFO:tasks.workunit.client.1.vm08.stdout:4/894: symlink da/d10/d26/d27/da6/dc9/l10e 0 2026-03-09T19:27:56.087 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.087+0000 7f2484ff9640 1 --2- 192.168.123.107:0/664035234 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f245c07f790 0x7f245c081c50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:56.087 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.088+0000 7f2484ff9640 1 -- 192.168.123.107:0/664035234 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f2474014070 con 0x7f2480097840 2026-03-09T19:27:56.087 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.088+0000 7f248cdbe640 1 -- 192.168.123.107:0/664035234 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f244c005350 con 0x7f2480097840 2026-03-09T19:27:56.092 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.090+0000 7f2486ffd640 1 --2- 192.168.123.107:0/664035234 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f245c07f790 0x7f245c081c50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:56.092 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.091+0000 7f2486ffd640 1 --2- 192.168.123.107:0/664035234 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f245c07f790 0x7f245c081c50 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f24801306e0 tx=0x7f247803a040 comp rx=0 tx=0).ready entity=mgr.14696 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:56.092 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.093+0000 7f2484ff9640 1 -- 192.168.123.107:0/664035234 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f2474062c40 con 0x7f2480097840 2026-03-09T19:27:56.094 INFO:tasks.workunit.client.0.vm07.stdout:6/696: link d0/d1/db/f9d d0/d1/db/d52/f111 0 2026-03-09T19:27:56.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:55 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:56.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:55 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:56.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:55 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:56.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:55 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:56.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:55 vm08.local ceph-mon[57794]: pgmap v7: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 34 MiB/s rd, 74 MiB/s wr, 214 op/s 2026-03-09T19:27:56.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:55 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:56.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:55 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:56.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:55 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:27:56.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:55 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:27:56.103 INFO:tasks.workunit.client.0.vm07.stdout:8/755: truncate d7/d50/da6/dc5/f103 3903213 0 2026-03-09T19:27:56.104 INFO:tasks.workunit.client.0.vm07.stdout:8/756: read d7/d50/f8f [1567564,97336] 0 2026-03-09T19:27:56.106 INFO:tasks.workunit.client.1.vm08.stdout:6/949: dwrite d3/d15/f64 [0,4194304] 0 2026-03-09T19:27:56.107 INFO:tasks.workunit.client.1.vm08.stdout:6/950: readlink d3/d34/d3b/l73 0 2026-03-09T19:27:56.122 INFO:tasks.workunit.client.1.vm08.stdout:4/895: rename da/d10/d26/d3a/d91/ff0 to da/d10/d16/d28/d2f/d4f/d56/dd0/f10f 0 2026-03-09T19:27:56.132 INFO:tasks.workunit.client.0.vm07.stdout:7/706: rename d0/d4/d5/d8/dcd/fdb to d0/d80/db1/de5/fec 0 2026-03-09T19:27:56.137 INFO:tasks.workunit.client.1.vm08.stdout:0/954: dwrite dd/d22/d27/d4f/fd7 [0,4194304] 0 2026-03-09T19:27:56.143 INFO:tasks.workunit.client.1.vm08.stdout:5/919: mknod d16/d1e/d3b/d61/d11e/c12c 0 2026-03-09T19:27:56.143 INFO:tasks.workunit.client.1.vm08.stdout:5/920: chown d16/d45/c6c 121 1 2026-03-09T19:27:56.158 INFO:tasks.workunit.client.1.vm08.stdout:9/926: write d0/d2/d14/d98/d99/ff9 [1724875,127066] 0 2026-03-09T19:27:56.160 INFO:tasks.workunit.client.0.vm07.stdout:0/723: fdatasync d0/d6/d13/d17/d19/d57/d6a/fd8 0 2026-03-09T19:27:56.160 INFO:tasks.workunit.client.1.vm08.stdout:9/927: write d0/d2/d14/d98/d99/ff9 [2110588,17432] 0 2026-03-09T19:27:56.170 INFO:tasks.workunit.client.0.vm07.stdout:2/796: truncate d3/d11/f39 545255 0 2026-03-09T19:27:56.174 INFO:tasks.workunit.client.0.vm07.stdout:6/697: dread d0/d13/f57 [0,4194304] 0 2026-03-09T19:27:56.177 INFO:tasks.workunit.client.1.vm08.stdout:4/896: symlink da/d10/d16/d28/d2f/d4f/d103/d40/d6c/l110 0 2026-03-09T19:27:56.178 INFO:tasks.workunit.client.0.vm07.stdout:9/741: dwrite d0/db/d29/d32/d5c/d69/fc9 [0,4194304] 0 2026-03-09T19:27:56.178 INFO:tasks.workunit.client.1.vm08.stdout:0/955: unlink dd/d22/d27/d11e/d78/db4/f122 0 2026-03-09T19:27:56.178 INFO:tasks.workunit.client.1.vm08.stdout:2/850: truncate d3/d4/d3e/d4e/d88/db0/ff3 21015 0 2026-03-09T19:27:56.182 INFO:tasks.workunit.client.1.vm08.stdout:8/924: dwrite de/d47/dfd/d99/da5/db3/f2d [0,4194304] 0 2026-03-09T19:27:56.208 INFO:tasks.workunit.client.0.vm07.stdout:4/723: mkdir d3/d11/d16/df5/dfe 0 2026-03-09T19:27:56.213 INFO:tasks.workunit.client.1.vm08.stdout:9/928: fsync d0/d1b/de9/d12a/da2/fdd 0 2026-03-09T19:27:56.216 INFO:tasks.workunit.client.1.vm08.stdout:2/851: rmdir d3/d4/d23/d2c/d39/d5e/d87 39 2026-03-09T19:27:56.219 INFO:tasks.workunit.client.0.vm07.stdout:4/724: sync 2026-03-09T19:27:56.219 INFO:tasks.workunit.client.1.vm08.stdout:0/956: write dd/d22/d27/f91 [3792638,114465] 0 2026-03-09T19:27:56.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.221+0000 7f248cdbe640 1 -- 192.168.123.107:0/664035234 --> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f244c002bf0 con 0x7f245c07f790 2026-03-09T19:27:56.228 INFO:tasks.workunit.client.0.vm07.stdout:3/804: link d1/d3d/f95 d1/d3d/d47/db3/f104 0 2026-03-09T19:27:56.230 INFO:tasks.workunit.client.0.vm07.stdout:3/805: dread d1/d6/dd/f8a [0,4194304] 0 2026-03-09T19:27:56.231 INFO:tasks.workunit.client.0.vm07.stdout:3/806: chown d1/d3d/d47/cc9 1511475311 1 2026-03-09T19:27:56.231 INFO:tasks.workunit.client.1.vm08.stdout:5/921: dread d16/d1e/d3b/f50 [0,4194304] 0 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.238+0000 7f2484ff9640 1 -- 192.168.123.107:0/664035234 <== mgr.14696 v2:192.168.123.108:6828/502005203 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f244c002bf0 con 0x7f245c07f790 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (4m) 1s ago 5m 22.6M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (5m) 1s ago 5m 8606k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (5m) 2s ago 5m 9.90M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (5m) 1s ago 5m 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2799ea3e4bf3 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (5m) 2s ago 5m 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 81fc95c210b6 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (4m) 1s ago 5m 82.1M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (3m) 1s ago 3m 15.3M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (3m) 1s ago 3m 17.3M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (3m) 2s ago 3m 26.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (3m) 2s ago 3m 224M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:9283,8765,8443 running (6m) 1s ago 6m 488M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 706e626ecd10 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (20s) 2s ago 4m 563M - 19.2.3-678-ge911bdeb 654f31e6858e c6797644a5bd 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (6m) 1s ago 6m 53.7M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ccb644205fb3 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (4m) 2s ago 4m 44.4M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8d7b1da9e1e2 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (5m) 1s ago 5m 14.1M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (4m) 2s ago 4m 16.1M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (4m) 1s ago 4m 336M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d7417e3377af 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (4m) 1s ago 4m 384M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2b3c7dd92144 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (4m) 1s ago 4m 321M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 67f7c4b96ef8 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (4m) 2s ago 4m 469M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 740e44caf4fc 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (4m) 2s ago 4m 408M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d929d31f8a58 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (3m) 2s ago 3m 394M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:27:56.238 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (4m) 1s ago 5m 53.3M - 2.43.0 a07b618ecd1d 238baaac36ff 2026-03-09T19:27:56.238 INFO:tasks.workunit.client.1.vm08.stdout:4/897: mknod da/d10/d26/d3a/d69/df1/c111 0 2026-03-09T19:27:56.240 INFO:tasks.workunit.client.0.vm07.stdout:0/724: truncate d0/d6/d13/d17/dc3/f7d 749733 0 2026-03-09T19:27:56.241 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.242+0000 7f248cdbe640 1 -- 192.168.123.107:0/664035234 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f245c07f790 msgr2=0x7f245c081c50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:56.241 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.242+0000 7f248cdbe640 1 --2- 192.168.123.107:0/664035234 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f245c07f790 0x7f245c081c50 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f24801306e0 tx=0x7f247803a040 comp rx=0 tx=0).stop 2026-03-09T19:27:56.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.242+0000 7f248cdbe640 1 -- 192.168.123.107:0/664035234 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2480097840 msgr2=0x7f248012f1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:56.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.242+0000 7f248cdbe640 1 --2- 192.168.123.107:0/664035234 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2480097840 0x7f248012f1c0 secure :-1 s=READY pgs=347 cs=0 l=1 rev1=1 crypto rx=0x7f247400ece0 tx=0x7f247400c6a0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.243+0000 7f248cdbe640 1 -- 192.168.123.107:0/664035234 shutdown_connections 2026-03-09T19:27:56.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.243+0000 7f248cdbe640 1 --2- 192.168.123.107:0/664035234 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f245c07f790 0x7f245c081c50 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.243+0000 7f248cdbe640 1 --2- 192.168.123.107:0/664035234 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2480098a40 0x7f248012f700 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.243+0000 7f248cdbe640 1 --2- 192.168.123.107:0/664035234 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2480097840 0x7f248012f1c0 unknown :-1 s=CLOSED pgs=347 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.243+0000 7f248cdbe640 1 -- 192.168.123.107:0/664035234 >> 192.168.123.107:0/664035234 conn(0x7f2480092ff0 msgr2=0x7f2480094ab0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:56.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.243+0000 7f248cdbe640 1 -- 192.168.123.107:0/664035234 shutdown_connections 2026-03-09T19:27:56.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.243+0000 7f248cdbe640 1 -- 192.168.123.107:0/664035234 wait complete. 2026-03-09T19:27:56.244 INFO:tasks.workunit.client.1.vm08.stdout:8/925: fsync de/d25/d31/d82/f96 0 2026-03-09T19:27:56.249 INFO:tasks.workunit.client.1.vm08.stdout:6/951: link d3/db/d43/f71 d3/d34/d3b/df5/f161 0 2026-03-09T19:27:56.257 INFO:tasks.workunit.client.0.vm07.stdout:6/698: mkdir d0/d1/db/d52/d94/d87/d112 0 2026-03-09T19:27:56.270 INFO:tasks.workunit.client.0.vm07.stdout:9/742: symlink d0/d6/d3a/d94/l104 0 2026-03-09T19:27:56.277 INFO:tasks.workunit.client.1.vm08.stdout:9/929: dread d0/d1b/d97/d48/d5d/d74/ded/fa1 [0,4194304] 0 2026-03-09T19:27:56.281 INFO:tasks.workunit.client.1.vm08.stdout:9/930: dwrite d0/d1b/d97/d48/d5d/f92 [0,4194304] 0 2026-03-09T19:27:56.287 INFO:tasks.workunit.client.1.vm08.stdout:9/931: read - d0/d2/d14/d98/f10b zero size 2026-03-09T19:27:56.288 INFO:tasks.workunit.client.1.vm08.stdout:9/932: chown d0/d1b/d97/d48/d5d/lfb 49 1 2026-03-09T19:27:56.288 INFO:tasks.workunit.client.0.vm07.stdout:7/707: creat d0/d80/db1/de5/d54/dc4/fed x:0 0 0 2026-03-09T19:27:56.291 INFO:tasks.workunit.client.1.vm08.stdout:5/922: symlink d16/d8e/dd5/dfa/l12d 0 2026-03-09T19:27:56.296 INFO:tasks.workunit.client.0.vm07.stdout:1/755: getdents d1/d11/d37/d3f/d6e/d9c 0 2026-03-09T19:27:56.298 INFO:tasks.workunit.client.1.vm08.stdout:4/898: symlink da/d10/d26/d3a/d91/l112 0 2026-03-09T19:27:56.313 INFO:tasks.workunit.client.1.vm08.stdout:2/852: dwrite d3/d4/d23/d2c/d39/d5e/de/d18/f2d [8388608,4194304] 0 2026-03-09T19:27:56.319 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.319+0000 7f68c1ca0640 1 -- 192.168.123.107:0/1480621976 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68b40a4350 msgr2=0x7f68b40a47d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:56.319 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.319+0000 7f68c1ca0640 1 --2- 192.168.123.107:0/1480621976 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68b40a4350 0x7f68b40a47d0 secure :-1 s=READY pgs=348 cs=0 l=1 rev1=1 crypto rx=0x7f68bc066a00 tx=0x7f68bc092a10 comp rx=0 tx=0).stop 2026-03-09T19:27:56.319 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.320+0000 7f68c1ca0640 1 -- 192.168.123.107:0/1480621976 shutdown_connections 2026-03-09T19:27:56.319 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.320+0000 7f68c1ca0640 1 --2- 192.168.123.107:0/1480621976 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68b40a4350 0x7f68b40a47d0 unknown :-1 s=CLOSED pgs=348 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.319 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.320+0000 7f68c1ca0640 1 --2- 192.168.123.107:0/1480621976 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f68b40a5d10 0x7f68b40a6110 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.319 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.320+0000 7f68c1ca0640 1 -- 192.168.123.107:0/1480621976 >> 192.168.123.107:0/1480621976 conn(0x7f68b409fea0 msgr2=0x7f68b40a2300 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:56.320 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.320+0000 7f68c1ca0640 1 -- 192.168.123.107:0/1480621976 shutdown_connections 2026-03-09T19:27:56.321 INFO:tasks.workunit.client.0.vm07.stdout:3/807: symlink d1/d6/d45/d54/dd1/l105 0 2026-03-09T19:27:56.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.320+0000 7f68c1ca0640 1 -- 192.168.123.107:0/1480621976 wait complete. 2026-03-09T19:27:56.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.321+0000 7f68c1ca0640 1 Processor -- start 2026-03-09T19:27:56.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.321+0000 7f68c1ca0640 1 -- start start 2026-03-09T19:27:56.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.321+0000 7f68c1ca0640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f68b40a5d10 0x7f68b40cfea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:56.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.321+0000 7f68c1ca0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68b40d1850 0x7f68b40d03e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:56.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.321+0000 7f68c1ca0640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f68b40d0920 con 0x7f68b40d1850 2026-03-09T19:27:56.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.321+0000 7f68c1ca0640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f68b40d0a90 con 0x7f68b40a5d10 2026-03-09T19:27:56.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.321+0000 7f68bbfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68b40d1850 0x7f68b40d03e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:56.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.322+0000 7f68bbfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68b40d1850 0x7f68b40d03e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:39252/0 (socket says 192.168.123.107:39252) 2026-03-09T19:27:56.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.322+0000 7f68bbfff640 1 -- 192.168.123.107:0/4059703852 learned_addr learned my addr 192.168.123.107:0/4059703852 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:27:56.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.322+0000 7f68c0c9e640 1 --2- 192.168.123.107:0/4059703852 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f68b40a5d10 0x7f68b40cfea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:56.323 INFO:tasks.workunit.client.1.vm08.stdout:0/957: write dd/d31/d132/f125 [1049815,11412] 0 2026-03-09T19:27:56.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.323+0000 7f68bbfff640 1 -- 192.168.123.107:0/4059703852 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f68b40a5d10 msgr2=0x7f68b40cfea0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:56.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.323+0000 7f68bbfff640 1 --2- 192.168.123.107:0/4059703852 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f68b40a5d10 0x7f68b40cfea0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.323+0000 7f68bbfff640 1 -- 192.168.123.107:0/4059703852 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f68bc04f090 con 0x7f68b40d1850 2026-03-09T19:27:56.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.323+0000 7f68bbfff640 1 --2- 192.168.123.107:0/4059703852 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68b40d1850 0x7f68b40d03e0 secure :-1 s=READY pgs=349 cs=0 l=1 rev1=1 crypto rx=0x7f68bc066c50 tx=0x7f68bc093030 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:56.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.325+0000 7f68b9ffb640 1 -- 192.168.123.107:0/4059703852 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f68bc09f070 con 0x7f68b40d1850 2026-03-09T19:27:56.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.325+0000 7f68b9ffb640 1 -- 192.168.123.107:0/4059703852 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f68bc067a90 con 0x7f68b40d1850 2026-03-09T19:27:56.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.325+0000 7f68c1ca0640 1 -- 192.168.123.107:0/4059703852 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f68b40d0c60 con 0x7f68b40d1850 2026-03-09T19:27:56.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.325+0000 7f68c1ca0640 1 -- 192.168.123.107:0/4059703852 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f68b40112b0 con 0x7f68b40d1850 2026-03-09T19:27:56.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.327+0000 7f68b9ffb640 1 -- 192.168.123.107:0/4059703852 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f68bc09acb0 con 0x7f68b40d1850 2026-03-09T19:27:56.327 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.328+0000 7f68b9ffb640 1 -- 192.168.123.107:0/4059703852 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 24) v1 ==== 99330+0+0 (secure 0 0 0) 0x7f68bc06e310 con 0x7f68b40d1850 2026-03-09T19:27:56.328 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.329+0000 7f68b9ffb640 1 --2- 192.168.123.107:0/4059703852 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f6898076ef0 0x7f68980793b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:56.329 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.330+0000 7f68c0c9e640 1 --2- 192.168.123.107:0/4059703852 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f6898076ef0 0x7f68980793b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:56.330 INFO:tasks.workunit.client.1.vm08.stdout:6/952: creat d3/d34/da9/da4/d117/f162 x:0 0 0 2026-03-09T19:27:56.331 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.332+0000 7f68c0c9e640 1 --2- 192.168.123.107:0/4059703852 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f6898076ef0 0x7f68980793b0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f68b40a5790 tx=0x7f68b0009290 comp rx=0 tx=0).ready entity=mgr.14696 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:56.332 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.333+0000 7f68b9ffb640 1 -- 192.168.123.107:0/4059703852 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f68bc120c50 con 0x7f68b40d1850 2026-03-09T19:27:56.333 INFO:tasks.workunit.client.0.vm07.stdout:6/699: unlink d0/d1/db/d52/ff4 0 2026-03-09T19:27:56.333 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.334+0000 7f68c1ca0640 1 -- 192.168.123.107:0/4059703852 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6880005350 con 0x7f68b40d1850 2026-03-09T19:27:56.337 INFO:tasks.workunit.client.0.vm07.stdout:5/721: rename d3/dd/d26/d3f/d47/d71/fb4 to d3/dd/fe5 0 2026-03-09T19:27:56.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.340+0000 7f68b9ffb640 1 -- 192.168.123.107:0/4059703852 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f68bc120e60 con 0x7f68b40d1850 2026-03-09T19:27:56.339 INFO:tasks.workunit.client.0.vm07.stdout:1/756: fdatasync d1/d3e/db3/d6d/fb0 0 2026-03-09T19:27:56.340 INFO:tasks.workunit.client.1.vm08.stdout:8/926: rename de/d25/d31/d82/fc5 to de/d47/dfd/d99/da5/f13c 0 2026-03-09T19:27:56.346 INFO:tasks.workunit.client.1.vm08.stdout:6/953: unlink d3/d15/dc2/f15e 0 2026-03-09T19:27:56.347 INFO:tasks.workunit.client.1.vm08.stdout:6/954: write d3/d15/f64 [2651105,10615] 0 2026-03-09T19:27:56.348 INFO:tasks.workunit.client.1.vm08.stdout:8/927: dwrite de/d47/dfd/d99/dde/f10e [0,4194304] 0 2026-03-09T19:27:56.358 INFO:tasks.workunit.client.1.vm08.stdout:6/955: dwrite d3/d34/d5c/da2/dd6/ffa [4194304,4194304] 0 2026-03-09T19:27:56.358 INFO:tasks.workunit.client.1.vm08.stdout:8/928: dwrite de/d7c/fe1 [0,4194304] 0 2026-03-09T19:27:56.360 INFO:tasks.workunit.client.1.vm08.stdout:6/956: chown d3/d15/fcb 1405653 1 2026-03-09T19:27:56.376 INFO:tasks.workunit.client.1.vm08.stdout:4/899: dread da/d10/d26/d27/fac [0,4194304] 0 2026-03-09T19:27:56.382 INFO:tasks.workunit.client.0.vm07.stdout:2/797: dwrite d3/dd/d16/d29/d2d/d45/d85/d8a/f9e [0,4194304] 0 2026-03-09T19:27:56.384 INFO:tasks.workunit.client.1.vm08.stdout:9/933: write d0/d2/d80/d69/f7a [1336923,128059] 0 2026-03-09T19:27:56.395 INFO:tasks.workunit.client.0.vm07.stdout:4/725: dwrite d3/d11/d2b/d38/ddc/db2/fc8 [0,4194304] 0 2026-03-09T19:27:56.396 INFO:tasks.workunit.client.1.vm08.stdout:2/853: dwrite d3/d4/d23/d2c/d39/d5e/de/fe9 [0,4194304] 0 2026-03-09T19:27:56.397 INFO:tasks.workunit.client.1.vm08.stdout:2/854: chown d3/d4/d23/d2c/d39/d5e/de/d18/da9/d110 27302 1 2026-03-09T19:27:56.398 INFO:tasks.workunit.client.0.vm07.stdout:4/726: write d3/d4f/f5b [3214054,44945] 0 2026-03-09T19:27:56.402 INFO:tasks.workunit.client.1.vm08.stdout:9/934: read d0/f83 [138891,95918] 0 2026-03-09T19:27:56.402 INFO:tasks.workunit.client.1.vm08.stdout:9/935: chown d0/d1b/de9/d12a/da2/da8/c110 6853 1 2026-03-09T19:27:56.406 INFO:tasks.workunit.client.1.vm08.stdout:5/923: mkdir d16/d1e/d8c/d99/da8/d9a/d12e 0 2026-03-09T19:27:56.409 INFO:tasks.workunit.client.1.vm08.stdout:0/958: rename dd/d22/d63/d6e/df5/c127 to dd/d31/dca/c139 0 2026-03-09T19:27:56.413 INFO:tasks.workunit.client.0.vm07.stdout:4/727: read d3/d11/d29/f3c [2545458,54161] 0 2026-03-09T19:27:56.416 INFO:tasks.workunit.client.0.vm07.stdout:6/700: rmdir d0/d4e 39 2026-03-09T19:27:56.417 INFO:tasks.workunit.client.0.vm07.stdout:8/757: rename d7/d9/d10/dd8/dfd/d67/de7/lfb to d7/d30/d32/l110 0 2026-03-09T19:27:56.428 INFO:tasks.workunit.client.1.vm08.stdout:8/929: dread de/d25/d33/f41 [0,4194304] 0 2026-03-09T19:27:56.445 INFO:tasks.workunit.client.1.vm08.stdout:6/957: dread d3/f5 [0,4194304] 0 2026-03-09T19:27:56.450 INFO:tasks.workunit.client.1.vm08.stdout:4/900: rename da/d10/d26/d27/d32/f9e to da/d10/f113 0 2026-03-09T19:27:56.483 INFO:tasks.workunit.client.1.vm08.stdout:0/959: symlink dd/d22/de1/d104/l13a 0 2026-03-09T19:27:56.485 INFO:tasks.workunit.client.1.vm08.stdout:9/936: creat d0/d2/f133 x:0 0 0 2026-03-09T19:27:56.488 INFO:tasks.workunit.client.1.vm08.stdout:4/901: read da/d10/d16/d28/d2f/d4f/d103/ded/ffa [3164256,90173] 0 2026-03-09T19:27:56.489 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.490+0000 7f68c1ca0640 1 -- 192.168.123.107:0/4059703852 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f68800058d0 con 0x7f68b40d1850 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.491+0000 7f68b9ffb640 1 -- 192.168.123.107:0/4059703852 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+845 (secure 0 0 0) 0x7f68bc0e97e0 con 0x7f68b40d1850 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 1, 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 13, 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:27:56.490 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:27:56.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.493+0000 7f68c1ca0640 1 -- 192.168.123.107:0/4059703852 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f6898076ef0 msgr2=0x7f68980793b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:56.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.493+0000 7f68c1ca0640 1 --2- 192.168.123.107:0/4059703852 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f6898076ef0 0x7f68980793b0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f68b40a5790 tx=0x7f68b0009290 comp rx=0 tx=0).stop 2026-03-09T19:27:56.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.493+0000 7f68c1ca0640 1 -- 192.168.123.107:0/4059703852 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68b40d1850 msgr2=0x7f68b40d03e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:56.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.493+0000 7f68c1ca0640 1 --2- 192.168.123.107:0/4059703852 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68b40d1850 0x7f68b40d03e0 secure :-1 s=READY pgs=349 cs=0 l=1 rev1=1 crypto rx=0x7f68bc066c50 tx=0x7f68bc093030 comp rx=0 tx=0).stop 2026-03-09T19:27:56.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.494+0000 7f68c1ca0640 1 -- 192.168.123.107:0/4059703852 shutdown_connections 2026-03-09T19:27:56.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.494+0000 7f68c1ca0640 1 --2- 192.168.123.107:0/4059703852 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f6898076ef0 0x7f68980793b0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.494+0000 7f68c1ca0640 1 --2- 192.168.123.107:0/4059703852 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68b40d1850 0x7f68b40d03e0 unknown :-1 s=CLOSED pgs=349 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.494+0000 7f68c1ca0640 1 --2- 192.168.123.107:0/4059703852 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f68b40a5d10 0x7f68b40cfea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.494+0000 7f68c1ca0640 1 -- 192.168.123.107:0/4059703852 >> 192.168.123.107:0/4059703852 conn(0x7f68b409fea0 msgr2=0x7f68b4006710 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:56.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.494+0000 7f68c1ca0640 1 -- 192.168.123.107:0/4059703852 shutdown_connections 2026-03-09T19:27:56.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.494+0000 7f68c1ca0640 1 -- 192.168.123.107:0/4059703852 wait complete. 2026-03-09T19:27:56.495 INFO:tasks.workunit.client.1.vm08.stdout:9/937: mkdir d0/d2/d14/d98/d99/dd8/d134 0 2026-03-09T19:27:56.500 INFO:tasks.workunit.client.1.vm08.stdout:0/960: truncate dd/f18 3646079 0 2026-03-09T19:27:56.508 INFO:tasks.workunit.client.1.vm08.stdout:0/961: write dd/d22/d63/d6e/df5/f131 [879189,10842] 0 2026-03-09T19:27:56.509 INFO:tasks.workunit.client.1.vm08.stdout:9/938: mknod d0/d1b/de9/d12a/da2/da8/de8/dcd/c135 0 2026-03-09T19:27:56.509 INFO:tasks.workunit.client.1.vm08.stdout:0/962: mknod dd/d9d/c13b 0 2026-03-09T19:27:56.515 INFO:tasks.workunit.client.1.vm08.stdout:4/902: rmdir da/d10/d26/d27/da6/df9 0 2026-03-09T19:27:56.516 INFO:tasks.workunit.client.1.vm08.stdout:4/903: write da/d10/d26/d3a/d69/f104 [2627091,91903] 0 2026-03-09T19:27:56.517 INFO:tasks.workunit.client.1.vm08.stdout:9/939: dwrite d0/d2/d14/d5c/fd [8388608,4194304] 0 2026-03-09T19:27:56.519 INFO:tasks.workunit.client.1.vm08.stdout:0/963: chown dd/d31/fac 90225707 1 2026-03-09T19:27:56.528 INFO:tasks.workunit.client.1.vm08.stdout:8/930: sync 2026-03-09T19:27:56.528 INFO:tasks.workunit.client.1.vm08.stdout:9/940: sync 2026-03-09T19:27:56.544 INFO:tasks.workunit.client.1.vm08.stdout:0/964: unlink dd/f19 0 2026-03-09T19:27:56.564 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.565+0000 7fed86f35640 1 -- 192.168.123.107:0/80876835 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fed80075ba0 msgr2=0x7fed80075fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:56.564 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.565+0000 7fed86f35640 1 --2- 192.168.123.107:0/80876835 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fed80075ba0 0x7fed80075fa0 secure :-1 s=READY pgs=350 cs=0 l=1 rev1=1 crypto rx=0x7fed7400b0a0 tx=0x7fed7402f4a0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.564 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.565+0000 7fed86f35640 1 -- 192.168.123.107:0/80876835 shutdown_connections 2026-03-09T19:27:56.564 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.565+0000 7fed86f35640 1 --2- 192.168.123.107:0/80876835 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fed80076df0 0x7fed80077250 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.564 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.565+0000 7fed86f35640 1 --2- 192.168.123.107:0/80876835 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fed80075ba0 0x7fed80075fa0 secure :-1 s=CLOSED pgs=350 cs=0 l=1 rev1=1 crypto rx=0x7fed7400b0a0 tx=0x7fed7402f4a0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.564 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.565+0000 7fed86f35640 1 -- 192.168.123.107:0/80876835 >> 192.168.123.107:0/80876835 conn(0x7fed800fe250 msgr2=0x7fed80100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:56.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.566+0000 7fed86f35640 1 -- 192.168.123.107:0/80876835 shutdown_connections 2026-03-09T19:27:56.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.566+0000 7fed86f35640 1 -- 192.168.123.107:0/80876835 wait complete. 2026-03-09T19:27:56.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.566+0000 7fed86f35640 1 Processor -- start 2026-03-09T19:27:56.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.566+0000 7fed86f35640 1 -- start start 2026-03-09T19:27:56.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.567+0000 7fed86f35640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fed80076df0 0x7fed8010d380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:56.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.567+0000 7fed86f35640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fed8010d8c0 0x7fed8010dd40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:56.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.567+0000 7fed86f35640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fed8010ed30 con 0x7fed8010d8c0 2026-03-09T19:27:56.566 INFO:tasks.workunit.client.1.vm08.stdout:0/965: symlink dd/d22/d24/d49/d92/l13c 0 2026-03-09T19:27:56.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.567+0000 7fed86f35640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fed801acec0 con 0x7fed80076df0 2026-03-09T19:27:56.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.567+0000 7fed7ffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fed8010d8c0 0x7fed8010dd40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:56.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.567+0000 7fed7ffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fed8010d8c0 0x7fed8010dd40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:39274/0 (socket says 192.168.123.107:39274) 2026-03-09T19:27:56.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.567+0000 7fed7ffff640 1 -- 192.168.123.107:0/837121415 learned_addr learned my addr 192.168.123.107:0/837121415 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:27:56.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.567+0000 7fed7ffff640 1 -- 192.168.123.107:0/837121415 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fed80076df0 msgr2=0x7fed8010d380 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T19:27:56.567 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.567+0000 7fed7ffff640 1 --2- 192.168.123.107:0/837121415 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fed80076df0 0x7fed8010d380 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.567 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.568+0000 7fed7ffff640 1 -- 192.168.123.107:0/837121415 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fed74009d00 con 0x7fed8010d8c0 2026-03-09T19:27:56.567 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.568+0000 7fed7ffff640 1 --2- 192.168.123.107:0/837121415 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fed8010d8c0 0x7fed8010dd40 secure :-1 s=READY pgs=351 cs=0 l=1 rev1=1 crypto rx=0x7fed7000d8d0 tx=0x7fed7000dda0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:56.567 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.568+0000 7fed7dffb640 1 -- 192.168.123.107:0/837121415 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fed70004490 con 0x7fed8010d8c0 2026-03-09T19:27:56.567 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.568+0000 7fed7dffb640 1 -- 192.168.123.107:0/837121415 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fed70004d60 con 0x7fed8010d8c0 2026-03-09T19:27:56.567 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.568+0000 7fed7dffb640 1 -- 192.168.123.107:0/837121415 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fed70005230 con 0x7fed8010d8c0 2026-03-09T19:27:56.567 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.568+0000 7fed86f35640 1 -- 192.168.123.107:0/837121415 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fed801ad060 con 0x7fed8010d8c0 2026-03-09T19:27:56.568 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.569+0000 7fed86f35640 1 -- 192.168.123.107:0/837121415 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fed801ad560 con 0x7fed8010d8c0 2026-03-09T19:27:56.569 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.570+0000 7fed86f35640 1 -- 192.168.123.107:0/837121415 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fed48005350 con 0x7fed8010d8c0 2026-03-09T19:27:56.570 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.571+0000 7fed7dffb640 1 -- 192.168.123.107:0/837121415 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 24) v1 ==== 99330+0+0 (secure 0 0 0) 0x7fed7000b9d0 con 0x7fed8010d8c0 2026-03-09T19:27:56.570 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.571+0000 7fed7dffb640 1 --2- 192.168.123.107:0/837121415 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7fed54076c60 0x7fed54079120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:56.573 INFO:tasks.workunit.client.1.vm08.stdout:8/931: truncate de/d25/d33/fb6 321732 0 2026-03-09T19:27:56.573 INFO:tasks.workunit.client.1.vm08.stdout:5/924: dwrite d16/d1e/d3b/fbd [0,4194304] 0 2026-03-09T19:27:56.575 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.573+0000 7fed7dffb640 1 -- 192.168.123.107:0/837121415 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7fed7000b7e0 con 0x7fed8010d8c0 2026-03-09T19:27:56.575 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.575+0000 7fed7dffb640 1 -- 192.168.123.107:0/837121415 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7fed7009d050 con 0x7fed8010d8c0 2026-03-09T19:27:56.577 INFO:tasks.workunit.client.1.vm08.stdout:2/855: dwrite d3/d4/d23/d2c/d39/d5e/de/f1c [4194304,4194304] 0 2026-03-09T19:27:56.580 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.581+0000 7fed84caa640 1 --2- 192.168.123.107:0/837121415 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7fed54076c60 0x7fed54079120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:56.588 INFO:tasks.workunit.client.1.vm08.stdout:6/958: dwrite d3/db/d43/d69/f142 [0,4194304] 0 2026-03-09T19:27:56.591 INFO:tasks.workunit.client.1.vm08.stdout:2/856: read d3/d4/d23/d2c/d39/d5e/d14/f78 [167956,124903] 0 2026-03-09T19:27:56.591 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.592+0000 7fed84caa640 1 --2- 192.168.123.107:0/837121415 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7fed54076c60 0x7fed54079120 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fed7402f9b0 tx=0x7fed7400afe0 comp rx=0 tx=0).ready entity=mgr.14696 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:56.592 INFO:tasks.workunit.client.1.vm08.stdout:2/857: truncate d3/d4/d23/f11d 286566 0 2026-03-09T19:27:56.592 INFO:tasks.workunit.client.1.vm08.stdout:2/858: chown d3/d9/lc3 105310544 1 2026-03-09T19:27:56.593 INFO:tasks.workunit.client.0.vm07.stdout:9/743: rename d0/d6/d3a/d81/d101 to d0/d6/d73/d105 0 2026-03-09T19:27:56.593 INFO:tasks.workunit.client.0.vm07.stdout:9/744: dread - d0/db/d29/d32/d5c/d80/fe4 zero size 2026-03-09T19:27:56.610 INFO:tasks.workunit.client.1.vm08.stdout:2/859: dread d3/d4/d3e/d9d/fc5 [0,4194304] 0 2026-03-09T19:27:56.611 INFO:tasks.workunit.client.1.vm08.stdout:2/860: dread - d3/dca/f11c zero size 2026-03-09T19:27:56.618 INFO:tasks.workunit.client.0.vm07.stdout:5/722: mkdir d3/dd/d26/d3f/d47/de6 0 2026-03-09T19:27:56.621 INFO:tasks.workunit.client.0.vm07.stdout:7/708: creat d0/d4/d5/d8/d41/d64/fee x:0 0 0 2026-03-09T19:27:56.628 INFO:tasks.workunit.client.1.vm08.stdout:8/932: truncate de/d47/dfd/f10c 6960 0 2026-03-09T19:27:56.633 INFO:tasks.workunit.client.1.vm08.stdout:4/904: write da/d10/d16/d28/d2f/d4f/d103/d40/d6c/f92 [1351,46106] 0 2026-03-09T19:27:56.639 INFO:tasks.workunit.client.0.vm07.stdout:3/808: dwrite d1/d6/dd/f44 [0,4194304] 0 2026-03-09T19:27:56.640 INFO:tasks.workunit.client.0.vm07.stdout:3/809: stat d1/d1f/f9c 0 2026-03-09T19:27:56.651 INFO:tasks.workunit.client.1.vm08.stdout:9/941: write d0/d1b/de9/d12a/da2/da8/de8/f8e [1387561,95682] 0 2026-03-09T19:27:56.652 INFO:tasks.workunit.client.1.vm08.stdout:9/942: dread - d0/d1b/de9/d12a/f125 zero size 2026-03-09T19:27:56.657 INFO:tasks.workunit.client.0.vm07.stdout:2/798: mknod d3/dd/d103/ddd/ded/df3/d112/c11b 0 2026-03-09T19:27:56.658 INFO:tasks.workunit.client.0.vm07.stdout:8/758: write d7/d9/f65 [353693,61791] 0 2026-03-09T19:27:56.660 INFO:tasks.workunit.client.0.vm07.stdout:0/725: getdents d0/d6/d13/d17/dc3 0 2026-03-09T19:27:56.669 INFO:tasks.workunit.client.1.vm08.stdout:8/933: fdatasync de/d1d/d21/f72 0 2026-03-09T19:27:56.669 INFO:tasks.workunit.client.1.vm08.stdout:4/905: rmdir da/d10/d26/d3a 39 2026-03-09T19:27:56.669 INFO:tasks.workunit.client.0.vm07.stdout:0/726: chown d0/d6/fcf 80 1 2026-03-09T19:27:56.669 INFO:tasks.workunit.client.0.vm07.stdout:6/701: dwrite d0/d1/d28/d76/dad/fe1 [0,4194304] 0 2026-03-09T19:27:56.681 INFO:tasks.workunit.client.0.vm07.stdout:5/723: creat d3/d1a/d28/d40/d92/d89/ddc/fe7 x:0 0 0 2026-03-09T19:27:56.687 INFO:tasks.workunit.client.1.vm08.stdout:6/959: unlink d3/d34/d6f/c28 0 2026-03-09T19:27:56.691 INFO:tasks.workunit.client.1.vm08.stdout:9/943: fdatasync d0/d2/f21 0 2026-03-09T19:27:56.695 INFO:tasks.workunit.client.0.vm07.stdout:4/728: mknod d3/d11/d2b/d38/ddc/cff 0 2026-03-09T19:27:56.696 INFO:tasks.workunit.client.1.vm08.stdout:5/925: write d16/d1e/d8c/d99/da8/d9a/fe0 [4202507,95825] 0 2026-03-09T19:27:56.698 INFO:tasks.workunit.client.0.vm07.stdout:7/709: dwrite d0/d4/d5/d8/d41/f73 [0,4194304] 0 2026-03-09T19:27:56.710 INFO:tasks.workunit.client.1.vm08.stdout:2/861: creat d3/d4/d23/d2c/d39/d5e/de/d18/da9/d110/f122 x:0 0 0 2026-03-09T19:27:56.713 INFO:tasks.workunit.client.0.vm07.stdout:3/810: mkdir d1/d3d/d47/db3/d87/d106 0 2026-03-09T19:27:56.714 INFO:tasks.workunit.client.0.vm07.stdout:1/757: link d1/d11/d37/d3f/d7e/dad/fcd d1/db/d31/d4f/ffc 0 2026-03-09T19:27:56.715 INFO:tasks.workunit.client.0.vm07.stdout:1/758: chown d1/d3e/db3/d6d/ff6 1 1 2026-03-09T19:27:56.716 INFO:tasks.workunit.client.0.vm07.stdout:2/799: chown d3/dd/d16/d30/d40/f107 90722002 1 2026-03-09T19:27:56.719 INFO:tasks.workunit.client.0.vm07.stdout:8/759: rename d7/d16/c46 to d7/d50/da6/c111 0 2026-03-09T19:27:56.721 INFO:tasks.workunit.client.1.vm08.stdout:4/906: creat da/d10/d16/d28/d2f/d4f/d56/d90/f114 x:0 0 0 2026-03-09T19:27:56.731 INFO:tasks.workunit.client.0.vm07.stdout:9/745: creat d0/db/d29/d32/d5c/d80/ddf/f106 x:0 0 0 2026-03-09T19:27:56.734 INFO:tasks.workunit.client.0.vm07.stdout:5/724: dread d3/d1a/fa [4194304,4194304] 0 2026-03-09T19:27:56.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.736+0000 7fed86f35640 1 -- 192.168.123.107:0/837121415 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fed480058d0 con 0x7fed8010d8c0 2026-03-09T19:27:56.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.736+0000 7fed7dffb640 1 -- 192.168.123.107:0/837121415 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1856 (secure 0 0 0) 0x7fed700607a0 con 0x7fed8010d8c0 2026-03-09T19:27:56.735 INFO:teuthology.orchestra.run.vm07.stdout:e13 2026-03-09T19:27:56.735 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:epoch 13 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:27:56.736 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:27:56.737 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:27:56.737 INFO:tasks.workunit.client.0.vm07.stdout:4/729: unlink d3/d11/d2b/d37/cd3 0 2026-03-09T19:27:56.737 INFO:tasks.workunit.client.0.vm07.stdout:4/730: dread - d3/d11/d2b/d38/ddc/d22/d86/f97 zero size 2026-03-09T19:27:56.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.739+0000 7fed86f35640 1 -- 192.168.123.107:0/837121415 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7fed54076c60 msgr2=0x7fed54079120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:56.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.739+0000 7fed86f35640 1 --2- 192.168.123.107:0/837121415 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7fed54076c60 0x7fed54079120 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fed7402f9b0 tx=0x7fed7400afe0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.739+0000 7fed86f35640 1 -- 192.168.123.107:0/837121415 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fed8010d8c0 msgr2=0x7fed8010dd40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:56.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.739+0000 7fed86f35640 1 --2- 192.168.123.107:0/837121415 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fed8010d8c0 0x7fed8010dd40 secure :-1 s=READY pgs=351 cs=0 l=1 rev1=1 crypto rx=0x7fed7000d8d0 tx=0x7fed7000dda0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.740+0000 7fed86f35640 1 -- 192.168.123.107:0/837121415 shutdown_connections 2026-03-09T19:27:56.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.740+0000 7fed86f35640 1 --2- 192.168.123.107:0/837121415 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7fed54076c60 0x7fed54079120 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.740+0000 7fed86f35640 1 --2- 192.168.123.107:0/837121415 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fed8010d8c0 0x7fed8010dd40 unknown :-1 s=CLOSED pgs=351 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.740+0000 7fed86f35640 1 --2- 192.168.123.107:0/837121415 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fed80076df0 0x7fed8010d380 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.740+0000 7fed86f35640 1 -- 192.168.123.107:0/837121415 >> 192.168.123.107:0/837121415 conn(0x7fed800fe250 msgr2=0x7fed800ffb10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:56.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.740+0000 7fed86f35640 1 -- 192.168.123.107:0/837121415 shutdown_connections 2026-03-09T19:27:56.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.740+0000 7fed86f35640 1 -- 192.168.123.107:0/837121415 wait complete. 2026-03-09T19:27:56.739 INFO:tasks.workunit.client.0.vm07.stdout:7/710: creat d0/d80/db1/de5/d54/d5a/fef x:0 0 0 2026-03-09T19:27:56.742 INFO:tasks.workunit.client.0.vm07.stdout:3/811: unlink d1/d3d/d47/db3/dc2/d28/f64 0 2026-03-09T19:27:56.751 INFO:tasks.workunit.client.0.vm07.stdout:2/800: rename d3/dd/d16/d29/d2d/d45/d8b/d98/f104 to d3/dd/d16/d29/d3c/d4c/f11c 0 2026-03-09T19:27:56.755 INFO:tasks.workunit.client.0.vm07.stdout:0/727: symlink d0/d6/d13/d1c/lee 0 2026-03-09T19:27:56.756 INFO:tasks.workunit.client.0.vm07.stdout:1/759: dwrite d1/d11/d37/d3f/d7e/f7f [0,4194304] 0 2026-03-09T19:27:56.778 INFO:tasks.workunit.client.0.vm07.stdout:4/731: mkdir d3/d11/d16/d100 0 2026-03-09T19:27:56.779 INFO:tasks.workunit.client.0.vm07.stdout:7/711: creat d0/d80/db1/de5/d54/dc4/ff0 x:0 0 0 2026-03-09T19:27:56.780 INFO:tasks.workunit.client.0.vm07.stdout:4/732: chown d3/d11/d29/f9b 2 1 2026-03-09T19:27:56.784 INFO:tasks.workunit.client.0.vm07.stdout:3/812: symlink d1/d6/d4c/d97/l107 0 2026-03-09T19:27:56.785 INFO:tasks.workunit.client.0.vm07.stdout:3/813: fsync d1/d3d/d47/db3/d87/ff7 0 2026-03-09T19:27:56.785 INFO:tasks.workunit.client.0.vm07.stdout:5/725: dwrite d3/d1a/fa [0,4194304] 0 2026-03-09T19:27:56.795 INFO:tasks.workunit.client.0.vm07.stdout:8/760: rename d7/d9/d37/fe8 to d7/d9/d10/dd8/dfd/d67/f112 0 2026-03-09T19:27:56.795 INFO:tasks.workunit.client.0.vm07.stdout:2/801: mkdir d3/dd/d16/d29/d2d/d45/d85/d11d 0 2026-03-09T19:27:56.796 INFO:tasks.workunit.client.1.vm08.stdout:6/960: creat d3/d34/d6f/dd2/f163 x:0 0 0 2026-03-09T19:27:56.801 INFO:tasks.workunit.client.1.vm08.stdout:5/926: mkdir d16/d1e/d6e/dcd/def/d12f 0 2026-03-09T19:27:56.803 INFO:tasks.workunit.client.1.vm08.stdout:5/927: write d16/d1e/f7d [2820564,9554] 0 2026-03-09T19:27:56.804 INFO:tasks.workunit.client.1.vm08.stdout:0/966: rename dd/d22/d27/d11e/d78/fbb to dd/d22/d27/d11e/d78/f13d 0 2026-03-09T19:27:56.804 INFO:tasks.workunit.client.0.vm07.stdout:9/746: symlink d0/d6/d73/dbe/l107 0 2026-03-09T19:27:56.810 INFO:tasks.workunit.client.1.vm08.stdout:6/961: fdatasync d3/d94/def/f133 0 2026-03-09T19:27:56.811 INFO:tasks.workunit.client.0.vm07.stdout:3/814: creat d1/d89/f108 x:0 0 0 2026-03-09T19:27:56.816 INFO:tasks.workunit.client.0.vm07.stdout:5/726: sync 2026-03-09T19:27:56.816 INFO:tasks.workunit.client.0.vm07.stdout:8/761: truncate d7/d16/f71 730723 0 2026-03-09T19:27:56.817 INFO:tasks.workunit.client.0.vm07.stdout:5/727: chown d3/dd/d26/d3f/d47/d56/f65 14953672 1 2026-03-09T19:27:56.827 INFO:tasks.workunit.client.1.vm08.stdout:0/967: mkdir dd/d7e/d13e 0 2026-03-09T19:27:56.827 INFO:tasks.workunit.client.1.vm08.stdout:8/934: link de/d47/dfd/d99/dde/ffb de/d47/dfd/d99/da0/d10d/f13d 0 2026-03-09T19:27:56.827 INFO:tasks.workunit.client.1.vm08.stdout:4/907: mknod da/c115 0 2026-03-09T19:27:56.827 INFO:tasks.workunit.client.0.vm07.stdout:0/728: symlink d0/d6/d13/d17/d19/d58/dd9/lef 0 2026-03-09T19:27:56.828 INFO:tasks.workunit.client.1.vm08.stdout:8/935: read - de/d47/dfd/d99/da0/fda zero size 2026-03-09T19:27:56.829 INFO:tasks.workunit.client.0.vm07.stdout:3/815: symlink d1/d6/d45/l109 0 2026-03-09T19:27:56.835 INFO:tasks.workunit.client.0.vm07.stdout:7/712: rename d0/d4/d5/d8/f15 to d0/d4/d5/d26/dc9/ff1 0 2026-03-09T19:27:56.837 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.838+0000 7fd699674640 1 -- 192.168.123.107:0/2743059171 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd694105da0 msgr2=0x7fd694108190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:56.837 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.838+0000 7fd699674640 1 --2- 192.168.123.107:0/2743059171 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd694105da0 0x7fd694108190 secure :-1 s=READY pgs=352 cs=0 l=1 rev1=1 crypto rx=0x7fd6880099b0 tx=0x7fd68802f220 comp rx=0 tx=0).stop 2026-03-09T19:27:56.838 INFO:tasks.workunit.client.0.vm07.stdout:2/802: mkdir d3/dd/d16/d29/d2d/d45/df6/d11e 0 2026-03-09T19:27:56.839 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.839+0000 7fd699674640 1 -- 192.168.123.107:0/2743059171 shutdown_connections 2026-03-09T19:27:56.839 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.839+0000 7fd699674640 1 --2- 192.168.123.107:0/2743059171 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd694105da0 0x7fd694108190 unknown :-1 s=CLOSED pgs=352 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.839 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.839+0000 7fd699674640 1 --2- 192.168.123.107:0/2743059171 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd694069930 0x7fd694105860 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.839 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.839+0000 7fd699674640 1 -- 192.168.123.107:0/2743059171 >> 192.168.123.107:0/2743059171 conn(0x7fd6940fae50 msgr2=0x7fd6940fd2b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:56.841 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.842+0000 7fd699674640 1 -- 192.168.123.107:0/2743059171 shutdown_connections 2026-03-09T19:27:56.842 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.843+0000 7fd699674640 1 -- 192.168.123.107:0/2743059171 wait complete. 2026-03-09T19:27:56.842 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.843+0000 7fd699674640 1 Processor -- start 2026-03-09T19:27:56.842 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.843+0000 7fd699674640 1 -- start start 2026-03-09T19:27:56.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.844+0000 7fd699674640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd694069930 0x7fd69419a440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:56.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.844+0000 7fd699674640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd694105da0 0x7fd69419a980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:56.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.844+0000 7fd699674640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd69419af50 con 0x7fd694105da0 2026-03-09T19:27:56.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.844+0000 7fd699674640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd69419b0c0 con 0x7fd694069930 2026-03-09T19:27:56.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.844+0000 7fd6927fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd694105da0 0x7fd69419a980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:56.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.844+0000 7fd6927fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd694105da0 0x7fd69419a980 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:39290/0 (socket says 192.168.123.107:39290) 2026-03-09T19:27:56.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.844+0000 7fd6927fc640 1 -- 192.168.123.107:0/364779766 learned_addr learned my addr 192.168.123.107:0/364779766 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:27:56.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.844+0000 7fd692ffd640 1 --2- 192.168.123.107:0/364779766 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd694069930 0x7fd69419a440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:56.844 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.845+0000 7fd6927fc640 1 -- 192.168.123.107:0/364779766 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd694069930 msgr2=0x7fd69419a440 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:56.844 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.845+0000 7fd6927fc640 1 --2- 192.168.123.107:0/364779766 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd694069930 0x7fd69419a440 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:56.844 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.845+0000 7fd6927fc640 1 -- 192.168.123.107:0/364779766 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd688009660 con 0x7fd694105da0 2026-03-09T19:27:56.844 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.845+0000 7fd6927fc640 1 --2- 192.168.123.107:0/364779766 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd694105da0 0x7fd69419a980 secure :-1 s=READY pgs=353 cs=0 l=1 rev1=1 crypto rx=0x7fd688009980 tx=0x7fd688031cd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:56.844 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.845+0000 7fd677fff640 1 -- 192.168.123.107:0/364779766 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd68803d070 con 0x7fd694105da0 2026-03-09T19:27:56.845 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.845+0000 7fd677fff640 1 -- 192.168.123.107:0/364779766 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd6880043d0 con 0x7fd694105da0 2026-03-09T19:27:56.845 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.845+0000 7fd677fff640 1 -- 192.168.123.107:0/364779766 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd688031280 con 0x7fd694105da0 2026-03-09T19:27:56.845 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.845+0000 7fd699674640 1 -- 192.168.123.107:0/364779766 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd69419fb00 con 0x7fd694105da0 2026-03-09T19:27:56.845 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.846+0000 7fd699674640 1 -- 192.168.123.107:0/364779766 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd69419fff0 con 0x7fd694105da0 2026-03-09T19:27:56.846 INFO:tasks.workunit.client.1.vm08.stdout:9/944: creat d0/d1b/de9/d12a/da2/da8/de8/f136 x:0 0 0 2026-03-09T19:27:56.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.847+0000 7fd677fff640 1 -- 192.168.123.107:0/364779766 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 24) v1 ==== 99330+0+0 (secure 0 0 0) 0x7fd6880388c0 con 0x7fd694105da0 2026-03-09T19:27:56.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.847+0000 7fd699674640 1 -- 192.168.123.107:0/364779766 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd660005350 con 0x7fd694105da0 2026-03-09T19:27:56.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.848+0000 7fd677fff640 1 --2- 192.168.123.107:0/364779766 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7fd66c076d80 0x7fd66c079240 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:56.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.848+0000 7fd677fff640 1 -- 192.168.123.107:0/364779766 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7fd6880bd890 con 0x7fd694105da0 2026-03-09T19:27:56.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.851+0000 7fd692ffd640 1 --2- 192.168.123.107:0/364779766 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7fd66c076d80 0x7fd66c079240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:56.852 INFO:tasks.workunit.client.1.vm08.stdout:2/862: dwrite d3/d4/d23/d2c/d39/d5e/ff7 [0,4194304] 0 2026-03-09T19:27:56.854 INFO:tasks.workunit.client.0.vm07.stdout:6/702: dwrite d0/d1/d28/da8/ffe [0,4194304] 0 2026-03-09T19:27:56.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.853+0000 7fd677fff640 1 -- 192.168.123.107:0/364779766 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7fd688086420 con 0x7fd694105da0 2026-03-09T19:27:56.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:56.857+0000 7fd692ffd640 1 --2- 192.168.123.107:0/364779766 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7fd66c076d80 0x7fd66c079240 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fd6800059c0 tx=0x7fd68000a5c0 comp rx=0 tx=0).ready entity=mgr.14696 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:56.914 INFO:tasks.workunit.client.0.vm07.stdout:1/760: mknod d1/d11/d37/d5d/dc1/cfd 0 2026-03-09T19:27:56.914 INFO:tasks.workunit.client.1.vm08.stdout:5/928: dwrite d16/d45/daf/df5/fb4 [4194304,4194304] 0 2026-03-09T19:27:56.924 INFO:tasks.workunit.client.0.vm07.stdout:8/762: write d7/d30/d75/f88 [964602,86176] 0 2026-03-09T19:27:56.925 INFO:tasks.workunit.client.0.vm07.stdout:5/728: write d3/f93 [4365369,110086] 0 2026-03-09T19:27:56.928 INFO:tasks.workunit.client.0.vm07.stdout:8/763: sync 2026-03-09T19:27:56.928 INFO:tasks.workunit.client.0.vm07.stdout:8/764: sync 2026-03-09T19:27:56.929 INFO:tasks.workunit.client.0.vm07.stdout:8/765: chown d7/d9/d10/f20 2 1 2026-03-09T19:27:56.930 INFO:tasks.workunit.client.0.vm07.stdout:8/766: chown d7/d1d/d83 410159892 1 2026-03-09T19:27:56.930 INFO:tasks.workunit.client.1.vm08.stdout:8/936: mkdir de/d1d/d21/d13e 0 2026-03-09T19:27:56.938 INFO:tasks.workunit.client.0.vm07.stdout:3/816: fdatasync d1/d6/fd6 0 2026-03-09T19:27:56.938 INFO:tasks.workunit.client.1.vm08.stdout:9/945: rmdir d0/d2/d14/d98 39 2026-03-09T19:27:56.938 INFO:tasks.workunit.client.0.vm07.stdout:3/817: write d1/d3d/f5e [3374771,50382] 0 2026-03-09T19:27:56.941 INFO:tasks.workunit.client.0.vm07.stdout:7/713: mkdir d0/d4/d5/d8/d41/d64/d74/d98/dcb/df2 0 2026-03-09T19:27:56.944 INFO:tasks.workunit.client.1.vm08.stdout:9/946: dread d0/d1b/de9/d12a/da2/da8/de8/dcd/fb6 [0,4194304] 0 2026-03-09T19:27:56.947 INFO:tasks.workunit.client.1.vm08.stdout:9/947: dread - d0/d2/f133 zero size 2026-03-09T19:27:56.947 INFO:tasks.workunit.client.1.vm08.stdout:5/929: symlink d16/d1e/d3b/l130 0 2026-03-09T19:27:56.957 INFO:tasks.workunit.client.1.vm08.stdout:8/937: mkdir de/d7c/d13f 0 2026-03-09T19:27:56.958 INFO:tasks.workunit.client.0.vm07.stdout:6/703: dread d0/d1/db/d1d/f3e [0,4194304] 0 2026-03-09T19:27:56.959 INFO:tasks.workunit.client.0.vm07.stdout:9/747: creat d0/db/f108 x:0 0 0 2026-03-09T19:27:56.960 INFO:tasks.workunit.client.1.vm08.stdout:6/962: link d3/f7 d3/db/d12a/d147/d154/f164 0 2026-03-09T19:27:56.964 INFO:tasks.workunit.client.1.vm08.stdout:5/930: truncate d16/d45/f6a 3175635 0 2026-03-09T19:27:56.966 INFO:tasks.workunit.client.1.vm08.stdout:0/968: truncate dd/d22/d27/d11e/d105/f83 2512760 0 2026-03-09T19:27:56.967 INFO:tasks.workunit.client.1.vm08.stdout:8/938: dread - de/d47/dfd/d99/da5/f13c zero size 2026-03-09T19:27:56.967 INFO:tasks.workunit.client.1.vm08.stdout:8/939: readlink de/d25/d33/d127/l133 0 2026-03-09T19:27:56.968 INFO:tasks.workunit.client.0.vm07.stdout:8/767: fsync d7/d1d/d83/d9f/fa4 0 2026-03-09T19:27:56.984 INFO:tasks.workunit.client.1.vm08.stdout:8/940: dread de/d25/d31/f118 [0,4194304] 0 2026-03-09T19:27:56.997 INFO:tasks.workunit.client.1.vm08.stdout:4/908: getdents da/d10/d16/d28/d2f/de9/db0 0 2026-03-09T19:27:56.998 INFO:tasks.workunit.client.0.vm07.stdout:3/818: mknod d1/d3d/d47/db3/d8e/dee/c10a 0 2026-03-09T19:27:56.998 INFO:tasks.workunit.client.1.vm08.stdout:4/909: chown da/d10/d16/d28/d2f/d4f/d56/dd0/lad 455854 1 2026-03-09T19:27:57.008 INFO:tasks.workunit.client.0.vm07.stdout:7/714: symlink d0/d4/d5/d8/d41/d64/d79/lf3 0 2026-03-09T19:27:57.009 INFO:tasks.workunit.client.1.vm08.stdout:6/963: fdatasync d3/d34/d6f/f2f 0 2026-03-09T19:27:57.009 INFO:tasks.workunit.client.1.vm08.stdout:8/941: chown de/d1d/d4f 4330 1 2026-03-09T19:27:57.010 INFO:tasks.workunit.client.0.vm07.stdout:9/748: truncate d0/d6/ff 6036370 0 2026-03-09T19:27:57.011 INFO:tasks.workunit.client.0.vm07.stdout:9/749: chown d0/d6/d57/d8f/ce8 439119915 1 2026-03-09T19:27:57.019 INFO:tasks.workunit.client.1.vm08.stdout:5/931: dread d16/d1e/d8c/f104 [0,4194304] 0 2026-03-09T19:27:57.025 INFO:tasks.workunit.client.0.vm07.stdout:5/729: mkdir d3/d1a/d28/d6c/de8 0 2026-03-09T19:27:57.025 INFO:tasks.workunit.client.1.vm08.stdout:2/863: link d3/d4/d23/d2c/d39/d5e/l65 d3/l123 0 2026-03-09T19:27:57.027 INFO:tasks.workunit.client.0.vm07.stdout:2/803: dwrite d3/dd/d16/d30/da7/f109 [4194304,4194304] 0 2026-03-09T19:27:57.031 INFO:tasks.workunit.client.1.vm08.stdout:4/910: rename da/d10/d16/d28/d2f/d4f/d103/d40/f10b to da/d10/d16/d28/d2f/d4f/d102/f116 0 2026-03-09T19:27:57.031 INFO:tasks.workunit.client.1.vm08.stdout:0/969: dwrite dd/d22/d27/d2e/fe0 [0,4194304] 0 2026-03-09T19:27:57.031 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.030+0000 7fd699674640 1 -- 192.168.123.107:0/364779766 --> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd660002bf0 con 0x7fd66c076d80 2026-03-09T19:27:57.032 INFO:tasks.workunit.client.0.vm07.stdout:0/729: rmdir d0/d6/d13/d17/d19/de5 0 2026-03-09T19:27:57.032 INFO:tasks.workunit.client.0.vm07.stdout:2/804: fdatasync d3/dd/d103/ddd/ff8 0 2026-03-09T19:27:57.041 INFO:tasks.workunit.client.0.vm07.stdout:2/805: dwrite d3/dd/d16/d30/d40/f107 [0,4194304] 0 2026-03-09T19:27:57.044 INFO:tasks.workunit.client.0.vm07.stdout:9/750: dread d0/d6/f7b [0,4194304] 0 2026-03-09T19:27:57.049 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:27:57.049 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T19:27:57.049 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:27:57.049 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:27:57.049 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [], 2026-03-09T19:27:57.050 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "1/23 daemons upgraded", 2026-03-09T19:27:57.050 INFO:teuthology.orchestra.run.vm07.stdout: "message": "", 2026-03-09T19:27:57.050 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:27:57.050 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:27:57.050 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.045+0000 7fd677fff640 1 -- 192.168.123.107:0/364779766 <== mgr.14696 v2:192.168.123.108:6828/502005203 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+318 (secure 0 0 0) 0x7fd660002bf0 con 0x7fd66c076d80 2026-03-09T19:27:57.050 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.050+0000 7fd675ffb640 1 -- 192.168.123.107:0/364779766 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7fd66c076d80 msgr2=0x7fd66c079240 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:57.050 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.050+0000 7fd675ffb640 1 --2- 192.168.123.107:0/364779766 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7fd66c076d80 0x7fd66c079240 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fd6800059c0 tx=0x7fd68000a5c0 comp rx=0 tx=0).stop 2026-03-09T19:27:57.050 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.050+0000 7fd675ffb640 1 -- 192.168.123.107:0/364779766 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd694105da0 msgr2=0x7fd69419a980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:57.050 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.050+0000 7fd675ffb640 1 --2- 192.168.123.107:0/364779766 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd694105da0 0x7fd69419a980 secure :-1 s=READY pgs=353 cs=0 l=1 rev1=1 crypto rx=0x7fd688009980 tx=0x7fd688031cd0 comp rx=0 tx=0).stop 2026-03-09T19:27:57.050 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.050+0000 7fd675ffb640 1 -- 192.168.123.107:0/364779766 shutdown_connections 2026-03-09T19:27:57.050 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.050+0000 7fd675ffb640 1 --2- 192.168.123.107:0/364779766 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7fd66c076d80 0x7fd66c079240 secure :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fd6800059c0 tx=0x7fd68000a5c0 comp rx=0 tx=0).stop 2026-03-09T19:27:57.050 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.050+0000 7fd675ffb640 1 --2- 192.168.123.107:0/364779766 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd694105da0 0x7fd69419a980 unknown :-1 s=CLOSED pgs=353 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:57.050 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.050+0000 7fd675ffb640 1 --2- 192.168.123.107:0/364779766 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd694069930 0x7fd69419a440 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:57.050 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.050+0000 7fd675ffb640 1 -- 192.168.123.107:0/364779766 >> 192.168.123.107:0/364779766 conn(0x7fd6940fae50 msgr2=0x7fd6940fd2b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:57.050 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.050+0000 7fd675ffb640 1 -- 192.168.123.107:0/364779766 shutdown_connections 2026-03-09T19:27:57.050 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.050+0000 7fd675ffb640 1 -- 192.168.123.107:0/364779766 wait complete. 2026-03-09T19:27:57.056 INFO:tasks.workunit.client.0.vm07.stdout:9/751: readlink d0/d6/d3a/dd3/lfa 0 2026-03-09T19:27:57.066 INFO:tasks.workunit.client.0.vm07.stdout:1/761: write d1/d11/d37/d3f/d45/f3b [449734,125144] 0 2026-03-09T19:27:57.074 INFO:tasks.workunit.client.1.vm08.stdout:6/964: mknod d3/d34/dce/c165 0 2026-03-09T19:27:57.089 INFO:tasks.workunit.client.0.vm07.stdout:7/715: fdatasync d0/d4/f6f 0 2026-03-09T19:27:57.089 INFO:tasks.workunit.client.0.vm07.stdout:4/733: rename d3/d11/d2b/d38/ddc/d22/d70 to d3/d11/d29/d101 0 2026-03-09T19:27:57.089 INFO:tasks.workunit.client.0.vm07.stdout:6/704: dwrite d0/d1/fa [8388608,4194304] 0 2026-03-09T19:27:57.089 INFO:tasks.workunit.client.0.vm07.stdout:9/752: sync 2026-03-09T19:27:57.089 INFO:tasks.workunit.client.1.vm08.stdout:8/942: creat de/d25/d87/dc9/dd8/f140 x:0 0 0 2026-03-09T19:27:57.089 INFO:tasks.workunit.client.1.vm08.stdout:5/932: mknod d16/d45/d115/c131 0 2026-03-09T19:27:57.089 INFO:tasks.workunit.client.1.vm08.stdout:2/864: dread - d3/d4/d23/d2c/dc1/f101 zero size 2026-03-09T19:27:57.089 INFO:tasks.workunit.client.1.vm08.stdout:9/948: link d0/d1b/de9/d12a/da2/da8/de8/dcd/fc6 d0/d2/f137 0 2026-03-09T19:27:57.095 INFO:tasks.workunit.client.0.vm07.stdout:7/716: sync 2026-03-09T19:27:57.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:57 vm08.local ceph-mon[57794]: from='client.24489 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:27:57.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:57 vm08.local ceph-mon[57794]: from='client.14722 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:27:57.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:57 vm08.local ceph-mon[57794]: from='client.14726 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:27:57.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:57 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/4059703852' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:27:57.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:57 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/837121415' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:27:57.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:57 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:57.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:57 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:57.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:57 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:27:57.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:57 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:27:57.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:57 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:27:57.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:57 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:27:57.099 INFO:tasks.workunit.client.0.vm07.stdout:6/705: dwrite d0/d1/d28/d76/dad/fe1 [0,4194304] 0 2026-03-09T19:27:57.125 INFO:tasks.workunit.client.1.vm08.stdout:0/970: mknod dd/d22/d63/d6e/d72/c13f 0 2026-03-09T19:27:57.137 INFO:tasks.workunit.client.1.vm08.stdout:6/965: creat d3/d34/da9/da4/d117/d10d/f166 x:0 0 0 2026-03-09T19:27:57.141 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:57 vm07.local ceph-mon[48545]: from='client.24489 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:27:57.141 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:57 vm07.local ceph-mon[48545]: from='client.14722 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:27:57.141 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:57 vm07.local ceph-mon[48545]: from='client.14726 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:27:57.141 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:57 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/4059703852' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:27:57.141 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:57 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/837121415' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:27:57.141 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:57 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:57.141 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:57 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:57.141 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:57 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:27:57.141 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:57 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:27:57.141 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:57 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:27:57.141 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:57 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:27:57.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.139+0000 7f9991af1640 1 -- 192.168.123.107:0/457002144 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f998c072420 msgr2=0x7f998c077190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:57.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.139+0000 7f9991af1640 1 --2- 192.168.123.107:0/457002144 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f998c072420 0x7f998c077190 secure :-1 s=READY pgs=354 cs=0 l=1 rev1=1 crypto rx=0x7f998400b0a0 tx=0x7f998402f4c0 comp rx=0 tx=0).stop 2026-03-09T19:27:57.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.140+0000 7f9991af1640 1 -- 192.168.123.107:0/457002144 shutdown_connections 2026-03-09T19:27:57.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.140+0000 7f9991af1640 1 --2- 192.168.123.107:0/457002144 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f998c072420 0x7f998c077190 unknown :-1 s=CLOSED pgs=354 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:57.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.140+0000 7f9991af1640 1 --2- 192.168.123.107:0/457002144 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f998c071a50 0x7f998c071e50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:57.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.140+0000 7f9991af1640 1 -- 192.168.123.107:0/457002144 >> 192.168.123.107:0/457002144 conn(0x7f998c06d4f0 msgr2=0x7f998c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:57.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.140+0000 7f9991af1640 1 -- 192.168.123.107:0/457002144 shutdown_connections 2026-03-09T19:27:57.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.140+0000 7f9991af1640 1 -- 192.168.123.107:0/457002144 wait complete. 2026-03-09T19:27:57.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.142+0000 7f9991af1640 1 Processor -- start 2026-03-09T19:27:57.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.142+0000 7f9991af1640 1 -- start start 2026-03-09T19:27:57.143 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.142+0000 7f9991af1640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f998c071a50 0x7f998c084060 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:57.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.142+0000 7f9991af1640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f998c0826b0 0x7f998c082b30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:57.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.142+0000 7f9991af1640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f998c0845a0 con 0x7f998c071a50 2026-03-09T19:27:57.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.142+0000 7f9991af1640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f998c083070 con 0x7f998c0826b0 2026-03-09T19:27:57.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.143+0000 7f9990aef640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f998c071a50 0x7f998c084060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:57.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.143+0000 7f9990aef640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f998c071a50 0x7f998c084060 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:39308/0 (socket says 192.168.123.107:39308) 2026-03-09T19:27:57.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.144+0000 7f9990aef640 1 -- 192.168.123.107:0/4186052139 learned_addr learned my addr 192.168.123.107:0/4186052139 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:27:57.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.144+0000 7f9990aef640 1 -- 192.168.123.107:0/4186052139 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f998c0826b0 msgr2=0x7f998c082b30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:57.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.144+0000 7f9990aef640 1 --2- 192.168.123.107:0/4186052139 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f998c0826b0 0x7f998c082b30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:57.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.144+0000 7f9990aef640 1 -- 192.168.123.107:0/4186052139 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9984009d00 con 0x7f998c071a50 2026-03-09T19:27:57.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.144+0000 7f9990aef640 1 --2- 192.168.123.107:0/4186052139 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f998c071a50 0x7f998c084060 secure :-1 s=READY pgs=355 cs=0 l=1 rev1=1 crypto rx=0x7f997c00afc0 tx=0x7f997c007590 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:57.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.146+0000 7f9989ffb640 1 -- 192.168.123.107:0/4186052139 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f997c019c00 con 0x7f998c071a50 2026-03-09T19:27:57.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.147+0000 7f9991af1640 1 -- 192.168.123.107:0/4186052139 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f998c083350 con 0x7f998c071a50 2026-03-09T19:27:57.149 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.147+0000 7f9991af1640 1 -- 192.168.123.107:0/4186052139 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f998c1b5bc0 con 0x7f998c071a50 2026-03-09T19:27:57.149 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.147+0000 7f9989ffb640 1 -- 192.168.123.107:0/4186052139 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f997c019d60 con 0x7f998c071a50 2026-03-09T19:27:57.149 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.147+0000 7f9989ffb640 1 -- 192.168.123.107:0/4186052139 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f997c005230 con 0x7f998c071a50 2026-03-09T19:27:57.149 INFO:tasks.workunit.client.1.vm08.stdout:6/966: dread d3/f10 [0,4194304] 0 2026-03-09T19:27:57.149 INFO:tasks.workunit.client.0.vm07.stdout:1/762: symlink d1/d3e/db3/lfe 0 2026-03-09T19:27:57.152 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.152+0000 7f9989ffb640 1 -- 192.168.123.107:0/4186052139 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 24) v1 ==== 99330+0+0 (secure 0 0 0) 0x7f997c005390 con 0x7f998c071a50 2026-03-09T19:27:57.155 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.153+0000 7f9991af1640 1 -- 192.168.123.107:0/4186052139 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f998c07a7f0 con 0x7f998c071a50 2026-03-09T19:27:57.155 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.153+0000 7f9989ffb640 1 --2- 192.168.123.107:0/4186052139 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f9978076cb0 0x7f9978079170 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:27:57.155 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.154+0000 7f9989ffb640 1 -- 192.168.123.107:0/4186052139 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f997c027030 con 0x7f998c071a50 2026-03-09T19:27:57.155 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.154+0000 7f998bfff640 1 --2- 192.168.123.107:0/4186052139 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f9978076cb0 0x7f9978079170 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:27:57.156 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.157+0000 7f9989ffb640 1 -- 192.168.123.107:0/4186052139 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f997c06af60 con 0x7f998c071a50 2026-03-09T19:27:57.158 INFO:tasks.workunit.client.1.vm08.stdout:5/933: unlink d16/d45/daf/df5/l51 0 2026-03-09T19:27:57.159 INFO:tasks.workunit.client.0.vm07.stdout:3/819: creat d1/d3d/d47/db3/d87/d106/f10b x:0 0 0 2026-03-09T19:27:57.163 INFO:tasks.workunit.client.1.vm08.stdout:4/911: getdents da/d10/d16/d28/d2f/de9/db0 0 2026-03-09T19:27:57.163 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.162+0000 7f998bfff640 1 --2- 192.168.123.107:0/4186052139 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f9978076cb0 0x7f9978079170 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f998400b070 tx=0x7f9984002750 comp rx=0 tx=0).ready entity=mgr.14696 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:27:57.167 INFO:tasks.workunit.client.1.vm08.stdout:5/934: dread d16/d45/fb1 [0,4194304] 0 2026-03-09T19:27:57.170 INFO:tasks.workunit.client.0.vm07.stdout:5/730: mknod d3/dd/d26/d2d/ce9 0 2026-03-09T19:27:57.172 INFO:tasks.workunit.client.1.vm08.stdout:0/971: sync 2026-03-09T19:27:57.174 INFO:tasks.workunit.client.0.vm07.stdout:4/734: truncate d3/d11/d2b/d38/ddc/f67 510811 0 2026-03-09T19:27:57.175 INFO:tasks.workunit.client.0.vm07.stdout:4/735: write d3/d11/d2b/d38/fdf [204997,11530] 0 2026-03-09T19:27:57.178 INFO:tasks.workunit.client.1.vm08.stdout:6/967: truncate d3/d34/da9/fae 29437 0 2026-03-09T19:27:57.181 INFO:tasks.workunit.client.0.vm07.stdout:9/753: creat d0/d6f/dc3/f109 x:0 0 0 2026-03-09T19:27:57.185 INFO:tasks.workunit.client.1.vm08.stdout:8/943: write de/d91/fdb [119016,129654] 0 2026-03-09T19:27:57.185 INFO:tasks.workunit.client.0.vm07.stdout:8/768: write d7/d9/d37/d45/f76 [9564,91432] 0 2026-03-09T19:27:57.188 INFO:tasks.workunit.client.0.vm07.stdout:4/736: fsync d3/d11/d2b/d38/fdf 0 2026-03-09T19:27:57.190 INFO:tasks.workunit.client.1.vm08.stdout:2/865: creat d3/d9/d26/ded/d104/f124 x:0 0 0 2026-03-09T19:27:57.193 INFO:tasks.workunit.client.1.vm08.stdout:9/949: write d0/d2/d14/d98/f9e [749664,39032] 0 2026-03-09T19:27:57.194 INFO:tasks.workunit.client.1.vm08.stdout:9/950: stat d0/d2/d14/d98/d99/d12f 0 2026-03-09T19:27:57.201 INFO:tasks.workunit.client.1.vm08.stdout:5/935: rename d16/d45/daf/ldf to d16/d1e/dc9/d10c/d112/l132 0 2026-03-09T19:27:57.203 INFO:tasks.workunit.client.0.vm07.stdout:2/806: mknod d3/dd/d103/dd4/c11f 0 2026-03-09T19:27:57.210 INFO:tasks.workunit.client.1.vm08.stdout:6/968: mkdir d3/d68/d14a/d167 0 2026-03-09T19:27:57.212 INFO:tasks.workunit.client.1.vm08.stdout:2/866: rmdir d3/d9/d79/d46 39 2026-03-09T19:27:57.214 INFO:tasks.workunit.client.1.vm08.stdout:2/867: write d3/d4/d23/d2c/d39/d5e/de/fe9 [2600751,1666] 0 2026-03-09T19:27:57.216 INFO:tasks.workunit.client.1.vm08.stdout:2/868: chown d3/d4/d23/d2c/d39/d5e/d14/fcd 3597309 1 2026-03-09T19:27:57.219 INFO:tasks.workunit.client.1.vm08.stdout:4/912: fdatasync da/d10/f113 0 2026-03-09T19:27:57.220 INFO:tasks.workunit.client.0.vm07.stdout:0/730: dwrite d0/d6/d13/d1c/d11/f80 [0,4194304] 0 2026-03-09T19:27:57.220 INFO:tasks.workunit.client.1.vm08.stdout:9/951: rename d0/d1b/de9/d12a/da2/da8/de8/f136 to d0/d1b/d97/d48/d5d/ddf/f138 0 2026-03-09T19:27:57.227 INFO:tasks.workunit.client.1.vm08.stdout:4/913: rename da/d10/d16/d28/d2f/d4f/d64 to da/d10/d16/d28/d2f/d4f/d64/d81/dfb/d117 22 2026-03-09T19:27:57.228 INFO:tasks.workunit.client.1.vm08.stdout:5/936: read - d16/d1e/dc9/fd8 zero size 2026-03-09T19:27:57.230 INFO:tasks.workunit.client.1.vm08.stdout:5/937: chown d16/d45/f54 829 1 2026-03-09T19:27:57.231 INFO:tasks.workunit.client.1.vm08.stdout:9/952: sync 2026-03-09T19:27:57.239 INFO:tasks.workunit.client.1.vm08.stdout:9/953: dread d0/d1b/f7c [0,4194304] 0 2026-03-09T19:27:57.244 INFO:tasks.workunit.client.1.vm08.stdout:8/944: dwrite de/f1f [0,4194304] 0 2026-03-09T19:27:57.246 INFO:tasks.workunit.client.1.vm08.stdout:0/972: mkdir dd/d22/d27/d2e/db0/d140 0 2026-03-09T19:27:57.264 INFO:tasks.workunit.client.1.vm08.stdout:6/969: chown d3/d94/cb6 1291597 1 2026-03-09T19:27:57.270 INFO:tasks.workunit.client.0.vm07.stdout:1/763: dwrite d1/d11/d37/d3f/d45/d87/faf [4194304,4194304] 0 2026-03-09T19:27:57.271 INFO:tasks.workunit.client.0.vm07.stdout:4/737: rename d3/fda to d3/d11/d2b/d38/ddc/db2/f102 0 2026-03-09T19:27:57.272 INFO:tasks.workunit.client.0.vm07.stdout:6/706: mknod d0/d1/c113 0 2026-03-09T19:27:57.277 INFO:tasks.workunit.client.0.vm07.stdout:3/820: dwrite d1/d1f/f13 [0,4194304] 0 2026-03-09T19:27:57.298 INFO:tasks.workunit.client.0.vm07.stdout:4/738: dread d3/d4f/d56/fcf [0,4194304] 0 2026-03-09T19:27:57.303 INFO:tasks.workunit.client.1.vm08.stdout:9/954: fsync d0/d1b/d97/dd3/f11a 0 2026-03-09T19:27:57.307 INFO:tasks.workunit.client.1.vm08.stdout:8/945: mkdir de/d1d/d4f/d141 0 2026-03-09T19:27:57.307 INFO:tasks.workunit.client.0.vm07.stdout:9/754: symlink d0/d6/d3a/d81/l10a 0 2026-03-09T19:27:57.307 INFO:tasks.workunit.client.1.vm08.stdout:8/946: stat de/d91/fd6 0 2026-03-09T19:27:57.319 INFO:tasks.workunit.client.1.vm08.stdout:0/973: dread dd/d22/fe7 [0,4194304] 0 2026-03-09T19:27:57.329 INFO:tasks.workunit.client.1.vm08.stdout:6/970: dread d3/d15/f40 [0,4194304] 0 2026-03-09T19:27:57.333 INFO:tasks.workunit.client.0.vm07.stdout:7/717: write d0/d4/d5/d8/d41/d64/fc1 [2872650,44077] 0 2026-03-09T19:27:57.334 INFO:tasks.workunit.client.1.vm08.stdout:2/869: write d3/d4/d23/d2c/fe6 [2102775,120175] 0 2026-03-09T19:27:57.336 INFO:tasks.workunit.client.1.vm08.stdout:2/870: readlink d3/d4/d23/d2c/d39/d5e/de/d18/d1f/l43 0 2026-03-09T19:27:57.340 INFO:tasks.workunit.client.1.vm08.stdout:2/871: stat d3/d4/d23/d2c/d39/d5e/de/d18/da9/la6 0 2026-03-09T19:27:57.343 INFO:tasks.workunit.client.0.vm07.stdout:6/707: creat d0/d1/db/f114 x:0 0 0 2026-03-09T19:27:57.343 INFO:tasks.workunit.client.0.vm07.stdout:2/807: dwrite d3/dd/d16/d29/d3c/fe4 [0,4194304] 0 2026-03-09T19:27:57.351 INFO:tasks.workunit.client.1.vm08.stdout:2/872: dwrite d3/d4/d23/d2c/d39/d5e/de/fe9 [0,4194304] 0 2026-03-09T19:27:57.364 INFO:tasks.workunit.client.0.vm07.stdout:1/764: rename d1/d11/d37/d3f/db5 to d1/d3e/db3/d6d/dff 0 2026-03-09T19:27:57.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.375+0000 7f9991af1640 1 -- 192.168.123.107:0/4186052139 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f998c083590 con 0x7f998c071a50 2026-03-09T19:27:57.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.376+0000 7f9989ffb640 1 -- 192.168.123.107:0/4186052139 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f997c06a6b0 con 0x7f998c071a50 2026-03-09T19:27:57.377 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_OK 2026-03-09T19:27:57.379 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.380+0000 7f995b7fe640 1 -- 192.168.123.107:0/4186052139 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f9978076cb0 msgr2=0x7f9978079170 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:57.379 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.380+0000 7f995b7fe640 1 --2- 192.168.123.107:0/4186052139 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f9978076cb0 0x7f9978079170 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f998400b070 tx=0x7f9984002750 comp rx=0 tx=0).stop 2026-03-09T19:27:57.379 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.380+0000 7f995b7fe640 1 -- 192.168.123.107:0/4186052139 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f998c071a50 msgr2=0x7f998c084060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:27:57.379 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.380+0000 7f995b7fe640 1 --2- 192.168.123.107:0/4186052139 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f998c071a50 0x7f998c084060 secure :-1 s=READY pgs=355 cs=0 l=1 rev1=1 crypto rx=0x7f997c00afc0 tx=0x7f997c007590 comp rx=0 tx=0).stop 2026-03-09T19:27:57.379 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.380+0000 7f995b7fe640 1 -- 192.168.123.107:0/4186052139 shutdown_connections 2026-03-09T19:27:57.379 INFO:tasks.workunit.client.1.vm08.stdout:8/947: dread - de/d25/d31/d82/f96 zero size 2026-03-09T19:27:57.380 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.380+0000 7f995b7fe640 1 --2- 192.168.123.107:0/4186052139 >> [v2:192.168.123.108:6828/502005203,v1:192.168.123.108:6829/502005203] conn(0x7f9978076cb0 0x7f9978079170 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:57.380 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.380+0000 7f995b7fe640 1 --2- 192.168.123.107:0/4186052139 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f998c0826b0 0x7f998c082b30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:57.380 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.380+0000 7f995b7fe640 1 --2- 192.168.123.107:0/4186052139 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f998c071a50 0x7f998c084060 unknown :-1 s=CLOSED pgs=355 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:27:57.380 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.380+0000 7f995b7fe640 1 -- 192.168.123.107:0/4186052139 >> 192.168.123.107:0/4186052139 conn(0x7f998c06d4f0 msgr2=0x7f998c070400 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:27:57.380 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.380+0000 7f995b7fe640 1 -- 192.168.123.107:0/4186052139 shutdown_connections 2026-03-09T19:27:57.380 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:27:57.381+0000 7f995b7fe640 1 -- 192.168.123.107:0/4186052139 wait complete. 2026-03-09T19:27:57.390 INFO:tasks.workunit.client.1.vm08.stdout:0/974: dread dd/d22/d27/d2e/d37/f46 [8388608,4194304] 0 2026-03-09T19:27:57.406 INFO:tasks.workunit.client.1.vm08.stdout:6/971: creat d3/d15/d111/f168 x:0 0 0 2026-03-09T19:27:57.407 INFO:tasks.workunit.client.0.vm07.stdout:5/731: link d3/d1a/d28/d6c/d72/cd5 d3/dd/d26/d2d/d9e/cea 0 2026-03-09T19:27:57.407 INFO:tasks.workunit.client.1.vm08.stdout:4/914: write da/d10/d26/d3a/d69/df1/fa9 [638292,108422] 0 2026-03-09T19:27:57.407 INFO:tasks.workunit.client.0.vm07.stdout:5/732: chown d3/d1a/d28/d6c/ccc 430322504 1 2026-03-09T19:27:57.408 INFO:tasks.workunit.client.1.vm08.stdout:4/915: chown da/d10/d16/d28/d2f/d4f/d64/d81/fb2 12503950 1 2026-03-09T19:27:57.411 INFO:tasks.workunit.client.0.vm07.stdout:9/755: chown d0/db/f1d 66654351 1 2026-03-09T19:27:57.412 INFO:tasks.workunit.client.1.vm08.stdout:5/938: creat d16/d1e/d3b/d61/d11e/f133 x:0 0 0 2026-03-09T19:27:57.412 INFO:tasks.workunit.client.1.vm08.stdout:2/873: creat d3/d4/d23/d2c/d39/d5e/de/d18/d99/f125 x:0 0 0 2026-03-09T19:27:57.419 INFO:tasks.workunit.client.0.vm07.stdout:9/756: sync 2026-03-09T19:27:57.423 INFO:tasks.workunit.client.1.vm08.stdout:5/939: dwrite d16/d45/daf/df5/d6f/f11b [0,4194304] 0 2026-03-09T19:27:57.429 INFO:tasks.workunit.client.1.vm08.stdout:9/955: dwrite d0/d2/d80/d69/ff3 [0,4194304] 0 2026-03-09T19:27:57.437 INFO:tasks.workunit.client.1.vm08.stdout:5/940: sync 2026-03-09T19:27:57.448 INFO:tasks.workunit.client.1.vm08.stdout:5/941: sync 2026-03-09T19:27:57.456 INFO:tasks.workunit.client.0.vm07.stdout:4/739: dwrite d3/d11/d2b/d38/ddc/d22/fd9 [0,4194304] 0 2026-03-09T19:27:57.457 INFO:tasks.workunit.client.0.vm07.stdout:4/740: fdatasync d3/fd7 0 2026-03-09T19:27:57.462 INFO:tasks.workunit.client.0.vm07.stdout:4/741: chown d3/dbe 290610542 1 2026-03-09T19:27:57.463 INFO:tasks.workunit.client.1.vm08.stdout:0/975: dwrite dd/d22/d27/d2e/f51 [0,4194304] 0 2026-03-09T19:27:57.477 INFO:tasks.workunit.client.1.vm08.stdout:4/916: dwrite f9 [0,4194304] 0 2026-03-09T19:27:57.481 INFO:tasks.workunit.client.1.vm08.stdout:4/917: sync 2026-03-09T19:27:57.490 INFO:tasks.workunit.client.1.vm08.stdout:9/956: mkdir d0/d1b/d68/d7f/d139 0 2026-03-09T19:27:57.498 INFO:tasks.workunit.client.1.vm08.stdout:9/957: readlink d0/d1b/d97/d48/d5d/d74/ded/ld0 0 2026-03-09T19:27:57.498 INFO:tasks.workunit.client.1.vm08.stdout:9/958: write d0/d2/d14/d98/f9e [433531,43495] 0 2026-03-09T19:27:57.522 INFO:tasks.workunit.client.0.vm07.stdout:4/742: dread d3/d11/d2b/d38/ddc/d22/f24 [0,4194304] 0 2026-03-09T19:27:57.523 INFO:tasks.workunit.client.0.vm07.stdout:4/743: fsync d3/d4f/f5e 0 2026-03-09T19:27:57.527 INFO:tasks.workunit.client.1.vm08.stdout:5/942: dread d16/d1e/f2c [0,4194304] 0 2026-03-09T19:27:57.527 INFO:tasks.workunit.client.1.vm08.stdout:5/943: readlink d16/d45/l48 0 2026-03-09T19:27:57.532 INFO:tasks.workunit.client.1.vm08.stdout:8/948: creat de/d25/f142 x:0 0 0 2026-03-09T19:27:57.535 INFO:tasks.workunit.client.1.vm08.stdout:4/918: rmdir da/d10/d16/d28/d2f/d4f/d56 39 2026-03-09T19:27:57.541 INFO:tasks.workunit.client.1.vm08.stdout:6/972: creat d3/d34/f169 x:0 0 0 2026-03-09T19:27:57.548 INFO:tasks.workunit.client.1.vm08.stdout:2/874: write d3/d4/d23/f11a [4432752,4344] 0 2026-03-09T19:27:57.548 INFO:tasks.workunit.client.1.vm08.stdout:0/976: write dd/d22/fba [2909942,108318] 0 2026-03-09T19:27:57.551 INFO:tasks.workunit.client.1.vm08.stdout:2/875: fdatasync d3/d4/d23/d2c/d39/d5e/de/d18/f2d 0 2026-03-09T19:27:57.557 INFO:tasks.workunit.client.1.vm08.stdout:9/959: fsync d0/d2/d14/d98/f9d 0 2026-03-09T19:27:57.577 INFO:tasks.workunit.client.1.vm08.stdout:8/949: dwrite de/d91/dc8/fe4 [0,4194304] 0 2026-03-09T19:27:57.579 INFO:tasks.workunit.client.1.vm08.stdout:6/973: truncate d3/d34/da9/f108 1024608 0 2026-03-09T19:27:57.589 INFO:tasks.workunit.client.1.vm08.stdout:4/919: dwrite da/d10/d16/d28/d2f/d4f/d103/fd1 [0,4194304] 0 2026-03-09T19:27:57.623 INFO:tasks.workunit.client.0.vm07.stdout:2/808: symlink d3/dd/daa/l120 0 2026-03-09T19:27:57.624 INFO:tasks.workunit.client.1.vm08.stdout:9/960: dwrite d0/d1b/de9/d12a/da2/da8/f113 [0,4194304] 0 2026-03-09T19:27:57.628 INFO:tasks.workunit.client.1.vm08.stdout:8/950: fsync de/d25/d31/f8e 0 2026-03-09T19:27:57.628 INFO:tasks.workunit.client.0.vm07.stdout:7/718: rename d0/d4/d5/fd0 to d0/d80/db1/de5/d54/d5a/ff4 0 2026-03-09T19:27:57.635 INFO:tasks.workunit.client.1.vm08.stdout:6/974: rename d3/c91 to d3/d34/d5c/d158/c16a 0 2026-03-09T19:27:57.639 INFO:tasks.workunit.client.0.vm07.stdout:1/765: dread - d1/d11/d37/d3f/d45/d87/fa9 zero size 2026-03-09T19:27:57.640 INFO:tasks.workunit.client.0.vm07.stdout:1/766: readlink d1/d3e/dc8/ldf 0 2026-03-09T19:27:57.645 INFO:tasks.workunit.client.1.vm08.stdout:4/920: readlink da/d10/d26/d3a/db5/lc4 0 2026-03-09T19:27:57.650 INFO:tasks.workunit.client.0.vm07.stdout:3/821: link d1/d6/dd/fb0 d1/d6/d45/d54/de5/f10c 0 2026-03-09T19:27:57.654 INFO:tasks.workunit.client.0.vm07.stdout:3/822: dwrite d1/d6/d45/dac/fe8 [0,4194304] 0 2026-03-09T19:27:57.660 INFO:tasks.workunit.client.0.vm07.stdout:5/733: mknod d3/d1a/d5d/ceb 0 2026-03-09T19:27:57.661 INFO:tasks.workunit.client.0.vm07.stdout:5/734: dread - d3/d1a/d28/d40/d92/f5e zero size 2026-03-09T19:27:57.667 INFO:tasks.workunit.client.0.vm07.stdout:0/731: creat d0/d6/d13/d17/ff0 x:0 0 0 2026-03-09T19:27:57.668 INFO:tasks.workunit.client.0.vm07.stdout:8/769: link d7/d30/c9b d7/d9/d37/d45/c113 0 2026-03-09T19:27:57.669 INFO:tasks.workunit.client.0.vm07.stdout:8/770: chown d7/d9/d10/dd8/dfd/d62/ca2 167 1 2026-03-09T19:27:57.676 INFO:tasks.workunit.client.0.vm07.stdout:0/732: dread d0/d6/d13/d17/dc3/fb6 [0,4194304] 0 2026-03-09T19:27:57.677 INFO:tasks.workunit.client.1.vm08.stdout:5/944: rmdir d16/d1e/d8c/d99/d12a 0 2026-03-09T19:27:57.692 INFO:tasks.workunit.client.0.vm07.stdout:4/744: read - d3/d11/d2b/d37/f95 zero size 2026-03-09T19:27:57.697 INFO:tasks.workunit.client.1.vm08.stdout:4/921: unlink da/d10/d1b/f37 0 2026-03-09T19:27:57.700 INFO:tasks.workunit.client.0.vm07.stdout:7/719: symlink d0/d80/db1/de5/d54/d55/lf5 0 2026-03-09T19:27:57.702 INFO:tasks.workunit.client.0.vm07.stdout:2/809: write d3/dd/d16/ffe [625052,68698] 0 2026-03-09T19:27:57.703 INFO:tasks.workunit.client.1.vm08.stdout:2/876: mknod d3/d9/d79/d46/d8c/c126 0 2026-03-09T19:27:57.704 INFO:tasks.workunit.client.1.vm08.stdout:9/961: write d0/d1b/d97/f117 [3567773,56341] 0 2026-03-09T19:27:57.708 INFO:tasks.workunit.client.1.vm08.stdout:2/877: write d3/d4/d23/d2c/d39/d5e/de/d18/da9/d110/f122 [867048,67103] 0 2026-03-09T19:27:57.720 INFO:tasks.workunit.client.0.vm07.stdout:5/735: stat d3/dd/l45 0 2026-03-09T19:27:57.723 INFO:tasks.workunit.client.0.vm07.stdout:9/757: fsync d0/f56 0 2026-03-09T19:27:57.724 INFO:tasks.workunit.client.0.vm07.stdout:0/733: fsync d0/d6/fa4 0 2026-03-09T19:27:57.725 INFO:tasks.workunit.client.0.vm07.stdout:3/823: dread d1/d1f/f38 [0,4194304] 0 2026-03-09T19:27:57.734 INFO:tasks.workunit.client.0.vm07.stdout:8/771: write d7/d9/d10/dd8/dfd/fc0 [983493,51265] 0 2026-03-09T19:27:57.736 INFO:tasks.workunit.client.0.vm07.stdout:3/824: dread d1/d6/dd/f57 [0,4194304] 0 2026-03-09T19:27:57.737 INFO:tasks.workunit.client.0.vm07.stdout:3/825: chown d1/cfb 10 1 2026-03-09T19:27:57.748 INFO:tasks.workunit.client.0.vm07.stdout:4/745: write d3/d11/d2b/d38/ddc/d22/d86/f97 [183838,48006] 0 2026-03-09T19:27:57.748 INFO:tasks.workunit.client.0.vm07.stdout:7/720: symlink d0/d4/d5/d99/lf6 0 2026-03-09T19:27:57.750 INFO:tasks.workunit.client.0.vm07.stdout:9/758: dread d0/dc1/f7c [0,4194304] 0 2026-03-09T19:27:57.750 INFO:tasks.workunit.client.0.vm07.stdout:2/810: symlink d3/dd/d16/d29/d2d/d45/d3b/dae/l121 0 2026-03-09T19:27:57.753 INFO:tasks.workunit.client.0.vm07.stdout:2/811: write d3/dd/d16/d29/d2d/d45/d3b/d44/d96/f115 [673091,40113] 0 2026-03-09T19:27:57.754 INFO:tasks.workunit.client.1.vm08.stdout:5/945: read d16/d1e/dc9/d10c/f98 [3370566,89942] 0 2026-03-09T19:27:57.754 INFO:tasks.workunit.client.0.vm07.stdout:0/734: rename d0/c72 to d0/d6/dc8/d99/ddc/cf1 0 2026-03-09T19:27:57.755 INFO:tasks.workunit.client.0.vm07.stdout:6/708: getdents d0/d2d/dd5 0 2026-03-09T19:27:57.755 INFO:tasks.workunit.client.0.vm07.stdout:2/812: chown d3/dd/d16/ffe 386890274 1 2026-03-09T19:27:57.756 INFO:tasks.workunit.client.0.vm07.stdout:1/767: mknod d1/d11/d37/d5d/dc1/c100 0 2026-03-09T19:27:57.761 INFO:tasks.workunit.client.0.vm07.stdout:5/736: read d3/d1a/d28/d6c/d72/d8f/f91 [910,88433] 0 2026-03-09T19:27:57.767 INFO:tasks.workunit.client.0.vm07.stdout:3/826: dread d1/f98 [0,4194304] 0 2026-03-09T19:27:57.767 INFO:tasks.workunit.client.1.vm08.stdout:8/951: creat de/d117/df2/d130/f143 x:0 0 0 2026-03-09T19:27:57.774 INFO:tasks.workunit.client.0.vm07.stdout:9/759: write d0/d17/f42 [502160,103055] 0 2026-03-09T19:27:57.778 INFO:tasks.workunit.client.1.vm08.stdout:6/975: unlink d3/d34/d6f/l38 0 2026-03-09T19:27:57.784 INFO:tasks.workunit.client.0.vm07.stdout:8/772: rename f3 to d7/d1d/f114 0 2026-03-09T19:27:57.785 INFO:tasks.workunit.client.0.vm07.stdout:0/735: truncate d0/f1e 3190751 0 2026-03-09T19:27:57.785 INFO:tasks.workunit.client.1.vm08.stdout:0/977: rename dd/d22/d63/d6e/ld3 to dd/d22/d27/d11e/d78/d86/l141 0 2026-03-09T19:27:57.785 INFO:tasks.workunit.client.1.vm08.stdout:9/962: unlink d0/d2/d14/d98/d99/ff9 0 2026-03-09T19:27:57.786 INFO:tasks.workunit.client.0.vm07.stdout:6/709: unlink d0/d1/db/d17/dc4/f60 0 2026-03-09T19:27:57.786 INFO:tasks.workunit.client.0.vm07.stdout:2/813: mkdir d3/dd/d16/d30/d40/d122 0 2026-03-09T19:27:57.788 INFO:tasks.workunit.client.1.vm08.stdout:8/952: creat de/d7c/f144 x:0 0 0 2026-03-09T19:27:57.789 INFO:tasks.workunit.client.0.vm07.stdout:2/814: write d3/dd/d16/d29/d2d/d45/d3b/d44/d96/f115 [108892,30111] 0 2026-03-09T19:27:57.800 INFO:tasks.workunit.client.0.vm07.stdout:5/737: symlink d3/d1a/d28/d6c/d72/lec 0 2026-03-09T19:27:57.807 INFO:tasks.workunit.client.0.vm07.stdout:1/768: write d1/d11/d37/d3f/dd0/fdc [773776,107194] 0 2026-03-09T19:27:57.811 INFO:tasks.workunit.client.1.vm08.stdout:0/978: rename dd/d22/df7 to dd/d22/d27/d11e/d142 0 2026-03-09T19:27:57.812 INFO:tasks.workunit.client.1.vm08.stdout:0/979: chown dd/d9d/dcc/ff8 2812 1 2026-03-09T19:27:57.817 INFO:tasks.workunit.client.1.vm08.stdout:6/976: dwrite d3/d34/d6f/f2f [0,4194304] 0 2026-03-09T19:27:57.823 INFO:tasks.workunit.client.1.vm08.stdout:2/878: symlink d3/d9/l127 0 2026-03-09T19:27:57.827 INFO:tasks.workunit.client.1.vm08.stdout:9/963: write d0/d2/f21 [2876532,75932] 0 2026-03-09T19:27:57.830 INFO:tasks.workunit.client.0.vm07.stdout:7/721: fsync d0/d4/d5/d26/d32/dbd/fba 0 2026-03-09T19:27:57.830 INFO:tasks.workunit.client.1.vm08.stdout:4/922: creat da/d10/d16/d28/d2f/f118 x:0 0 0 2026-03-09T19:27:57.834 INFO:tasks.workunit.client.0.vm07.stdout:8/773: symlink d7/d9/d10/dd8/l115 0 2026-03-09T19:27:57.837 INFO:tasks.workunit.client.0.vm07.stdout:1/769: symlink d1/d3e/db3/l101 0 2026-03-09T19:27:57.838 INFO:tasks.workunit.client.0.vm07.stdout:3/827: creat d1/d3d/d47/db3/dc2/d28/dfd/f10d x:0 0 0 2026-03-09T19:27:57.840 INFO:tasks.workunit.client.1.vm08.stdout:6/977: creat d3/d15/dc2/f16b x:0 0 0 2026-03-09T19:27:57.841 INFO:tasks.workunit.client.0.vm07.stdout:9/760: truncate d0/d6/ff 2667644 0 2026-03-09T19:27:57.854 INFO:tasks.workunit.client.1.vm08.stdout:5/946: dwrite d16/d1e/d8c/d99/f105 [0,4194304] 0 2026-03-09T19:27:57.854 INFO:tasks.workunit.client.1.vm08.stdout:4/923: fdatasync da/d10/d26/d27/d32/f45 0 2026-03-09T19:27:57.854 INFO:tasks.workunit.client.0.vm07.stdout:0/736: mknod d0/d6/d13/d1c/d61/cf2 0 2026-03-09T19:27:57.859 INFO:tasks.workunit.client.1.vm08.stdout:8/953: symlink de/d1d/d69/l145 0 2026-03-09T19:27:57.877 INFO:tasks.workunit.client.1.vm08.stdout:2/879: unlink d3/d4/d23/d2c/d39/d5e/db8/dff/l10d 0 2026-03-09T19:27:57.882 INFO:tasks.workunit.client.0.vm07.stdout:6/710: rmdir d0/d4e/d7f 39 2026-03-09T19:27:57.884 INFO:tasks.workunit.client.1.vm08.stdout:9/964: dread d0/d2/d14/d98/d99/fd4 [0,4194304] 0 2026-03-09T19:27:57.891 INFO:tasks.workunit.client.0.vm07.stdout:9/761: creat d0/d6f/f10b x:0 0 0 2026-03-09T19:27:57.896 INFO:tasks.workunit.client.0.vm07.stdout:4/746: rename d3/d11/d2b/d38/ddc/d22/fd9 to d3/d11/f103 0 2026-03-09T19:27:57.906 INFO:tasks.workunit.client.1.vm08.stdout:4/924: unlink f9 0 2026-03-09T19:27:57.909 INFO:tasks.workunit.client.0.vm07.stdout:7/722: unlink d0/d4/d5/d8/d41/d64/d74/d98/l9c 0 2026-03-09T19:27:57.915 INFO:tasks.workunit.client.1.vm08.stdout:0/980: dwrite dd/d22/d27/d65/ddf/f108 [0,4194304] 0 2026-03-09T19:27:57.918 INFO:tasks.workunit.client.1.vm08.stdout:6/978: write d3/d34/d5c/da2/d107/f115 [787654,90048] 0 2026-03-09T19:27:57.920 INFO:tasks.workunit.client.0.vm07.stdout:0/737: truncate d0/f41 1250452 0 2026-03-09T19:27:57.920 INFO:tasks.workunit.client.1.vm08.stdout:6/979: chown d3/d34/da9/da4/d117 1 1 2026-03-09T19:27:57.921 INFO:tasks.workunit.client.1.vm08.stdout:6/980: write d3/db/d43/d69/da0/fdf [2203586,110156] 0 2026-03-09T19:27:57.923 INFO:tasks.workunit.client.1.vm08.stdout:6/981: chown d3/l95 2044 1 2026-03-09T19:27:57.929 INFO:tasks.workunit.client.1.vm08.stdout:2/880: chown d3/d4/d23/d2c/c62 984 1 2026-03-09T19:27:57.932 INFO:tasks.workunit.client.0.vm07.stdout:3/828: mkdir d1/d3d/d47/d10e 0 2026-03-09T19:27:57.934 INFO:tasks.workunit.client.0.vm07.stdout:9/762: creat d0/d6/d3a/f10c x:0 0 0 2026-03-09T19:27:57.937 INFO:tasks.workunit.client.0.vm07.stdout:2/815: rename d3/ff to d3/dd/d16/d29/d2d/d45/d8b/d98/dee/f123 0 2026-03-09T19:27:57.937 INFO:tasks.workunit.client.1.vm08.stdout:4/925: fdatasync da/d10/d26/d27/d9b/f108 0 2026-03-09T19:27:57.938 INFO:tasks.workunit.client.0.vm07.stdout:2/816: stat d3/dd/d16/d29/d3c/d4c/ca8 0 2026-03-09T19:27:57.939 INFO:tasks.workunit.client.0.vm07.stdout:7/723: readlink d0/d4/d5/d8/d1a/l8c 0 2026-03-09T19:27:57.940 INFO:tasks.workunit.client.0.vm07.stdout:2/817: stat d3/dd/d16/d29/d3c/d5a/d7a/d74/la6 0 2026-03-09T19:27:57.941 INFO:tasks.workunit.client.0.vm07.stdout:2/818: chown d3/dd/d16/d29/d2d/d45/df6 26247 1 2026-03-09T19:27:57.942 INFO:tasks.workunit.client.1.vm08.stdout:5/947: mknod d16/d1e/d8c/d99/da8/d9a/c134 0 2026-03-09T19:27:57.944 INFO:tasks.workunit.client.1.vm08.stdout:0/981: creat dd/d31/d132/f143 x:0 0 0 2026-03-09T19:27:57.945 INFO:tasks.workunit.client.0.vm07.stdout:9/763: sync 2026-03-09T19:27:57.946 INFO:tasks.workunit.client.0.vm07.stdout:2/819: dwrite d3/dd/fe [4194304,4194304] 0 2026-03-09T19:27:57.953 INFO:tasks.workunit.client.0.vm07.stdout:6/711: mknod d0/d2d/dd5/df5/c115 0 2026-03-09T19:27:57.961 INFO:tasks.workunit.client.0.vm07.stdout:6/712: dwrite d0/d2d/f4a [4194304,4194304] 0 2026-03-09T19:27:57.981 INFO:tasks.workunit.client.1.vm08.stdout:6/982: fsync d3/d15/f2b 0 2026-03-09T19:27:57.981 INFO:tasks.workunit.client.1.vm08.stdout:2/881: creat d3/d4/d23/d2c/d39/db9/df6/f128 x:0 0 0 2026-03-09T19:27:57.981 INFO:tasks.workunit.client.1.vm08.stdout:9/965: mknod d0/d1b/de9/d12a/da2/da8/de8/c13a 0 2026-03-09T19:27:57.981 INFO:tasks.workunit.client.1.vm08.stdout:8/954: rmdir de/d7c/d13f 0 2026-03-09T19:27:57.982 INFO:tasks.workunit.client.0.vm07.stdout:3/829: fdatasync d1/d6/dd/f4d 0 2026-03-09T19:27:57.982 INFO:tasks.workunit.client.0.vm07.stdout:4/747: dread d3/d11/d51/f8e [0,4194304] 0 2026-03-09T19:27:57.982 INFO:tasks.workunit.client.0.vm07.stdout:5/738: rename d3/dd/d26/d3f/d47/fb6 to d3/d1a/d5a/db8/fed 0 2026-03-09T19:27:57.982 INFO:tasks.workunit.client.0.vm07.stdout:7/724: dread - d0/d4/f86 zero size 2026-03-09T19:27:57.982 INFO:tasks.workunit.client.0.vm07.stdout:8/774: getdents d7/d9/d10/dd8/dfd/d67/de7 0 2026-03-09T19:27:57.984 INFO:tasks.workunit.client.0.vm07.stdout:3/830: dread d1/d74/f55 [0,4194304] 0 2026-03-09T19:27:57.987 INFO:tasks.workunit.client.1.vm08.stdout:8/955: write de/d25/f142 [830800,15988] 0 2026-03-09T19:27:57.994 INFO:tasks.workunit.client.1.vm08.stdout:2/882: dread d3/d9/d79/d46/f72 [0,4194304] 0 2026-03-09T19:27:57.995 INFO:tasks.workunit.client.1.vm08.stdout:6/983: dread d3/d34/da9/f97 [0,4194304] 0 2026-03-09T19:27:57.995 INFO:tasks.workunit.client.1.vm08.stdout:6/984: read - d3/d15/dc2/f16b zero size 2026-03-09T19:27:58.000 INFO:tasks.workunit.client.0.vm07.stdout:2/820: dread d3/dd/d103/fb6 [0,4194304] 0 2026-03-09T19:27:58.004 INFO:tasks.workunit.client.1.vm08.stdout:8/956: stat de/d1d/f1e 0 2026-03-09T19:27:58.004 INFO:tasks.workunit.client.1.vm08.stdout:4/926: rename da/d10/d1b/f85 to da/d10/d26/d3a/f119 0 2026-03-09T19:27:58.004 INFO:tasks.workunit.client.0.vm07.stdout:6/713: creat d0/d1/d28/da9/f116 x:0 0 0 2026-03-09T19:27:58.004 INFO:tasks.workunit.client.0.vm07.stdout:1/770: rename d1/d3e/db3/d6d/fb0 to d1/d11/d37/dcb/f102 0 2026-03-09T19:27:58.004 INFO:tasks.workunit.client.0.vm07.stdout:0/738: creat d0/ff3 x:0 0 0 2026-03-09T19:27:58.005 INFO:tasks.workunit.client.1.vm08.stdout:0/982: sync 2026-03-09T19:27:58.012 INFO:tasks.workunit.client.1.vm08.stdout:2/883: dread d3/d9/d79/f98 [0,4194304] 0 2026-03-09T19:27:58.016 INFO:tasks.workunit.client.0.vm07.stdout:2/821: unlink d3/dd/d16/d29/d2d/d45/d3b/d44/d97/da4/lea 0 2026-03-09T19:27:58.017 INFO:tasks.workunit.client.1.vm08.stdout:2/884: dwrite d3/d4/d23/d2c/d39/d5e/de/d18/da9/f118 [0,4194304] 0 2026-03-09T19:27:58.021 INFO:tasks.workunit.client.1.vm08.stdout:8/957: link de/d47/fc1 de/d1d/d21/f146 0 2026-03-09T19:27:58.021 INFO:tasks.workunit.client.1.vm08.stdout:6/985: creat d3/db/d43/d69/f16c x:0 0 0 2026-03-09T19:27:58.024 INFO:tasks.workunit.client.0.vm07.stdout:2/822: truncate d3/dd/d16/d30/d40/f107 4723688 0 2026-03-09T19:27:58.028 INFO:tasks.workunit.client.0.vm07.stdout:2/823: stat d3/fc 0 2026-03-09T19:27:58.028 INFO:tasks.workunit.client.1.vm08.stdout:5/948: write d16/d45/fb1 [860011,97781] 0 2026-03-09T19:27:58.031 INFO:tasks.workunit.client.1.vm08.stdout:5/949: read d16/d1e/d8c/d99/f11a [4031105,92366] 0 2026-03-09T19:27:58.033 INFO:tasks.workunit.client.0.vm07.stdout:9/764: write d0/db/d29/d32/d5c/f78 [1199822,49580] 0 2026-03-09T19:27:58.034 INFO:tasks.workunit.client.0.vm07.stdout:9/765: readlink d0/d6f/d86/lec 0 2026-03-09T19:27:58.035 INFO:tasks.workunit.client.0.vm07.stdout:9/766: dread - d0/db/fda zero size 2026-03-09T19:27:58.038 INFO:tasks.workunit.client.0.vm07.stdout:1/771: mknod d1/d3e/db3/d6d/dff/c103 0 2026-03-09T19:27:58.047 INFO:tasks.workunit.client.1.vm08.stdout:9/966: dwrite d0/d2/d14/d98/f10b [0,4194304] 0 2026-03-09T19:27:58.047 INFO:tasks.workunit.client.1.vm08.stdout:2/885: unlink d3/d4/d23/d2c/f94 0 2026-03-09T19:27:58.047 INFO:tasks.workunit.client.0.vm07.stdout:0/739: chown d0/cf 845750 1 2026-03-09T19:27:58.047 INFO:tasks.workunit.client.0.vm07.stdout:5/739: write d3/dd/d26/d2d/faa [404881,95906] 0 2026-03-09T19:27:58.051 INFO:tasks.workunit.client.0.vm07.stdout:8/775: write d7/d9/d10/fb9 [6060895,76319] 0 2026-03-09T19:27:58.051 INFO:tasks.workunit.client.1.vm08.stdout:0/983: write dd/d22/d27/d2e/f39 [716052,40593] 0 2026-03-09T19:27:58.052 INFO:tasks.workunit.client.0.vm07.stdout:8/776: fsync d7/d9/d37/fb4 0 2026-03-09T19:27:58.056 INFO:tasks.workunit.client.1.vm08.stdout:6/986: symlink d3/d15/dc2/d12f/l16d 0 2026-03-09T19:27:58.057 INFO:tasks.workunit.client.0.vm07.stdout:7/725: link d0/d4/d5/d8/d41/d64/fc1 d0/d80/db1/ff7 0 2026-03-09T19:27:58.058 INFO:tasks.workunit.client.1.vm08.stdout:5/950: creat d16/d8e/dd5/f135 x:0 0 0 2026-03-09T19:27:58.061 INFO:tasks.workunit.client.0.vm07.stdout:4/748: creat d3/d11/f104 x:0 0 0 2026-03-09T19:27:58.061 INFO:tasks.workunit.client.0.vm07.stdout:2/824: creat d3/d49/f124 x:0 0 0 2026-03-09T19:27:58.061 INFO:tasks.workunit.client.1.vm08.stdout:9/967: rmdir d0/d1b/d97/d48/d5d/d74 39 2026-03-09T19:27:58.061 INFO:tasks.workunit.client.0.vm07.stdout:2/825: stat d3/d49/cc4 0 2026-03-09T19:27:58.062 INFO:tasks.workunit.client.0.vm07.stdout:4/749: dread - d3/d11/d51/faa zero size 2026-03-09T19:27:58.063 INFO:tasks.workunit.client.0.vm07.stdout:2/826: chown d3/dd/d103/ddd/ded/df3/c10f 8209440 1 2026-03-09T19:27:58.064 INFO:tasks.workunit.client.0.vm07.stdout:6/714: creat d0/d1/db/d91/f117 x:0 0 0 2026-03-09T19:27:58.065 INFO:tasks.workunit.client.0.vm07.stdout:9/767: truncate d0/d6/f7b 238831 0 2026-03-09T19:27:58.065 INFO:tasks.workunit.client.1.vm08.stdout:0/984: readlink dd/lda 0 2026-03-09T19:27:58.066 INFO:tasks.workunit.client.0.vm07.stdout:1/772: mknod d1/d3/d21/c104 0 2026-03-09T19:27:58.066 INFO:tasks.workunit.client.1.vm08.stdout:6/987: rmdir d3/dbc/deb 39 2026-03-09T19:27:58.068 INFO:tasks.workunit.client.1.vm08.stdout:8/958: rename de/d25/d87/dc9/dd8/c100 to de/d1d/d4f/d141/c147 0 2026-03-09T19:27:58.071 INFO:tasks.workunit.client.0.vm07.stdout:8/777: sync 2026-03-09T19:27:58.073 INFO:tasks.workunit.client.1.vm08.stdout:5/951: dread - d16/d45/daf/df5/d6f/f110 zero size 2026-03-09T19:27:58.074 INFO:tasks.workunit.client.0.vm07.stdout:5/740: mkdir d3/d1a/d5d/dee 0 2026-03-09T19:27:58.078 INFO:tasks.workunit.client.1.vm08.stdout:4/927: symlink da/d10/l11a 0 2026-03-09T19:27:58.079 INFO:tasks.workunit.client.0.vm07.stdout:7/726: fdatasync d0/d4/d5/d26/db9/dc2/fd1 0 2026-03-09T19:27:58.080 INFO:tasks.workunit.client.1.vm08.stdout:9/968: rmdir d0/d2/d14/d98/d99/dea 39 2026-03-09T19:27:58.081 INFO:tasks.workunit.client.1.vm08.stdout:2/886: symlink d3/d4/l129 0 2026-03-09T19:27:58.083 INFO:tasks.workunit.client.0.vm07.stdout:7/727: chown d0/d4/d5/d8/d41/d64/d74/f82 940119 1 2026-03-09T19:27:58.084 INFO:tasks.workunit.client.1.vm08.stdout:2/887: dread d3/d4/d23/d2c/d39/d5e/de/f1c [4194304,4194304] 0 2026-03-09T19:27:58.084 INFO:tasks.workunit.client.0.vm07.stdout:3/831: dwrite d1/f20 [0,4194304] 0 2026-03-09T19:27:58.089 INFO:tasks.workunit.client.0.vm07.stdout:4/750: dread d3/d11/d2b/d38/ddc/f67 [0,4194304] 0 2026-03-09T19:27:58.096 INFO:tasks.workunit.client.1.vm08.stdout:0/985: read dd/d22/f28 [59573,96158] 0 2026-03-09T19:27:58.099 INFO:tasks.workunit.client.0.vm07.stdout:4/751: dwrite d3/fa2 [0,4194304] 0 2026-03-09T19:27:58.102 INFO:tasks.workunit.client.1.vm08.stdout:6/988: fdatasync d3/d34/d6f/f4f 0 2026-03-09T19:27:58.114 INFO:tasks.workunit.client.0.vm07.stdout:9/768: fdatasync d0/db/d29/d2c/d36/f62 0 2026-03-09T19:27:58.115 INFO:tasks.workunit.client.0.vm07.stdout:9/769: dread - d0/db/f108 zero size 2026-03-09T19:27:58.115 INFO:tasks.workunit.client.1.vm08.stdout:8/959: symlink de/d117/df2/d130/l148 0 2026-03-09T19:27:58.121 INFO:tasks.workunit.client.0.vm07.stdout:1/773: unlink d1/d3/l41 0 2026-03-09T19:27:58.121 INFO:tasks.workunit.client.1.vm08.stdout:5/952: fdatasync d16/d1e/d8c/d99/da8/fd0 0 2026-03-09T19:27:58.123 INFO:tasks.workunit.client.0.vm07.stdout:0/740: creat d0/d6/d13/d1c/d52/ff4 x:0 0 0 2026-03-09T19:27:58.128 INFO:tasks.workunit.client.1.vm08.stdout:0/986: truncate dd/d7e/fde 906933 0 2026-03-09T19:27:58.128 INFO:tasks.workunit.client.0.vm07.stdout:5/741: symlink d3/d1a/d5a/lef 0 2026-03-09T19:27:58.128 INFO:tasks.workunit.client.0.vm07.stdout:8/778: dread d7/d9/d10/d44/fdc [0,4194304] 0 2026-03-09T19:27:58.128 INFO:tasks.workunit.client.0.vm07.stdout:7/728: mknod d0/d80/db1/de5/db4/cf8 0 2026-03-09T19:27:58.130 INFO:tasks.workunit.client.0.vm07.stdout:3/832: rmdir d1/d6/dd/dbf 39 2026-03-09T19:27:58.130 INFO:tasks.workunit.client.1.vm08.stdout:8/960: creat de/d1d/d2e/f149 x:0 0 0 2026-03-09T19:27:58.131 INFO:tasks.workunit.client.0.vm07.stdout:8/779: dread d7/d9/d10/dd8/dfd/d67/fa3 [0,4194304] 0 2026-03-09T19:27:58.131 INFO:tasks.workunit.client.0.vm07.stdout:4/752: stat d3/d11/d29/c9e 0 2026-03-09T19:27:58.131 INFO:tasks.workunit.client.1.vm08.stdout:5/953: dread - d16/d1e/d3b/d61/d11e/d107/fde zero size 2026-03-09T19:27:58.132 INFO:tasks.workunit.client.0.vm07.stdout:2/827: symlink d3/dd/daa/dc7/l125 0 2026-03-09T19:27:58.133 INFO:tasks.workunit.client.1.vm08.stdout:4/928: mknod da/d10/d26/d3a/d69/df1/c11b 0 2026-03-09T19:27:58.133 INFO:tasks.workunit.client.0.vm07.stdout:2/828: chown d3/dd/d16/d29/d3c/d4c/c6f 4469631 1 2026-03-09T19:27:58.134 INFO:tasks.workunit.client.0.vm07.stdout:9/770: symlink d0/d6f/dc3/l10d 0 2026-03-09T19:27:58.135 INFO:tasks.workunit.client.1.vm08.stdout:0/987: symlink dd/d22/d27/d11e/d78/d86/l144 0 2026-03-09T19:27:58.147 INFO:tasks.workunit.client.1.vm08.stdout:4/929: write da/d10/d16/d28/d2f/d4f/d103/d40/d6c/f92 [997167,108406] 0 2026-03-09T19:27:58.151 INFO:tasks.workunit.client.0.vm07.stdout:6/715: dwrite d0/d1/db/d1d/d77/ff0 [4194304,4194304] 0 2026-03-09T19:27:58.152 INFO:tasks.workunit.client.1.vm08.stdout:9/969: write d0/d1b/f7c [284828,22455] 0 2026-03-09T19:27:58.153 INFO:tasks.workunit.client.1.vm08.stdout:9/970: readlink d0/d1b/d68/dfe/l11e 0 2026-03-09T19:27:58.159 INFO:tasks.workunit.client.0.vm07.stdout:3/833: rmdir d1/d1f 39 2026-03-09T19:27:58.159 INFO:tasks.workunit.client.1.vm08.stdout:8/961: symlink de/d25/d33/l14a 0 2026-03-09T19:27:58.161 INFO:tasks.workunit.client.0.vm07.stdout:8/780: mknod d7/d1d/d83/d9f/dd2/def/c116 0 2026-03-09T19:27:58.162 INFO:tasks.workunit.client.0.vm07.stdout:2/829: mkdir d3/dd/d16/d29/d3c/da2/d126 0 2026-03-09T19:27:58.163 INFO:tasks.workunit.client.0.vm07.stdout:2/830: stat d3/d11/f19 0 2026-03-09T19:27:58.165 INFO:tasks.workunit.client.1.vm08.stdout:2/888: getdents d3/d4/d23/d2c/d39/db9/df6 0 2026-03-09T19:27:58.167 INFO:tasks.workunit.client.0.vm07.stdout:5/742: mkdir d3/dd/d26/d2d/d9e/df0 0 2026-03-09T19:27:58.176 INFO:tasks.workunit.client.0.vm07.stdout:1/774: dwrite d1/d11/d37/d5d/d50/f8b [0,4194304] 0 2026-03-09T19:27:58.176 INFO:tasks.workunit.client.0.vm07.stdout:1/775: chown d1/d11/d37/d3f/d45/d87/d88/df4 1982015828 1 2026-03-09T19:27:58.176 INFO:tasks.workunit.client.1.vm08.stdout:9/971: chown d0/d1b/d97/d48/d5d/d74/ded/lab 3581 1 2026-03-09T19:27:58.176 INFO:tasks.workunit.client.1.vm08.stdout:9/972: dread - d0/d1b/d97/d106/f126 zero size 2026-03-09T19:27:58.176 INFO:tasks.workunit.client.1.vm08.stdout:6/989: write d3/f7 [2924700,47078] 0 2026-03-09T19:27:58.179 INFO:tasks.workunit.client.1.vm08.stdout:8/962: fdatasync de/d91/fd6 0 2026-03-09T19:27:58.186 INFO:tasks.workunit.client.1.vm08.stdout:5/954: write d16/d1e/db3/fc6 [316109,79883] 0 2026-03-09T19:27:58.189 INFO:tasks.workunit.client.1.vm08.stdout:4/930: mknod da/d10/d16/d28/d2f/d4f/d103/c11c 0 2026-03-09T19:27:58.190 INFO:tasks.workunit.client.0.vm07.stdout:6/716: rmdir d0/dbf/d95 39 2026-03-09T19:27:58.191 INFO:tasks.workunit.client.0.vm07.stdout:4/753: mknod d3/dfc/c105 0 2026-03-09T19:27:58.191 INFO:tasks.workunit.client.0.vm07.stdout:3/834: mkdir d1/d6/d45/d54/de5/d10f 0 2026-03-09T19:27:58.192 INFO:tasks.workunit.client.1.vm08.stdout:9/973: creat d0/d1b/d122/f13b x:0 0 0 2026-03-09T19:27:58.192 INFO:tasks.workunit.client.0.vm07.stdout:3/835: chown d1/d74/f52 3114 1 2026-03-09T19:27:58.193 INFO:tasks.workunit.client.0.vm07.stdout:3/836: dread - d1/d89/fc7 zero size 2026-03-09T19:27:58.194 INFO:tasks.workunit.client.0.vm07.stdout:8/781: rename d7/d9/d37/d45/d4f/fd4 to d7/d9/d37/d45/d97/f117 0 2026-03-09T19:27:58.194 INFO:tasks.workunit.client.0.vm07.stdout:3/837: truncate d1/d6/d45/fbe 668602 0 2026-03-09T19:27:58.198 INFO:tasks.workunit.client.1.vm08.stdout:8/963: mknod de/d1d/d21/c14b 0 2026-03-09T19:27:58.198 INFO:tasks.workunit.client.0.vm07.stdout:0/741: rmdir d0/d6/d13/d1c/d11/d56/d62 0 2026-03-09T19:27:58.198 INFO:tasks.workunit.client.1.vm08.stdout:0/988: getdents dd/d22/d27/d2e 0 2026-03-09T19:27:58.202 INFO:tasks.workunit.client.0.vm07.stdout:4/754: dread d3/fa2 [0,4194304] 0 2026-03-09T19:27:58.204 INFO:tasks.workunit.client.1.vm08.stdout:9/974: dread d0/d2/d14/d98/f38 [0,4194304] 0 2026-03-09T19:27:58.204 INFO:tasks.workunit.client.0.vm07.stdout:2/831: dread d3/d49/fa1 [0,4194304] 0 2026-03-09T19:27:58.204 INFO:tasks.workunit.client.0.vm07.stdout:7/729: rmdir d0/d4/d5/d8/dcd 0 2026-03-09T19:27:58.208 INFO:tasks.workunit.client.0.vm07.stdout:0/742: dwrite d0/d6/d13/d33/fe1 [0,4194304] 0 2026-03-09T19:27:58.210 INFO:tasks.workunit.client.1.vm08.stdout:5/955: dread d16/d1e/d8c/d99/da8/fbc [0,4194304] 0 2026-03-09T19:27:58.212 INFO:tasks.workunit.client.0.vm07.stdout:6/717: dread d0/d1/fd4 [0,4194304] 0 2026-03-09T19:27:58.215 INFO:tasks.workunit.client.0.vm07.stdout:4/755: dread d3/d11/d29/ff4 [0,4194304] 0 2026-03-09T19:27:58.220 INFO:tasks.workunit.client.1.vm08.stdout:4/931: mknod da/d10/d26/d3a/db5/ddb/c11d 0 2026-03-09T19:27:58.221 INFO:tasks.workunit.client.1.vm08.stdout:4/932: chown da/d10/d26/d27/f35 4624606 1 2026-03-09T19:27:58.222 INFO:tasks.workunit.client.0.vm07.stdout:6/718: dread d0/d1/db/f70 [0,4194304] 0 2026-03-09T19:27:58.222 INFO:tasks.workunit.client.1.vm08.stdout:2/889: link d3/d4/d3e/d4e/d88/l8a d3/d4/d3e/df2/l12a 0 2026-03-09T19:27:58.224 INFO:tasks.workunit.client.1.vm08.stdout:0/989: dread - dd/d9d/dcc/ff8 zero size 2026-03-09T19:27:58.225 INFO:tasks.workunit.client.0.vm07.stdout:5/743: symlink d3/dd/dda/lf1 0 2026-03-09T19:27:58.226 INFO:tasks.workunit.client.0.vm07.stdout:5/744: truncate d3/dd/dbe/fce 1874238 0 2026-03-09T19:27:58.226 INFO:tasks.workunit.client.1.vm08.stdout:6/990: creat d3/db/f16e x:0 0 0 2026-03-09T19:27:58.227 INFO:tasks.workunit.client.0.vm07.stdout:5/745: readlink d3/dd/d26/d3f/d47/d71/lc3 0 2026-03-09T19:27:58.227 INFO:tasks.workunit.client.0.vm07.stdout:1/776: symlink d1/db/d31/dca/l105 0 2026-03-09T19:27:58.228 INFO:tasks.workunit.client.0.vm07.stdout:1/777: chown d1/d3e/db3/d6d/dff/fef 2585819 1 2026-03-09T19:27:58.233 INFO:tasks.workunit.client.0.vm07.stdout:0/743: fsync d0/d6/d13/f6c 0 2026-03-09T19:27:58.233 INFO:tasks.workunit.client.0.vm07.stdout:9/771: dwrite d0/d6/ff [0,4194304] 0 2026-03-09T19:27:58.233 INFO:tasks.workunit.client.1.vm08.stdout:4/933: creat da/d10/d26/d27/d9b/f11e x:0 0 0 2026-03-09T19:27:58.234 INFO:tasks.workunit.client.1.vm08.stdout:4/934: chown da/d10/d26/d27/d9b 1566 1 2026-03-09T19:27:58.234 INFO:tasks.workunit.client.0.vm07.stdout:9/772: write d0/db/d29/d68/f8e [3304459,4064] 0 2026-03-09T19:27:58.241 INFO:tasks.workunit.client.1.vm08.stdout:0/990: creat dd/d31/dca/f145 x:0 0 0 2026-03-09T19:27:58.251 INFO:tasks.workunit.client.0.vm07.stdout:7/730: creat d0/d4/d5/d8/d41/d64/d74/d98/dcb/d39/ff9 x:0 0 0 2026-03-09T19:27:58.251 INFO:tasks.workunit.client.0.vm07.stdout:2/832: creat d3/dd/d16/d29/d2d/d45/d3b/f127 x:0 0 0 2026-03-09T19:27:58.251 INFO:tasks.workunit.client.0.vm07.stdout:8/782: mkdir d7/d9/d37/d45/d4f/db1/d107/d118 0 2026-03-09T19:27:58.251 INFO:tasks.workunit.client.1.vm08.stdout:0/991: fsync dd/d22/fba 0 2026-03-09T19:27:58.251 INFO:tasks.workunit.client.1.vm08.stdout:5/956: fsync d16/d8e/dff/fb0 0 2026-03-09T19:27:58.251 INFO:tasks.workunit.client.1.vm08.stdout:2/890: mknod d3/d9/d79/d46/d8c/d92/d120/c12b 0 2026-03-09T19:27:58.253 INFO:tasks.workunit.client.1.vm08.stdout:9/975: sync 2026-03-09T19:27:58.253 INFO:tasks.workunit.client.0.vm07.stdout:6/719: sync 2026-03-09T19:27:58.260 INFO:tasks.workunit.client.1.vm08.stdout:0/992: symlink dd/d22/d27/d2e/d37/l146 0 2026-03-09T19:27:58.260 INFO:tasks.workunit.client.0.vm07.stdout:0/744: symlink d0/d6/d13/d17/dc3/dbc/lf5 0 2026-03-09T19:27:58.262 INFO:tasks.workunit.client.1.vm08.stdout:5/957: truncate d16/f34 3868973 0 2026-03-09T19:27:58.262 INFO:tasks.workunit.client.0.vm07.stdout:9/773: mkdir d0/d6/d73/d10e 0 2026-03-09T19:27:58.262 INFO:tasks.workunit.client.0.vm07.stdout:7/731: fsync d0/d80/db1/de5/d54/f5e 0 2026-03-09T19:27:58.263 INFO:tasks.workunit.client.1.vm08.stdout:2/891: fsync d3/d4/d23/d2c/d39/d5e/de/f32 0 2026-03-09T19:27:58.264 INFO:tasks.workunit.client.1.vm08.stdout:9/976: fsync d0/d2/d14/d5c/fb0 0 2026-03-09T19:27:58.268 INFO:tasks.workunit.client.1.vm08.stdout:5/958: mkdir d16/d1e/dc9/d102/d136 0 2026-03-09T19:27:58.270 INFO:tasks.workunit.client.1.vm08.stdout:2/892: creat d3/d4/d10e/f12c x:0 0 0 2026-03-09T19:27:58.271 INFO:tasks.workunit.client.0.vm07.stdout:6/720: chown d0/dbf/d95/feb 0 1 2026-03-09T19:27:58.271 INFO:tasks.workunit.client.0.vm07.stdout:1/778: symlink d1/db/d31/l106 0 2026-03-09T19:27:58.275 INFO:tasks.workunit.client.1.vm08.stdout:0/993: sync 2026-03-09T19:27:58.281 INFO:tasks.workunit.client.1.vm08.stdout:2/893: creat d3/d9/d26/f12d x:0 0 0 2026-03-09T19:27:58.282 INFO:tasks.workunit.client.0.vm07.stdout:0/745: mkdir d0/d6/d13/d17/dc3/df6 0 2026-03-09T19:27:58.283 INFO:tasks.workunit.client.0.vm07.stdout:9/774: creat d0/db/d29/d4d/f10f x:0 0 0 2026-03-09T19:27:58.286 INFO:tasks.workunit.client.1.vm08.stdout:8/964: truncate de/f1f 768346 0 2026-03-09T19:27:58.287 INFO:tasks.workunit.client.1.vm08.stdout:8/965: chown de/d1d/d21/f146 12966 1 2026-03-09T19:27:58.295 INFO:tasks.workunit.client.1.vm08.stdout:4/935: write da/d10/d16/d28/d46/fbe [781995,127563] 0 2026-03-09T19:27:58.296 INFO:tasks.workunit.client.1.vm08.stdout:6/991: dwrite d3/d15/f19 [0,4194304] 0 2026-03-09T19:27:58.312 INFO:tasks.workunit.client.1.vm08.stdout:9/977: dwrite d0/d1b/de9/dfd/f10d [0,4194304] 0 2026-03-09T19:27:58.312 INFO:tasks.workunit.client.1.vm08.stdout:5/959: dwrite d16/d1e/fa5 [4194304,4194304] 0 2026-03-09T19:27:58.326 INFO:tasks.workunit.client.0.vm07.stdout:2/833: unlink d3/dd/d16/c21 0 2026-03-09T19:27:58.327 INFO:tasks.workunit.client.0.vm07.stdout:5/746: write d3/dd/d26/d2d/fae [1044697,109115] 0 2026-03-09T19:27:58.332 INFO:tasks.workunit.client.0.vm07.stdout:4/756: link d3/d11/d2b/d38/ddc/d22/d86/lf6 d3/d11/d29/d101/d99/l106 0 2026-03-09T19:27:58.333 INFO:tasks.workunit.client.0.vm07.stdout:5/747: chown d3/dd/d26/d3f/d47/d71/db7/fbc 4 1 2026-03-09T19:27:58.333 INFO:tasks.workunit.client.0.vm07.stdout:8/783: symlink d7/d9/l119 0 2026-03-09T19:27:58.337 INFO:tasks.workunit.client.0.vm07.stdout:3/838: getdents d1/d6/d4c/dfa 0 2026-03-09T19:27:58.338 INFO:tasks.workunit.client.0.vm07.stdout:3/839: readlink d1/d6/d45/d54/de5/lf8 0 2026-03-09T19:27:58.338 INFO:tasks.workunit.client.0.vm07.stdout:0/746: truncate d0/fa8 682946 0 2026-03-09T19:27:58.338 INFO:tasks.workunit.client.0.vm07.stdout:9/775: symlink d0/d6f/dc3/df8/l110 0 2026-03-09T19:27:58.341 INFO:tasks.workunit.client.0.vm07.stdout:9/776: dread d0/d6/d3a/fc6 [0,4194304] 0 2026-03-09T19:27:58.341 INFO:tasks.workunit.client.1.vm08.stdout:2/894: mknod d3/d4/d23/d2c/d39/d5e/de/d18/d99/c12e 0 2026-03-09T19:27:58.341 INFO:tasks.workunit.client.0.vm07.stdout:9/777: write d0/db/d29/d4d/f95 [2956693,125797] 0 2026-03-09T19:27:58.342 INFO:tasks.workunit.client.1.vm08.stdout:6/992: creat d3/d15/dc2/d12f/f16f x:0 0 0 2026-03-09T19:27:58.342 INFO:tasks.workunit.client.0.vm07.stdout:9/778: fsync d0/d6f/dc3/fff 0 2026-03-09T19:27:58.342 INFO:tasks.workunit.client.1.vm08.stdout:0/994: mknod dd/d22/d27/d11e/d78/c147 0 2026-03-09T19:27:58.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:58 vm08.local ceph-mon[57794]: Updating vm07:/etc/ceph/ceph.conf 2026-03-09T19:27:58.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:58 vm08.local ceph-mon[57794]: Updating vm08:/etc/ceph/ceph.conf 2026-03-09T19:27:58.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:58 vm08.local ceph-mon[57794]: pgmap v8: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 27 MiB/s rd, 57 MiB/s wr, 166 op/s 2026-03-09T19:27:58.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:58 vm08.local ceph-mon[57794]: from='client.14738 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:27:58.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:58 vm08.local ceph-mon[57794]: Updating vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:27:58.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:58 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/4186052139' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:27:58.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:58 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:58.345 INFO:tasks.workunit.client.0.vm07.stdout:2/834: mkdir d3/dd/d16/d29/d3c/d4c/d128 0 2026-03-09T19:27:58.347 INFO:tasks.workunit.client.0.vm07.stdout:2/835: readlink d3/dd/d16/leb 0 2026-03-09T19:27:58.349 INFO:tasks.workunit.client.1.vm08.stdout:5/960: creat d16/d1e/d3b/d61/d11e/d107/d114/f137 x:0 0 0 2026-03-09T19:27:58.351 INFO:tasks.workunit.client.1.vm08.stdout:8/966: symlink de/d47/dfd/d99/da5/db3/d128/l14c 0 2026-03-09T19:27:58.351 INFO:tasks.workunit.client.1.vm08.stdout:5/961: dread - d16/fb8 zero size 2026-03-09T19:27:58.352 INFO:tasks.workunit.client.1.vm08.stdout:5/962: truncate d16/d1e/d8c/fab 4226815 0 2026-03-09T19:27:58.355 INFO:tasks.workunit.client.0.vm07.stdout:5/748: dread d3/f25 [0,4194304] 0 2026-03-09T19:27:58.356 INFO:tasks.workunit.client.0.vm07.stdout:5/749: chown d3/d1a/d28/d40/d92/d89/cc0 5057 1 2026-03-09T19:27:58.360 INFO:tasks.workunit.client.0.vm07.stdout:8/784: dread d7/d50/da6/fb8 [0,4194304] 0 2026-03-09T19:27:58.361 INFO:tasks.workunit.client.0.vm07.stdout:8/785: readlink d7/d9/d10/dd8/dfd/d67/l77 0 2026-03-09T19:27:58.361 INFO:tasks.workunit.client.0.vm07.stdout:6/721: symlink d0/d4e/l118 0 2026-03-09T19:27:58.372 INFO:tasks.workunit.client.0.vm07.stdout:7/732: creat d0/d4/d5/d26/ffa x:0 0 0 2026-03-09T19:27:58.372 INFO:tasks.workunit.client.1.vm08.stdout:4/936: creat da/d10/d16/d28/d2f/d4f/d64/d81/dfb/f11f x:0 0 0 2026-03-09T19:27:58.373 INFO:tasks.workunit.client.1.vm08.stdout:6/993: chown d3/d34/f10e 3 1 2026-03-09T19:27:58.376 INFO:tasks.workunit.client.1.vm08.stdout:6/994: dread d3/d15/f19 [0,4194304] 0 2026-03-09T19:27:58.395 INFO:tasks.workunit.client.1.vm08.stdout:9/978: dwrite d0/d2/d14/d98/d99/fd4 [0,4194304] 0 2026-03-09T19:27:58.402 INFO:tasks.workunit.client.0.vm07.stdout:4/757: dwrite d3/d11/d2b/d38/ddc/fb4 [0,4194304] 0 2026-03-09T19:27:58.406 INFO:tasks.workunit.client.0.vm07.stdout:4/758: truncate d3/d11/d2b/d38/fdf 605691 0 2026-03-09T19:27:58.407 INFO:tasks.workunit.client.0.vm07.stdout:4/759: write d3/d11/d2b/d38/ddc/d91/dd6/ffb [490939,81522] 0 2026-03-09T19:27:58.410 INFO:tasks.workunit.client.0.vm07.stdout:0/747: creat d0/d6/d13/d1c/d61/ff7 x:0 0 0 2026-03-09T19:27:58.412 INFO:tasks.workunit.client.1.vm08.stdout:8/967: dread - de/d1d/d2e/f12a zero size 2026-03-09T19:27:58.413 INFO:tasks.workunit.client.1.vm08.stdout:8/968: write de/d91/dc8/fe4 [856188,10054] 0 2026-03-09T19:27:58.417 INFO:tasks.workunit.client.1.vm08.stdout:5/963: symlink d16/d1e/d8c/d99/dcc/l138 0 2026-03-09T19:27:58.428 INFO:tasks.workunit.client.1.vm08.stdout:4/937: fsync da/d10/d26/f87 0 2026-03-09T19:27:58.428 INFO:tasks.workunit.client.1.vm08.stdout:4/938: dwrite da/d10/d26/d27/d9b/f108 [0,4194304] 0 2026-03-09T19:27:58.434 INFO:tasks.workunit.client.0.vm07.stdout:2/836: fsync d3/dd/d16/d29/d2d/d45/dc3/f9c 0 2026-03-09T19:27:58.452 INFO:tasks.workunit.client.1.vm08.stdout:9/979: dread d0/d1b/d97/d48/d5d/f9b [0,4194304] 0 2026-03-09T19:27:58.456 INFO:tasks.workunit.client.1.vm08.stdout:5/964: creat d16/d8e/dd5/dfa/f139 x:0 0 0 2026-03-09T19:27:58.459 INFO:tasks.workunit.client.1.vm08.stdout:5/965: dwrite d16/d8e/dd5/dfa/f139 [0,4194304] 0 2026-03-09T19:27:58.475 INFO:tasks.workunit.client.0.vm07.stdout:9/779: write d0/db/d29/d2c/f34 [2587731,59420] 0 2026-03-09T19:27:58.475 INFO:tasks.workunit.client.0.vm07.stdout:9/780: dread - d0/d6/d73/fed zero size 2026-03-09T19:27:58.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:58 vm07.local ceph-mon[48545]: Updating vm07:/etc/ceph/ceph.conf 2026-03-09T19:27:58.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:58 vm07.local ceph-mon[48545]: Updating vm08:/etc/ceph/ceph.conf 2026-03-09T19:27:58.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:58 vm07.local ceph-mon[48545]: pgmap v8: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 27 MiB/s rd, 57 MiB/s wr, 166 op/s 2026-03-09T19:27:58.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:58 vm07.local ceph-mon[48545]: from='client.14738 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:27:58.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:58 vm07.local ceph-mon[48545]: Updating vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:27:58.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:58 vm07.local ceph-mon[48545]: from='client.? 192.168.123.107:0/4186052139' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:27:58.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:58 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:58.485 INFO:tasks.workunit.client.1.vm08.stdout:0/995: link dd/d22/d27/d4f/fd7 dd/d22/d27/d11e/db3/f148 0 2026-03-09T19:27:58.486 INFO:tasks.workunit.client.1.vm08.stdout:0/996: write dd/d22/d27/d11e/de3/f12e [318268,42320] 0 2026-03-09T19:27:58.490 INFO:tasks.workunit.client.1.vm08.stdout:8/969: write de/d1d/d2e/f56 [173930,115339] 0 2026-03-09T19:27:58.493 INFO:tasks.workunit.client.1.vm08.stdout:4/939: dwrite da/d10/d16/d28/d2f/f80 [0,4194304] 0 2026-03-09T19:27:58.502 INFO:tasks.workunit.client.1.vm08.stdout:4/940: dread da/d10/d26/d27/f35 [0,4194304] 0 2026-03-09T19:27:58.514 INFO:tasks.workunit.client.1.vm08.stdout:2/895: rename d3/d4/d23/d2c/d39/d5e/de/d18/d99/f125 to d3/d4/d23/d2c/f12f 0 2026-03-09T19:27:58.515 INFO:tasks.workunit.client.1.vm08.stdout:2/896: write d3/d9/d79/d46/d8c/f90 [5164555,6000] 0 2026-03-09T19:27:58.526 INFO:tasks.workunit.client.1.vm08.stdout:6/995: creat d3/db/d43/d69/f170 x:0 0 0 2026-03-09T19:27:58.530 INFO:tasks.workunit.client.1.vm08.stdout:0/997: creat dd/d31/dca/f149 x:0 0 0 2026-03-09T19:27:58.532 INFO:tasks.workunit.client.1.vm08.stdout:8/970: symlink de/d1d/d2e/l14d 0 2026-03-09T19:27:58.542 INFO:tasks.workunit.client.1.vm08.stdout:2/897: unlink f2 0 2026-03-09T19:27:58.548 INFO:tasks.workunit.client.1.vm08.stdout:9/980: write d0/d2/d14/f56 [2371847,69023] 0 2026-03-09T19:27:58.553 INFO:tasks.workunit.client.1.vm08.stdout:0/998: write dd/d22/d27/d6c/fb9 [837491,35744] 0 2026-03-09T19:27:58.557 INFO:tasks.workunit.client.1.vm08.stdout:5/966: rename d16/lc3 to d16/d1e/d9f/d12b/l13a 0 2026-03-09T19:27:58.568 INFO:tasks.workunit.client.1.vm08.stdout:6/996: dwrite d3/d34/f13e [0,4194304] 0 2026-03-09T19:27:58.579 INFO:tasks.workunit.client.1.vm08.stdout:9/981: creat d0/d2/d14/d98/f13c x:0 0 0 2026-03-09T19:27:58.585 INFO:tasks.workunit.client.1.vm08.stdout:6/997: dread d3/d34/d6f/f41 [0,4194304] 0 2026-03-09T19:27:58.590 INFO:tasks.workunit.client.1.vm08.stdout:0/999: dread dd/d22/d27/d2e/db0/fb2 [0,4194304] 0 2026-03-09T19:27:58.594 INFO:tasks.workunit.client.1.vm08.stdout:6/998: dread d3/d15/f64 [0,4194304] 0 2026-03-09T19:27:58.605 INFO:tasks.workunit.client.1.vm08.stdout:5/967: symlink d16/d45/daf/df5/d6f/l13b 0 2026-03-09T19:27:58.605 INFO:tasks.workunit.client.0.vm07.stdout:5/750: dwrite d3/dd/d26/d3f/fc4 [0,4194304] 0 2026-03-09T19:27:58.605 INFO:tasks.workunit.client.1.vm08.stdout:5/968: write d16/d1e/dc9/d10c/d112/f127 [982730,121225] 0 2026-03-09T19:27:58.607 INFO:tasks.workunit.client.1.vm08.stdout:5/969: chown d16/d45/daf/df5/d6f/f11b 55867 1 2026-03-09T19:27:58.617 INFO:tasks.workunit.client.0.vm07.stdout:6/722: stat d0/d4e/d7f/dbe 0 2026-03-09T19:27:58.619 INFO:tasks.workunit.client.0.vm07.stdout:1/779: mkdir d1/d11/d37/d5d/dc1/d107 0 2026-03-09T19:27:58.621 INFO:tasks.workunit.client.0.vm07.stdout:3/840: rename d1/d74/c27 to d1/d6/dd/dbf/ddc/c110 0 2026-03-09T19:27:58.625 INFO:tasks.workunit.client.0.vm07.stdout:4/760: unlink d3/d11/c15 0 2026-03-09T19:27:58.627 INFO:tasks.workunit.client.0.vm07.stdout:0/748: mkdir d0/d6/d13/d17/d19/d58/dd9/df8 0 2026-03-09T19:27:58.628 INFO:tasks.workunit.client.1.vm08.stdout:4/941: rename da/d10/d16/d28/d2f/de9/fbb to da/d10/d16/d28/d2f/d4f/d103/ded/f120 0 2026-03-09T19:27:58.628 INFO:tasks.workunit.client.1.vm08.stdout:5/970: chown d16/d1e/d3b/d61/le7 30 1 2026-03-09T19:27:58.628 INFO:tasks.workunit.client.0.vm07.stdout:0/749: readlink d0/d6/d13/d1c/d11/l2c 0 2026-03-09T19:27:58.631 INFO:tasks.workunit.client.1.vm08.stdout:6/999: chown d3/d34/d5c/da2/dd6/l134 598823422 1 2026-03-09T19:27:58.635 INFO:tasks.workunit.client.0.vm07.stdout:4/761: dwrite d3/d11/d2b/d38/ddc/fb4 [0,4194304] 0 2026-03-09T19:27:58.635 INFO:tasks.workunit.client.0.vm07.stdout:0/750: readlink d0/d6/d13/d1c/d61/d69/lde 0 2026-03-09T19:27:58.637 INFO:tasks.workunit.client.0.vm07.stdout:9/781: stat d0/db/d29/d68/d99/lb1 0 2026-03-09T19:27:58.643 INFO:tasks.workunit.client.1.vm08.stdout:8/971: rename de/d47/dfd/d99/ccd to de/d1d/d4f/d141/c14e 0 2026-03-09T19:27:58.645 INFO:tasks.workunit.client.1.vm08.stdout:9/982: dwrite d0/d1b/d97/d48/d6f/f79 [0,4194304] 0 2026-03-09T19:27:58.648 INFO:tasks.workunit.client.1.vm08.stdout:2/898: link d3/d4/d23/d2c/dc1/l116 d3/d4/d3e/l130 0 2026-03-09T19:27:58.652 INFO:tasks.workunit.client.1.vm08.stdout:8/972: dread de/d25/d33/f41 [0,4194304] 0 2026-03-09T19:27:58.657 INFO:tasks.workunit.client.1.vm08.stdout:9/983: rmdir d0/d1b/de9/d12a/da2/da8 39 2026-03-09T19:27:58.660 INFO:tasks.workunit.client.0.vm07.stdout:5/751: mkdir d3/d1a/d28/d40/d92/d89/ddc/df2 0 2026-03-09T19:27:58.661 INFO:tasks.workunit.client.1.vm08.stdout:2/899: truncate d3/d4/d23/d2c/d39/d5e/de/d18/fad 3938767 0 2026-03-09T19:27:58.661 INFO:tasks.workunit.client.0.vm07.stdout:8/786: mkdir d7/d9/d10/dd8/d10b/d11a 0 2026-03-09T19:27:58.661 INFO:tasks.workunit.client.1.vm08.stdout:2/900: chown d3/d9/d26/l37 0 1 2026-03-09T19:27:58.662 INFO:tasks.workunit.client.0.vm07.stdout:7/733: mkdir d0/d4/d5/d26/dfb 0 2026-03-09T19:27:58.663 INFO:tasks.workunit.client.1.vm08.stdout:9/984: dwrite d0/d2/d14/d5c/fd [0,4194304] 0 2026-03-09T19:27:58.663 INFO:tasks.workunit.client.0.vm07.stdout:3/841: truncate d1/d1f/f13 4834015 0 2026-03-09T19:27:58.664 INFO:tasks.workunit.client.1.vm08.stdout:8/973: stat de/d25/d31/d82/l70 0 2026-03-09T19:27:58.670 INFO:tasks.workunit.client.1.vm08.stdout:8/974: mknod de/d25/d87/c14f 0 2026-03-09T19:27:58.673 INFO:tasks.workunit.client.1.vm08.stdout:8/975: dread - de/d47/fe8 zero size 2026-03-09T19:27:58.673 INFO:tasks.workunit.client.1.vm08.stdout:9/985: creat d0/d1b/d97/d48/d5d/ddf/f13d x:0 0 0 2026-03-09T19:27:58.682 INFO:tasks.workunit.client.1.vm08.stdout:8/976: dwrite de/d7c/f144 [0,4194304] 0 2026-03-09T19:27:58.688 INFO:tasks.workunit.client.1.vm08.stdout:5/971: rename d16/d1e/dc9/d10c/f98 to d16/d1e/d8c/d99/da8/f13c 0 2026-03-09T19:27:58.688 INFO:tasks.workunit.client.1.vm08.stdout:5/972: chown d16/d1e/d3b/d61 2 1 2026-03-09T19:27:58.689 INFO:tasks.workunit.client.1.vm08.stdout:5/973: chown d16/d45/daf/d126 26706 1 2026-03-09T19:27:58.693 INFO:tasks.workunit.client.1.vm08.stdout:9/986: dread - d0/d2/d14/d5c/f118 zero size 2026-03-09T19:27:58.694 INFO:tasks.workunit.client.0.vm07.stdout:9/782: symlink d0/d6/d57/l111 0 2026-03-09T19:27:58.695 INFO:tasks.workunit.client.0.vm07.stdout:9/783: chown d0/d6f/d86 13242596 1 2026-03-09T19:27:58.703 INFO:tasks.workunit.client.0.vm07.stdout:5/752: symlink d3/dd/d26/d2d/d60/d83/lf3 0 2026-03-09T19:27:58.704 INFO:tasks.workunit.client.0.vm07.stdout:5/753: chown d3/d1a/d28/d40/l55 56601 1 2026-03-09T19:27:58.716 INFO:tasks.workunit.client.0.vm07.stdout:3/842: creat d1/d6/dd/dbf/f111 x:0 0 0 2026-03-09T19:27:58.717 INFO:tasks.workunit.client.0.vm07.stdout:8/787: dread d7/d50/da6/dc5/f103 [0,4194304] 0 2026-03-09T19:27:58.721 INFO:tasks.workunit.client.1.vm08.stdout:2/901: dread d3/d9/d79/d46/fcb [0,4194304] 0 2026-03-09T19:27:58.725 INFO:tasks.workunit.client.1.vm08.stdout:2/902: dwrite d3/d9/d26/f12d [0,4194304] 0 2026-03-09T19:27:58.728 INFO:tasks.workunit.client.0.vm07.stdout:0/751: mkdir d0/d6/d13/d17/dc3/df6/df9 0 2026-03-09T19:27:58.735 INFO:tasks.workunit.client.0.vm07.stdout:9/784: creat d0/d6/d57/f112 x:0 0 0 2026-03-09T19:27:58.736 INFO:tasks.workunit.client.0.vm07.stdout:6/723: mknod d0/dbf/d95/d31/d9e/c119 0 2026-03-09T19:27:58.741 INFO:tasks.workunit.client.0.vm07.stdout:7/734: creat d0/d4/d5/d8/d41/d64/dd5/ffc x:0 0 0 2026-03-09T19:27:58.743 INFO:tasks.workunit.client.0.vm07.stdout:7/735: chown d0/d80/db1/de5/d54/d55/lf5 1162026714 1 2026-03-09T19:27:58.744 INFO:tasks.workunit.client.0.vm07.stdout:5/754: dread f2 [4194304,4194304] 0 2026-03-09T19:27:58.746 INFO:tasks.workunit.client.0.vm07.stdout:0/752: dread d0/d6/f43 [0,4194304] 0 2026-03-09T19:27:58.747 INFO:tasks.workunit.client.0.vm07.stdout:3/843: creat d1/d3d/d47/db3/dc2/d28/d7c/f112 x:0 0 0 2026-03-09T19:27:58.747 INFO:tasks.workunit.client.0.vm07.stdout:5/755: write d3/d1a/d28/d40/d92/d89/ddc/fe7 [115702,46854] 0 2026-03-09T19:27:58.749 INFO:tasks.workunit.client.0.vm07.stdout:9/785: dwrite d0/d6f/dc3/f109 [0,4194304] 0 2026-03-09T19:27:58.753 INFO:tasks.workunit.client.0.vm07.stdout:9/786: write d0/db/d29/d2c/f34 [1309595,102387] 0 2026-03-09T19:27:58.761 INFO:tasks.workunit.client.0.vm07.stdout:2/837: getdents d3/dd/d16/d29/d2d/d45/d8b/d98 0 2026-03-09T19:27:58.761 INFO:tasks.workunit.client.0.vm07.stdout:6/724: fdatasync d0/d1/db/d1d/f47 0 2026-03-09T19:27:58.775 INFO:tasks.workunit.client.0.vm07.stdout:1/780: write d1/d11/d37/d3f/d6e/d9c/faa [138014,8599] 0 2026-03-09T19:27:58.780 INFO:tasks.workunit.client.0.vm07.stdout:4/762: write d3/d4f/d56/d5f/fc2 [836488,38846] 0 2026-03-09T19:27:58.786 INFO:tasks.workunit.client.0.vm07.stdout:3/844: rmdir d1/d3d/d47/db3/dc2/d28 39 2026-03-09T19:27:58.788 INFO:tasks.workunit.client.0.vm07.stdout:7/736: write d0/d4/f6f [3868440,82507] 0 2026-03-09T19:27:58.792 INFO:tasks.workunit.client.0.vm07.stdout:5/756: write d3/dd/d26/d3f/d47/d56/f65 [113173,64556] 0 2026-03-09T19:27:58.795 INFO:tasks.workunit.client.0.vm07.stdout:2/838: symlink d3/dd/d16/d29/d2d/d45/dc3/l129 0 2026-03-09T19:27:58.795 INFO:tasks.workunit.client.0.vm07.stdout:8/788: dwrite d7/d1d/d83/d9f/fc6 [0,4194304] 0 2026-03-09T19:27:58.800 INFO:tasks.workunit.client.0.vm07.stdout:5/757: write d3/dd/d26/d3f/d47/d56/f65 [4253074,130745] 0 2026-03-09T19:27:58.801 INFO:tasks.workunit.client.0.vm07.stdout:1/781: mknod d1/d11/d37/d3f/d7e/dad/c108 0 2026-03-09T19:27:58.802 INFO:tasks.workunit.client.0.vm07.stdout:4/763: mkdir d3/d11/d2b/d38/d107 0 2026-03-09T19:27:58.802 INFO:tasks.workunit.client.0.vm07.stdout:2/839: dread d3/dd/d16/d30/f7e [0,4194304] 0 2026-03-09T19:27:58.810 INFO:tasks.workunit.client.0.vm07.stdout:0/753: creat d0/d6/d13/d17/d19/ffa x:0 0 0 2026-03-09T19:27:58.813 INFO:tasks.workunit.client.1.vm08.stdout:4/942: rename da/d10/d1b/d23 to da/d10/d16/d28/d46/d121 0 2026-03-09T19:27:58.815 INFO:tasks.workunit.client.1.vm08.stdout:5/974: truncate d16/d1e/d3b/d61/f7a 2559798 0 2026-03-09T19:27:58.822 INFO:tasks.workunit.client.1.vm08.stdout:8/977: symlink de/d47/l150 0 2026-03-09T19:27:58.826 INFO:tasks.workunit.client.0.vm07.stdout:9/787: dwrite d0/db/d29/f67 [0,4194304] 0 2026-03-09T19:27:58.826 INFO:tasks.workunit.client.1.vm08.stdout:9/987: fsync d0/d2/d14/f31 0 2026-03-09T19:27:58.828 INFO:tasks.workunit.client.0.vm07.stdout:9/788: chown d0/db/d29/d4d 31577 1 2026-03-09T19:27:58.829 INFO:tasks.workunit.client.0.vm07.stdout:6/725: creat d0/dbf/d95/f11a x:0 0 0 2026-03-09T19:27:58.839 INFO:tasks.workunit.client.0.vm07.stdout:0/754: symlink d0/d6/dc8/lfb 0 2026-03-09T19:27:58.842 INFO:tasks.workunit.client.1.vm08.stdout:4/943: mkdir da/d10/d16/d28/d122 0 2026-03-09T19:27:58.842 INFO:tasks.workunit.client.0.vm07.stdout:7/737: write d0/d4/d5/d26/d32/dbd/fba [508778,17838] 0 2026-03-09T19:27:58.843 INFO:tasks.workunit.client.1.vm08.stdout:4/944: fdatasync da/d10/d16/d28/d2f/d4f/d103/d40/ff8 0 2026-03-09T19:27:58.843 INFO:tasks.workunit.client.0.vm07.stdout:6/726: dwrite d0/d1/d28/da8/ffe [0,4194304] 0 2026-03-09T19:27:58.848 INFO:tasks.workunit.client.0.vm07.stdout:2/840: fsync d3/dd/d16/f5f 0 2026-03-09T19:27:58.853 INFO:tasks.workunit.client.0.vm07.stdout:0/755: dread d0/d6/d13/d33/fe1 [0,4194304] 0 2026-03-09T19:27:58.863 INFO:tasks.workunit.client.0.vm07.stdout:8/789: dwrite d7/d9/d10/dd8/dfd/d62/f64 [0,4194304] 0 2026-03-09T19:27:58.868 INFO:tasks.workunit.client.1.vm08.stdout:9/988: fdatasync d0/d1b/de9/d12a/da2/da8/de8/f101 0 2026-03-09T19:27:58.869 INFO:tasks.workunit.client.1.vm08.stdout:9/989: write d0/d1b/de9/d12a/da2/da8/de8/f8e [1358031,76596] 0 2026-03-09T19:27:58.870 INFO:tasks.workunit.client.0.vm07.stdout:3/845: creat d1/d3d/d47/f113 x:0 0 0 2026-03-09T19:27:58.870 INFO:tasks.workunit.client.0.vm07.stdout:5/758: write d3/dd/d26/d3f/f66 [1098447,50167] 0 2026-03-09T19:27:58.870 INFO:tasks.workunit.client.0.vm07.stdout:8/790: chown d7/d9/d10/d44/c49 18721717 1 2026-03-09T19:27:58.876 INFO:tasks.workunit.client.0.vm07.stdout:9/789: mknod d0/d6f/dc3/c113 0 2026-03-09T19:27:58.878 INFO:tasks.workunit.client.0.vm07.stdout:9/790: readlink d0/db/d29/d32/d5c/d80/dad/lcf 0 2026-03-09T19:27:58.884 INFO:tasks.workunit.client.1.vm08.stdout:8/978: dwrite de/d91/f108 [0,4194304] 0 2026-03-09T19:27:58.895 INFO:tasks.workunit.client.1.vm08.stdout:9/990: creat d0/d1b/d97/d48/d5d/d74/f13e x:0 0 0 2026-03-09T19:27:58.895 INFO:tasks.workunit.client.1.vm08.stdout:2/903: getdents d3/d4/d3e/d4e/d88/db0 0 2026-03-09T19:27:58.895 INFO:tasks.workunit.client.1.vm08.stdout:2/904: chown d3/d4/d23/fc0 2 1 2026-03-09T19:27:58.895 INFO:tasks.workunit.client.1.vm08.stdout:2/905: readlink d3/d4/d23/d2c/d39/d5e/de/d18/d1f/ld6 0 2026-03-09T19:27:58.895 INFO:tasks.workunit.client.1.vm08.stdout:2/906: chown d3/d9/d26/ded 0 1 2026-03-09T19:27:58.899 INFO:tasks.workunit.client.0.vm07.stdout:6/727: creat d0/d1/db/d17/dc4/d7b/d7d/f11b x:0 0 0 2026-03-09T19:27:58.899 INFO:tasks.workunit.client.0.vm07.stdout:2/841: creat d3/dd/d16/d29/d2d/d45/dc3/f12a x:0 0 0 2026-03-09T19:27:58.900 INFO:tasks.workunit.client.1.vm08.stdout:9/991: creat d0/d2/d14/d98/f13f x:0 0 0 2026-03-09T19:27:58.904 INFO:tasks.workunit.client.1.vm08.stdout:2/907: mkdir d3/d9/d4a/d131 0 2026-03-09T19:27:58.910 INFO:tasks.workunit.client.1.vm08.stdout:4/945: dread da/d10/d16/d28/d2f/d4f/d103/d40/f41 [0,4194304] 0 2026-03-09T19:27:58.911 INFO:tasks.workunit.client.1.vm08.stdout:8/979: mknod de/d47/dfd/d99/da0/d10d/c151 0 2026-03-09T19:27:58.912 INFO:tasks.workunit.client.0.vm07.stdout:8/791: rmdir d7/d9/d37/d45 39 2026-03-09T19:27:58.915 INFO:tasks.workunit.client.0.vm07.stdout:3/846: rename d1/d89/fc7 to d1/d6/d45/d54/dd1/f114 0 2026-03-09T19:27:58.916 INFO:tasks.workunit.client.1.vm08.stdout:9/992: creat d0/d2/d14/d5c/f140 x:0 0 0 2026-03-09T19:27:58.917 INFO:tasks.workunit.client.1.vm08.stdout:5/975: truncate d16/d1e/d6e/f72 6950483 0 2026-03-09T19:27:58.919 INFO:tasks.workunit.client.0.vm07.stdout:9/791: truncate d0/d6/d57/d8f/f9f 698803 0 2026-03-09T19:27:58.925 INFO:tasks.workunit.client.1.vm08.stdout:4/946: stat da/d10/d16/d28/d2f/d4f/d103/ded/f120 0 2026-03-09T19:27:58.929 INFO:tasks.workunit.client.1.vm08.stdout:5/976: dread d16/d1e/d8c/f101 [0,4194304] 0 2026-03-09T19:27:58.934 INFO:tasks.workunit.client.1.vm08.stdout:5/977: symlink d16/d1e/dc9/l13d 0 2026-03-09T19:27:58.936 INFO:tasks.workunit.client.0.vm07.stdout:4/764: link d3/d4f/f7c d3/d11/d29/f108 0 2026-03-09T19:27:58.936 INFO:tasks.workunit.client.1.vm08.stdout:4/947: unlink da/d10/d16/d28/d46/d52/cd7 0 2026-03-09T19:27:58.936 INFO:tasks.workunit.client.0.vm07.stdout:3/847: read d1/d1f/fd0 [1545306,32696] 0 2026-03-09T19:27:58.937 INFO:tasks.workunit.client.1.vm08.stdout:5/978: symlink d16/d1e/d6e/dcd/l13e 0 2026-03-09T19:27:58.938 INFO:tasks.workunit.client.0.vm07.stdout:1/782: link d1/d11/d37/d5d/d50/fe5 d1/d11/d37/d3f/f109 0 2026-03-09T19:27:58.942 INFO:tasks.workunit.client.0.vm07.stdout:3/848: dread - d1/d3d/d47/f9e zero size 2026-03-09T19:27:58.945 INFO:tasks.workunit.client.0.vm07.stdout:4/765: symlink d3/d11/d29/d101/d99/de7/l109 0 2026-03-09T19:27:58.945 INFO:tasks.workunit.client.0.vm07.stdout:2/842: sync 2026-03-09T19:27:58.945 INFO:tasks.workunit.client.0.vm07.stdout:1/783: mknod d1/c10a 0 2026-03-09T19:27:58.947 INFO:tasks.workunit.client.0.vm07.stdout:6/728: getdents d0/d1/d28/da9 0 2026-03-09T19:27:58.951 INFO:tasks.workunit.client.0.vm07.stdout:4/766: dread d3/d11/d2b/d38/fdf [0,4194304] 0 2026-03-09T19:27:58.953 INFO:tasks.workunit.client.0.vm07.stdout:7/738: write d0/d80/db1/de5/d54/f9e [405929,117991] 0 2026-03-09T19:27:58.953 INFO:tasks.workunit.client.0.vm07.stdout:6/729: write d0/d1/db/d52/d94/fff [772257,46390] 0 2026-03-09T19:27:58.957 INFO:tasks.workunit.client.0.vm07.stdout:1/784: rename d1/db/l67 to d1/db/d31/d4f/d7a/dd2/l10b 0 2026-03-09T19:27:58.958 INFO:tasks.workunit.client.0.vm07.stdout:0/756: write d0/d6/d13/f31 [578907,53384] 0 2026-03-09T19:27:58.958 INFO:tasks.workunit.client.0.vm07.stdout:9/792: dread d0/d6/f48 [0,4194304] 0 2026-03-09T19:27:58.961 INFO:tasks.workunit.client.0.vm07.stdout:8/792: sync 2026-03-09T19:27:58.963 INFO:tasks.workunit.client.0.vm07.stdout:1/785: rmdir d1/d3e/db3/d9a 39 2026-03-09T19:27:58.969 INFO:tasks.workunit.client.0.vm07.stdout:9/793: mknod d0/d6/d57/d5d/c114 0 2026-03-09T19:27:58.969 INFO:tasks.workunit.client.0.vm07.stdout:6/730: link d0/d1/db/d1d/d77/cab d0/d1/db/d17/dc4/dc7/c11c 0 2026-03-09T19:27:58.969 INFO:tasks.workunit.client.0.vm07.stdout:4/767: rename d3/d11/d29/c54 to d3/d11/c10a 0 2026-03-09T19:27:58.969 INFO:tasks.workunit.client.0.vm07.stdout:0/757: creat d0/d6/d13/d17/d19/ffc x:0 0 0 2026-03-09T19:27:58.971 INFO:tasks.workunit.client.0.vm07.stdout:2/843: sync 2026-03-09T19:27:58.974 INFO:tasks.workunit.client.1.vm08.stdout:2/908: sync 2026-03-09T19:27:58.975 INFO:tasks.workunit.client.1.vm08.stdout:4/948: sync 2026-03-09T19:27:58.976 INFO:tasks.workunit.client.0.vm07.stdout:4/768: creat d3/d11/d2b/d38/ddc/db2/f10b x:0 0 0 2026-03-09T19:27:58.982 INFO:tasks.workunit.client.0.vm07.stdout:5/759: write d3/dd/d26/d3f/d47/d71/d76/fdf [2625944,94712] 0 2026-03-09T19:27:58.982 INFO:tasks.workunit.client.1.vm08.stdout:9/993: dwrite d0/d2/d14/f4d [0,4194304] 0 2026-03-09T19:27:58.983 INFO:tasks.workunit.client.1.vm08.stdout:8/980: dwrite de/d117/f126 [0,4194304] 0 2026-03-09T19:27:58.985 INFO:tasks.workunit.client.0.vm07.stdout:5/760: write d3/d1a/d28/d40/f46 [3296580,26503] 0 2026-03-09T19:27:58.996 INFO:tasks.workunit.client.0.vm07.stdout:7/739: link d0/d4/d5/d26/dc9/ff1 d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58/ffd 0 2026-03-09T19:27:58.997 INFO:tasks.workunit.client.1.vm08.stdout:2/909: mknod d3/d9/d4a/d9a/c132 0 2026-03-09T19:27:59.003 INFO:tasks.workunit.client.1.vm08.stdout:5/979: dwrite d16/d8e/ff8 [0,4194304] 0 2026-03-09T19:27:59.008 INFO:tasks.workunit.client.0.vm07.stdout:1/786: creat d1/f10c x:0 0 0 2026-03-09T19:27:59.015 INFO:tasks.workunit.client.0.vm07.stdout:0/758: rename d0/d6/d13/d17/dc3/le6 to d0/d6/d13/d17/d19/d57/d6a/lfd 0 2026-03-09T19:27:59.017 INFO:tasks.workunit.client.0.vm07.stdout:2/844: mknod d3/d11/d38/d111/c12b 0 2026-03-09T19:27:59.018 INFO:tasks.workunit.client.1.vm08.stdout:8/981: fsync de/d1d/d4f/fd9 0 2026-03-09T19:27:59.019 INFO:tasks.workunit.client.0.vm07.stdout:2/845: chown d3/dd/d103/fef 71 1 2026-03-09T19:27:59.033 INFO:tasks.workunit.client.0.vm07.stdout:3/849: dwrite d1/d6/dd/f8a [4194304,4194304] 0 2026-03-09T19:27:59.033 INFO:tasks.workunit.client.0.vm07.stdout:4/769: rmdir d3/d4f/d56 39 2026-03-09T19:27:59.033 INFO:tasks.workunit.client.0.vm07.stdout:5/761: creat d3/d1a/d28/d40/ff4 x:0 0 0 2026-03-09T19:27:59.033 INFO:tasks.workunit.client.1.vm08.stdout:9/994: truncate d0/d2/d14/d98/dbb/fe4 527453 0 2026-03-09T19:27:59.033 INFO:tasks.workunit.client.1.vm08.stdout:9/995: stat d0/d1b/d97/c6b 0 2026-03-09T19:27:59.033 INFO:tasks.workunit.client.1.vm08.stdout:2/910: creat d3/d4/d23/d2c/d39/da3/f133 x:0 0 0 2026-03-09T19:27:59.033 INFO:tasks.workunit.client.1.vm08.stdout:4/949: creat da/d10/d16/d28/d2f/d4f/d56/dd0/dc0/f123 x:0 0 0 2026-03-09T19:27:59.033 INFO:tasks.workunit.client.1.vm08.stdout:8/982: rename de/d47/l150 to de/d1d/d21/d73/l152 0 2026-03-09T19:27:59.033 INFO:tasks.workunit.client.1.vm08.stdout:9/996: rmdir d0/d2/d14/d98/d99 39 2026-03-09T19:27:59.033 INFO:tasks.workunit.client.1.vm08.stdout:2/911: truncate d3/d4/d23/d2c/f5b 1600723 0 2026-03-09T19:27:59.033 INFO:tasks.workunit.client.1.vm08.stdout:5/980: mknod d16/d1e/c13f 0 2026-03-09T19:27:59.033 INFO:tasks.workunit.client.0.vm07.stdout:8/793: getdents d7/d1d 0 2026-03-09T19:27:59.034 INFO:tasks.workunit.client.1.vm08.stdout:2/912: dread - d3/d4/d23/d2c/d39/d5e/de/d8b/f7e zero size 2026-03-09T19:27:59.039 INFO:tasks.workunit.client.0.vm07.stdout:3/850: sync 2026-03-09T19:27:59.040 INFO:tasks.workunit.client.0.vm07.stdout:1/787: creat d1/d3/d21/f10d x:0 0 0 2026-03-09T19:27:59.041 INFO:tasks.workunit.client.0.vm07.stdout:1/788: chown d1/d91/l95 939860 1 2026-03-09T19:27:59.042 INFO:tasks.workunit.client.0.vm07.stdout:9/794: dwrite d0/db/d29/d2c/d36/d5a/fb2 [0,4194304] 0 2026-03-09T19:27:59.043 INFO:tasks.workunit.client.0.vm07.stdout:6/731: dwrite d0/d1/db/d17/dc4/d7b/d7d/fa7 [0,4194304] 0 2026-03-09T19:27:59.049 INFO:tasks.workunit.client.1.vm08.stdout:8/983: creat de/d25/d87/dc9/dd8/f153 x:0 0 0 2026-03-09T19:27:59.061 INFO:tasks.workunit.client.1.vm08.stdout:2/913: mkdir d3/d4/d23/d2c/d39/d5e/de/d18/da9/d110/d134 0 2026-03-09T19:27:59.061 INFO:tasks.workunit.client.0.vm07.stdout:0/759: dread d0/d6/fa4 [0,4194304] 0 2026-03-09T19:27:59.061 INFO:tasks.workunit.client.0.vm07.stdout:9/795: fdatasync d0/db/d29/d32/d5c/d80/ddf/f106 0 2026-03-09T19:27:59.061 INFO:tasks.workunit.client.0.vm07.stdout:9/796: read - d0/d6f/d86/fd1 zero size 2026-03-09T19:27:59.061 INFO:tasks.workunit.client.0.vm07.stdout:4/770: read - d3/d4f/d56/d5f/f7b zero size 2026-03-09T19:27:59.064 INFO:tasks.workunit.client.0.vm07.stdout:7/740: truncate d0/d4/d5/d26/f75 2553355 0 2026-03-09T19:27:59.064 INFO:tasks.workunit.client.1.vm08.stdout:2/914: truncate d3/f7 2215523 0 2026-03-09T19:27:59.065 INFO:tasks.workunit.client.0.vm07.stdout:6/732: mknod d0/d4e/d7f/dbe/c11d 0 2026-03-09T19:27:59.066 INFO:tasks.workunit.client.0.vm07.stdout:6/733: write d0/dbf/d95/f74 [572192,24519] 0 2026-03-09T19:27:59.071 INFO:tasks.workunit.client.0.vm07.stdout:5/762: fdatasync d3/d1a/f10 0 2026-03-09T19:27:59.077 INFO:tasks.workunit.client.0.vm07.stdout:0/760: fsync d0/d6/d13/d1c/d61/d69/f9c 0 2026-03-09T19:27:59.077 INFO:tasks.workunit.client.0.vm07.stdout:5/763: chown d3/d1a/d28/d36/l38 29 1 2026-03-09T19:27:59.077 INFO:tasks.workunit.client.0.vm07.stdout:4/771: creat d3/d11/d51/f10c x:0 0 0 2026-03-09T19:27:59.077 INFO:tasks.workunit.client.0.vm07.stdout:6/734: mkdir d0/d1/db/d17/dc4/d7b/da0/d11e 0 2026-03-09T19:27:59.077 INFO:tasks.workunit.client.0.vm07.stdout:5/764: chown d3/dd/l45 0 1 2026-03-09T19:27:59.077 INFO:tasks.workunit.client.0.vm07.stdout:0/761: rename d0/d6/d13/d1c/lee to d0/d6/dc8/d99/ddc/lfe 0 2026-03-09T19:27:59.080 INFO:tasks.workunit.client.0.vm07.stdout:3/851: creat d1/f115 x:0 0 0 2026-03-09T19:27:59.082 INFO:tasks.workunit.client.0.vm07.stdout:8/794: dread d7/d30/fb7 [0,4194304] 0 2026-03-09T19:27:59.083 INFO:tasks.workunit.client.0.vm07.stdout:6/735: mkdir d0/d1/db/d1d/d77/d11f 0 2026-03-09T19:27:59.086 INFO:tasks.workunit.client.1.vm08.stdout:2/915: dread d3/d9/f5d [0,4194304] 0 2026-03-09T19:27:59.100 INFO:tasks.workunit.client.1.vm08.stdout:2/916: fsync d3/d4/d23/d2c/d39/d5e/d14/f58 0 2026-03-09T19:27:59.100 INFO:tasks.workunit.client.1.vm08.stdout:2/917: stat d3/d4/d23/d2c/d39/d5e/de/d18/d1f/lc7 0 2026-03-09T19:27:59.100 INFO:tasks.workunit.client.1.vm08.stdout:2/918: dread d3/d4/f55 [0,4194304] 0 2026-03-09T19:27:59.100 INFO:tasks.workunit.client.1.vm08.stdout:2/919: mkdir d3/d9/d26/ded/d135 0 2026-03-09T19:27:59.100 INFO:tasks.workunit.client.0.vm07.stdout:3/852: dread d1/d6/dd/f67 [0,4194304] 0 2026-03-09T19:27:59.101 INFO:tasks.workunit.client.0.vm07.stdout:1/789: link d1/d11/d37/d3f/d7e/dad/c108 d1/db/d31/dca/c10e 0 2026-03-09T19:27:59.101 INFO:tasks.workunit.client.0.vm07.stdout:3/853: dread - d1/d3d/d47/f113 zero size 2026-03-09T19:27:59.101 INFO:tasks.workunit.client.0.vm07.stdout:6/736: creat d0/d13/f120 x:0 0 0 2026-03-09T19:27:59.101 INFO:tasks.workunit.client.0.vm07.stdout:0/762: rename d0/d6/d13/d1c/d11/l28 to d0/d6/d13/d1c/d52/d81/lff 0 2026-03-09T19:27:59.101 INFO:tasks.workunit.client.0.vm07.stdout:5/765: creat d3/dd/d26/ff5 x:0 0 0 2026-03-09T19:27:59.101 INFO:tasks.workunit.client.0.vm07.stdout:6/737: mkdir d0/d4e/dae/daf/d121 0 2026-03-09T19:27:59.101 INFO:tasks.workunit.client.0.vm07.stdout:5/766: chown d3/d1a/d5a 29212420 1 2026-03-09T19:27:59.101 INFO:tasks.workunit.client.0.vm07.stdout:0/763: fsync d0/d6/d13/d17/dc3/fcc 0 2026-03-09T19:27:59.101 INFO:tasks.workunit.client.0.vm07.stdout:5/767: stat d3/dd/d95/fc7 0 2026-03-09T19:27:59.102 INFO:tasks.workunit.client.0.vm07.stdout:5/768: chown d3/d1a 2532284 1 2026-03-09T19:27:59.111 INFO:tasks.workunit.client.0.vm07.stdout:0/764: dread - d0/d6/d13/d1c/fd7 zero size 2026-03-09T19:27:59.114 INFO:tasks.workunit.client.0.vm07.stdout:6/738: truncate d0/d4e/d75/f100 229850 0 2026-03-09T19:27:59.115 INFO:tasks.workunit.client.1.vm08.stdout:2/920: dread d3/d4/d23/d2c/f31 [0,4194304] 0 2026-03-09T19:27:59.116 INFO:tasks.workunit.client.0.vm07.stdout:3/854: truncate d1/d3d/d47/db3/dc2/f1e 1231313 0 2026-03-09T19:27:59.122 INFO:tasks.workunit.client.1.vm08.stdout:2/921: truncate d3/d4/d23/d2c/d39/d5e/de/f17 76022 0 2026-03-09T19:27:59.123 INFO:tasks.workunit.client.1.vm08.stdout:2/922: dread - d3/d4/d23/d2c/fdb zero size 2026-03-09T19:27:59.123 INFO:tasks.workunit.client.1.vm08.stdout:2/923: chown d3/d4/d23/d2c/fe6 11883 1 2026-03-09T19:27:59.126 INFO:tasks.workunit.client.0.vm07.stdout:5/769: truncate d3/dd/d26/d2d/d60/fc5 6046154 0 2026-03-09T19:27:59.129 INFO:tasks.workunit.client.1.vm08.stdout:4/950: write da/d10/d26/d27/fac [1802887,111648] 0 2026-03-09T19:27:59.131 INFO:tasks.workunit.client.1.vm08.stdout:9/997: write d0/d1b/d68/d7f/fe3 [3405630,101007] 0 2026-03-09T19:27:59.135 INFO:tasks.workunit.client.1.vm08.stdout:4/951: unlink da/d10/d16/d28/d2f/d4f/d64/d81/c97 0 2026-03-09T19:27:59.136 INFO:tasks.workunit.client.0.vm07.stdout:6/739: mkdir d0/d44/d122 0 2026-03-09T19:27:59.138 INFO:tasks.workunit.client.0.vm07.stdout:2/846: write d3/dd/d16/d29/d2d/d45/d85/d8a/fce [981476,68785] 0 2026-03-09T19:27:59.139 INFO:tasks.workunit.client.0.vm07.stdout:1/790: link d1/d3/d21/f2e d1/d11/d37/d3f/d45/d87/d88/df4/f10f 0 2026-03-09T19:27:59.140 INFO:tasks.workunit.client.1.vm08.stdout:8/984: write de/d91/fd6 [5327868,26866] 0 2026-03-09T19:27:59.142 INFO:tasks.workunit.client.0.vm07.stdout:6/740: sync 2026-03-09T19:27:59.147 INFO:tasks.workunit.client.1.vm08.stdout:5/981: dwrite d16/d1e/d8c/d99/da8/f13c [4194304,4194304] 0 2026-03-09T19:27:59.147 INFO:tasks.workunit.client.0.vm07.stdout:2/847: dwrite d3/dd/fe [0,4194304] 0 2026-03-09T19:27:59.153 INFO:tasks.workunit.client.0.vm07.stdout:2/848: readlink d3/dd/d16/d29/d2d/d45/d3b/l79 0 2026-03-09T19:27:59.157 INFO:tasks.workunit.client.0.vm07.stdout:2/849: fdatasync d3/dd/d16/d29/d2d/d45/d8b/d98/dee/f10e 0 2026-03-09T19:27:59.159 INFO:tasks.workunit.client.0.vm07.stdout:2/850: chown d3/dd/d16/d29/d2d/d45/dc3/l129 139236 1 2026-03-09T19:27:59.170 INFO:tasks.workunit.client.0.vm07.stdout:6/741: rmdir d0/d1/d28/d76/dad 39 2026-03-09T19:27:59.174 INFO:tasks.workunit.client.0.vm07.stdout:9/797: write d0/db/fe9 [466886,1048] 0 2026-03-09T19:27:59.179 INFO:tasks.workunit.client.1.vm08.stdout:4/952: getdents da 0 2026-03-09T19:27:59.179 INFO:tasks.workunit.client.0.vm07.stdout:7/741: write d0/d4/d5/f36 [775875,79967] 0 2026-03-09T19:27:59.180 INFO:tasks.workunit.client.0.vm07.stdout:9/798: creat d0/d6f/f115 x:0 0 0 2026-03-09T19:27:59.181 INFO:tasks.workunit.client.0.vm07.stdout:4/772: dwrite d3/d4f/d56/d5f/f7b [0,4194304] 0 2026-03-09T19:27:59.181 INFO:tasks.workunit.client.0.vm07.stdout:9/799: chown d0/d6/d57/d5d/l7a 21521402 1 2026-03-09T19:27:59.184 INFO:tasks.workunit.client.0.vm07.stdout:0/765: rmdir d0/d6/d13/d1c/d52 39 2026-03-09T19:27:59.187 INFO:tasks.workunit.client.0.vm07.stdout:2/851: getdents d3/d11/d38/d111/d113 0 2026-03-09T19:27:59.187 INFO:tasks.workunit.client.1.vm08.stdout:4/953: truncate da/d10/d26/d3a/f119 1177260 0 2026-03-09T19:27:59.187 INFO:tasks.workunit.client.1.vm08.stdout:4/954: chown da/d10/d16/d28/fde 7144 1 2026-03-09T19:27:59.187 INFO:tasks.workunit.client.0.vm07.stdout:0/766: chown d0/d6/c55 204372452 1 2026-03-09T19:27:59.187 INFO:tasks.workunit.client.0.vm07.stdout:2/852: stat d3/dd/d16/d29/d2d/d45/df6/d11e 0 2026-03-09T19:27:59.197 INFO:tasks.workunit.client.1.vm08.stdout:5/982: sync 2026-03-09T19:27:59.197 INFO:tasks.workunit.client.1.vm08.stdout:4/955: unlink da/d10/d26/d27/d9b/f108 0 2026-03-09T19:27:59.198 INFO:tasks.workunit.client.1.vm08.stdout:5/983: chown d16/d45/daf/df5/d6f/l13b 2646841 1 2026-03-09T19:27:59.208 INFO:tasks.workunit.client.0.vm07.stdout:0/767: unlink d0/d6/fa4 0 2026-03-09T19:27:59.210 INFO:tasks.workunit.client.0.vm07.stdout:0/768: stat d0/d6/d13/d1c/d61/d69/lde 0 2026-03-09T19:27:59.210 INFO:tasks.workunit.client.0.vm07.stdout:8/795: dwrite d7/d9/d10/dd8/dfd/d67/f7b [0,4194304] 0 2026-03-09T19:27:59.212 INFO:tasks.workunit.client.1.vm08.stdout:4/956: mkdir da/d10/d16/d28/d2f/d4f/d103/d73/d124 0 2026-03-09T19:27:59.217 INFO:tasks.workunit.client.0.vm07.stdout:4/773: fsync d3/d11/d29/d34/fa5 0 2026-03-09T19:27:59.221 INFO:tasks.workunit.client.1.vm08.stdout:2/924: dwrite d3/d9/d4a/d9a/fc8 [0,4194304] 0 2026-03-09T19:27:59.222 INFO:tasks.workunit.client.0.vm07.stdout:2/853: rename d3/dd/d16/d29/d2d/d45/d3b/d44/d96/ff0 to d3/d11/d38/d111/d113/f12c 0 2026-03-09T19:27:59.224 INFO:tasks.workunit.client.1.vm08.stdout:2/925: write d3/d4/d23/d2c/d39/d5e/d14/f2b [2246158,75103] 0 2026-03-09T19:27:59.225 INFO:tasks.workunit.client.1.vm08.stdout:2/926: stat d3/d4/d23/d2c/d39/d5e/d14/f58 0 2026-03-09T19:27:59.229 INFO:tasks.workunit.client.0.vm07.stdout:3/855: dwrite d1/d6/d4c/d97/fc0 [0,4194304] 0 2026-03-09T19:27:59.233 INFO:tasks.workunit.client.1.vm08.stdout:5/984: mknod d16/d1e/d6e/dcd/c140 0 2026-03-09T19:27:59.234 INFO:tasks.workunit.client.1.vm08.stdout:9/998: write d0/d1b/d97/fca [1162230,79180] 0 2026-03-09T19:27:59.237 INFO:tasks.workunit.client.0.vm07.stdout:3/856: write d1/d6/d4c/d97/fc0 [3756648,37196] 0 2026-03-09T19:27:59.237 INFO:tasks.workunit.client.1.vm08.stdout:9/999: dread d0/d2/d14/d98/f10b [0,4194304] 0 2026-03-09T19:27:59.239 INFO:tasks.workunit.client.0.vm07.stdout:1/791: getdents d1/d11/d37/d3f/d45/d87/d88/df4 0 2026-03-09T19:27:59.239 INFO:tasks.workunit.client.1.vm08.stdout:2/927: readlink d3/d4/d23/d2c/d39/da3/lb2 0 2026-03-09T19:27:59.245 INFO:tasks.workunit.client.0.vm07.stdout:5/770: write d3/d1a/d28/d36/f63 [48017,124520] 0 2026-03-09T19:27:59.249 INFO:tasks.workunit.client.0.vm07.stdout:6/742: write d0/d4e/d7f/dbe/fe6 [632910,64995] 0 2026-03-09T19:27:59.250 INFO:tasks.workunit.client.1.vm08.stdout:8/985: dwrite de/d7c/f95 [0,4194304] 0 2026-03-09T19:27:59.263 INFO:tasks.workunit.client.0.vm07.stdout:9/800: write d0/d6/d3a/d94/ff6 [225525,126311] 0 2026-03-09T19:27:59.263 INFO:tasks.workunit.client.0.vm07.stdout:9/801: readlink d0/lf3 0 2026-03-09T19:27:59.266 INFO:tasks.workunit.client.0.vm07.stdout:4/774: fsync d3/fd2 0 2026-03-09T19:27:59.274 INFO:tasks.workunit.client.0.vm07.stdout:3/857: rmdir d1/d3d/d47/db3/d87/d106 39 2026-03-09T19:27:59.281 INFO:tasks.workunit.client.1.vm08.stdout:2/928: symlink d3/d4/d23/l136 0 2026-03-09T19:27:59.281 INFO:tasks.workunit.client.0.vm07.stdout:3/858: dread d1/d1f/f9c [0,4194304] 0 2026-03-09T19:27:59.281 INFO:tasks.workunit.client.0.vm07.stdout:1/792: dread - d1/d11/d37/fed zero size 2026-03-09T19:27:59.281 INFO:tasks.workunit.client.0.vm07.stdout:0/769: truncate d0/d6/d13/d1c/d11/f2e 781191 0 2026-03-09T19:27:59.282 INFO:tasks.workunit.client.0.vm07.stdout:5/771: mkdir d3/d1a/d28/d6c/d72/df6 0 2026-03-09T19:27:59.284 INFO:tasks.workunit.client.1.vm08.stdout:4/957: getdents da/d10/d26/d3a/db5/ddb 0 2026-03-09T19:27:59.288 INFO:tasks.workunit.client.0.vm07.stdout:0/770: dwrite d0/d6/d13/f31 [0,4194304] 0 2026-03-09T19:27:59.291 INFO:tasks.workunit.client.1.vm08.stdout:2/929: truncate d3/d4/d23/d2c/d39/d5e/de/d8b/f76 27634 0 2026-03-09T19:27:59.300 INFO:tasks.workunit.client.1.vm08.stdout:5/985: getdents d16/d1e/d6e 0 2026-03-09T19:27:59.305 INFO:tasks.workunit.client.0.vm07.stdout:6/743: dread d0/d1/d28/d76/dad/fe1 [0,4194304] 0 2026-03-09T19:27:59.309 INFO:tasks.workunit.client.1.vm08.stdout:4/958: unlink da/d10/d16/d28/d2f/d4f/d103/d2c/l5e 0 2026-03-09T19:27:59.309 INFO:tasks.workunit.client.1.vm08.stdout:4/959: dread da/d10/d16/d28/d2f/d4f/d103/d40/f41 [4194304,4194304] 0 2026-03-09T19:27:59.311 INFO:tasks.workunit.client.1.vm08.stdout:5/986: mknod d16/d45/d115/c141 0 2026-03-09T19:27:59.316 INFO:tasks.workunit.client.1.vm08.stdout:4/960: symlink da/d10/d16/d28/d2f/d4f/d56/dd0/dc0/l125 0 2026-03-09T19:27:59.319 INFO:tasks.workunit.client.0.vm07.stdout:3/859: creat d1/d6/d45/d54/f116 x:0 0 0 2026-03-09T19:27:59.332 INFO:tasks.workunit.client.0.vm07.stdout:2/854: write d3/dd/d16/d29/d3c/d5a/fbe [521764,101913] 0 2026-03-09T19:27:59.335 INFO:tasks.workunit.client.1.vm08.stdout:5/987: symlink d16/d1e/d3b/d61/d11e/d107/d114/l142 0 2026-03-09T19:27:59.336 INFO:tasks.workunit.client.1.vm08.stdout:8/986: dwrite de/d7c/fe7 [0,4194304] 0 2026-03-09T19:27:59.337 INFO:tasks.workunit.client.0.vm07.stdout:7/742: dwrite d0/d4/d5/d8/d41/d64/d74/d98/f18 [4194304,4194304] 0 2026-03-09T19:27:59.341 INFO:tasks.workunit.client.1.vm08.stdout:2/930: link d3/l123 d3/d4/d23/d2c/d39/l137 0 2026-03-09T19:27:59.342 INFO:tasks.workunit.client.0.vm07.stdout:5/772: truncate d3/dd/d26/d3f/d47/d71/db7/fbc 885828 0 2026-03-09T19:27:59.350 INFO:tasks.workunit.client.0.vm07.stdout:9/802: mkdir d0/d6f/dc3/df8/dfc/d116 0 2026-03-09T19:27:59.359 INFO:tasks.workunit.client.1.vm08.stdout:8/987: creat de/d1d/d21/d73/f154 x:0 0 0 2026-03-09T19:27:59.360 INFO:tasks.workunit.client.1.vm08.stdout:5/988: dread d16/d45/fdd [0,4194304] 0 2026-03-09T19:27:59.366 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:59 vm07.local ceph-mon[48545]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:27:59.366 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:59 vm07.local ceph-mon[48545]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:27:59.366 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:59 vm07.local ceph-mon[48545]: Updating vm07:/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:27:59.366 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:59 vm07.local ceph-mon[48545]: Updating vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.client.admin.keyring 2026-03-09T19:27:59.366 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:59 vm07.local ceph-mon[48545]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.client.admin.keyring 2026-03-09T19:27:59.366 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:59 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:59.366 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:59 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:59.366 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:59 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:59.366 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:27:59 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:59.369 INFO:tasks.workunit.client.1.vm08.stdout:4/961: creat da/d10/d16/d28/d2f/d4f/d103/d73/d124/f126 x:0 0 0 2026-03-09T19:27:59.374 INFO:tasks.workunit.client.1.vm08.stdout:2/931: rename d3/d4/d23/d2c/d39/d5e/de/f1c to d3/d4/d10e/f138 0 2026-03-09T19:27:59.378 INFO:tasks.workunit.client.0.vm07.stdout:2/855: truncate d3/dd/d16/d29/d2d/d45/d85/d8a/fd2 1566863 0 2026-03-09T19:27:59.381 INFO:tasks.workunit.client.0.vm07.stdout:8/796: getdents d7/d9/d37/d45/d97/dbc 0 2026-03-09T19:27:59.387 INFO:tasks.workunit.client.0.vm07.stdout:5/773: chown d3/dd/d26/d3f/d47/f62 1684 1 2026-03-09T19:27:59.388 INFO:tasks.workunit.client.0.vm07.stdout:5/774: read d3/d1a/f12 [1817549,47585] 0 2026-03-09T19:27:59.389 INFO:tasks.workunit.client.1.vm08.stdout:5/989: getdents d16/d1e/dc9/d10c/dfc 0 2026-03-09T19:27:59.389 INFO:tasks.workunit.client.0.vm07.stdout:0/771: dread d0/d6/d13/d1c/d52/d81/fb0 [0,4194304] 0 2026-03-09T19:27:59.395 INFO:tasks.workunit.client.0.vm07.stdout:9/803: rename d0/db/d29/d32/d5c/d80/lea to d0/d6/d57/deb/dba/l117 0 2026-03-09T19:27:59.395 INFO:tasks.workunit.client.0.vm07.stdout:9/804: write d0/d6f/dc3/fff [858259,6689] 0 2026-03-09T19:27:59.406 INFO:tasks.workunit.client.0.vm07.stdout:3/860: rmdir d1 39 2026-03-09T19:27:59.406 INFO:tasks.workunit.client.0.vm07.stdout:1/793: write d1/d3e/db3/d6d/dff/fe7 [371400,32800] 0 2026-03-09T19:27:59.414 INFO:tasks.workunit.client.0.vm07.stdout:6/744: write d0/dbf/d95/d31/f3a [658049,93032] 0 2026-03-09T19:27:59.416 INFO:tasks.workunit.client.1.vm08.stdout:8/988: truncate de/d7c/fe7 2804128 0 2026-03-09T19:27:59.418 INFO:tasks.workunit.client.0.vm07.stdout:6/745: dread d0/d13/f18 [0,4194304] 0 2026-03-09T19:27:59.419 INFO:tasks.workunit.client.1.vm08.stdout:4/962: getdents da/d10/d16/d28/d2f/d4f/d64 0 2026-03-09T19:27:59.421 INFO:tasks.workunit.client.1.vm08.stdout:5/990: dwrite d16/fb8 [0,4194304] 0 2026-03-09T19:27:59.429 INFO:tasks.workunit.client.1.vm08.stdout:2/932: getdents d3/d9/d26/ded 0 2026-03-09T19:27:59.431 INFO:tasks.workunit.client.1.vm08.stdout:8/989: creat de/d47/dfd/d99/da0/d10d/f155 x:0 0 0 2026-03-09T19:27:59.432 INFO:tasks.workunit.client.1.vm08.stdout:4/963: unlink da/c1a 0 2026-03-09T19:27:59.434 INFO:tasks.workunit.client.1.vm08.stdout:5/991: creat d16/d1e/db3/f143 x:0 0 0 2026-03-09T19:27:59.436 INFO:tasks.workunit.client.1.vm08.stdout:2/933: mkdir d3/d4/d23/d2c/d39/d5e/de/d18/d99/d139 0 2026-03-09T19:27:59.445 INFO:tasks.workunit.client.1.vm08.stdout:2/934: dread d3/d4/f6 [4194304,4194304] 0 2026-03-09T19:27:59.446 INFO:tasks.workunit.client.1.vm08.stdout:2/935: chown d3/d4/f55 56597274 1 2026-03-09T19:27:59.447 INFO:tasks.workunit.client.1.vm08.stdout:8/990: truncate de/d47/fc1 708138 0 2026-03-09T19:27:59.447 INFO:tasks.workunit.client.1.vm08.stdout:8/991: dread - de/d47/fe8 zero size 2026-03-09T19:27:59.451 INFO:tasks.workunit.client.1.vm08.stdout:8/992: dwrite de/d47/dfd/d99/da5/ff8 [0,4194304] 0 2026-03-09T19:27:59.454 INFO:tasks.workunit.client.1.vm08.stdout:8/993: stat de/d1d/d69/fb7 0 2026-03-09T19:27:59.463 INFO:tasks.workunit.client.1.vm08.stdout:2/936: symlink d3/d4/d3e/l13a 0 2026-03-09T19:27:59.465 INFO:tasks.workunit.client.1.vm08.stdout:2/937: creat d3/d4/d3e/d4e/d88/f13b x:0 0 0 2026-03-09T19:27:59.466 INFO:tasks.workunit.client.1.vm08.stdout:2/938: dread - d3/d4/d23/d2c/d39/d5e/de/d18/f11f zero size 2026-03-09T19:27:59.476 INFO:tasks.workunit.client.1.vm08.stdout:5/992: dwrite d16/d45/daf/ff0 [0,4194304] 0 2026-03-09T19:27:59.483 INFO:tasks.workunit.client.0.vm07.stdout:4/775: link d3/f1a d3/d11/d29/d101/d99/de7/f10d 0 2026-03-09T19:27:59.485 INFO:tasks.workunit.client.0.vm07.stdout:8/797: dread - d7/d30/d75/fbf zero size 2026-03-09T19:27:59.487 INFO:tasks.workunit.client.0.vm07.stdout:5/775: dread - d3/d1a/d5a/fc6 zero size 2026-03-09T19:27:59.488 INFO:tasks.workunit.client.1.vm08.stdout:8/994: creat de/d1d/f156 x:0 0 0 2026-03-09T19:27:59.490 INFO:tasks.workunit.client.0.vm07.stdout:0/772: creat d0/d6/d13/d1c/d11/d56/f100 x:0 0 0 2026-03-09T19:27:59.493 INFO:tasks.workunit.client.1.vm08.stdout:5/993: dread d16/d1e/f7d [0,4194304] 0 2026-03-09T19:27:59.499 INFO:tasks.workunit.client.0.vm07.stdout:9/805: unlink d0/d17/ldc 0 2026-03-09T19:27:59.500 INFO:tasks.workunit.client.1.vm08.stdout:4/964: dread da/d10/d16/f4b [0,4194304] 0 2026-03-09T19:27:59.503 INFO:tasks.workunit.client.1.vm08.stdout:8/995: read - de/d25/d31/f131 zero size 2026-03-09T19:27:59.505 INFO:tasks.workunit.client.1.vm08.stdout:5/994: fsync d16/d1e/d3b/d61/d11e/d107/fce 0 2026-03-09T19:27:59.506 INFO:tasks.workunit.client.1.vm08.stdout:8/996: dread de/d7c/f95 [0,4194304] 0 2026-03-09T19:27:59.507 INFO:tasks.workunit.client.1.vm08.stdout:8/997: chown de/d91/dc8/de9/fff 30541 1 2026-03-09T19:27:59.507 INFO:tasks.workunit.client.1.vm08.stdout:8/998: readlink de/d1d/d2e/l3c 0 2026-03-09T19:27:59.508 INFO:tasks.workunit.client.1.vm08.stdout:4/965: mknod da/d10/d26/d27/da6/c127 0 2026-03-09T19:27:59.509 INFO:tasks.workunit.client.1.vm08.stdout:4/966: read - da/d10/d16/d28/d2f/d4f/d56/dd0/dc0/f123 zero size 2026-03-09T19:27:59.518 INFO:tasks.workunit.client.1.vm08.stdout:2/939: truncate d3/d4/d23/d2c/fe6 4873647 0 2026-03-09T19:27:59.520 INFO:tasks.workunit.client.0.vm07.stdout:2/856: write d3/dd/d16/d29/d2d/d45/dc3/f9c [5876059,126598] 0 2026-03-09T19:27:59.521 INFO:tasks.workunit.client.0.vm07.stdout:6/746: rmdir d0/dbf 39 2026-03-09T19:27:59.527 INFO:tasks.workunit.client.0.vm07.stdout:7/743: link d0/d4/d5/d26/lc0 d0/d80/db1/lfe 0 2026-03-09T19:27:59.527 INFO:tasks.workunit.client.1.vm08.stdout:5/995: fsync d16/d1e/f27 0 2026-03-09T19:27:59.528 INFO:tasks.workunit.client.0.vm07.stdout:1/794: dwrite d1/db/d31/d4f/ffc [0,4194304] 0 2026-03-09T19:27:59.530 INFO:tasks.workunit.client.0.vm07.stdout:1/795: stat d1/d11/d37/d3f/d45/d87/fe6 0 2026-03-09T19:27:59.530 INFO:tasks.workunit.client.1.vm08.stdout:8/999: mknod de/d91/dd5/c157 0 2026-03-09T19:27:59.532 INFO:tasks.workunit.client.0.vm07.stdout:4/776: write d3/d11/f74 [871383,111192] 0 2026-03-09T19:27:59.538 INFO:tasks.workunit.client.1.vm08.stdout:2/940: creat d3/d4/d10e/f13c x:0 0 0 2026-03-09T19:27:59.538 INFO:tasks.workunit.client.0.vm07.stdout:4/777: chown d3/d11/d2b/f2c 1256651996 1 2026-03-09T19:27:59.539 INFO:tasks.workunit.client.1.vm08.stdout:2/941: chown d3/d4/d23/d2c/d39 297406 1 2026-03-09T19:27:59.540 INFO:tasks.workunit.client.0.vm07.stdout:8/798: creat d7/d9/d37/d45/d4f/db1/d107/f11b x:0 0 0 2026-03-09T19:27:59.541 INFO:tasks.workunit.client.0.vm07.stdout:5/776: mkdir d3/dd/d26/d2d/d60/df7 0 2026-03-09T19:27:59.542 INFO:tasks.workunit.client.0.vm07.stdout:8/799: write d7/d9/d10/dd8/dfd/d62/fc2 [2998728,112206] 0 2026-03-09T19:27:59.542 INFO:tasks.workunit.client.0.vm07.stdout:0/773: rmdir d0/d6/d13/dd0 39 2026-03-09T19:27:59.545 INFO:tasks.workunit.client.0.vm07.stdout:8/800: write d7/d50/da6/f109 [541871,57539] 0 2026-03-09T19:27:59.549 INFO:tasks.workunit.client.1.vm08.stdout:2/942: symlink d3/d4/d23/d2c/d39/d5e/db8/l13d 0 2026-03-09T19:27:59.555 INFO:tasks.workunit.client.1.vm08.stdout:2/943: dread d3/d4/d23/d2c/d39/d5e/de/d8b/f76 [0,4194304] 0 2026-03-09T19:27:59.555 INFO:tasks.workunit.client.0.vm07.stdout:7/744: rmdir d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58 39 2026-03-09T19:27:59.555 INFO:tasks.workunit.client.0.vm07.stdout:1/796: fsync d1/d3/f12 0 2026-03-09T19:27:59.555 INFO:tasks.workunit.client.0.vm07.stdout:9/806: mkdir d0/d6/d57/d5d/dde/d118 0 2026-03-09T19:27:59.555 INFO:tasks.workunit.client.0.vm07.stdout:3/861: mknod d1/d1f/c117 0 2026-03-09T19:27:59.562 INFO:tasks.workunit.client.0.vm07.stdout:7/745: readlink d0/d4/l48 0 2026-03-09T19:27:59.562 INFO:tasks.workunit.client.0.vm07.stdout:1/797: fdatasync d1/d11/d37/d3f/d45/f16 0 2026-03-09T19:27:59.563 INFO:tasks.workunit.client.0.vm07.stdout:5/777: symlink d3/dd/lf8 0 2026-03-09T19:27:59.565 INFO:tasks.workunit.client.1.vm08.stdout:2/944: dread d3/d4/d23/d2c/d39/d5e/de/d18/f50 [0,4194304] 0 2026-03-09T19:27:59.568 INFO:tasks.workunit.client.0.vm07.stdout:8/801: sync 2026-03-09T19:27:59.570 INFO:tasks.workunit.client.1.vm08.stdout:4/967: write da/d10/d16/d28/d46/d52/f5b [1385628,126049] 0 2026-03-09T19:27:59.570 INFO:tasks.workunit.client.0.vm07.stdout:9/807: truncate d0/d6/d57/deb/fd2 35830 0 2026-03-09T19:27:59.570 INFO:tasks.workunit.client.0.vm07.stdout:4/778: link d3/d11/d29/d101/d99/de7/l109 d3/d11/d16/df5/l10e 0 2026-03-09T19:27:59.572 INFO:tasks.workunit.client.1.vm08.stdout:5/996: dwrite d16/d1e/d8c/d99/da8/fe2 [0,4194304] 0 2026-03-09T19:27:59.574 INFO:tasks.workunit.client.1.vm08.stdout:5/997: read - d16/d1e/d3b/d61/d11e/f133 zero size 2026-03-09T19:27:59.577 INFO:tasks.workunit.client.0.vm07.stdout:6/747: write d0/d1/d28/da8/fd2 [663888,1110] 0 2026-03-09T19:27:59.577 INFO:tasks.workunit.client.0.vm07.stdout:4/779: stat d3/d11/d2b/fba 0 2026-03-09T19:27:59.579 INFO:tasks.workunit.client.1.vm08.stdout:2/945: mknod d3/d9/d79/d46/d8c/d92/c13e 0 2026-03-09T19:27:59.582 INFO:tasks.workunit.client.0.vm07.stdout:3/862: write d1/f98 [721553,123756] 0 2026-03-09T19:27:59.586 INFO:tasks.workunit.client.0.vm07.stdout:0/774: symlink d0/d6/d13/dd0/l101 0 2026-03-09T19:27:59.586 INFO:tasks.workunit.client.0.vm07.stdout:2/857: dwrite d3/dd/d16/d29/d2d/d45/f62 [4194304,4194304] 0 2026-03-09T19:27:59.594 INFO:tasks.workunit.client.1.vm08.stdout:5/998: fdatasync d16/d1e/f27 0 2026-03-09T19:27:59.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:59 vm08.local ceph-mon[57794]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:27:59.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:59 vm08.local ceph-mon[57794]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:27:59.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:59 vm08.local ceph-mon[57794]: Updating vm07:/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:27:59.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:59 vm08.local ceph-mon[57794]: Updating vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.client.admin.keyring 2026-03-09T19:27:59.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:59 vm08.local ceph-mon[57794]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.client.admin.keyring 2026-03-09T19:27:59.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:59 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:59.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:59 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:59.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:59 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:59.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:27:59 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:27:59.595 INFO:tasks.workunit.client.1.vm08.stdout:2/946: read d3/d4/f91 [3963734,13645] 0 2026-03-09T19:27:59.599 INFO:tasks.workunit.client.1.vm08.stdout:4/968: mknod da/d10/d26/d27/c128 0 2026-03-09T19:27:59.599 INFO:tasks.workunit.client.1.vm08.stdout:4/969: chown da/d10/d26/d27/fac 1 1 2026-03-09T19:27:59.602 INFO:tasks.workunit.client.0.vm07.stdout:9/808: unlink d0/d6/d57/l100 0 2026-03-09T19:27:59.604 INFO:tasks.workunit.client.1.vm08.stdout:2/947: unlink d3/d4/d23/d2c/d39/f10f 0 2026-03-09T19:27:59.611 INFO:tasks.workunit.client.0.vm07.stdout:4/780: creat d3/dbe/f10f x:0 0 0 2026-03-09T19:27:59.612 INFO:tasks.workunit.client.0.vm07.stdout:1/798: symlink d1/db/d31/l110 0 2026-03-09T19:27:59.613 INFO:tasks.workunit.client.0.vm07.stdout:3/863: read - d1/d3d/d47/db3/faf zero size 2026-03-09T19:27:59.614 INFO:tasks.workunit.client.0.vm07.stdout:3/864: truncate d1/d6/d45/d54/de5/f10c 763752 0 2026-03-09T19:27:59.614 INFO:tasks.workunit.client.1.vm08.stdout:2/948: fdatasync d3/d4/d23/d2c/d39/d5e/de/d18/fad 0 2026-03-09T19:27:59.616 INFO:tasks.workunit.client.1.vm08.stdout:2/949: write d3/d4/d23/d2c/d39/db9/df6/f119 [939795,23923] 0 2026-03-09T19:27:59.625 INFO:tasks.workunit.client.0.vm07.stdout:2/858: truncate d3/dd/d16/f5f 1245081 0 2026-03-09T19:27:59.627 INFO:tasks.workunit.client.0.vm07.stdout:7/746: symlink d0/d4/d5/d26/dfb/lff 0 2026-03-09T19:27:59.628 INFO:tasks.workunit.client.1.vm08.stdout:2/950: creat d3/d4/d23/d2c/d39/d5e/de/d18/d99/f13f x:0 0 0 2026-03-09T19:27:59.629 INFO:tasks.workunit.client.0.vm07.stdout:5/778: dwrite d3/d1a/d28/d6c/d72/d8f/f91 [0,4194304] 0 2026-03-09T19:27:59.632 INFO:tasks.workunit.client.1.vm08.stdout:5/999: dwrite d16/d1e/f7d [0,4194304] 0 2026-03-09T19:27:59.636 INFO:tasks.workunit.client.0.vm07.stdout:8/802: truncate d7/d16/f71 1288090 0 2026-03-09T19:27:59.640 INFO:tasks.workunit.client.1.vm08.stdout:2/951: creat d3/d4/d23/d2c/d39/d5e/de/d18/da9/d110/f140 x:0 0 0 2026-03-09T19:27:59.641 INFO:tasks.workunit.client.0.vm07.stdout:9/809: symlink d0/d6f/dc3/df8/dfc/l119 0 2026-03-09T19:27:59.642 INFO:tasks.workunit.client.0.vm07.stdout:6/748: rmdir d0/dbf/d95/d31/d9e 39 2026-03-09T19:27:59.644 INFO:tasks.workunit.client.1.vm08.stdout:2/952: write d3/d4/d23/d2c/f31 [14934,58423] 0 2026-03-09T19:27:59.645 INFO:tasks.workunit.client.0.vm07.stdout:4/781: truncate d3/d11/d51/faa 868170 0 2026-03-09T19:27:59.648 INFO:tasks.workunit.client.1.vm08.stdout:4/970: dwrite da/d10/f25 [0,4194304] 0 2026-03-09T19:27:59.652 INFO:tasks.workunit.client.0.vm07.stdout:3/865: rmdir d1/d6/dd 39 2026-03-09T19:27:59.652 INFO:tasks.workunit.client.0.vm07.stdout:3/866: chown d1/d3d/d47/db3/dc2/f3a 149 1 2026-03-09T19:27:59.656 INFO:tasks.workunit.client.1.vm08.stdout:2/953: fdatasync d3/d9/d79/f6b 0 2026-03-09T19:27:59.658 INFO:tasks.workunit.client.0.vm07.stdout:8/803: symlink d7/d1d/d83/d9f/l11c 0 2026-03-09T19:27:59.659 INFO:tasks.workunit.client.1.vm08.stdout:4/971: creat da/d10/de3/f129 x:0 0 0 2026-03-09T19:27:59.661 INFO:tasks.workunit.client.0.vm07.stdout:9/810: unlink d0/db/d29/d32/l35 0 2026-03-09T19:27:59.661 INFO:tasks.workunit.client.0.vm07.stdout:6/749: truncate d0/d4e/d7f/fbc 264319 0 2026-03-09T19:27:59.662 INFO:tasks.workunit.client.0.vm07.stdout:4/782: dread - d3/dbe/ff3 zero size 2026-03-09T19:27:59.663 INFO:tasks.workunit.client.0.vm07.stdout:4/783: readlink d3/d11/d29/l62 0 2026-03-09T19:27:59.663 INFO:tasks.workunit.client.1.vm08.stdout:4/972: mkdir da/d10/d1b/d10c/d12a 0 2026-03-09T19:27:59.664 INFO:tasks.workunit.client.0.vm07.stdout:4/784: readlink d3/lf 0 2026-03-09T19:27:59.666 INFO:tasks.workunit.client.0.vm07.stdout:3/867: unlink d1/d6/d4c/d97/ca2 0 2026-03-09T19:27:59.671 INFO:tasks.workunit.client.0.vm07.stdout:3/868: dread d1/d1f/f9c [0,4194304] 0 2026-03-09T19:27:59.671 INFO:tasks.workunit.client.1.vm08.stdout:4/973: dwrite da/d10/d16/d28/d2f/d4f/d103/d40/ff8 [4194304,4194304] 0 2026-03-09T19:27:59.671 INFO:tasks.workunit.client.1.vm08.stdout:2/954: creat d3/d9/d79/d46/d8c/f141 x:0 0 0 2026-03-09T19:27:59.671 INFO:tasks.workunit.client.1.vm08.stdout:2/955: creat d3/dca/f142 x:0 0 0 2026-03-09T19:27:59.674 INFO:tasks.workunit.client.0.vm07.stdout:5/779: creat d3/dd/d26/d3f/d47/de6/ff9 x:0 0 0 2026-03-09T19:27:59.682 INFO:tasks.workunit.client.0.vm07.stdout:8/804: symlink d7/d30/d32/l11d 0 2026-03-09T19:27:59.683 INFO:tasks.workunit.client.1.vm08.stdout:2/956: symlink d3/d4/d23/d2c/d39/d5e/de/d18/da9/d110/l143 0 2026-03-09T19:27:59.685 INFO:tasks.workunit.client.0.vm07.stdout:9/811: rename d0/db/d29/d4d/l97 to d0/d6/d3a/dd3/l11a 0 2026-03-09T19:27:59.685 INFO:tasks.workunit.client.0.vm07.stdout:0/775: getdents d0/d6/d13 0 2026-03-09T19:27:59.688 INFO:tasks.workunit.client.0.vm07.stdout:4/785: dread - d3/d11/d16/fae zero size 2026-03-09T19:27:59.689 INFO:tasks.workunit.client.0.vm07.stdout:5/780: mkdir d3/dd/d26/d3f/d47/d71/dfa 0 2026-03-09T19:27:59.692 INFO:tasks.workunit.client.0.vm07.stdout:9/812: symlink d0/db/d29/d2c/d36/d5a/l11b 0 2026-03-09T19:27:59.694 INFO:tasks.workunit.client.0.vm07.stdout:4/786: truncate d3/f7 1827811 0 2026-03-09T19:27:59.700 INFO:tasks.workunit.client.0.vm07.stdout:3/869: creat d1/d3d/d47/db3/dc2/d28/dc4/f118 x:0 0 0 2026-03-09T19:27:59.700 INFO:tasks.workunit.client.0.vm07.stdout:0/776: mknod d0/d6/d13/d17/d19/d58/dd9/df8/c102 0 2026-03-09T19:27:59.700 INFO:tasks.workunit.client.0.vm07.stdout:3/870: mknod d1/d3d/d47/db3/dc2/d28/dc4/c119 0 2026-03-09T19:27:59.707 INFO:tasks.workunit.client.0.vm07.stdout:5/781: dread d3/d1a/fb [0,4194304] 0 2026-03-09T19:27:59.714 INFO:tasks.workunit.client.0.vm07.stdout:5/782: dread d3/dd/d26/d3f/fb3 [0,4194304] 0 2026-03-09T19:27:59.717 INFO:tasks.workunit.client.0.vm07.stdout:9/813: dread d0/db/d29/da8/fab [0,4194304] 0 2026-03-09T19:27:59.747 INFO:tasks.workunit.client.0.vm07.stdout:0/777: sync 2026-03-09T19:27:59.754 INFO:tasks.workunit.client.1.vm08.stdout:2/957: sync 2026-03-09T19:27:59.755 INFO:tasks.workunit.client.1.vm08.stdout:2/958: dread - d3/d4/d23/d2c/f64 zero size 2026-03-09T19:27:59.764 INFO:tasks.workunit.client.1.vm08.stdout:2/959: dread d3/d4/d23/d2c/d39/d5e/de/d18/f2d [0,4194304] 0 2026-03-09T19:27:59.766 INFO:tasks.workunit.client.1.vm08.stdout:2/960: mkdir d3/d4/d23/d2c/d39/db9/d144 0 2026-03-09T19:27:59.766 INFO:tasks.workunit.client.1.vm08.stdout:2/961: fdatasync d3/d4/d10e/f12c 0 2026-03-09T19:27:59.772 INFO:tasks.workunit.client.1.vm08.stdout:2/962: creat d3/d4/d23/d2c/d39/d5e/de/d18/d99/dd4/f145 x:0 0 0 2026-03-09T19:27:59.777 INFO:tasks.workunit.client.0.vm07.stdout:1/799: dwrite d1/d11/d37/d3f/f82 [0,4194304] 0 2026-03-09T19:27:59.779 INFO:tasks.workunit.client.0.vm07.stdout:1/800: chown d1/d11/d37/d3f/d6e/d9c/db6/cf1 2356300 1 2026-03-09T19:27:59.779 INFO:tasks.workunit.client.1.vm08.stdout:4/974: dwrite da/d10/d16/d28/d2f/fd4 [0,4194304] 0 2026-03-09T19:27:59.779 INFO:tasks.workunit.client.0.vm07.stdout:2/859: dwrite d3/dd/d16/d29/d2d/f6d [4194304,4194304] 0 2026-03-09T19:27:59.782 INFO:tasks.workunit.client.0.vm07.stdout:7/747: dwrite d0/d80/db1/ff7 [0,4194304] 0 2026-03-09T19:27:59.792 INFO:tasks.workunit.client.1.vm08.stdout:4/975: dwrite da/d10/d16/d28/d2f/d4f/d56/d90/f114 [0,4194304] 0 2026-03-09T19:27:59.807 INFO:tasks.workunit.client.0.vm07.stdout:8/805: write d7/d9/f36 [928124,121554] 0 2026-03-09T19:27:59.808 INFO:tasks.workunit.client.0.vm07.stdout:1/801: dread d1/f1d [0,4194304] 0 2026-03-09T19:27:59.812 INFO:tasks.workunit.client.0.vm07.stdout:2/860: rmdir d3/dd/d16/d29/d3c/d5a/d7a 39 2026-03-09T19:27:59.815 INFO:tasks.workunit.client.0.vm07.stdout:6/750: truncate d0/d1/db/d17/dc4/d7b/d7d/fa7 19847 0 2026-03-09T19:27:59.815 INFO:tasks.workunit.client.0.vm07.stdout:8/806: rmdir d7/d30/d75 39 2026-03-09T19:27:59.825 INFO:tasks.workunit.client.0.vm07.stdout:4/787: truncate d3/d4f/d56/d5f/fc2 840150 0 2026-03-09T19:27:59.825 INFO:tasks.workunit.client.0.vm07.stdout:4/788: chown d3/d11/d2b/f2c 161414496 1 2026-03-09T19:27:59.826 INFO:tasks.workunit.client.0.vm07.stdout:2/861: truncate d3/dd/d16/d29/d2d/d45/d3b/d44/d96/ddf/fe3 3852404 0 2026-03-09T19:27:59.827 INFO:tasks.workunit.client.0.vm07.stdout:7/748: link d0/d4/d5/d8/d1a/lc3 d0/d4/d5/d8/l100 0 2026-03-09T19:27:59.829 INFO:tasks.workunit.client.0.vm07.stdout:3/871: write d1/d6/dd/dbf/fcf [544203,35172] 0 2026-03-09T19:27:59.830 INFO:tasks.workunit.client.0.vm07.stdout:2/862: write d3/dd/d16/d29/d3c/da2/f10d [843546,15665] 0 2026-03-09T19:27:59.832 INFO:tasks.workunit.client.0.vm07.stdout:7/749: write d0/d4/d5/d8/d41/fe0 [323093,81433] 0 2026-03-09T19:27:59.837 INFO:tasks.workunit.client.0.vm07.stdout:3/872: dread d1/f20 [0,4194304] 0 2026-03-09T19:27:59.837 INFO:tasks.workunit.client.0.vm07.stdout:9/814: write d0/d6/f48 [1858952,112392] 0 2026-03-09T19:27:59.837 INFO:tasks.workunit.client.0.vm07.stdout:5/783: write d3/d1a/d28/d40/d92/f5e [772577,48398] 0 2026-03-09T19:27:59.837 INFO:tasks.workunit.client.0.vm07.stdout:2/863: chown d3/dd/d16/d29/d3c/d5a/d7a/d74 0 1 2026-03-09T19:27:59.837 INFO:tasks.workunit.client.0.vm07.stdout:0/778: write d0/d6/d13/dd0/fd5 [666730,119102] 0 2026-03-09T19:27:59.844 INFO:tasks.workunit.client.0.vm07.stdout:0/779: rmdir d0/d6/d13/d17/d19/d57/d6a 39 2026-03-09T19:27:59.844 INFO:tasks.workunit.client.1.vm08.stdout:4/976: dread da/d10/d16/d28/d2f/d4f/d64/d81/f86 [0,4194304] 0 2026-03-09T19:27:59.845 INFO:tasks.workunit.client.0.vm07.stdout:7/750: read d0/d4/d5/d8/d1a/f4d [1769999,123560] 0 2026-03-09T19:27:59.845 INFO:tasks.workunit.client.1.vm08.stdout:4/977: read da/d10/d16/d28/d2f/d4f/d103/d40/d6c/f92 [726746,59667] 0 2026-03-09T19:27:59.848 INFO:tasks.workunit.client.0.vm07.stdout:2/864: mknod d3/dd/d16/d29/d3c/d5a/db3/c12d 0 2026-03-09T19:27:59.849 INFO:tasks.workunit.client.1.vm08.stdout:4/978: truncate da/d10/d16/f9f 4320232 0 2026-03-09T19:27:59.851 INFO:tasks.workunit.client.0.vm07.stdout:5/784: mkdir d3/d1a/d28/d40/d92/d89/ddc/dde/dfb 0 2026-03-09T19:27:59.854 INFO:tasks.workunit.client.1.vm08.stdout:4/979: fsync da/d10/d26/d3a/f88 0 2026-03-09T19:27:59.858 INFO:tasks.workunit.client.1.vm08.stdout:4/980: dread - da/d10/d16/d28/d2f/d4f/d64/d81/dfb/f11f zero size 2026-03-09T19:27:59.863 INFO:tasks.workunit.client.0.vm07.stdout:7/751: sync 2026-03-09T19:27:59.864 INFO:tasks.workunit.client.0.vm07.stdout:7/752: write d0/d4/d5/d8/d41/d64/dd5/ffc [160539,115732] 0 2026-03-09T19:27:59.871 INFO:tasks.workunit.client.1.vm08.stdout:2/963: write d3/d4/d3e/d4e/d88/db0/ff3 [628747,108253] 0 2026-03-09T19:27:59.876 INFO:tasks.workunit.client.0.vm07.stdout:9/815: link d0/d6/d3a/d94/caa d0/db/d29/c11c 0 2026-03-09T19:27:59.880 INFO:tasks.workunit.client.0.vm07.stdout:2/865: creat d3/dd/d16/d29/d3c/d4c/d128/f12e x:0 0 0 2026-03-09T19:27:59.882 INFO:tasks.workunit.client.1.vm08.stdout:2/964: unlink d3/d4/d3e/d4e/d88/l8a 0 2026-03-09T19:27:59.884 INFO:tasks.workunit.client.0.vm07.stdout:0/780: link d0/f9f d0/d6/d13/d17/d19/d58/dd9/f103 0 2026-03-09T19:27:59.887 INFO:tasks.workunit.client.1.vm08.stdout:2/965: dread - d3/d9/fdd zero size 2026-03-09T19:27:59.890 INFO:tasks.workunit.client.1.vm08.stdout:2/966: dread d3/d9/d26/f12d [0,4194304] 0 2026-03-09T19:27:59.891 INFO:tasks.workunit.client.1.vm08.stdout:2/967: dread - d3/d4/d23/d2c/d39/da3/f133 zero size 2026-03-09T19:27:59.897 INFO:tasks.workunit.client.0.vm07.stdout:7/753: creat d0/d4/d5/d8/d1a/d2a/dc5/f101 x:0 0 0 2026-03-09T19:27:59.900 INFO:tasks.workunit.client.0.vm07.stdout:0/781: dread d0/d6/d13/d33/f35 [0,4194304] 0 2026-03-09T19:27:59.905 INFO:tasks.workunit.client.0.vm07.stdout:5/785: creat d3/d1a/d28/d6c/ffc x:0 0 0 2026-03-09T19:27:59.906 INFO:tasks.workunit.client.0.vm07.stdout:0/782: symlink d0/d6/d13/d17/d19/d58/dd9/l104 0 2026-03-09T19:27:59.907 INFO:tasks.workunit.client.0.vm07.stdout:5/786: fsync d3/dd/dbe/fce 0 2026-03-09T19:27:59.908 INFO:tasks.workunit.client.0.vm07.stdout:7/754: rmdir d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58 39 2026-03-09T19:27:59.908 INFO:tasks.workunit.client.0.vm07.stdout:9/816: dread d0/db/d29/d68/f8e [0,4194304] 0 2026-03-09T19:27:59.915 INFO:tasks.workunit.client.0.vm07.stdout:1/802: dwrite d1/d3e/db3/fda [0,4194304] 0 2026-03-09T19:27:59.917 INFO:tasks.workunit.client.0.vm07.stdout:0/783: unlink d0/d6/d13/d17/dc3/f7d 0 2026-03-09T19:27:59.918 INFO:tasks.workunit.client.0.vm07.stdout:6/751: dwrite d0/d1/db/d1d/f2e [0,4194304] 0 2026-03-09T19:27:59.919 INFO:tasks.workunit.client.0.vm07.stdout:6/752: chown d0/d4e/l56 6 1 2026-03-09T19:27:59.921 INFO:tasks.workunit.client.0.vm07.stdout:8/807: write d7/d50/da6/fde [1898116,13652] 0 2026-03-09T19:27:59.921 INFO:tasks.workunit.client.0.vm07.stdout:4/789: write d3/d4f/d56/d5f/f72 [385301,640] 0 2026-03-09T19:27:59.926 INFO:tasks.workunit.client.0.vm07.stdout:3/873: write d1/d6/f9 [1763475,36033] 0 2026-03-09T19:27:59.931 INFO:tasks.workunit.client.0.vm07.stdout:1/803: mkdir d1/d11/d37/d3f/d7e/d111 0 2026-03-09T19:27:59.933 INFO:tasks.workunit.client.1.vm08.stdout:4/981: truncate da/d10/d16/d28/d2f/d4f/d56/d90/f114 464743 0 2026-03-09T19:27:59.935 INFO:tasks.workunit.client.1.vm08.stdout:4/982: chown da/d10/d16/d28/d2f/d4f/d56/dd0/fc6 271 1 2026-03-09T19:27:59.937 INFO:tasks.workunit.client.0.vm07.stdout:2/866: write d3/f93 [4470031,44792] 0 2026-03-09T19:27:59.937 INFO:tasks.workunit.client.1.vm08.stdout:2/968: write d3/d9/d26/f52 [4551718,96346] 0 2026-03-09T19:27:59.938 INFO:tasks.workunit.client.1.vm08.stdout:2/969: readlink d3/d4/d3e/d4e/la2 0 2026-03-09T19:27:59.948 INFO:tasks.workunit.client.1.vm08.stdout:4/983: chown da/c76 203 1 2026-03-09T19:27:59.948 INFO:tasks.workunit.client.1.vm08.stdout:2/970: mkdir d3/d4/d3e/df2/d146 0 2026-03-09T19:27:59.948 INFO:tasks.workunit.client.0.vm07.stdout:0/784: truncate d0/d6/dc8/f94 1530062 0 2026-03-09T19:27:59.948 INFO:tasks.workunit.client.0.vm07.stdout:0/785: readlink d0/l9d 0 2026-03-09T19:27:59.948 INFO:tasks.workunit.client.0.vm07.stdout:6/753: rename d0/d1/db/d17/dc4 to d0/d2d/dd5/d123 0 2026-03-09T19:27:59.949 INFO:tasks.workunit.client.0.vm07.stdout:7/755: mknod d0/c102 0 2026-03-09T19:27:59.949 INFO:tasks.workunit.client.0.vm07.stdout:7/756: write d0/d4/d5/d8/d41/d64/d74/d98/f83 [960052,92836] 0 2026-03-09T19:27:59.949 INFO:tasks.workunit.client.0.vm07.stdout:7/757: write d0/d4/d5/d8/f37 [2518251,38297] 0 2026-03-09T19:27:59.949 INFO:tasks.workunit.client.0.vm07.stdout:8/808: mkdir d7/d9/d37/d45/d4f/db1/d107/d11e 0 2026-03-09T19:27:59.952 INFO:tasks.workunit.client.1.vm08.stdout:4/984: rmdir da/d10/d16/d28/d2f/d4f/d64 39 2026-03-09T19:27:59.956 INFO:tasks.workunit.client.0.vm07.stdout:3/874: readlink d1/l2c 0 2026-03-09T19:27:59.959 INFO:tasks.workunit.client.1.vm08.stdout:2/971: getdents d3/d4/d23/d2c/d39/d5e/db8 0 2026-03-09T19:27:59.963 INFO:tasks.workunit.client.0.vm07.stdout:8/809: read d7/d50/fe5 [1563864,9538] 0 2026-03-09T19:27:59.964 INFO:tasks.workunit.client.1.vm08.stdout:2/972: link d3/dca/f11c d3/d9/d79/d46/d8c/d92/d120/f147 0 2026-03-09T19:27:59.969 INFO:tasks.workunit.client.1.vm08.stdout:2/973: dwrite d3/d4/d23/d2c/d39/d5e/de/d18/da9/f118 [0,4194304] 0 2026-03-09T19:27:59.973 INFO:tasks.workunit.client.0.vm07.stdout:3/875: rmdir d1/d3d/d47/db3/d8e/dee 39 2026-03-09T19:27:59.974 INFO:tasks.workunit.client.0.vm07.stdout:2/867: mkdir d3/d49/d114/d12f 0 2026-03-09T19:27:59.974 INFO:tasks.workunit.client.1.vm08.stdout:2/974: link d3/d4/d23/d2c/dc1/fe2 d3/d9/d79/d46/f148 0 2026-03-09T19:27:59.975 INFO:tasks.workunit.client.0.vm07.stdout:8/810: creat d7/d9/d10/dd8/dfd/d67/f11f x:0 0 0 2026-03-09T19:27:59.990 INFO:tasks.workunit.client.0.vm07.stdout:5/787: write d3/d1a/d5a/db8/fed [4081539,11736] 0 2026-03-09T19:27:59.990 INFO:tasks.workunit.client.1.vm08.stdout:4/985: sync 2026-03-09T19:27:59.991 INFO:tasks.workunit.client.0.vm07.stdout:5/788: truncate d3/d1a/fe4 802001 0 2026-03-09T19:27:59.991 INFO:tasks.workunit.client.1.vm08.stdout:4/986: chown da/d10/d26/d3a/db5/de6 1959 1 2026-03-09T19:27:59.994 INFO:tasks.workunit.client.1.vm08.stdout:2/975: rename d3/d4/d23/d2c/d39/d5e/de/d18/f2d to d3/d9/d79/d46/d8c/f149 0 2026-03-09T19:27:59.994 INFO:tasks.workunit.client.0.vm07.stdout:5/789: dwrite d3/f93 [0,4194304] 0 2026-03-09T19:28:00.002 INFO:tasks.workunit.client.1.vm08.stdout:4/987: mknod da/d10/d16/d28/d46/d121/c12b 0 2026-03-09T19:28:00.003 INFO:tasks.workunit.client.0.vm07.stdout:3/876: fsync d1/d3d/d47/db3/faf 0 2026-03-09T19:28:00.013 INFO:tasks.workunit.client.1.vm08.stdout:4/988: dread da/d10/d26/d27/d32/f45 [4194304,4194304] 0 2026-03-09T19:28:00.015 INFO:tasks.workunit.client.0.vm07.stdout:1/804: write d1/db/d31/d4f/f8c [2716367,87395] 0 2026-03-09T19:28:00.019 INFO:tasks.workunit.client.0.vm07.stdout:0/786: write d0/d6/d13/f6c [1944237,80] 0 2026-03-09T19:28:00.020 INFO:tasks.workunit.client.0.vm07.stdout:1/805: chown d1/f6 28 1 2026-03-09T19:28:00.020 INFO:tasks.workunit.client.0.vm07.stdout:6/754: getdents d0/d1/db/d52 0 2026-03-09T19:28:00.021 INFO:tasks.workunit.client.0.vm07.stdout:9/817: dwrite d0/d17/f1f [0,4194304] 0 2026-03-09T19:28:00.022 INFO:tasks.workunit.client.0.vm07.stdout:9/818: chown d0/d6/d3a/f89 1781 1 2026-03-09T19:28:00.029 INFO:tasks.workunit.client.0.vm07.stdout:7/758: dread d0/d4/d5/d8/d41/fe0 [0,4194304] 0 2026-03-09T19:28:00.032 INFO:tasks.workunit.client.1.vm08.stdout:4/989: creat da/d10/d26/d3a/db5/ddb/f12c x:0 0 0 2026-03-09T19:28:00.032 INFO:tasks.workunit.client.1.vm08.stdout:4/990: readlink l7 0 2026-03-09T19:28:00.032 INFO:tasks.workunit.client.0.vm07.stdout:5/790: mknod d3/dd/d26/d2d/d9e/cfd 0 2026-03-09T19:28:00.033 INFO:tasks.workunit.client.1.vm08.stdout:4/991: chown da/d10/d16/d28/d46/d121/c12b 3459 1 2026-03-09T19:28:00.034 INFO:tasks.workunit.client.1.vm08.stdout:2/976: write d3/d9/d79/f86 [822023,48980] 0 2026-03-09T19:28:00.035 INFO:tasks.workunit.client.0.vm07.stdout:2/868: dwrite d3/dd/d16/d30/da7/fd8 [0,4194304] 0 2026-03-09T19:28:00.037 INFO:tasks.workunit.client.1.vm08.stdout:4/992: fdatasync da/d10/d16/d28/d2f/d4f/d103/d40/fc2 0 2026-03-09T19:28:00.037 INFO:tasks.workunit.client.0.vm07.stdout:4/790: link d3/d11/d29/d101/d99/de7/l109 d3/d4f/d56/l110 0 2026-03-09T19:28:00.042 INFO:tasks.workunit.client.1.vm08.stdout:4/993: symlink da/d10/d26/d3a/l12d 0 2026-03-09T19:28:00.048 INFO:tasks.workunit.client.1.vm08.stdout:2/977: dread d3/d9/d79/d46/d8c/fa5 [0,4194304] 0 2026-03-09T19:28:00.048 INFO:tasks.workunit.client.0.vm07.stdout:1/806: symlink d1/d11/d37/dcb/l112 0 2026-03-09T19:28:00.048 INFO:tasks.workunit.client.0.vm07.stdout:1/807: chown d1/d11/l70 98699886 1 2026-03-09T19:28:00.051 INFO:tasks.workunit.client.0.vm07.stdout:9/819: rename d0/db/d29/d68/fcb to d0/d6/d73/dbe/f11d 0 2026-03-09T19:28:00.052 INFO:tasks.workunit.client.1.vm08.stdout:2/978: creat d3/d4/d3e/d9d/f14a x:0 0 0 2026-03-09T19:28:00.058 INFO:tasks.workunit.client.0.vm07.stdout:1/808: fdatasync d1/db/d31/fa8 0 2026-03-09T19:28:00.061 INFO:tasks.workunit.client.0.vm07.stdout:7/759: mknod d0/d4/d5/d26/db9/dc2/c103 0 2026-03-09T19:28:00.062 INFO:tasks.workunit.client.0.vm07.stdout:7/760: readlink d0/d4/d5/d99/lf6 0 2026-03-09T19:28:00.062 INFO:tasks.workunit.client.0.vm07.stdout:8/811: truncate d7/d9/d10/d44/d9a/f8a 1828775 0 2026-03-09T19:28:00.063 INFO:tasks.workunit.client.0.vm07.stdout:3/877: write d1/d6/dd/f33 [369175,64684] 0 2026-03-09T19:28:00.064 INFO:tasks.workunit.client.0.vm07.stdout:7/761: write d0/d4/d5/d8/d41/d64/d74/d98/f47 [797755,22081] 0 2026-03-09T19:28:00.065 INFO:tasks.workunit.client.0.vm07.stdout:7/762: chown d0/d4/d5/d8/d1a/f4d 172 1 2026-03-09T19:28:00.068 INFO:tasks.workunit.client.0.vm07.stdout:5/791: creat d3/dd/ffe x:0 0 0 2026-03-09T19:28:00.070 INFO:tasks.workunit.client.0.vm07.stdout:5/792: dread d3/dd/d95/fc1 [0,4194304] 0 2026-03-09T19:28:00.071 INFO:tasks.workunit.client.0.vm07.stdout:8/812: truncate d7/d1d/f96 1704281 0 2026-03-09T19:28:00.072 INFO:tasks.workunit.client.0.vm07.stdout:8/813: chown d7/d9/d10/dd8/dfd/d67/l9c 1 1 2026-03-09T19:28:00.074 INFO:tasks.workunit.client.1.vm08.stdout:2/979: write d3/dca/f11c [211357,27565] 0 2026-03-09T19:28:00.078 INFO:tasks.workunit.client.1.vm08.stdout:4/994: dread da/d10/d16/d28/d2f/d4f/d103/fd1 [0,4194304] 0 2026-03-09T19:28:00.082 INFO:tasks.workunit.client.0.vm07.stdout:9/820: write d0/d6/fa [4489837,6640] 0 2026-03-09T19:28:00.083 INFO:tasks.workunit.client.0.vm07.stdout:8/814: dread d7/d16/f69 [0,4194304] 0 2026-03-09T19:28:00.087 INFO:tasks.workunit.client.1.vm08.stdout:2/980: dread d3/d4/d23/d2c/f80 [0,4194304] 0 2026-03-09T19:28:00.091 INFO:tasks.workunit.client.0.vm07.stdout:6/755: getdents d0/d1/d28/d76 0 2026-03-09T19:28:00.092 INFO:tasks.workunit.client.0.vm07.stdout:6/756: write d0/d1/db/d91/f117 [67398,39989] 0 2026-03-09T19:28:00.093 INFO:tasks.workunit.client.0.vm07.stdout:7/763: stat d0/d80/db1/de5/d54/d5a/lc6 0 2026-03-09T19:28:00.095 INFO:tasks.workunit.client.1.vm08.stdout:2/981: mknod d3/d4/d3e/d4e/c14b 0 2026-03-09T19:28:00.098 INFO:tasks.workunit.client.0.vm07.stdout:2/869: getdents d3/dd/d16/d29/d2d/d45/dc3 0 2026-03-09T19:28:00.098 INFO:tasks.workunit.client.0.vm07.stdout:4/791: link d3/d11/d29/f42 d3/d11/d2b/d38/d107/f111 0 2026-03-09T19:28:00.099 INFO:tasks.workunit.client.1.vm08.stdout:4/995: fdatasync da/d10/d16/d28/d2f/d4f/d103/d40/f70 0 2026-03-09T19:28:00.099 INFO:tasks.workunit.client.0.vm07.stdout:0/787: getdents d0/d6/d13/d17/d19/d57/d6a 0 2026-03-09T19:28:00.099 INFO:tasks.workunit.client.0.vm07.stdout:0/788: fdatasync d0/d6/d13/d1c/d11/d56/f100 0 2026-03-09T19:28:00.101 INFO:tasks.workunit.client.0.vm07.stdout:4/792: rename d3/d11/d29/d101/d99 to d3/d11/d29/d101/d99/de7/d112 22 2026-03-09T19:28:00.102 INFO:tasks.workunit.client.0.vm07.stdout:5/793: creat d3/d1a/d28/d36/fff x:0 0 0 2026-03-09T19:28:00.102 INFO:tasks.workunit.client.0.vm07.stdout:3/878: fdatasync d1/fe 0 2026-03-09T19:28:00.104 INFO:tasks.workunit.client.0.vm07.stdout:7/764: sync 2026-03-09T19:28:00.109 INFO:tasks.workunit.client.0.vm07.stdout:2/870: dwrite d3/dd/d16/d29/d2d/d45/fd9 [0,4194304] 0 2026-03-09T19:28:00.110 INFO:tasks.workunit.client.1.vm08.stdout:4/996: symlink da/d10/d1b/d10c/d12a/l12e 0 2026-03-09T19:28:00.110 INFO:tasks.workunit.client.0.vm07.stdout:8/815: symlink d7/d9/d10/dd8/l120 0 2026-03-09T19:28:00.111 INFO:tasks.workunit.client.0.vm07.stdout:2/871: write d3/dd/d16/d30/da7/f109 [8793800,23774] 0 2026-03-09T19:28:00.114 INFO:tasks.workunit.client.0.vm07.stdout:1/809: dwrite d1/d11/d37/d3f/fd7 [0,4194304] 0 2026-03-09T19:28:00.114 INFO:tasks.workunit.client.0.vm07.stdout:1/810: stat d1/db/d31/d4f/l58 0 2026-03-09T19:28:00.114 INFO:tasks.workunit.client.0.vm07.stdout:9/821: write d0/d6/fe3 [585522,20832] 0 2026-03-09T19:28:00.114 INFO:tasks.workunit.client.0.vm07.stdout:6/757: rmdir d0/d4e/dae 39 2026-03-09T19:28:00.115 INFO:tasks.workunit.client.1.vm08.stdout:2/982: dwrite d3/d4/d23/d2c/d39/d5e/de/d18/d1f/f7f [4194304,4194304] 0 2026-03-09T19:28:00.134 INFO:tasks.workunit.client.0.vm07.stdout:3/879: truncate d1/d6/f9d 834550 0 2026-03-09T19:28:00.134 INFO:tasks.workunit.client.1.vm08.stdout:2/983: mkdir d3/d4/d23/d14c 0 2026-03-09T19:28:00.149 INFO:tasks.workunit.client.1.vm08.stdout:2/984: fdatasync d3/d4/d23/d2c/d39/d5e/de/d8b/f81 0 2026-03-09T19:28:00.150 INFO:tasks.workunit.client.0.vm07.stdout:6/758: truncate d0/d1/db/d1d/fcb 478012 0 2026-03-09T19:28:00.150 INFO:tasks.workunit.client.0.vm07.stdout:9/822: fsync d0/d6f/fa9 0 2026-03-09T19:28:00.155 INFO:tasks.workunit.client.1.vm08.stdout:2/985: symlink d3/d4/d23/d2c/d39/d5e/db8/dff/l14d 0 2026-03-09T19:28:00.155 INFO:tasks.workunit.client.1.vm08.stdout:2/986: stat d3/c100 0 2026-03-09T19:28:00.156 INFO:tasks.workunit.client.1.vm08.stdout:2/987: write d3/dca/f11c [713808,5955] 0 2026-03-09T19:28:00.162 INFO:tasks.workunit.client.0.vm07.stdout:4/793: link d3/d4f/d56/l110 d3/d11/d29/d101/l113 0 2026-03-09T19:28:00.164 INFO:tasks.workunit.client.0.vm07.stdout:4/794: rmdir d3/d11/d2b/d37/db6 39 2026-03-09T19:28:00.164 INFO:tasks.workunit.client.0.vm07.stdout:3/880: link d1/d3d/d47/db3/d87/c102 d1/d3d/d47/db3/dc2/d28/d7c/c11a 0 2026-03-09T19:28:00.166 INFO:tasks.workunit.client.0.vm07.stdout:8/816: getdents d7/d30/d32 0 2026-03-09T19:28:00.167 INFO:tasks.workunit.client.0.vm07.stdout:8/817: readlink d7/d9/d57/l105 0 2026-03-09T19:28:00.168 INFO:tasks.workunit.client.0.vm07.stdout:6/759: getdents d0/d1/db/d1d 0 2026-03-09T19:28:00.169 INFO:tasks.workunit.client.0.vm07.stdout:1/811: dread d1/db/d31/dca/fa7 [0,4194304] 0 2026-03-09T19:28:00.170 INFO:tasks.workunit.client.0.vm07.stdout:6/760: stat d0/d1/db/d52/d94/d81/la5 0 2026-03-09T19:28:00.170 INFO:tasks.workunit.client.0.vm07.stdout:6/761: dread - d0/dbf/f8d zero size 2026-03-09T19:28:00.172 INFO:tasks.workunit.client.0.vm07.stdout:9/823: getdents d0/db/d29/d32 0 2026-03-09T19:28:00.173 INFO:tasks.workunit.client.0.vm07.stdout:3/881: rename d1/d3d/d47/db3/dc2/d28/l40 to d1/d6/d4c/dfa/l11b 0 2026-03-09T19:28:00.173 INFO:tasks.workunit.client.0.vm07.stdout:9/824: chown d0/db/d29/d32/d5c/f78 0 1 2026-03-09T19:28:00.175 INFO:tasks.workunit.client.0.vm07.stdout:1/812: mkdir d1/d11/d37/d3f/d45/d87/d89/d113 0 2026-03-09T19:28:00.186 INFO:tasks.workunit.client.0.vm07.stdout:8/818: creat d7/d30/d32/f121 x:0 0 0 2026-03-09T19:28:00.187 INFO:tasks.workunit.client.1.vm08.stdout:4/997: dread da/d10/d16/d28/d2f/d4f/f65 [0,4194304] 0 2026-03-09T19:28:00.187 INFO:tasks.workunit.client.1.vm08.stdout:4/998: link da/d10/d16/d28/d2f/d4f/d64/d81/cc7 da/d10/d16/d28/d2f/de9/db0/c12f 0 2026-03-09T19:28:00.187 INFO:tasks.workunit.client.0.vm07.stdout:9/825: symlink d0/d6/d73/d105/l11e 0 2026-03-09T19:28:00.187 INFO:tasks.workunit.client.0.vm07.stdout:8/819: chown d7/d9/d37/d45/d97/dbc/de2/cee 746821153 1 2026-03-09T19:28:00.187 INFO:tasks.workunit.client.0.vm07.stdout:5/794: dwrite d3/f25 [0,4194304] 0 2026-03-09T19:28:00.187 INFO:tasks.workunit.client.0.vm07.stdout:8/820: fsync d7/d9/d10/d44/f48 0 2026-03-09T19:28:00.188 INFO:tasks.workunit.client.0.vm07.stdout:5/795: readlink d3/d1a/d28/d40/l84 0 2026-03-09T19:28:00.192 INFO:tasks.workunit.client.0.vm07.stdout:1/813: dread d1/d11/d37/d3f/d7e/dad/fcd [0,4194304] 0 2026-03-09T19:28:00.196 INFO:tasks.workunit.client.1.vm08.stdout:2/988: sync 2026-03-09T19:28:00.197 INFO:tasks.workunit.client.0.vm07.stdout:8/821: dread d7/d50/da6/f109 [0,4194304] 0 2026-03-09T19:28:00.197 INFO:tasks.workunit.client.0.vm07.stdout:3/882: sync 2026-03-09T19:28:00.197 INFO:tasks.workunit.client.1.vm08.stdout:2/989: chown d3/d4/d23/d2c/d39/d5e/db8/dff 403987882 1 2026-03-09T19:28:00.198 INFO:tasks.workunit.client.1.vm08.stdout:2/990: chown d3/d9/d79 148583 1 2026-03-09T19:28:00.203 INFO:tasks.workunit.client.0.vm07.stdout:6/762: dread d0/dbf/fa2 [0,4194304] 0 2026-03-09T19:28:00.203 INFO:tasks.workunit.client.0.vm07.stdout:6/763: fdatasync d0/d2d/f4a 0 2026-03-09T19:28:00.204 INFO:tasks.workunit.client.0.vm07.stdout:9/826: dread d0/db/d29/d2c/f54 [0,4194304] 0 2026-03-09T19:28:00.204 INFO:tasks.workunit.client.0.vm07.stdout:5/796: rename d3/dd/dbe to d3/dd/d26/d2d/d100 0 2026-03-09T19:28:00.207 INFO:tasks.workunit.client.0.vm07.stdout:8/822: fsync d7/f1c 0 2026-03-09T19:28:00.207 INFO:tasks.workunit.client.0.vm07.stdout:9/827: stat d0/db/d29/d2c 0 2026-03-09T19:28:00.209 INFO:tasks.workunit.client.0.vm07.stdout:8/823: creat d7/d9/d37/d34/f122 x:0 0 0 2026-03-09T19:28:00.212 INFO:tasks.workunit.client.0.vm07.stdout:8/824: truncate d7/d1d/f3d 1694895 0 2026-03-09T19:28:00.212 INFO:tasks.workunit.client.0.vm07.stdout:3/883: rename d1/d89/lb8 to d1/d6/l11c 0 2026-03-09T19:28:00.213 INFO:tasks.workunit.client.0.vm07.stdout:9/828: fsync d0/d6/f7b 0 2026-03-09T19:28:00.217 INFO:tasks.workunit.client.0.vm07.stdout:9/829: mknod d0/d6/d3a/c11f 0 2026-03-09T19:28:00.219 INFO:tasks.workunit.client.0.vm07.stdout:3/884: rename d1/d74/fb6 to d1/d3d/d47/db3/dc2/f11d 0 2026-03-09T19:28:00.220 INFO:tasks.workunit.client.0.vm07.stdout:8/825: dwrite d7/d50/f8f [4194304,4194304] 0 2026-03-09T19:28:00.222 INFO:tasks.workunit.client.0.vm07.stdout:8/826: chown d7/d9/d10/d44/cb5 739 1 2026-03-09T19:28:00.227 INFO:tasks.workunit.client.0.vm07.stdout:2/872: dwrite d3/dd/d16/d29/f58 [0,4194304] 0 2026-03-09T19:28:00.234 INFO:tasks.workunit.client.0.vm07.stdout:7/765: write d0/d4/d5/d26/d32/f45 [4893634,122432] 0 2026-03-09T19:28:00.236 INFO:tasks.workunit.client.0.vm07.stdout:9/830: dwrite d0/db/fe9 [0,4194304] 0 2026-03-09T19:28:00.237 INFO:tasks.workunit.client.0.vm07.stdout:0/789: dwrite d0/d6/d13/d1c/d50/fc6 [0,4194304] 0 2026-03-09T19:28:00.239 INFO:tasks.workunit.client.1.vm08.stdout:2/991: write d3/d9/fd2 [368879,75044] 0 2026-03-09T19:28:00.242 INFO:tasks.workunit.client.1.vm08.stdout:4/999: dwrite da/d10/f113 [0,4194304] 0 2026-03-09T19:28:00.246 INFO:tasks.workunit.client.0.vm07.stdout:4/795: dwrite d3/d11/d29/f9b [0,4194304] 0 2026-03-09T19:28:00.257 INFO:tasks.workunit.client.1.vm08.stdout:2/992: rename d3/d4/d23/d2c/d39/d5e/de/d8b/c8e to d3/d4/d3e/d9d/c14e 0 2026-03-09T19:28:00.258 INFO:tasks.workunit.client.0.vm07.stdout:8/827: rmdir d7/d9/d37/d45/d97 39 2026-03-09T19:28:00.260 INFO:tasks.workunit.client.0.vm07.stdout:8/828: chown d7/d9/d37/d34 927 1 2026-03-09T19:28:00.261 INFO:tasks.workunit.client.0.vm07.stdout:1/814: dwrite d1/d11/d37/d3f/d45/d87/fa9 [0,4194304] 0 2026-03-09T19:28:00.263 INFO:tasks.workunit.client.0.vm07.stdout:6/764: write d0/d1/db/d24/da4/fc3 [3192420,79701] 0 2026-03-09T19:28:00.269 INFO:tasks.workunit.client.0.vm07.stdout:5/797: dwrite d3/d1a/f86 [0,4194304] 0 2026-03-09T19:28:00.270 INFO:tasks.workunit.client.0.vm07.stdout:5/798: dread - d3/dd/d26/ff5 zero size 2026-03-09T19:28:00.272 INFO:tasks.workunit.client.1.vm08.stdout:2/993: unlink d3/d9/d79/d46/d8c/d92/d120/f147 0 2026-03-09T19:28:00.273 INFO:tasks.workunit.client.0.vm07.stdout:5/799: stat d3/dd/d26/d2d/d60/c8d 0 2026-03-09T19:28:00.280 INFO:tasks.workunit.client.1.vm08.stdout:2/994: creat d3/d9/d4a/d131/f14f x:0 0 0 2026-03-09T19:28:00.281 INFO:tasks.workunit.client.0.vm07.stdout:7/766: mknod d0/d80/db1/de5/d54/d55/c104 0 2026-03-09T19:28:00.281 INFO:tasks.workunit.client.0.vm07.stdout:0/790: fsync d0/d6/d13/d1c/fe8 0 2026-03-09T19:28:00.291 INFO:tasks.workunit.client.0.vm07.stdout:9/831: rename d0/db/d29/d68/f8e to d0/d6/d73/d10e/f120 0 2026-03-09T19:28:00.292 INFO:tasks.workunit.client.0.vm07.stdout:9/832: chown d0/db/d29/d2c/d36/d5a/fb2 289 1 2026-03-09T19:28:00.295 INFO:tasks.workunit.client.1.vm08.stdout:2/995: creat d3/f150 x:0 0 0 2026-03-09T19:28:00.296 INFO:tasks.workunit.client.0.vm07.stdout:9/833: dread d0/db/d29/d2c/d36/f62 [0,4194304] 0 2026-03-09T19:28:00.297 INFO:tasks.workunit.client.1.vm08.stdout:2/996: creat d3/d4/d23/d2c/d39/d5e/d14/f151 x:0 0 0 2026-03-09T19:28:00.299 INFO:tasks.workunit.client.0.vm07.stdout:8/829: truncate d7/d1d/d83/d9f/dd2/def/f106 737294 0 2026-03-09T19:28:00.299 INFO:tasks.workunit.client.0.vm07.stdout:8/830: chown d7/d16/c99 43013 1 2026-03-09T19:28:00.304 INFO:tasks.workunit.client.0.vm07.stdout:8/831: dwrite d7/d16/dcf/f108 [0,4194304] 0 2026-03-09T19:28:00.308 INFO:tasks.workunit.client.1.vm08.stdout:2/997: fdatasync d3/d9/d79/d46/f148 0 2026-03-09T19:28:00.308 INFO:tasks.workunit.client.0.vm07.stdout:6/765: mkdir d0/d2d/dd5/d123/d7b/da0/d10f/d124 0 2026-03-09T19:28:00.308 INFO:tasks.workunit.client.1.vm08.stdout:2/998: fdatasync d3/d4/d23/fc0 0 2026-03-09T19:28:00.312 INFO:tasks.workunit.client.0.vm07.stdout:9/834: symlink d0/d6/d57/d5d/dde/l121 0 2026-03-09T19:28:00.313 INFO:tasks.workunit.client.0.vm07.stdout:9/835: write d0/db/d29/d4d/f10f [1000645,127275] 0 2026-03-09T19:28:00.314 INFO:tasks.workunit.client.0.vm07.stdout:8/832: fsync d7/d16/d1e/f6e 0 2026-03-09T19:28:00.314 INFO:tasks.workunit.client.0.vm07.stdout:8/833: chown d7/d16/c1f 303875 1 2026-03-09T19:28:00.325 INFO:tasks.workunit.client.1.vm08.stdout:2/999: unlink d3/d4/d23/d2c/c74 0 2026-03-09T19:28:00.325 INFO:tasks.workunit.client.1.vm08.stderr:+ rm -rf -- ./tmp.V6OdvJ4d0F 2026-03-09T19:28:00.326 INFO:tasks.workunit.client.0.vm07.stdout:6/766: creat d0/d1/db/d52/d94/d81/f125 x:0 0 0 2026-03-09T19:28:00.326 INFO:tasks.workunit.client.0.vm07.stdout:6/767: creat d0/d1/db/d91/f126 x:0 0 0 2026-03-09T19:28:00.326 INFO:tasks.workunit.client.0.vm07.stdout:2/873: getdents d3/dd 0 2026-03-09T19:28:00.326 INFO:tasks.workunit.client.0.vm07.stdout:9/836: symlink d0/l122 0 2026-03-09T19:28:00.326 INFO:tasks.workunit.client.0.vm07.stdout:6/768: readlink d0/d1/db/d24/l25 0 2026-03-09T19:28:00.326 INFO:tasks.workunit.client.0.vm07.stdout:7/767: getdents d0/d80/db1 0 2026-03-09T19:28:00.326 INFO:tasks.workunit.client.0.vm07.stdout:8/834: rename d7/d9/d10/d44/f48 to d7/d9/d10/f123 0 2026-03-09T19:28:00.326 INFO:tasks.workunit.client.0.vm07.stdout:7/768: fsync d0/d80/db1/de5/f9a 0 2026-03-09T19:28:00.331 INFO:tasks.workunit.client.0.vm07.stdout:6/769: link d0/d4e/d7f/fe9 d0/d1/db/d17/f127 0 2026-03-09T19:28:00.345 INFO:tasks.workunit.client.0.vm07.stdout:1/815: dread d1/f51 [0,4194304] 0 2026-03-09T19:28:00.352 INFO:tasks.workunit.client.0.vm07.stdout:6/770: dread d0/d1/d28/d76/f97 [0,4194304] 0 2026-03-09T19:28:00.352 INFO:tasks.workunit.client.0.vm07.stdout:1/816: fdatasync d1/db/fb2 0 2026-03-09T19:28:00.354 INFO:tasks.workunit.client.0.vm07.stdout:6/771: unlink d0/d1/db/d52/d94/fff 0 2026-03-09T19:28:00.357 INFO:tasks.workunit.client.0.vm07.stdout:6/772: dwrite d0/d1/db/d52/f111 [4194304,4194304] 0 2026-03-09T19:28:00.364 INFO:tasks.workunit.client.0.vm07.stdout:6/773: chown d0/dbf/d95/d31/f89 968 1 2026-03-09T19:28:00.388 INFO:tasks.workunit.client.0.vm07.stdout:6/774: rename d0/d1/db/d1d/l58 to d0/dbf/l128 0 2026-03-09T19:28:00.391 INFO:tasks.workunit.client.0.vm07.stdout:6/775: truncate d0/d44/dd3/f106 844702 0 2026-03-09T19:28:00.398 INFO:tasks.workunit.client.0.vm07.stdout:6/776: creat d0/d1/db/d1d/d77/f129 x:0 0 0 2026-03-09T19:28:00.404 INFO:tasks.workunit.client.0.vm07.stdout:6/777: rmdir d0/d4e/d7f/dbe 39 2026-03-09T19:28:00.413 INFO:tasks.workunit.client.0.vm07.stdout:3/885: dwrite d1/d6/d45/dac/fea [0,4194304] 0 2026-03-09T19:28:00.414 INFO:tasks.workunit.client.0.vm07.stdout:3/886: write d1/d6/dd/dbf/fcf [1017607,81974] 0 2026-03-09T19:28:00.425 INFO:tasks.workunit.client.0.vm07.stdout:3/887: rmdir d1/d6/dd/dbf/ddc 39 2026-03-09T19:28:00.429 INFO:tasks.workunit.client.0.vm07.stdout:3/888: truncate d1/d74/f52 1209112 0 2026-03-09T19:28:00.433 INFO:tasks.workunit.client.0.vm07.stdout:3/889: symlink d1/d3d/d47/d10e/l11e 0 2026-03-09T19:28:00.436 INFO:tasks.workunit.client.0.vm07.stdout:3/890: truncate d1/d3d/d47/db3/d8e/da9/f93 2032737 0 2026-03-09T19:28:00.443 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:00 vm07.local ceph-mon[48545]: Reconfiguring prometheus.vm07 (dependencies changed)... 2026-03-09T19:28:00.443 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:00 vm07.local ceph-mon[48545]: Reconfiguring daemon prometheus.vm07 on vm07 2026-03-09T19:28:00.443 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:00 vm07.local ceph-mon[48545]: pgmap v9: 65 pgs: 65 active+clean; 3.7 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 44 MiB/s rd, 95 MiB/s wr, 264 op/s 2026-03-09T19:28:00.462 INFO:tasks.workunit.client.0.vm07.stdout:7/769: dread d0/d4/d5/d26/f31 [0,4194304] 0 2026-03-09T19:28:00.463 INFO:tasks.workunit.client.0.vm07.stdout:7/770: dread - d0/d4/d5/d8/d41/d64/d74/d98/dcb/d39/ff9 zero size 2026-03-09T19:28:00.464 INFO:tasks.workunit.client.0.vm07.stdout:3/891: dread d1/fe [0,4194304] 0 2026-03-09T19:28:00.468 INFO:tasks.workunit.client.0.vm07.stdout:3/892: mkdir d1/d6/d45/d54/d11f 0 2026-03-09T19:28:00.471 INFO:tasks.workunit.client.0.vm07.stdout:3/893: creat d1/d6/d4c/dfa/f120 x:0 0 0 2026-03-09T19:28:00.472 INFO:tasks.workunit.client.0.vm07.stdout:4/796: write d3/d11/d2b/d38/ddc/f67 [623615,95629] 0 2026-03-09T19:28:00.476 INFO:tasks.workunit.client.0.vm07.stdout:7/771: dread d0/f13 [4194304,4194304] 0 2026-03-09T19:28:00.479 INFO:tasks.workunit.client.0.vm07.stdout:5/800: write d3/dd/d26/d2d/d79/d9f/fd0 [168356,73542] 0 2026-03-09T19:28:00.480 INFO:tasks.workunit.client.0.vm07.stdout:7/772: dwrite d0/d4/d5/d26/db9/dc2/fd1 [0,4194304] 0 2026-03-09T19:28:00.484 INFO:tasks.workunit.client.0.vm07.stdout:5/801: chown d3/d1a/d28/d40/d92/d89 21250876 1 2026-03-09T19:28:00.486 INFO:tasks.workunit.client.0.vm07.stdout:4/797: sync 2026-03-09T19:28:00.489 INFO:tasks.workunit.client.0.vm07.stdout:3/894: rename d1/d1f/lb2 to d1/d3d/d47/db3/dc2/l121 0 2026-03-09T19:28:00.489 INFO:tasks.workunit.client.0.vm07.stdout:0/791: write d0/d6/f43 [5219546,121647] 0 2026-03-09T19:28:00.490 INFO:tasks.workunit.client.0.vm07.stdout:3/895: write d1/d3d/d47/db3/dc2/d28/dfd/f10d [990270,6042] 0 2026-03-09T19:28:00.492 INFO:tasks.workunit.client.0.vm07.stdout:7/773: truncate d0/d4/d5/d8/d1a/d2a/fb2 2065000 0 2026-03-09T19:28:00.495 INFO:tasks.workunit.client.0.vm07.stdout:5/802: symlink d3/d1a/d28/d6c/d72/d8f/l101 0 2026-03-09T19:28:00.499 INFO:tasks.workunit.client.0.vm07.stdout:9/837: dwrite d0/db/d29/d32/d5c/d69/f8d [0,4194304] 0 2026-03-09T19:28:00.501 INFO:tasks.workunit.client.0.vm07.stdout:0/792: symlink d0/d6/d13/d17/d19/d58/dd9/df8/l105 0 2026-03-09T19:28:00.501 INFO:tasks.workunit.client.0.vm07.stdout:8/835: write d7/d1d/f3f [3436244,61398] 0 2026-03-09T19:28:00.501 INFO:tasks.workunit.client.0.vm07.stdout:2/874: dwrite d3/d49/faf [0,4194304] 0 2026-03-09T19:28:00.503 INFO:tasks.workunit.client.0.vm07.stdout:1/817: write d1/d11/d37/f2c [3406323,108429] 0 2026-03-09T19:28:00.504 INFO:tasks.workunit.client.0.vm07.stdout:5/803: creat d3/d1a/d5d/f102 x:0 0 0 2026-03-09T19:28:00.508 INFO:tasks.workunit.client.0.vm07.stdout:4/798: creat d3/d4f/d56/d5f/d88/dd0/f114 x:0 0 0 2026-03-09T19:28:00.510 INFO:tasks.workunit.client.0.vm07.stdout:5/804: chown d3/dd/fab 102 1 2026-03-09T19:28:00.511 INFO:tasks.workunit.client.0.vm07.stdout:4/799: dread - d3/fd2 zero size 2026-03-09T19:28:00.521 INFO:tasks.workunit.client.0.vm07.stdout:0/793: rename d0/d6/d13/d1c/d11/d56/le9 to d0/d6/d13/d17/d19/d58/dd9/l106 0 2026-03-09T19:28:00.525 INFO:tasks.workunit.client.0.vm07.stdout:9/838: fsync d0/db/d9e/faf 0 2026-03-09T19:28:00.527 INFO:tasks.workunit.client.0.vm07.stdout:8/836: dread d7/d30/fb7 [0,4194304] 0 2026-03-09T19:28:00.528 INFO:tasks.workunit.client.0.vm07.stdout:2/875: dwrite d3/d49/f124 [0,4194304] 0 2026-03-09T19:28:00.532 INFO:tasks.workunit.client.0.vm07.stdout:6/778: dwrite d0/d44/fe2 [0,4194304] 0 2026-03-09T19:28:00.534 INFO:tasks.workunit.client.0.vm07.stdout:6/779: write d0/d1/d28/da8/ffe [1686974,103905] 0 2026-03-09T19:28:00.558 INFO:tasks.workunit.client.0.vm07.stdout:8/837: dread - d7/d50/da6/faf zero size 2026-03-09T19:28:00.560 INFO:tasks.workunit.client.0.vm07.stdout:2/876: creat d3/dd/d16/d29/d2d/f130 x:0 0 0 2026-03-09T19:28:00.561 INFO:tasks.workunit.client.0.vm07.stdout:4/800: creat d3/d11/d2b/d38/ddc/d22/f115 x:0 0 0 2026-03-09T19:28:00.562 INFO:tasks.workunit.client.0.vm07.stdout:6/780: unlink d0/d1/d28/fb3 0 2026-03-09T19:28:00.562 INFO:tasks.workunit.client.0.vm07.stdout:5/805: link d3/f19 d3/d1a/d28/d48/f103 0 2026-03-09T19:28:00.563 INFO:tasks.workunit.client.0.vm07.stdout:8/838: symlink d7/d1d/d83/d9f/dd2/l124 0 2026-03-09T19:28:00.563 INFO:tasks.workunit.client.0.vm07.stdout:6/781: write d0/dbf/d95/f11a [106637,91452] 0 2026-03-09T19:28:00.566 INFO:tasks.workunit.client.0.vm07.stdout:2/877: creat d3/dd/d16/d29/d3c/d5a/d7a/d74/f131 x:0 0 0 2026-03-09T19:28:00.567 INFO:tasks.workunit.client.0.vm07.stdout:4/801: mkdir d3/d11/d29/d101/d99/d116 0 2026-03-09T19:28:00.572 INFO:tasks.workunit.client.0.vm07.stdout:5/806: symlink d3/d1a/d28/l104 0 2026-03-09T19:28:00.573 INFO:tasks.workunit.client.0.vm07.stdout:4/802: creat d3/d11/d2b/d38/d8f/f117 x:0 0 0 2026-03-09T19:28:00.573 INFO:tasks.workunit.client.0.vm07.stdout:5/807: fsync d3/dd/d26/d3f/d47/d71/d76/fdf 0 2026-03-09T19:28:00.574 INFO:tasks.workunit.client.0.vm07.stdout:5/808: dread - d3/d1a/d28/d40/ff4 zero size 2026-03-09T19:28:00.575 INFO:tasks.workunit.client.0.vm07.stdout:4/803: write d3/d11/d2b/d38/ddc/d22/f115 [261171,21327] 0 2026-03-09T19:28:00.577 INFO:tasks.workunit.client.0.vm07.stdout:8/839: creat d7/d9/d37/d45/d97/dbc/de2/f125 x:0 0 0 2026-03-09T19:28:00.578 INFO:tasks.workunit.client.0.vm07.stdout:4/804: stat d3/d11/d51/fcb 0 2026-03-09T19:28:00.581 INFO:tasks.workunit.client.0.vm07.stdout:7/774: truncate d0/d4/d5/d8/d41/d64/d74/d98/f83 3590007 0 2026-03-09T19:28:00.581 INFO:tasks.workunit.client.0.vm07.stdout:1/818: truncate d1/d11/d37/d3f/f82 1150084 0 2026-03-09T19:28:00.581 INFO:tasks.workunit.client.0.vm07.stdout:1/819: chown d1/db 2731627 1 2026-03-09T19:28:00.581 INFO:tasks.workunit.client.0.vm07.stdout:0/794: write d0/d6/d13/d1c/d61/d69/fb9 [1605159,83098] 0 2026-03-09T19:28:00.581 INFO:tasks.workunit.client.0.vm07.stdout:3/896: dwrite d1/f68 [0,4194304] 0 2026-03-09T19:28:00.584 INFO:tasks.workunit.client.0.vm07.stdout:6/782: mkdir d0/d4e/d7f/dbe/d12a 0 2026-03-09T19:28:00.593 INFO:tasks.workunit.client.0.vm07.stdout:5/809: rmdir d3/dd/d26/d3f/d47 39 2026-03-09T19:28:00.597 INFO:tasks.workunit.client.0.vm07.stdout:8/840: mkdir d7/d9/d10/d44/d126 0 2026-03-09T19:28:00.597 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:00 vm08.local ceph-mon[57794]: Reconfiguring prometheus.vm07 (dependencies changed)... 2026-03-09T19:28:00.597 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:00 vm08.local ceph-mon[57794]: Reconfiguring daemon prometheus.vm07 on vm07 2026-03-09T19:28:00.597 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:00 vm08.local ceph-mon[57794]: pgmap v9: 65 pgs: 65 active+clean; 3.7 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 44 MiB/s rd, 95 MiB/s wr, 264 op/s 2026-03-09T19:28:00.597 INFO:tasks.workunit.client.0.vm07.stdout:1/820: fdatasync d1/d3e/db3/fba 0 2026-03-09T19:28:00.597 INFO:tasks.workunit.client.0.vm07.stdout:7/775: read d0/d80/db1/de5/d54/f6a [502463,16417] 0 2026-03-09T19:28:00.597 INFO:tasks.workunit.client.0.vm07.stdout:8/841: stat d7/d30/d32/de9/fff 0 2026-03-09T19:28:00.597 INFO:tasks.workunit.client.0.vm07.stdout:7/776: dwrite d0/d80/db1/de5/d54/dc4/fed [0,4194304] 0 2026-03-09T19:28:00.605 INFO:tasks.workunit.client.0.vm07.stdout:6/783: rmdir d0/d1/d28/d76 39 2026-03-09T19:28:00.605 INFO:tasks.workunit.client.0.vm07.stdout:7/777: write d0/d4/d5/d8/d41/d64/dd5/ffc [208736,4489] 0 2026-03-09T19:28:00.605 INFO:tasks.workunit.client.0.vm07.stdout:2/878: link d3/dd/d16/d29/d3c/d4c/ffb d3/dd/d16/d29/d3c/da2/d126/f132 0 2026-03-09T19:28:00.614 INFO:tasks.workunit.client.0.vm07.stdout:3/897: sync 2026-03-09T19:28:00.614 INFO:tasks.workunit.client.0.vm07.stdout:8/842: sync 2026-03-09T19:28:00.615 INFO:tasks.workunit.client.0.vm07.stdout:0/795: unlink d0/d6/d13/d17/c37 0 2026-03-09T19:28:00.615 INFO:tasks.workunit.client.0.vm07.stdout:6/784: mkdir d0/d1/db/d52/d12b 0 2026-03-09T19:28:00.616 INFO:tasks.workunit.client.0.vm07.stdout:7/778: rmdir d0/d4/d5/d8/d1a/d2a/dc5 39 2026-03-09T19:28:00.617 INFO:tasks.workunit.client.0.vm07.stdout:5/810: creat d3/dd/d26/d2d/d60/dcf/f105 x:0 0 0 2026-03-09T19:28:00.621 INFO:tasks.workunit.client.0.vm07.stdout:2/879: mknod d3/dd/d16/d29/d2d/d45/d85/d8a/c133 0 2026-03-09T19:28:00.621 INFO:tasks.workunit.client.0.vm07.stdout:6/785: rmdir d0/d1/db/d1d 39 2026-03-09T19:28:00.621 INFO:tasks.workunit.client.0.vm07.stdout:0/796: mknod d0/d6/d13/d17/d19/d57/d6a/c107 0 2026-03-09T19:28:00.623 INFO:tasks.workunit.client.0.vm07.stdout:7/779: read - d0/d4/d5/d8/d41/d64/d74/d98/fd9 zero size 2026-03-09T19:28:00.627 INFO:tasks.workunit.client.0.vm07.stdout:1/821: dread d1/d3/d21/f2e [0,4194304] 0 2026-03-09T19:28:00.630 INFO:tasks.workunit.client.0.vm07.stdout:5/811: rename d3/dd/d26/ff5 to d3/d1a/d28/d6c/d72/f106 0 2026-03-09T19:28:00.634 INFO:tasks.workunit.client.0.vm07.stdout:3/898: mkdir d1/d3d/d47/db3/d8e/d122 0 2026-03-09T19:28:00.634 INFO:tasks.workunit.client.0.vm07.stdout:7/780: symlink d0/d4/d5/d8/d41/l105 0 2026-03-09T19:28:00.634 INFO:tasks.workunit.client.0.vm07.stdout:0/797: stat d0/d6/d13/d17/d19/f1f 0 2026-03-09T19:28:00.637 INFO:tasks.workunit.client.0.vm07.stdout:6/786: dread d0/d4e/d7f/dbe/fe6 [0,4194304] 0 2026-03-09T19:28:00.637 INFO:tasks.workunit.client.0.vm07.stdout:6/787: stat d0/d4e/d7f/dbe/fe6 0 2026-03-09T19:28:00.638 INFO:tasks.workunit.client.0.vm07.stdout:9/839: write d0/d6/f4c [6916119,73078] 0 2026-03-09T19:28:00.642 INFO:tasks.workunit.client.0.vm07.stdout:6/788: chown d0/d13/l96 6 1 2026-03-09T19:28:00.644 INFO:tasks.workunit.client.0.vm07.stdout:1/822: rename d1/f6 to d1/d3e/dae/f114 0 2026-03-09T19:28:00.644 INFO:tasks.workunit.client.0.vm07.stdout:8/843: dread d7/d9/d37/d45/d97/f117 [0,4194304] 0 2026-03-09T19:28:00.644 INFO:tasks.workunit.client.0.vm07.stdout:9/840: dread d0/db/d29/d2c/d36/fa1 [0,4194304] 0 2026-03-09T19:28:00.655 INFO:tasks.workunit.client.0.vm07.stdout:2/880: mkdir d3/dd/d16/d29/d3c/df1/d134 0 2026-03-09T19:28:00.656 INFO:tasks.workunit.client.0.vm07.stdout:2/881: stat d3/dd/d16/d29/d3c/d5a 0 2026-03-09T19:28:00.657 INFO:tasks.workunit.client.0.vm07.stdout:5/812: mkdir d3/d1a/d5a/d107 0 2026-03-09T19:28:00.659 INFO:tasks.workunit.client.0.vm07.stdout:3/899: creat d1/d3d/d47/db3/f123 x:0 0 0 2026-03-09T19:28:00.664 INFO:tasks.workunit.client.0.vm07.stdout:1/823: creat d1/d3e/dae/f115 x:0 0 0 2026-03-09T19:28:00.664 INFO:tasks.workunit.client.0.vm07.stdout:4/805: dwrite d3/d4f/d56/f7f [0,4194304] 0 2026-03-09T19:28:00.664 INFO:tasks.workunit.client.0.vm07.stdout:2/882: stat d3/dd/d16/d29/d2d/d45/d8b/d98/fab 0 2026-03-09T19:28:00.665 INFO:tasks.workunit.client.0.vm07.stdout:0/798: symlink d0/d6/d13/d1c/l108 0 2026-03-09T19:28:00.669 INFO:tasks.workunit.client.0.vm07.stdout:8/844: chown d7/d50/da6/c111 184410 1 2026-03-09T19:28:00.669 INFO:tasks.workunit.client.0.vm07.stdout:6/789: symlink d0/d4e/dae/l12c 0 2026-03-09T19:28:00.676 INFO:tasks.workunit.client.0.vm07.stdout:8/845: write d7/d9/f87 [4683977,40979] 0 2026-03-09T19:28:00.681 INFO:tasks.workunit.client.0.vm07.stdout:5/813: dwrite d3/dd/ffe [0,4194304] 0 2026-03-09T19:28:00.693 INFO:tasks.workunit.client.0.vm07.stdout:6/790: dread - d0/d1/fdb zero size 2026-03-09T19:28:00.717 INFO:tasks.workunit.client.0.vm07.stdout:3/900: dread d1/d3d/d47/db3/dc2/f39 [0,4194304] 0 2026-03-09T19:28:00.717 INFO:tasks.workunit.client.0.vm07.stdout:3/901: chown d1/cfb 11 1 2026-03-09T19:28:00.722 INFO:tasks.workunit.client.0.vm07.stdout:9/841: dread d0/d6/d73/fe6 [0,4194304] 0 2026-03-09T19:28:00.816 INFO:tasks.workunit.client.0.vm07.stdout:0/799: mknod d0/d6/d13/d17/d19/d58/c109 0 2026-03-09T19:28:00.829 INFO:tasks.workunit.client.0.vm07.stdout:1/824: dwrite d1/db/d31/d56/f6a [0,4194304] 0 2026-03-09T19:28:00.835 INFO:tasks.workunit.client.0.vm07.stdout:0/800: fdatasync d0/d6/dc8/d99/fac 0 2026-03-09T19:28:00.860 INFO:tasks.workunit.client.0.vm07.stdout:1/825: fsync d1/d3e/db3/d6d/ff6 0 2026-03-09T19:28:00.866 INFO:tasks.workunit.client.0.vm07.stdout:0/801: truncate d0/f3a 5176050 0 2026-03-09T19:28:00.866 INFO:tasks.workunit.client.0.vm07.stdout:0/802: chown d0/d6/d13/d33/cce 0 1 2026-03-09T19:28:00.872 INFO:tasks.workunit.client.0.vm07.stdout:4/806: getdents d3 0 2026-03-09T19:28:00.883 INFO:tasks.workunit.client.0.vm07.stdout:4/807: mkdir d3/d11/d16/de1/d118 0 2026-03-09T19:28:00.889 INFO:tasks.workunit.client.0.vm07.stdout:4/808: dread d3/d11/f7d [4194304,4194304] 0 2026-03-09T19:28:00.907 INFO:tasks.workunit.client.0.vm07.stdout:0/803: dread d0/d6/d13/d1c/d50/f60 [0,4194304] 0 2026-03-09T19:28:00.915 INFO:tasks.workunit.client.0.vm07.stdout:0/804: creat d0/d6/d13/d17/dc3/df6/df9/f10a x:0 0 0 2026-03-09T19:28:01.050 INFO:tasks.workunit.client.0.vm07.stdout:7/781: rename d0/d4/d5/d26/db9/fe3 to d0/d4/d5/d8/d1a/f106 0 2026-03-09T19:28:01.053 INFO:tasks.workunit.client.0.vm07.stdout:2/883: rename d3/d49/d114 to d3/dd/d16/d29/d3c/d5a/d7a/d74/dc9/d135 0 2026-03-09T19:28:01.057 INFO:tasks.workunit.client.0.vm07.stdout:8/846: rename d7/d9/d10/d44/f6c to d7/d16/dcf/f127 0 2026-03-09T19:28:01.057 INFO:tasks.workunit.client.0.vm07.stdout:6/791: dwrite d0/dbf/fa2 [0,4194304] 0 2026-03-09T19:28:01.063 INFO:tasks.workunit.client.0.vm07.stdout:7/782: dwrite d0/d80/db1/de5/d54/d5a/fef [0,4194304] 0 2026-03-09T19:28:01.070 INFO:tasks.workunit.client.0.vm07.stdout:2/884: creat d3/d11/d38/d111/d113/f136 x:0 0 0 2026-03-09T19:28:01.074 INFO:tasks.workunit.client.0.vm07.stdout:8/847: mknod d7/d1d/d83/d9f/dd2/def/c128 0 2026-03-09T19:28:01.076 INFO:tasks.workunit.client.0.vm07.stdout:2/885: readlink d3/dd/d16/d29/d3c/d4c/l118 0 2026-03-09T19:28:01.077 INFO:tasks.workunit.client.0.vm07.stdout:7/783: rmdir d0/d4/d5/d8/d41/d64/d74/d98/de7 0 2026-03-09T19:28:01.077 INFO:tasks.workunit.client.0.vm07.stdout:3/902: write d1/d3d/d47/db3/f104 [74164,36148] 0 2026-03-09T19:28:01.078 INFO:tasks.workunit.client.0.vm07.stdout:7/784: chown d0/d4/d5/f20 47430989 1 2026-03-09T19:28:01.078 INFO:tasks.workunit.client.0.vm07.stdout:7/785: read - d0/d4/d5/d26/f4a zero size 2026-03-09T19:28:01.083 INFO:tasks.workunit.client.0.vm07.stdout:6/792: dread d0/dbf/d95/d31/f3a [0,4194304] 0 2026-03-09T19:28:01.085 INFO:tasks.workunit.client.0.vm07.stdout:9/842: dwrite d0/d6/d73/fe6 [0,4194304] 0 2026-03-09T19:28:01.091 INFO:tasks.workunit.client.0.vm07.stdout:7/786: creat d0/d4/d5/d26/f107 x:0 0 0 2026-03-09T19:28:01.096 INFO:tasks.workunit.client.0.vm07.stdout:6/793: dwrite d0/d4e/d7f/dbe/fe6 [0,4194304] 0 2026-03-09T19:28:01.096 INFO:tasks.workunit.client.0.vm07.stdout:9/843: stat d0/d6f/d86/fd1 0 2026-03-09T19:28:01.097 INFO:tasks.workunit.client.0.vm07.stdout:6/794: dread - d0/d2d/dd5/d123/d7b/d7d/f11b zero size 2026-03-09T19:28:01.102 INFO:tasks.workunit.client.0.vm07.stdout:8/848: sync 2026-03-09T19:28:01.105 INFO:tasks.workunit.client.0.vm07.stdout:1/826: write d1/db/d31/f64 [4167448,73532] 0 2026-03-09T19:28:01.120 INFO:tasks.workunit.client.0.vm07.stdout:7/787: mknod d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58/c108 0 2026-03-09T19:28:01.137 INFO:tasks.workunit.client.0.vm07.stdout:1/827: unlink d1/d3e/db3/d6d/f85 0 2026-03-09T19:28:01.138 INFO:tasks.workunit.client.0.vm07.stdout:4/809: write d3/d11/d51/f9a [794266,6232] 0 2026-03-09T19:28:01.146 INFO:tasks.workunit.client.0.vm07.stdout:1/828: unlink d1/d3e/dae/ce4 0 2026-03-09T19:28:01.149 INFO:tasks.workunit.client.0.vm07.stdout:4/810: dwrite d3/d11/d29/ff4 [0,4194304] 0 2026-03-09T19:28:01.158 INFO:tasks.workunit.client.0.vm07.stdout:2/886: rename f0 to d3/d11/d38/d111/f137 0 2026-03-09T19:28:01.166 INFO:tasks.workunit.client.0.vm07.stdout:8/849: rename d7/d9/d10/dd8/dfd/d67/l68 to d7/d9/d37/d45/d4f/d10e/l129 0 2026-03-09T19:28:01.168 INFO:tasks.workunit.client.0.vm07.stdout:0/805: mknod d0/d6/dc8/c10b 0 2026-03-09T19:28:01.170 INFO:tasks.workunit.client.0.vm07.stdout:5/814: unlink d3/d1a/f86 0 2026-03-09T19:28:01.173 INFO:tasks.workunit.client.0.vm07.stdout:2/887: creat d3/dd/d16/d29/d3c/d5a/f138 x:0 0 0 2026-03-09T19:28:01.174 INFO:tasks.workunit.client.0.vm07.stdout:0/806: symlink d0/d6/d13/d1c/d11/d56/l10c 0 2026-03-09T19:28:01.174 INFO:tasks.workunit.client.0.vm07.stdout:3/903: unlink d1/d3d/d47/db3/f6b 0 2026-03-09T19:28:01.182 INFO:tasks.workunit.client.0.vm07.stdout:3/904: creat d1/d3d/d47/db3/d8e/da9/f124 x:0 0 0 2026-03-09T19:28:01.184 INFO:tasks.workunit.client.0.vm07.stdout:8/850: getdents d7/d9/d10/dd8/d10b 0 2026-03-09T19:28:01.184 INFO:tasks.workunit.client.0.vm07.stdout:3/905: stat d1/d3d/f95 0 2026-03-09T19:28:01.185 INFO:tasks.workunit.client.0.vm07.stdout:3/906: readlink d1/d6/dd/l99 0 2026-03-09T19:28:01.186 INFO:tasks.workunit.client.0.vm07.stdout:7/788: dread d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58/f70 [0,4194304] 0 2026-03-09T19:28:01.186 INFO:tasks.workunit.client.0.vm07.stdout:8/851: read d7/d30/d32/fa9 [231603,76363] 0 2026-03-09T19:28:01.190 INFO:tasks.workunit.client.0.vm07.stdout:8/852: unlink d7/d9/d37/d45/c113 0 2026-03-09T19:28:01.190 INFO:tasks.workunit.client.0.vm07.stdout:3/907: truncate d1/d3d/d47/db3/d8e/da9/f7d 1360776 0 2026-03-09T19:28:01.190 INFO:tasks.workunit.client.0.vm07.stdout:9/844: write d0/db/d29/fb3 [181653,4639] 0 2026-03-09T19:28:01.192 INFO:tasks.workunit.client.0.vm07.stdout:3/908: write d1/d3d/d47/f113 [863489,28028] 0 2026-03-09T19:28:01.197 INFO:tasks.workunit.client.0.vm07.stdout:6/795: write d0/d1/db/d17/fe5 [567177,101574] 0 2026-03-09T19:28:01.198 INFO:tasks.workunit.client.0.vm07.stdout:1/829: dread d1/d11/d37/f40 [0,4194304] 0 2026-03-09T19:28:01.199 INFO:tasks.workunit.client.0.vm07.stdout:8/853: creat d7/d9/d37/d45/d97/dbc/de2/f12a x:0 0 0 2026-03-09T19:28:01.203 INFO:tasks.workunit.client.0.vm07.stdout:1/830: unlink d1/db/d31/dca/l105 0 2026-03-09T19:28:01.208 INFO:tasks.workunit.client.0.vm07.stdout:7/789: link d0/d4/d5/d8/d41/f89 d0/d4/d5/f109 0 2026-03-09T19:28:01.208 INFO:tasks.workunit.client.0.vm07.stdout:7/790: dread - d0/d4/d5/d99/feb zero size 2026-03-09T19:28:01.210 INFO:tasks.workunit.client.0.vm07.stdout:1/831: unlink d1/c10a 0 2026-03-09T19:28:01.210 INFO:tasks.workunit.client.0.vm07.stdout:4/811: write d3/d11/d29/d34/fa5 [2824549,121547] 0 2026-03-09T19:28:01.210 INFO:tasks.workunit.client.0.vm07.stdout:1/832: stat d1/d11/d37/d5d/f8a 0 2026-03-09T19:28:01.232 INFO:tasks.workunit.client.0.vm07.stdout:5/815: dwrite d3/d1a/d28/fca [0,4194304] 0 2026-03-09T19:28:01.233 INFO:tasks.workunit.client.0.vm07.stdout:4/812: rmdir d3/d11 39 2026-03-09T19:28:01.239 INFO:tasks.workunit.client.0.vm07.stdout:0/807: dwrite d0/d6/d13/da1/fc1 [4194304,4194304] 0 2026-03-09T19:28:01.239 INFO:tasks.workunit.client.0.vm07.stdout:0/808: readlink d0/d6/l3b 0 2026-03-09T19:28:01.249 INFO:tasks.workunit.client.0.vm07.stdout:6/796: rmdir d0/d1/db/d1d/d77/d11f 0 2026-03-09T19:28:01.249 INFO:tasks.workunit.client.0.vm07.stdout:8/854: write d7/d9/d37/d45/f7d [5124725,7162] 0 2026-03-09T19:28:01.249 INFO:tasks.workunit.client.0.vm07.stdout:9/845: dread d0/db/d29/d32/d5c/d69/f83 [0,4194304] 0 2026-03-09T19:28:01.259 INFO:tasks.workunit.client.0.vm07.stdout:7/791: dwrite d0/d4/f12 [0,4194304] 0 2026-03-09T19:28:01.260 INFO:tasks.workunit.client.0.vm07.stdout:2/888: dwrite d3/dd/f9a [0,4194304] 0 2026-03-09T19:28:01.260 INFO:tasks.workunit.client.0.vm07.stdout:7/792: dread - d0/d4/d5/d26/f107 zero size 2026-03-09T19:28:01.260 INFO:tasks.workunit.client.0.vm07.stdout:5/816: creat d3/dd/d26/d2d/d79/f108 x:0 0 0 2026-03-09T19:28:01.269 INFO:tasks.workunit.client.0.vm07.stdout:3/909: getdents d1/d3d/d47/db3/dc2/d28/d7c 0 2026-03-09T19:28:01.277 INFO:tasks.workunit.client.0.vm07.stdout:0/809: symlink d0/d6/d13/d17/d19/d58/l10d 0 2026-03-09T19:28:01.277 INFO:tasks.workunit.client.0.vm07.stdout:8/855: rename d7/d9/l119 to d7/d1d/l12b 0 2026-03-09T19:28:01.284 INFO:tasks.workunit.client.0.vm07.stdout:8/856: dread d7/d9/d10/dd8/dfd/fc0 [0,4194304] 0 2026-03-09T19:28:01.286 INFO:tasks.workunit.client.0.vm07.stdout:7/793: stat d0/d80/db1/de5/d54/d95/fcf 0 2026-03-09T19:28:01.287 INFO:tasks.workunit.client.0.vm07.stdout:5/817: symlink d3/d1a/d5d/l109 0 2026-03-09T19:28:01.293 INFO:tasks.workunit.client.0.vm07.stdout:5/818: dwrite d3/d1a/d28/d6c/d72/d8f/f91 [4194304,4194304] 0 2026-03-09T19:28:01.299 INFO:tasks.workunit.client.0.vm07.stdout:1/833: mknod d1/d11/d37/d3f/d7e/dad/c116 0 2026-03-09T19:28:01.299 INFO:tasks.workunit.client.0.vm07.stdout:6/797: fsync d0/d2d/f88 0 2026-03-09T19:28:01.299 INFO:tasks.workunit.client.0.vm07.stdout:2/889: mknod d3/dd/d16/d29/d2d/d45/d8b/d98/c139 0 2026-03-09T19:28:01.299 INFO:tasks.workunit.client.0.vm07.stdout:7/794: creat d0/d4/d5/d8/d41/d64/d74/d98/f10a x:0 0 0 2026-03-09T19:28:01.299 INFO:tasks.workunit.client.0.vm07.stdout:8/857: chown d7/d30/d75/dcc 64796201 1 2026-03-09T19:28:01.300 INFO:tasks.workunit.client.0.vm07.stdout:0/810: mknod d0/d6/d13/d17/c10e 0 2026-03-09T19:28:01.311 INFO:tasks.workunit.client.0.vm07.stdout:4/813: link d3/d11/d2b/d38/ddc/f60 d3/d11/d2b/d38/ddc/d91/dd6/f119 0 2026-03-09T19:28:01.311 INFO:tasks.workunit.client.0.vm07.stdout:5/819: link d3/d1a/d28/d40/d92/f8e d3/d1a/d5d/dee/f10a 0 2026-03-09T19:28:01.311 INFO:tasks.workunit.client.0.vm07.stdout:4/814: fdatasync d3/d11/d2b/d38/ddc/db2/fc8 0 2026-03-09T19:28:01.312 INFO:tasks.workunit.client.0.vm07.stdout:8/858: mkdir d7/d9/d37/d45/d4f/db1/d107/d11e/d12c 0 2026-03-09T19:28:01.317 INFO:tasks.workunit.client.0.vm07.stdout:5/820: creat d3/f10b x:0 0 0 2026-03-09T19:28:01.318 INFO:tasks.workunit.client.0.vm07.stdout:4/815: dwrite d3/d11/d51/f9a [0,4194304] 0 2026-03-09T19:28:01.324 INFO:tasks.workunit.client.0.vm07.stdout:2/890: sync 2026-03-09T19:28:01.324 INFO:tasks.workunit.client.0.vm07.stdout:7/795: sync 2026-03-09T19:28:01.327 INFO:tasks.workunit.client.0.vm07.stdout:2/891: stat d3/dd/d16/d30/d40/c10a 0 2026-03-09T19:28:01.328 INFO:tasks.workunit.client.0.vm07.stdout:7/796: read d0/d80/db1/de5/d54/dc4/fed [186878,99442] 0 2026-03-09T19:28:01.328 INFO:tasks.workunit.client.0.vm07.stdout:4/816: unlink d3/d11/d29/ff4 0 2026-03-09T19:28:01.329 INFO:tasks.workunit.client.0.vm07.stdout:2/892: readlink d3/dd/d16/d29/d2d/d45/d3b/l79 0 2026-03-09T19:28:01.329 INFO:tasks.workunit.client.0.vm07.stdout:4/817: fdatasync d3/d11/f74 0 2026-03-09T19:28:01.333 INFO:tasks.workunit.client.0.vm07.stdout:4/818: creat d3/d11/d16/df5/f11a x:0 0 0 2026-03-09T19:28:01.336 INFO:tasks.workunit.client.0.vm07.stdout:2/893: mknod d3/dd/d16/d29/d2d/d45/dc3/c13a 0 2026-03-09T19:28:01.338 INFO:tasks.workunit.client.0.vm07.stdout:5/821: rename d3/d1a/d5a/db8 to d3/d1a/d28/d10c 0 2026-03-09T19:28:01.339 INFO:tasks.workunit.client.0.vm07.stdout:4/819: creat d3/d11/d29/d34/f11b x:0 0 0 2026-03-09T19:28:01.339 INFO:tasks.workunit.client.0.vm07.stdout:2/894: truncate d3/dd/d16/d29/d3c/d5a/fb7 1103022 0 2026-03-09T19:28:01.340 INFO:tasks.workunit.client.0.vm07.stdout:2/895: chown d3/dd/l28 361 1 2026-03-09T19:28:01.342 INFO:tasks.workunit.client.0.vm07.stdout:4/820: sync 2026-03-09T19:28:01.346 INFO:tasks.workunit.client.0.vm07.stdout:2/896: symlink d3/dd/d16/d29/d2d/d45/d3b/d44/d96/ddf/l13b 0 2026-03-09T19:28:01.349 INFO:tasks.workunit.client.0.vm07.stdout:5/822: dwrite d3/dd/d26/d3f/d47/d56/f65 [0,4194304] 0 2026-03-09T19:28:01.351 INFO:tasks.workunit.client.0.vm07.stdout:2/897: fsync d3/dd/d16/d29/d2d/f6d 0 2026-03-09T19:28:01.354 INFO:tasks.workunit.client.0.vm07.stdout:5/823: stat d3/d1a/d5d/f102 0 2026-03-09T19:28:01.359 INFO:tasks.workunit.client.0.vm07.stdout:4/821: dread d3/d4f/d56/d5f/f6f [0,4194304] 0 2026-03-09T19:28:01.359 INFO:tasks.workunit.client.0.vm07.stdout:2/898: mknod d3/dd/d16/d29/d2d/d45/df6/d11e/c13c 0 2026-03-09T19:28:01.378 INFO:tasks.workunit.client.0.vm07.stdout:4/822: symlink d3/d11/d51/l11c 0 2026-03-09T19:28:01.382 INFO:tasks.workunit.client.0.vm07.stdout:4/823: truncate d3/d11/d2b/d38/ddc/db2/f102 769058 0 2026-03-09T19:28:01.395 INFO:tasks.workunit.client.0.vm07.stdout:4/824: dwrite d3/d4f/d56/d5f/f7b [0,4194304] 0 2026-03-09T19:28:01.443 INFO:tasks.workunit.client.0.vm07.stdout:9/846: dwrite d0/db/d29/d68/d99/fae [0,4194304] 0 2026-03-09T19:28:01.443 INFO:tasks.workunit.client.0.vm07.stdout:0/811: rmdir d0/d6 39 2026-03-09T19:28:01.444 INFO:tasks.workunit.client.0.vm07.stdout:4/825: dread d3/d11/d2b/d38/ddc/d22/d86/f8c [0,4194304] 0 2026-03-09T19:28:01.452 INFO:tasks.workunit.client.0.vm07.stdout:6/798: truncate d0/d44/fe2 1917600 0 2026-03-09T19:28:01.453 INFO:tasks.workunit.client.0.vm07.stdout:9/847: fdatasync d0/db/fe9 0 2026-03-09T19:28:01.458 INFO:tasks.workunit.client.0.vm07.stdout:4/826: fdatasync d3/d4f/f5b 0 2026-03-09T19:28:01.464 INFO:tasks.workunit.client.0.vm07.stdout:7/797: write d0/d4/d5/d8/f93 [2129201,28442] 0 2026-03-09T19:28:01.465 INFO:tasks.workunit.client.0.vm07.stdout:8/859: write d7/d9/d10/dd8/dfd/d62/fc8 [2620823,70081] 0 2026-03-09T19:28:01.465 INFO:tasks.workunit.client.0.vm07.stdout:1/834: dwrite d1/d3e/db3/d6d/fac [0,4194304] 0 2026-03-09T19:28:01.465 INFO:tasks.workunit.client.0.vm07.stdout:3/910: dwrite d1/d6/d71/f69 [0,4194304] 0 2026-03-09T19:28:01.465 INFO:tasks.workunit.client.0.vm07.stdout:3/911: write d1/d3d/d47/f113 [8336,29073] 0 2026-03-09T19:28:01.467 INFO:tasks.workunit.client.0.vm07.stdout:9/848: write d0/d6/ff [2591965,40539] 0 2026-03-09T19:28:01.471 INFO:tasks.workunit.client.0.vm07.stdout:5/824: write d3/d1a/d28/d6c/d72/d8f/fbd [809947,101717] 0 2026-03-09T19:28:01.471 INFO:tasks.workunit.client.0.vm07.stdout:5/825: fsync d3/d1a/d28/d40/d92/f5e 0 2026-03-09T19:28:01.471 INFO:tasks.workunit.client.0.vm07.stdout:2/899: write d3/dd/d16/d30/f67 [3797448,14326] 0 2026-03-09T19:28:01.479 INFO:tasks.workunit.client.0.vm07.stdout:8/860: fdatasync d7/d9/d37/d34/faa 0 2026-03-09T19:28:01.479 INFO:tasks.workunit.client.0.vm07.stdout:9/849: stat d0/db/d29/d68/f6b 0 2026-03-09T19:28:01.480 INFO:tasks.workunit.client.0.vm07.stdout:1/835: dread d1/d3e/db3/d6d/dff/fe7 [0,4194304] 0 2026-03-09T19:28:01.480 INFO:tasks.workunit.client.0.vm07.stdout:2/900: readlink d3/dd/d16/d29/d3c/d5a/le2 0 2026-03-09T19:28:01.484 INFO:tasks.workunit.client.0.vm07.stdout:7/798: dwrite d0/d80/db1/de5/d54/d55/f6d [0,4194304] 0 2026-03-09T19:28:01.487 INFO:tasks.workunit.client.0.vm07.stdout:1/836: dread d1/d3/d21/f2e [0,4194304] 0 2026-03-09T19:28:01.498 INFO:tasks.workunit.client.0.vm07.stdout:0/812: truncate d0/d6/dc8/d99/fa7 1086166 0 2026-03-09T19:28:01.498 INFO:tasks.workunit.client.0.vm07.stdout:3/912: link d1/d3d/d47/db3/f123 d1/d74/f125 0 2026-03-09T19:28:01.513 INFO:tasks.workunit.client.0.vm07.stdout:1/837: mkdir d1/d91/d117 0 2026-03-09T19:28:01.525 INFO:tasks.workunit.client.0.vm07.stdout:3/913: rename d1/d6/l11c to d1/d6/d45/d54/de5/d10f/l126 0 2026-03-09T19:28:01.525 INFO:tasks.workunit.client.0.vm07.stdout:2/901: creat d3/dd/d16/d29/d2d/d45/d3b/d44/d97/da4/d116/f13d x:0 0 0 2026-03-09T19:28:01.525 INFO:tasks.workunit.client.0.vm07.stdout:3/914: fsync d1/f98 0 2026-03-09T19:28:01.534 INFO:tasks.workunit.client.0.vm07.stdout:9/850: rename d0/db/d29/d2c/f43 to d0/db/d29/d32/d5c/d69/f123 0 2026-03-09T19:28:01.534 INFO:tasks.workunit.client.0.vm07.stdout:2/902: fsync d3/dd/d16/f25 0 2026-03-09T19:28:01.540 INFO:tasks.workunit.client.0.vm07.stdout:3/915: truncate d1/d3d/d47/f9e 194765 0 2026-03-09T19:28:01.543 INFO:tasks.workunit.client.0.vm07.stdout:5/826: dread d3/d1a/d28/d6c/d72/d8f/fbd [0,4194304] 0 2026-03-09T19:28:01.543 INFO:tasks.workunit.client.0.vm07.stdout:2/903: getdents d3/dd/d16/d29/d2d/d45/d3b/d44 0 2026-03-09T19:28:01.544 INFO:tasks.workunit.client.0.vm07.stdout:5/827: readlink d3/dd/d26/d2d/d79/d9f/ld8 0 2026-03-09T19:28:01.544 INFO:tasks.workunit.client.0.vm07.stdout:2/904: write d3/dd/f9a [5226247,2525] 0 2026-03-09T19:28:01.545 INFO:tasks.workunit.client.0.vm07.stdout:3/916: mkdir d1/d3d/d47/db3/d8e/da9/d127 0 2026-03-09T19:28:01.546 INFO:tasks.workunit.client.0.vm07.stdout:5/828: dread - d3/d1a/d28/d36/fff zero size 2026-03-09T19:28:01.552 INFO:tasks.workunit.client.0.vm07.stdout:5/829: fdatasync d3/d1a/d28/d40/f46 0 2026-03-09T19:28:01.553 INFO:tasks.workunit.client.0.vm07.stdout:2/905: unlink d3/dd/d16/d29/d3c/l41 0 2026-03-09T19:28:01.553 INFO:tasks.workunit.client.0.vm07.stdout:5/830: fsync f2 0 2026-03-09T19:28:01.557 INFO:tasks.workunit.client.0.vm07.stdout:9/851: sync 2026-03-09T19:28:01.557 INFO:tasks.workunit.client.0.vm07.stdout:2/906: creat d3/dd/d16/d29/d3c/d4c/f13e x:0 0 0 2026-03-09T19:28:01.557 INFO:tasks.workunit.client.0.vm07.stdout:5/831: getdents d3/dd/d26/d2d/d9e/df0 0 2026-03-09T19:28:01.558 INFO:tasks.workunit.client.0.vm07.stdout:5/832: rmdir d3/d1a/d28/d40/d92 39 2026-03-09T19:28:01.559 INFO:tasks.workunit.client.0.vm07.stdout:5/833: creat d3/d1a/d5a/f10d x:0 0 0 2026-03-09T19:28:01.560 INFO:tasks.workunit.client.0.vm07.stdout:2/907: write d3/dd/d16/d29/d2d/f6d [4096781,80307] 0 2026-03-09T19:28:01.562 INFO:tasks.workunit.client.0.vm07.stdout:5/834: readlink d3/d1a/d28/d6c/d72/lec 0 2026-03-09T19:28:01.575 INFO:tasks.workunit.client.0.vm07.stdout:5/835: getdents d3/dd/d26/d2d/d9e/df0 0 2026-03-09T19:28:01.575 INFO:tasks.workunit.client.0.vm07.stdout:2/908: mknod d3/c13f 0 2026-03-09T19:28:01.576 INFO:tasks.workunit.client.0.vm07.stdout:2/909: chown d3/dd/d16/d29/d3c/d5a/fb7 762644302 1 2026-03-09T19:28:01.577 INFO:tasks.workunit.client.0.vm07.stdout:9/852: dread d0/d6/d73/d10e/f120 [0,4194304] 0 2026-03-09T19:28:01.579 INFO:tasks.workunit.client.0.vm07.stdout:5/836: dread d3/dd/d26/d2d/d79/d9f/fd0 [0,4194304] 0 2026-03-09T19:28:01.579 INFO:tasks.workunit.client.0.vm07.stdout:2/910: mknod d3/dd/d103/ddd/ded/df3/c140 0 2026-03-09T19:28:01.588 INFO:tasks.workunit.client.0.vm07.stdout:5/837: rename d3/dd/d26/d2d/f54 to d3/dd/d26/d2d/d9e/f10e 0 2026-03-09T19:28:01.588 INFO:tasks.workunit.client.0.vm07.stdout:5/838: truncate d3/dd/d95/fc7 174038 0 2026-03-09T19:28:01.588 INFO:tasks.workunit.client.0.vm07.stdout:5/839: mknod d3/dd/d26/d2d/d79/c10f 0 2026-03-09T19:28:01.588 INFO:tasks.workunit.client.0.vm07.stdout:5/840: rename d3/dd/d26/d3f/d47/de6/ff9 to d3/d1a/d5d/dee/f110 0 2026-03-09T19:28:01.593 INFO:tasks.workunit.client.0.vm07.stdout:2/911: dwrite d3/dd/d16/d29/d2d/d45/d8b/d98/dee/f10e [0,4194304] 0 2026-03-09T19:28:01.593 INFO:tasks.workunit.client.0.vm07.stdout:5/841: sync 2026-03-09T19:28:01.597 INFO:tasks.workunit.client.0.vm07.stdout:5/842: dwrite d3/dd/d26/d3f/fc4 [0,4194304] 0 2026-03-09T19:28:01.605 INFO:tasks.workunit.client.0.vm07.stdout:2/912: read d3/dd/f24 [1047511,1960] 0 2026-03-09T19:28:01.609 INFO:tasks.workunit.client.0.vm07.stdout:5/843: creat d3/dd/d26/d2d/d9e/f111 x:0 0 0 2026-03-09T19:28:01.617 INFO:tasks.workunit.client.0.vm07.stdout:2/913: getdents d3/dd/d16/d29/d2d/d45/d8b/d98/dee 0 2026-03-09T19:28:01.624 INFO:tasks.workunit.client.0.vm07.stdout:5/844: dread - d3/d1a/d28/d40/fd9 zero size 2026-03-09T19:28:01.625 INFO:tasks.workunit.client.0.vm07.stdout:2/914: mkdir d3/dd/d16/d29/d3c/d5a/d7a/d74/d141 0 2026-03-09T19:28:01.630 INFO:tasks.workunit.client.0.vm07.stdout:2/915: chown d3/d11/d38/d111/c12b 617194826 1 2026-03-09T19:28:01.633 INFO:tasks.workunit.client.0.vm07.stdout:6/799: dwrite d0/d1/db/f70 [0,4194304] 0 2026-03-09T19:28:01.641 INFO:tasks.workunit.client.0.vm07.stdout:6/800: symlink d0/d1/db/d52/d94/d81/l12d 0 2026-03-09T19:28:01.650 INFO:tasks.workunit.client.0.vm07.stdout:5/845: dread d3/f93 [0,4194304] 0 2026-03-09T19:28:01.667 INFO:tasks.workunit.client.0.vm07.stdout:5/846: dread d3/f19 [4194304,4194304] 0 2026-03-09T19:28:01.682 INFO:tasks.workunit.client.0.vm07.stdout:5/847: fdatasync d3/d1a/d28/d6c/d72/f9a 0 2026-03-09T19:28:01.682 INFO:tasks.workunit.client.0.vm07.stdout:4/827: write d3/d11/d2b/d38/ddc/d91/fb3 [1437491,16093] 0 2026-03-09T19:28:01.686 INFO:tasks.workunit.client.0.vm07.stdout:8/861: dwrite d7/d50/fa0 [0,4194304] 0 2026-03-09T19:28:01.689 INFO:tasks.workunit.client.0.vm07.stdout:8/862: chown d7/d9/d37 7420255 1 2026-03-09T19:28:01.690 INFO:tasks.workunit.client.0.vm07.stdout:7/799: truncate d0/d4/f6f 3476966 0 2026-03-09T19:28:01.690 INFO:tasks.workunit.client.0.vm07.stdout:4/828: fdatasync d3/d11/d16/f82 0 2026-03-09T19:28:01.704 INFO:tasks.workunit.client.0.vm07.stdout:5/848: rename d3/d1a/d28/d36/f63 to d3/d1a/d5a/d107/f112 0 2026-03-09T19:28:01.707 INFO:tasks.workunit.client.0.vm07.stdout:0/813: dwrite d0/d6/d13/d1c/d50/fd6 [0,4194304] 0 2026-03-09T19:28:01.707 INFO:tasks.workunit.client.0.vm07.stdout:2/916: dread d3/dd/d16/d30/d40/f107 [0,4194304] 0 2026-03-09T19:28:01.707 INFO:tasks.workunit.client.0.vm07.stdout:1/838: dwrite d1/db/fb2 [0,4194304] 0 2026-03-09T19:28:01.717 INFO:tasks.workunit.client.0.vm07.stdout:9/853: dwrite d0/db/d29/d2c/f54 [4194304,4194304] 0 2026-03-09T19:28:01.717 INFO:tasks.workunit.client.0.vm07.stdout:7/800: readlink d0/d4/d5/d8/d41/d64/l68 0 2026-03-09T19:28:01.719 INFO:tasks.workunit.client.0.vm07.stdout:1/839: unlink d1/db/d31/c46 0 2026-03-09T19:28:01.722 INFO:tasks.workunit.client.0.vm07.stdout:0/814: stat d0/d6/d13/d17/d19/d57/d6a/lfd 0 2026-03-09T19:28:01.723 INFO:tasks.workunit.client.0.vm07.stdout:3/917: dwrite d1/d3d/d47/db3/d8e/da9/fe6 [0,4194304] 0 2026-03-09T19:28:01.733 INFO:tasks.workunit.client.0.vm07.stdout:4/829: dread d3/d11/d2b/d38/fdf [0,4194304] 0 2026-03-09T19:28:01.736 INFO:tasks.workunit.client.0.vm07.stdout:9/854: dwrite d0/d6/f4c [4194304,4194304] 0 2026-03-09T19:28:01.742 INFO:tasks.workunit.client.0.vm07.stdout:2/917: creat d3/dd/d16/d29/d3c/d5a/d7a/d74/f142 x:0 0 0 2026-03-09T19:28:01.743 INFO:tasks.workunit.client.0.vm07.stdout:8/863: dread d7/d50/f6d [0,4194304] 0 2026-03-09T19:28:01.756 INFO:tasks.workunit.client.0.vm07.stdout:1/840: mkdir d1/db/d31/d56/d118 0 2026-03-09T19:28:01.756 INFO:tasks.workunit.client.0.vm07.stdout:3/918: rename d1/d6/d71/cb9 to d1/d3d/d47/db3/d8e/da9/d127/c128 0 2026-03-09T19:28:01.757 INFO:tasks.workunit.client.0.vm07.stdout:4/830: creat d3/d11/d2b/d38/ddc/d91/f11d x:0 0 0 2026-03-09T19:28:01.758 INFO:tasks.workunit.client.0.vm07.stdout:8/864: creat d7/d9/d37/d45/d97/dbc/f12d x:0 0 0 2026-03-09T19:28:01.758 INFO:tasks.workunit.client.0.vm07.stdout:3/919: chown d1/d3d/d47/d10e/l11e 0 1 2026-03-09T19:28:01.766 INFO:tasks.workunit.client.0.vm07.stdout:5/849: dread d3/d1a/d28/d6c/d72/d8f/f91 [0,4194304] 0 2026-03-09T19:28:01.767 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:01 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:01.767 INFO:tasks.workunit.client.0.vm07.stdout:1/841: dread d1/f76 [0,4194304] 0 2026-03-09T19:28:01.772 INFO:tasks.workunit.client.0.vm07.stdout:8/865: creat d7/d16/dcf/f12e x:0 0 0 2026-03-09T19:28:01.772 INFO:tasks.workunit.client.0.vm07.stdout:1/842: chown d1/db/f14 697201 1 2026-03-09T19:28:01.775 INFO:tasks.workunit.client.0.vm07.stdout:1/843: fsync d1/d3e/db3/d6d/dff/fe7 0 2026-03-09T19:28:01.777 INFO:tasks.workunit.client.0.vm07.stdout:8/866: creat d7/d9/d10/dd8/dfd/d67/de7/f12f x:0 0 0 2026-03-09T19:28:01.781 INFO:tasks.workunit.client.0.vm07.stdout:5/850: fsync d3/dd/f23 0 2026-03-09T19:28:01.784 INFO:tasks.workunit.client.0.vm07.stdout:6/801: write d0/dbf/d95/f105 [167830,77496] 0 2026-03-09T19:28:01.784 INFO:tasks.workunit.client.0.vm07.stdout:3/920: rename d1/d6/dd/f33 to d1/d3d/d47/db3/d8e/f129 0 2026-03-09T19:28:01.785 INFO:tasks.workunit.client.0.vm07.stdout:9/855: dread d0/db/fe9 [0,4194304] 0 2026-03-09T19:28:01.786 INFO:tasks.workunit.client.0.vm07.stdout:1/844: sync 2026-03-09T19:28:01.792 INFO:tasks.workunit.client.0.vm07.stdout:8/867: dread d7/d30/d32/fba [0,4194304] 0 2026-03-09T19:28:01.793 INFO:tasks.workunit.client.0.vm07.stdout:8/868: chown d7/d9/d10/d44 14 1 2026-03-09T19:28:01.800 INFO:tasks.workunit.client.0.vm07.stdout:5/851: dread d3/f18 [0,4194304] 0 2026-03-09T19:28:01.803 INFO:tasks.workunit.client.0.vm07.stdout:0/815: read d0/d6/d13/d17/d19/d58/f77 [1136687,67952] 0 2026-03-09T19:28:01.810 INFO:tasks.workunit.client.0.vm07.stdout:9/856: fsync d0/d6/d73/d10e/f120 0 2026-03-09T19:28:01.812 INFO:tasks.workunit.client.0.vm07.stdout:1/845: rmdir d1/d3e 39 2026-03-09T19:28:01.815 INFO:tasks.workunit.client.0.vm07.stdout:8/869: mknod d7/d9/d37/d45/d4f/db1/d107/c130 0 2026-03-09T19:28:01.816 INFO:tasks.workunit.client.0.vm07.stdout:5/852: read - d3/d1a/d28/d10c/fd7 zero size 2026-03-09T19:28:01.818 INFO:tasks.workunit.client.0.vm07.stdout:5/853: chown d3/d1a/d28/d10c/fd7 0 1 2026-03-09T19:28:01.821 INFO:tasks.workunit.client.0.vm07.stdout:0/816: read d0/d6/d13/d17/dc3/fb3 [4078307,30809] 0 2026-03-09T19:28:01.822 INFO:tasks.workunit.client.0.vm07.stdout:9/857: creat d0/db/d9e/f124 x:0 0 0 2026-03-09T19:28:01.827 INFO:tasks.workunit.client.0.vm07.stdout:0/817: fdatasync d0/d6/dc8/fca 0 2026-03-09T19:28:01.830 INFO:tasks.workunit.client.0.vm07.stdout:5/854: mkdir d3/d1a/d28/d40/d92/d89/ddc/dde/d113 0 2026-03-09T19:28:01.831 INFO:tasks.workunit.client.0.vm07.stdout:0/818: creat d0/d6/dc8/f10f x:0 0 0 2026-03-09T19:28:01.832 INFO:tasks.workunit.client.0.vm07.stdout:9/858: mkdir d0/db/d29/d2c/de5/d125 0 2026-03-09T19:28:01.839 INFO:tasks.workunit.client.0.vm07.stdout:7/801: dwrite d0/d4/d5/d8/d1a/d2a/f34 [0,4194304] 0 2026-03-09T19:28:01.854 INFO:tasks.workunit.client.0.vm07.stdout:1/846: dread d1/d3e/f49 [0,4194304] 0 2026-03-09T19:28:01.854 INFO:tasks.workunit.client.0.vm07.stdout:1/847: stat d1/d91 0 2026-03-09T19:28:01.854 INFO:tasks.workunit.client.0.vm07.stdout:0/819: mknod d0/d6/d13/d17/d19/d58/dd9/df8/c110 0 2026-03-09T19:28:01.857 INFO:tasks.workunit.client.0.vm07.stdout:7/802: mkdir d0/d4/d5/d8/d41/d10b 0 2026-03-09T19:28:01.861 INFO:tasks.workunit.client.0.vm07.stdout:0/820: stat d0/d6/d13/d1c/d61/c84 0 2026-03-09T19:28:01.861 INFO:tasks.workunit.client.0.vm07.stdout:7/803: truncate d0/d4/f12 4212230 0 2026-03-09T19:28:01.863 INFO:tasks.workunit.client.0.vm07.stdout:7/804: dread - d0/d4/d5/d26/fbe zero size 2026-03-09T19:28:01.878 INFO:tasks.workunit.client.0.vm07.stdout:7/805: truncate d0/d4/d5/d8/d1a/d2a/fb3 1016199 0 2026-03-09T19:28:01.881 INFO:tasks.workunit.client.0.vm07.stdout:7/806: mkdir d0/d4/d5/d26/d32/d10c 0 2026-03-09T19:28:01.881 INFO:tasks.workunit.client.0.vm07.stdout:0/821: dwrite d0/d6/d13/d1c/d50/fd6 [0,4194304] 0 2026-03-09T19:28:01.889 INFO:tasks.workunit.client.0.vm07.stdout:7/807: truncate d0/d4/d5/d26/d32/dbd/fa0 1195189 0 2026-03-09T19:28:01.927 INFO:tasks.workunit.client.0.vm07.stdout:2/918: write d3/dd/d16/d29/d2d/d45/d3b/dae/fda [1647943,95496] 0 2026-03-09T19:28:01.928 INFO:tasks.workunit.client.0.vm07.stdout:4/831: write d3/f64 [351744,74648] 0 2026-03-09T19:28:01.930 INFO:tasks.workunit.client.0.vm07.stdout:2/919: write d3/dd/d16/d30/da7/f109 [6192079,11712] 0 2026-03-09T19:28:01.940 INFO:tasks.workunit.client.0.vm07.stdout:2/920: rmdir d3/dd/d16/d29/d3c/d5a/db3 39 2026-03-09T19:28:01.940 INFO:tasks.workunit.client.0.vm07.stdout:3/921: write d1/f73 [2943988,76464] 0 2026-03-09T19:28:01.941 INFO:tasks.workunit.client.0.vm07.stdout:4/832: mkdir d3/d4f/d56/d11e 0 2026-03-09T19:28:01.955 INFO:tasks.workunit.client.0.vm07.stdout:4/833: mkdir d3/d11/d2b/d38/ddc/d91/d11f 0 2026-03-09T19:28:01.956 INFO:tasks.workunit.client.0.vm07.stdout:2/921: read d3/dd/d16/d29/fa3 [2758234,86244] 0 2026-03-09T19:28:01.956 INFO:tasks.workunit.client.0.vm07.stdout:6/802: dwrite d0/d4e/d75/f100 [0,4194304] 0 2026-03-09T19:28:01.958 INFO:tasks.workunit.client.0.vm07.stdout:2/922: fsync d3/f63 0 2026-03-09T19:28:01.964 INFO:tasks.workunit.client.0.vm07.stdout:4/834: sync 2026-03-09T19:28:01.964 INFO:tasks.workunit.client.0.vm07.stdout:4/835: write d3/d4f/f5b [61604,36272] 0 2026-03-09T19:28:01.965 INFO:tasks.workunit.client.0.vm07.stdout:2/923: truncate d3/d11/d38/d111/d113/f12c 873574 0 2026-03-09T19:28:01.971 INFO:tasks.workunit.client.0.vm07.stdout:2/924: unlink d3/dd/d16/d29/d2d/d45/d3b/d44/f81 0 2026-03-09T19:28:01.971 INFO:tasks.workunit.client.0.vm07.stdout:4/836: creat d3/d11/d2b/d37/f120 x:0 0 0 2026-03-09T19:28:01.972 INFO:tasks.workunit.client.0.vm07.stdout:4/837: chown d3/d11/d29/d101/lb7 432745 1 2026-03-09T19:28:01.975 INFO:tasks.workunit.client.0.vm07.stdout:4/838: creat d3/d11/d51/f121 x:0 0 0 2026-03-09T19:28:01.976 INFO:tasks.workunit.client.0.vm07.stdout:4/839: truncate d3/d11/d29/d34/f11b 168432 0 2026-03-09T19:28:01.979 INFO:tasks.workunit.client.0.vm07.stdout:4/840: rename d3/d11/d2b/d38/ddc/d91/dd6/ffb to d3/d11/d29/d101/f122 0 2026-03-09T19:28:01.980 INFO:tasks.workunit.client.0.vm07.stdout:4/841: rename d3/d11 to d3/d11/d16/df5/d123 22 2026-03-09T19:28:01.983 INFO:tasks.workunit.client.0.vm07.stdout:4/842: rmdir d3/d11/d2b/d37/db6 39 2026-03-09T19:28:01.993 INFO:tasks.workunit.client.0.vm07.stdout:8/870: dwrite d7/d30/fb7 [0,4194304] 0 2026-03-09T19:28:01.997 INFO:tasks.workunit.client.0.vm07.stdout:5/855: truncate d3/dd/d26/d3f/d47/d71/d76/d98/fdd 2546611 0 2026-03-09T19:28:02.001 INFO:tasks.workunit.client.0.vm07.stdout:8/871: chown d7/d9/d37/d45/d4f/db1/fd6 24772887 1 2026-03-09T19:28:02.008 INFO:tasks.workunit.client.0.vm07.stdout:0/822: dwrite d0/d6/d13/d1c/d11/f2e [0,4194304] 0 2026-03-09T19:28:02.015 INFO:tasks.workunit.client.0.vm07.stdout:5/856: creat d3/d1a/d28/d40/f114 x:0 0 0 2026-03-09T19:28:02.017 INFO:tasks.workunit.client.0.vm07.stdout:1/848: dwrite d1/d11/d37/d3f/d6e/d9c/ff7 [0,4194304] 0 2026-03-09T19:28:02.018 INFO:tasks.workunit.client.0.vm07.stdout:9/859: dwrite d0/db/fac [0,4194304] 0 2026-03-09T19:28:02.025 INFO:tasks.workunit.client.0.vm07.stdout:1/849: chown d1/d3e/db3/d6d 700 1 2026-03-09T19:28:02.028 INFO:tasks.workunit.client.0.vm07.stdout:3/922: dwrite d1/d3d/d47/db3/dc2/d28/fd4 [0,4194304] 0 2026-03-09T19:28:02.032 INFO:tasks.workunit.client.0.vm07.stdout:0/823: unlink d0/d6/d13/d33/c66 0 2026-03-09T19:28:02.033 INFO:tasks.workunit.client.0.vm07.stdout:0/824: chown d0/d6/d13/d17 24 1 2026-03-09T19:28:02.033 INFO:tasks.workunit.client.0.vm07.stdout:9/860: truncate d0/db/d29/d2c/fb6 944551 0 2026-03-09T19:28:02.037 INFO:tasks.workunit.client.0.vm07.stdout:5/857: rmdir d3/d1a/d28/d6c/de8 0 2026-03-09T19:28:02.045 INFO:tasks.workunit.client.0.vm07.stdout:2/925: rename d3/dd/d16/d29/d2d/d45/d3b/d44/d97 to d3/d11/d143 0 2026-03-09T19:28:02.045 INFO:tasks.workunit.client.0.vm07.stdout:1/850: truncate d1/d11/d37/d3f/d45/f15 80914 0 2026-03-09T19:28:02.045 INFO:tasks.workunit.client.0.vm07.stdout:9/861: creat d0/d6/d57/d5d/dde/f126 x:0 0 0 2026-03-09T19:28:02.045 INFO:tasks.workunit.client.0.vm07.stdout:2/926: mknod d3/dd/d16/d29/c144 0 2026-03-09T19:28:02.045 INFO:tasks.workunit.client.0.vm07.stdout:1/851: fdatasync d1/d3/d21/f5f 0 2026-03-09T19:28:02.045 INFO:tasks.workunit.client.0.vm07.stdout:5/858: getdents d3/dd/d26/d2d/d60/df7 0 2026-03-09T19:28:02.050 INFO:tasks.workunit.client.0.vm07.stdout:2/927: mkdir d3/dd/d16/d29/d2d/d45/d8b/d98/dee/d145 0 2026-03-09T19:28:02.052 INFO:tasks.workunit.client.0.vm07.stdout:2/928: chown d3/dd/d16/d29/d3c/d5a/d7a/d74/dc9/d135 24 1 2026-03-09T19:28:02.053 INFO:tasks.workunit.client.0.vm07.stdout:5/859: mknod d3/d1a/d28/d6c/d72/db5/c115 0 2026-03-09T19:28:02.055 INFO:tasks.workunit.client.0.vm07.stdout:5/860: readlink d3/d1a/d28/d48/le2 0 2026-03-09T19:28:02.059 INFO:tasks.workunit.client.0.vm07.stdout:5/861: fdatasync d3/dd/d26/d2d/d79/f108 0 2026-03-09T19:28:02.070 INFO:tasks.workunit.client.0.vm07.stdout:9/862: dread d0/d6/d3a/d81/fa3 [0,4194304] 0 2026-03-09T19:28:02.080 INFO:tasks.workunit.client.0.vm07.stdout:9/863: getdents d0/db/d29/d32/d5c/d69 0 2026-03-09T19:28:02.081 INFO:tasks.workunit.client.0.vm07.stdout:9/864: chown d0/d6f/lcc 1695586 1 2026-03-09T19:28:02.087 INFO:tasks.workunit.client.0.vm07.stdout:5/862: dwrite d3/dd/d26/d3f/fc4 [0,4194304] 0 2026-03-09T19:28:02.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:01 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:02.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:01 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:02.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:01 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T19:28:02.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:01 vm08.local ceph-mon[57794]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T19:28:02.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:01 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:02.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:01 vm08.local ceph-mon[57794]: Upgrade: Need to upgrade myself (mgr.vm08.mxylvw) 2026-03-09T19:28:02.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:01 vm08.local ceph-mon[57794]: Upgrade: Need to upgrade myself (mgr.vm08.mxylvw) 2026-03-09T19:28:02.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:01 vm08.local ceph-mon[57794]: pgmap v10: 65 pgs: 65 active+clean; 3.7 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 40 MiB/s rd, 87 MiB/s wr, 239 op/s 2026-03-09T19:28:02.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:01 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:02.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:01 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.xacuym", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:28:02.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:01 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.xacuym", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:28:02.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:01 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T19:28:02.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:01 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:02.102 INFO:tasks.workunit.client.0.vm07.stdout:5/863: rename d3/f25 to d3/dd/d95/f116 0 2026-03-09T19:28:02.102 INFO:tasks.workunit.client.0.vm07.stdout:1/852: dread d1/d11/fbc [0,4194304] 0 2026-03-09T19:28:02.108 INFO:tasks.workunit.client.0.vm07.stdout:9/865: dwrite d0/d6/fe3 [0,4194304] 0 2026-03-09T19:28:02.115 INFO:tasks.workunit.client.0.vm07.stdout:5/864: mknod d3/d1a/d5d/dee/c117 0 2026-03-09T19:28:02.117 INFO:tasks.workunit.client.0.vm07.stdout:5/865: readlink d3/dd/l4c 0 2026-03-09T19:28:02.126 INFO:tasks.workunit.client.0.vm07.stdout:9/866: creat d0/d6/d57/d5d/dde/d118/f127 x:0 0 0 2026-03-09T19:28:02.127 INFO:tasks.workunit.client.0.vm07.stdout:1/853: link d1/d11/d37/d3f/d45/f9d d1/d3/d21/f119 0 2026-03-09T19:28:02.128 INFO:tasks.workunit.client.0.vm07.stdout:9/867: symlink d0/db/d29/d2c/de5/d125/l128 0 2026-03-09T19:28:02.135 INFO:tasks.workunit.client.0.vm07.stdout:5/866: link d3/dd/d26/d3f/d47/c64 d3/dd/d26/d2d/d60/df7/c118 0 2026-03-09T19:28:02.135 INFO:tasks.workunit.client.0.vm07.stdout:1/854: readlink d1/d11/d37/d5d/dc1/lc6 0 2026-03-09T19:28:02.135 INFO:tasks.workunit.client.0.vm07.stdout:9/868: creat d0/d6f/d86/f129 x:0 0 0 2026-03-09T19:28:02.139 INFO:tasks.workunit.client.0.vm07.stdout:1/855: getdents d1/db/d31/d56/d118 0 2026-03-09T19:28:02.149 INFO:tasks.workunit.client.0.vm07.stdout:5/867: dwrite d3/d1a/d28/d40/d92/d89/ddc/fe7 [0,4194304] 0 2026-03-09T19:28:02.153 INFO:tasks.workunit.client.0.vm07.stdout:9/869: fdatasync d0/d6/d73/fed 0 2026-03-09T19:28:02.159 INFO:tasks.workunit.client.0.vm07.stdout:9/870: symlink d0/db/d29/d68/l12a 0 2026-03-09T19:28:02.161 INFO:tasks.workunit.client.0.vm07.stdout:5/868: dread d3/d1a/d28/d6c/d72/db5/fd2 [0,4194304] 0 2026-03-09T19:28:02.165 INFO:tasks.workunit.client.0.vm07.stdout:9/871: creat d0/db/d29/d32/d5c/f12b x:0 0 0 2026-03-09T19:28:02.174 INFO:tasks.workunit.client.0.vm07.stdout:5/869: chown d3/d1a/d28/d48/c57 307125126 1 2026-03-09T19:28:02.175 INFO:tasks.workunit.client.0.vm07.stdout:1/856: dread d1/d11/d37/f2c [0,4194304] 0 2026-03-09T19:28:02.177 INFO:tasks.workunit.client.0.vm07.stdout:5/870: getdents d3/d1a/d28/d10c 0 2026-03-09T19:28:02.177 INFO:tasks.workunit.client.0.vm07.stdout:9/872: mkdir d0/db/d29/d32/d5c/d80/ddf/d12c 0 2026-03-09T19:28:02.184 INFO:tasks.workunit.client.0.vm07.stdout:5/871: dread - d3/d1a/d28/d36/fff zero size 2026-03-09T19:28:02.185 INFO:tasks.workunit.client.0.vm07.stdout:9/873: write d0/db/d29/d2c/d36/d5a/fb2 [755150,13465] 0 2026-03-09T19:28:02.187 INFO:tasks.workunit.client.0.vm07.stdout:5/872: mknod d3/dd/d26/d2d/d60/d83/c119 0 2026-03-09T19:28:02.189 INFO:tasks.workunit.client.0.vm07.stdout:1/857: sync 2026-03-09T19:28:02.190 INFO:tasks.workunit.client.0.vm07.stdout:0/825: dread d0/d6/d13/d17/d19/f34 [0,4194304] 0 2026-03-09T19:28:02.198 INFO:tasks.workunit.client.0.vm07.stdout:1/858: rename d1/d11/d37/dcb/l112 to d1/d11/d37/d3f/d7e/dad/l11a 0 2026-03-09T19:28:02.202 INFO:tasks.workunit.client.0.vm07.stdout:9/874: dread - d0/db/d29/d32/d5c/d80/ff2 zero size 2026-03-09T19:28:02.205 INFO:tasks.workunit.client.0.vm07.stdout:5/873: mkdir d3/dd/d26/d3f/d47/d71/dfa/d11a 0 2026-03-09T19:28:02.206 INFO:tasks.workunit.client.0.vm07.stdout:5/874: stat d3/d1a/d28/d6c/f7a 0 2026-03-09T19:28:02.206 INFO:tasks.workunit.client.0.vm07.stdout:5/875: mkdir d3/dd/d26/d3f/d47/d71/db7/d11b 0 2026-03-09T19:28:02.213 INFO:tasks.workunit.client.0.vm07.stdout:9/875: creat d0/db/d29/d68/d99/f12d x:0 0 0 2026-03-09T19:28:02.213 INFO:tasks.workunit.client.0.vm07.stdout:9/876: chown d0/db/d29/fb3 5513 1 2026-03-09T19:28:02.217 INFO:tasks.workunit.client.0.vm07.stdout:5/876: link d3/dd/ffe d3/d1a/d28/d48/f11c 0 2026-03-09T19:28:02.218 INFO:tasks.workunit.client.0.vm07.stdout:1/859: dread d1/d11/d37/d5d/dc1/fce [0,4194304] 0 2026-03-09T19:28:02.218 INFO:tasks.workunit.client.0.vm07.stdout:5/877: chown d3/dd/dda 12190176 1 2026-03-09T19:28:02.219 INFO:tasks.workunit.client.0.vm07.stdout:5/878: chown d3/dd/d26/d3f/d47/d71/d76/fdf 67244517 1 2026-03-09T19:28:02.219 INFO:tasks.workunit.client.0.vm07.stdout:9/877: unlink d0/db/d29/d32/d5c/d80/ddf/f106 0 2026-03-09T19:28:02.221 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:01 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:02.221 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:01 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T19:28:02.221 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:01 vm07.local ceph-mon[48545]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T19:28:02.221 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:01 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:02.221 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:01 vm07.local ceph-mon[48545]: Upgrade: Need to upgrade myself (mgr.vm08.mxylvw) 2026-03-09T19:28:02.221 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:01 vm07.local ceph-mon[48545]: Upgrade: Need to upgrade myself (mgr.vm08.mxylvw) 2026-03-09T19:28:02.221 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:01 vm07.local ceph-mon[48545]: pgmap v10: 65 pgs: 65 active+clean; 3.7 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 40 MiB/s rd, 87 MiB/s wr, 239 op/s 2026-03-09T19:28:02.221 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:01 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:02.221 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:01 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.xacuym", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:28:02.221 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:01 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.xacuym", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:28:02.221 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:01 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T19:28:02.221 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:01 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:02.223 INFO:tasks.workunit.client.0.vm07.stdout:5/879: readlink d3/dd/d26/d3f/d47/d71/lc3 0 2026-03-09T19:28:02.223 INFO:tasks.workunit.client.0.vm07.stdout:1/860: mknod d1/d91/c11b 0 2026-03-09T19:28:02.223 INFO:tasks.workunit.client.0.vm07.stdout:9/878: rename d0/db/d29/d4d to d0/d6f/d12e 0 2026-03-09T19:28:02.224 INFO:tasks.workunit.client.0.vm07.stdout:1/861: chown d1/d11/c65 5637984 1 2026-03-09T19:28:02.224 INFO:tasks.workunit.client.0.vm07.stdout:9/879: stat d0/d6/d73/d10e 0 2026-03-09T19:28:02.227 INFO:tasks.workunit.client.0.vm07.stdout:5/880: fdatasync d3/d1a/d28/f3c 0 2026-03-09T19:28:02.232 INFO:tasks.workunit.client.0.vm07.stdout:8/872: write d7/d9/d37/d45/d97/dbc/ff4 [1018799,26415] 0 2026-03-09T19:28:02.242 INFO:tasks.workunit.client.0.vm07.stdout:7/808: dwrite d0/d4/d5/d8/d1a/d2a/fb3 [0,4194304] 0 2026-03-09T19:28:02.244 INFO:tasks.workunit.client.0.vm07.stdout:5/881: rmdir d3/dd/d26/d2d 39 2026-03-09T19:28:02.248 INFO:tasks.workunit.client.0.vm07.stdout:5/882: fsync d3/d1a/d28/d10c/fed 0 2026-03-09T19:28:02.248 INFO:tasks.workunit.client.0.vm07.stdout:5/883: write d3/d1a/d5a/f10d [165437,71698] 0 2026-03-09T19:28:02.262 INFO:tasks.workunit.client.0.vm07.stdout:5/884: mknod d3/dd/d26/d2d/d79/c11d 0 2026-03-09T19:28:02.262 INFO:tasks.workunit.client.0.vm07.stdout:2/929: write d3/dd/d16/d29/d2d/d45/d3b/ffd [1499009,99622] 0 2026-03-09T19:28:02.267 INFO:tasks.workunit.client.0.vm07.stdout:3/923: dread d1/d3d/d47/f62 [0,4194304] 0 2026-03-09T19:28:02.267 INFO:tasks.workunit.client.0.vm07.stdout:2/930: creat d3/d11/d143/da4/f146 x:0 0 0 2026-03-09T19:28:02.269 INFO:tasks.workunit.client.0.vm07.stdout:3/924: unlink d1/d6/f21 0 2026-03-09T19:28:02.277 INFO:tasks.workunit.client.0.vm07.stdout:3/925: mkdir d1/d1f/d12a 0 2026-03-09T19:28:02.279 INFO:tasks.workunit.client.0.vm07.stdout:3/926: unlink d1/d89/f108 0 2026-03-09T19:28:02.291 INFO:tasks.workunit.client.0.vm07.stdout:3/927: sync 2026-03-09T19:28:02.295 INFO:tasks.workunit.client.0.vm07.stdout:3/928: fdatasync d1/f98 0 2026-03-09T19:28:02.296 INFO:tasks.workunit.client.0.vm07.stdout:6/803: dread d0/d1/db/d24/da4/ff3 [0,4194304] 0 2026-03-09T19:28:02.299 INFO:tasks.workunit.client.0.vm07.stdout:6/804: fdatasync d0/d1/db/f109 0 2026-03-09T19:28:02.302 INFO:tasks.workunit.client.0.vm07.stdout:3/929: symlink d1/d74/l12b 0 2026-03-09T19:28:02.302 INFO:tasks.workunit.client.0.vm07.stdout:6/805: mkdir d0/d1/db/d17/d12e 0 2026-03-09T19:28:02.305 INFO:tasks.workunit.client.0.vm07.stdout:0/826: write d0/d6/d13/d17/d19/f7c [1518172,29308] 0 2026-03-09T19:28:02.305 INFO:tasks.workunit.client.0.vm07.stdout:6/806: mkdir d0/d4e/dae/daf/d12f 0 2026-03-09T19:28:02.306 INFO:tasks.workunit.client.0.vm07.stdout:6/807: stat d0/d1/db/d24/da4/dda 0 2026-03-09T19:28:02.311 INFO:tasks.workunit.client.0.vm07.stdout:3/930: getdents d1/d3d/d47/db3/dc2/d28/dc4 0 2026-03-09T19:28:02.314 INFO:tasks.workunit.client.0.vm07.stdout:6/808: link d0/d2d/dd5/d123/dc7/fce d0/d1/f130 0 2026-03-09T19:28:02.316 INFO:tasks.workunit.client.0.vm07.stdout:6/809: creat d0/d2d/dd5/d123/f131 x:0 0 0 2026-03-09T19:28:02.319 INFO:tasks.workunit.client.0.vm07.stdout:6/810: mkdir d0/d4e/dae/daf/d12f/d132 0 2026-03-09T19:28:02.321 INFO:tasks.workunit.client.0.vm07.stdout:6/811: truncate d0/d1/db/d52/d94/d81/f125 201470 0 2026-03-09T19:28:02.327 INFO:tasks.workunit.client.0.vm07.stdout:6/812: mkdir d0/dbf/d133 0 2026-03-09T19:28:02.332 INFO:tasks.workunit.client.0.vm07.stdout:3/931: dread d1/d74/fe4 [0,4194304] 0 2026-03-09T19:28:02.339 INFO:tasks.workunit.client.0.vm07.stdout:6/813: dwrite d0/d1/db/d24/da4/fc3 [0,4194304] 0 2026-03-09T19:28:02.348 INFO:tasks.workunit.client.0.vm07.stdout:3/932: dread d1/d6/f19 [0,4194304] 0 2026-03-09T19:28:02.355 INFO:tasks.workunit.client.0.vm07.stdout:1/862: dread d1/db/d31/d56/f71 [0,4194304] 0 2026-03-09T19:28:02.355 INFO:tasks.workunit.client.0.vm07.stdout:9/880: write d0/dc1/f7c [3954215,116826] 0 2026-03-09T19:28:02.365 INFO:tasks.workunit.client.0.vm07.stdout:1/863: read - d1/d11/d37/d3f/d7e/dad/fbd zero size 2026-03-09T19:28:02.365 INFO:tasks.workunit.client.0.vm07.stdout:7/809: dwrite d0/d4/d5/d26/d32/fa6 [0,4194304] 0 2026-03-09T19:28:02.377 INFO:tasks.workunit.client.0.vm07.stdout:5/885: dwrite d3/d1a/d28/d48/f4f [4194304,4194304] 0 2026-03-09T19:28:02.377 INFO:tasks.workunit.client.0.vm07.stdout:3/933: mknod d1/d6/d4c/d97/c12c 0 2026-03-09T19:28:02.377 INFO:tasks.workunit.client.0.vm07.stdout:2/931: write d3/d11/d38/d111/d113/f12c [1579625,108796] 0 2026-03-09T19:28:02.381 INFO:tasks.workunit.client.0.vm07.stdout:3/934: chown d1/d6/dd/f2b 8 1 2026-03-09T19:28:02.384 INFO:tasks.workunit.client.0.vm07.stdout:8/873: dwrite d7/d9/d37/d45/d97/dbc/fca [0,4194304] 0 2026-03-09T19:28:02.384 INFO:tasks.workunit.client.0.vm07.stdout:1/864: creat d1/d3e/db3/d6d/dff/f11c x:0 0 0 2026-03-09T19:28:02.389 INFO:tasks.workunit.client.0.vm07.stdout:3/935: write d1/d6/d4c/d97/fc0 [1882911,13796] 0 2026-03-09T19:28:02.392 INFO:tasks.workunit.client.0.vm07.stdout:8/874: truncate d7/d9/d37/d45/d4f/db1/d107/f11b 608499 0 2026-03-09T19:28:02.396 INFO:tasks.workunit.client.0.vm07.stdout:1/865: mknod d1/d11/d37/d3f/d6e/d9c/c11d 0 2026-03-09T19:28:02.397 INFO:tasks.workunit.client.0.vm07.stdout:4/843: dread d3/d11/d2b/d38/d107/f111 [0,4194304] 0 2026-03-09T19:28:02.398 INFO:tasks.workunit.client.0.vm07.stdout:3/936: rmdir d1/d3d/d47/db3/d8e/da9/d127 39 2026-03-09T19:28:02.400 INFO:tasks.workunit.client.0.vm07.stdout:8/875: rmdir d7/d9/d10/dd8/dfd/d62 39 2026-03-09T19:28:02.404 INFO:tasks.workunit.client.0.vm07.stdout:1/866: chown d1/d11/d37/d5d/d50/f8b 77280 1 2026-03-09T19:28:02.407 INFO:tasks.workunit.client.0.vm07.stdout:1/867: fdatasync d1/d11/d37/d3f/d6e/d9c/ff7 0 2026-03-09T19:28:02.407 INFO:tasks.workunit.client.0.vm07.stdout:8/876: stat d7/d9/d37/d45/d97/f117 0 2026-03-09T19:28:02.414 INFO:tasks.workunit.client.0.vm07.stdout:8/877: unlink d7/d50/da6/c111 0 2026-03-09T19:28:02.420 INFO:tasks.workunit.client.0.vm07.stdout:4/844: dread d3/d11/d29/d101/f122 [0,4194304] 0 2026-03-09T19:28:02.421 INFO:tasks.workunit.client.0.vm07.stdout:8/878: dread d7/d1d/d83/d9f/fcb [0,4194304] 0 2026-03-09T19:28:02.424 INFO:tasks.workunit.client.0.vm07.stdout:8/879: unlink d7/d9/d10/d44/cb5 0 2026-03-09T19:28:02.426 INFO:tasks.workunit.client.0.vm07.stdout:4/845: getdents d3/d11/d2b/d38/ddc 0 2026-03-09T19:28:02.433 INFO:tasks.workunit.client.0.vm07.stdout:4/846: rename d3/d11/d2b/d38/ddc/f67 to d3/d4f/d56/f124 0 2026-03-09T19:28:02.433 INFO:tasks.workunit.client.0.vm07.stdout:4/847: chown d3/d11/d2b/fb5 795890 1 2026-03-09T19:28:02.441 INFO:tasks.workunit.client.0.vm07.stdout:0/827: write d0/d6/d13/d1c/d52/d81/fb0 [216849,77323] 0 2026-03-09T19:28:02.456 INFO:tasks.workunit.client.0.vm07.stdout:0/828: sync 2026-03-09T19:28:02.474 INFO:tasks.workunit.client.0.vm07.stdout:9/881: dwrite d0/d6/d3a/d94/fa0 [0,4194304] 0 2026-03-09T19:28:02.477 INFO:tasks.workunit.client.0.vm07.stdout:5/886: write d3/d1a/d28/f2e [7332560,103065] 0 2026-03-09T19:28:02.477 INFO:tasks.workunit.client.0.vm07.stdout:2/932: write d3/fa [4667326,117809] 0 2026-03-09T19:28:02.480 INFO:tasks.workunit.client.0.vm07.stdout:3/937: write d1/d3d/d47/f62 [2060808,36597] 0 2026-03-09T19:28:02.480 INFO:tasks.workunit.client.0.vm07.stdout:8/880: truncate d7/d9/f4c 5048989 0 2026-03-09T19:28:02.481 INFO:tasks.workunit.client.0.vm07.stdout:8/881: dread - d7/d9/d37/d45/f7e zero size 2026-03-09T19:28:02.482 INFO:tasks.workunit.client.0.vm07.stdout:8/882: chown d7/d50/da6/f109 153 1 2026-03-09T19:28:02.487 INFO:tasks.workunit.client.0.vm07.stdout:9/882: creat d0/d6/d73/f12f x:0 0 0 2026-03-09T19:28:02.489 INFO:tasks.workunit.client.0.vm07.stdout:2/933: fsync d3/d11/d143/da4/f101 0 2026-03-09T19:28:02.492 INFO:tasks.workunit.client.0.vm07.stdout:3/938: write d1/d3d/d47/db3/d8e/f129 [4340601,85349] 0 2026-03-09T19:28:02.492 INFO:tasks.workunit.client.0.vm07.stdout:8/883: symlink d7/d50/da6/l131 0 2026-03-09T19:28:02.495 INFO:tasks.workunit.client.0.vm07.stdout:4/848: read - d3/d11/d2b/d38/ddc/d91/fbb zero size 2026-03-09T19:28:02.496 INFO:tasks.workunit.client.0.vm07.stdout:4/849: chown d3/d11 305 1 2026-03-09T19:28:02.505 INFO:tasks.workunit.client.0.vm07.stdout:8/884: rename d7/d50/da6/dc5/ff7 to d7/d50/da6/f132 0 2026-03-09T19:28:02.506 INFO:tasks.workunit.client.0.vm07.stdout:2/934: creat d3/dd/d16/f147 x:0 0 0 2026-03-09T19:28:02.511 INFO:tasks.workunit.client.0.vm07.stdout:4/850: rename d3/d11/d16/df5 to d3/d4f/d56/d5f/d125 0 2026-03-09T19:28:02.511 INFO:tasks.workunit.client.0.vm07.stdout:8/885: dread - d7/d50/f10c zero size 2026-03-09T19:28:02.525 INFO:tasks.workunit.client.0.vm07.stdout:3/939: dread d1/d3d/d47/db3/dc2/f30 [0,4194304] 0 2026-03-09T19:28:02.530 INFO:tasks.workunit.client.0.vm07.stdout:2/935: creat d3/dd/d16/d29/d3c/f148 x:0 0 0 2026-03-09T19:28:02.532 INFO:tasks.workunit.client.0.vm07.stdout:8/886: mkdir d7/d9/d10/dd8/dfc/d133 0 2026-03-09T19:28:02.532 INFO:tasks.workunit.client.0.vm07.stdout:4/851: fsync d3/fc 0 2026-03-09T19:28:02.532 INFO:tasks.workunit.client.0.vm07.stdout:2/936: creat d3/dd/d16/d29/d3c/d4c/f149 x:0 0 0 2026-03-09T19:28:02.533 INFO:tasks.workunit.client.0.vm07.stdout:8/887: write d7/d9/d37/d45/d97/f117 [3558584,63220] 0 2026-03-09T19:28:02.534 INFO:tasks.workunit.client.0.vm07.stdout:8/888: chown d7/d1d/d83 7760190 1 2026-03-09T19:28:02.535 INFO:tasks.workunit.client.0.vm07.stdout:5/887: dread d3/d1a/d28/d36/f8c [0,4194304] 0 2026-03-09T19:28:02.538 INFO:tasks.workunit.client.0.vm07.stdout:2/937: sync 2026-03-09T19:28:02.546 INFO:tasks.workunit.client.0.vm07.stdout:0/829: dwrite d0/d6/dc8/fb4 [0,4194304] 0 2026-03-09T19:28:02.556 INFO:tasks.workunit.client.0.vm07.stdout:8/889: rmdir d7/d16/d1e 39 2026-03-09T19:28:02.563 INFO:tasks.workunit.client.0.vm07.stdout:6/814: dread d0/d2d/dd5/d123/fd7 [0,4194304] 0 2026-03-09T19:28:02.572 INFO:tasks.workunit.client.0.vm07.stdout:0/830: link d0/d6/d13/d1c/d61/cf2 d0/d6/d13/d1c/d11/d8b/c111 0 2026-03-09T19:28:02.572 INFO:tasks.workunit.client.0.vm07.stdout:6/815: readlink d0/d1/d28/da8/le8 0 2026-03-09T19:28:02.572 INFO:tasks.workunit.client.0.vm07.stdout:0/831: chown d0/d6/d13/d1c/d61/cb7 281 1 2026-03-09T19:28:02.573 INFO:tasks.workunit.client.0.vm07.stdout:6/816: dread - d0/d2d/dd5/d123/d7b/d7d/f11b zero size 2026-03-09T19:28:02.574 INFO:tasks.workunit.client.0.vm07.stdout:6/817: rename d0/d1/db to d0/d1/db/d52/d134 22 2026-03-09T19:28:02.584 INFO:tasks.workunit.client.0.vm07.stdout:1/868: dread d1/d3/d21/f5f [0,4194304] 0 2026-03-09T19:28:02.585 INFO:tasks.workunit.client.0.vm07.stdout:1/869: stat d1/d11/d37/dcb/f102 0 2026-03-09T19:28:02.586 INFO:tasks.workunit.client.0.vm07.stdout:1/870: mkdir d1/d11/d37/d3f/d11e 0 2026-03-09T19:28:02.587 INFO:tasks.workunit.client.0.vm07.stdout:1/871: mknod d1/db/d31/d56/d118/c11f 0 2026-03-09T19:28:02.589 INFO:tasks.workunit.client.0.vm07.stdout:1/872: mknod d1/d11/d37/d3f/d45/d87/d88/c120 0 2026-03-09T19:28:02.592 INFO:tasks.workunit.client.0.vm07.stdout:1/873: sync 2026-03-09T19:28:02.603 INFO:tasks.workunit.client.0.vm07.stdout:7/810: dread d0/d4/d5/d8/f35 [0,4194304] 0 2026-03-09T19:28:02.605 INFO:tasks.workunit.client.0.vm07.stdout:7/811: write d0/d4/d5/d8/d41/d64/d74/d98/f47 [1438487,41020] 0 2026-03-09T19:28:02.608 INFO:tasks.workunit.client.0.vm07.stdout:7/812: unlink d0/d4/d5/d26/d32/c66 0 2026-03-09T19:28:02.634 INFO:tasks.workunit.client.0.vm07.stdout:5/888: read d3/dd/d26/d3f/d47/d71/d76/fdf [3329,125380] 0 2026-03-09T19:28:02.646 INFO:tasks.workunit.client.0.vm07.stdout:9/883: write d0/db/d29/d32/d5c/d69/fc9 [5057385,25644] 0 2026-03-09T19:28:02.647 INFO:tasks.workunit.client.0.vm07.stdout:9/884: chown d0/db/d9e/faf 0 1 2026-03-09T19:28:02.653 INFO:tasks.workunit.client.0.vm07.stdout:4/852: dwrite d3/d11/d29/d34/f5c [0,4194304] 0 2026-03-09T19:28:02.657 INFO:tasks.workunit.client.0.vm07.stdout:9/885: dwrite d0/d6/f48 [0,4194304] 0 2026-03-09T19:28:02.657 INFO:tasks.workunit.client.0.vm07.stdout:4/853: chown d3/d11/cc5 4423 1 2026-03-09T19:28:02.663 INFO:tasks.workunit.client.0.vm07.stdout:0/832: write d0/d6/d13/d1c/fd7 [142387,126284] 0 2026-03-09T19:28:02.669 INFO:tasks.workunit.client.0.vm07.stdout:6/818: write d0/d4e/d7f/fbc [716569,5904] 0 2026-03-09T19:28:02.674 INFO:tasks.workunit.client.0.vm07.stdout:6/819: stat d0/d2d/cfd 0 2026-03-09T19:28:02.674 INFO:tasks.workunit.client.0.vm07.stdout:3/940: dwrite d1/d74/f77 [0,4194304] 0 2026-03-09T19:28:02.678 INFO:tasks.workunit.client.0.vm07.stdout:8/890: dwrite d7/d9/d37/d45/d4f/db1/fce [0,4194304] 0 2026-03-09T19:28:02.680 INFO:tasks.workunit.client.0.vm07.stdout:3/941: chown d1/d3d/d47/db3/dc2/f30 168 1 2026-03-09T19:28:02.687 INFO:tasks.workunit.client.0.vm07.stdout:4/854: dread - d3/d11/d2b/d38/ddc/d22/d86/fc9 zero size 2026-03-09T19:28:02.703 INFO:tasks.workunit.client.0.vm07.stdout:9/886: dread d0/d6/d57/d8f/f9f [0,4194304] 0 2026-03-09T19:28:02.713 INFO:tasks.workunit.client.0.vm07.stdout:1/874: write d1/db/d31/d4f/fd4 [763216,25856] 0 2026-03-09T19:28:02.714 INFO:tasks.workunit.client.0.vm07.stdout:6/820: getdents d0/d4e/dae/daf/d12f/d132 0 2026-03-09T19:28:02.715 INFO:tasks.workunit.client.0.vm07.stdout:6/821: chown d0/d2d/dd5/d123/dc7 836 1 2026-03-09T19:28:02.720 INFO:tasks.workunit.client.0.vm07.stdout:3/942: fdatasync d1/d3d/d47/db3/d8e/da9/f93 0 2026-03-09T19:28:02.735 INFO:tasks.workunit.client.0.vm07.stdout:1/875: creat d1/db/d31/d4f/d7a/dd2/f121 x:0 0 0 2026-03-09T19:28:02.738 INFO:tasks.workunit.client.0.vm07.stdout:6/822: dread - d0/d1/d28/da8/ffb zero size 2026-03-09T19:28:02.738 INFO:tasks.workunit.client.0.vm07.stdout:4/855: creat d3/dbe/f126 x:0 0 0 2026-03-09T19:28:02.746 INFO:tasks.workunit.client.0.vm07.stdout:8/891: link d7/d9/d37/d45/d4f/db1/d107/f11b d7/d9/d37/f134 0 2026-03-09T19:28:02.747 INFO:tasks.workunit.client.0.vm07.stdout:6/823: creat d0/d2d/f135 x:0 0 0 2026-03-09T19:28:02.751 INFO:tasks.workunit.client.0.vm07.stdout:8/892: mkdir d7/d1d/d83/d9f/d135 0 2026-03-09T19:28:02.752 INFO:tasks.workunit.client.0.vm07.stdout:8/893: readlink d7/d1d/d83/d9f/l11c 0 2026-03-09T19:28:02.752 INFO:tasks.workunit.client.0.vm07.stdout:8/894: fsync d7/d9/d37/d45/f73 0 2026-03-09T19:28:02.753 INFO:tasks.workunit.client.0.vm07.stdout:8/895: stat d7/d9/d10/dd8/dfd/d67/l9c 0 2026-03-09T19:28:02.755 INFO:tasks.workunit.client.0.vm07.stdout:4/856: mkdir d3/d11/d2b/d38/ddc/db2/d127 0 2026-03-09T19:28:02.755 INFO:tasks.workunit.client.0.vm07.stdout:6/824: truncate d0/d1/db/fe3 550518 0 2026-03-09T19:28:02.757 INFO:tasks.workunit.client.0.vm07.stdout:6/825: readlink d0/d1/db/d24/l25 0 2026-03-09T19:28:02.760 INFO:tasks.workunit.client.0.vm07.stdout:4/857: rename d3/d11/d2b/d38/ddc/db2/fc8 to d3/d11/d2b/d38/ddc/db2/d127/f128 0 2026-03-09T19:28:02.761 INFO:tasks.workunit.client.0.vm07.stdout:6/826: unlink d0/d1/db/d1d/f2e 0 2026-03-09T19:28:02.762 INFO:tasks.workunit.client.0.vm07.stdout:9/887: read d0/d6f/dc3/fc4 [263733,84428] 0 2026-03-09T19:28:02.767 INFO:tasks.workunit.client.0.vm07.stdout:4/858: symlink d3/d11/d2b/d38/ddc/db2/d127/l129 0 2026-03-09T19:28:02.773 INFO:tasks.workunit.client.0.vm07.stdout:9/888: rename d0/d6f/d12e/c4e to d0/db/d9e/c130 0 2026-03-09T19:28:02.774 INFO:tasks.workunit.client.0.vm07.stdout:8/896: dread d7/d9/d10/dd8/dfd/d62/fc8 [0,4194304] 0 2026-03-09T19:28:02.774 INFO:tasks.workunit.client.0.vm07.stdout:8/897: readlink d7/d30/d32/l11d 0 2026-03-09T19:28:02.779 INFO:tasks.workunit.client.0.vm07.stdout:6/827: truncate d0/d1/d28/d76/fec 578803 0 2026-03-09T19:28:02.781 INFO:tasks.workunit.client.0.vm07.stdout:9/889: sync 2026-03-09T19:28:02.784 INFO:tasks.workunit.client.0.vm07.stdout:9/890: readlink d0/d17/l50 0 2026-03-09T19:28:02.786 INFO:tasks.workunit.client.0.vm07.stdout:7/813: truncate d0/d80/db1/de5/d54/d55/f6d 524384 0 2026-03-09T19:28:02.787 INFO:tasks.workunit.client.0.vm07.stdout:9/891: symlink d0/d6/d73/d10e/l131 0 2026-03-09T19:28:02.788 INFO:tasks.workunit.client.0.vm07.stdout:7/814: truncate d0/d4/d5/f36 2856310 0 2026-03-09T19:28:02.790 INFO:tasks.workunit.client.0.vm07.stdout:7/815: chown d0/l8a 1716 1 2026-03-09T19:28:02.791 INFO:tasks.workunit.client.0.vm07.stdout:9/892: truncate d0/d6f/dc3/fc4 330560 0 2026-03-09T19:28:02.792 INFO:tasks.workunit.client.0.vm07.stdout:5/889: dwrite d3/f68 [0,4194304] 0 2026-03-09T19:28:02.800 INFO:tasks.workunit.client.0.vm07.stdout:9/893: write d0/d6f/d12e/fa5 [2886606,5553] 0 2026-03-09T19:28:02.806 INFO:tasks.workunit.client.0.vm07.stdout:9/894: chown d0/d6/d73/dbe 13485 1 2026-03-09T19:28:02.817 INFO:tasks.workunit.client.0.vm07.stdout:9/895: mknod d0/d6/d3a/c132 0 2026-03-09T19:28:02.821 INFO:tasks.workunit.client.0.vm07.stdout:0/833: dwrite d0/d6/d13/d17/d19/d58/dd9/f103 [0,4194304] 0 2026-03-09T19:28:02.834 INFO:tasks.workunit.client.0.vm07.stdout:5/890: dwrite d3/f93 [0,4194304] 0 2026-03-09T19:28:02.839 INFO:tasks.workunit.client.0.vm07.stdout:0/834: rename d0/d6/d13/d17/dc3/lc2 to d0/d6/d13/d1c/d61/l112 0 2026-03-09T19:28:02.844 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:02 vm07.local ceph-mon[48545]: Upgrade: Updating mgr.vm07.xacuym 2026-03-09T19:28:02.849 INFO:tasks.workunit.client.0.vm07.stdout:8/898: dread d7/d9/d10/f41 [0,4194304] 0 2026-03-09T19:28:02.850 INFO:tasks.workunit.client.0.vm07.stdout:5/891: mknod d3/dd/d26/d3f/d47/d71/db7/c11e 0 2026-03-09T19:28:02.853 INFO:tasks.workunit.client.0.vm07.stdout:0/835: link d0/d6/d13/d17/ff0 d0/d6/d13/d1c/d11/d56/f113 0 2026-03-09T19:28:02.853 INFO:tasks.workunit.client.0.vm07.stdout:8/899: mkdir d7/d1d/d83/d136 0 2026-03-09T19:28:02.855 INFO:tasks.workunit.client.0.vm07.stdout:5/892: creat d3/dd/dda/f11f x:0 0 0 2026-03-09T19:28:02.860 INFO:tasks.workunit.client.0.vm07.stdout:0/836: creat d0/d6/d13/d17/d19/f114 x:0 0 0 2026-03-09T19:28:02.861 INFO:tasks.workunit.client.0.vm07.stdout:3/943: dwrite d1/d74/f125 [0,4194304] 0 2026-03-09T19:28:02.868 INFO:tasks.workunit.client.0.vm07.stdout:3/944: stat d1/d6/f76 0 2026-03-09T19:28:02.868 INFO:tasks.workunit.client.0.vm07.stdout:0/837: readlink d0/d6/d13/d1c/d11/d8b/le4 0 2026-03-09T19:28:02.870 INFO:tasks.workunit.client.0.vm07.stdout:8/900: link d7/d16/f71 d7/d9/d37/d45/f137 0 2026-03-09T19:28:02.871 INFO:tasks.workunit.client.0.vm07.stdout:0/838: creat d0/d6/d13/d17/d19/d58/f115 x:0 0 0 2026-03-09T19:28:02.872 INFO:tasks.workunit.client.0.vm07.stdout:5/893: getdents d3/dd/d26/d3f/d47/d71/d76 0 2026-03-09T19:28:02.872 INFO:tasks.workunit.client.0.vm07.stdout:8/901: symlink d7/d50/da6/l138 0 2026-03-09T19:28:02.873 INFO:tasks.workunit.client.0.vm07.stdout:3/945: rename d1/d3d/d47/db3/lba to d1/d3d/d47/db3/d8e/da9/d127/l12d 0 2026-03-09T19:28:02.879 INFO:tasks.workunit.client.0.vm07.stdout:0/839: symlink d0/d6/d13/d17/dc3/df6/df9/l116 0 2026-03-09T19:28:02.880 INFO:tasks.workunit.client.0.vm07.stdout:5/894: chown d3/d1a/d28/c30 26148 1 2026-03-09T19:28:02.883 INFO:tasks.workunit.client.0.vm07.stdout:5/895: rmdir d3/d1a/d28 39 2026-03-09T19:28:02.884 INFO:tasks.workunit.client.0.vm07.stdout:3/946: dread d1/d6/d45/f5d [0,4194304] 0 2026-03-09T19:28:02.885 INFO:tasks.workunit.client.0.vm07.stdout:3/947: dread - d1/d3d/d47/db3/fb5 zero size 2026-03-09T19:28:02.888 INFO:tasks.workunit.client.0.vm07.stdout:3/948: symlink d1/d6/d45/d54/de5/l12e 0 2026-03-09T19:28:02.889 INFO:tasks.workunit.client.0.vm07.stdout:5/896: getdents d3/dd/d26/d3f/d47/d56 0 2026-03-09T19:28:02.891 INFO:tasks.workunit.client.0.vm07.stdout:3/949: sync 2026-03-09T19:28:02.891 INFO:tasks.workunit.client.0.vm07.stdout:0/840: sync 2026-03-09T19:28:02.892 INFO:tasks.workunit.client.0.vm07.stdout:5/897: symlink d3/dd/d26/d3f/d47/de6/l120 0 2026-03-09T19:28:02.892 INFO:tasks.workunit.client.0.vm07.stdout:0/841: readlink d0/d6/d13/d17/d19/d58/dd9/lef 0 2026-03-09T19:28:02.894 INFO:tasks.workunit.client.0.vm07.stdout:3/950: dread - d1/d3d/d47/db3/faf zero size 2026-03-09T19:28:02.900 INFO:tasks.workunit.client.0.vm07.stdout:0/842: fdatasync d0/d6/d13/d33/f35 0 2026-03-09T19:28:02.900 INFO:tasks.workunit.client.0.vm07.stdout:3/951: creat d1/d6/d45/d54/f12f x:0 0 0 2026-03-09T19:28:02.901 INFO:tasks.workunit.client.0.vm07.stdout:0/843: chown d0/d6/d13/d1c/d50/fc7 74499 1 2026-03-09T19:28:02.908 INFO:tasks.workunit.client.0.vm07.stdout:1/876: dread d1/db/d31/fa8 [0,4194304] 0 2026-03-09T19:28:02.911 INFO:tasks.workunit.client.0.vm07.stdout:1/877: sync 2026-03-09T19:28:02.919 INFO:tasks.workunit.client.0.vm07.stdout:0/844: symlink d0/d6/d13/d1c/d61/d69/l117 0 2026-03-09T19:28:02.922 INFO:tasks.workunit.client.0.vm07.stdout:1/878: rename d1/d11/d37/d3f/d45/d87/fe6 to d1/d91/d117/f122 0 2026-03-09T19:28:02.924 INFO:tasks.workunit.client.0.vm07.stdout:0/845: dread d0/d6/dc8/fba [0,4194304] 0 2026-03-09T19:28:02.924 INFO:tasks.workunit.client.0.vm07.stdout:3/952: creat d1/d6/d71/d101/f130 x:0 0 0 2026-03-09T19:28:02.931 INFO:tasks.workunit.client.0.vm07.stdout:4/859: dwrite d3/d4f/f7c [0,4194304] 0 2026-03-09T19:28:02.935 INFO:tasks.workunit.client.0.vm07.stdout:3/953: creat d1/d3d/d47/d10e/f131 x:0 0 0 2026-03-09T19:28:02.940 INFO:tasks.workunit.client.0.vm07.stdout:4/860: dread - d3/d11/d2b/d38/ddc/d91/fbb zero size 2026-03-09T19:28:02.944 INFO:tasks.workunit.client.0.vm07.stdout:3/954: rename d1/d3d/d47/fe7 to d1/d3d/d47/db3/dc2/f132 0 2026-03-09T19:28:02.945 INFO:tasks.workunit.client.0.vm07.stdout:3/955: stat d1/d6/d71/d101 0 2026-03-09T19:28:02.945 INFO:tasks.workunit.client.0.vm07.stdout:4/861: rename d3/d11/d2b/d38/fdf to d3/d11/d2b/d38/d8f/f12a 0 2026-03-09T19:28:02.945 INFO:tasks.workunit.client.0.vm07.stdout:3/956: write d1/d3d/d47/f113 [851335,129282] 0 2026-03-09T19:28:02.949 INFO:tasks.workunit.client.0.vm07.stdout:3/957: truncate d1/d3d/d47/db3/dc2/f39 3120408 0 2026-03-09T19:28:02.949 INFO:tasks.workunit.client.0.vm07.stdout:4/862: rename d3/d11/d29/f108 to d3/d11/d2b/d38/f12b 0 2026-03-09T19:28:02.950 INFO:tasks.workunit.client.0.vm07.stdout:3/958: readlink d1/d3d/d47/db3/d87/la4 0 2026-03-09T19:28:02.955 INFO:tasks.workunit.client.0.vm07.stdout:3/959: symlink d1/d3d/d47/db3/d8e/da9/d127/l133 0 2026-03-09T19:28:02.956 INFO:tasks.workunit.client.0.vm07.stdout:4/863: creat d3/d11/d2b/d37/db6/f12c x:0 0 0 2026-03-09T19:28:02.958 INFO:tasks.workunit.client.0.vm07.stdout:3/960: creat d1/d74/f134 x:0 0 0 2026-03-09T19:28:02.960 INFO:tasks.workunit.client.0.vm07.stdout:4/864: mknod d3/c12d 0 2026-03-09T19:28:02.960 INFO:tasks.workunit.client.0.vm07.stdout:3/961: truncate d1/d6/d45/dac/feb 4090645 0 2026-03-09T19:28:02.961 INFO:tasks.workunit.client.0.vm07.stdout:4/865: stat d3/d11/f103 0 2026-03-09T19:28:02.962 INFO:tasks.workunit.client.0.vm07.stdout:3/962: fsync d1/d6/fd6 0 2026-03-09T19:28:02.963 INFO:tasks.workunit.client.0.vm07.stdout:3/963: chown d1/d6/d45/d54/de5/fec 30 1 2026-03-09T19:28:02.968 INFO:tasks.workunit.client.0.vm07.stdout:3/964: dwrite d1/d3d/f95 [0,4194304] 0 2026-03-09T19:28:02.974 INFO:tasks.workunit.client.0.vm07.stdout:3/965: write d1/d1f/f1a [2668310,62697] 0 2026-03-09T19:28:02.978 INFO:tasks.workunit.client.0.vm07.stdout:3/966: unlink d1/d3d/d47/db3/dc2/d28/d7c/fbd 0 2026-03-09T19:28:02.979 INFO:tasks.workunit.client.0.vm07.stdout:3/967: chown d1/d3d/d47/f113 3156 1 2026-03-09T19:28:02.982 INFO:tasks.workunit.client.0.vm07.stdout:3/968: creat d1/d1f/f135 x:0 0 0 2026-03-09T19:28:02.986 INFO:tasks.workunit.client.0.vm07.stdout:6/828: dwrite d0/d1/db/d24/fc2 [0,4194304] 0 2026-03-09T19:28:02.996 INFO:tasks.workunit.client.0.vm07.stdout:7/816: dwrite d0/d4/d5/d8/d41/d64/d74/d98/fd9 [0,4194304] 0 2026-03-09T19:28:03.003 INFO:tasks.workunit.client.0.vm07.stdout:9/896: dwrite d0/d6f/fa9 [0,4194304] 0 2026-03-09T19:28:03.008 INFO:tasks.workunit.client.0.vm07.stdout:7/817: symlink d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58/l10d 0 2026-03-09T19:28:03.012 INFO:tasks.workunit.client.0.vm07.stdout:8/902: write d7/d9/d37/f85 [656920,19807] 0 2026-03-09T19:28:03.013 INFO:tasks.workunit.client.0.vm07.stdout:8/903: chown d7/d50/c53 13 1 2026-03-09T19:28:03.013 INFO:tasks.workunit.client.0.vm07.stdout:9/897: link d0/db/d9e/faf d0/d6/d57/deb/f133 0 2026-03-09T19:28:03.013 INFO:tasks.workunit.client.0.vm07.stdout:7/818: truncate d0/d4/d5/d8/d41/d64/d74/d98/fd8 1042884 0 2026-03-09T19:28:03.014 INFO:tasks.workunit.client.0.vm07.stdout:9/898: stat d0/db/d29/d32/d5c 0 2026-03-09T19:28:03.016 INFO:tasks.workunit.client.0.vm07.stdout:9/899: fdatasync d0/d6f/f115 0 2026-03-09T19:28:03.019 INFO:tasks.workunit.client.0.vm07.stdout:7/819: link d0/d4/d5/d26/db9/dc2/fd1 d0/d4/d5/d8/d1a/d2a/f10e 0 2026-03-09T19:28:03.023 INFO:tasks.workunit.client.0.vm07.stdout:9/900: getdents d0/d6/d57/d8f 0 2026-03-09T19:28:03.031 INFO:tasks.workunit.client.0.vm07.stdout:9/901: dread d0/d6f/d12e/f95 [0,4194304] 0 2026-03-09T19:28:03.032 INFO:tasks.workunit.client.0.vm07.stdout:9/902: unlink d0/d17/f1f 0 2026-03-09T19:28:03.033 INFO:tasks.workunit.client.0.vm07.stdout:9/903: unlink d0/d6/d57/f112 0 2026-03-09T19:28:03.034 INFO:tasks.workunit.client.0.vm07.stdout:9/904: chown d0/d6/d57/deb/fd2 551 1 2026-03-09T19:28:03.035 INFO:tasks.workunit.client.0.vm07.stdout:9/905: write d0/dc1/f7c [1771814,23503] 0 2026-03-09T19:28:03.036 INFO:tasks.workunit.client.0.vm07.stdout:9/906: mknod d0/d6f/d12e/c134 0 2026-03-09T19:28:03.062 INFO:tasks.workunit.client.0.vm07.stdout:2/938: dread d3/dd/d16/d29/f58 [0,4194304] 0 2026-03-09T19:28:03.064 INFO:tasks.workunit.client.0.vm07.stdout:2/939: creat d3/dd/d103/ddd/f14a x:0 0 0 2026-03-09T19:28:03.064 INFO:tasks.workunit.client.0.vm07.stdout:2/940: write d3/fa [2564610,38598] 0 2026-03-09T19:28:03.066 INFO:tasks.workunit.client.0.vm07.stdout:2/941: symlink d3/dd/d16/d29/d2d/d45/df6/d11e/l14b 0 2026-03-09T19:28:03.067 INFO:tasks.workunit.client.0.vm07.stdout:2/942: fdatasync d3/dd/d103/ddd/f14a 0 2026-03-09T19:28:03.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:02 vm08.local ceph-mon[57794]: Upgrade: Updating mgr.vm07.xacuym 2026-03-09T19:28:03.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:02 vm08.local ceph-mon[57794]: Deploying daemon mgr.vm07.xacuym on vm07 2026-03-09T19:28:03.128 INFO:tasks.workunit.client.0.vm07.stdout:5/898: write d3/d1a/d28/d40/d92/f8e [550935,91251] 0 2026-03-09T19:28:03.128 INFO:tasks.workunit.client.0.vm07.stdout:1/879: write d1/d3e/db3/d6d/ff6 [126079,37123] 0 2026-03-09T19:28:03.129 INFO:tasks.workunit.client.0.vm07.stdout:0/846: write d0/d6/d13/d17/d19/d57/d6a/f74 [2435828,95027] 0 2026-03-09T19:28:03.130 INFO:tasks.workunit.client.0.vm07.stdout:0/847: chown d0/d6/d13/d17/d19/f114 770008 1 2026-03-09T19:28:03.131 INFO:tasks.workunit.client.0.vm07.stdout:0/848: fsync d0/d6/d13/d1c/d61/d69/fb9 0 2026-03-09T19:28:03.143 INFO:tasks.workunit.client.0.vm07.stdout:4/866: write d3/d11/d2b/f71 [503630,84483] 0 2026-03-09T19:28:03.155 INFO:tasks.workunit.client.0.vm07.stdout:3/969: dwrite d1/d3d/fb7 [0,4194304] 0 2026-03-09T19:28:03.161 INFO:tasks.workunit.client.0.vm07.stdout:6/829: dwrite d0/d44/fe2 [0,4194304] 0 2026-03-09T19:28:03.170 INFO:tasks.workunit.client.0.vm07.stdout:0/849: mkdir d0/d6/d13/d17/d118 0 2026-03-09T19:28:03.170 INFO:tasks.workunit.client.0.vm07.stdout:8/904: truncate d7/d9/d10/dd8/dfd/d67/de7/fea 208275 0 2026-03-09T19:28:03.175 INFO:tasks.workunit.client.0.vm07.stdout:9/907: truncate d0/d6f/fdb 3682351 0 2026-03-09T19:28:03.188 INFO:tasks.workunit.client.0.vm07.stdout:6/830: unlink d0/d4e/cdc 0 2026-03-09T19:28:03.189 INFO:tasks.workunit.client.0.vm07.stdout:2/943: write d3/dd/d16/d29/d2d/d45/d8b/d98/fab [365665,77233] 0 2026-03-09T19:28:03.189 INFO:tasks.workunit.client.0.vm07.stdout:5/899: link d3/d1a/d28/d40/f46 d3/f121 0 2026-03-09T19:28:03.189 INFO:tasks.workunit.client.0.vm07.stdout:2/944: dread - d3/dd/d16/d29/d3c/d5a/f138 zero size 2026-03-09T19:28:03.189 INFO:tasks.workunit.client.0.vm07.stdout:8/905: rename d7/d9/d37/d45/c54 to d7/d9/ddf/c139 0 2026-03-09T19:28:03.189 INFO:tasks.workunit.client.0.vm07.stdout:1/880: getdents d1/d3/d21 0 2026-03-09T19:28:03.189 INFO:tasks.workunit.client.0.vm07.stdout:9/908: symlink d0/db/d29/d2c/d36/l135 0 2026-03-09T19:28:03.189 INFO:tasks.workunit.client.0.vm07.stdout:3/970: creat d1/f136 x:0 0 0 2026-03-09T19:28:03.189 INFO:tasks.workunit.client.0.vm07.stdout:3/971: stat d1/d6/d45 0 2026-03-09T19:28:03.189 INFO:tasks.workunit.client.0.vm07.stdout:6/831: rename d0/d4e/dae/daf/d12f/d132 to d0/d1/db/d52/d94/d87/d136 0 2026-03-09T19:28:03.189 INFO:tasks.workunit.client.0.vm07.stdout:5/900: mkdir d3/d1a/d28/d122 0 2026-03-09T19:28:03.189 INFO:tasks.workunit.client.0.vm07.stdout:9/909: unlink d0/d6f/d86/fd1 0 2026-03-09T19:28:03.190 INFO:tasks.workunit.client.0.vm07.stdout:3/972: dread d1/fe [0,4194304] 0 2026-03-09T19:28:03.191 INFO:tasks.workunit.client.0.vm07.stdout:5/901: truncate d3/d1a/d28/d6c/ffc 1038763 0 2026-03-09T19:28:03.192 INFO:tasks.workunit.client.0.vm07.stdout:6/832: mknod d0/d13/c137 0 2026-03-09T19:28:03.197 INFO:tasks.workunit.client.0.vm07.stdout:6/833: fdatasync d0/dbf/d95/d31/f6c 0 2026-03-09T19:28:03.199 INFO:tasks.workunit.client.0.vm07.stdout:3/973: read d1/d6/d4c/fb1 [461213,45317] 0 2026-03-09T19:28:03.199 INFO:tasks.workunit.client.0.vm07.stdout:9/910: creat d0/db/d29/d2c/f136 x:0 0 0 2026-03-09T19:28:03.201 INFO:tasks.workunit.client.0.vm07.stdout:6/834: truncate d0/d1/d28/d76/dad/f104 925165 0 2026-03-09T19:28:03.202 INFO:tasks.workunit.client.0.vm07.stdout:9/911: fdatasync d0/d6/d57/d8f/ff0 0 2026-03-09T19:28:03.205 INFO:tasks.workunit.client.0.vm07.stdout:9/912: dread d0/d6f/fa9 [0,4194304] 0 2026-03-09T19:28:03.207 INFO:tasks.workunit.client.0.vm07.stdout:3/974: link d1/d3d/d47/db3/d8e/da9/fe6 d1/d3d/d47/db3/d8e/da9/d127/f137 0 2026-03-09T19:28:03.209 INFO:tasks.workunit.client.0.vm07.stdout:2/945: sync 2026-03-09T19:28:03.211 INFO:tasks.workunit.client.0.vm07.stdout:3/975: fsync d1/d6/fb 0 2026-03-09T19:28:03.212 INFO:tasks.workunit.client.0.vm07.stdout:3/976: truncate d1/d6/d45/d54/de5/f10c 1212927 0 2026-03-09T19:28:03.214 INFO:tasks.workunit.client.0.vm07.stdout:6/835: creat d0/dbf/f138 x:0 0 0 2026-03-09T19:28:03.214 INFO:tasks.workunit.client.0.vm07.stdout:6/836: dread - d0/d2d/dd5/d123/d7b/d7d/f11b zero size 2026-03-09T19:28:03.217 INFO:tasks.workunit.client.0.vm07.stdout:3/977: rmdir d1/d3d/d47/db3/dc2/d28/d7c 39 2026-03-09T19:28:03.220 INFO:tasks.workunit.client.0.vm07.stdout:3/978: creat d1/d3d/d47/db3/f138 x:0 0 0 2026-03-09T19:28:03.220 INFO:tasks.workunit.client.0.vm07.stdout:2/946: getdents d3/dd/d16/d30/d40 0 2026-03-09T19:28:03.222 INFO:tasks.workunit.client.0.vm07.stdout:3/979: creat d1/d6/d45/dac/f139 x:0 0 0 2026-03-09T19:28:03.223 INFO:tasks.workunit.client.0.vm07.stdout:2/947: mknod d3/dd/d16/d29/d3c/da2/c14c 0 2026-03-09T19:28:03.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:02 vm07.local ceph-mon[48545]: Deploying daemon mgr.vm07.xacuym on vm07 2026-03-09T19:28:03.238 INFO:tasks.workunit.client.0.vm07.stdout:4/867: write d3/d11/d29/fc4 [229724,22621] 0 2026-03-09T19:28:03.252 INFO:tasks.workunit.client.0.vm07.stdout:4/868: rename d3/d4f/d56/d5f/d125/f11a to d3/d4f/f12e 0 2026-03-09T19:28:03.257 INFO:tasks.workunit.client.0.vm07.stdout:4/869: dwrite d3/d4f/d56/ff2 [4194304,4194304] 0 2026-03-09T19:28:03.259 INFO:tasks.workunit.client.0.vm07.stdout:0/850: dwrite d0/d6/d13/d17/d19/f34 [0,4194304] 0 2026-03-09T19:28:03.263 INFO:tasks.workunit.client.0.vm07.stdout:8/906: write d7/d30/f61 [1673093,111601] 0 2026-03-09T19:28:03.266 INFO:tasks.workunit.client.0.vm07.stdout:1/881: dwrite d1/db/f9b [0,4194304] 0 2026-03-09T19:28:03.269 INFO:tasks.workunit.client.0.vm07.stdout:0/851: dread d0/d6/d13/d1c/d50/f60 [0,4194304] 0 2026-03-09T19:28:03.271 INFO:tasks.workunit.client.0.vm07.stdout:0/852: chown d0/d6/d13/d17/d19/d58/de0 280543218 1 2026-03-09T19:28:03.271 INFO:tasks.workunit.client.0.vm07.stdout:5/902: write d3/dd/d26/d2d/d100/fcd [662998,77604] 0 2026-03-09T19:28:03.275 INFO:tasks.workunit.client.0.vm07.stdout:9/913: write d0/db/fb0 [269184,71735] 0 2026-03-09T19:28:03.276 INFO:tasks.workunit.client.0.vm07.stdout:8/907: creat d7/d30/d75/f13a x:0 0 0 2026-03-09T19:28:03.278 INFO:tasks.workunit.client.0.vm07.stdout:2/948: write d3/dd/d103/ddd/ded/f119 [1002961,124990] 0 2026-03-09T19:28:03.279 INFO:tasks.workunit.client.0.vm07.stdout:0/853: dread - d0/d6/dc8/fe2 zero size 2026-03-09T19:28:03.283 INFO:tasks.workunit.client.0.vm07.stdout:9/914: creat d0/d6/d57/d8f/f137 x:0 0 0 2026-03-09T19:28:03.283 INFO:tasks.workunit.client.0.vm07.stdout:8/908: fsync d7/d9/d37/d45/d97/dbc/fc4 0 2026-03-09T19:28:03.287 INFO:tasks.workunit.client.0.vm07.stdout:2/949: symlink d3/dd/d103/ddd/ded/l14d 0 2026-03-09T19:28:03.288 INFO:tasks.workunit.client.0.vm07.stdout:5/903: mkdir d3/d1a/d28/d48/d123 0 2026-03-09T19:28:03.289 INFO:tasks.workunit.client.0.vm07.stdout:9/915: creat d0/d6/d57/d5d/dde/f138 x:0 0 0 2026-03-09T19:28:03.290 INFO:tasks.workunit.client.0.vm07.stdout:8/909: mkdir d7/d9/d37/d45/d97/dbc/d13b 0 2026-03-09T19:28:03.291 INFO:tasks.workunit.client.0.vm07.stdout:1/882: link d1/d11/d37/d3f/d45/c35 d1/db/c123 0 2026-03-09T19:28:03.293 INFO:tasks.workunit.client.0.vm07.stdout:9/916: creat d0/dc1/f139 x:0 0 0 2026-03-09T19:28:03.294 INFO:tasks.workunit.client.0.vm07.stdout:8/910: read - d7/d9/f9e zero size 2026-03-09T19:28:03.294 INFO:tasks.workunit.client.0.vm07.stdout:6/837: dread d0/d1/d28/fc5 [0,4194304] 0 2026-03-09T19:28:03.295 INFO:tasks.workunit.client.0.vm07.stdout:0/854: rename d0/d6/d13/d33/l91 to d0/d6/d13/d1c/l119 0 2026-03-09T19:28:03.298 INFO:tasks.workunit.client.0.vm07.stdout:9/917: fsync d0/d6/d57/d8f/ff0 0 2026-03-09T19:28:03.299 INFO:tasks.workunit.client.0.vm07.stdout:6/838: fdatasync d0/d13/f5f 0 2026-03-09T19:28:03.300 INFO:tasks.workunit.client.0.vm07.stdout:6/839: stat d0/d4e/d7f 0 2026-03-09T19:28:03.300 INFO:tasks.workunit.client.0.vm07.stdout:8/911: mknod d7/d1d/c13c 0 2026-03-09T19:28:03.301 INFO:tasks.workunit.client.0.vm07.stdout:2/950: rename d3/dd/d16/d30/d40/d122 to d3/dd/d16/d29/d3c/da2/dff/d14e 0 2026-03-09T19:28:03.302 INFO:tasks.workunit.client.0.vm07.stdout:1/883: mknod d1/d11/d37/d3f/d6e/d9c/c124 0 2026-03-09T19:28:03.302 INFO:tasks.workunit.client.0.vm07.stdout:9/918: creat d0/d6f/dc3/df8/dfc/f13a x:0 0 0 2026-03-09T19:28:03.303 INFO:tasks.workunit.client.0.vm07.stdout:6/840: fsync d0/d13/ff8 0 2026-03-09T19:28:03.309 INFO:tasks.workunit.client.0.vm07.stdout:0/855: link d0/d6/d13/d17/d19/d57/f5a d0/d6/d13/d1c/d11/d56/f11a 0 2026-03-09T19:28:03.309 INFO:tasks.workunit.client.0.vm07.stdout:9/919: symlink d0/d6/d3a/d94/l13b 0 2026-03-09T19:28:03.309 INFO:tasks.workunit.client.0.vm07.stdout:6/841: read d0/dbf/d95/d31/f89 [7096127,24903] 0 2026-03-09T19:28:03.310 INFO:tasks.workunit.client.0.vm07.stdout:2/951: mknod d3/dd/d16/d29/d2d/d45/d8b/d98/dee/d145/c14f 0 2026-03-09T19:28:03.311 INFO:tasks.workunit.client.0.vm07.stdout:9/920: dread - d0/d6/d57/deb/f133 zero size 2026-03-09T19:28:03.317 INFO:tasks.workunit.client.0.vm07.stdout:0/856: creat d0/d6/d13/d17/dc3/df6/df9/f11b x:0 0 0 2026-03-09T19:28:03.318 INFO:tasks.workunit.client.0.vm07.stdout:9/921: dread d0/db/d29/d2c/d36/f62 [0,4194304] 0 2026-03-09T19:28:03.318 INFO:tasks.workunit.client.0.vm07.stdout:0/857: rmdir d0/d6/d13/d17/d19/d58/dd9/df8 39 2026-03-09T19:28:03.327 INFO:tasks.workunit.client.0.vm07.stdout:1/884: dread d1/db/d31/f64 [4194304,4194304] 0 2026-03-09T19:28:03.328 INFO:tasks.workunit.client.0.vm07.stdout:9/922: rename d0/c93 to d0/d6/d57/d5d/c13c 0 2026-03-09T19:28:03.329 INFO:tasks.workunit.client.0.vm07.stdout:2/952: getdents d3/dd/d16/d29/d2d/d45/d3b/d44/d96 0 2026-03-09T19:28:03.330 INFO:tasks.workunit.client.0.vm07.stdout:2/953: write d3/d49/faf [683636,1846] 0 2026-03-09T19:28:03.331 INFO:tasks.workunit.client.0.vm07.stdout:2/954: chown d3/dd/d16/d29/d3c/d5a/d7a/d74/dc9/fd3 3224805 1 2026-03-09T19:28:03.331 INFO:tasks.workunit.client.0.vm07.stdout:1/885: read d1/d11/d37/d3f/fd7 [2171798,38757] 0 2026-03-09T19:28:03.335 INFO:tasks.workunit.client.0.vm07.stdout:0/858: rename d0/d6/d13/d1c/d11/d56/f113 to d0/d6/d13/d17/f11c 0 2026-03-09T19:28:03.337 INFO:tasks.workunit.client.0.vm07.stdout:0/859: creat d0/d6/dc8/d99/f11d x:0 0 0 2026-03-09T19:28:03.341 INFO:tasks.workunit.client.0.vm07.stdout:1/886: rmdir d1/d11/d37/d3f/d45/d87/d89/d113 0 2026-03-09T19:28:03.341 INFO:tasks.workunit.client.0.vm07.stdout:3/980: dread d1/d6/d4c/d97/fc0 [0,4194304] 0 2026-03-09T19:28:03.342 INFO:tasks.workunit.client.0.vm07.stdout:0/860: link d0/d6/d13/d1c/d50/f73 d0/d6/f11e 0 2026-03-09T19:28:03.346 INFO:tasks.workunit.client.0.vm07.stdout:3/981: chown d1/d3d/d47/db3/d87/la4 3 1 2026-03-09T19:28:03.349 INFO:tasks.workunit.client.0.vm07.stdout:7/820: dread d0/d80/db1/de5/d54/d55/f6d [0,4194304] 0 2026-03-09T19:28:03.349 INFO:tasks.workunit.client.0.vm07.stdout:1/887: link d1/d11/d37/f40 d1/d11/d37/dcb/f125 0 2026-03-09T19:28:03.355 INFO:tasks.workunit.client.0.vm07.stdout:0/861: truncate d0/d6/d13/d1c/d50/f73 818513 0 2026-03-09T19:28:03.356 INFO:tasks.workunit.client.0.vm07.stdout:7/821: symlink d0/d4/d5/d8/d1a/d2a/l10f 0 2026-03-09T19:28:03.356 INFO:tasks.workunit.client.0.vm07.stdout:0/862: stat d0/d6/d13/l46 0 2026-03-09T19:28:03.369 INFO:tasks.workunit.client.0.vm07.stdout:3/982: dread d1/d3d/d47/f62 [0,4194304] 0 2026-03-09T19:28:03.370 INFO:tasks.workunit.client.0.vm07.stdout:7/822: read d0/d80/db1/de5/f62 [1317902,94070] 0 2026-03-09T19:28:03.370 INFO:tasks.workunit.client.0.vm07.stdout:7/823: readlink d0/d4/d5/d8/d41/d64/l68 0 2026-03-09T19:28:03.381 INFO:tasks.workunit.client.0.vm07.stdout:4/870: dread d3/d11/d2b/d38/ddc/d22/d86/f8c [0,4194304] 0 2026-03-09T19:28:03.382 INFO:tasks.workunit.client.0.vm07.stdout:9/923: rmdir d0/d6/d57/d8f 39 2026-03-09T19:28:03.383 INFO:tasks.workunit.client.0.vm07.stdout:7/824: dwrite d0/d4/d5/d26/d32/fa6 [0,4194304] 0 2026-03-09T19:28:03.386 INFO:tasks.workunit.client.0.vm07.stdout:7/825: stat d0/d4/d5/d8/d41/d64/fee 0 2026-03-09T19:28:03.389 INFO:tasks.workunit.client.0.vm07.stdout:3/983: dwrite d1/d6/d45/d54/f116 [0,4194304] 0 2026-03-09T19:28:03.390 INFO:tasks.workunit.client.0.vm07.stdout:3/984: readlink d1/d6/d4c/le3 0 2026-03-09T19:28:03.397 INFO:tasks.workunit.client.0.vm07.stdout:4/871: mknod d3/d4f/d56/d5f/c12f 0 2026-03-09T19:28:03.400 INFO:tasks.workunit.client.0.vm07.stdout:9/924: stat d0/db/d29/d2c/d36/cbd 0 2026-03-09T19:28:03.402 INFO:tasks.workunit.client.0.vm07.stdout:7/826: mknod d0/d4/d5/d8/d41/d64/d74/c110 0 2026-03-09T19:28:03.407 INFO:tasks.workunit.client.0.vm07.stdout:9/925: dwrite d0/db/d29/d2c/f54 [4194304,4194304] 0 2026-03-09T19:28:03.416 INFO:tasks.workunit.client.0.vm07.stdout:4/872: sync 2026-03-09T19:28:03.420 INFO:tasks.workunit.client.0.vm07.stdout:4/873: creat d3/d11/d29/d101/d93/f130 x:0 0 0 2026-03-09T19:28:03.421 INFO:tasks.workunit.client.0.vm07.stdout:4/874: chown d3/dfc/c105 223896513 1 2026-03-09T19:28:03.422 INFO:tasks.workunit.client.0.vm07.stdout:4/875: write d3/d11/d2b/d37/f120 [701692,1133] 0 2026-03-09T19:28:03.426 INFO:tasks.workunit.client.0.vm07.stdout:4/876: symlink d3/d11/d2b/d38/ddc/d22/d86/l131 0 2026-03-09T19:28:03.437 INFO:tasks.workunit.client.0.vm07.stdout:4/877: creat d3/d11/d2b/d38/ddc/db2/d127/f132 x:0 0 0 2026-03-09T19:28:03.437 INFO:tasks.workunit.client.0.vm07.stdout:4/878: fsync d3/d11/d29/d101/d93/f9c 0 2026-03-09T19:28:03.437 INFO:tasks.workunit.client.0.vm07.stdout:4/879: dread - d3/d11/d51/fcb zero size 2026-03-09T19:28:03.437 INFO:tasks.workunit.client.0.vm07.stdout:4/880: unlink d3/d11/d2b/d38/ddc/f60 0 2026-03-09T19:28:03.437 INFO:tasks.workunit.client.0.vm07.stdout:4/881: mkdir d3/d4f/d56/d5f/d125/d133 0 2026-03-09T19:28:03.440 INFO:tasks.workunit.client.0.vm07.stdout:4/882: rename d3/d11/d2b/d38/ddc/db2/f102 to d3/d11/d29/d101/d99/de7/f134 0 2026-03-09T19:28:03.470 INFO:tasks.workunit.client.0.vm07.stdout:8/912: write d7/d9/d10/dd8/dfd/f7a [35985,98667] 0 2026-03-09T19:28:03.471 INFO:tasks.workunit.client.0.vm07.stdout:8/913: readlink d7/d30/l8c 0 2026-03-09T19:28:03.472 INFO:tasks.workunit.client.0.vm07.stdout:5/904: dwrite d3/dd/d26/d3f/d47/d71/fd3 [0,4194304] 0 2026-03-09T19:28:03.478 INFO:tasks.workunit.client.0.vm07.stdout:8/914: creat d7/d9/d10/f13d x:0 0 0 2026-03-09T19:28:03.481 INFO:tasks.workunit.client.0.vm07.stdout:5/905: mkdir d3/d1a/d28/d40/d92/d89/ddc/dde/dfb/d124 0 2026-03-09T19:28:03.481 INFO:tasks.workunit.client.0.vm07.stdout:6/842: dwrite d0/fe [0,4194304] 0 2026-03-09T19:28:03.483 INFO:tasks.workunit.client.0.vm07.stdout:8/915: creat d7/d9/d37/d45/d4f/f13e x:0 0 0 2026-03-09T19:28:03.485 INFO:tasks.workunit.client.0.vm07.stdout:5/906: mknod d3/d1a/d28/d40/d92/d89/ddc/dde/c125 0 2026-03-09T19:28:03.485 INFO:tasks.workunit.client.0.vm07.stdout:8/916: chown d7/d9/d37/d45/d4f/c78 121934907 1 2026-03-09T19:28:03.493 INFO:tasks.workunit.client.0.vm07.stdout:6/843: dread d0/d1/db/d1d/f3e [0,4194304] 0 2026-03-09T19:28:03.496 INFO:tasks.workunit.client.0.vm07.stdout:8/917: dwrite d7/d1d/d83/d9f/fc6 [0,4194304] 0 2026-03-09T19:28:03.502 INFO:tasks.workunit.client.0.vm07.stdout:1/888: dwrite d1/d11/f42 [0,4194304] 0 2026-03-09T19:28:03.506 INFO:tasks.workunit.client.0.vm07.stdout:2/955: dwrite d3/dd/d16/d29/d2d/d45/d85/fa5 [4194304,4194304] 0 2026-03-09T19:28:03.519 INFO:tasks.workunit.client.0.vm07.stdout:0/863: dwrite d0/d6/d13/fbf [0,4194304] 0 2026-03-09T19:28:03.521 INFO:tasks.workunit.client.0.vm07.stdout:2/956: mkdir d3/dd/d16/d30/da7/d150 0 2026-03-09T19:28:03.529 INFO:tasks.workunit.client.0.vm07.stdout:1/889: mknod d1/d11/d37/d3f/d45/d87/d89/dec/c126 0 2026-03-09T19:28:03.533 INFO:tasks.workunit.client.0.vm07.stdout:0/864: creat d0/d6/d13/d1c/d61/d69/f11f x:0 0 0 2026-03-09T19:28:03.535 INFO:tasks.workunit.client.0.vm07.stdout:2/957: dread d3/f10c [0,4194304] 0 2026-03-09T19:28:03.536 INFO:tasks.workunit.client.0.vm07.stdout:3/985: write d1/d3d/d47/f9e [365972,1286] 0 2026-03-09T19:28:03.537 INFO:tasks.workunit.client.0.vm07.stdout:1/890: creat d1/d3e/f127 x:0 0 0 2026-03-09T19:28:03.538 INFO:tasks.workunit.client.0.vm07.stdout:1/891: write d1/d11/d37/d3f/d6e/d9c/faa [16072,62384] 0 2026-03-09T19:28:03.538 INFO:tasks.workunit.client.0.vm07.stdout:2/958: mknod d3/dd/daa/c151 0 2026-03-09T19:28:03.539 INFO:tasks.workunit.client.0.vm07.stdout:3/986: mkdir d1/d6/d71/d13a 0 2026-03-09T19:28:03.541 INFO:tasks.workunit.client.0.vm07.stdout:0/865: getdents d0/d6/d13/d17/d19 0 2026-03-09T19:28:03.547 INFO:tasks.workunit.client.0.vm07.stdout:7/827: dwrite d0/d4/d5/f109 [0,4194304] 0 2026-03-09T19:28:03.551 INFO:tasks.workunit.client.0.vm07.stdout:9/926: dwrite d0/f4 [4194304,4194304] 0 2026-03-09T19:28:03.551 INFO:tasks.workunit.client.0.vm07.stdout:1/892: rmdir d1/d11/d37/d3f 39 2026-03-09T19:28:03.551 INFO:tasks.workunit.client.0.vm07.stdout:3/987: unlink d1/d6/d45/d54/f116 0 2026-03-09T19:28:03.555 INFO:tasks.workunit.client.0.vm07.stdout:4/883: write d3/d11/d51/fcb [910312,52551] 0 2026-03-09T19:28:03.558 INFO:tasks.workunit.client.0.vm07.stdout:2/959: mknod d3/c152 0 2026-03-09T19:28:03.565 INFO:tasks.workunit.client.0.vm07.stdout:4/884: creat d3/d11/d16/de1/f135 x:0 0 0 2026-03-09T19:28:03.567 INFO:tasks.workunit.client.0.vm07.stdout:3/988: dwrite d1/f136 [0,4194304] 0 2026-03-09T19:28:03.570 INFO:tasks.workunit.client.0.vm07.stdout:3/989: stat d1/d3d/d47/db3/d8e/da9/f82 0 2026-03-09T19:28:03.570 INFO:tasks.workunit.client.0.vm07.stdout:0/866: dread d0/d6/d13/d17/dc3/fb6 [0,4194304] 0 2026-03-09T19:28:03.583 INFO:tasks.workunit.client.0.vm07.stdout:2/960: dwrite d3/dd/d16/d29/f58 [4194304,4194304] 0 2026-03-09T19:28:03.585 INFO:tasks.workunit.client.0.vm07.stdout:2/961: dread - d3/dd/d16/d29/d3c/d5a/f138 zero size 2026-03-09T19:28:03.593 INFO:tasks.workunit.client.0.vm07.stdout:4/885: truncate d3/dbe/ff3 379683 0 2026-03-09T19:28:03.597 INFO:tasks.workunit.client.0.vm07.stdout:0/867: creat d0/d6/d13/d17/d118/f120 x:0 0 0 2026-03-09T19:28:03.597 INFO:tasks.workunit.client.0.vm07.stdout:0/868: chown d0/d6/c55 1863565 1 2026-03-09T19:28:03.598 INFO:tasks.workunit.client.0.vm07.stdout:3/990: mknod d1/d6/c13b 0 2026-03-09T19:28:03.599 INFO:tasks.workunit.client.0.vm07.stdout:4/886: truncate d3/d11/d2b/f98 349922 0 2026-03-09T19:28:03.600 INFO:tasks.workunit.client.0.vm07.stdout:7/828: getdents d0/d80/db1/de5/d54/dc4 0 2026-03-09T19:28:03.605 INFO:tasks.workunit.client.0.vm07.stdout:3/991: rmdir d1/d3d/d47/db3/d8e/da9 39 2026-03-09T19:28:03.606 INFO:tasks.workunit.client.0.vm07.stdout:4/887: fsync d3/d11/d2b/d38/ddc/fb4 0 2026-03-09T19:28:03.608 INFO:tasks.workunit.client.0.vm07.stdout:3/992: mkdir d1/d1f/d13c 0 2026-03-09T19:28:03.608 INFO:tasks.workunit.client.0.vm07.stdout:4/888: write d3/d4f/d56/f124 [585098,87935] 0 2026-03-09T19:28:03.609 INFO:tasks.workunit.client.0.vm07.stdout:3/993: write d1/d6/d4c/dfa/f120 [944988,72942] 0 2026-03-09T19:28:03.609 INFO:tasks.workunit.client.0.vm07.stdout:4/889: write d3/d11/d51/f121 [409772,124578] 0 2026-03-09T19:28:03.610 INFO:tasks.workunit.client.0.vm07.stdout:4/890: chown d3/d11/d2b 4 1 2026-03-09T19:28:03.614 INFO:tasks.workunit.client.0.vm07.stdout:3/994: creat d1/d6/d71/d101/f13d x:0 0 0 2026-03-09T19:28:03.614 INFO:tasks.workunit.client.0.vm07.stdout:4/891: creat d3/d11/d2b/d38/d8f/f136 x:0 0 0 2026-03-09T19:28:03.615 INFO:tasks.workunit.client.0.vm07.stdout:4/892: readlink d3/d11/lf8 0 2026-03-09T19:28:03.615 INFO:tasks.workunit.client.0.vm07.stdout:3/995: symlink d1/d3d/l13e 0 2026-03-09T19:28:03.621 INFO:tasks.workunit.client.0.vm07.stdout:3/996: truncate d1/d3d/d47/db3/faf 109723 0 2026-03-09T19:28:03.678 INFO:tasks.workunit.client.0.vm07.stdout:5/907: rename d3/d1a/d28/d40 to d3/dd/d26/d3f/d47/d71/d76/d98/d126 0 2026-03-09T19:28:03.681 INFO:tasks.workunit.client.0.vm07.stdout:7/829: dread d0/d4/d5/d8/d1a/f4d [0,4194304] 0 2026-03-09T19:28:03.681 INFO:tasks.workunit.client.0.vm07.stdout:6/844: rename d0/d1/db/d1d/c46 to d0/d4e/d75/c139 0 2026-03-09T19:28:03.682 INFO:tasks.workunit.client.0.vm07.stdout:6/845: write d0/d44/fe2 [2554063,92631] 0 2026-03-09T19:28:03.691 INFO:tasks.workunit.client.0.vm07.stdout:6/846: symlink d0/d1/db/d52/d12b/l13a 0 2026-03-09T19:28:03.693 INFO:tasks.workunit.client.0.vm07.stdout:6/847: unlink d0/d1/db/d1d/d77/ff0 0 2026-03-09T19:28:03.694 INFO:tasks.workunit.client.0.vm07.stdout:6/848: fsync d0/d1/db/d91/f126 0 2026-03-09T19:28:03.699 INFO:tasks.workunit.client.0.vm07.stdout:6/849: creat d0/d1/db/d52/d94/d87/f13b x:0 0 0 2026-03-09T19:28:03.711 INFO:tasks.workunit.client.0.vm07.stdout:2/962: dread d3/d11/f2e [0,4194304] 0 2026-03-09T19:28:03.713 INFO:tasks.workunit.client.0.vm07.stdout:8/918: dwrite d7/d9/d37/d45/d97/dbc/fc4 [0,4194304] 0 2026-03-09T19:28:03.718 INFO:tasks.workunit.client.0.vm07.stdout:8/919: mknod d7/d1d/c13f 0 2026-03-09T19:28:03.730 INFO:tasks.workunit.client.0.vm07.stdout:8/920: rename d7/d16/dcf/f127 to d7/d9/d10/dd8/dfc/f140 0 2026-03-09T19:28:03.732 INFO:tasks.workunit.client.0.vm07.stdout:9/927: dwrite d0/db/f39 [0,4194304] 0 2026-03-09T19:28:03.734 INFO:tasks.workunit.client.0.vm07.stdout:2/963: sync 2026-03-09T19:28:03.735 INFO:tasks.workunit.client.0.vm07.stdout:8/921: truncate d7/d30/d32/f121 373581 0 2026-03-09T19:28:03.741 INFO:tasks.workunit.client.0.vm07.stdout:2/964: stat d3/dd/d16/d29/d2d/d45/d8b/d98/dee/cf9 0 2026-03-09T19:28:03.741 INFO:tasks.workunit.client.0.vm07.stdout:1/893: dwrite d1/d11/d37/d3f/d6e/f9f [0,4194304] 0 2026-03-09T19:28:03.748 INFO:tasks.workunit.client.0.vm07.stdout:1/894: stat d1/d11/d37/d5d/dc1/lc6 0 2026-03-09T19:28:03.751 INFO:tasks.workunit.client.0.vm07.stdout:1/895: sync 2026-03-09T19:28:03.758 INFO:tasks.workunit.client.0.vm07.stdout:2/965: mknod d3/dd/d16/d29/d2d/d45/d85/d8a/c153 0 2026-03-09T19:28:03.758 INFO:tasks.workunit.client.0.vm07.stdout:0/869: dwrite d0/d6/d13/d17/d19/d58/f77 [0,4194304] 0 2026-03-09T19:28:03.765 INFO:tasks.workunit.client.0.vm07.stdout:8/922: creat d7/d16/d1e/f141 x:0 0 0 2026-03-09T19:28:03.765 INFO:tasks.workunit.client.0.vm07.stdout:8/923: chown d7/d50 220339417 1 2026-03-09T19:28:03.768 INFO:tasks.workunit.client.0.vm07.stdout:9/928: dread d0/d17/f42 [0,4194304] 0 2026-03-09T19:28:03.768 INFO:tasks.workunit.client.0.vm07.stdout:1/896: dread - d1/d11/d37/d5d/dc1/fd6 zero size 2026-03-09T19:28:03.772 INFO:tasks.workunit.client.0.vm07.stdout:2/966: mknod d3/dd/d16/d29/d2d/c154 0 2026-03-09T19:28:03.775 INFO:tasks.workunit.client.0.vm07.stdout:2/967: write d3/dd/d16/f147 [531808,60579] 0 2026-03-09T19:28:03.775 INFO:tasks.workunit.client.0.vm07.stdout:8/924: creat d7/d1d/d83/d9f/dd2/f142 x:0 0 0 2026-03-09T19:28:03.775 INFO:tasks.workunit.client.0.vm07.stdout:1/897: mknod d1/d3/d21/c128 0 2026-03-09T19:28:03.783 INFO:tasks.workunit.client.0.vm07.stdout:0/870: dread d0/f41 [0,4194304] 0 2026-03-09T19:28:03.785 INFO:tasks.workunit.client.0.vm07.stdout:9/929: dwrite d0/d6f/d12e/f10f [0,4194304] 0 2026-03-09T19:28:03.797 INFO:tasks.workunit.client.0.vm07.stdout:2/968: creat d3/dd/d103/ddd/ded/df3/d112/f155 x:0 0 0 2026-03-09T19:28:03.797 INFO:tasks.workunit.client.0.vm07.stdout:3/997: write d1/d6/d45/d54/dd1/f114 [910605,1268] 0 2026-03-09T19:28:03.800 INFO:tasks.workunit.client.0.vm07.stdout:8/925: dread d7/d50/f82 [0,4194304] 0 2026-03-09T19:28:03.806 INFO:tasks.workunit.client.0.vm07.stdout:1/898: stat d1/d11/d37/d3f/f4a 0 2026-03-09T19:28:03.809 INFO:tasks.workunit.client.0.vm07.stdout:5/908: write d3/dd/d26/d3f/fb3 [1762728,86138] 0 2026-03-09T19:28:03.814 INFO:tasks.workunit.client.0.vm07.stdout:4/893: dwrite d3/d4f/d56/d5f/fc3 [0,4194304] 0 2026-03-09T19:28:03.822 INFO:tasks.workunit.client.0.vm07.stdout:5/909: stat d3/d1a/d28/d6c/ccc 0 2026-03-09T19:28:03.827 INFO:tasks.workunit.client.0.vm07.stdout:7/830: write d0/d80/db1/de5/d54/f7d [4965439,108580] 0 2026-03-09T19:28:03.828 INFO:tasks.workunit.client.0.vm07.stdout:1/899: symlink d1/d11/d37/d3f/d45/l129 0 2026-03-09T19:28:03.829 INFO:tasks.workunit.client.0.vm07.stdout:3/998: chown d1/d6/d45/d54/dd1 1301 1 2026-03-09T19:28:03.830 INFO:tasks.workunit.client.0.vm07.stdout:4/894: mkdir d3/d11/d2b/d38/d107/d137 0 2026-03-09T19:28:03.832 INFO:tasks.workunit.client.0.vm07.stdout:7/831: dread d0/d4/d5/d26/d32/f45 [0,4194304] 0 2026-03-09T19:28:03.832 INFO:tasks.workunit.client.0.vm07.stdout:7/832: chown d0/d4/d5/d8/d41/d64 1478520191 1 2026-03-09T19:28:03.833 INFO:tasks.workunit.client.0.vm07.stdout:9/930: dwrite d0/db/d29/d2c/f4a [0,4194304] 0 2026-03-09T19:28:03.834 INFO:tasks.workunit.client.0.vm07.stdout:4/895: fsync d3/d4f/d56/d5f/d88/dd0/f114 0 2026-03-09T19:28:03.847 INFO:tasks.workunit.client.0.vm07.stdout:7/833: readlink d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58/l10d 0 2026-03-09T19:28:03.847 INFO:tasks.workunit.client.0.vm07.stdout:7/834: write d0/d4/d5/f36 [2026086,97665] 0 2026-03-09T19:28:03.847 INFO:tasks.workunit.client.0.vm07.stdout:4/896: mknod d3/d4f/d56/d5f/d88/c138 0 2026-03-09T19:28:03.847 INFO:tasks.workunit.client.0.vm07.stdout:7/835: mkdir d0/d4/d5/d26/d32/dbd/d111 0 2026-03-09T19:28:03.848 INFO:tasks.workunit.client.0.vm07.stdout:6/850: dwrite d0/d1/d28/da9/f116 [0,4194304] 0 2026-03-09T19:28:03.848 INFO:tasks.workunit.client.0.vm07.stdout:7/836: dread - d0/d80/db1/de5/d54/d95/fcf zero size 2026-03-09T19:28:03.849 INFO:tasks.workunit.client.0.vm07.stdout:6/851: write d0/d4e/d7f/fe9 [470569,23648] 0 2026-03-09T19:28:03.852 INFO:tasks.workunit.client.0.vm07.stdout:9/931: sync 2026-03-09T19:28:03.856 INFO:tasks.workunit.client.0.vm07.stdout:5/910: mknod d3/dd/d26/d3f/d47/d71/d76/d98/c127 0 2026-03-09T19:28:03.861 INFO:tasks.workunit.client.0.vm07.stdout:3/999: truncate d1/d3d/d47/db3/dc2/f1e 199818 0 2026-03-09T19:28:03.863 INFO:tasks.workunit.client.0.vm07.stdout:7/837: dread d0/d4/d5/d26/d32/fa6 [0,4194304] 0 2026-03-09T19:28:03.863 INFO:tasks.workunit.client.0.vm07.stdout:1/900: creat d1/d3e/db3/f12a x:0 0 0 2026-03-09T19:28:03.871 INFO:tasks.workunit.client.0.vm07.stdout:5/911: symlink d3/dd/d26/d3f/d47/d71/d76/d98/d126/d92/d89/ddc/dde/l128 0 2026-03-09T19:28:03.871 INFO:tasks.workunit.client.0.vm07.stdout:7/838: symlink d0/d4/d5/d26/dfb/l112 0 2026-03-09T19:28:03.872 INFO:tasks.workunit.client.0.vm07.stdout:4/897: truncate d3/d11/d2b/d38/ddc/d91/fbb 654645 0 2026-03-09T19:28:03.875 INFO:tasks.workunit.client.0.vm07.stdout:5/912: fsync d3/d1a/d28/d6c/f7a 0 2026-03-09T19:28:03.879 INFO:tasks.workunit.client.0.vm07.stdout:4/898: mknod d3/d4f/d56/d5f/d125/dfe/c139 0 2026-03-09T19:28:03.881 INFO:tasks.workunit.client.0.vm07.stdout:1/901: dread d1/db/f9b [0,4194304] 0 2026-03-09T19:28:03.888 INFO:tasks.workunit.client.0.vm07.stdout:5/913: truncate d3/d1a/d28/d48/f11c 3395496 0 2026-03-09T19:28:03.891 INFO:tasks.workunit.client.0.vm07.stdout:5/914: mkdir d3/d1a/d28/d6c/d72/db5/d129 0 2026-03-09T19:28:03.899 INFO:tasks.workunit.client.0.vm07.stdout:5/915: mkdir d3/dd/d26/d3f/d47/d71/d76/d98/d126/d92/d89/ddc/dde/dfb/d124/d12a 0 2026-03-09T19:28:03.901 INFO:tasks.workunit.client.0.vm07.stdout:8/926: rename d7/d9/d10/dd8/dfc/f140 to d7/d9/d37/d45/d97/dbc/f143 0 2026-03-09T19:28:03.903 INFO:tasks.workunit.client.0.vm07.stdout:6/852: rename d0/d4e/d7f/cc1 to d0/d1/db/d24/c13c 0 2026-03-09T19:28:03.904 INFO:tasks.workunit.client.0.vm07.stdout:5/916: getdents d3/dd/d26/d2d 0 2026-03-09T19:28:03.910 INFO:tasks.workunit.client.0.vm07.stdout:5/917: truncate d3/d1a/d28/d48/f103 9735258 0 2026-03-09T19:28:03.911 INFO:tasks.workunit.client.0.vm07.stdout:6/853: truncate d0/d2d/dd5/d123/f7c 1252859 0 2026-03-09T19:28:03.912 INFO:tasks.workunit.client.0.vm07.stdout:8/927: rename d7/d9/d10/dd8/dfd/d67/cb6 to d7/c144 0 2026-03-09T19:28:03.912 INFO:tasks.workunit.client.0.vm07.stdout:5/918: unlink d3/dd/l45 0 2026-03-09T19:28:03.913 INFO:tasks.workunit.client.0.vm07.stdout:8/928: chown d7/d9/d10/dd8/l120 56314649 1 2026-03-09T19:28:03.915 INFO:tasks.workunit.client.0.vm07.stdout:8/929: chown d7/d9/d57/lad 11 1 2026-03-09T19:28:03.920 INFO:tasks.workunit.client.0.vm07.stdout:5/919: creat d3/dd/d26/d3f/d47/d71/d76/d98/d126/d92/d89/f12b x:0 0 0 2026-03-09T19:28:03.923 INFO:tasks.workunit.client.0.vm07.stdout:5/920: mkdir d3/dd/d26/d3f/d47/d71/db7/d12c 0 2026-03-09T19:28:03.924 INFO:tasks.workunit.client.0.vm07.stdout:5/921: symlink d3/d1a/d5d/l12d 0 2026-03-09T19:28:03.927 INFO:tasks.workunit.client.0.vm07.stdout:5/922: truncate d3/dd/d26/d3f/d47/d71/d76/d98/d126/fa5 1092902 0 2026-03-09T19:28:03.931 INFO:tasks.workunit.client.0.vm07.stdout:9/932: dread d0/db/d29/da8/fab [0,4194304] 0 2026-03-09T19:28:03.933 INFO:tasks.workunit.client.0.vm07.stdout:9/933: getdents d0/d6/d57 0 2026-03-09T19:28:03.933 INFO:tasks.workunit.client.0.vm07.stdout:9/934: read d0/d6/d57/d8f/f9f [184485,88893] 0 2026-03-09T19:28:03.934 INFO:tasks.workunit.client.0.vm07.stdout:9/935: symlink d0/db/d29/da8/l13d 0 2026-03-09T19:28:03.973 INFO:tasks.workunit.client.0.vm07.stdout:2/969: write d3/dd/d16/d29/d2d/d45/dc3/fb8 [936157,102556] 0 2026-03-09T19:28:03.973 INFO:tasks.workunit.client.0.vm07.stdout:0/871: write d0/d6/d13/d1c/f3e [19345,82918] 0 2026-03-09T19:28:03.974 INFO:tasks.workunit.client.0.vm07.stdout:2/970: readlink d3/dd/d16/d29/d3c/d4c/l118 0 2026-03-09T19:28:03.980 INFO:tasks.workunit.client.0.vm07.stdout:2/971: creat d3/dd/d16/d30/da7/f156 x:0 0 0 2026-03-09T19:28:04.001 INFO:tasks.workunit.client.0.vm07.stdout:4/899: write d3/d11/d2b/d38/ddc/fb0 [219804,28364] 0 2026-03-09T19:28:04.002 INFO:tasks.workunit.client.0.vm07.stdout:1/902: write d1/d11/d37/d3f/d45/d87/d88/fd5 [1625032,105553] 0 2026-03-09T19:28:04.002 INFO:tasks.workunit.client.0.vm07.stdout:1/903: readlink d1/d11/l19 0 2026-03-09T19:28:04.005 INFO:tasks.workunit.client.0.vm07.stdout:7/839: dwrite d0/d4/f86 [0,4194304] 0 2026-03-09T19:28:04.007 INFO:tasks.workunit.client.0.vm07.stdout:7/840: dread d0/d4/f86 [0,4194304] 0 2026-03-09T19:28:04.015 INFO:tasks.workunit.client.0.vm07.stdout:4/900: truncate d3/d11/d2b/f69 293002 0 2026-03-09T19:28:04.015 INFO:tasks.workunit.client.0.vm07.stdout:7/841: stat d0/d80/f81 0 2026-03-09T19:28:04.015 INFO:tasks.workunit.client.0.vm07.stdout:8/930: write d7/d9/d10/f20 [951478,98335] 0 2026-03-09T19:28:04.016 INFO:tasks.workunit.client.0.vm07.stdout:6/854: write d0/dbf/fd8 [4263950,81438] 0 2026-03-09T19:28:04.016 INFO:tasks.workunit.client.0.vm07.stdout:5/923: write d3/dd/f24 [2071917,84485] 0 2026-03-09T19:28:04.017 INFO:tasks.workunit.client.0.vm07.stdout:1/904: creat d1/d11/d37/d5d/dc1/d107/f12b x:0 0 0 2026-03-09T19:28:04.019 INFO:tasks.workunit.client.0.vm07.stdout:1/905: read d1/d11/f42 [3413296,121962] 0 2026-03-09T19:28:04.019 INFO:tasks.workunit.client.0.vm07.stdout:4/901: chown d3/d11/d2b/d38/f8a 17 1 2026-03-09T19:28:04.030 INFO:tasks.workunit.client.0.vm07.stdout:8/931: sync 2026-03-09T19:28:04.033 INFO:tasks.workunit.client.0.vm07.stdout:9/936: dwrite d0/d6f/fdb [0,4194304] 0 2026-03-09T19:28:04.037 INFO:tasks.workunit.client.0.vm07.stdout:1/906: mkdir d1/d3e/dc8/d12c 0 2026-03-09T19:28:04.040 INFO:tasks.workunit.client.0.vm07.stdout:6/855: mkdir d0/dbf/dd9/d13d 0 2026-03-09T19:28:04.040 INFO:tasks.workunit.client.0.vm07.stdout:0/872: dwrite d0/d6/d13/d1c/d61/f63 [0,4194304] 0 2026-03-09T19:28:04.041 INFO:tasks.workunit.client.0.vm07.stdout:6/856: sync 2026-03-09T19:28:04.056 INFO:tasks.workunit.client.0.vm07.stdout:6/857: symlink d0/d2d/dd5/l13e 0 2026-03-09T19:28:04.056 INFO:tasks.workunit.client.0.vm07.stdout:4/902: dwrite d3/d11/d29/fc4 [0,4194304] 0 2026-03-09T19:28:04.062 INFO:tasks.workunit.client.0.vm07.stdout:8/932: rename d7/d9/d10/dd8/l120 to d7/d50/l145 0 2026-03-09T19:28:04.069 INFO:tasks.workunit.client.0.vm07.stdout:5/924: dread d3/f121 [0,4194304] 0 2026-03-09T19:28:04.069 INFO:tasks.workunit.client.0.vm07.stdout:1/907: mkdir d1/d91/d117/d12d 0 2026-03-09T19:28:04.069 INFO:tasks.workunit.client.0.vm07.stdout:5/925: fsync f2 0 2026-03-09T19:28:04.069 INFO:tasks.workunit.client.0.vm07.stdout:6/858: truncate d0/d44/dd3/f106 1221291 0 2026-03-09T19:28:04.069 INFO:tasks.workunit.client.0.vm07.stdout:1/908: dread - d1/d11/d37/ff2 zero size 2026-03-09T19:28:04.070 INFO:tasks.workunit.client.0.vm07.stdout:8/933: mkdir d7/d9/d10/dd8/dfd/d67/de7/d146 0 2026-03-09T19:28:04.082 INFO:tasks.workunit.client.0.vm07.stdout:8/934: mknod d7/d9/d10/d44/c147 0 2026-03-09T19:28:04.082 INFO:tasks.workunit.client.0.vm07.stdout:2/972: dwrite d3/d11/f31 [4194304,4194304] 0 2026-03-09T19:28:04.087 INFO:tasks.workunit.client.0.vm07.stdout:2/973: write d3/dd/d16/d30/da7/f156 [521550,48669] 0 2026-03-09T19:28:04.093 INFO:tasks.workunit.client.0.vm07.stdout:4/903: getdents d3/d4f/d56/d11e 0 2026-03-09T19:28:04.096 INFO:tasks.workunit.client.0.vm07.stdout:6/859: unlink d0/d1/db/d52/c66 0 2026-03-09T19:28:04.101 INFO:tasks.workunit.client.0.vm07.stdout:8/935: fdatasync d7/d9/fd 0 2026-03-09T19:28:04.104 INFO:tasks.workunit.client.0.vm07.stdout:5/926: truncate d3/dd/d26/d2d/d60/fc5 4211379 0 2026-03-09T19:28:04.106 INFO:tasks.workunit.client.0.vm07.stdout:2/974: creat d3/d11/d38/f157 x:0 0 0 2026-03-09T19:28:04.108 INFO:tasks.workunit.client.0.vm07.stdout:4/904: fdatasync d3/fc 0 2026-03-09T19:28:04.108 INFO:tasks.workunit.client.0.vm07.stdout:2/975: truncate d3/dd/d16/d29/d3c/d4c/f149 470221 0 2026-03-09T19:28:04.110 INFO:tasks.workunit.client.0.vm07.stdout:7/842: write d0/d80/db1/de5/d54/fa4 [447777,37120] 0 2026-03-09T19:28:04.115 INFO:tasks.workunit.client.0.vm07.stdout:0/873: write d0/d6/d13/d17/dc3/fb3 [3537427,79740] 0 2026-03-09T19:28:04.115 INFO:tasks.workunit.client.0.vm07.stdout:0/874: chown d0/d6/d13/d17/c26 127988 1 2026-03-09T19:28:04.116 INFO:tasks.workunit.client.0.vm07.stdout:0/875: write d0/d6/d13/d17/d19/d57/d6a/f74 [470521,49385] 0 2026-03-09T19:28:04.120 INFO:tasks.workunit.client.0.vm07.stdout:4/905: dwrite d3/d4f/d56/ff2 [8388608,4194304] 0 2026-03-09T19:28:04.129 INFO:tasks.workunit.client.0.vm07.stdout:5/927: mkdir d3/dd/d26/d3f/d47/d71/dfa/d12e 0 2026-03-09T19:28:04.130 INFO:tasks.workunit.client.0.vm07.stdout:4/906: dwrite d3/d11/d2b/d37/f120 [0,4194304] 0 2026-03-09T19:28:04.130 INFO:tasks.workunit.client.0.vm07.stdout:6/860: creat d0/d2d/dd5/d123/d7b/da0/d11e/f13f x:0 0 0 2026-03-09T19:28:04.133 INFO:tasks.workunit.client.0.vm07.stdout:2/976: rename d3/dd/d16/d29/d3c to d3/dd/d158 0 2026-03-09T19:28:04.133 INFO:tasks.workunit.client.0.vm07.stdout:6/861: chown d0/d4e/d7f 335 1 2026-03-09T19:28:04.133 INFO:tasks.workunit.client.0.vm07.stdout:0/876: unlink d0/d6/d13/d1c/d11/c70 0 2026-03-09T19:28:04.133 INFO:tasks.workunit.client.0.vm07.stdout:6/862: chown d0/d1/c5 23942347 1 2026-03-09T19:28:04.147 INFO:tasks.workunit.client.0.vm07.stdout:2/977: rmdir d3/dd/d158/d4c 39 2026-03-09T19:28:04.147 INFO:tasks.workunit.client.0.vm07.stdout:9/937: dwrite d0/db/f108 [0,4194304] 0 2026-03-09T19:28:04.150 INFO:tasks.workunit.client.0.vm07.stdout:6/863: chown d0/d1/f130 2838 1 2026-03-09T19:28:04.152 INFO:tasks.workunit.client.0.vm07.stdout:4/907: rename d3/d11/d29/d34/c46 to d3/d4f/d56/d5f/d88/dd0/c13a 0 2026-03-09T19:28:04.155 INFO:tasks.workunit.client.0.vm07.stdout:9/938: dread d0/f4 [4194304,4194304] 0 2026-03-09T19:28:04.169 INFO:tasks.workunit.client.0.vm07.stdout:2/978: readlink d3/dd/d16/d29/d2d/d45/d3b/d44/d96/ddf/l13b 0 2026-03-09T19:28:04.169 INFO:tasks.workunit.client.0.vm07.stdout:4/908: rename d3/d11/d2b/d37/l94 to d3/d11/d2b/d38/ddc/d91/l13b 0 2026-03-09T19:28:04.169 INFO:tasks.workunit.client.0.vm07.stdout:6/864: rmdir d0/d1/d28/d76/dad 39 2026-03-09T19:28:04.169 INFO:tasks.workunit.client.0.vm07.stdout:9/939: chown d0/db/d29/d32/c37 25 1 2026-03-09T19:28:04.169 INFO:tasks.workunit.client.0.vm07.stdout:0/877: symlink d0/d6/d13/d17/d19/d58/dd9/l121 0 2026-03-09T19:28:04.169 INFO:tasks.workunit.client.0.vm07.stdout:2/979: mkdir d3/dd/d103/d159 0 2026-03-09T19:28:04.169 INFO:tasks.workunit.client.0.vm07.stdout:6/865: rmdir d0/dbf 39 2026-03-09T19:28:04.169 INFO:tasks.workunit.client.0.vm07.stdout:4/909: mknod d3/d11/d29/d101/d99/c13c 0 2026-03-09T19:28:04.169 INFO:tasks.workunit.client.0.vm07.stdout:9/940: rename d0/d6/d57/d8f/ce8 to d0/db/d29/d32/d5c/d80/ddf/c13e 0 2026-03-09T19:28:04.169 INFO:tasks.workunit.client.0.vm07.stdout:1/909: write d1/d11/d37/dcb/f102 [4339163,33909] 0 2026-03-09T19:28:04.175 INFO:tasks.workunit.client.0.vm07.stdout:4/910: mkdir d3/d11/d2b/d13d 0 2026-03-09T19:28:04.177 INFO:tasks.workunit.client.0.vm07.stdout:7/843: dread d0/d4/d5/d8/d41/d64/d74/d98/dcb/d39/f97 [0,4194304] 0 2026-03-09T19:28:04.178 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:03 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:28:04.178 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:03 vm07.local ceph-mon[48545]: pgmap v11: 65 pgs: 65 active+clean; 3.7 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 40 MiB/s rd, 87 MiB/s wr, 239 op/s 2026-03-09T19:28:04.180 INFO:tasks.workunit.client.0.vm07.stdout:6/866: symlink d0/dbf/d133/l140 0 2026-03-09T19:28:04.182 INFO:tasks.workunit.client.0.vm07.stdout:1/910: symlink d1/d91/l12e 0 2026-03-09T19:28:04.183 INFO:tasks.workunit.client.0.vm07.stdout:9/941: truncate d0/d6/d3a/fc6 1345774 0 2026-03-09T19:28:04.186 INFO:tasks.workunit.client.0.vm07.stdout:9/942: chown d0/d6f/fa9 172923166 1 2026-03-09T19:28:04.186 INFO:tasks.workunit.client.0.vm07.stdout:4/911: creat d3/d11/f13e x:0 0 0 2026-03-09T19:28:04.188 INFO:tasks.workunit.client.0.vm07.stdout:2/980: dread d3/dd/d158/d5a/fbe [0,4194304] 0 2026-03-09T19:28:04.189 INFO:tasks.workunit.client.0.vm07.stdout:7/844: rename d0/d4/d5/d8/d41/d64/fc1 to d0/d4/d5/d26/dfb/f113 0 2026-03-09T19:28:04.190 INFO:tasks.workunit.client.0.vm07.stdout:1/911: fdatasync d1/d11/d37/d5d/dc1/fd6 0 2026-03-09T19:28:04.192 INFO:tasks.workunit.client.0.vm07.stdout:7/845: chown d0/d4/d5/d8/d41/d64/d74/d98/dcb/c9f 9895 1 2026-03-09T19:28:04.194 INFO:tasks.workunit.client.0.vm07.stdout:2/981: fdatasync d3/dd/d16/f25 0 2026-03-09T19:28:04.194 INFO:tasks.workunit.client.0.vm07.stdout:4/912: readlink d3/d11/d29/d101/l113 0 2026-03-09T19:28:04.194 INFO:tasks.workunit.client.0.vm07.stdout:1/912: mknod d1/d11/c12f 0 2026-03-09T19:28:04.196 INFO:tasks.workunit.client.0.vm07.stdout:9/943: truncate d0/db/d29/d32/d5c/d80/ffe 767389 0 2026-03-09T19:28:04.196 INFO:tasks.workunit.client.0.vm07.stdout:1/913: read d1/db/d31/d4f/fd4 [185283,23396] 0 2026-03-09T19:28:04.196 INFO:tasks.workunit.client.0.vm07.stdout:6/867: truncate d0/d1/d28/d76/dad/fe1 4346576 0 2026-03-09T19:28:04.200 INFO:tasks.workunit.client.0.vm07.stdout:4/913: mkdir d3/d4f/d56/d5f/d125/dfe/d13f 0 2026-03-09T19:28:04.204 INFO:tasks.workunit.client.0.vm07.stdout:6/868: truncate d0/d4e/f78 785330 0 2026-03-09T19:28:04.205 INFO:tasks.workunit.client.0.vm07.stdout:1/914: write d1/db/fb2 [4919682,5530] 0 2026-03-09T19:28:04.206 INFO:tasks.workunit.client.0.vm07.stdout:6/869: chown d0/d1/lb1 922568387 1 2026-03-09T19:28:04.210 INFO:tasks.workunit.client.0.vm07.stdout:4/914: creat d3/d11/d2b/d38/ddc/d91/f140 x:0 0 0 2026-03-09T19:28:04.211 INFO:tasks.workunit.client.0.vm07.stdout:1/915: unlink d1/d3/f23 0 2026-03-09T19:28:04.218 INFO:tasks.workunit.client.0.vm07.stdout:9/944: dwrite d0/d6f/dc3/df8/dfc/f13a [0,4194304] 0 2026-03-09T19:28:04.222 INFO:tasks.workunit.client.0.vm07.stdout:7/846: dread d0/d4/d5/d26/dc9/ff1 [0,4194304] 0 2026-03-09T19:28:04.223 INFO:tasks.workunit.client.0.vm07.stdout:1/916: fsync d1/d11/d37/dcb/f125 0 2026-03-09T19:28:04.223 INFO:tasks.workunit.client.0.vm07.stdout:7/847: stat d0/d4/d5/d26/d32/dbd 0 2026-03-09T19:28:04.228 INFO:tasks.workunit.client.0.vm07.stdout:9/945: mkdir d0/d13f 0 2026-03-09T19:28:04.229 INFO:tasks.workunit.client.0.vm07.stdout:4/915: dread d3/d4f/f5e [0,4194304] 0 2026-03-09T19:28:04.236 INFO:tasks.workunit.client.0.vm07.stdout:8/936: write d7/d50/da6/fde [1917980,60775] 0 2026-03-09T19:28:04.236 INFO:tasks.workunit.client.0.vm07.stdout:5/928: write d3/dd/f8a [58022,118716] 0 2026-03-09T19:28:04.239 INFO:tasks.workunit.client.0.vm07.stdout:9/946: sync 2026-03-09T19:28:04.244 INFO:tasks.workunit.client.0.vm07.stdout:5/929: creat d3/dd/dda/f12f x:0 0 0 2026-03-09T19:28:04.246 INFO:tasks.workunit.client.0.vm07.stdout:8/937: symlink d7/d9/d10/dd8/dfd/l148 0 2026-03-09T19:28:04.250 INFO:tasks.workunit.client.0.vm07.stdout:5/930: readlink d3/d1a/d28/d36/l38 0 2026-03-09T19:28:04.252 INFO:tasks.workunit.client.0.vm07.stdout:5/931: mkdir d3/dd/d26/d3f/d47/d130 0 2026-03-09T19:28:04.257 INFO:tasks.workunit.client.0.vm07.stdout:5/932: chown d3/dd/d26/d2d/d79/d9f 48 1 2026-03-09T19:28:04.265 INFO:tasks.workunit.client.0.vm07.stdout:9/947: dwrite d0/db/f108 [4194304,4194304] 0 2026-03-09T19:28:04.270 INFO:tasks.workunit.client.0.vm07.stdout:9/948: write d0/db/d29/d2c/d36/f71 [1693438,46763] 0 2026-03-09T19:28:04.271 INFO:tasks.workunit.client.0.vm07.stdout:2/982: write d3/dd/f73 [1607347,65595] 0 2026-03-09T19:28:04.273 INFO:tasks.workunit.client.0.vm07.stdout:9/949: write d0/db/d29/d68/d99/fae [2250802,9954] 0 2026-03-09T19:28:04.273 INFO:tasks.workunit.client.0.vm07.stdout:0/878: dwrite d0/d6/d13/d17/dc3/fb6 [4194304,4194304] 0 2026-03-09T19:28:04.275 INFO:tasks.workunit.client.0.vm07.stdout:0/879: readlink d0/d6/d13/d17/d19/d58/dd9/l121 0 2026-03-09T19:28:04.281 INFO:tasks.workunit.client.0.vm07.stdout:1/917: write d1/d11/d37/d5d/d50/f8b [4370195,41096] 0 2026-03-09T19:28:04.285 INFO:tasks.workunit.client.0.vm07.stdout:2/983: chown d3/dd/d16/d29/d2d/l35 392 1 2026-03-09T19:28:04.285 INFO:tasks.workunit.client.0.vm07.stdout:9/950: truncate d0/d6/d57/deb/fc0 1401314 0 2026-03-09T19:28:04.285 INFO:tasks.workunit.client.0.vm07.stdout:0/880: mknod d0/d6/dc8/d99/ddc/c122 0 2026-03-09T19:28:04.289 INFO:tasks.workunit.client.0.vm07.stdout:6/870: dwrite d0/dbf/d95/feb [0,4194304] 0 2026-03-09T19:28:04.291 INFO:tasks.workunit.client.0.vm07.stdout:4/916: write d3/d11/d29/d101/d99/de7/f10d [2430613,4863] 0 2026-03-09T19:28:04.291 INFO:tasks.workunit.client.0.vm07.stdout:7/848: write d0/d4/d5/d8/d41/d64/d74/d98/dcb/f63 [3660103,80709] 0 2026-03-09T19:28:04.299 INFO:tasks.workunit.client.0.vm07.stdout:9/951: symlink d0/db/d29/d32/d5c/l140 0 2026-03-09T19:28:04.299 INFO:tasks.workunit.client.0.vm07.stdout:0/881: rename d0/d6/d13/da1 to d0/d6/d13/d1c/d52/d81/d123 0 2026-03-09T19:28:04.299 INFO:tasks.workunit.client.0.vm07.stdout:4/917: symlink d3/d11/d2b/d38/ddc/db2/l141 0 2026-03-09T19:28:04.300 INFO:tasks.workunit.client.0.vm07.stdout:6/871: unlink d0/d1/db/d52/d94/d81/lb2 0 2026-03-09T19:28:04.301 INFO:tasks.workunit.client.0.vm07.stdout:7/849: mkdir d0/d4/d5/d26/d32/dbd/d111/d114 0 2026-03-09T19:28:04.304 INFO:tasks.workunit.client.0.vm07.stdout:7/850: fdatasync d0/d4/d5/d8/f93 0 2026-03-09T19:28:04.305 INFO:tasks.workunit.client.0.vm07.stdout:4/918: fsync d3/d11/d2b/fba 0 2026-03-09T19:28:04.306 INFO:tasks.workunit.client.0.vm07.stdout:4/919: fsync d3/d11/f74 0 2026-03-09T19:28:04.310 INFO:tasks.workunit.client.0.vm07.stdout:7/851: link d0/d80/db1/de5/f62 d0/d80/db1/de5/d54/d55/d7f/f115 0 2026-03-09T19:28:04.311 INFO:tasks.workunit.client.0.vm07.stdout:4/920: fsync d3/d11/d2b/f98 0 2026-03-09T19:28:04.324 INFO:tasks.workunit.client.0.vm07.stdout:8/938: dwrite d7/d50/da6/fb8 [0,4194304] 0 2026-03-09T19:28:04.330 INFO:tasks.workunit.client.0.vm07.stdout:5/933: dwrite d3/dd/d26/d3f/d47/d71/db7/fbc [0,4194304] 0 2026-03-09T19:28:04.335 INFO:tasks.workunit.client.0.vm07.stdout:8/939: symlink d7/d9/d10/d44/d9a/l149 0 2026-03-09T19:28:04.337 INFO:tasks.workunit.client.0.vm07.stdout:2/984: dread d3/d11/f19 [0,4194304] 0 2026-03-09T19:28:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:03 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:28:04.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:03 vm08.local ceph-mon[57794]: pgmap v11: 65 pgs: 65 active+clean; 3.7 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 40 MiB/s rd, 87 MiB/s wr, 239 op/s 2026-03-09T19:28:04.353 INFO:tasks.workunit.client.0.vm07.stdout:5/934: sync 2026-03-09T19:28:04.355 INFO:tasks.workunit.client.0.vm07.stdout:2/985: dread d3/dd/d16/d30/da7/fd8 [0,4194304] 0 2026-03-09T19:28:04.356 INFO:tasks.workunit.client.0.vm07.stdout:5/935: rename d3/d1a/d28/d6c/d72/d8f/l101 to d3/dd/d26/d3f/d47/d71/dfa/l131 0 2026-03-09T19:28:04.358 INFO:tasks.workunit.client.0.vm07.stdout:2/986: mkdir d3/dd/d158/d5a/d7a/d74/dc9/d135/d15a 0 2026-03-09T19:28:04.363 INFO:tasks.workunit.client.0.vm07.stdout:1/918: write d1/d11/d37/ff2 [942015,25779] 0 2026-03-09T19:28:04.367 INFO:tasks.workunit.client.0.vm07.stdout:2/987: dwrite d3/dd/d16/d29/d2d/d45/f62 [4194304,4194304] 0 2026-03-09T19:28:04.368 INFO:tasks.workunit.client.0.vm07.stdout:2/988: truncate d3/dd/d158/da2/f10d 1656125 0 2026-03-09T19:28:04.384 INFO:tasks.workunit.client.0.vm07.stdout:5/936: dread d3/d1a/d28/d6c/d72/d8f/f91 [0,4194304] 0 2026-03-09T19:28:04.387 INFO:tasks.workunit.client.0.vm07.stdout:2/989: dwrite d3/dd/d16/d29/d2d/f130 [0,4194304] 0 2026-03-09T19:28:04.391 INFO:tasks.workunit.client.0.vm07.stdout:2/990: write d3/dd/d16/d29/d2d/f6d [8748847,115609] 0 2026-03-09T19:28:04.394 INFO:tasks.workunit.client.0.vm07.stdout:6/872: dwrite d0/d1/db/f43 [0,4194304] 0 2026-03-09T19:28:04.395 INFO:tasks.workunit.client.0.vm07.stdout:6/873: readlink d0/d1/db/d52/d94/d81/l12d 0 2026-03-09T19:28:04.432 INFO:tasks.workunit.client.0.vm07.stdout:1/919: mknod d1/d3e/db3/d9a/c130 0 2026-03-09T19:28:04.435 INFO:tasks.workunit.client.0.vm07.stdout:5/937: unlink d3/d1a/d28/d6c/d72/lec 0 2026-03-09T19:28:04.439 INFO:tasks.workunit.client.0.vm07.stdout:6/874: chown d0/dbf/l128 1877241110 1 2026-03-09T19:28:04.444 INFO:tasks.workunit.client.0.vm07.stdout:5/938: creat d3/dd/d26/d3f/d47/d71/d76/d98/d126/d92/d89/ddc/dde/f132 x:0 0 0 2026-03-09T19:28:04.447 INFO:tasks.workunit.client.0.vm07.stdout:0/882: rename d0/d6/dc8/d99/ddc/lfe to d0/d6/d13/d1c/d61/l124 0 2026-03-09T19:28:04.450 INFO:tasks.workunit.client.0.vm07.stdout:0/883: chown d0/d6/d13/d17/dc3/fb3 7299401 1 2026-03-09T19:28:04.451 INFO:tasks.workunit.client.0.vm07.stdout:5/939: creat d3/dd/d26/d3f/d47/d71/d76/d98/d126/d92/d89/ddc/dde/d113/f133 x:0 0 0 2026-03-09T19:28:04.454 INFO:tasks.workunit.client.0.vm07.stdout:5/940: mkdir d3/dd/d26/d3f/de0/d134 0 2026-03-09T19:28:04.457 INFO:tasks.workunit.client.0.vm07.stdout:0/884: link d0/d6/d13/caf d0/d6/d13/d17/d19/d57/d6a/c125 0 2026-03-09T19:28:04.462 INFO:tasks.workunit.client.0.vm07.stdout:0/885: rename d0/d6/d13/d1c/d61/ff7 to d0/d6/d13/d17/dc3/dbc/f126 0 2026-03-09T19:28:04.463 INFO:tasks.workunit.client.0.vm07.stdout:5/941: dwrite d3/d1a/fe4 [0,4194304] 0 2026-03-09T19:28:04.476 INFO:tasks.workunit.client.0.vm07.stdout:9/952: truncate d0/db/d29/d2c/d36/f71 2510315 0 2026-03-09T19:28:04.476 INFO:tasks.workunit.client.0.vm07.stdout:0/886: dread d0/d6/dc8/f94 [0,4194304] 0 2026-03-09T19:28:04.485 INFO:tasks.workunit.client.0.vm07.stdout:4/921: dwrite d3/d11/d16/f6e [0,4194304] 0 2026-03-09T19:28:04.491 INFO:tasks.workunit.client.0.vm07.stdout:0/887: mknod d0/d6/d13/d17/dc3/df6/df9/c127 0 2026-03-09T19:28:04.492 INFO:tasks.workunit.client.0.vm07.stdout:7/852: dwrite d0/d4/d5/d8/d41/fdf [0,4194304] 0 2026-03-09T19:28:04.492 INFO:tasks.workunit.client.0.vm07.stdout:7/853: stat d0/d4 0 2026-03-09T19:28:04.492 INFO:tasks.workunit.client.0.vm07.stdout:9/953: sync 2026-03-09T19:28:04.492 INFO:tasks.workunit.client.0.vm07.stdout:8/940: dwrite d7/d50/f6f [4194304,4194304] 0 2026-03-09T19:28:04.495 INFO:tasks.workunit.client.0.vm07.stdout:0/888: readlink d0/d6/dc8/lcd 0 2026-03-09T19:28:04.497 INFO:tasks.workunit.client.0.vm07.stdout:0/889: read - d0/d6/d13/d17/d19/f114 zero size 2026-03-09T19:28:04.499 INFO:tasks.workunit.client.0.vm07.stdout:0/890: truncate d0/d6/dc8/f10f 245096 0 2026-03-09T19:28:04.508 INFO:tasks.workunit.client.0.vm07.stdout:8/941: rmdir d7/d1d/d83/d9f/dd2 39 2026-03-09T19:28:04.515 INFO:tasks.workunit.client.0.vm07.stdout:8/942: rmdir d7/d1d 39 2026-03-09T19:28:04.523 INFO:tasks.workunit.client.0.vm07.stdout:9/954: dread d0/d6/d57/f59 [0,4194304] 0 2026-03-09T19:28:04.524 INFO:tasks.workunit.client.0.vm07.stdout:7/854: creat d0/d4/d5/f116 x:0 0 0 2026-03-09T19:28:04.524 INFO:tasks.workunit.client.0.vm07.stdout:7/855: write d0/d4/d5/d26/dfb/f113 [5057470,60840] 0 2026-03-09T19:28:04.524 INFO:tasks.workunit.client.0.vm07.stdout:9/955: unlink d0/d17/cc8 0 2026-03-09T19:28:04.527 INFO:tasks.workunit.client.0.vm07.stdout:9/956: mkdir d0/d6/d3a/dd3/d141 0 2026-03-09T19:28:04.528 INFO:tasks.workunit.client.0.vm07.stdout:8/943: link d7/d9/d10/dd8/dfd/fc0 d7/d1d/d83/d9f/dd2/def/f14a 0 2026-03-09T19:28:04.530 INFO:tasks.workunit.client.0.vm07.stdout:8/944: unlink d7/d9/d57/lad 0 2026-03-09T19:28:04.530 INFO:tasks.workunit.client.0.vm07.stdout:7/856: link d0/d80/db1/de5/d54/d5a/l7b d0/d4/d5/d26/d32/dbd/d111/l117 0 2026-03-09T19:28:04.531 INFO:tasks.workunit.client.0.vm07.stdout:8/945: chown d7/d9/d37/d45/d97/dbc/de2/f12a 30 1 2026-03-09T19:28:04.531 INFO:tasks.workunit.client.0.vm07.stdout:7/857: write d0/d4/d5/d8/f93 [830986,116555] 0 2026-03-09T19:28:04.534 INFO:tasks.workunit.client.0.vm07.stdout:8/946: unlink d7/d30/d32/fba 0 2026-03-09T19:28:04.538 INFO:tasks.workunit.client.0.vm07.stdout:7/858: mknod d0/d80/db1/c118 0 2026-03-09T19:28:04.539 INFO:tasks.workunit.client.0.vm07.stdout:8/947: dwrite d7/d16/f69 [0,4194304] 0 2026-03-09T19:28:04.540 INFO:tasks.workunit.client.0.vm07.stdout:7/859: symlink d0/d4/d5/d26/db9/dc2/l119 0 2026-03-09T19:28:04.544 INFO:tasks.workunit.client.0.vm07.stdout:7/860: chown d0/d4/d5/d8/d41/fb0 683 1 2026-03-09T19:28:04.546 INFO:tasks.workunit.client.0.vm07.stdout:7/861: chown d0/d4/d5/d26/db9/dc2/c103 822439 1 2026-03-09T19:28:04.547 INFO:tasks.workunit.client.0.vm07.stdout:9/957: sync 2026-03-09T19:28:04.547 INFO:tasks.workunit.client.0.vm07.stdout:8/948: mkdir d7/d1d/d83/d9f/ded/d14b 0 2026-03-09T19:28:04.554 INFO:tasks.workunit.client.0.vm07.stdout:7/862: dread d0/d4/d5/d8/f35 [0,4194304] 0 2026-03-09T19:28:04.557 INFO:tasks.workunit.client.0.vm07.stdout:8/949: mknod d7/d9/d10/dd8/dfc/c14c 0 2026-03-09T19:28:04.559 INFO:tasks.workunit.client.0.vm07.stdout:9/958: symlink d0/d6/d3a/dd3/d141/l142 0 2026-03-09T19:28:04.567 INFO:tasks.workunit.client.0.vm07.stdout:7/863: dwrite d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58/f70 [0,4194304] 0 2026-03-09T19:28:04.570 INFO:tasks.workunit.client.0.vm07.stdout:7/864: link d0/d4/d5/d8/d1a/d2a/l53 d0/d4/d5/d26/d32/dbd/d111/d114/l11a 0 2026-03-09T19:28:04.571 INFO:tasks.workunit.client.0.vm07.stdout:9/959: creat d0/d17/f143 x:0 0 0 2026-03-09T19:28:04.571 INFO:tasks.workunit.client.0.vm07.stdout:7/865: fdatasync d0/d80/db1/de5/f9a 0 2026-03-09T19:28:04.575 INFO:tasks.workunit.client.0.vm07.stdout:9/960: symlink d0/d17/l144 0 2026-03-09T19:28:04.576 INFO:tasks.workunit.client.0.vm07.stdout:9/961: chown d0/d6/d73/f12f 2006779 1 2026-03-09T19:28:04.579 INFO:tasks.workunit.client.0.vm07.stdout:9/962: rmdir d0/db/d29/d2c/de5 39 2026-03-09T19:28:04.585 INFO:tasks.workunit.client.0.vm07.stdout:2/991: dwrite d3/dd/d16/d30/f7e [0,4194304] 0 2026-03-09T19:28:04.591 INFO:tasks.workunit.client.0.vm07.stdout:2/992: sync 2026-03-09T19:28:04.595 INFO:tasks.workunit.client.0.vm07.stdout:1/920: dwrite d1/d3/f4 [0,4194304] 0 2026-03-09T19:28:04.603 INFO:tasks.workunit.client.0.vm07.stdout:6/875: dwrite d0/d2d/dd5/d123/f8f [0,4194304] 0 2026-03-09T19:28:04.611 INFO:tasks.workunit.client.0.vm07.stdout:5/942: truncate d3/dd/f8a 65533 0 2026-03-09T19:28:04.617 INFO:tasks.workunit.client.0.vm07.stdout:5/943: creat d3/dd/d26/d3f/d47/d71/d76/d98/f135 x:0 0 0 2026-03-09T19:28:04.618 INFO:tasks.workunit.client.0.vm07.stdout:6/876: mkdir d0/d1/d141 0 2026-03-09T19:28:04.622 INFO:tasks.workunit.client.0.vm07.stdout:5/944: creat d3/dd/d26/d3f/d47/d71/d76/d98/d126/f136 x:0 0 0 2026-03-09T19:28:04.625 INFO:tasks.workunit.client.0.vm07.stdout:6/877: creat d0/d1/d28/da8/f142 x:0 0 0 2026-03-09T19:28:04.627 INFO:tasks.workunit.client.0.vm07.stdout:5/945: symlink d3/dd/d26/d3f/d47/d71/d76/d98/d126/d92/d89/ddc/dde/l137 0 2026-03-09T19:28:04.628 INFO:tasks.workunit.client.0.vm07.stdout:4/922: dwrite d3/d11/f6c [0,4194304] 0 2026-03-09T19:28:04.645 INFO:tasks.workunit.client.0.vm07.stdout:5/946: rename d3/d1a/d5a/ce1 to d3/dd/d26/d2d/d9e/df0/c138 0 2026-03-09T19:28:04.645 INFO:tasks.workunit.client.0.vm07.stdout:4/923: symlink d3/l142 0 2026-03-09T19:28:04.645 INFO:tasks.workunit.client.0.vm07.stdout:5/947: chown d3/d1a/d28/d6c/ccc 12 1 2026-03-09T19:28:04.646 INFO:tasks.workunit.client.0.vm07.stdout:4/924: chown d3/ce 50722 1 2026-03-09T19:28:04.650 INFO:tasks.workunit.client.0.vm07.stdout:5/948: link d3/dd/d26/d3f/d47/d71/d76/d98/d126/f46 d3/d1a/d28/d10c/f139 0 2026-03-09T19:28:04.651 INFO:tasks.workunit.client.0.vm07.stdout:5/949: write d3/dd/d26/d2d/d79/f108 [247860,88328] 0 2026-03-09T19:28:04.657 INFO:tasks.workunit.client.0.vm07.stdout:0/891: write d0/d6/fcf [198821,20559] 0 2026-03-09T19:28:04.657 INFO:tasks.workunit.client.0.vm07.stdout:5/950: mkdir d3/dd/d26/d2d/d79/d13a 0 2026-03-09T19:28:04.660 INFO:tasks.workunit.client.0.vm07.stdout:1/921: dread d1/d11/d37/d5d/f8a [0,4194304] 0 2026-03-09T19:28:04.677 INFO:tasks.workunit.client.0.vm07.stdout:5/951: dwrite d3/d1a/d28/d48/f4f [4194304,4194304] 0 2026-03-09T19:28:04.677 INFO:tasks.workunit.client.0.vm07.stdout:1/922: truncate d1/d11/d37/dcb/f125 1913635 0 2026-03-09T19:28:04.677 INFO:tasks.workunit.client.0.vm07.stdout:8/950: write d7/d50/da6/dc5/f103 [2403605,14264] 0 2026-03-09T19:28:04.677 INFO:tasks.workunit.client.0.vm07.stdout:1/923: mkdir d1/d11/d37/dcb/d131 0 2026-03-09T19:28:04.677 INFO:tasks.workunit.client.0.vm07.stdout:8/951: readlink d7/d9/l3c 0 2026-03-09T19:28:04.677 INFO:tasks.workunit.client.0.vm07.stdout:8/952: readlink d7/d9/d10/l26 0 2026-03-09T19:28:04.677 INFO:tasks.workunit.client.0.vm07.stdout:8/953: mkdir d7/d9/d10/dd8/dfc/d14d 0 2026-03-09T19:28:04.679 INFO:tasks.workunit.client.0.vm07.stdout:8/954: symlink d7/d30/d75/dcc/df2/l14e 0 2026-03-09T19:28:04.681 INFO:tasks.workunit.client.0.vm07.stdout:0/892: sync 2026-03-09T19:28:04.682 INFO:tasks.workunit.client.0.vm07.stdout:8/955: rename d7/cd3 to d7/d9/d10/dd8/d10b/c14f 0 2026-03-09T19:28:04.682 INFO:tasks.workunit.client.0.vm07.stdout:0/893: dread - d0/d6/d13/d1c/d11/d56/f100 zero size 2026-03-09T19:28:04.684 INFO:tasks.workunit.client.0.vm07.stdout:0/894: truncate d0/fa8 1256499 0 2026-03-09T19:28:04.686 INFO:tasks.workunit.client.0.vm07.stdout:0/895: creat d0/d6/d13/d1c/d52/d81/f128 x:0 0 0 2026-03-09T19:28:04.686 INFO:tasks.workunit.client.0.vm07.stdout:0/896: chown d0/d6/l7e 23395568 1 2026-03-09T19:28:04.689 INFO:tasks.workunit.client.0.vm07.stdout:0/897: chown d0/d6/d13/d17/f20 84022 1 2026-03-09T19:28:04.689 INFO:tasks.workunit.client.0.vm07.stdout:5/952: sync 2026-03-09T19:28:04.690 INFO:tasks.workunit.client.0.vm07.stdout:0/898: truncate d0/d6/d13/d17/d19/ffa 757958 0 2026-03-09T19:28:04.693 INFO:tasks.workunit.client.0.vm07.stdout:5/953: symlink d3/d1a/d28/d122/l13b 0 2026-03-09T19:28:04.694 INFO:tasks.workunit.client.0.vm07.stdout:0/899: mknod d0/c129 0 2026-03-09T19:28:04.698 INFO:tasks.workunit.client.0.vm07.stdout:5/954: creat d3/d1a/d28/d48/f13c x:0 0 0 2026-03-09T19:28:04.699 INFO:tasks.workunit.client.0.vm07.stdout:0/900: dwrite d0/d6/d13/d17/d19/f7c [0,4194304] 0 2026-03-09T19:28:04.702 INFO:tasks.workunit.client.0.vm07.stdout:0/901: stat d0/d6/dc8/d99/fdb 0 2026-03-09T19:28:04.710 INFO:tasks.workunit.client.0.vm07.stdout:0/902: symlink d0/d6/d13/d1c/d61/l12a 0 2026-03-09T19:28:04.710 INFO:tasks.workunit.client.0.vm07.stdout:5/955: mknod d3/dd/d26/d3f/d47/d71/dfa/d11a/c13d 0 2026-03-09T19:28:04.715 INFO:tasks.workunit.client.0.vm07.stdout:5/956: creat d3/d1a/d5a/f13e x:0 0 0 2026-03-09T19:28:04.717 INFO:tasks.workunit.client.0.vm07.stdout:5/957: mknod d3/dd/d26/d3f/d47/d56/c13f 0 2026-03-09T19:28:04.718 INFO:tasks.workunit.client.0.vm07.stdout:0/903: getdents d0/d6/d13/d17/dc3 0 2026-03-09T19:28:04.719 INFO:tasks.workunit.client.0.vm07.stdout:5/958: readlink d3/dd/d26/d3f/d47/d71/d76/d98/d126/lb1 0 2026-03-09T19:28:04.720 INFO:tasks.workunit.client.0.vm07.stdout:0/904: symlink d0/d6/dc8/l12b 0 2026-03-09T19:28:04.722 INFO:tasks.workunit.client.0.vm07.stdout:0/905: mknod d0/d6/d13/d1c/d52/c12c 0 2026-03-09T19:28:04.724 INFO:tasks.workunit.client.0.vm07.stdout:0/906: dread d0/d6/dc8/f10f [0,4194304] 0 2026-03-09T19:28:04.724 INFO:tasks.workunit.client.0.vm07.stdout:5/959: link d3/dd/d26/d2d/d60/c8d d3/dd/d26/c140 0 2026-03-09T19:28:04.727 INFO:tasks.workunit.client.0.vm07.stdout:0/907: creat d0/d6/d13/f12d x:0 0 0 2026-03-09T19:28:04.730 INFO:tasks.workunit.client.0.vm07.stdout:5/960: rename d3/d1a/d28/d36/l70 to d3/dd/d26/d3f/d47/d71/dfa/d11a/l141 0 2026-03-09T19:28:04.730 INFO:tasks.workunit.client.0.vm07.stdout:5/961: truncate f2 5817862 0 2026-03-09T19:28:04.730 INFO:tasks.workunit.client.0.vm07.stdout:0/908: fdatasync d0/d6/d13/d17/d19/d58/f115 0 2026-03-09T19:28:04.740 INFO:tasks.workunit.client.0.vm07.stdout:8/956: dread d7/d9/d57/fb2 [0,4194304] 0 2026-03-09T19:28:04.750 INFO:tasks.workunit.client.0.vm07.stdout:0/909: mknod d0/d6/d13/d17/d19/d58/de0/c12e 0 2026-03-09T19:28:04.753 INFO:tasks.workunit.client.0.vm07.stdout:7/866: dwrite d0/d4/d5/d8/d41/fe0 [0,4194304] 0 2026-03-09T19:28:04.762 INFO:tasks.workunit.client.0.vm07.stdout:9/963: dwrite d0/d6/d57/d8f/ff0 [4194304,4194304] 0 2026-03-09T19:28:04.768 INFO:tasks.workunit.client.0.vm07.stdout:2/993: dwrite d3/dd/f24 [0,4194304] 0 2026-03-09T19:28:04.777 INFO:tasks.workunit.client.0.vm07.stdout:6/878: dwrite d0/d2d/f88 [0,4194304] 0 2026-03-09T19:28:04.784 INFO:tasks.workunit.client.0.vm07.stdout:0/910: truncate d0/d6/d13/f31 3748345 0 2026-03-09T19:28:04.787 INFO:tasks.workunit.client.0.vm07.stdout:4/925: truncate d3/d11/d2b/d38/ddc/d91/fb3 3435223 0 2026-03-09T19:28:04.797 INFO:tasks.workunit.client.0.vm07.stdout:0/911: sync 2026-03-09T19:28:04.802 INFO:tasks.workunit.client.0.vm07.stdout:7/867: symlink d0/d4/d5/d8/d1a/d2a/dc5/l11b 0 2026-03-09T19:28:04.804 INFO:tasks.workunit.client.0.vm07.stdout:1/924: write d1/d3e/db3/fba [868045,44522] 0 2026-03-09T19:28:04.808 INFO:tasks.workunit.client.0.vm07.stdout:0/912: fdatasync d0/d6/d13/d17/fc5 0 2026-03-09T19:28:04.819 INFO:tasks.workunit.client.0.vm07.stdout:7/868: symlink d0/d80/db1/de5/d54/d95/l11c 0 2026-03-09T19:28:04.819 INFO:tasks.workunit.client.0.vm07.stdout:6/879: truncate d0/d1/db/d1d/fcb 252069 0 2026-03-09T19:28:04.822 INFO:tasks.workunit.client.0.vm07.stdout:2/994: dread d3/dd/d16/d29/d2d/d45/d3b/dae/fda [0,4194304] 0 2026-03-09T19:28:04.824 INFO:tasks.workunit.client.0.vm07.stdout:7/869: truncate d0/d80/db1/de5/d54/f5e 4590289 0 2026-03-09T19:28:04.829 INFO:tasks.workunit.client.0.vm07.stdout:7/870: dwrite d0/d4/d5/d8/d41/d64/d74/d98/dcb/d39/ff9 [0,4194304] 0 2026-03-09T19:28:04.836 INFO:tasks.workunit.client.0.vm07.stdout:0/913: rmdir d0/d6/d13/d1c/d52/d81/d123 39 2026-03-09T19:28:04.852 INFO:tasks.workunit.client.0.vm07.stdout:2/995: dread d3/f5 [0,4194304] 0 2026-03-09T19:28:04.852 INFO:tasks.workunit.client.0.vm07.stdout:4/926: link d3/d11/d2b/d38/ddc/cff d3/d11/d29/d34/c143 0 2026-03-09T19:28:04.857 INFO:tasks.workunit.client.0.vm07.stdout:7/871: creat d0/d4/d5/d26/d32/f11d x:0 0 0 2026-03-09T19:28:04.859 INFO:tasks.workunit.client.0.vm07.stdout:7/872: dread d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58/ffd [0,4194304] 0 2026-03-09T19:28:04.862 INFO:tasks.workunit.client.0.vm07.stdout:1/925: getdents d1/d11/d37/d3f/d45/d87 0 2026-03-09T19:28:04.862 INFO:tasks.workunit.client.0.vm07.stdout:1/926: chown d1/db/f81 0 1 2026-03-09T19:28:04.866 INFO:tasks.workunit.client.0.vm07.stdout:0/914: mknod d0/d6/d13/d17/d19/d57/c12f 0 2026-03-09T19:28:04.867 INFO:tasks.workunit.client.0.vm07.stdout:1/927: fsync d1/d11/d37/d5d/f8a 0 2026-03-09T19:28:04.869 INFO:tasks.workunit.client.0.vm07.stdout:0/915: chown d0/d6/d13/d1c/d52/fdf 17403898 1 2026-03-09T19:28:04.872 INFO:tasks.workunit.client.0.vm07.stdout:9/964: write d0/db/d29/d32/d5c/d69/f96 [1039094,68307] 0 2026-03-09T19:28:04.875 INFO:tasks.workunit.client.0.vm07.stdout:5/962: dwrite d3/d1a/d28/d10c/fd7 [0,4194304] 0 2026-03-09T19:28:04.876 INFO:tasks.workunit.client.0.vm07.stdout:8/957: dwrite d7/d30/d32/f74 [0,4194304] 0 2026-03-09T19:28:04.879 INFO:tasks.workunit.client.0.vm07.stdout:4/927: unlink d3/d11/d2b/d38/ddc/d91/dd6/f119 0 2026-03-09T19:28:04.888 INFO:tasks.workunit.client.0.vm07.stdout:4/928: dwrite d3/d11/d2b/d38/d8f/f136 [0,4194304] 0 2026-03-09T19:28:04.890 INFO:tasks.workunit.client.0.vm07.stdout:9/965: symlink d0/d6/l145 0 2026-03-09T19:28:04.904 INFO:tasks.workunit.client.0.vm07.stdout:6/880: dwrite d0/dbf/d95/d31/f3c [4194304,4194304] 0 2026-03-09T19:28:04.910 INFO:tasks.workunit.client.0.vm07.stdout:4/929: chown d3/f13 55 1 2026-03-09T19:28:04.910 INFO:tasks.workunit.client.0.vm07.stdout:8/958: symlink d7/d9/d37/d45/d4f/db1/d107/d118/l150 0 2026-03-09T19:28:04.912 INFO:tasks.workunit.client.0.vm07.stdout:4/930: truncate d3/d11/d2b/d38/ddc/fb0 756178 0 2026-03-09T19:28:04.918 INFO:tasks.workunit.client.0.vm07.stdout:9/966: dread d0/d6/d3a/dd3/ff9 [0,4194304] 0 2026-03-09T19:28:04.919 INFO:tasks.workunit.client.0.vm07.stdout:9/967: chown d0/d6/d57/c9a 125 1 2026-03-09T19:28:04.923 INFO:tasks.workunit.client.0.vm07.stdout:1/928: creat d1/d11/d37/f132 x:0 0 0 2026-03-09T19:28:04.933 INFO:tasks.workunit.client.0.vm07.stdout:0/916: creat d0/d6/d13/d1c/d50/f130 x:0 0 0 2026-03-09T19:28:04.933 INFO:tasks.workunit.client.0.vm07.stdout:6/881: truncate d0/d1/db/d1d/f47 3860117 0 2026-03-09T19:28:04.933 INFO:tasks.workunit.client.0.vm07.stdout:8/959: symlink d7/d1d/d83/d9f/dd2/def/l151 0 2026-03-09T19:28:04.933 INFO:tasks.workunit.client.0.vm07.stdout:4/931: creat d3/d4f/d56/d5f/f144 x:0 0 0 2026-03-09T19:28:04.933 INFO:tasks.workunit.client.0.vm07.stdout:9/968: chown d0/l14 135367 1 2026-03-09T19:28:04.933 INFO:tasks.workunit.client.0.vm07.stdout:9/969: readlink d0/d6/d57/ldd 0 2026-03-09T19:28:04.939 INFO:tasks.workunit.client.0.vm07.stdout:9/970: mknod d0/d6f/dc3/c146 0 2026-03-09T19:28:04.940 INFO:tasks.workunit.client.0.vm07.stdout:9/971: fdatasync d0/db/fac 0 2026-03-09T19:28:04.942 INFO:tasks.workunit.client.0.vm07.stdout:2/996: dread d3/dd/d16/d29/f42 [0,4194304] 0 2026-03-09T19:28:04.947 INFO:tasks.workunit.client.0.vm07.stdout:4/932: dread d3/d11/d2b/d38/ddc/fb0 [0,4194304] 0 2026-03-09T19:28:04.948 INFO:tasks.workunit.client.0.vm07.stdout:8/960: creat d7/d9/d10/dd8/d10b/d11a/f152 x:0 0 0 2026-03-09T19:28:04.950 INFO:tasks.workunit.client.0.vm07.stdout:2/997: truncate d3/dd/d16/d29/d2d/d45/d3b/fe5 200850 0 2026-03-09T19:28:04.954 INFO:tasks.workunit.client.0.vm07.stdout:4/933: dread - d3/d11/d2b/d37/fdd zero size 2026-03-09T19:28:04.954 INFO:tasks.workunit.client.0.vm07.stdout:2/998: dread - d3/dd/d158/d5a/d7a/d74/f131 zero size 2026-03-09T19:28:04.954 INFO:tasks.workunit.client.0.vm07.stdout:0/917: creat d0/d6/d13/d17/d19/d58/dd9/f131 x:0 0 0 2026-03-09T19:28:04.956 INFO:tasks.workunit.client.0.vm07.stdout:8/961: fsync d7/d50/da6/faf 0 2026-03-09T19:28:04.959 INFO:tasks.workunit.client.0.vm07.stdout:4/934: mknod d3/dfc/c145 0 2026-03-09T19:28:04.960 INFO:tasks.workunit.client.0.vm07.stdout:2/999: dwrite d3/dd/d158/da2/d126/f132 [0,4194304] 0 2026-03-09T19:28:04.961 INFO:tasks.workunit.client.0.vm07.stdout:8/962: chown d7/d9/d10/d44/fdc 356856 1 2026-03-09T19:28:04.961 INFO:tasks.workunit.client.0.vm07.stdout:1/929: dread d1/db/fb2 [0,4194304] 0 2026-03-09T19:28:04.962 INFO:tasks.workunit.client.0.vm07.stdout:7/873: write d0/d4/d5/d8/d41/d64/d74/d98/fd8 [888740,121860] 0 2026-03-09T19:28:04.963 INFO:tasks.workunit.client.0.vm07.stdout:4/935: dread - d3/fea zero size 2026-03-09T19:28:04.968 INFO:tasks.workunit.client.0.vm07.stdout:0/918: sync 2026-03-09T19:28:04.975 INFO:tasks.workunit.client.0.vm07.stdout:1/930: creat d1/d3e/dc8/f133 x:0 0 0 2026-03-09T19:28:04.978 INFO:tasks.workunit.client.0.vm07.stdout:8/963: creat d7/f153 x:0 0 0 2026-03-09T19:28:04.978 INFO:tasks.workunit.client.0.vm07.stdout:4/936: mknod d3/d11/c146 0 2026-03-09T19:28:04.980 INFO:tasks.workunit.client.0.vm07.stdout:0/919: unlink d0/d6/d13/d1c/d52/d81/d123/fea 0 2026-03-09T19:28:04.982 INFO:tasks.workunit.client.0.vm07.stdout:8/964: mkdir d7/d9/d154 0 2026-03-09T19:28:04.991 INFO:tasks.workunit.client.0.vm07.stdout:0/920: chown d0/d6/d13/d1c/d61/d69/c8e 571 1 2026-03-09T19:28:04.991 INFO:tasks.workunit.client.0.vm07.stdout:8/965: link d7/d30/d32/fa9 d7/d16/f155 0 2026-03-09T19:28:04.991 INFO:tasks.workunit.client.0.vm07.stdout:1/931: dread d1/db/d31/d4f/f77 [0,4194304] 0 2026-03-09T19:28:04.991 INFO:tasks.workunit.client.0.vm07.stdout:0/921: getdents d0/d6 0 2026-03-09T19:28:04.991 INFO:tasks.workunit.client.0.vm07.stdout:8/966: fdatasync d7/f9d 0 2026-03-09T19:28:04.992 INFO:tasks.workunit.client.0.vm07.stdout:7/874: dread d0/d80/db1/de5/fd2 [0,4194304] 0 2026-03-09T19:28:04.994 INFO:tasks.workunit.client.0.vm07.stdout:1/932: mkdir d1/d11/d37/d5d/dc1/d107/d134 0 2026-03-09T19:28:04.994 INFO:tasks.workunit.client.0.vm07.stdout:4/937: dread d3/d11/d2b/d38/d8f/f12a [0,4194304] 0 2026-03-09T19:28:04.996 INFO:tasks.workunit.client.0.vm07.stdout:1/933: chown d1/d3/l25 182005738 1 2026-03-09T19:28:04.996 INFO:tasks.workunit.client.0.vm07.stdout:0/922: readlink d0/d6/d13/l46 0 2026-03-09T19:28:04.997 INFO:tasks.workunit.client.0.vm07.stdout:7/875: rename d0/d80/db1/de5/d54/d55 to d0/d4/d5/d26/d32/dbd/d111/d11e 0 2026-03-09T19:28:04.998 INFO:tasks.workunit.client.0.vm07.stdout:4/938: symlink d3/d4f/d56/d5f/l147 0 2026-03-09T19:28:05.000 INFO:tasks.workunit.client.0.vm07.stdout:0/923: rmdir d0/d6/dc8/d99/ddc 39 2026-03-09T19:28:05.001 INFO:tasks.workunit.client.0.vm07.stdout:1/934: truncate d1/d11/d37/d3f/f82 1040570 0 2026-03-09T19:28:05.002 INFO:tasks.workunit.client.0.vm07.stdout:8/967: creat d7/d16/f156 x:0 0 0 2026-03-09T19:28:05.003 INFO:tasks.workunit.client.0.vm07.stdout:4/939: symlink d3/d4f/d56/d11e/l148 0 2026-03-09T19:28:05.015 INFO:tasks.workunit.client.0.vm07.stdout:1/935: creat d1/d11/d37/d5d/d50/f135 x:0 0 0 2026-03-09T19:28:05.015 INFO:tasks.workunit.client.0.vm07.stdout:5/963: dwrite d3/dd/fab [0,4194304] 0 2026-03-09T19:28:05.022 INFO:tasks.workunit.client.0.vm07.stdout:9/972: write d0/db/d29/d2c/d36/fa1 [541236,36976] 0 2026-03-09T19:28:05.025 INFO:tasks.workunit.client.0.vm07.stdout:1/936: chown d1/d3e/db3/d6d/dff/fe7 0 1 2026-03-09T19:28:05.026 INFO:tasks.workunit.client.0.vm07.stdout:6/882: dwrite d0/d1/f130 [0,4194304] 0 2026-03-09T19:28:05.030 INFO:tasks.workunit.client.0.vm07.stdout:0/924: truncate d0/d6/d13/d1c/fa5 949877 0 2026-03-09T19:28:05.038 INFO:tasks.workunit.client.0.vm07.stdout:4/940: creat d3/d11/d2b/d38/d107/f149 x:0 0 0 2026-03-09T19:28:05.039 INFO:tasks.workunit.client.0.vm07.stdout:4/941: truncate d3/d11/d29/d101/f122 1297245 0 2026-03-09T19:28:05.042 INFO:tasks.workunit.client.0.vm07.stdout:1/937: dwrite d1/d3/f4 [4194304,4194304] 0 2026-03-09T19:28:05.043 INFO:tasks.workunit.client.0.vm07.stdout:8/968: symlink d7/d9/d37/d34/l157 0 2026-03-09T19:28:05.044 INFO:tasks.workunit.client.0.vm07.stdout:6/883: creat d0/d1/d28/da9/f143 x:0 0 0 2026-03-09T19:28:05.046 INFO:tasks.workunit.client.0.vm07.stdout:4/942: unlink d3/d4f/d56/d11e/l148 0 2026-03-09T19:28:05.048 INFO:tasks.workunit.client.0.vm07.stdout:9/973: mkdir d0/d13f/d147 0 2026-03-09T19:28:05.055 INFO:tasks.workunit.client.0.vm07.stdout:5/964: mkdir d3/dd/d26/d3f/d47/d130/d142 0 2026-03-09T19:28:05.059 INFO:tasks.workunit.client.0.vm07.stdout:8/969: chown d7/d9/ddf/c139 394614834 1 2026-03-09T19:28:05.059 INFO:tasks.workunit.client.0.vm07.stdout:0/925: dwrite d0/d6/d13/d1c/f27 [0,4194304] 0 2026-03-09T19:28:05.059 INFO:tasks.workunit.client.0.vm07.stdout:6/884: creat d0/d2d/dd5/d123/d7b/f144 x:0 0 0 2026-03-09T19:28:05.067 INFO:tasks.workunit.client.0.vm07.stdout:5/965: unlink d3/f4d 0 2026-03-09T19:28:05.072 INFO:tasks.workunit.client.0.vm07.stdout:6/885: mknod d0/d1/db/d24/da4/c145 0 2026-03-09T19:28:05.073 INFO:tasks.workunit.client.0.vm07.stdout:0/926: stat d0/d6/d13/d17/d19/d58/dd9/l106 0 2026-03-09T19:28:05.073 INFO:tasks.workunit.client.0.vm07.stdout:4/943: symlink d3/d11/d2b/d38/ddc/l14a 0 2026-03-09T19:28:05.073 INFO:tasks.workunit.client.0.vm07.stdout:9/974: truncate d0/d6/d57/deb/fc0 411700 0 2026-03-09T19:28:05.074 INFO:tasks.workunit.client.0.vm07.stdout:8/970: dwrite d7/d9/f87 [4194304,4194304] 0 2026-03-09T19:28:05.074 INFO:tasks.workunit.client.0.vm07.stdout:6/886: read - d0/d1/d28/da9/f143 zero size 2026-03-09T19:28:05.077 INFO:tasks.workunit.client.0.vm07.stdout:0/927: rmdir d0/d6/d13/d1c/d52/d81/d123 39 2026-03-09T19:28:05.078 INFO:tasks.workunit.client.0.vm07.stdout:5/966: mkdir d3/d1a/d28/d6c/d72/db5/d143 0 2026-03-09T19:28:05.087 INFO:tasks.workunit.client.0.vm07.stdout:4/944: mkdir d3/d11/d2b/d38/ddc/d91/d14b 0 2026-03-09T19:28:05.087 INFO:tasks.workunit.client.0.vm07.stdout:9/975: dread - d0/db/f91 zero size 2026-03-09T19:28:05.090 INFO:tasks.workunit.client.0.vm07.stdout:6/887: creat d0/d44/f146 x:0 0 0 2026-03-09T19:28:05.096 INFO:tasks.workunit.client.0.vm07.stdout:5/967: creat d3/dd/d26/d3f/d47/d71/db7/f144 x:0 0 0 2026-03-09T19:28:05.101 INFO:tasks.workunit.client.0.vm07.stdout:5/968: write d3/f10b [346451,127365] 0 2026-03-09T19:28:05.102 INFO:tasks.workunit.client.0.vm07.stdout:9/976: link d0/d6f/dc3/fff d0/d6/d57/d5d/dde/d118/f148 0 2026-03-09T19:28:05.105 INFO:tasks.workunit.client.0.vm07.stdout:6/888: creat d0/d2d/dd5/d123/d7b/f147 x:0 0 0 2026-03-09T19:28:05.106 INFO:tasks.workunit.client.0.vm07.stdout:0/928: dwrite d0/f1e [0,4194304] 0 2026-03-09T19:28:05.111 INFO:tasks.workunit.client.0.vm07.stdout:8/971: dwrite d7/d50/fa0 [0,4194304] 0 2026-03-09T19:28:05.114 INFO:tasks.workunit.client.0.vm07.stdout:0/929: stat d0/d6/d13/d17/d19/d58/dd9/l121 0 2026-03-09T19:28:05.121 INFO:tasks.workunit.client.0.vm07.stdout:4/945: dwrite d3/d11/f74 [0,4194304] 0 2026-03-09T19:28:05.128 INFO:tasks.workunit.client.0.vm07.stdout:0/930: truncate d0/d6/d13/d1c/fd7 693810 0 2026-03-09T19:28:05.128 INFO:tasks.workunit.client.0.vm07.stdout:8/972: symlink d7/d1d/d83/l158 0 2026-03-09T19:28:05.128 INFO:tasks.workunit.client.0.vm07.stdout:7/876: truncate d0/d80/db1/de5/d54/f7d 4378763 0 2026-03-09T19:28:05.128 INFO:tasks.workunit.client.0.vm07.stdout:9/977: creat d0/d6f/dc3/df8/f149 x:0 0 0 2026-03-09T19:28:05.145 INFO:tasks.workunit.client.0.vm07.stdout:8/973: unlink d7/d9/d37/d45/d4f/db1/fd6 0 2026-03-09T19:28:05.145 INFO:tasks.workunit.client.0.vm07.stdout:1/938: dwrite d1/d11/d37/d3f/f8e [0,4194304] 0 2026-03-09T19:28:05.145 INFO:tasks.workunit.client.0.vm07.stdout:8/974: read - d7/d50/da6/faf zero size 2026-03-09T19:28:05.157 INFO:tasks.workunit.client.0.vm07.stdout:9/978: symlink d0/d17/l14a 0 2026-03-09T19:28:05.158 INFO:tasks.workunit.client.0.vm07.stdout:4/946: link d3/d11/c68 d3/dfc/c14c 0 2026-03-09T19:28:05.169 INFO:tasks.workunit.client.0.vm07.stdout:8/975: link d7/d50/fa0 d7/f159 0 2026-03-09T19:28:05.172 INFO:tasks.workunit.client.0.vm07.stdout:8/976: unlink d7/d9/d10/dd8/dfd/d62/fc2 0 2026-03-09T19:28:05.175 INFO:tasks.workunit.client.0.vm07.stdout:8/977: dwrite d7/d30/fb7 [0,4194304] 0 2026-03-09T19:28:05.190 INFO:tasks.workunit.client.0.vm07.stdout:6/889: write d0/d1/db/d17/f38 [744614,40608] 0 2026-03-09T19:28:05.192 INFO:tasks.workunit.client.0.vm07.stdout:6/890: creat d0/d4e/dae/daf/f148 x:0 0 0 2026-03-09T19:28:05.193 INFO:tasks.workunit.client.0.vm07.stdout:6/891: write d0/d2d/dd5/d123/d7b/d7d/fa7 [13187,27938] 0 2026-03-09T19:28:05.195 INFO:tasks.workunit.client.0.vm07.stdout:6/892: mkdir d0/d1/db/d52/d94/d81/d149 0 2026-03-09T19:28:05.201 INFO:tasks.workunit.client.0.vm07.stdout:9/979: sync 2026-03-09T19:28:05.201 INFO:tasks.workunit.client.0.vm07.stdout:5/969: write d3/dd/f8a [977179,123541] 0 2026-03-09T19:28:05.203 INFO:tasks.workunit.client.0.vm07.stdout:6/893: dread d0/d2d/dd5/d123/fd7 [0,4194304] 0 2026-03-09T19:28:05.205 INFO:tasks.workunit.client.0.vm07.stdout:6/894: stat d0/d2d/dd5/d123/dc7/c11c 0 2026-03-09T19:28:05.207 INFO:tasks.workunit.client.0.vm07.stdout:6/895: getdents d0/d1 0 2026-03-09T19:28:05.223 INFO:tasks.workunit.client.0.vm07.stdout:6/896: dread d0/d1/db/d1d/f27 [0,4194304] 0 2026-03-09T19:28:05.229 INFO:tasks.workunit.client.0.vm07.stdout:7/877: write d0/fda [583077,3887] 0 2026-03-09T19:28:05.232 INFO:tasks.workunit.client.0.vm07.stdout:0/931: truncate d0/d6/d13/dd0/fd5 91250 0 2026-03-09T19:28:05.233 INFO:tasks.workunit.client.0.vm07.stdout:4/947: write d3/d11/d29/f3c [4067549,30379] 0 2026-03-09T19:28:05.239 INFO:tasks.workunit.client.0.vm07.stdout:4/948: dread - d3/d11/d2b/d38/d8f/f117 zero size 2026-03-09T19:28:05.243 INFO:tasks.workunit.client.0.vm07.stdout:1/939: dwrite d1/db/f14 [0,4194304] 0 2026-03-09T19:28:05.247 INFO:tasks.workunit.client.0.vm07.stdout:9/980: write d0/db/d29/d32/d5c/d80/fe2 [625497,65584] 0 2026-03-09T19:28:05.248 INFO:tasks.workunit.client.0.vm07.stdout:4/949: truncate d3/d11/d2b/d38/ddc/db2/d127/f132 417941 0 2026-03-09T19:28:05.253 INFO:tasks.workunit.client.0.vm07.stdout:6/897: write d0/d1/d28/da8/ffb [555907,9872] 0 2026-03-09T19:28:05.254 INFO:tasks.workunit.client.0.vm07.stdout:5/970: write d3/d1a/d28/d6c/d72/db5/fd2 [1127848,41845] 0 2026-03-09T19:28:05.255 INFO:tasks.workunit.client.0.vm07.stdout:9/981: dread - d0/db/f91 zero size 2026-03-09T19:28:05.255 INFO:tasks.workunit.client.0.vm07.stdout:0/932: dwrite d0/d6/d13/d1c/d52/d81/f128 [0,4194304] 0 2026-03-09T19:28:05.256 INFO:tasks.workunit.client.0.vm07.stdout:6/898: chown d0/d4e/dae/daf/f148 1099054 1 2026-03-09T19:28:05.258 INFO:tasks.workunit.client.0.vm07.stdout:8/978: dwrite d7/d9/d37/d45/f10d [0,4194304] 0 2026-03-09T19:28:05.259 INFO:tasks.workunit.client.0.vm07.stdout:7/878: link d0/d4/d5/d8/d41/d64/d74/d98/dcb/f63 d0/d4/d5/d8/d41/d64/d74/d98/dcb/df2/f11f 0 2026-03-09T19:28:05.283 INFO:tasks.workunit.client.0.vm07.stdout:8/979: truncate d7/d9/d10/f41 1626833 0 2026-03-09T19:28:05.283 INFO:tasks.workunit.client.0.vm07.stdout:5/971: rename d3/dd/d26/d3f/d47/d71/d76/d98/d126/d92/laf to d3/dd/d26/d3f/d47/d71/dfa/l145 0 2026-03-09T19:28:05.291 INFO:tasks.workunit.client.0.vm07.stdout:7/879: mknod d0/d4/d5/d26/c120 0 2026-03-09T19:28:05.295 INFO:tasks.workunit.client.0.vm07.stdout:1/940: creat d1/d3e/db3/f136 x:0 0 0 2026-03-09T19:28:05.295 INFO:tasks.workunit.client.0.vm07.stdout:4/950: symlink d3/d11/d2b/d38/ddc/d22/l14d 0 2026-03-09T19:28:05.295 INFO:tasks.workunit.client.0.vm07.stdout:8/980: creat d7/d9/d57/f15a x:0 0 0 2026-03-09T19:28:05.295 INFO:tasks.workunit.client.0.vm07.stdout:6/899: dread d0/d1/d28/da9/f116 [0,4194304] 0 2026-03-09T19:28:05.297 INFO:tasks.workunit.client.0.vm07.stdout:5/972: write d3/dd/d26/d2d/d79/d9f/fd0 [756980,117176] 0 2026-03-09T19:28:05.301 INFO:tasks.workunit.client.0.vm07.stdout:8/981: symlink d7/d9/d37/d45/d4f/db1/d107/d11e/d12c/l15b 0 2026-03-09T19:28:05.305 INFO:tasks.workunit.client.0.vm07.stdout:4/951: dread d3/d11/d2b/d38/d8f/f136 [0,4194304] 0 2026-03-09T19:28:05.305 INFO:tasks.workunit.client.0.vm07.stdout:7/880: mkdir d0/d4/d5/d26/d32/dbd/d111/d11e/d7f/d121 0 2026-03-09T19:28:05.306 INFO:tasks.workunit.client.0.vm07.stdout:6/900: rename d0/d1/db/l71 to d0/d2d/dd5/l14a 0 2026-03-09T19:28:05.307 INFO:tasks.workunit.client.0.vm07.stdout:5/973: unlink d3/d1a/d28/c43 0 2026-03-09T19:28:05.308 INFO:tasks.workunit.client.0.vm07.stdout:1/941: sync 2026-03-09T19:28:05.314 INFO:tasks.workunit.client.0.vm07.stdout:8/982: mkdir d7/d9/d37/d45/d97/dbc/d13b/d15c 0 2026-03-09T19:28:05.315 INFO:tasks.workunit.client.0.vm07.stdout:6/901: write d0/d2d/dd5/d123/d7b/da0/d11e/f13f [408943,47285] 0 2026-03-09T19:28:05.316 INFO:tasks.workunit.client.0.vm07.stdout:5/974: dread - d3/d1a/d5d/dee/f110 zero size 2026-03-09T19:28:05.316 INFO:tasks.workunit.client.0.vm07.stdout:6/902: stat d0/d13/f18 0 2026-03-09T19:28:05.316 INFO:tasks.workunit.client.0.vm07.stdout:7/881: unlink d0/d4/d5/d8/d1a/de8/ce9 0 2026-03-09T19:28:05.320 INFO:tasks.workunit.client.0.vm07.stdout:4/952: rename d3/f8d to d3/d11/d2b/d38/ddc/db2/f14e 0 2026-03-09T19:28:05.324 INFO:tasks.workunit.client.0.vm07.stdout:6/903: write d0/d1/db/d91/f117 [783265,23684] 0 2026-03-09T19:28:05.325 INFO:tasks.workunit.client.0.vm07.stdout:7/882: mknod d0/d4/d5/d26/db9/dc2/c122 0 2026-03-09T19:28:05.330 INFO:tasks.workunit.client.0.vm07.stdout:9/982: write d0/dc1/f7c [1950830,67897] 0 2026-03-09T19:28:05.330 INFO:tasks.workunit.client.0.vm07.stdout:0/933: write d0/f3a [85219,93347] 0 2026-03-09T19:28:05.331 INFO:tasks.workunit.client.0.vm07.stdout:4/953: dread - d3/d11/d16/fae zero size 2026-03-09T19:28:05.334 INFO:tasks.workunit.client.0.vm07.stdout:0/934: sync 2026-03-09T19:28:05.336 INFO:tasks.workunit.client.0.vm07.stdout:5/975: read d3/d1a/f12 [3269185,96489] 0 2026-03-09T19:28:05.338 INFO:tasks.workunit.client.0.vm07.stdout:8/983: mknod d7/d9/d37/d45/d4f/db1/d107/d118/c15d 0 2026-03-09T19:28:05.344 INFO:tasks.workunit.client.0.vm07.stdout:1/942: dread d1/d11/d37/d3f/f109 [0,4194304] 0 2026-03-09T19:28:05.349 INFO:tasks.workunit.client.0.vm07.stdout:1/943: write d1/d11/d37/d3f/d45/d87/d88/fd5 [3750453,62484] 0 2026-03-09T19:28:05.349 INFO:tasks.workunit.client.0.vm07.stdout:7/883: fsync d0/d4/d5/d8/d41/d64/d74/d98/dcb/f60 0 2026-03-09T19:28:05.358 INFO:tasks.workunit.client.0.vm07.stdout:4/954: fsync d3/f13 0 2026-03-09T19:28:05.360 INFO:tasks.workunit.client.0.vm07.stdout:9/983: rename d0/db/d29/d32/d5c/d80/ddf/d12c to d0/d13f/d147/d14b 0 2026-03-09T19:28:05.363 INFO:tasks.workunit.client.0.vm07.stdout:6/904: fdatasync d0/f8b 0 2026-03-09T19:28:05.367 INFO:tasks.workunit.client.0.vm07.stdout:6/905: sync 2026-03-09T19:28:05.371 INFO:tasks.workunit.client.0.vm07.stdout:0/935: symlink d0/d6/d13/d1c/l132 0 2026-03-09T19:28:05.371 INFO:tasks.workunit.client.0.vm07.stdout:9/984: mkdir d0/d6f/dc3/df8/dfc/d14c 0 2026-03-09T19:28:05.372 INFO:tasks.workunit.client.0.vm07.stdout:5/976: mkdir d3/dd/d26/d3f/d47/d71/d76/d98/d126/d92/d89/ddc/dde/dfb/d124/d12a/d146 0 2026-03-09T19:28:05.373 INFO:tasks.workunit.client.0.vm07.stdout:1/944: dread d1/db/f1f [0,4194304] 0 2026-03-09T19:28:05.373 INFO:tasks.workunit.client.0.vm07.stdout:7/884: dread d0/d4/d5/d8/d1a/d2a/f34 [0,4194304] 0 2026-03-09T19:28:05.373 INFO:tasks.workunit.client.0.vm07.stdout:0/936: stat d0/d6/d13/d17/f20 0 2026-03-09T19:28:05.376 INFO:tasks.workunit.client.0.vm07.stdout:8/984: getdents d7/d9/d10/dd8 0 2026-03-09T19:28:05.376 INFO:tasks.workunit.client.0.vm07.stdout:5/977: chown d3/d1a/d28/d6c/d72/df6 25 1 2026-03-09T19:28:05.378 INFO:tasks.workunit.client.0.vm07.stdout:4/955: write d3/d11/f7d [5206292,39884] 0 2026-03-09T19:28:05.389 INFO:tasks.workunit.client.0.vm07.stdout:6/906: getdents d0/d1/db/d24/da4 0 2026-03-09T19:28:05.392 INFO:tasks.workunit.client.0.vm07.stdout:9/985: write d0/d6/d3a/d94/ff6 [111278,49564] 0 2026-03-09T19:28:05.397 INFO:tasks.workunit.client.0.vm07.stdout:4/956: dread d3/d11/d51/f121 [0,4194304] 0 2026-03-09T19:28:05.402 INFO:tasks.workunit.client.0.vm07.stdout:4/957: write d3/d4f/d56/f124 [304913,50551] 0 2026-03-09T19:28:05.412 INFO:tasks.workunit.client.0.vm07.stdout:8/985: mkdir d7/d9/d10/dd8/dfc/d15e 0 2026-03-09T19:28:05.413 INFO:tasks.workunit.client.0.vm07.stdout:0/937: creat d0/d6/d13/d1c/f133 x:0 0 0 2026-03-09T19:28:05.415 INFO:tasks.workunit.client.0.vm07.stdout:1/945: creat d1/d3e/f137 x:0 0 0 2026-03-09T19:28:05.415 INFO:tasks.workunit.client.0.vm07.stdout:1/946: dread - d1/d3e/f127 zero size 2026-03-09T19:28:05.417 INFO:tasks.workunit.client.0.vm07.stdout:6/907: truncate d0/d4e/d7f/dbe/ff2 444786 0 2026-03-09T19:28:05.420 INFO:tasks.workunit.client.0.vm07.stdout:7/885: mkdir d0/d4/d5/d8/d41/d64/d74/d98/dcb/d123 0 2026-03-09T19:28:05.420 INFO:tasks.workunit.client.0.vm07.stdout:4/958: mkdir d3/d11/d2b/d38/ddc/d22/d86/d14f 0 2026-03-09T19:28:05.421 INFO:tasks.workunit.client.0.vm07.stdout:8/986: fdatasync d7/d9/d10/fb9 0 2026-03-09T19:28:05.425 INFO:tasks.workunit.client.0.vm07.stdout:5/978: creat d3/dd/d26/f147 x:0 0 0 2026-03-09T19:28:05.426 INFO:tasks.workunit.client.0.vm07.stdout:6/908: creat d0/d44/dd3/f14b x:0 0 0 2026-03-09T19:28:05.429 INFO:tasks.workunit.client.0.vm07.stdout:8/987: mkdir d7/d9/d10/dd8/dfd/d62/d15f 0 2026-03-09T19:28:05.429 INFO:tasks.workunit.client.0.vm07.stdout:8/988: fdatasync d7/d30/fb7 0 2026-03-09T19:28:05.430 INFO:tasks.workunit.client.0.vm07.stdout:5/979: mknod d3/dd/d26/d2d/d9e/df0/c148 0 2026-03-09T19:28:05.433 INFO:tasks.workunit.client.0.vm07.stdout:4/959: dread d3/d11/d29/d34/f5c [0,4194304] 0 2026-03-09T19:28:05.434 INFO:tasks.workunit.client.0.vm07.stdout:9/986: rename d0/db/d29/d32/d5c/d69/f83 to d0/d6/d3a/d81/f14d 0 2026-03-09T19:28:05.434 INFO:tasks.workunit.client.0.vm07.stdout:4/960: chown d3/d4f/f12e 0 1 2026-03-09T19:28:05.435 INFO:tasks.workunit.client.0.vm07.stdout:9/987: write d0/d6/d73/f12f [464428,1493] 0 2026-03-09T19:28:05.437 INFO:tasks.workunit.client.0.vm07.stdout:4/961: chown d3/d4f/d56/d5f/d125/dfe/c139 318 1 2026-03-09T19:28:05.437 INFO:tasks.workunit.client.0.vm07.stdout:9/988: chown d0/db/d29/d2c/lef 22330 1 2026-03-09T19:28:05.446 INFO:tasks.workunit.client.0.vm07.stdout:9/989: sync 2026-03-09T19:28:05.446 INFO:tasks.workunit.client.0.vm07.stdout:4/962: sync 2026-03-09T19:28:05.447 INFO:tasks.workunit.client.0.vm07.stdout:6/909: mkdir d0/d4e/d14c 0 2026-03-09T19:28:05.452 INFO:tasks.workunit.client.0.vm07.stdout:7/886: creat d0/f124 x:0 0 0 2026-03-09T19:28:05.455 INFO:tasks.workunit.client.0.vm07.stdout:5/980: mkdir d3/d1a/d28/d6c/d149 0 2026-03-09T19:28:05.455 INFO:tasks.workunit.client.0.vm07.stdout:5/981: chown d3/dd/d26/d2d/d60/d83 40 1 2026-03-09T19:28:05.462 INFO:tasks.workunit.client.0.vm07.stdout:8/989: mkdir d7/d9/d10/dd8/dfd/d160 0 2026-03-09T19:28:05.469 INFO:tasks.workunit.client.0.vm07.stdout:6/910: write d0/d2d/f4a [1661745,67629] 0 2026-03-09T19:28:05.470 INFO:tasks.workunit.client.0.vm07.stdout:7/887: chown d0/d4/d5/d26/d32/dbd/d111/d114/l11a 128 1 2026-03-09T19:28:05.471 INFO:tasks.workunit.client.0.vm07.stdout:8/990: dread d7/d50/fa0 [0,4194304] 0 2026-03-09T19:28:05.471 INFO:tasks.workunit.client.0.vm07.stdout:6/911: write d0/d1/db/d1d/d77/f129 [1046053,21699] 0 2026-03-09T19:28:05.471 INFO:tasks.workunit.client.0.vm07.stdout:5/982: mkdir d3/d1a/d28/d6c/d72/d8f/d14a 0 2026-03-09T19:28:05.473 INFO:tasks.workunit.client.0.vm07.stdout:0/938: rename d0/d6/d13/d1c/d11/l87 to d0/d6/d13/d17/d19/d57/l134 0 2026-03-09T19:28:05.477 INFO:tasks.workunit.client.0.vm07.stdout:9/990: truncate d0/d6/d57/deb/fc0 568232 0 2026-03-09T19:28:05.477 INFO:tasks.workunit.client.0.vm07.stdout:7/888: fdatasync d0/d4/d5/d26/dfb/f113 0 2026-03-09T19:28:05.477 INFO:tasks.workunit.client.0.vm07.stdout:5/983: truncate d3/f18 5236670 0 2026-03-09T19:28:05.478 INFO:tasks.workunit.client.0.vm07.stdout:0/939: truncate d0/d6/fcf 1139241 0 2026-03-09T19:28:05.485 INFO:tasks.workunit.client.0.vm07.stdout:4/963: rename d3/d11/d2b/d38/d107 to d3/d11/d29/d101/d99/de7/d150 0 2026-03-09T19:28:05.486 INFO:tasks.workunit.client.0.vm07.stdout:6/912: dread d0/d1/d28/da8/fd1 [4194304,4194304] 0 2026-03-09T19:28:05.487 INFO:tasks.workunit.client.0.vm07.stdout:4/964: fsync d3/d11/d16/de1/f135 0 2026-03-09T19:28:05.490 INFO:tasks.workunit.client.0.vm07.stdout:7/889: mknod d0/d4/d5/d26/dfb/c125 0 2026-03-09T19:28:05.494 INFO:tasks.workunit.client.0.vm07.stdout:5/984: symlink d3/dd/d26/d3f/d47/d71/l14b 0 2026-03-09T19:28:05.496 INFO:tasks.workunit.client.0.vm07.stdout:8/991: dread d7/d50/f8f [4194304,4194304] 0 2026-03-09T19:28:05.497 INFO:tasks.workunit.client.0.vm07.stdout:9/991: symlink d0/db/d29/d68/d99/d102/l14e 0 2026-03-09T19:28:05.498 INFO:tasks.workunit.client.0.vm07.stdout:8/992: chown d7/d9/d37/d45/d4f/db1/d107/f11b 1 1 2026-03-09T19:28:05.499 INFO:tasks.workunit.client.0.vm07.stdout:7/890: mkdir d0/d4/d5/d8/d41/d64/d74/d98/d126 0 2026-03-09T19:28:05.500 INFO:tasks.workunit.client.0.vm07.stdout:5/985: rmdir d3/dd/d26/d3f/d47/d71/d76 39 2026-03-09T19:28:05.502 INFO:tasks.workunit.client.0.vm07.stdout:4/965: getdents d3/d11/d29/d34/de0 0 2026-03-09T19:28:05.502 INFO:tasks.workunit.client.0.vm07.stdout:9/992: fdatasync d0/db/d29/d32/d5c/d80/fe4 0 2026-03-09T19:28:05.503 INFO:tasks.workunit.client.0.vm07.stdout:4/966: readlink d3/d11/d2b/d38/ddc/d22/d86/led 0 2026-03-09T19:28:05.503 INFO:tasks.workunit.client.0.vm07.stdout:7/891: fdatasync d0/d4/d5/d26/d32/fa6 0 2026-03-09T19:28:05.506 INFO:tasks.workunit.client.0.vm07.stdout:8/993: rename d7/d9/d10/dd8/dfd/d67/f11f to d7/d9/d10/dd8/d10b/d11a/f161 0 2026-03-09T19:28:05.507 INFO:tasks.workunit.client.0.vm07.stdout:6/913: getdents d0/d1/d28/da9 0 2026-03-09T19:28:05.509 INFO:tasks.workunit.client.0.vm07.stdout:9/993: creat d0/d6/d3a/f14f x:0 0 0 2026-03-09T19:28:05.510 INFO:tasks.workunit.client.0.vm07.stdout:4/967: creat d3/d4f/d56/d5f/d125/d133/f151 x:0 0 0 2026-03-09T19:28:05.510 INFO:tasks.workunit.client.0.vm07.stdout:6/914: chown d0/d1/db/d52/d94/d81/lc0 238 1 2026-03-09T19:28:05.511 INFO:tasks.workunit.client.0.vm07.stdout:6/915: fdatasync d0/d4e/d7f/fbc 0 2026-03-09T19:28:05.511 INFO:tasks.workunit.client.0.vm07.stdout:7/892: mknod d0/d4/d5/d8/d41/d64/c127 0 2026-03-09T19:28:05.512 INFO:tasks.workunit.client.0.vm07.stdout:6/916: chown d0/d1/db/d52/d94/d81 602433611 1 2026-03-09T19:28:05.515 INFO:tasks.workunit.client.0.vm07.stdout:9/994: creat d0/d6/d73/d105/f150 x:0 0 0 2026-03-09T19:28:05.515 INFO:tasks.workunit.client.0.vm07.stdout:8/994: dread d7/d1d/d83/d9f/dd2/def/f106 [0,4194304] 0 2026-03-09T19:28:05.519 INFO:tasks.workunit.client.0.vm07.stdout:4/968: dread d3/d11/d51/f121 [0,4194304] 0 2026-03-09T19:28:05.521 INFO:tasks.workunit.client.0.vm07.stdout:9/995: creat d0/d13f/d147/d14b/f151 x:0 0 0 2026-03-09T19:28:05.522 INFO:tasks.workunit.client.0.vm07.stdout:4/969: read d3/f64 [44196,12851] 0 2026-03-09T19:28:05.525 INFO:tasks.workunit.client.0.vm07.stdout:8/995: read d7/d9/d37/d45/d97/dbc/f143 [139099,97972] 0 2026-03-09T19:28:05.525 INFO:tasks.workunit.client.0.vm07.stdout:1/947: dwrite d1/d11/d37/d5d/dc1/fd6 [0,4194304] 0 2026-03-09T19:28:05.528 INFO:tasks.workunit.client.0.vm07.stdout:7/893: creat d0/d4/d5/d26/f128 x:0 0 0 2026-03-09T19:28:05.535 INFO:tasks.workunit.client.0.vm07.stdout:9/996: dwrite d0/db/d29/d68/d99/fae [0,4194304] 0 2026-03-09T19:28:05.552 INFO:tasks.workunit.client.0.vm07.stdout:6/917: link d0/d1/db/d1d/f47 d0/d2d/dd5/df5/f14d 0 2026-03-09T19:28:05.553 INFO:tasks.workunit.client.0.vm07.stdout:4/970: creat d3/d11/d2b/d38/ddc/d22/d86/f152 x:0 0 0 2026-03-09T19:28:05.556 INFO:tasks.workunit.client.0.vm07.stdout:4/971: read - d3/d11/d2b/d38/ddc/d91/f140 zero size 2026-03-09T19:28:05.557 INFO:tasks.workunit.client.0.vm07.stdout:8/996: unlink d7/d9/d37/d45/d4f/db1/d107/f11b 0 2026-03-09T19:28:05.558 INFO:tasks.workunit.client.0.vm07.stdout:0/940: write d0/d6/d13/d1c/d11/d56/f7f [3800158,121230] 0 2026-03-09T19:28:05.560 INFO:tasks.workunit.client.0.vm07.stdout:7/894: dread d0/d4/d5/d8/d41/d64/d74/d98/dcb/f63 [0,4194304] 0 2026-03-09T19:28:05.561 INFO:tasks.workunit.client.0.vm07.stdout:1/948: rename d1/d11/d37/d3f/fd7 to d1/db/d31/d56/f138 0 2026-03-09T19:28:05.562 INFO:tasks.workunit.client.0.vm07.stdout:5/986: write d3/dd/d26/d3f/d47/f62 [2127,44026] 0 2026-03-09T19:28:05.563 INFO:tasks.workunit.client.0.vm07.stdout:7/895: chown d0/d4/d5/d26/dfb/f113 22538298 1 2026-03-09T19:28:05.563 INFO:tasks.workunit.client.0.vm07.stdout:4/972: read d3/d11/d2b/d38/ddc/fb0 [83665,58891] 0 2026-03-09T19:28:05.565 INFO:tasks.workunit.client.0.vm07.stdout:8/997: rename d7/d1d/d83/d9f/dd2 to d7/d9/d10/dd8/d10b/d162 0 2026-03-09T19:28:05.566 INFO:tasks.workunit.client.0.vm07.stdout:9/997: truncate d0/d6/d73/d10e/f120 1963310 0 2026-03-09T19:28:05.567 INFO:tasks.workunit.client.0.vm07.stdout:6/918: truncate d0/d4e/d7f/fca 749068 0 2026-03-09T19:28:05.568 INFO:tasks.workunit.client.0.vm07.stdout:0/941: fdatasync d0/f3d 0 2026-03-09T19:28:05.569 INFO:tasks.workunit.client.0.vm07.stdout:6/919: write d0/d2d/dd5/d123/f131 [396774,38883] 0 2026-03-09T19:28:05.572 INFO:tasks.workunit.client.0.vm07.stdout:9/998: symlink d0/d6f/d86/l152 0 2026-03-09T19:28:05.574 INFO:tasks.workunit.client.0.vm07.stdout:1/949: mkdir d1/d11/d37/d3f/d7e/dad/d139 0 2026-03-09T19:28:05.577 INFO:tasks.workunit.client.0.vm07.stdout:8/998: creat d7/d9/d37/d45/d97/dbc/f163 x:0 0 0 2026-03-09T19:28:05.579 INFO:tasks.workunit.client.0.vm07.stdout:6/920: creat d0/d1/db/d24/da4/f14e x:0 0 0 2026-03-09T19:28:05.583 INFO:tasks.workunit.client.0.vm07.stdout:5/987: creat d3/dd/d26/d3f/d47/f14c x:0 0 0 2026-03-09T19:28:05.585 INFO:tasks.workunit.client.0.vm07.stdout:9/999: rmdir d0/db/d29/d68/d99 39 2026-03-09T19:28:05.597 INFO:tasks.workunit.client.0.vm07.stdout:1/950: truncate d1/d3e/db3/d9a/fd9 761487 0 2026-03-09T19:28:05.597 INFO:tasks.workunit.client.0.vm07.stdout:8/999: symlink d7/d9/d10/dd8/d10b/d162/def/l164 0 2026-03-09T19:28:05.598 INFO:tasks.workunit.client.0.vm07.stdout:0/942: unlink d0/d6/d13/d33/c9a 0 2026-03-09T19:28:05.599 INFO:tasks.workunit.client.0.vm07.stdout:4/973: link d3/d11/d2b/d38/ddc/d22/d86/cee d3/d11/d2b/d38/ddc/d22/d86/c153 0 2026-03-09T19:28:05.599 INFO:tasks.workunit.client.0.vm07.stdout:6/921: creat d0/d2d/dd5/d123/f14f x:0 0 0 2026-03-09T19:28:05.600 INFO:tasks.workunit.client.0.vm07.stdout:6/922: write d0/dbf/f138 [44881,5930] 0 2026-03-09T19:28:05.604 INFO:tasks.workunit.client.0.vm07.stdout:7/896: creat d0/d4/d5/d8/d1a/f129 x:0 0 0 2026-03-09T19:28:05.608 INFO:tasks.workunit.client.0.vm07.stdout:1/951: stat d1/db/d31/dca/l86 0 2026-03-09T19:28:05.610 INFO:tasks.workunit.client.0.vm07.stdout:4/974: symlink d3/d11/d29/d34/l154 0 2026-03-09T19:28:05.611 INFO:tasks.workunit.client.0.vm07.stdout:5/988: mknod d3/d1a/d28/d48/c14d 0 2026-03-09T19:28:05.615 INFO:tasks.workunit.client.0.vm07.stdout:0/943: fsync d0/d6/d13/dd0/fd5 0 2026-03-09T19:28:05.616 INFO:tasks.workunit.client.0.vm07.stdout:6/923: mkdir d0/d150 0 2026-03-09T19:28:05.617 INFO:tasks.workunit.client.0.vm07.stdout:0/944: read d0/f3a [819086,109470] 0 2026-03-09T19:28:05.618 INFO:tasks.workunit.client.0.vm07.stdout:1/952: write d1/db/d31/d4f/f77 [4962361,127312] 0 2026-03-09T19:28:05.619 INFO:tasks.workunit.client.0.vm07.stdout:0/945: fdatasync d0/d6/d13/d1c/d11/d56/f100 0 2026-03-09T19:28:05.620 INFO:tasks.workunit.client.0.vm07.stdout:1/953: readlink d1/db/d31/d4f/d7a/dd2/de8/leb 0 2026-03-09T19:28:05.632 INFO:tasks.workunit.client.0.vm07.stdout:1/954: truncate d1/d11/d37/d3f/f4a 2668067 0 2026-03-09T19:28:05.633 INFO:tasks.workunit.client.0.vm07.stdout:6/924: symlink d0/d4e/d7f/dbe/d12a/l151 0 2026-03-09T19:28:05.633 INFO:tasks.workunit.client.0.vm07.stdout:4/975: write d3/d4f/d56/d5f/fc2 [183487,18551] 0 2026-03-09T19:28:05.633 INFO:tasks.workunit.client.0.vm07.stdout:1/955: dread - d1/d11/d37/f132 zero size 2026-03-09T19:28:05.635 INFO:tasks.workunit.client.0.vm07.stdout:6/925: read - d0/d1/db/d52/d94/d87/f13b zero size 2026-03-09T19:28:05.636 INFO:tasks.workunit.client.0.vm07.stdout:1/956: chown d1/f38 7355 1 2026-03-09T19:28:05.637 INFO:tasks.workunit.client.0.vm07.stdout:4/976: dread - d3/d11/d2b/d37/db6/f12c zero size 2026-03-09T19:28:05.637 INFO:tasks.workunit.client.0.vm07.stdout:5/989: symlink d3/dd/d26/d3f/d47/d71/d76/d98/d126/l14e 0 2026-03-09T19:28:05.638 INFO:tasks.workunit.client.0.vm07.stdout:7/897: dwrite d0/d80/db1/de5/d54/f7d [0,4194304] 0 2026-03-09T19:28:05.641 INFO:tasks.workunit.client.0.vm07.stdout:0/946: rename d0/d6/dc8 to d0/d6/d13/d135 0 2026-03-09T19:28:05.646 INFO:tasks.workunit.client.0.vm07.stdout:6/926: creat d0/d1/f152 x:0 0 0 2026-03-09T19:28:05.646 INFO:tasks.workunit.client.0.vm07.stdout:6/927: dread - d0/d2d/dd5/d123/d7b/f144 zero size 2026-03-09T19:28:05.651 INFO:tasks.workunit.client.0.vm07.stdout:1/957: mkdir d1/d11/d37/d3f/d6e/d9c/d13a 0 2026-03-09T19:28:05.655 INFO:tasks.workunit.client.0.vm07.stdout:7/898: read d0/d4/d5/d8/d41/d64/d74/d98/dcb/f60 [162851,110147] 0 2026-03-09T19:28:05.657 INFO:tasks.workunit.client.0.vm07.stdout:1/958: mkdir d1/d11/d37/d3f/d45/d87/d88/df4/d13b 0 2026-03-09T19:28:05.663 INFO:tasks.workunit.client.0.vm07.stdout:5/990: dwrite d3/d1a/d5d/f102 [0,4194304] 0 2026-03-09T19:28:05.664 INFO:tasks.workunit.client.0.vm07.stdout:7/899: chown d0/d4/d5/d8/d41/d64/d74/d98/f83 5323 1 2026-03-09T19:28:05.665 INFO:tasks.workunit.client.0.vm07.stdout:1/959: mknod d1/db/d31/d56/c13c 0 2026-03-09T19:28:05.668 INFO:tasks.workunit.client.0.vm07.stdout:1/960: creat d1/d11/d37/dcb/f13d x:0 0 0 2026-03-09T19:28:05.669 INFO:tasks.workunit.client.0.vm07.stdout:7/900: mkdir d0/d4/d5/d8/d41/d64/d79/d12a 0 2026-03-09T19:28:05.670 INFO:tasks.workunit.client.0.vm07.stdout:1/961: creat d1/db/d31/d4f/d7a/dd2/f13e x:0 0 0 2026-03-09T19:28:05.672 INFO:tasks.workunit.client.0.vm07.stdout:1/962: write d1/d11/d37/d3f/d45/d87/d88/fd5 [3135449,9000] 0 2026-03-09T19:28:05.677 INFO:tasks.workunit.client.0.vm07.stdout:1/963: write d1/d11/d37/d3f/d6e/f9f [4239067,17354] 0 2026-03-09T19:28:05.687 INFO:tasks.workunit.client.0.vm07.stdout:5/991: dread d3/dd/ffe [0,4194304] 0 2026-03-09T19:28:05.789 INFO:tasks.workunit.client.0.vm07.stdout:4/977: dwrite d3/d11/d16/f82 [0,4194304] 0 2026-03-09T19:28:05.795 INFO:tasks.workunit.client.0.vm07.stdout:4/978: dread - d3/d11/d2b/d37/f95 zero size 2026-03-09T19:28:05.809 INFO:tasks.workunit.client.0.vm07.stdout:4/979: symlink d3/d11/d2b/d38/l155 0 2026-03-09T19:28:05.819 INFO:tasks.workunit.client.0.vm07.stdout:7/901: dwrite d0/d4/d5/d26/d32/dbd/d111/d11e/d7f/fad [0,4194304] 0 2026-03-09T19:28:05.825 INFO:tasks.workunit.client.0.vm07.stdout:4/980: dwrite d3/d11/d2b/d38/ddc/d22/f115 [0,4194304] 0 2026-03-09T19:28:05.836 INFO:tasks.workunit.client.0.vm07.stdout:7/902: fdatasync d0/d4/d5/d8/d41/f73 0 2026-03-09T19:28:05.845 INFO:tasks.workunit.client.0.vm07.stdout:6/928: link d0/d1/db/d52/f111 d0/f153 0 2026-03-09T19:28:05.850 INFO:tasks.workunit.client.0.vm07.stdout:4/981: mkdir d3/d11/d2b/d38/ddc/d156 0 2026-03-09T19:28:05.853 INFO:tasks.workunit.client.0.vm07.stdout:6/929: truncate d0/dbf/d95/f10e 5102445 0 2026-03-09T19:28:05.868 INFO:tasks.workunit.client.0.vm07.stdout:6/930: dwrite d0/d1/d28/da9/f116 [0,4194304] 0 2026-03-09T19:28:05.883 INFO:tasks.workunit.client.0.vm07.stdout:6/931: unlink d0/d4e/dae/ce4 0 2026-03-09T19:28:05.890 INFO:tasks.workunit.client.0.vm07.stdout:6/932: link d0/d44/dd3/f106 d0/d1/db/d24/f154 0 2026-03-09T19:28:05.891 INFO:tasks.workunit.client.0.vm07.stdout:6/933: write d0/d2d/f135 [329759,103097] 0 2026-03-09T19:28:05.896 INFO:tasks.workunit.client.0.vm07.stdout:6/934: mknod d0/d2d/dd5/d123/d7b/da0/d10f/c155 0 2026-03-09T19:28:05.926 INFO:tasks.workunit.client.0.vm07.stdout:0/947: symlink d0/d6/d13/d17/d19/d58/dd9/l136 0 2026-03-09T19:28:05.927 INFO:tasks.workunit.client.0.vm07.stdout:4/982: creat d3/d4f/d56/d5f/f157 x:0 0 0 2026-03-09T19:28:05.935 INFO:tasks.workunit.client.0.vm07.stdout:0/948: getdents d0/d6/d13/d17 0 2026-03-09T19:28:05.942 INFO:tasks.workunit.client.0.vm07.stdout:4/983: dwrite d3/d11/d2b/d38/d8f/f136 [4194304,4194304] 0 2026-03-09T19:28:05.953 INFO:tasks.workunit.client.0.vm07.stdout:4/984: creat d3/d11/d2b/d38/f158 x:0 0 0 2026-03-09T19:28:05.960 INFO:tasks.workunit.client.0.vm07.stdout:4/985: creat d3/d11/d29/d34/de0/f159 x:0 0 0 2026-03-09T19:28:05.965 INFO:tasks.workunit.client.0.vm07.stdout:6/935: write d0/d1/db/f4b [235326,99123] 0 2026-03-09T19:28:05.970 INFO:tasks.workunit.client.0.vm07.stdout:6/936: creat d0/d1/db/d1d/d77/f156 x:0 0 0 2026-03-09T19:28:05.974 INFO:tasks.workunit.client.0.vm07.stdout:6/937: readlink d0/d1/db/d52/d94/d81/la5 0 2026-03-09T19:28:05.977 INFO:tasks.workunit.client.0.vm07.stdout:5/992: rename d3/dd/lf8 to d3/d1a/d28/d6c/l14f 0 2026-03-09T19:28:05.983 INFO:tasks.workunit.client.0.vm07.stdout:4/986: getdents d3/d4f/d56/d5f/d125/d133 0 2026-03-09T19:28:05.984 INFO:tasks.workunit.client.0.vm07.stdout:1/964: creat d1/d11/d37/f13f x:0 0 0 2026-03-09T19:28:05.984 INFO:tasks.workunit.client.0.vm07.stdout:0/949: write d0/d6/d13/d1c/d61/d69/f9c [623106,7363] 0 2026-03-09T19:28:05.985 INFO:tasks.workunit.client.0.vm07.stdout:6/938: creat d0/d2d/dd5/d123/d7b/da0/f157 x:0 0 0 2026-03-09T19:28:05.999 INFO:tasks.workunit.client.0.vm07.stdout:7/903: rename d0/d4/d5/d8/l2b to d0/l12b 0 2026-03-09T19:28:06.000 INFO:tasks.workunit.client.0.vm07.stdout:4/987: symlink d3/d11/d2b/d38/ddc/d22/d86/d14f/l15a 0 2026-03-09T19:28:06.002 INFO:tasks.workunit.client.0.vm07.stdout:1/965: getdents d1/d3e/db3/d9a 0 2026-03-09T19:28:06.004 INFO:tasks.workunit.client.0.vm07.stdout:7/904: stat d0/d80/db1/de5/d54/d5a/ff4 0 2026-03-09T19:28:06.009 INFO:tasks.workunit.client.0.vm07.stdout:5/993: sync 2026-03-09T19:28:06.014 INFO:tasks.workunit.client.0.vm07.stdout:1/966: creat d1/d11/d37/d5d/dc1/d107/f140 x:0 0 0 2026-03-09T19:28:06.014 INFO:tasks.workunit.client.0.vm07.stdout:7/905: symlink d0/d4/d5/d26/d32/dbd/l12c 0 2026-03-09T19:28:06.014 INFO:tasks.workunit.client.0.vm07.stdout:0/950: dread d0/d6/d13/f31 [0,4194304] 0 2026-03-09T19:28:06.014 INFO:tasks.workunit.client.0.vm07.stdout:4/988: mkdir d3/d4f/d56/d15b 0 2026-03-09T19:28:06.016 INFO:tasks.workunit.client.0.vm07.stdout:7/906: sync 2026-03-09T19:28:06.016 INFO:tasks.workunit.client.0.vm07.stdout:6/939: write d0/d1/db/f114 [648206,108857] 0 2026-03-09T19:28:06.019 INFO:tasks.workunit.client.0.vm07.stdout:7/907: chown d0/d80/db1/de5/d54/d95/l11c 139460302 1 2026-03-09T19:28:06.023 INFO:tasks.workunit.client.0.vm07.stdout:5/994: symlink d3/dd/d26/d3f/de0/d134/l150 0 2026-03-09T19:28:06.024 INFO:tasks.workunit.client.0.vm07.stdout:5/995: sync 2026-03-09T19:28:06.024 INFO:tasks.workunit.client.0.vm07.stdout:6/940: unlink d0/dbf/c7e 0 2026-03-09T19:28:06.025 INFO:tasks.workunit.client.0.vm07.stdout:5/996: stat d3/d1a/d28/d48/f13c 0 2026-03-09T19:28:06.026 INFO:tasks.workunit.client.0.vm07.stdout:6/941: write d0/d2d/dd5/d123/dc7/fce [3029657,50881] 0 2026-03-09T19:28:06.026 INFO:tasks.workunit.client.0.vm07.stdout:5/997: fdatasync d3/dd/d26/d2d/d100/fcd 0 2026-03-09T19:28:06.026 INFO:tasks.workunit.client.0.vm07.stdout:0/951: creat d0/d6/d13/d1c/d61/f137 x:0 0 0 2026-03-09T19:28:06.027 INFO:tasks.workunit.client.0.vm07.stdout:4/989: write d3/d11/d2b/d38/ddc/fb0 [263618,89560] 0 2026-03-09T19:28:06.028 INFO:tasks.workunit.client.0.vm07.stdout:7/908: read d0/d4/d5/d8/f35 [100476,109419] 0 2026-03-09T19:28:06.029 INFO:tasks.workunit.client.0.vm07.stdout:0/952: chown d0/d6 47535658 1 2026-03-09T19:28:06.030 INFO:tasks.workunit.client.0.vm07.stdout:1/967: mkdir d1/d11/d37/d3f/d45/d87/d88/df4/d13b/d141 0 2026-03-09T19:28:06.030 INFO:tasks.workunit.client.0.vm07.stdout:4/990: chown d3/d11/d16/l45 735826287 1 2026-03-09T19:28:06.036 INFO:tasks.workunit.client.0.vm07.stdout:6/942: mknod d0/dbf/dd9/d13d/c158 0 2026-03-09T19:28:06.036 INFO:tasks.workunit.client.0.vm07.stdout:5/998: creat d3/dd/d26/d3f/de0/d134/f151 x:0 0 0 2026-03-09T19:28:06.037 INFO:tasks.workunit.client.0.vm07.stdout:0/953: chown d0/d6/l3b 90557 1 2026-03-09T19:28:06.041 INFO:tasks.workunit.client.0.vm07.stdout:4/991: creat d3/d11/d2b/d38/ddc/d156/f15c x:0 0 0 2026-03-09T19:28:06.042 INFO:tasks.workunit.client.0.vm07.stdout:1/968: truncate d1/d3/d21/f119 1036452 0 2026-03-09T19:28:06.048 INFO:tasks.workunit.client.0.vm07.stdout:5/999: chown d3/d1a/d28/d48/c14d 1450 1 2026-03-09T19:28:06.057 INFO:tasks.workunit.client.0.vm07.stdout:0/954: mkdir d0/d6/d13/d17/d19/d58/dd9/df8/d138 0 2026-03-09T19:28:06.060 INFO:tasks.workunit.client.0.vm07.stdout:4/992: rename d3/d4f/d56/d5f/d125/dfe to d3/d11/d29/d15d 0 2026-03-09T19:28:06.069 INFO:tasks.workunit.client.0.vm07.stdout:7/909: dread d0/d4/f86 [0,4194304] 0 2026-03-09T19:28:06.071 INFO:tasks.workunit.client.0.vm07.stdout:1/969: rename d1/d11/d37/d3f/d6e/d9c/db6/cf1 to d1/db/d31/d56/c142 0 2026-03-09T19:28:06.072 INFO:tasks.workunit.client.0.vm07.stdout:0/955: mkdir d0/d6/d13/d17/d19/d58/dd9/df8/d139 0 2026-03-09T19:28:06.078 INFO:tasks.workunit.client.0.vm07.stdout:6/943: dwrite d0/d13/f18 [0,4194304] 0 2026-03-09T19:28:06.083 INFO:tasks.workunit.client.0.vm07.stdout:4/993: dwrite d3/d11/d16/f82 [0,4194304] 0 2026-03-09T19:28:06.083 INFO:tasks.workunit.client.0.vm07.stdout:1/970: write d1/d11/d37/f13f [916292,90753] 0 2026-03-09T19:28:06.085 INFO:tasks.workunit.client.0.vm07.stdout:6/944: chown d0/d1/db/d1d 209707 1 2026-03-09T19:28:06.085 INFO:tasks.workunit.client.0.vm07.stdout:7/910: readlink d0/d4/d5/d8/d1a/d2a/dc5/l11b 0 2026-03-09T19:28:06.086 INFO:tasks.workunit.client.0.vm07.stdout:7/911: write d0/d4/d5/d99/feb [562419,55571] 0 2026-03-09T19:28:06.100 INFO:tasks.workunit.client.0.vm07.stdout:0/956: write d0/d6/d13/f31 [2242654,91543] 0 2026-03-09T19:28:06.102 INFO:tasks.workunit.client.0.vm07.stdout:0/957: chown d0/d6/d13/d33/l48 1 1 2026-03-09T19:28:06.104 INFO:tasks.workunit.client.0.vm07.stdout:1/971: dread - d1/db/d31/d4f/fc4 zero size 2026-03-09T19:28:06.104 INFO:tasks.workunit.client.0.vm07.stdout:7/912: mknod d0/d4/d5/d26/d32/dbd/d111/d11e/d7f/c12d 0 2026-03-09T19:28:06.106 INFO:tasks.workunit.client.0.vm07.stdout:6/945: rename d0/d1/db/f43 to d0/d1/db/d17/d12e/f159 0 2026-03-09T19:28:06.110 INFO:tasks.workunit.client.0.vm07.stdout:0/958: dread d0/d6/d13/d135/f10f [0,4194304] 0 2026-03-09T19:28:06.118 INFO:tasks.workunit.client.0.vm07.stdout:0/959: mkdir d0/d6/d13/d1c/d61/d69/d13a 0 2026-03-09T19:28:06.120 INFO:tasks.workunit.client.0.vm07.stdout:4/994: dread d3/d11/f79 [0,4194304] 0 2026-03-09T19:28:06.120 INFO:tasks.workunit.client.0.vm07.stdout:7/913: dread d0/d4/d5/d8/d41/d64/dd5/ffc [0,4194304] 0 2026-03-09T19:28:06.124 INFO:tasks.workunit.client.0.vm07.stdout:4/995: rmdir d3/d4f/d56/d5f/d125/d133 39 2026-03-09T19:28:06.124 INFO:tasks.workunit.client.0.vm07.stdout:7/914: write d0/d4/d5/d8/d41/f89 [2923091,68869] 0 2026-03-09T19:28:06.127 INFO:tasks.workunit.client.0.vm07.stdout:1/972: dread d1/db/fb2 [0,4194304] 0 2026-03-09T19:28:06.132 INFO:tasks.workunit.client.0.vm07.stdout:1/973: chown d1/db/d31/d4f/d7a/lcf 12525431 1 2026-03-09T19:28:06.142 INFO:tasks.workunit.client.0.vm07.stdout:7/915: rmdir d0/d4/d5/d26/d32/dbd/d111/d114 39 2026-03-09T19:28:06.142 INFO:tasks.workunit.client.0.vm07.stdout:7/916: creat d0/d80/db1/de5/d54/d95/f12e x:0 0 0 2026-03-09T19:28:06.145 INFO:tasks.workunit.client.0.vm07.stdout:7/917: mknod d0/d4/d5/d26/d32/dbd/d111/d11e/d7f/c12f 0 2026-03-09T19:28:06.149 INFO:tasks.workunit.client.0.vm07.stdout:1/974: truncate d1/d11/d37/d3f/d6e/d9c/ff7 400585 0 2026-03-09T19:28:06.164 INFO:tasks.workunit.client.0.vm07.stdout:4/996: dread d3/fa2 [0,4194304] 0 2026-03-09T19:28:06.165 INFO:tasks.workunit.client.0.vm07.stdout:4/997: truncate d3/f7 216229 0 2026-03-09T19:28:06.173 INFO:tasks.workunit.client.0.vm07.stdout:4/998: dwrite d3/d4f/f12e [0,4194304] 0 2026-03-09T19:28:06.184 INFO:tasks.workunit.client.0.vm07.stdout:4/999: chown d3/d11/d51/f10c 156677 1 2026-03-09T19:28:06.248 INFO:tasks.workunit.client.0.vm07.stdout:6/946: truncate d0/d2d/f88 1819778 0 2026-03-09T19:28:06.249 INFO:tasks.workunit.client.0.vm07.stdout:6/947: symlink d0/d44/l15a 0 2026-03-09T19:28:06.249 INFO:tasks.workunit.client.0.vm07.stdout:6/948: stat d0/d2d/dd5/d123/d7b/d7d/fa7 0 2026-03-09T19:28:06.250 INFO:tasks.workunit.client.0.vm07.stdout:6/949: write d0/d44/dd3/f14b [664564,80957] 0 2026-03-09T19:28:06.252 INFO:tasks.workunit.client.0.vm07.stdout:0/960: truncate d0/d6/d13/d17/d19/d58/f77 2487494 0 2026-03-09T19:28:06.256 INFO:tasks.workunit.client.0.vm07.stdout:0/961: truncate d0/d6/d13/d1c/d50/f60 2344276 0 2026-03-09T19:28:06.260 INFO:tasks.workunit.client.0.vm07.stdout:0/962: rename d0/d6/d13/d17/l2a to d0/d6/d13/d1c/d11/l13b 0 2026-03-09T19:28:06.261 INFO:tasks.workunit.client.0.vm07.stdout:6/950: getdents d0/d2d 0 2026-03-09T19:28:06.269 INFO:tasks.workunit.client.0.vm07.stdout:6/951: mkdir d0/d44/d122/d15b 0 2026-03-09T19:28:06.269 INFO:tasks.workunit.client.0.vm07.stdout:0/963: creat d0/d6/d13/d1c/d52/d81/d123/f13c x:0 0 0 2026-03-09T19:28:06.272 INFO:tasks.workunit.client.0.vm07.stdout:1/975: write d1/d11/d37/d3f/d45/d87/fa9 [3087369,67117] 0 2026-03-09T19:28:06.273 INFO:tasks.workunit.client.0.vm07.stdout:7/918: write d0/d4/d5/f20 [2244877,7667] 0 2026-03-09T19:28:06.291 INFO:tasks.workunit.client.0.vm07.stdout:7/919: dread d0/d4/d5/d8/d41/d64/fbb [0,4194304] 0 2026-03-09T19:28:06.292 INFO:tasks.workunit.client.0.vm07.stdout:1/976: creat d1/d3e/db3/f143 x:0 0 0 2026-03-09T19:28:06.299 INFO:tasks.workunit.client.0.vm07.stdout:1/977: rmdir d1/d11/d37/d3f/d45/d87 39 2026-03-09T19:28:06.301 INFO:tasks.workunit.client.0.vm07.stdout:7/920: truncate d0/d80/db1/de5/f9a 281453 0 2026-03-09T19:28:06.303 INFO:tasks.workunit.client.0.vm07.stdout:1/978: dread d1/d3e/db3/d6d/dff/fe7 [0,4194304] 0 2026-03-09T19:28:06.304 INFO:tasks.workunit.client.0.vm07.stdout:6/952: write d0/d13/f120 [779784,93816] 0 2026-03-09T19:28:06.307 INFO:tasks.workunit.client.0.vm07.stdout:0/964: truncate d0/d6/d13/d17/d19/d58/fa2 122977 0 2026-03-09T19:28:06.310 INFO:tasks.workunit.client.0.vm07.stdout:6/953: fdatasync d0/d1/d28/da8/fd1 0 2026-03-09T19:28:06.314 INFO:tasks.workunit.client.0.vm07.stdout:0/965: unlink d0/d6/c55 0 2026-03-09T19:28:06.315 INFO:tasks.workunit.client.0.vm07.stdout:6/954: mknod d0/d1/db/d52/d94/d81/d149/c15c 0 2026-03-09T19:28:06.318 INFO:tasks.workunit.client.0.vm07.stdout:1/979: getdents d1/d3e/db3/d6d 0 2026-03-09T19:28:06.319 INFO:tasks.workunit.client.0.vm07.stdout:7/921: dwrite d0/d80/db1/de5/db4/fdd [0,4194304] 0 2026-03-09T19:28:06.322 INFO:tasks.workunit.client.0.vm07.stdout:7/922: write d0/d4/d5/f20 [267942,22831] 0 2026-03-09T19:28:06.323 INFO:tasks.workunit.client.0.vm07.stdout:6/955: chown d0/d4e/d7f/fca 1 1 2026-03-09T19:28:06.335 INFO:tasks.workunit.client.0.vm07.stdout:1/980: creat d1/d11/f144 x:0 0 0 2026-03-09T19:28:06.337 INFO:tasks.workunit.client.0.vm07.stdout:0/966: dread d0/d6/d13/d17/d19/d57/d6a/fda [0,4194304] 0 2026-03-09T19:28:06.339 INFO:tasks.workunit.client.0.vm07.stdout:6/956: symlink d0/d1/db/d52/d94/d81/l15d 0 2026-03-09T19:28:06.340 INFO:tasks.workunit.client.0.vm07.stdout:6/957: fsync d0/dbf/f138 0 2026-03-09T19:28:06.344 INFO:tasks.workunit.client.0.vm07.stdout:6/958: rename d0/d4e/d7f/dbe/d12a to d0/dbf/dd9/d13d/d15e 0 2026-03-09T19:28:06.349 INFO:tasks.workunit.client.0.vm07.stdout:0/967: dread d0/d6/d13/d1c/d11/f29 [0,4194304] 0 2026-03-09T19:28:06.362 INFO:tasks.workunit.client.0.vm07.stdout:7/923: dwrite d0/d80/db1/de5/fd2 [0,4194304] 0 2026-03-09T19:28:06.364 INFO:tasks.workunit.client.0.vm07.stdout:1/981: dwrite d1/db/d31/dca/fa7 [0,4194304] 0 2026-03-09T19:28:06.367 INFO:tasks.workunit.client.0.vm07.stdout:7/924: readlink d0/d4/d5/d8/d41/l51 0 2026-03-09T19:28:06.371 INFO:tasks.workunit.client.0.vm07.stdout:0/968: write d0/d6/d13/d33/fe1 [4705700,65074] 0 2026-03-09T19:28:06.372 INFO:tasks.workunit.client.0.vm07.stdout:6/959: dwrite d0/dbf/d95/f74 [0,4194304] 0 2026-03-09T19:28:06.374 INFO:tasks.workunit.client.0.vm07.stdout:0/969: stat d0/cc4 0 2026-03-09T19:28:06.382 INFO:tasks.workunit.client.0.vm07.stdout:6/960: symlink d0/d1/db/d24/da4/dda/l15f 0 2026-03-09T19:28:06.382 INFO:tasks.workunit.client.0.vm07.stdout:7/925: write d0/d4/d5/d26/d32/dbd/d111/d11e/f6d [1587909,13809] 0 2026-03-09T19:28:06.386 INFO:tasks.workunit.client.0.vm07.stdout:1/982: creat d1/d11/d37/d3f/d11e/f145 x:0 0 0 2026-03-09T19:28:06.391 INFO:tasks.workunit.client.0.vm07.stdout:7/926: mkdir d0/d80/db1/de5/db4/d130 0 2026-03-09T19:28:06.401 INFO:tasks.workunit.client.0.vm07.stdout:0/970: mkdir d0/d6/d13/d135/d99/ddc/d13d 0 2026-03-09T19:28:06.401 INFO:tasks.workunit.client.0.vm07.stdout:1/983: stat d1/d11/d37/d3f/d45/f9d 0 2026-03-09T19:28:06.401 INFO:tasks.workunit.client.0.vm07.stdout:1/984: rename d1 to d1/d11/d37/d3f/d7e/dad/d146 22 2026-03-09T19:28:06.401 INFO:tasks.workunit.client.0.vm07.stdout:7/927: rmdir d0/d4/d5/d26/dc9 39 2026-03-09T19:28:06.401 INFO:tasks.workunit.client.0.vm07.stdout:1/985: symlink d1/d11/d37/dcb/d131/l147 0 2026-03-09T19:28:06.401 INFO:tasks.workunit.client.0.vm07.stdout:0/971: creat d0/d6/d13/d1c/d50/f13e x:0 0 0 2026-03-09T19:28:06.406 INFO:tasks.workunit.client.0.vm07.stdout:0/972: symlink d0/d6/d13/d17/dc3/l13f 0 2026-03-09T19:28:06.407 INFO:tasks.workunit.client.0.vm07.stdout:7/928: truncate d0/d4/d5/d8/d41/d64/d74/d98/dcb/d39/f97 223912 0 2026-03-09T19:28:06.413 INFO:tasks.workunit.client.0.vm07.stdout:7/929: write d0/d4/d5/d26/d32/f11d [659230,37300] 0 2026-03-09T19:28:06.419 INFO:tasks.workunit.client.0.vm07.stdout:0/973: link d0/d6/d13/d1c/d11/f80 d0/d6/d13/d17/dc3/f140 0 2026-03-09T19:28:06.419 INFO:tasks.workunit.client.0.vm07.stdout:6/961: dread d0/d1/db/d52/f111 [0,4194304] 0 2026-03-09T19:28:06.419 INFO:tasks.workunit.client.0.vm07.stdout:7/930: mknod d0/d4/d5/d8/c131 0 2026-03-09T19:28:06.419 INFO:tasks.workunit.client.0.vm07.stdout:0/974: creat d0/d6/d13/dd0/f141 x:0 0 0 2026-03-09T19:28:06.419 INFO:tasks.workunit.client.0.vm07.stdout:6/962: creat d0/d2d/dd5/d123/dc7/f160 x:0 0 0 2026-03-09T19:28:06.419 INFO:tasks.workunit.client.0.vm07.stdout:0/975: read - d0/d6/d13/d1c/fe8 zero size 2026-03-09T19:28:06.421 INFO:tasks.workunit.client.0.vm07.stdout:0/976: write d0/d6/d13/d1c/d52/d81/d123/f13c [8764,100200] 0 2026-03-09T19:28:06.423 INFO:tasks.workunit.client.0.vm07.stdout:6/963: symlink d0/dbf/dd9/l161 0 2026-03-09T19:28:06.424 INFO:tasks.workunit.client.0.vm07.stdout:7/931: chown d0/d4/d5/d26/db9/cd4 141432002 1 2026-03-09T19:28:06.425 INFO:tasks.workunit.client.0.vm07.stdout:0/977: symlink d0/l142 0 2026-03-09T19:28:06.428 INFO:tasks.workunit.client.0.vm07.stdout:0/978: truncate d0/d6/d13/d135/d99/fd4 736638 0 2026-03-09T19:28:06.431 INFO:tasks.workunit.client.0.vm07.stdout:0/979: truncate d0/d6/f11e 1461886 0 2026-03-09T19:28:06.437 INFO:tasks.workunit.client.0.vm07.stdout:0/980: dwrite d0/d6/d13/d33/f35 [0,4194304] 0 2026-03-09T19:28:06.447 INFO:tasks.workunit.client.0.vm07.stdout:0/981: symlink d0/d6/d13/d17/d19/d58/l143 0 2026-03-09T19:28:06.457 INFO:tasks.workunit.client.0.vm07.stdout:1/986: dwrite d1/d11/d37/dcb/f125 [0,4194304] 0 2026-03-09T19:28:06.460 INFO:tasks.workunit.client.0.vm07.stdout:6/964: write d0/d1/db/fe3 [988678,48842] 0 2026-03-09T19:28:06.464 INFO:tasks.workunit.client.0.vm07.stdout:6/965: chown d0/d1/db/d24/fc2 13650 1 2026-03-09T19:28:06.466 INFO:tasks.workunit.client.0.vm07.stdout:1/987: truncate d1/d11/d37/d5d/f8a 2130018 0 2026-03-09T19:28:06.470 INFO:tasks.workunit.client.0.vm07.stdout:7/932: dwrite d0/d4/d5/d8/d41/d64/d74/f82 [4194304,4194304] 0 2026-03-09T19:28:06.474 INFO:tasks.workunit.client.0.vm07.stdout:7/933: readlink d0/d4/d5/lb5 0 2026-03-09T19:28:06.479 INFO:tasks.workunit.client.0.vm07.stdout:7/934: readlink d0/d80/db1/de5/d54/d5a/l69 0 2026-03-09T19:28:06.491 INFO:tasks.workunit.client.0.vm07.stdout:1/988: dread d1/d3/d21/f55 [0,4194304] 0 2026-03-09T19:28:06.491 INFO:tasks.workunit.client.0.vm07.stdout:1/989: write d1/db/d31/d56/f138 [3719268,66188] 0 2026-03-09T19:28:06.491 INFO:tasks.workunit.client.0.vm07.stdout:1/990: mkdir d1/d11/d37/d3f/d7e/dad/d148 0 2026-03-09T19:28:06.494 INFO:tasks.workunit.client.0.vm07.stdout:6/966: sync 2026-03-09T19:28:06.500 INFO:tasks.workunit.client.0.vm07.stdout:1/991: symlink d1/d11/d37/d5d/dc1/d107/l149 0 2026-03-09T19:28:06.502 INFO:tasks.workunit.client.0.vm07.stdout:6/967: link d0/d13/l59 d0/d1/db/d1d/l162 0 2026-03-09T19:28:06.508 INFO:tasks.workunit.client.0.vm07.stdout:6/968: creat d0/d2d/dd5/d123/d7b/da0/f163 x:0 0 0 2026-03-09T19:28:06.508 INFO:tasks.workunit.client.0.vm07.stdout:6/969: chown d0/dbf/d95/d31/c50 94213 1 2026-03-09T19:28:06.508 INFO:tasks.workunit.client.0.vm07.stdout:1/992: rename d1/d11/d37/d3f/d45/d87/d88/df4 to d1/d11/d37/d3f/d45/d87/d88/d14a 0 2026-03-09T19:28:06.508 INFO:tasks.workunit.client.0.vm07.stdout:6/970: symlink d0/d2d/dd5/d123/d7b/l164 0 2026-03-09T19:28:06.510 INFO:tasks.workunit.client.0.vm07.stdout:6/971: mkdir d0/d2d/dd5/d123/d7b/d7d/d165 0 2026-03-09T19:28:06.510 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:06 vm07.local ceph-mon[48545]: pgmap v12: 65 pgs: 65 active+clean; 4.0 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 65 MiB/s rd, 135 MiB/s wr, 393 op/s 2026-03-09T19:28:06.510 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:06 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:06.510 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:06 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:06.510 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:06 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:06.519 INFO:tasks.workunit.client.0.vm07.stdout:6/972: creat d0/d1/f166 x:0 0 0 2026-03-09T19:28:06.520 INFO:tasks.workunit.client.0.vm07.stdout:6/973: stat d0/d4e/l110 0 2026-03-09T19:28:06.523 INFO:tasks.workunit.client.0.vm07.stdout:6/974: fdatasync d0/dbf/d95/d31/f3a 0 2026-03-09T19:28:06.525 INFO:tasks.workunit.client.0.vm07.stdout:6/975: mknod d0/d1/db/d17/d12e/c167 0 2026-03-09T19:28:06.526 INFO:tasks.workunit.client.0.vm07.stdout:6/976: readlink d0/d44/l54 0 2026-03-09T19:28:06.531 INFO:tasks.workunit.client.0.vm07.stdout:6/977: mknod d0/d1/db/d52/d94/d81/c168 0 2026-03-09T19:28:06.550 INFO:tasks.workunit.client.0.vm07.stdout:0/982: write d0/d6/d13/d17/d19/d57/f5a [312266,130246] 0 2026-03-09T19:28:06.569 INFO:tasks.workunit.client.0.vm07.stdout:0/983: rename d0/d6/d13/d17/d19/d58/dd9/df8/c110 to d0/d6/d13/d1c/d52/c144 0 2026-03-09T19:28:06.578 INFO:tasks.workunit.client.0.vm07.stdout:0/984: read d0/f3d [1061768,119596] 0 2026-03-09T19:28:06.585 INFO:tasks.workunit.client.0.vm07.stdout:7/935: dwrite d0/d4/d5/d26/f107 [0,4194304] 0 2026-03-09T19:28:06.586 INFO:tasks.workunit.client.0.vm07.stdout:1/993: dwrite d1/d11/d37/d3f/d45/f9d [0,4194304] 0 2026-03-09T19:28:06.590 INFO:tasks.workunit.client.0.vm07.stdout:6/978: write d0/d4e/d7f/fca [858051,22966] 0 2026-03-09T19:28:06.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:06 vm08.local ceph-mon[57794]: pgmap v12: 65 pgs: 65 active+clean; 4.0 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 65 MiB/s rd, 135 MiB/s wr, 393 op/s 2026-03-09T19:28:06.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:06 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:06.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:06 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:06.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:06 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:06.595 INFO:tasks.workunit.client.0.vm07.stdout:6/979: write d0/d1/db/d52/d94/d87/f13b [238299,65620] 0 2026-03-09T19:28:06.605 INFO:tasks.workunit.client.0.vm07.stdout:0/985: creat d0/d6/d13/d1c/d50/f145 x:0 0 0 2026-03-09T19:28:06.605 INFO:tasks.workunit.client.0.vm07.stdout:1/994: rmdir d1/db/d31/d4f 39 2026-03-09T19:28:06.609 INFO:tasks.workunit.client.0.vm07.stdout:6/980: mknod d0/d2d/dd5/d123/d7b/da0/c169 0 2026-03-09T19:28:06.614 INFO:tasks.workunit.client.0.vm07.stdout:1/995: creat d1/d3e/db3/d9a/f14b x:0 0 0 2026-03-09T19:28:06.617 INFO:tasks.workunit.client.0.vm07.stdout:0/986: truncate d0/d6/d13/d1c/d52/fed 277785 0 2026-03-09T19:28:06.617 INFO:tasks.workunit.client.0.vm07.stdout:6/981: symlink d0/dbf/d133/l16a 0 2026-03-09T19:28:06.619 INFO:tasks.workunit.client.0.vm07.stdout:0/987: stat d0/d6/d13/d17/d19/d57/d6a/fec 0 2026-03-09T19:28:06.620 INFO:tasks.workunit.client.0.vm07.stdout:1/996: symlink d1/db/d31/d4f/d7a/l14c 0 2026-03-09T19:28:06.621 INFO:tasks.workunit.client.0.vm07.stdout:6/982: write d0/d1/d28/fc5 [1824661,25728] 0 2026-03-09T19:28:06.631 INFO:tasks.workunit.client.0.vm07.stdout:6/983: rename d0/d4e/dae to d0/d44/d122/d16b 0 2026-03-09T19:28:06.633 INFO:tasks.workunit.client.0.vm07.stdout:6/984: chown d0/d1/db/d1d/f47 0 1 2026-03-09T19:28:06.645 INFO:tasks.workunit.client.0.vm07.stdout:6/985: getdents d0/d44/d122/d16b/daf 0 2026-03-09T19:28:06.649 INFO:tasks.workunit.client.0.vm07.stdout:6/986: dread - d0/d2d/dd5/d123/d7b/f147 zero size 2026-03-09T19:28:06.654 INFO:tasks.workunit.client.0.vm07.stdout:7/936: truncate d0/d4/d5/d26/f107 1528159 0 2026-03-09T19:28:06.656 INFO:tasks.workunit.client.0.vm07.stdout:1/997: write d1/d3/d21/f10d [636462,77423] 0 2026-03-09T19:28:06.659 INFO:tasks.workunit.client.0.vm07.stdout:0/988: dread d0/d6/d13/d17/d19/f34 [0,4194304] 0 2026-03-09T19:28:06.660 INFO:tasks.workunit.client.0.vm07.stdout:0/989: fdatasync d0/d6/d13/d1c/d61/d69/fb9 0 2026-03-09T19:28:06.660 INFO:tasks.workunit.client.0.vm07.stdout:1/998: truncate d1/d91/ff8 331286 0 2026-03-09T19:28:06.661 INFO:tasks.workunit.client.0.vm07.stdout:0/990: truncate d0/d6/d13/dd0/f141 310542 0 2026-03-09T19:28:06.664 INFO:tasks.workunit.client.0.vm07.stdout:1/999: fdatasync d1/d3e/db3/f12a 0 2026-03-09T19:28:06.664 INFO:tasks.workunit.client.0.vm07.stdout:7/937: creat d0/d80/db1/de5/d54/d5a/d87/f132 x:0 0 0 2026-03-09T19:28:06.664 INFO:tasks.workunit.client.0.vm07.stdout:6/987: getdents d0/d1/db/d91 0 2026-03-09T19:28:06.668 INFO:tasks.workunit.client.0.vm07.stdout:7/938: chown d0/d80/db1/de5/l76 2071425 1 2026-03-09T19:28:06.671 INFO:tasks.workunit.client.0.vm07.stdout:6/988: dwrite d0/d2d/dd5/d123/dc7/f160 [0,4194304] 0 2026-03-09T19:28:06.673 INFO:tasks.workunit.client.0.vm07.stdout:0/991: symlink d0/d6/d13/d1c/d11/d8b/l146 0 2026-03-09T19:28:06.673 INFO:tasks.workunit.client.0.vm07.stdout:7/939: readlink d0/d80/db1/de5/d54/l5b 0 2026-03-09T19:28:06.682 INFO:tasks.workunit.client.0.vm07.stdout:6/989: truncate d0/dbf/f34 1830237 0 2026-03-09T19:28:06.687 INFO:tasks.workunit.client.0.vm07.stdout:6/990: rename d0/d44/d122/d16b/daf/c102 to d0/d1/db/d17/d12e/c16c 0 2026-03-09T19:28:06.689 INFO:tasks.workunit.client.0.vm07.stdout:7/940: rename d0/d80/db1/de5/db4 to d0/d4/d5/d8/d41/d64/d74/d98/dcb/d123/d133 0 2026-03-09T19:28:06.697 INFO:tasks.workunit.client.0.vm07.stdout:6/991: rename d0/d4e/d75/cbb to d0/d13/c16d 0 2026-03-09T19:28:06.705 INFO:tasks.workunit.client.0.vm07.stdout:0/992: dread d0/d6/d13/d1c/d11/f2e [0,4194304] 0 2026-03-09T19:28:06.705 INFO:tasks.workunit.client.0.vm07.stdout:0/993: chown d0/d6/d13/d33/ceb 6 1 2026-03-09T19:28:06.707 INFO:tasks.workunit.client.0.vm07.stdout:6/992: dread d0/f153 [0,4194304] 0 2026-03-09T19:28:06.708 INFO:tasks.workunit.client.0.vm07.stdout:6/993: readlink d0/d44/l15a 0 2026-03-09T19:28:06.721 INFO:tasks.workunit.client.0.vm07.stdout:6/994: mknod d0/d1/db/d52/d94/d81/c16e 0 2026-03-09T19:28:06.729 INFO:tasks.workunit.client.0.vm07.stdout:6/995: dread d0/d4e/f78 [0,4194304] 0 2026-03-09T19:28:06.730 INFO:tasks.workunit.client.0.vm07.stdout:0/994: link d0/d6/d13/d17/d19/d58/dd9/f131 d0/d6/d13/d1c/f147 0 2026-03-09T19:28:06.733 INFO:tasks.workunit.client.0.vm07.stdout:7/941: dwrite d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58/ffd [0,4194304] 0 2026-03-09T19:28:06.737 INFO:tasks.workunit.client.0.vm07.stdout:0/995: dwrite d0/d6/d13/d17/dc3/fb6 [0,4194304] 0 2026-03-09T19:28:06.758 INFO:tasks.workunit.client.0.vm07.stdout:7/942: dread - d0/d4/d5/d8/d41/d64/d74/d98/dcb/d123/d133/fd7 zero size 2026-03-09T19:28:06.758 INFO:tasks.workunit.client.0.vm07.stdout:0/996: creat d0/d6/d13/d17/d19/d57/d6a/f148 x:0 0 0 2026-03-09T19:28:06.759 INFO:tasks.workunit.client.0.vm07.stdout:7/943: write d0/d4/d5/d26/f128 [447462,103126] 0 2026-03-09T19:28:06.761 INFO:tasks.workunit.client.0.vm07.stdout:7/944: chown d0/d4/d5/d8/d41/d64/d74/d98/dcb/d123/d133/cf8 129706031 1 2026-03-09T19:28:06.764 INFO:tasks.workunit.client.0.vm07.stdout:7/945: fdatasync d0/d4/d5/d8/f35 0 2026-03-09T19:28:06.764 INFO:tasks.workunit.client.0.vm07.stdout:7/946: fdatasync d0/d4/d5/d26/dfb/f113 0 2026-03-09T19:28:06.767 INFO:tasks.workunit.client.0.vm07.stdout:7/947: rename d0/d80/db1/de5/d54/d5a/f96 to d0/d4/d5/d8/d1a/d2a/dc5/f134 0 2026-03-09T19:28:06.769 INFO:tasks.workunit.client.0.vm07.stdout:0/997: link d0/d6/d13/d17/d19/d58/dd9/l104 d0/d6/d13/d17/d19/d58/dd9/l149 0 2026-03-09T19:28:06.775 INFO:tasks.workunit.client.0.vm07.stdout:7/948: getdents d0/d80/db1/de5/d54 0 2026-03-09T19:28:06.776 INFO:tasks.workunit.client.0.vm07.stdout:7/949: mkdir d0/d4/d5/d135 0 2026-03-09T19:28:06.777 INFO:tasks.workunit.client.0.vm07.stdout:7/950: mknod d0/d4/d5/d135/c136 0 2026-03-09T19:28:06.783 INFO:tasks.workunit.client.0.vm07.stdout:6/996: write d0/d4e/d7f/dbe/fee [151084,96287] 0 2026-03-09T19:28:06.785 INFO:tasks.workunit.client.0.vm07.stdout:7/951: creat d0/d4/d5/d8/d41/d64/d74/d98/dcb/d123/d133/d130/f137 x:0 0 0 2026-03-09T19:28:06.791 INFO:tasks.workunit.client.0.vm07.stdout:7/952: creat d0/d4/d5/d26/dc9/f138 x:0 0 0 2026-03-09T19:28:06.793 INFO:tasks.workunit.client.0.vm07.stdout:7/953: mknod d0/d4/d5/d26/db9/c139 0 2026-03-09T19:28:06.798 INFO:tasks.workunit.client.0.vm07.stdout:7/954: symlink d0/d4/d5/d26/dc9/l13a 0 2026-03-09T19:28:06.799 INFO:tasks.workunit.client.0.vm07.stdout:0/998: truncate d0/d6/d13/d17/d19/f34 1245443 0 2026-03-09T19:28:06.802 INFO:tasks.workunit.client.0.vm07.stdout:7/955: rmdir d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58 39 2026-03-09T19:28:06.804 INFO:tasks.workunit.client.0.vm07.stdout:6/997: dwrite d0/dbf/d95/d31/f3c [0,4194304] 0 2026-03-09T19:28:06.809 INFO:tasks.workunit.client.0.vm07.stdout:6/998: write d0/dbf/d95/f105 [245611,75257] 0 2026-03-09T19:28:06.814 INFO:tasks.workunit.client.0.vm07.stdout:6/999: creat d0/d2d/f16f x:0 0 0 2026-03-09T19:28:06.816 INFO:tasks.workunit.client.0.vm07.stdout:0/999: getdents d0/d6/d13/d17/dc3 0 2026-03-09T19:28:06.825 INFO:tasks.workunit.client.0.vm07.stdout:7/956: dwrite d0/d80/db1/de5/d54/d5a/d87/f132 [0,4194304] 0 2026-03-09T19:28:06.827 INFO:tasks.workunit.client.0.vm07.stdout:7/957: write d0/d4/f12 [2210039,31792] 0 2026-03-09T19:28:06.832 INFO:tasks.workunit.client.0.vm07.stdout:7/958: mkdir d0/d4/d5/d26/d32/dbd/d111/d13b 0 2026-03-09T19:28:06.833 INFO:tasks.workunit.client.0.vm07.stdout:7/959: readlink d0/d80/db1/de5/d54/d5a/l7b 0 2026-03-09T19:28:06.842 INFO:tasks.workunit.client.0.vm07.stdout:7/960: rename d0/d4/d5/d26/d32/dbd/fba to d0/d4/f13c 0 2026-03-09T19:28:06.845 INFO:tasks.workunit.client.0.vm07.stdout:7/961: dread d0/d4/d5/d8/d41/d64/d74/d98/dcb/df2/f11f [0,4194304] 0 2026-03-09T19:28:06.850 INFO:tasks.workunit.client.0.vm07.stdout:7/962: dwrite d0/d4/d5/d8/d41/d64/d74/d98/fd9 [0,4194304] 0 2026-03-09T19:28:06.860 INFO:tasks.workunit.client.0.vm07.stdout:7/963: mkdir d0/d80/db1/de5/d13d 0 2026-03-09T19:28:06.861 INFO:tasks.workunit.client.0.vm07.stdout:7/964: dread - d0/d80/db1/de5/d54/d5a/ff4 zero size 2026-03-09T19:28:06.861 INFO:tasks.workunit.client.0.vm07.stdout:7/965: fsync d0/d80/db1/de5/fd2 0 2026-03-09T19:28:06.863 INFO:tasks.workunit.client.0.vm07.stdout:7/966: truncate d0/d4/d5/d8/d41/d64/fee 228866 0 2026-03-09T19:28:06.884 INFO:tasks.workunit.client.0.vm07.stdout:7/967: write d0/d4/d5/f85 [4030,15666] 0 2026-03-09T19:28:06.886 INFO:tasks.workunit.client.0.vm07.stdout:7/968: mknod d0/d4/d5/d8/d41/c13e 0 2026-03-09T19:28:06.888 INFO:tasks.workunit.client.0.vm07.stdout:7/969: mknod d0/d4/d5/d26/d32/dbd/d111/d11e/d7f/c13f 0 2026-03-09T19:28:06.889 INFO:tasks.workunit.client.0.vm07.stdout:7/970: mknod d0/d4/d5/d26/d32/dbd/d111/d11e/c140 0 2026-03-09T19:28:06.893 INFO:tasks.workunit.client.0.vm07.stdout:7/971: creat d0/d4/d5/d8/d41/d64/f141 x:0 0 0 2026-03-09T19:28:06.895 INFO:tasks.workunit.client.0.vm07.stdout:7/972: rmdir d0/d80/db1/de5/d54/d5a 39 2026-03-09T19:28:06.906 INFO:tasks.workunit.client.0.vm07.stdout:7/973: dwrite d0/d80/db1/de5/f9a [0,4194304] 0 2026-03-09T19:28:06.912 INFO:tasks.workunit.client.0.vm07.stdout:7/974: mknod d0/d4/d5/d8/d41/d10b/c142 0 2026-03-09T19:28:06.914 INFO:tasks.workunit.client.0.vm07.stdout:7/975: creat d0/d4/d5/d26/d32/dbd/d111/d11e/d7f/d121/f143 x:0 0 0 2026-03-09T19:28:06.940 INFO:tasks.workunit.client.0.vm07.stdout:7/976: write d0/d4/d5/d8/d41/d64/d79/fea [134165,26314] 0 2026-03-09T19:28:06.947 INFO:tasks.workunit.client.0.vm07.stdout:7/977: dread d0/d4/d5/d26/d32/f45 [0,4194304] 0 2026-03-09T19:28:06.951 INFO:tasks.workunit.client.0.vm07.stdout:7/978: read d0/d4/d5/d8/d41/d64/d74/f88 [4362551,126763] 0 2026-03-09T19:28:06.956 INFO:tasks.workunit.client.0.vm07.stdout:7/979: dwrite d0/d4/d5/f116 [0,4194304] 0 2026-03-09T19:28:06.961 INFO:tasks.workunit.client.0.vm07.stdout:7/980: write d0/d4/d5/d26/d32/f11d [907341,122712] 0 2026-03-09T19:28:06.964 INFO:tasks.workunit.client.0.vm07.stdout:7/981: chown d0/l8a 125577673 1 2026-03-09T19:28:06.971 INFO:tasks.workunit.client.0.vm07.stdout:7/982: creat d0/d80/db1/de5/d54/f144 x:0 0 0 2026-03-09T19:28:06.997 INFO:tasks.workunit.client.0.vm07.stdout:7/983: chown d0/d80/db1/de5/d54/d95/l11c 246418230 1 2026-03-09T19:28:07.001 INFO:tasks.workunit.client.0.vm07.stdout:7/984: creat d0/d4/d5/d26/d32/d10c/f145 x:0 0 0 2026-03-09T19:28:07.002 INFO:tasks.workunit.client.0.vm07.stdout:7/985: readlink d0/d4/d5/d8/d41/d64/d79/l90 0 2026-03-09T19:28:07.006 INFO:tasks.workunit.client.0.vm07.stdout:7/986: creat d0/d4/d5/d26/d32/dbd/d111/f146 x:0 0 0 2026-03-09T19:28:07.031 INFO:tasks.workunit.client.0.vm07.stdout:7/987: sync 2026-03-09T19:28:07.035 INFO:tasks.workunit.client.0.vm07.stdout:7/988: creat d0/d4/d5/d8/d41/d64/d79/f147 x:0 0 0 2026-03-09T19:28:07.045 INFO:tasks.workunit.client.0.vm07.stdout:7/989: truncate d0/d80/db1/ff7 586621 0 2026-03-09T19:28:07.051 INFO:tasks.workunit.client.0.vm07.stdout:7/990: link d0/d4/d5/d8/d41/f73 d0/d80/db1/de5/d54/d95/f148 0 2026-03-09T19:28:07.071 INFO:tasks.workunit.client.0.vm07.stdout:7/991: write d0/d80/db1/de5/d54/d5a/fef [5033515,38212] 0 2026-03-09T19:28:07.083 INFO:tasks.workunit.client.0.vm07.stdout:7/992: rmdir d0/d4/d5/d26/dfb 39 2026-03-09T19:28:07.087 INFO:tasks.workunit.client.0.vm07.stdout:7/993: chown d0/d4/d5/d8/d41/d64/d74/d98/dcb/d58/f70 46 1 2026-03-09T19:28:07.088 INFO:tasks.workunit.client.0.vm07.stdout:7/994: chown d0/d80/db1/de5/d54/d95 3010 1 2026-03-09T19:28:07.094 INFO:tasks.workunit.client.0.vm07.stdout:7/995: rename d0/d4/d5/d8/d41/d64/d79/d12a to d0/d4/d5/d8/d1a/d2a/dc5/d149 0 2026-03-09T19:28:07.117 INFO:tasks.workunit.client.0.vm07.stdout:7/996: getdents d0/d4/d5/d26/db9 0 2026-03-09T19:28:07.123 INFO:tasks.workunit.client.0.vm07.stdout:7/997: creat d0/d4/d5/d8/d41/d64/f14a x:0 0 0 2026-03-09T19:28:07.125 INFO:tasks.workunit.client.0.vm07.stdout:7/998: dread - d0/d4/d5/d26/ffa zero size 2026-03-09T19:28:07.126 INFO:tasks.workunit.client.0.vm07.stdout:7/999: dread - d0/d4/d5/d8/d41/d64/d74/d98/dcb/d123/d133/fd7 zero size 2026-03-09T19:28:07.131 INFO:tasks.workunit.client.0.vm07.stderr:+ rm -rf -- ./tmp.h9HUznJztE 2026-03-09T19:28:07.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:07 vm08.local ceph-mon[57794]: pgmap v13: 65 pgs: 65 active+clean; 4.0 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 46 MiB/s rd, 93 MiB/s wr, 269 op/s 2026-03-09T19:28:07.571 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:07 vm07.local ceph-mon[48545]: pgmap v13: 65 pgs: 65 active+clean; 4.0 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 46 MiB/s rd, 93 MiB/s wr, 269 op/s 2026-03-09T19:28:09.985 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:09 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:09.985 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:09 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:09.985 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:09 vm07.local ceph-mon[48545]: pgmap v14: 65 pgs: 65 active+clean; 4.0 GiB data, 14 GiB used, 106 GiB / 120 GiB avail; 62 MiB/s rd, 131 MiB/s wr, 386 op/s 2026-03-09T19:28:09.985 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:09 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:09.985 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:09 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:09 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:09 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:09 vm08.local ceph-mon[57794]: pgmap v14: 65 pgs: 65 active+clean; 4.0 GiB data, 14 GiB used, 106 GiB / 120 GiB avail; 62 MiB/s rd, 131 MiB/s wr, 386 op/s 2026-03-09T19:28:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:09 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:09 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: pgmap v15: 65 pgs: 65 active+clean; 4.0 GiB data, 14 GiB used, 106 GiB / 120 GiB avail; 41 MiB/s rd, 87 MiB/s wr, 270 op/s 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mgr fail", "who": "vm08.mxylvw"}]: dispatch 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mgr fail", "who": "vm08.mxylvw"}]: dispatch 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: Activating manager daemon vm07.xacuym 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' cmd='[{"prefix": "mgr fail", "who": "vm08.mxylvw"}]': finished 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: osdmap e44: 6 total, 6 up, 6 in 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: mgrmap e25: vm07.xacuym(active, starting, since 0.0140518s) 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: Active manager daemon vm07.xacuym restarted 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: Activating manager daemon vm07.xacuym 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: osdmap e45: 6 total, 6 up, 6 in 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: mgrmap e26: vm07.xacuym(active, starting, since 0.0118069s) 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.? 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.xacuym/crt"}]: dispatch 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.xacuym/crt"}]: dispatch 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:28:12.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.zkmcyw"}]: dispatch 2026-03-09T19:28:12.473 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.uizncw"}]: dispatch 2026-03-09T19:28:12.473 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.zcaqju"}]: dispatch 2026-03-09T19:28:12.473 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.jwsqrf"}]: dispatch 2026-03-09T19:28:12.473 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr metadata", "who": "vm07.xacuym", "id": "vm07.xacuym"}]: dispatch 2026-03-09T19:28:12.473 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:28:12.473 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:28:12.473 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:28:12.473 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:28:12.473 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:28:12.473 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:28:12.473 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T19:28:12.473 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T19:28:12.473 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T19:28:12.473 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T19:28:12.473 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.xacuym/key"}]: dispatch 2026-03-09T19:28:12.473 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:12 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: pgmap v15: 65 pgs: 65 active+clean; 4.0 GiB data, 14 GiB used, 106 GiB / 120 GiB avail; 41 MiB/s rd, 87 MiB/s wr, 270 op/s 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14696 192.168.123.108:0/3528536194' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mgr fail", "who": "vm08.mxylvw"}]: dispatch 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "mgr fail", "who": "vm08.mxylvw"}]: dispatch 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: Activating manager daemon vm07.xacuym 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14696 ' entity='mgr.vm08.mxylvw' cmd='[{"prefix": "mgr fail", "who": "vm08.mxylvw"}]': finished 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: osdmap e44: 6 total, 6 up, 6 in 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: mgrmap e25: vm07.xacuym(active, starting, since 0.0140518s) 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: Active manager daemon vm07.xacuym restarted 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: Activating manager daemon vm07.xacuym 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: osdmap e45: 6 total, 6 up, 6 in 2026-03-09T19:28:12.584 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: mgrmap e26: vm07.xacuym(active, starting, since 0.0118069s) 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.? 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.xacuym/crt"}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.xacuym/crt"}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.zkmcyw"}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.uizncw"}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.zcaqju"}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.jwsqrf"}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr metadata", "who": "vm07.xacuym", "id": "vm07.xacuym"}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.xacuym/key"}]: dispatch 2026-03-09T19:28:12.585 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:12 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T19:28:13.480 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:13 vm07.local ceph-mon[48545]: Manager daemon vm07.xacuym is now available 2026-03-09T19:28:13.480 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:13 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:13.480 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:13 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.xacuym/mirror_snapshot_schedule"}]: dispatch 2026-03-09T19:28:13.480 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:13 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:28:13.480 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:13 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.xacuym/trash_purge_schedule"}]: dispatch 2026-03-09T19:28:13.577 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:13 vm08.local ceph-mon[57794]: Manager daemon vm07.xacuym is now available 2026-03-09T19:28:13.577 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:13 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:13.577 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:13 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.xacuym/mirror_snapshot_schedule"}]: dispatch 2026-03-09T19:28:13.577 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:13 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:28:13.577 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:13 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.xacuym/trash_purge_schedule"}]: dispatch 2026-03-09T19:28:14.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:14 vm08.local ceph-mon[57794]: mgrmap e27: vm07.xacuym(active, since 1.33859s) 2026-03-09T19:28:14.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:14 vm08.local ceph-mon[57794]: [09/Mar/2026:19:28:13] ENGINE Bus STARTING 2026-03-09T19:28:14.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:14 vm08.local ceph-mon[57794]: [09/Mar/2026:19:28:14] ENGINE Serving on https://192.168.123.107:7150 2026-03-09T19:28:14.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:14 vm08.local ceph-mon[57794]: [09/Mar/2026:19:28:14] ENGINE Client ('192.168.123.107', 50660) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T19:28:14.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:14 vm07.local ceph-mon[48545]: mgrmap e27: vm07.xacuym(active, since 1.33859s) 2026-03-09T19:28:14.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:14 vm07.local ceph-mon[48545]: [09/Mar/2026:19:28:13] ENGINE Bus STARTING 2026-03-09T19:28:14.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:14 vm07.local ceph-mon[48545]: [09/Mar/2026:19:28:14] ENGINE Serving on https://192.168.123.107:7150 2026-03-09T19:28:14.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:14 vm07.local ceph-mon[48545]: [09/Mar/2026:19:28:14] ENGINE Client ('192.168.123.107', 50660) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T19:28:15.280 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:15 vm08.local ceph-mon[57794]: [09/Mar/2026:19:28:14] ENGINE Serving on http://192.168.123.107:8765 2026-03-09T19:28:15.280 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:15 vm08.local ceph-mon[57794]: [09/Mar/2026:19:28:14] ENGINE Bus STARTED 2026-03-09T19:28:15.280 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:15 vm08.local ceph-mon[57794]: mgrmap e28: vm07.xacuym(active, since 2s) 2026-03-09T19:28:15.280 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:15 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:15.280 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:15 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:15.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:15 vm07.local ceph-mon[48545]: [09/Mar/2026:19:28:14] ENGINE Serving on http://192.168.123.107:8765 2026-03-09T19:28:15.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:15 vm07.local ceph-mon[48545]: [09/Mar/2026:19:28:14] ENGINE Bus STARTED 2026-03-09T19:28:15.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:15 vm07.local ceph-mon[48545]: mgrmap e28: vm07.xacuym(active, since 2s) 2026-03-09T19:28:15.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:15 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:15.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:15 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:16.425 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:16 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:16.713 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:16 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:16.713 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:16 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:16.713 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:16 vm08.local ceph-mon[57794]: pgmap v5: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-09T19:28:16.713 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:16 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:16.713 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:16 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:16.713 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:16 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:16.713 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:16 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:16.713 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:16 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:28:16.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:16 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:16.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:16 vm07.local ceph-mon[48545]: pgmap v5: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-09T19:28:16.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:16 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:16.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:16 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:16.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:16 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:16.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:16 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:16.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:16 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:28:17.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:17 vm07.local ceph-mon[48545]: mgrmap e29: vm07.xacuym(active, since 4s) 2026-03-09T19:28:17.784 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:17 vm08.local ceph-mon[57794]: mgrmap e29: vm07.xacuym(active, since 4s) 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: Updating vm07:/etc/ceph/ceph.conf 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: Updating vm08:/etc/ceph/ceph.conf 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: Updating vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: pgmap v6: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: Updating vm07:/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.client.admin.keyring 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:18.771 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:18 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:18.930 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:18.930 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:18.930 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:28:18.930 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:18.930 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:28:18.930 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: Updating vm07:/etc/ceph/ceph.conf 2026-03-09T19:28:18.930 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: Updating vm08:/etc/ceph/ceph.conf 2026-03-09T19:28:18.930 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:28:18.930 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: Updating vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:28:18.931 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: pgmap v6: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-09T19:28:18.931 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: Updating vm07:/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:28:18.931 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:28:18.931 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.client.admin.keyring 2026-03-09T19:28:18.931 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:18.931 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:18.931 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:18.931 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:18.931 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:18 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:19.657 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:19 vm08.local ceph-mon[57794]: Updating vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.client.admin.keyring 2026-03-09T19:28:19.658 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:19 vm08.local ceph-mon[57794]: Reconfiguring prometheus.vm07 (dependencies changed)... 2026-03-09T19:28:19.658 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:19 vm08.local ceph-mon[57794]: Reconfiguring daemon prometheus.vm07 on vm07 2026-03-09T19:28:19.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:19 vm07.local ceph-mon[48545]: Updating vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.client.admin.keyring 2026-03-09T19:28:19.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:19 vm07.local ceph-mon[48545]: Reconfiguring prometheus.vm07 (dependencies changed)... 2026-03-09T19:28:19.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:19 vm07.local ceph-mon[48545]: Reconfiguring daemon prometheus.vm07 on vm07 2026-03-09T19:28:20.815 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:20 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:20.815 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:20 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:20.815 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:20 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T19:28:20.815 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:20 vm08.local ceph-mon[57794]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T19:28:20.815 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:20 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:20.815 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:20 vm08.local ceph-mon[57794]: Standby manager daemon vm08.mxylvw started 2026-03-09T19:28:20.815 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:20 vm08.local ceph-mon[57794]: from='mgr.? 192.168.123.108:0/3022912457' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.mxylvw/crt"}]: dispatch 2026-03-09T19:28:20.815 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:20 vm08.local ceph-mon[57794]: from='mgr.? 192.168.123.108:0/3022912457' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T19:28:20.815 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:20 vm08.local ceph-mon[57794]: from='mgr.? 192.168.123.108:0/3022912457' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.mxylvw/key"}]: dispatch 2026-03-09T19:28:20.815 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:20 vm08.local ceph-mon[57794]: from='mgr.? 192.168.123.108:0/3022912457' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T19:28:20.815 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:20 vm08.local ceph-mon[57794]: pgmap v7: 65 pgs: 65 active+clean; 2.7 GiB data, 9.9 GiB used, 110 GiB / 120 GiB avail; 35 KiB/s rd, 451 KiB/s wr, 73 op/s 2026-03-09T19:28:20.816 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:20 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.mxylvw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:28:20.816 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:20 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T19:28:20.816 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:20 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:20.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:20 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:20.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:20 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:20.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:20 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T19:28:20.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:20 vm07.local ceph-mon[48545]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T19:28:20.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:20 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:20.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:20 vm07.local ceph-mon[48545]: Standby manager daemon vm08.mxylvw started 2026-03-09T19:28:20.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:20 vm07.local ceph-mon[48545]: from='mgr.? 192.168.123.108:0/3022912457' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.mxylvw/crt"}]: dispatch 2026-03-09T19:28:20.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:20 vm07.local ceph-mon[48545]: from='mgr.? 192.168.123.108:0/3022912457' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T19:28:20.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:20 vm07.local ceph-mon[48545]: from='mgr.? 192.168.123.108:0/3022912457' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.mxylvw/key"}]: dispatch 2026-03-09T19:28:20.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:20 vm07.local ceph-mon[48545]: from='mgr.? 192.168.123.108:0/3022912457' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T19:28:20.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:20 vm07.local ceph-mon[48545]: pgmap v7: 65 pgs: 65 active+clean; 2.7 GiB data, 9.9 GiB used, 110 GiB / 120 GiB avail; 35 KiB/s rd, 451 KiB/s wr, 73 op/s 2026-03-09T19:28:20.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:20 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.mxylvw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:28:20.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:20 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T19:28:20.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:20 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:21 vm08.local ceph-mon[57794]: Upgrade: Updating mgr.vm08.mxylvw 2026-03-09T19:28:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:21 vm08.local ceph-mon[57794]: Deploying daemon mgr.vm08.mxylvw on vm08 2026-03-09T19:28:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:21 vm08.local ceph-mon[57794]: mgrmap e30: vm07.xacuym(active, since 8s), standbys: vm08.mxylvw 2026-03-09T19:28:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:21 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr metadata", "who": "vm08.mxylvw", "id": "vm08.mxylvw"}]: dispatch 2026-03-09T19:28:21.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:21 vm07.local ceph-mon[48545]: Upgrade: Updating mgr.vm08.mxylvw 2026-03-09T19:28:21.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:21 vm07.local ceph-mon[48545]: Deploying daemon mgr.vm08.mxylvw on vm08 2026-03-09T19:28:21.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:21 vm07.local ceph-mon[48545]: mgrmap e30: vm07.xacuym(active, since 8s), standbys: vm08.mxylvw 2026-03-09T19:28:21.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:21 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr metadata", "who": "vm08.mxylvw", "id": "vm08.mxylvw"}]: dispatch 2026-03-09T19:28:23.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:23 vm08.local ceph-mon[57794]: pgmap v8: 65 pgs: 65 active+clean; 2.7 GiB data, 9.9 GiB used, 110 GiB / 120 GiB avail; 27 KiB/s rd, 347 KiB/s wr, 56 op/s 2026-03-09T19:28:23.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:23 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:23.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:23 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:23.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:23 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:23.221 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:23 vm07.local ceph-mon[48545]: pgmap v8: 65 pgs: 65 active+clean; 2.7 GiB data, 9.9 GiB used, 110 GiB / 120 GiB avail; 27 KiB/s rd, 347 KiB/s wr, 56 op/s 2026-03-09T19:28:23.221 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:23 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:23.221 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:23 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:23.221 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:23 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:24.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:24 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:24.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:24 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:24.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:24 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:24.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:24 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:24.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:24 vm07.local ceph-mon[48545]: pgmap v9: 65 pgs: 65 active+clean; 2.0 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail; 30 KiB/s rd, 661 KiB/s wr, 95 op/s 2026-03-09T19:28:24.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:24 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:24.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:24 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:25.005 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:24 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:25.005 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:24 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:25.005 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:24 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:25.005 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:24 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:25.005 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:24 vm08.local ceph-mon[57794]: pgmap v9: 65 pgs: 65 active+clean; 2.0 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail; 30 KiB/s rd, 661 KiB/s wr, 95 op/s 2026-03-09T19:28:25.005 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:24 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:25.005 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:24 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:27.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: pgmap v10: 65 pgs: 65 active+clean; 2.0 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail; 27 KiB/s rd, 588 KiB/s wr, 84 op/s 2026-03-09T19:28:27.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:27.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:27.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:27.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:28:27.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:27.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:27.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:28:27.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:27.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm07.xacuym"}]: dispatch 2026-03-09T19:28:27.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm07.xacuym"}]': finished 2026-03-09T19:28:27.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm08.mxylvw"}]: dispatch 2026-03-09T19:28:27.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm08.mxylvw"}]': finished 2026-03-09T19:28:27.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-09T19:28:27.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:27.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: pgmap v10: 65 pgs: 65 active+clean; 2.0 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail; 27 KiB/s rd, 588 KiB/s wr, 84 op/s 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm07.xacuym"}]: dispatch 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm07.xacuym"}]': finished 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm08.mxylvw"}]: dispatch 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm08.mxylvw"}]': finished 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T19:28:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:27 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:27.457 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T19:28:27.457 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-mon[48545]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:27.457 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local systemd[1]: Stopping Ceph mon.vm07 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:28:27.457 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07[48541]: 2026-03-09T19:28:27.277+0000 7f917dd9e640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm07 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T19:28:27.457 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07[48541]: 2026-03-09T19:28:27.277+0000 7f917dd9e640 -1 mon.vm07@0(leader) e2 *** Got Signal Terminated *** 2026-03-09T19:28:27.457 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.456+0000 7f823a18b640 1 -- 192.168.123.107:0/1447874772 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8234075ba0 msgr2=0x7f8234075fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:27.457 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.456+0000 7f823a18b640 1 --2- 192.168.123.107:0/1447874772 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8234075ba0 0x7f8234075fa0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f82240099b0 tx=0x7f822402f240 comp rx=0 tx=0).stop 2026-03-09T19:28:27.457 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.458+0000 7f823a18b640 1 -- 192.168.123.107:0/1447874772 shutdown_connections 2026-03-09T19:28:27.457 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.458+0000 7f823a18b640 1 --2- 192.168.123.107:0/1447874772 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8234076df0 0x7f8234077250 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.457 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.458+0000 7f823a18b640 1 --2- 192.168.123.107:0/1447874772 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8234075ba0 0x7f8234075fa0 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.457 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.458+0000 7f823a18b640 1 -- 192.168.123.107:0/1447874772 >> 192.168.123.107:0/1447874772 conn(0x7f82340fe250 msgr2=0x7f8234100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:28:27.457 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.458+0000 7f823a18b640 1 -- 192.168.123.107:0/1447874772 shutdown_connections 2026-03-09T19:28:27.457 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.458+0000 7f823a18b640 1 -- 192.168.123.107:0/1447874772 wait complete. 2026-03-09T19:28:27.457 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.459+0000 7f823a18b640 1 Processor -- start 2026-03-09T19:28:27.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.459+0000 7f823a18b640 1 -- start start 2026-03-09T19:28:27.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.459+0000 7f823a18b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8234075ba0 0x7f823419e970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:27.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.459+0000 7f82337fe640 1 -- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8234075ba0 msgr2=0x7f823419e970 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:3300/0 2026-03-09T19:28:27.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.459+0000 7f82337fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8234075ba0 0x7f823419e970 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T19:28:27.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.459+0000 7f823a18b640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8234076df0 0x7f823419eeb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:27.458 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.460+0000 7f823a18b640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f823419f480 con 0x7f8234075ba0 2026-03-09T19:28:27.459 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.460+0000 7f823a18b640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f823419f5f0 con 0x7f8234076df0 2026-03-09T19:28:27.459 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.460+0000 7f8232ffd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8234076df0 0x7f823419eeb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:28:27.459 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.460+0000 7f8232ffd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8234076df0 0x7f823419eeb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:43826/0 (socket says 192.168.123.107:43826) 2026-03-09T19:28:27.459 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.460+0000 7f8232ffd640 1 -- 192.168.123.107:0/2756213089 learned_addr learned my addr 192.168.123.107:0/2756213089 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:28:27.459 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.460+0000 7f8232ffd640 1 -- 192.168.123.107:0/2756213089 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8234075ba0 msgr2=0x7f823419e970 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:28:27.459 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.460+0000 7f8232ffd640 1 --2- 192.168.123.107:0/2756213089 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8234075ba0 0x7f823419e970 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.459 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.460+0000 7f8232ffd640 1 -- 192.168.123.107:0/2756213089 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8224009660 con 0x7f8234076df0 2026-03-09T19:28:27.459 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.461+0000 7f8232ffd640 1 --2- 192.168.123.107:0/2756213089 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8234076df0 0x7f823419eeb0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f822000b700 tx=0x7f822000bbd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:28:27.460 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.461+0000 7f8230ff9640 1 -- 192.168.123.107:0/2756213089 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8220002af0 con 0x7f8234076df0 2026-03-09T19:28:27.460 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.461+0000 7f823a18b640 1 -- 192.168.123.107:0/2756213089 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f82341a4090 con 0x7f8234076df0 2026-03-09T19:28:27.460 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.461+0000 7f823a18b640 1 -- 192.168.123.107:0/2756213089 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f82341a4660 con 0x7f8234076df0 2026-03-09T19:28:27.460 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.461+0000 7f8230ff9640 1 -- 192.168.123.107:0/2756213089 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8220009430 con 0x7f8234076df0 2026-03-09T19:28:27.460 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.461+0000 7f8230ff9640 1 -- 192.168.123.107:0/2756213089 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f822000cb00 con 0x7f8234076df0 2026-03-09T19:28:27.461 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.462+0000 7f823a18b640 1 -- 192.168.123.107:0/2756213089 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f81f8005350 con 0x7f8234076df0 2026-03-09T19:28:27.461 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.462+0000 7f8230ff9640 1 -- 192.168.123.107:0/2756213089 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 30) v1 ==== 100078+0+0 (secure 0 0 0) 0x7f822000cc60 con 0x7f8234076df0 2026-03-09T19:28:27.462 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.463+0000 7f8230ff9640 1 --2- 192.168.123.107:0/2756213089 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f82040775e0 0x7f8204079aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:27.462 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.463+0000 7f82337fe640 1 --2- 192.168.123.107:0/2756213089 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f82040775e0 0x7f8204079aa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:28:27.463 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.464+0000 7f8230ff9640 1 -- 192.168.123.107:0/2756213089 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f8220098cf0 con 0x7f8234076df0 2026-03-09T19:28:27.463 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.464+0000 7f82337fe640 1 --2- 192.168.123.107:0/2756213089 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f82040775e0 0x7f8204079aa0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f8224002f10 tx=0x7f8224031040 comp rx=0 tx=0).ready entity=mgr.14746 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:28:27.465 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.466+0000 7f8230ff9640 1 -- 192.168.123.107:0/2756213089 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f8220061790 con 0x7f8234076df0 2026-03-09T19:28:27.585 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.585+0000 7f823a18b640 1 -- 192.168.123.107:0/2756213089 --> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f81f8002bf0 con 0x7f82040775e0 2026-03-09T19:28:27.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.587+0000 7f8230ff9640 1 -- 192.168.123.107:0/2756213089 <== mgr.14746 v2:192.168.123.107:6800/781232384 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+368 (secure 0 0 0) 0x7f81f8002bf0 con 0x7f82040775e0 2026-03-09T19:28:27.589 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.590+0000 7f823a18b640 1 -- 192.168.123.107:0/2756213089 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f82040775e0 msgr2=0x7f8204079aa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:27.590 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.590+0000 7f823a18b640 1 --2- 192.168.123.107:0/2756213089 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f82040775e0 0x7f8204079aa0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f8224002f10 tx=0x7f8224031040 comp rx=0 tx=0).stop 2026-03-09T19:28:27.590 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.590+0000 7f823a18b640 1 -- 192.168.123.107:0/2756213089 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8234076df0 msgr2=0x7f823419eeb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:27.590 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.590+0000 7f823a18b640 1 --2- 192.168.123.107:0/2756213089 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8234076df0 0x7f823419eeb0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f822000b700 tx=0x7f822000bbd0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.590 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.591+0000 7f823a18b640 1 -- 192.168.123.107:0/2756213089 shutdown_connections 2026-03-09T19:28:27.590 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.591+0000 7f823a18b640 1 --2- 192.168.123.107:0/2756213089 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f82040775e0 0x7f8204079aa0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.590 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.591+0000 7f823a18b640 1 --2- 192.168.123.107:0/2756213089 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8234076df0 0x7f823419eeb0 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.590 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.591+0000 7f823a18b640 1 --2- 192.168.123.107:0/2756213089 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8234075ba0 0x7f823419e970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.590 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.591+0000 7f823a18b640 1 -- 192.168.123.107:0/2756213089 >> 192.168.123.107:0/2756213089 conn(0x7f82340fe250 msgr2=0x7f82340ffd10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:28:27.590 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.591+0000 7f823a18b640 1 -- 192.168.123.107:0/2756213089 shutdown_connections 2026-03-09T19:28:27.590 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.591+0000 7f823a18b640 1 -- 192.168.123.107:0/2756213089 wait complete. 2026-03-09T19:28:27.599 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:28:27.672 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.672+0000 7f96aa2ad640 1 -- 192.168.123.107:0/1382629836 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96a4072440 msgr2=0x7f96a40771b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:27.672 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.672+0000 7f96aa2ad640 1 --2- 192.168.123.107:0/1382629836 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96a4072440 0x7f96a40771b0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f969c008030 tx=0x7f969c030dc0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.672 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.673+0000 7f96aa2ad640 1 -- 192.168.123.107:0/1382629836 shutdown_connections 2026-03-09T19:28:27.672 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.673+0000 7f96aa2ad640 1 --2- 192.168.123.107:0/1382629836 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96a4072440 0x7f96a40771b0 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.672 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.673+0000 7f96aa2ad640 1 --2- 192.168.123.107:0/1382629836 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f96a4071a70 0x7f96a4071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.672 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.673+0000 7f96aa2ad640 1 -- 192.168.123.107:0/1382629836 >> 192.168.123.107:0/1382629836 conn(0x7f96a406d4f0 msgr2=0x7f96a406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:28:27.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.673+0000 7f96aa2ad640 1 -- 192.168.123.107:0/1382629836 shutdown_connections 2026-03-09T19:28:27.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.673+0000 7f96aa2ad640 1 -- 192.168.123.107:0/1382629836 wait complete. 2026-03-09T19:28:27.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.674+0000 7f96aa2ad640 1 Processor -- start 2026-03-09T19:28:27.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.674+0000 7f96aa2ad640 1 -- start start 2026-03-09T19:28:27.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.674+0000 7f96aa2ad640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f96a4071a70 0x7f96a4131920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:27.674 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.674+0000 7f96aa2ad640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96a4072440 0x7f96a4131e60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:27.674 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.674+0000 7f96aa2ad640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f96a4133360 con 0x7f96a4071a70 2026-03-09T19:28:27.674 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.674+0000 7f96aa2ad640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f96a41334d0 con 0x7f96a4072440 2026-03-09T19:28:27.674 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.674+0000 7f96a3fff640 1 -- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f96a4071a70 msgr2=0x7f96a4131920 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:3300/0 2026-03-09T19:28:27.674 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.674+0000 7f96a3fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f96a4071a70 0x7f96a4131920 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T19:28:27.674 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.674+0000 7f96a37fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96a4072440 0x7f96a4131e60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:28:27.674 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.674+0000 7f96a37fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96a4072440 0x7f96a4131e60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:43844/0 (socket says 192.168.123.107:43844) 2026-03-09T19:28:27.674 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.674+0000 7f96a37fe640 1 -- 192.168.123.107:0/420982624 learned_addr learned my addr 192.168.123.107:0/420982624 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:28:27.674 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.675+0000 7f96a37fe640 1 -- 192.168.123.107:0/420982624 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f96a4071a70 msgr2=0x7f96a4131920 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:28:27.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.675+0000 7f96a37fe640 1 --2- 192.168.123.107:0/420982624 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f96a4071a70 0x7f96a4131920 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.675+0000 7f96a37fe640 1 -- 192.168.123.107:0/420982624 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f969c007ce0 con 0x7f96a4072440 2026-03-09T19:28:27.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.675+0000 7f96a37fe640 1 --2- 192.168.123.107:0/420982624 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96a4072440 0x7f96a4131e60 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f969c0043c0 tx=0x7f969c03a6a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:28:27.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.676+0000 7f96a17fa640 1 -- 192.168.123.107:0/420982624 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f969c002c70 con 0x7f96a4072440 2026-03-09T19:28:27.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.677+0000 7f96aa2ad640 1 -- 192.168.123.107:0/420982624 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f96a41323a0 con 0x7f96a4072440 2026-03-09T19:28:27.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.677+0000 7f96aa2ad640 1 -- 192.168.123.107:0/420982624 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f96a407fb70 con 0x7f96a4072440 2026-03-09T19:28:27.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.677+0000 7f96a17fa640 1 -- 192.168.123.107:0/420982624 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f969c004a50 con 0x7f96a4072440 2026-03-09T19:28:27.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.677+0000 7f96a17fa640 1 -- 192.168.123.107:0/420982624 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f969c0427f0 con 0x7f96a4072440 2026-03-09T19:28:27.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.678+0000 7f9682ffd640 1 -- 192.168.123.107:0/420982624 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f96a4071e70 con 0x7f96a4072440 2026-03-09T19:28:27.678 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.679+0000 7f96a17fa640 1 -- 192.168.123.107:0/420982624 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 30) v1 ==== 100078+0+0 (secure 0 0 0) 0x7f969c04a050 con 0x7f96a4072440 2026-03-09T19:28:27.678 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.679+0000 7f96a17fa640 1 --2- 192.168.123.107:0/420982624 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f9690077610 0x7f9690079ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:27.679 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.679+0000 7f96a17fa640 1 -- 192.168.123.107:0/420982624 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f969c0bf440 con 0x7f96a4072440 2026-03-09T19:28:27.679 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.679+0000 7f96a3fff640 1 --2- 192.168.123.107:0/420982624 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f9690077610 0x7f9690079ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:28:27.679 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.680+0000 7f96a3fff640 1 --2- 192.168.123.107:0/420982624 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f9690077610 0x7f9690079ad0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f96940028b0 tx=0x7f9694009270 comp rx=0 tx=0).ready entity=mgr.14746 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:28:27.681 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.682+0000 7f96a17fa640 1 -- 192.168.123.107:0/420982624 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f969c087e50 con 0x7f96a4072440 2026-03-09T19:28:27.708 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local podman[111613]: 2026-03-09 19:28:27.546329397 +0000 UTC m=+0.322607905 container died ccb644205fb3015552c6dee0fd883e9274481ccc83c6a2d83d43307275007c07 (image=quay.ceph.io/ceph-ci/ceph:reef, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6) 2026-03-09T19:28:27.709 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local podman[111613]: 2026-03-09 19:28:27.69726516 +0000 UTC m=+0.473543657 container remove ccb644205fb3015552c6dee0fd883e9274481ccc83c6a2d83d43307275007c07 (image=quay.ceph.io/ceph-ci/ceph:reef, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, CEPH_REF=reef, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T19:28:27.709 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local bash[111613]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07 2026-03-09T19:28:27.858 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.858+0000 7f9682ffd640 1 -- 192.168.123.107:0/420982624 --> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f96a40611c0 con 0x7f9690077610 2026-03-09T19:28:27.859 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.860+0000 7f96a17fa640 1 -- 192.168.123.107:0/420982624 <== mgr.14746 v2:192.168.123.107:6800/781232384 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+368 (secure 0 0 0) 0x7f96a40611c0 con 0x7f9690077610 2026-03-09T19:28:27.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.865+0000 7f96aa2ad640 1 -- 192.168.123.107:0/420982624 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f9690077610 msgr2=0x7f9690079ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:27.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.866+0000 7f96aa2ad640 1 --2- 192.168.123.107:0/420982624 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f9690077610 0x7f9690079ad0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f96940028b0 tx=0x7f9694009270 comp rx=0 tx=0).stop 2026-03-09T19:28:27.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.866+0000 7f96aa2ad640 1 -- 192.168.123.107:0/420982624 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96a4072440 msgr2=0x7f96a4131e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:27.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.866+0000 7f96aa2ad640 1 --2- 192.168.123.107:0/420982624 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96a4072440 0x7f96a4131e60 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f969c0043c0 tx=0x7f969c03a6a0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.866 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.867+0000 7f96aa2ad640 1 -- 192.168.123.107:0/420982624 shutdown_connections 2026-03-09T19:28:27.866 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.868+0000 7f96aa2ad640 1 --2- 192.168.123.107:0/420982624 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f9690077610 0x7f9690079ad0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.867 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.868+0000 7f96aa2ad640 1 --2- 192.168.123.107:0/420982624 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96a4072440 0x7f96a4131e60 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.867 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.868+0000 7f96aa2ad640 1 --2- 192.168.123.107:0/420982624 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f96a4071a70 0x7f96a4131920 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.867 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.868+0000 7f96aa2ad640 1 -- 192.168.123.107:0/420982624 >> 192.168.123.107:0/420982624 conn(0x7f96a406d4f0 msgr2=0x7f96a4075370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:28:27.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.871+0000 7f96aa2ad640 1 -- 192.168.123.107:0/420982624 shutdown_connections 2026-03-09T19:28:27.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.871+0000 7f96aa2ad640 1 -- 192.168.123.107:0/420982624 wait complete. 2026-03-09T19:28:27.967 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.968+0000 7fd17522e640 1 -- 192.168.123.107:0/479312931 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd170071a50 msgr2=0x7fd170071e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:27.967 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.968+0000 7fd17522e640 1 --2- 192.168.123.107:0/479312931 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd170071a50 0x7fd170071e50 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fd160007920 tx=0x7fd16002ffe0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.968+0000 7fd17522e640 1 -- 192.168.123.107:0/479312931 shutdown_connections 2026-03-09T19:28:27.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.968+0000 7fd17522e640 1 --2- 192.168.123.107:0/479312931 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd170072420 0x7fd170077190 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.968+0000 7fd17522e640 1 --2- 192.168.123.107:0/479312931 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd170071a50 0x7fd170071e50 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.968+0000 7fd17522e640 1 -- 192.168.123.107:0/479312931 >> 192.168.123.107:0/479312931 conn(0x7fd17006d4f0 msgr2=0x7fd17006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:28:27.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.968+0000 7fd17522e640 1 -- 192.168.123.107:0/479312931 shutdown_connections 2026-03-09T19:28:27.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.968+0000 7fd17522e640 1 -- 192.168.123.107:0/479312931 wait complete. 2026-03-09T19:28:27.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.969+0000 7fd17522e640 1 Processor -- start 2026-03-09T19:28:27.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.969+0000 7fd17522e640 1 -- start start 2026-03-09T19:28:27.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.969+0000 7fd17522e640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd170072420 0x7fd170132090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:27.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.969+0000 7fd17522e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd1701325d0 0x7fd17007fb50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:27.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.969+0000 7fd17522e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd170133620 con 0x7fd1701325d0 2026-03-09T19:28:27.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.969+0000 7fd17522e640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd170132a50 con 0x7fd170072420 2026-03-09T19:28:27.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.969+0000 7fd16f7fe640 1 -- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd1701325d0 msgr2=0x7fd17007fb50 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:3300/0 2026-03-09T19:28:27.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.969+0000 7fd16f7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd1701325d0 0x7fd17007fb50 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T19:28:27.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.969+0000 7fd16ffff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd170072420 0x7fd170132090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:28:27.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.969+0000 7fd16ffff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd170072420 0x7fd170132090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:43866/0 (socket says 192.168.123.107:43866) 2026-03-09T19:28:27.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.969+0000 7fd16ffff640 1 -- 192.168.123.107:0/224299317 learned_addr learned my addr 192.168.123.107:0/224299317 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:28:27.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.970+0000 7fd16ffff640 1 -- 192.168.123.107:0/224299317 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd1701325d0 msgr2=0x7fd17007fb50 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:28:27.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.970+0000 7fd16ffff640 1 --2- 192.168.123.107:0/224299317 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd1701325d0 0x7fd17007fb50 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:27.970 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.970+0000 7fd16ffff640 1 -- 192.168.123.107:0/224299317 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd1600075d0 con 0x7fd170072420 2026-03-09T19:28:27.972 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.971+0000 7fd16ffff640 1 --2- 192.168.123.107:0/224299317 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd170072420 0x7fd170132090 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7fd1600304f0 tx=0x7fd160030a70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:28:27.973 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.971+0000 7fd16d7fa640 1 -- 192.168.123.107:0/224299317 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd160030d70 con 0x7fd170072420 2026-03-09T19:28:27.973 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.971+0000 7fd16d7fa640 1 -- 192.168.123.107:0/224299317 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd160039b50 con 0x7fd170072420 2026-03-09T19:28:27.973 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.972+0000 7fd16d7fa640 1 -- 192.168.123.107:0/224299317 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd160038a70 con 0x7fd170072420 2026-03-09T19:28:27.973 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.972+0000 7fd17522e640 1 -- 192.168.123.107:0/224299317 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd170132cd0 con 0x7fd170072420 2026-03-09T19:28:27.973 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.972+0000 7fd17522e640 1 -- 192.168.123.107:0/224299317 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd170080370 con 0x7fd170072420 2026-03-09T19:28:27.973 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.973+0000 7fd16d7fa640 1 -- 192.168.123.107:0/224299317 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 30) v1 ==== 100078+0+0 (secure 0 0 0) 0x7fd160038bd0 con 0x7fd170072420 2026-03-09T19:28:27.973 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.974+0000 7fd16d7fa640 1 --2- 192.168.123.107:0/224299317 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7fd14c077420 0x7fd14c0798e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:27.973 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.974+0000 7fd16d7fa640 1 -- 192.168.123.107:0/224299317 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fd1600bd360 con 0x7fd170072420 2026-03-09T19:28:27.973 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.974+0000 7fd17522e640 1 -- 192.168.123.107:0/224299317 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd170071e50 con 0x7fd170072420 2026-03-09T19:28:27.974 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.975+0000 7fd16f7fe640 1 --2- 192.168.123.107:0/224299317 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7fd14c077420 0x7fd14c0798e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:28:27.978 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.977+0000 7fd16f7fe640 1 --2- 192.168.123.107:0/224299317 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7fd14c077420 0x7fd14c0798e0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fd1701335b0 tx=0x7fd1680091f0 comp rx=0 tx=0).ready entity=mgr.14746 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:28:27.978 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:27.979+0000 7fd16d7fa640 1 -- 192.168.123.107:0/224299317 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7fd160085cf0 con 0x7fd170072420 2026-03-09T19:28:28.031 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mon.vm07.service: Deactivated successfully. 2026-03-09T19:28:28.031 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local systemd[1]: Stopped Ceph mon.vm07 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:28:28.031 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:27 vm07.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mon.vm07.service: Consumed 5.971s CPU time. 2026-03-09T19:28:28.120 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.120+0000 7fd17522e640 1 -- 192.168.123.107:0/224299317 --> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fd170076060 con 0x7fd14c077420 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.126+0000 7fd16d7fa640 1 -- 192.168.123.107:0/224299317 <== mgr.14746 v2:192.168.123.107:6800/781232384 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fd170076060 con 0x7fd14c077420 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (5m) 4s ago 6m 22.9M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (6m) 4s ago 6m 8644k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (5m) 4s ago 5m 9.96M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (6m) 4s ago 6m 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2799ea3e4bf3 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (5m) 4s ago 5m 7616k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 81fc95c210b6 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (5m) 4s ago 5m 88.0M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (4m) 4s ago 4m 15.7M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (4m) 4s ago 4m 17.7M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (4m) 4s ago 4m 27.8M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (4m) 4s ago 4m 238M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:8443,9283,8765 running (23s) 4s ago 6m 586M - 19.2.3-678-ge911bdeb 654f31e6858e 6c1350e70bfa 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (6s) 4s ago 5m 37.9M - 19.2.3-678-ge911bdeb 654f31e6858e c4c36685d8dc 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (6m) 4s ago 6m 57.3M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ccb644205fb3 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (5m) 4s ago 5m 47.7M 2048M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 8d7b1da9e1e2 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (6m) 4s ago 6m 14.3M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (5m) 4s ago 5m 16.1M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (5m) 4s ago 5m 357M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d7417e3377af 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (5m) 4s ago 5m 407M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2b3c7dd92144 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (4m) 4s ago 4m 307M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 67f7c4b96ef8 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (4m) 4s ago 4m 439M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 740e44caf4fc 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (4m) 4s ago 4m 425M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d929d31f8a58 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (4m) 4s ago 4m 366M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:28:28.125 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (8s) 4s ago 5m 40.1M - 2.43.0 a07b618ecd1d c09450c20f5f 2026-03-09T19:28:28.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.129+0000 7fd17522e640 1 -- 192.168.123.107:0/224299317 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7fd14c077420 msgr2=0x7fd14c0798e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:28.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.129+0000 7fd17522e640 1 --2- 192.168.123.107:0/224299317 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7fd14c077420 0x7fd14c0798e0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fd1701335b0 tx=0x7fd1680091f0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.129+0000 7fd17522e640 1 -- 192.168.123.107:0/224299317 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd170072420 msgr2=0x7fd170132090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:28.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.129+0000 7fd17522e640 1 --2- 192.168.123.107:0/224299317 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd170072420 0x7fd170132090 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7fd1600304f0 tx=0x7fd160030a70 comp rx=0 tx=0).stop 2026-03-09T19:28:28.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.129+0000 7fd17522e640 1 -- 192.168.123.107:0/224299317 shutdown_connections 2026-03-09T19:28:28.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.129+0000 7fd17522e640 1 --2- 192.168.123.107:0/224299317 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7fd14c077420 0x7fd14c0798e0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.129+0000 7fd17522e640 1 --2- 192.168.123.107:0/224299317 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd1701325d0 0x7fd17007fb50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.129+0000 7fd17522e640 1 --2- 192.168.123.107:0/224299317 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd170072420 0x7fd170132090 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.129+0000 7fd17522e640 1 -- 192.168.123.107:0/224299317 >> 192.168.123.107:0/224299317 conn(0x7fd17006d4f0 msgr2=0x7fd170075950 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:28:28.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.130+0000 7fd17522e640 1 -- 192.168.123.107:0/224299317 shutdown_connections 2026-03-09T19:28:28.129 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.130+0000 7fd17522e640 1 -- 192.168.123.107:0/224299317 wait complete. 2026-03-09T19:28:28.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.210+0000 7fedab13f640 1 -- 192.168.123.107:0/3056544458 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feda4071a50 msgr2=0x7feda4071e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:28.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.210+0000 7fedab13f640 1 --2- 192.168.123.107:0/3056544458 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feda4071a50 0x7feda4071e50 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7feda0007920 tx=0x7feda002ffe0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.210+0000 7fedab13f640 1 -- 192.168.123.107:0/3056544458 shutdown_connections 2026-03-09T19:28:28.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.210+0000 7fedab13f640 1 --2- 192.168.123.107:0/3056544458 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feda4072420 0x7feda4077190 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.210+0000 7fedab13f640 1 --2- 192.168.123.107:0/3056544458 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feda4071a50 0x7feda4071e50 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.210+0000 7fedab13f640 1 -- 192.168.123.107:0/3056544458 >> 192.168.123.107:0/3056544458 conn(0x7feda406d4f0 msgr2=0x7feda406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:28:28.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.212+0000 7fedab13f640 1 -- 192.168.123.107:0/3056544458 shutdown_connections 2026-03-09T19:28:28.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.212+0000 7fedab13f640 1 -- 192.168.123.107:0/3056544458 wait complete. 2026-03-09T19:28:28.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.212+0000 7fedab13f640 1 Processor -- start 2026-03-09T19:28:28.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.212+0000 7fedab13f640 1 -- start start 2026-03-09T19:28:28.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.212+0000 7fedab13f640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feda4072420 0x7feda4084120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:28.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.212+0000 7fedab13f640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feda4082770 0x7feda4082bf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:28.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.212+0000 7fedab13f640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feda4084660 con 0x7feda4072420 2026-03-09T19:28:28.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.212+0000 7fedab13f640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feda4083160 con 0x7feda4082770 2026-03-09T19:28:28.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.213+0000 7fedaa13d640 1 -- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feda4072420 msgr2=0x7feda4084120 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:3300/0 2026-03-09T19:28:28.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.213+0000 7fedaa13d640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feda4072420 0x7feda4084120 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T19:28:28.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.213+0000 7feda993c640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feda4082770 0x7feda4082bf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:28:28.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.213+0000 7feda993c640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feda4082770 0x7feda4082bf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:43896/0 (socket says 192.168.123.107:43896) 2026-03-09T19:28:28.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.213+0000 7feda993c640 1 -- 192.168.123.107:0/3878896075 learned_addr learned my addr 192.168.123.107:0/3878896075 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:28:28.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.213+0000 7feda993c640 1 -- 192.168.123.107:0/3878896075 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feda4072420 msgr2=0x7feda4084120 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:28:28.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.213+0000 7feda993c640 1 --2- 192.168.123.107:0/3878896075 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feda4072420 0x7feda4084120 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.213+0000 7feda993c640 1 -- 192.168.123.107:0/3878896075 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feda00075d0 con 0x7feda4082770 2026-03-09T19:28:28.215 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.216+0000 7feda993c640 1 --2- 192.168.123.107:0/3878896075 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feda4082770 0x7feda4082bf0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fed9c00bdf0 tx=0x7fed9c00bef0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:28:28.215 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.216+0000 7fed9b7fe640 1 -- 192.168.123.107:0/3878896075 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fed9c00cd70 con 0x7feda4082770 2026-03-09T19:28:28.217 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.216+0000 7fedab13f640 1 -- 192.168.123.107:0/3878896075 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7feda4083440 con 0x7feda4082770 2026-03-09T19:28:28.217 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.217+0000 7fedab13f640 1 -- 192.168.123.107:0/3878896075 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7feda41be620 con 0x7feda4082770 2026-03-09T19:28:28.217 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.218+0000 7fed9b7fe640 1 -- 192.168.123.107:0/3878896075 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fed9c015930 con 0x7feda4082770 2026-03-09T19:28:28.217 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.218+0000 7fed9b7fe640 1 -- 192.168.123.107:0/3878896075 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fed9c0046e0 con 0x7feda4082770 2026-03-09T19:28:28.218 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.219+0000 7fed9b7fe640 1 -- 192.168.123.107:0/3878896075 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 30) v1 ==== 100078+0+0 (secure 0 0 0) 0x7fed9c004900 con 0x7feda4082770 2026-03-09T19:28:28.219 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.220+0000 7fed9b7fe640 1 --2- 192.168.123.107:0/3878896075 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7fed780775e0 0x7fed78079aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:28.219 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.220+0000 7fedaa13d640 1 --2- 192.168.123.107:0/3878896075 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7fed780775e0 0x7fed78079aa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:28:28.219 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.220+0000 7fed9b7fe640 1 -- 192.168.123.107:0/3878896075 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fed9c099e20 con 0x7feda4082770 2026-03-09T19:28:28.219 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.221+0000 7fedab13f640 1 -- 192.168.123.107:0/3878896075 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feda407a450 con 0x7feda4082770 2026-03-09T19:28:28.220 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.221+0000 7fedaa13d640 1 --2- 192.168.123.107:0/3878896075 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7fed780775e0 0x7fed78079aa0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7feda00304f0 tx=0x7feda00023d0 comp rx=0 tx=0).ready entity=mgr.14746 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:28:28.225 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.226+0000 7fed9b7fe640 1 -- 192.168.123.107:0/3878896075 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7fed9c0627c0 con 0x7feda4082770 2026-03-09T19:28:28.387 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.388+0000 7fedab13f640 1 -- 192.168.123.107:0/3878896075 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7feda41be950 con 0x7feda4082770 2026-03-09T19:28:28.387 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local systemd[1]: Starting Ceph mon.vm07 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:28:28.387 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local podman[111824]: 2026-03-09 19:28:28.366640342 +0000 UTC m=+0.032959150 container create ad39140965d8dea0174b49de75fa5bc94a99c64754b25968d52baaa8f6bb0d9b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_REF=squid, OSD_FLAVOR=default) 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.392+0000 7fed9b7fe640 1 -- 192.168.123.107:0/3878896075 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+739 (secure 0 0 0) 0x7fed9c061f10 con 0x7feda4082770 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 12, 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:28:28.391 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:28:28.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.395+0000 7fed997fa640 1 -- 192.168.123.107:0/3878896075 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7fed780775e0 msgr2=0x7fed78079aa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:28.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.395+0000 7fed997fa640 1 --2- 192.168.123.107:0/3878896075 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7fed780775e0 0x7fed78079aa0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7feda00304f0 tx=0x7feda00023d0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.395+0000 7fed997fa640 1 -- 192.168.123.107:0/3878896075 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feda4082770 msgr2=0x7feda4082bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:28.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.395+0000 7fed997fa640 1 --2- 192.168.123.107:0/3878896075 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feda4082770 0x7feda4082bf0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fed9c00bdf0 tx=0x7fed9c00bef0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.395+0000 7fed997fa640 1 -- 192.168.123.107:0/3878896075 shutdown_connections 2026-03-09T19:28:28.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.395+0000 7fed997fa640 1 --2- 192.168.123.107:0/3878896075 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7fed780775e0 0x7fed78079aa0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.395+0000 7fed997fa640 1 --2- 192.168.123.107:0/3878896075 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feda4082770 0x7feda4082bf0 secure :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fed9c00bdf0 tx=0x7fed9c00bef0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.395+0000 7fed997fa640 1 --2- 192.168.123.107:0/3878896075 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feda4072420 0x7feda4084120 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.395+0000 7fed997fa640 1 -- 192.168.123.107:0/3878896075 >> 192.168.123.107:0/3878896075 conn(0x7feda406d4f0 msgr2=0x7feda4075450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:28:28.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.396+0000 7fed997fa640 1 -- 192.168.123.107:0/3878896075 shutdown_connections 2026-03-09T19:28:28.395 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.396+0000 7fed997fa640 1 -- 192.168.123.107:0/3878896075 wait complete. 2026-03-09T19:28:28.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.490+0000 7f9ecd2d4640 1 -- 192.168.123.107:0/819518830 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9ec8072420 msgr2=0x7f9ec8077190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:28.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.490+0000 7f9ecd2d4640 1 --2- 192.168.123.107:0/819518830 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9ec8072420 0x7f9ec8077190 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f9ec0008030 tx=0x7f9ec0030dc0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.490+0000 7f9ecd2d4640 1 -- 192.168.123.107:0/819518830 shutdown_connections 2026-03-09T19:28:28.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.490+0000 7f9ecd2d4640 1 --2- 192.168.123.107:0/819518830 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9ec8072420 0x7f9ec8077190 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.490+0000 7f9ecd2d4640 1 --2- 192.168.123.107:0/819518830 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ec8071a50 0x7f9ec8071e50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.490+0000 7f9ecd2d4640 1 -- 192.168.123.107:0/819518830 >> 192.168.123.107:0/819518830 conn(0x7f9ec806d4f0 msgr2=0x7f9ec806f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:28:28.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.490+0000 7f9ecd2d4640 1 -- 192.168.123.107:0/819518830 shutdown_connections 2026-03-09T19:28:28.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.491+0000 7f9ecd2d4640 1 -- 192.168.123.107:0/819518830 wait complete. 2026-03-09T19:28:28.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.491+0000 7f9ecd2d4640 1 Processor -- start 2026-03-09T19:28:28.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.492+0000 7f9ecd2d4640 1 -- start start 2026-03-09T19:28:28.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.492+0000 7f9ecd2d4640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ec8071a50 0x7f9ec8131970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:28.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.492+0000 7f9ecd2d4640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9ec8133320 0x7f9ec8131eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:28.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.492+0000 7f9ecd2d4640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9ec8132480 con 0x7f9ec8071a50 2026-03-09T19:28:28.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.492+0000 7f9ecd2d4640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9ec81325f0 con 0x7f9ec8133320 2026-03-09T19:28:28.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.492+0000 7f9ec7fff640 1 -- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ec8071a50 msgr2=0x7f9ec8131970 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:3300/0 2026-03-09T19:28:28.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.492+0000 7f9ec7fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ec8071a50 0x7f9ec8131970 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T19:28:28.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.492+0000 7f9ec77fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9ec8133320 0x7f9ec8131eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:28:28.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.492+0000 7f9ec77fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9ec8133320 0x7f9ec8131eb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:43900/0 (socket says 192.168.123.107:43900) 2026-03-09T19:28:28.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.492+0000 7f9ec77fe640 1 -- 192.168.123.107:0/66557907 learned_addr learned my addr 192.168.123.107:0/66557907 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:28:28.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.492+0000 7f9ec77fe640 1 -- 192.168.123.107:0/66557907 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ec8071a50 msgr2=0x7f9ec8131970 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:28:28.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.492+0000 7f9ec77fe640 1 --2- 192.168.123.107:0/66557907 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ec8071a50 0x7f9ec8131970 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.492+0000 7f9ec77fe640 1 -- 192.168.123.107:0/66557907 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9ec0007ce0 con 0x7f9ec8133320 2026-03-09T19:28:28.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.493+0000 7f9ec77fe640 1 --2- 192.168.123.107:0/66557907 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9ec8133320 0x7f9ec8131eb0 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f9ec0004270 tx=0x7f9ec0007c50 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:28:28.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.494+0000 7f9ec57fa640 1 -- 192.168.123.107:0/66557907 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9ec0002e80 con 0x7f9ec8133320 2026-03-09T19:28:28.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.494+0000 7f9ecd2d4640 1 -- 192.168.123.107:0/66557907 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9ec807fb50 con 0x7f9ec8133320 2026-03-09T19:28:28.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.494+0000 7f9ecd2d4640 1 -- 192.168.123.107:0/66557907 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9ec807ffe0 con 0x7f9ec8133320 2026-03-09T19:28:28.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.495+0000 7f9ec57fa640 1 -- 192.168.123.107:0/66557907 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9ec00048f0 con 0x7f9ec8133320 2026-03-09T19:28:28.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.495+0000 7f9ec57fa640 1 -- 192.168.123.107:0/66557907 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9ec00448b0 con 0x7f9ec8133320 2026-03-09T19:28:28.497 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.496+0000 7f9ec57fa640 1 -- 192.168.123.107:0/66557907 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 30) v1 ==== 100078+0+0 (secure 0 0 0) 0x7f9ec0004460 con 0x7f9ec8133320 2026-03-09T19:28:28.497 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.497+0000 7f9ec57fa640 1 --2- 192.168.123.107:0/66557907 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f9ea0077630 0x7f9ea0079af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:28.497 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.497+0000 7f9ec57fa640 1 -- 192.168.123.107:0/66557907 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f9ec00c0860 con 0x7f9ec8133320 2026-03-09T19:28:28.497 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.497+0000 7f9ec7fff640 1 --2- 192.168.123.107:0/66557907 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f9ea0077630 0x7f9ea0079af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:28:28.497 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.497+0000 7f9ecd2d4640 1 -- 192.168.123.107:0/66557907 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9ec81327b0 con 0x7f9ec8133320 2026-03-09T19:28:28.497 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.498+0000 7f9ec7fff640 1 --2- 192.168.123.107:0/66557907 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f9ea0077630 0x7f9ea0079af0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f9eb80028b0 tx=0x7f9eb8009270 comp rx=0 tx=0).ready entity=mgr.14746 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:28:28.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.505+0000 7f9ec57fa640 1 -- 192.168.123.107:0/66557907 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f9ec00891f0 con 0x7f9ec8133320 2026-03-09T19:28:28.650 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.651+0000 7f9ecd2d4640 1 -- 192.168.123.107:0/66557907 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f9ec80802c0 con 0x7f9ec8133320 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local podman[111824]: 2026-03-09 19:28:28.438662889 +0000 UTC m=+0.104981697 container init ad39140965d8dea0174b49de75fa5bc94a99c64754b25968d52baaa8f6bb0d9b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, io.buildah.version=1.41.3) 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local podman[111824]: 2026-03-09 19:28:28.444488413 +0000 UTC m=+0.110807221 container start ad39140965d8dea0174b49de75fa5bc94a99c64754b25968d52baaa8f6bb0d9b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3) 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local podman[111824]: 2026-03-09 19:28:28.349252031 +0000 UTC m=+0.015570839 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local bash[111824]: ad39140965d8dea0174b49de75fa5bc94a99c64754b25968d52baaa8f6bb0d9b 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local systemd[1]: Started Ceph mon.vm07 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: set uid:gid to 167:167 (ceph:ceph) 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: pidfile_write: ignore empty --pid-file 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: load: jerasure load: lrc 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: RocksDB version: 7.9.2 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Git sha 0 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: DB SUMMARY 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: DB Session ID: M6KOHVVIEJVMSS0T3U1A 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: CURRENT file: CURRENT 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: IDENTITY file: IDENTITY 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: MANIFEST file: MANIFEST-000015 size: 775 Bytes 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm07/store.db dir, Total Num: 1, files: 000023.sst 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm07/store.db: 000021.log size: 2321806 ; 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.error_if_exists: 0 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.create_if_missing: 0 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.paranoid_checks: 1 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.env: 0x555f8b495dc0 2026-03-09T19:28:28.652 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.fs: PosixFileSystem 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.info_log: 0x555f8d1c9900 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_file_opening_threads: 16 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.statistics: (nil) 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.use_fsync: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_log_file_size: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.keep_log_file_num: 1000 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.recycle_log_file_num: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.allow_fallocate: 1 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.allow_mmap_reads: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.allow_mmap_writes: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.use_direct_reads: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.create_missing_column_families: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.db_log_dir: 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.wal_dir: 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.advise_random_on_open: 1 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.db_write_buffer_size: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.write_buffer_manager: 0x555f8d1cd900 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.rate_limiter: (nil) 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.wal_recovery_mode: 2 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.enable_thread_tracking: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.enable_pipelined_write: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.unordered_write: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.row_cache: None 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.wal_filter: None 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.allow_ingest_behind: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.two_write_queues: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.manual_wal_flush: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.wal_compression: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.atomic_flush: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.log_readahead_size: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.best_efforts_recovery: 0 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-09T19:28:28.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.allow_data_in_errors: 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.db_host_id: __hostname__ 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_background_jobs: 2 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_background_compactions: -1 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_subcompactions: 1 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_total_wal_size: 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_open_files: -1 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.bytes_per_sync: 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compaction_readahead_size: 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_background_flushes: -1 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Compression algorithms supported: 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: kZSTD supported: 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: kXpressCompression supported: 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: kBZip2Compression supported: 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: kLZ4Compression supported: 1 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: kZlibCompression supported: 1 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: kLZ4HCCompression supported: 1 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: kSnappyCompression supported: 1 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm07/store.db/MANIFEST-000015 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.merge_operator: 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compaction_filter: None 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compaction_filter_factory: None 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.sst_partitioner_factory: None 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555f8d1c9580) 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: cache_index_and_filter_blocks: 1 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: pin_top_level_index_and_filter: 1 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: index_type: 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: data_block_index_type: 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: index_shortening: 1 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: checksum: 4 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: no_block_cache: 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_cache: 0x555f8d1ec9b0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_cache_name: BinnedLRUCache 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_cache_options: 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: capacity : 536870912 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: num_shard_bits : 4 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: strict_capacity_limit : 0 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: high_pri_pool_ratio: 0.000 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_cache_compressed: (nil) 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: persistent_cache: (nil) 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_size: 4096 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_size_deviation: 10 2026-03-09T19:28:28.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_restart_interval: 16 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout: index_block_restart_interval: 1 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout: metadata_block_size: 4096 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout: partition_filters: 0 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout: use_delta_encoding: 1 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout: filter_policy: bloomfilter 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout: whole_key_filtering: 1 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout: verify_compression: 0 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout: read_amp_bytes_per_bit: 0 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout: format_version: 5 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout: enable_index_compression: 1 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_align: 0 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout: max_auto_readahead_size: 262144 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout: prepopulate_block_cache: 0 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout: initial_auto_readahead_size: 8192 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout: num_file_reads_for_auto_readahead: 2 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.write_buffer_size: 33554432 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_write_buffer_number: 2 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compression: NoCompression 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.bottommost_compression: Disabled 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.prefix_extractor: nullptr 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.num_levels: 7 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compression_opts.level: 32767 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compression_opts.strategy: 0 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compression_opts.enabled: false 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.target_file_size_base: 67108864 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.arena_block_size: 1048576 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-09T19:28:28.655 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.disable_auto_compactions: 0 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.inplace_update_support: 0 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.bloom_locality: 0 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.max_successive_merges: 0 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.paranoid_file_checks: 0 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.force_consistency_checks: 1 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.report_bg_io_stats: 0 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.ttl: 2592000 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.enable_blob_files: false 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.min_blob_size: 0 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.blob_file_size: 268435456 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.blob_file_starting_level: 0 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm07/store.db/MANIFEST-000015 succeeded,manifest_file_number is 15, next_file_number is 25, last_sequence is 7822, log_number is 21,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 21 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 21 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: b4b8e1cf-65f6-4df8-9145-1b98b7a045ce 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773084508511158, "job": 1, "event": "recovery_started", "wal_files": [21]} 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #21 mode 2 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773084508529697, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 26, "file_size": 2065584, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7823, "largest_seqno": 8450, "table_properties": {"data_size": 2061200, "index_size": 2511, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 7361, "raw_average_key_size": 23, "raw_value_size": 2053929, "raw_average_value_size": 6647, "num_data_blocks": 120, "num_entries": 309, "num_filter_entries": 309, "num_deletions": 2, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773084508, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "b4b8e1cf-65f6-4df8-9145-1b98b7a045ce", "db_session_id": "M6KOHVVIEJVMSS0T3U1A", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773084508529790, "job": 1, "event": "recovery_finished"} 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: [db/version_set.cc:5047] Creating manifest 28 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm07/store.db/000021.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x555f8d1eee00 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: DB pointer 0x555f8d1fe000 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: starting mon.vm07 rank 0 at public addrs [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] at bind addrs [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] mon_data /var/lib/ceph/mon/ceph-vm07 fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout: ** DB Stats ** 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout: ** Compaction Stats [default] ** 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T19:28:28.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: L0 1/0 1.97 MB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 161.5 0.01 0.00 1 0.012 0 0 0.0 0.0 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: L6 1/0 6.58 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Sum 2/0 8.55 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 161.5 0.01 0.00 1 0.012 0 0 0.0 0.0 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 161.5 0.01 0.00 1 0.012 0 0 0.0 0.0 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: ** Compaction Stats [default] ** 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 161.5 0.01 0.00 1 0.012 0 0 0.0 0.0 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Flush(GB): cumulative 0.002, interval 0.002 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Cumulative compaction: 0.00 GB write, 70.32 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Interval compaction: 0.00 GB write, 70.32 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Block cache BinnedLRUCache@0x555f8d1ec9b0#2 capacity: 512.00 MB usage: 44.83 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 9e-06 secs_since: 0 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Block cache entry stats(count,size,portion): DataBlock(3,17.23 KB,0.0032872%) FilterBlock(2,8.94 KB,0.00170469%) IndexBlock(2,18.66 KB,0.0035584%) Misc(1,0.00 KB,0%) 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: mon.vm07@-1(???) e2 preinit fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: mon.vm07@-1(???).mds e13 new map 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: mon.vm07@-1(???).mds e13 print_map 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: e13 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: legacy client fscid: 1 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Filesystem 'cephfs' (1) 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: fs_name cephfs 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: epoch 13 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: tableserver 0 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: root 0 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: session_timeout 60 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: session_autoclose 300 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: max_file_size 1099511627776 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: max_xattr_size 65536 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: required_client_features {} 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: last_failure 0 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: last_failure_osd_epoch 41 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: max_mds 2 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: in 0,1 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: up {0=24279,1=24285} 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: failed 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: damaged 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: stopped 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: data_pools [3] 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: metadata_pool 2 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: inline_data disabled 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: balancer 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: bal_rank_mask -1 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: standby_count_wanted 1 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: qdb_cluster leader: 0 members: 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: [mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:28:28.657 INFO:journalctl@ceph.mon.vm07.vm07.stdout: [mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:28:28.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-09T19:28:28.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-09T19:28:28.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Standby daemons: 2026-03-09T19:28:28.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-09T19:28:28.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout: [mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:28:28.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout: [mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:28:28.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: mon.vm07@-1(???).osd e45 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-09T19:28:28.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: mon.vm07@-1(???).osd e45 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T19:28:28.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: mon.vm07@-1(???).osd e45 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T19:28:28.658 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: mon.vm07@-1(???).osd e45 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.653+0000 7f9ec57fa640 1 -- 192.168.123.107:0/66557907 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1856 (secure 0 0 0) 0x7f9ec0088940 con 0x7f9ec8133320 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:e13 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:epoch 13 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:28:28.658 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:28:28.659 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:28:28.659 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:28:28.659 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:28:28.659 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:28:28.659 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:28:28.659 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:28:28.659 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:28:28.659 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:28:28.660 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.661+0000 7f9ecd2d4640 1 -- 192.168.123.107:0/66557907 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f9ea0077630 msgr2=0x7f9ea0079af0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:28.660 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.661+0000 7f9ecd2d4640 1 --2- 192.168.123.107:0/66557907 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f9ea0077630 0x7f9ea0079af0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f9eb80028b0 tx=0x7f9eb8009270 comp rx=0 tx=0).stop 2026-03-09T19:28:28.660 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.661+0000 7f9ecd2d4640 1 -- 192.168.123.107:0/66557907 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9ec8133320 msgr2=0x7f9ec8131eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:28.660 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.661+0000 7f9ecd2d4640 1 --2- 192.168.123.107:0/66557907 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9ec8133320 0x7f9ec8131eb0 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f9ec0004270 tx=0x7f9ec0007c50 comp rx=0 tx=0).stop 2026-03-09T19:28:28.660 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.661+0000 7f9ecd2d4640 1 -- 192.168.123.107:0/66557907 shutdown_connections 2026-03-09T19:28:28.660 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.661+0000 7f9ecd2d4640 1 --2- 192.168.123.107:0/66557907 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f9ea0077630 0x7f9ea0079af0 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.660 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.661+0000 7f9ecd2d4640 1 --2- 192.168.123.107:0/66557907 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9ec8133320 0x7f9ec8131eb0 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.661 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.661+0000 7f9ecd2d4640 1 --2- 192.168.123.107:0/66557907 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ec8071a50 0x7f9ec8131970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.661 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.661+0000 7f9ecd2d4640 1 -- 192.168.123.107:0/66557907 >> 192.168.123.107:0/66557907 conn(0x7f9ec806d4f0 msgr2=0x7f9ec8070430 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:28:28.661 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.661+0000 7f9ecd2d4640 1 -- 192.168.123.107:0/66557907 shutdown_connections 2026-03-09T19:28:28.661 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.661+0000 7f9ecd2d4640 1 -- 192.168.123.107:0/66557907 wait complete. 2026-03-09T19:28:28.733 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.734+0000 7f894c939640 1 -- 192.168.123.107:0/1427890128 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944072a20 msgr2=0x7f8944108570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:28.733 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.734+0000 7f894c939640 1 --2- 192.168.123.107:0/1427890128 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944072a20 0x7f8944108570 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f89380098e0 tx=0x7f893802f1d0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.734 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.735+0000 7f894c939640 1 -- 192.168.123.107:0/1427890128 shutdown_connections 2026-03-09T19:28:28.734 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.735+0000 7f894c939640 1 --2- 192.168.123.107:0/1427890128 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944072a20 0x7f8944108570 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.734 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.735+0000 7f894c939640 1 --2- 192.168.123.107:0/1427890128 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89440720e0 0x7f89440724e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.734 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.735+0000 7f894c939640 1 -- 192.168.123.107:0/1427890128 >> 192.168.123.107:0/1427890128 conn(0x7f894406d660 msgr2=0x7f894406faa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:28:28.734 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.735+0000 7f894c939640 1 -- 192.168.123.107:0/1427890128 shutdown_connections 2026-03-09T19:28:28.734 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.735+0000 7f894c939640 1 -- 192.168.123.107:0/1427890128 wait complete. 2026-03-09T19:28:28.734 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.735+0000 7f894c939640 1 Processor -- start 2026-03-09T19:28:28.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.736+0000 7f894c939640 1 -- start start 2026-03-09T19:28:28.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.736+0000 7f894c939640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89440720e0 0x7f894410b230 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:28.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.736+0000 7f894c939640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944072a20 0x7f8944109880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:28.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.736+0000 7f894c939640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f894410b800 con 0x7f89440720e0 2026-03-09T19:28:28.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.736+0000 7f894c939640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f89441a4810 con 0x7f8944072a20 2026-03-09T19:28:28.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.736+0000 7f894a6ae640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89440720e0 0x7f894410b230 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:28:28.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.736+0000 7f894a6ae640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89440720e0 0x7f894410b230 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:50848/0 (socket says 192.168.123.107:50848) 2026-03-09T19:28:28.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.736+0000 7f894a6ae640 1 -- 192.168.123.107:0/3539209232 learned_addr learned my addr 192.168.123.107:0/3539209232 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:28:28.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.736+0000 7f894a6ae640 1 -- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89440720e0 msgr2=0x7f894410b230 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 12 2026-03-09T19:28:28.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.736+0000 7f894a6ae640 1 -- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89440720e0 msgr2=0x7f894410b230 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-09T19:28:28.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.736+0000 7f894a6ae640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89440720e0 0x7f894410b230 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-09T19:28:28.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.736+0000 7f894a6ae640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89440720e0 0x7f894410b230 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T19:28:28.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.736+0000 7f8949ead640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944072a20 0x7f8944109880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:28:28.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.737+0000 7f8949ead640 1 -- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89440720e0 msgr2=0x7f894410b230 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:28:28.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.737+0000 7f8949ead640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89440720e0 0x7f894410b230 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:28.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.737+0000 7f8949ead640 1 -- 192.168.123.107:0/3539209232 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8938009590 con 0x7f8944072a20 2026-03-09T19:28:28.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.738+0000 7f8949ead640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944072a20 0x7f8944109880 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f8944072e60 tx=0x7f8938031c40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:28:28.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.738+0000 7f893f7fe640 1 -- 192.168.123.107:0/3539209232 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f893803d070 con 0x7f8944072a20 2026-03-09T19:28:28.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.738+0000 7f894c939640 1 -- 192.168.123.107:0/3539209232 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f89441a4a90 con 0x7f8944072a20 2026-03-09T19:28:28.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.738+0000 7f894c939640 1 -- 192.168.123.107:0/3539209232 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f894410a0a0 con 0x7f8944072a20 2026-03-09T19:28:28.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.739+0000 7f893f7fe640 1 -- 192.168.123.107:0/3539209232 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8938002820 con 0x7f8944072a20 2026-03-09T19:28:28.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.739+0000 7f893f7fe640 1 -- 192.168.123.107:0/3539209232 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8938038510 con 0x7f8944072a20 2026-03-09T19:28:28.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.742+0000 7f893f7fe640 1 -- 192.168.123.107:0/3539209232 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 30) v1 ==== 100078+0+0 (secure 0 0 0) 0x7f8938038670 con 0x7f8944072a20 2026-03-09T19:28:28.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.742+0000 7f894c939640 1 -- 192.168.123.107:0/3539209232 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8914005350 con 0x7f8944072a20 2026-03-09T19:28:28.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.742+0000 7f893f7fe640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 0x7f8910079af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:28.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.742+0000 7f894a6ae640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 0x7f8910079af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:28:28.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.742+0000 7f893f7fe640 1 -- 192.168.123.107:0/3539209232 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f89380be730 con 0x7f8944072a20 2026-03-09T19:28:28.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.743+0000 7f894a6ae640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 0x7f8910079af0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f8930004750 tx=0x7f8930004070 comp rx=0 tx=0).ready entity=mgr.14746 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:28:28.748 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.748+0000 7f893f7fe640 1 -- 192.168.123.107:0/3539209232 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+193909 (secure 0 0 0) 0x7f89380870c0 con 0x7f8944072a20 2026-03-09T19:28:28.813 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.814+0000 7f893f7fe640 1 -- 192.168.123.107:0/3539209232 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mgrmap(e 31) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f8938031100 con 0x7f8944072a20 2026-03-09T19:28:28.912 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.913+0000 7f894c939640 1 -- 192.168.123.107:0/3539209232 --> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8914002bf0 con 0x7f8910077630 2026-03-09T19:28:28.912 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:28:28.912 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:28:28.912 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:28:28.912 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: from='client.24533 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:28:28.912 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: from='client.24537 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:28:28.912 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: pgmap v11: 65 pgs: 65 active+clean; 2.0 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail; 27 KiB/s rd, 588 KiB/s wr, 84 op/s 2026-03-09T19:28:28.912 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: from='client.24541 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:28:28.912 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/3878896075' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:28:28.912 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/66557907' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:28:28.913 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: mon.vm07 calling monitor election 2026-03-09T19:28:28.913 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: mon.vm07 is new leader, mons vm07,vm08 in quorum (ranks 0,1) 2026-03-09T19:28:28.913 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: monmap epoch 2 2026-03-09T19:28:28.913 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:28:28.913 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: last_changed 2026-03-09T19:22:58.690412+0000 2026-03-09T19:28:28.913 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: created 2026-03-09T19:21:44.113262+0000 2026-03-09T19:28:28.913 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: min_mon_release 18 (reef) 2026-03-09T19:28:28.913 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: election_strategy: 1 2026-03-09T19:28:28.913 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: 0: [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] mon.vm07 2026-03-09T19:28:28.913 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: 1: [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] mon.vm08 2026-03-09T19:28:28.913 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: fsmap cephfs:2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm08.jwsqrf=up:active} 2 up:standby 2026-03-09T19:28:28.913 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: osdmap e45: 6 total, 6 up, 6 in 2026-03-09T19:28:28.913 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: mgrmap e30: vm07.xacuym(active, since 16s), standbys: vm08.mxylvw 2026-03-09T19:28:28.913 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: overall HEALTH_OK 2026-03-09T19:28:28.913 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: mgrmap e31: vm07.xacuym(active, since 16s), standbys: vm08.mxylvw 2026-03-09T19:28:28.913 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:28 vm07.local ceph-mon[111841]: from='mgr.14746 ' entity='' 2026-03-09T19:28:28.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.934+0000 7f894a6ae640 1 -- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 msgr2=0x7f8910079af0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk reading from fd=16 : -104 Unknown error -104 2026-03-09T19:28:28.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.934+0000 7f894a6ae640 1 -- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 msgr2=0x7f8910079af0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-09T19:28:28.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.934+0000 7f894a6ae640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 0x7f8910079af0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f8930004750 tx=0x7f8930004070 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-09T19:28:28.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.934+0000 7f894a6ae640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 0x7f8910079af0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f8930004750 tx=0x7f8930004070 comp rx=0 tx=0).stop 2026-03-09T19:28:28.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.934+0000 7f893f7fe640 1 -- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 msgr2=0x7f8910079af0 unknown :-1 s=STATE_CLOSED l=1).mark_down 2026-03-09T19:28:28.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:28.934+0000 7f893f7fe640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 0x7f8910079af0 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: from='mgr.14746 192.168.123.107:0/1089914834' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: from='client.24533 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: from='client.24537 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: pgmap v11: 65 pgs: 65 active+clean; 2.0 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail; 27 KiB/s rd, 588 KiB/s wr, 84 op/s 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: from='client.24541 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/3878896075' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/66557907' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: mon.vm07 calling monitor election 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: mon.vm07 is new leader, mons vm07,vm08 in quorum (ranks 0,1) 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: monmap epoch 2 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: last_changed 2026-03-09T19:22:58.690412+0000 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: created 2026-03-09T19:21:44.113262+0000 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: min_mon_release 18 (reef) 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: election_strategy: 1 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: 0: [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] mon.vm07 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: 1: [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] mon.vm08 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: fsmap cephfs:2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm08.jwsqrf=up:active} 2 up:standby 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: osdmap e45: 6 total, 6 up, 6 in 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: mgrmap e30: vm07.xacuym(active, since 16s), standbys: vm08.mxylvw 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: overall HEALTH_OK 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: mgrmap e31: vm07.xacuym(active, since 16s), standbys: vm08.mxylvw 2026-03-09T19:28:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:28 vm08.local ceph-mon[57794]: from='mgr.14746 ' entity='' 2026-03-09T19:28:29.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:29.742+0000 7f893dffb640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 0x7f894410a350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:29.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:29.742+0000 7f893dffb640 1 -- 192.168.123.107:0/3539209232 --> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8944112710 con 0x7f8910077630 2026-03-09T19:28:29.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:29.742+0000 7f894a6ae640 1 -- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 msgr2=0x7f894410a350 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/781232384 2026-03-09T19:28:29.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:29.742+0000 7f894a6ae640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 0x7f894410a350 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T19:28:29.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:29.843+0000 7f893f7fe640 1 -- 192.168.123.107:0/3539209232 <== mon.1 v2:192.168.123.108:3300/0 8 ==== mgrmap(e 32) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f89380bf400 con 0x7f8944072a20 2026-03-09T19:28:29.942 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:29.943+0000 7f894a6ae640 1 -- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 msgr2=0x7f894410a350 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/781232384 2026-03-09T19:28:29.942 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:29.943+0000 7f894a6ae640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 0x7f894410a350 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-09T19:28:30.107 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-09T19:28:30.108 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1/tmp 2026-03-09T19:28:30.122 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:29 vm07.local ceph-mon[111841]: Standby manager daemon vm08.mxylvw restarted 2026-03-09T19:28:30.122 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:29 vm07.local ceph-mon[111841]: Standby manager daemon vm08.mxylvw started 2026-03-09T19:28:30.179 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:29 vm08.local ceph-mon[57794]: Standby manager daemon vm08.mxylvw restarted 2026-03-09T19:28:30.179 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:29 vm08.local ceph-mon[57794]: Standby manager daemon vm08.mxylvw started 2026-03-09T19:28:30.342 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:30.343+0000 7f894a6ae640 1 -- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 msgr2=0x7f894410a350 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/781232384 2026-03-09T19:28:30.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:30.343+0000 7f894a6ae640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 0x7f894410a350 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-09T19:28:31.143 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:31.144+0000 7f894a6ae640 1 -- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 msgr2=0x7f894410a350 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/781232384 2026-03-09T19:28:31.143 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:31.144+0000 7f894a6ae640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 0x7f894410a350 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-09T19:28:31.143 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:30 vm07.local ceph-mon[111841]: mgrmap e32: vm07.xacuym(active, since 17s), standbys: vm08.mxylvw 2026-03-09T19:28:31.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:30 vm08.local ceph-mon[57794]: mgrmap e32: vm07.xacuym(active, since 17s), standbys: vm08.mxylvw 2026-03-09T19:28:32.744 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:32.745+0000 7f894a6ae640 1 -- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 msgr2=0x7f894410a350 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.107:6800/781232384 2026-03-09T19:28:32.744 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:32.745+0000 7f894a6ae640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 0x7f894410a350 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 3.200000 2026-03-09T19:28:33.529 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:33.530+0000 7f893f7fe640 1 -- 192.168.123.107:0/3539209232 <== mon.1 v2:192.168.123.108:3300/0 9 ==== mgrmap(e 33) v1 ==== 99734+0+0 (secure 0 0 0) 0x7f8938083c00 con 0x7f8944072a20 2026-03-09T19:28:33.530 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:33.530+0000 7f893f7fe640 1 -- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 msgr2=0x7f894410a350 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:28:33.530 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:33.530+0000 7f893f7fe640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 0x7f894410a350 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:33.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:33 vm08.local ceph-mon[57794]: Active manager daemon vm07.xacuym restarted 2026-03-09T19:28:33.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:33 vm08.local ceph-mon[57794]: Activating manager daemon vm07.xacuym 2026-03-09T19:28:33.872 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:33 vm07.local ceph-mon[111841]: Active manager daemon vm07.xacuym restarted 2026-03-09T19:28:33.872 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:33 vm07.local ceph-mon[111841]: Activating manager daemon vm07.xacuym 2026-03-09T19:28:34.559 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.560+0000 7f893f7fe640 1 -- 192.168.123.107:0/3539209232 <== mon.1 v2:192.168.123.108:3300/0 10 ==== mgrmap(e 34) v1 ==== 99861+0+0 (secure 0 0 0) 0x7f89380858f0 con 0x7f8944072a20 2026-03-09T19:28:34.559 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.560+0000 7f893f7fe640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8910037fc0 0x7f891007eed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:34.560 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.561+0000 7f893f7fe640 1 -- 192.168.123.107:0/3539209232 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8944112710 con 0x7f8910037fc0 2026-03-09T19:28:34.560 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.562+0000 7f894a6ae640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8910037fc0 0x7f891007eed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:28:34.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.562+0000 7f894a6ae640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8910037fc0 0x7f891007eed0 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f8930002850 tx=0x7f8930015000 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:28:34.563 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.564+0000 7f893f7fe640 1 -- 192.168.123.107:0/3539209232 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+337 (secure 0 0 0) 0x7f8944112710 con 0x7f8910037fc0 2026-03-09T19:28:34.563 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:28:34.563 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T19:28:34.563 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:28:34.563 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:28:34.563 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [ 2026-03-09T19:28:34.563 INFO:teuthology.orchestra.run.vm07.stdout: "mgr" 2026-03-09T19:28:34.563 INFO:teuthology.orchestra.run.vm07.stdout: ], 2026-03-09T19:28:34.563 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "2/23 daemons upgraded", 2026-03-09T19:28:34.563 INFO:teuthology.orchestra.run.vm07.stdout: "message": "", 2026-03-09T19:28:34.563 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:28:34.563 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:28:34.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.566+0000 7f894c939640 1 -- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8910037fc0 msgr2=0x7f891007eed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:34.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.566+0000 7f894c939640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8910037fc0 0x7f891007eed0 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f8930002850 tx=0x7f8930015000 comp rx=0 tx=0).stop 2026-03-09T19:28:34.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.567+0000 7f894c939640 1 -- 192.168.123.107:0/3539209232 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944072a20 msgr2=0x7f8944109880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:34.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.567+0000 7f894c939640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944072a20 0x7f8944109880 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f8944072e60 tx=0x7f8938031c40 comp rx=0 tx=0).stop 2026-03-09T19:28:34.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.567+0000 7f894c939640 1 -- 192.168.123.107:0/3539209232 shutdown_connections 2026-03-09T19:28:34.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.567+0000 7f894c939640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8910037fc0 0x7f891007eed0 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:34.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.567+0000 7f894c939640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:6800/781232384,v1:192.168.123.107:6801/781232384] conn(0x7f8910077630 0x7f894410a350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:34.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.567+0000 7f894c939640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944072a20 0x7f8944109880 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:34.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.567+0000 7f894c939640 1 --2- 192.168.123.107:0/3539209232 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89440720e0 0x7f894410b230 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:34.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.567+0000 7f894c939640 1 -- 192.168.123.107:0/3539209232 >> 192.168.123.107:0/3539209232 conn(0x7f894406d660 msgr2=0x7f894406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:28:34.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.567+0000 7f894c939640 1 -- 192.168.123.107:0/3539209232 shutdown_connections 2026-03-09T19:28:34.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.567+0000 7f894c939640 1 -- 192.168.123.107:0/3539209232 wait complete. 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: osdmap e46: 6 total, 6 up, 6 in 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: mgrmap e33: vm07.xacuym(active, starting, since 0.42067s), standbys: vm08.mxylvw 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.zkmcyw"}]: dispatch 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.uizncw"}]: dispatch 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.zcaqju"}]: dispatch 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.jwsqrf"}]: dispatch 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr metadata", "who": "vm07.xacuym", "id": "vm07.xacuym"}]: dispatch 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr metadata", "who": "vm08.mxylvw", "id": "vm08.mxylvw"}]: dispatch 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:28:34.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T19:28:34.596 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T19:28:34.596 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T19:28:34.596 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: Manager daemon vm07.xacuym is now available 2026-03-09T19:28:34.596 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:28:34.596 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:34.596 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.xacuym/mirror_snapshot_schedule"}]: dispatch 2026-03-09T19:28:34.596 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:34 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.xacuym/trash_purge_schedule"}]: dispatch 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.678+0000 7f5344967640 1 -- 192.168.123.107:0/3011199205 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5340072420 msgr2=0x7f5340077190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.678+0000 7f5344967640 1 --2- 192.168.123.107:0/3011199205 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5340072420 0x7f5340077190 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f5338009040 tx=0x7f533802fc10 comp rx=0 tx=0).stop 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.678+0000 7f5344967640 1 -- 192.168.123.107:0/3011199205 shutdown_connections 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.678+0000 7f5344967640 1 --2- 192.168.123.107:0/3011199205 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5340072420 0x7f5340077190 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.678+0000 7f5344967640 1 --2- 192.168.123.107:0/3011199205 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5340071a50 0x7f5340071e50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.678+0000 7f5344967640 1 -- 192.168.123.107:0/3011199205 >> 192.168.123.107:0/3011199205 conn(0x7f534006d4f0 msgr2=0x7f534006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.679+0000 7f5344967640 1 -- 192.168.123.107:0/3011199205 shutdown_connections 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.679+0000 7f5344967640 1 -- 192.168.123.107:0/3011199205 wait complete. 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.679+0000 7f5344967640 1 Processor -- start 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.679+0000 7f5344967640 1 -- start start 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.679+0000 7f5344967640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5340071a50 0x7f5340084050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.679+0000 7f5344967640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5340072420 0x7f53400826a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.679+0000 7f5344967640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5340084620 con 0x7f5340071a50 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.679+0000 7f5344967640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5340082be0 con 0x7f5340072420 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.680+0000 7f533e575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5340072420 0x7f53400826a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.680+0000 7f533e575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5340072420 0x7f53400826a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:53954/0 (socket says 192.168.123.107:53954) 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.680+0000 7f533e575640 1 -- 192.168.123.107:0/1725410525 learned_addr learned my addr 192.168.123.107:0/1725410525 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.680+0000 7f533e575640 1 -- 192.168.123.107:0/1725410525 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5340071a50 msgr2=0x7f5340084050 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.680+0000 7f533e575640 1 --2- 192.168.123.107:0/1725410525 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5340071a50 0x7f5340084050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:34.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.680+0000 7f533e575640 1 -- 192.168.123.107:0/1725410525 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5338008cf0 con 0x7f5340072420 2026-03-09T19:28:34.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.682+0000 7f533e575640 1 --2- 192.168.123.107:0/1725410525 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5340072420 0x7f53400826a0 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f5338004bd0 tx=0x7f53380046a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:28:34.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.682+0000 7f531ffff640 1 -- 192.168.123.107:0/1725410525 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5338041070 con 0x7f5340072420 2026-03-09T19:28:34.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.682+0000 7f5344967640 1 -- 192.168.123.107:0/1725410525 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5340082e60 con 0x7f5340072420 2026-03-09T19:28:34.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.682+0000 7f5344967640 1 -- 192.168.123.107:0/1725410525 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5340083350 con 0x7f5340072420 2026-03-09T19:28:34.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.683+0000 7f531ffff640 1 -- 192.168.123.107:0/1725410525 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f5338031ea0 con 0x7f5340072420 2026-03-09T19:28:34.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.683+0000 7f531ffff640 1 -- 192.168.123.107:0/1725410525 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f533803bcd0 con 0x7f5340072420 2026-03-09T19:28:34.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.683+0000 7f5344967640 1 -- 192.168.123.107:0/1725410525 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f530c005350 con 0x7f5340072420 2026-03-09T19:28:34.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.686+0000 7f531ffff640 1 -- 192.168.123.107:0/1725410525 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 34) v1 ==== 99861+0+0 (secure 0 0 0) 0x7f5338004200 con 0x7f5340072420 2026-03-09T19:28:34.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.686+0000 7f531ffff640 1 --2- 192.168.123.107:0/1725410525 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f53140776c0 0x7f5314079b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:28:34.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.687+0000 7f531ffff640 1 -- 192.168.123.107:0/1725410525 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(46..46 src has 1..46) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f53380bf7c0 con 0x7f5340072420 2026-03-09T19:28:34.687 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.688+0000 7f533ed76640 1 --2- 192.168.123.107:0/1725410525 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f53140776c0 0x7f5314079b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:28:34.691 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.689+0000 7f531ffff640 1 -- 192.168.123.107:0/1725410525 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f53380848d0 con 0x7f5340072420 2026-03-09T19:28:34.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.696+0000 7f533ed76640 1 --2- 192.168.123.107:0/1725410525 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f53140776c0 0x7f5314079b80 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f533000b440 tx=0x7f533000d040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: osdmap e46: 6 total, 6 up, 6 in 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: mgrmap e33: vm07.xacuym(active, starting, since 0.42067s), standbys: vm08.mxylvw 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.zkmcyw"}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.uizncw"}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.zcaqju"}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.jwsqrf"}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr metadata", "who": "vm07.xacuym", "id": "vm07.xacuym"}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr metadata", "who": "vm08.mxylvw", "id": "vm08.mxylvw"}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: Manager daemon vm07.xacuym is now available 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:28:34.850 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:34.851 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.xacuym/mirror_snapshot_schedule"}]: dispatch 2026-03-09T19:28:34.851 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.xacuym/trash_purge_schedule"}]: dispatch 2026-03-09T19:28:34.894 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.894+0000 7f5344967640 1 -- 192.168.123.107:0/1725410525 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f530c0051c0 con 0x7f5340072420 2026-03-09T19:28:34.897 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.898+0000 7f531ffff640 1 -- 192.168.123.107:0/1725410525 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f53380879f0 con 0x7f5340072420 2026-03-09T19:28:34.897 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_OK 2026-03-09T19:28:34.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.901+0000 7f531dffb640 1 -- 192.168.123.107:0/1725410525 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f53140776c0 msgr2=0x7f5314079b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:34.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.901+0000 7f531dffb640 1 --2- 192.168.123.107:0/1725410525 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f53140776c0 0x7f5314079b80 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f533000b440 tx=0x7f533000d040 comp rx=0 tx=0).stop 2026-03-09T19:28:34.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.901+0000 7f531dffb640 1 -- 192.168.123.107:0/1725410525 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5340072420 msgr2=0x7f53400826a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:28:34.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.901+0000 7f531dffb640 1 --2- 192.168.123.107:0/1725410525 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5340072420 0x7f53400826a0 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f5338004bd0 tx=0x7f53380046a0 comp rx=0 tx=0).stop 2026-03-09T19:28:34.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.901+0000 7f531dffb640 1 -- 192.168.123.107:0/1725410525 shutdown_connections 2026-03-09T19:28:34.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.901+0000 7f531dffb640 1 --2- 192.168.123.107:0/1725410525 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f53140776c0 0x7f5314079b80 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:34.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.901+0000 7f531dffb640 1 --2- 192.168.123.107:0/1725410525 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5340072420 0x7f53400826a0 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:34.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.901+0000 7f531dffb640 1 --2- 192.168.123.107:0/1725410525 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5340071a50 0x7f5340084050 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:28:34.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.901+0000 7f531dffb640 1 -- 192.168.123.107:0/1725410525 >> 192.168.123.107:0/1725410525 conn(0x7f534006d4f0 msgr2=0x7f534006df00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:28:34.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.902+0000 7f531dffb640 1 -- 192.168.123.107:0/1725410525 shutdown_connections 2026-03-09T19:28:34.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:28:34.902+0000 7f531dffb640 1 -- 192.168.123.107:0/1725410525 wait complete. 2026-03-09T19:28:35.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:35 vm08.local ceph-mon[57794]: mgrmap e34: vm07.xacuym(active, since 1.46666s), standbys: vm08.mxylvw 2026-03-09T19:28:35.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:35 vm08.local ceph-mon[57794]: from='client.24553 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:28:35.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:35 vm08.local ceph-mon[57794]: pgmap v3: 65 pgs: 65 active+clean; 733 MiB data, 4.6 GiB used, 115 GiB / 120 GiB avail 2026-03-09T19:28:35.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:35 vm08.local ceph-mon[57794]: from='client.? 192.168.123.107:0/1725410525' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:28:35.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:35 vm07.local ceph-mon[111841]: mgrmap e34: vm07.xacuym(active, since 1.46666s), standbys: vm08.mxylvw 2026-03-09T19:28:35.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:35 vm07.local ceph-mon[111841]: from='client.24553 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:28:35.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:35 vm07.local ceph-mon[111841]: pgmap v3: 65 pgs: 65 active+clean; 733 MiB data, 4.6 GiB used, 115 GiB / 120 GiB avail 2026-03-09T19:28:35.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:35 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1725410525' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:28:36.881 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:36 vm08.local ceph-mon[57794]: mgrmap e35: vm07.xacuym(active, since 2s), standbys: vm08.mxylvw 2026-03-09T19:28:36.882 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:36 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:36.882 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:36 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:36.882 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:36 vm08.local ceph-mon[57794]: [09/Mar/2026:19:28:35] ENGINE Bus STARTING 2026-03-09T19:28:36.882 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:36 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:36.882 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:36 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:36.882 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:36 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:36.882 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:36 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:36.882 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:36 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:36.882 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:36 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:36.882 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:36 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:28:36.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:36 vm07.local ceph-mon[111841]: mgrmap e35: vm07.xacuym(active, since 2s), standbys: vm08.mxylvw 2026-03-09T19:28:36.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:36 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:36.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:36 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:36.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:36 vm07.local ceph-mon[111841]: [09/Mar/2026:19:28:35] ENGINE Bus STARTING 2026-03-09T19:28:36.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:36 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:36.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:36 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:36.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:36 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:36.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:36 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:36.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:36 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:36.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:36 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:36.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:36 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:28:37.910 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:37 vm08.local ceph-mon[57794]: [09/Mar/2026:19:28:36] ENGINE Serving on https://192.168.123.107:7150 2026-03-09T19:28:37.910 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:37 vm08.local ceph-mon[57794]: [09/Mar/2026:19:28:36] ENGINE Client ('192.168.123.107', 36524) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T19:28:37.910 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:37 vm08.local ceph-mon[57794]: [09/Mar/2026:19:28:36] ENGINE Serving on http://192.168.123.107:8765 2026-03-09T19:28:37.910 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:37 vm08.local ceph-mon[57794]: [09/Mar/2026:19:28:36] ENGINE Bus STARTED 2026-03-09T19:28:37.910 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:37 vm08.local ceph-mon[57794]: Standby manager daemon vm08.mxylvw restarted 2026-03-09T19:28:37.910 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:37 vm08.local ceph-mon[57794]: Standby manager daemon vm08.mxylvw started 2026-03-09T19:28:37.910 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:37 vm08.local ceph-mon[57794]: from='mgr.? 192.168.123.108:0/4054110047' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.mxylvw/crt"}]: dispatch 2026-03-09T19:28:37.910 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:37 vm08.local ceph-mon[57794]: from='mgr.? 192.168.123.108:0/4054110047' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T19:28:37.910 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:37 vm08.local ceph-mon[57794]: from='mgr.? 192.168.123.108:0/4054110047' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.mxylvw/key"}]: dispatch 2026-03-09T19:28:37.910 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:37 vm08.local ceph-mon[57794]: from='mgr.? 192.168.123.108:0/4054110047' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T19:28:37.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:37 vm07.local ceph-mon[111841]: [09/Mar/2026:19:28:36] ENGINE Serving on https://192.168.123.107:7150 2026-03-09T19:28:37.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:37 vm07.local ceph-mon[111841]: [09/Mar/2026:19:28:36] ENGINE Client ('192.168.123.107', 36524) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T19:28:37.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:37 vm07.local ceph-mon[111841]: [09/Mar/2026:19:28:36] ENGINE Serving on http://192.168.123.107:8765 2026-03-09T19:28:37.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:37 vm07.local ceph-mon[111841]: [09/Mar/2026:19:28:36] ENGINE Bus STARTED 2026-03-09T19:28:37.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:37 vm07.local ceph-mon[111841]: Standby manager daemon vm08.mxylvw restarted 2026-03-09T19:28:37.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:37 vm07.local ceph-mon[111841]: Standby manager daemon vm08.mxylvw started 2026-03-09T19:28:37.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:37 vm07.local ceph-mon[111841]: from='mgr.? 192.168.123.108:0/4054110047' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.mxylvw/crt"}]: dispatch 2026-03-09T19:28:37.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:37 vm07.local ceph-mon[111841]: from='mgr.? 192.168.123.108:0/4054110047' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T19:28:37.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:37 vm07.local ceph-mon[111841]: from='mgr.? 192.168.123.108:0/4054110047' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.mxylvw/key"}]: dispatch 2026-03-09T19:28:37.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:37 vm07.local ceph-mon[111841]: from='mgr.? 192.168.123.108:0/4054110047' entity='mgr.vm08.mxylvw' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T19:28:38.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:38 vm07.local ceph-mon[111841]: pgmap v5: 65 pgs: 65 active+clean; 733 MiB data, 4.6 GiB used, 115 GiB / 120 GiB avail 2026-03-09T19:28:38.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:38 vm07.local ceph-mon[111841]: mgrmap e36: vm07.xacuym(active, since 4s), standbys: vm08.mxylvw 2026-03-09T19:28:38.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:38.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:38.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:28:38.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:38.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:28:38.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:38.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:39.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:38 vm08.local ceph-mon[57794]: pgmap v5: 65 pgs: 65 active+clean; 733 MiB data, 4.6 GiB used, 115 GiB / 120 GiB avail 2026-03-09T19:28:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:38 vm08.local ceph-mon[57794]: mgrmap e36: vm07.xacuym(active, since 4s), standbys: vm08.mxylvw 2026-03-09T19:28:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:38 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:38 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:38 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:28:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:38 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:38 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:28:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:38 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:38 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:39.903 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: Updating vm07:/etc/ceph/ceph.conf 2026-03-09T19:28:39.903 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: Updating vm08:/etc/ceph/ceph.conf 2026-03-09T19:28:39.903 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:28:39.903 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: Updating vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:28:39.903 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: Updating vm07:/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:28:39.903 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:28:39.904 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.client.admin.keyring 2026-03-09T19:28:39.904 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: Updating vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.client.admin.keyring 2026-03-09T19:28:39.904 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:39.904 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:39.904 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:39.904 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:39.904 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:28:39.904 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-09T19:28:39.904 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:39.904 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T19:28:39.904 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T19:28:39.904 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local ceph-mon[57794]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:39.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: Updating vm07:/etc/ceph/ceph.conf 2026-03-09T19:28:39.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: Updating vm08:/etc/ceph/ceph.conf 2026-03-09T19:28:39.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:28:39.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: Updating vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.conf 2026-03-09T19:28:39.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: Updating vm07:/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:28:39.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-09T19:28:39.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: Updating vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.client.admin.keyring 2026-03-09T19:28:39.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: Updating vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/config/ceph.client.admin.keyring 2026-03-09T19:28:39.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:39.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:39.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:39.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:39.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:28:39.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-09T19:28:39.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:39.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T19:28:39.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T19:28:39.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:40.188 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:39 vm08.local systemd[1]: Stopping Ceph mon.vm08 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:28:40.188 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm08[57790]: 2026-03-09T19:28:40.093+0000 7f6a1f3ad640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm08 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T19:28:40.188 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm08[57790]: 2026-03-09T19:28:40.093+0000 7f6a1f3ad640 -1 mon.vm08@1(peon) e2 *** Got Signal Terminated *** 2026-03-09T19:28:40.473 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local podman[103296]: 2026-03-09 19:28:40.189855604 +0000 UTC m=+0.125333009 container died 8d7b1da9e1e263a5be2b49dafb6537c8ada36a7f685f10ea847dd85490b81218 (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm08, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, ceph=True, org.label-schema.license=GPLv2) 2026-03-09T19:28:40.473 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local podman[103296]: 2026-03-09 19:28:40.218906476 +0000 UTC m=+0.154383881 container remove 8d7b1da9e1e263a5be2b49dafb6537c8ada36a7f685f10ea847dd85490b81218 (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm08, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6) 2026-03-09T19:28:40.473 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local bash[103296]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm08 2026-03-09T19:28:40.473 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mon.vm08.service: Deactivated successfully. 2026-03-09T19:28:40.473 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local systemd[1]: Stopped Ceph mon.vm08 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:28:40.473 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mon.vm08.service: Consumed 3.646s CPU time. 2026-03-09T19:28:40.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local systemd[1]: Starting Ceph mon.vm08 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:28:40.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local podman[103406]: 2026-03-09 19:28:40.823707079 +0000 UTC m=+0.078912096 container create b4a58927ebfd3c4838a78529550b27f27d4836f9d55961adb68794a2f3bf8c2c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm08, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS) 2026-03-09T19:28:41.346 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local podman[103406]: 2026-03-09 19:28:40.867946238 +0000 UTC m=+0.123151264 container init b4a58927ebfd3c4838a78529550b27f27d4836f9d55961adb68794a2f3bf8c2c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm08, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local podman[103406]: 2026-03-09 19:28:40.87225774 +0000 UTC m=+0.127462747 container start b4a58927ebfd3c4838a78529550b27f27d4836f9d55961adb68794a2f3bf8c2c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm08, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local bash[103406]: b4a58927ebfd3c4838a78529550b27f27d4836f9d55961adb68794a2f3bf8c2c 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local podman[103406]: 2026-03-09 19:28:40.785227385 +0000 UTC m=+0.040432411 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local systemd[1]: Started Ceph mon.vm08 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: set uid:gid to 167:167 (ceph:ceph) 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: pidfile_write: ignore empty --pid-file 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: load: jerasure load: lrc 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: RocksDB version: 7.9.2 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Git sha 0 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: DB SUMMARY 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: DB Session ID: MER5KEOKBXC1DDZ9I19C 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: CURRENT file: CURRENT 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: IDENTITY file: IDENTITY 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: MANIFEST file: MANIFEST-000010 size: 668 Bytes 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm08/store.db dir, Total Num: 1, files: 000018.sst 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm08/store.db: 000016.log size: 6624319 ; 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.error_if_exists: 0 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.create_if_missing: 0 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.paranoid_checks: 1 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.env: 0x562e14b55dc0 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.fs: PosixFileSystem 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.info_log: 0x562e169e3900 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_file_opening_threads: 16 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.statistics: (nil) 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.use_fsync: 0 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_log_file_size: 0 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.keep_log_file_num: 1000 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.recycle_log_file_num: 0 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.allow_fallocate: 1 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.allow_mmap_reads: 0 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.allow_mmap_writes: 0 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.use_direct_reads: 0 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.create_missing_column_families: 0 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.db_log_dir: 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.wal_dir: 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.advise_random_on_open: 1 2026-03-09T19:28:41.347 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.db_write_buffer_size: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.write_buffer_manager: 0x562e169e7900 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.rate_limiter: (nil) 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.wal_recovery_mode: 2 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.enable_thread_tracking: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.enable_pipelined_write: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.unordered_write: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.row_cache: None 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.wal_filter: None 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.allow_ingest_behind: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.two_write_queues: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.manual_wal_flush: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.wal_compression: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.atomic_flush: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.log_readahead_size: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.best_efforts_recovery: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.allow_data_in_errors: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.db_host_id: __hostname__ 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_background_jobs: 2 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_background_compactions: -1 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_subcompactions: 1 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_total_wal_size: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_open_files: -1 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.bytes_per_sync: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compaction_readahead_size: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_background_flushes: -1 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Compression algorithms supported: 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: kZSTD supported: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: kXpressCompression supported: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: kBZip2Compression supported: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: kLZ4Compression supported: 1 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: kZlibCompression supported: 1 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: kLZ4HCCompression supported: 1 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: kSnappyCompression supported: 1 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm08/store.db/MANIFEST-000010 2026-03-09T19:28:41.348 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.merge_operator: 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compaction_filter: None 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compaction_filter_factory: None 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.sst_partitioner_factory: None 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562e169e2500) 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: cache_index_and_filter_blocks: 1 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: pin_top_level_index_and_filter: 1 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: index_type: 0 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: data_block_index_type: 0 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: index_shortening: 1 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: checksum: 4 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: no_block_cache: 0 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_cache: 0x562e16a07350 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_cache_name: BinnedLRUCache 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_cache_options: 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: capacity : 536870912 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: num_shard_bits : 4 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: strict_capacity_limit : 0 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: high_pri_pool_ratio: 0.000 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_cache_compressed: (nil) 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: persistent_cache: (nil) 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_size: 4096 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_size_deviation: 10 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_restart_interval: 16 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: index_block_restart_interval: 1 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: metadata_block_size: 4096 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: partition_filters: 0 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: use_delta_encoding: 1 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: filter_policy: bloomfilter 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: whole_key_filtering: 1 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: verify_compression: 0 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: read_amp_bytes_per_bit: 0 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: format_version: 5 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: enable_index_compression: 1 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_align: 0 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: max_auto_readahead_size: 262144 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: prepopulate_block_cache: 0 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: initial_auto_readahead_size: 8192 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout: num_file_reads_for_auto_readahead: 2 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.write_buffer_size: 33554432 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_write_buffer_number: 2 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compression: NoCompression 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.bottommost_compression: Disabled 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.prefix_extractor: nullptr 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.num_levels: 7 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-09T19:28:41.349 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compression_opts.level: 32767 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compression_opts.strategy: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compression_opts.enabled: false 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.target_file_size_base: 67108864 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.arena_block_size: 1048576 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.disable_auto_compactions: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.inplace_update_support: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.bloom_locality: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.max_successive_merges: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.paranoid_file_checks: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.force_consistency_checks: 1 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.report_bg_io_stats: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.ttl: 2592000 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-09T19:28:41.350 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.enable_blob_files: false 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.min_blob_size: 0 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.blob_file_size: 268435456 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.blob_file_starting_level: 0 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm08/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 20, last_sequence is 7760, log_number is 16,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 16 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 16 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 91fec2a6-c106-46d1-a7cc-e95775207d5a 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773084520946404, "job": 1, "event": "recovery_started", "wal_files": [16]} 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #16 mode 2 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773084520971409, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 21, "file_size": 3920978, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7765, "largest_seqno": 8871, "table_properties": {"data_size": 3914709, "index_size": 3943, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 11884, "raw_average_key_size": 23, "raw_value_size": 3903315, "raw_average_value_size": 7791, "num_data_blocks": 187, "num_entries": 501, "num_filter_entries": 501, "num_deletions": 2, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773084520, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "91fec2a6-c106-46d1-a7cc-e95775207d5a", "db_session_id": "MER5KEOKBXC1DDZ9I19C", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773084520972537, "job": 1, "event": "recovery_finished"} 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: [db/version_set.cc:5047] Creating manifest 23 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm08/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x562e16a08e00 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: DB pointer 0x562e16b14000 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ** DB Stats ** 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ** Compaction Stats [default] ** 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: L0 1/0 3.74 MB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 223.4 0.02 0.00 1 0.017 0 0 0.0 0.0 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: L6 1/0 6.58 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Sum 2/0 10.32 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 223.4 0.02 0.00 1 0.017 0 0 0.0 0.0 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 223.4 0.02 0.00 1 0.017 0 0 0.0 0.0 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ** Compaction Stats [default] ** 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 223.4 0.02 0.00 1 0.017 0 0 0.0 0.0 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Flush(GB): cumulative 0.004, interval 0.004 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Cumulative compaction: 0.00 GB write, 101.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Interval compaction: 0.00 GB write, 101.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Block cache BinnedLRUCache@0x562e16a07350#2 capacity: 512.00 MB usage: 13.52 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.3e-05 secs_since: 0 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Block cache entry stats(count,size,portion): FilterBlock(2,9.44 KB,0.00180006%) IndexBlock(1,4.08 KB,0.000777841%) Misc(1,0.00 KB,0%) 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-09T19:28:41.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: starting mon.vm08 rank 1 at public addrs [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] at bind addrs [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] mon_data /var/lib/ceph/mon/ceph-vm08 fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: mon.vm08@-1(???) e2 preinit fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: mon.vm08@-1(???).mds e13 new map 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: mon.vm08@-1(???).mds e13 print_map 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: e13 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: legacy client fscid: 1 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Filesystem 'cephfs' (1) 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: fs_name cephfs 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: epoch 13 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: tableserver 0 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: root 0 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: session_timeout 60 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: session_autoclose 300 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: max_file_size 1099511627776 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: max_xattr_size 65536 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: required_client_features {} 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: last_failure 0 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: last_failure_osd_epoch 41 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: max_mds 2 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: in 0,1 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: up {0=24279,1=24285} 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: failed 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: damaged 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: stopped 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: data_pools [3] 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: metadata_pool 2 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: inline_data disabled 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: balancer 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: bal_rank_mask -1 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: standby_count_wanted 1 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: qdb_cluster leader: 0 members: 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: [mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: [mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Standby daemons: 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: [mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout: [mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: mon.vm08@-1(???).osd e46 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: mon.vm08@-1(???).osd e46 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: mon.vm08@-1(???).osd e46 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: mon.vm08@-1(???).osd e46 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T19:28:41.352 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:40 vm08.local ceph-mon[103420]: mon.vm08@-1(???).paxosservice(auth 1..22) refresh upgraded, format 0 -> 3 2026-03-09T19:28:41.467 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-09T19:28:41.467 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: pgmap v7: 65 pgs: 65 active+clean; 548 MiB data, 4.0 GiB used, 116 GiB / 120 GiB avail; 38 KiB/s rd, 464 KiB/s wr, 110 op/s 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: mon.vm07 calling monitor election 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: mon.vm08 calling monitor election 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: mon.vm07 is new leader, mons vm07,vm08 in quorum (ranks 0,1) 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: monmap epoch 3 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: last_changed 2026-03-09T19:28:41.795559+0000 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: created 2026-03-09T19:21:44.113262+0000 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: min_mon_release 19 (squid) 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: election_strategy: 1 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: 0: [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] mon.vm07 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: 1: [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] mon.vm08 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: fsmap cephfs:2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm08.jwsqrf=up:active} 2 up:standby 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: osdmap e46: 6 total, 6 up, 6 in 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: mgrmap e36: vm07.xacuym(active, since 8s), standbys: vm08.mxylvw 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: overall HEALTH_OK 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:43.030 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:42 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:43.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: pgmap v7: 65 pgs: 65 active+clean; 548 MiB data, 4.0 GiB used, 116 GiB / 120 GiB avail; 38 KiB/s rd, 464 KiB/s wr, 110 op/s 2026-03-09T19:28:43.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-09T19:28:43.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: mon.vm07 calling monitor election 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: mon.vm08 calling monitor election 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: mon.vm07 is new leader, mons vm07,vm08 in quorum (ranks 0,1) 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: monmap epoch 3 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: last_changed 2026-03-09T19:28:41.795559+0000 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: created 2026-03-09T19:21:44.113262+0000 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: min_mon_release 19 (squid) 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: election_strategy: 1 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: 0: [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] mon.vm07 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: 1: [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] mon.vm08 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: fsmap cephfs:2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm08.jwsqrf=up:active} 2 up:standby 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: osdmap e46: 6 total, 6 up, 6 in 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: mgrmap e36: vm07.xacuym(active, since 8s), standbys: vm08.mxylvw 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: overall HEALTH_OK 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:43.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:42 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:44.782 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:44 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:44.782 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:44 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:44.782 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:44 vm08.local ceph-mon[103420]: pgmap v8: 65 pgs: 65 active+clean; 548 MiB data, 4.0 GiB used, 116 GiB / 120 GiB avail; 29 KiB/s rd, 361 KiB/s wr, 85 op/s 2026-03-09T19:28:44.782 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:44 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:44.783 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:44 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:44.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:44 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:44.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:44 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:44.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:44 vm07.local ceph-mon[111841]: pgmap v8: 65 pgs: 65 active+clean; 548 MiB data, 4.0 GiB used, 116 GiB / 120 GiB avail; 29 KiB/s rd, 361 KiB/s wr, 85 op/s 2026-03-09T19:28:44.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:44 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:44.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:44 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:46.731 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:46 vm07.local ceph-mon[111841]: pgmap v9: 65 pgs: 65 active+clean; 290 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 740 KiB/s wr, 165 op/s 2026-03-09T19:28:46.731 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:46.731 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:46.731 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:46.731 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:28:46.731 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:46.731 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T19:28:46.731 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T19:28:46.731 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:46.748 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:46 vm08.local ceph-mon[103420]: pgmap v9: 65 pgs: 65 active+clean; 290 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 740 KiB/s wr, 165 op/s 2026-03-09T19:28:46.748 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:46.748 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:46.749 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:46.749 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:28:46.749 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:46.749 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T19:28:46.749 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T19:28:46.749 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:47.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:47 vm07.local ceph-mon[111841]: Reconfiguring mon.vm07 (monmap changed)... 2026-03-09T19:28:47.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:47 vm07.local ceph-mon[111841]: Reconfiguring daemon mon.vm07 on vm07 2026-03-09T19:28:47.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:47 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:47.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:47 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:47.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:47 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.xacuym", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:28:47.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:47 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T19:28:47.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:47 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:47.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:47 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:47.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:47 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:47.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:47 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:28:47.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:47 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:28:47.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:47 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T19:28:47.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:47 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm07"}]: dispatch 2026-03-09T19:28:47.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:47 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:48.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:47 vm08.local ceph-mon[103420]: Reconfiguring mon.vm07 (monmap changed)... 2026-03-09T19:28:48.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:47 vm08.local ceph-mon[103420]: Reconfiguring daemon mon.vm07 on vm07 2026-03-09T19:28:48.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:47 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:48.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:47 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:48.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:47 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.xacuym", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:28:48.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:47 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T19:28:48.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:47 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:48.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:47 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:48.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:47 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:48.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:47 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:28:48.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:47 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:28:48.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:47 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T19:28:48.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:47 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm07"}]: dispatch 2026-03-09T19:28:48.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:47 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:48.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:48 vm08.local ceph-mon[103420]: Reconfiguring mgr.vm07.xacuym (monmap changed)... 2026-03-09T19:28:48.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:48 vm08.local ceph-mon[103420]: Reconfiguring daemon mgr.vm07.xacuym on vm07 2026-03-09T19:28:48.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:48 vm08.local ceph-mon[103420]: Reconfiguring ceph-exporter.vm07 (monmap changed)... 2026-03-09T19:28:48.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:48 vm08.local ceph-mon[103420]: Unable to update caps for client.ceph-exporter.vm07 2026-03-09T19:28:48.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:48 vm08.local ceph-mon[103420]: Reconfiguring daemon ceph-exporter.vm07 on vm07 2026-03-09T19:28:48.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:48 vm08.local ceph-mon[103420]: pgmap v10: 65 pgs: 65 active+clean; 290 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 676 KiB/s wr, 151 op/s 2026-03-09T19:28:48.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:48.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:48.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm07", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T19:28:48.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:48.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:48.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:48.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T19:28:48.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:48.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:28:48.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:48 vm07.local ceph-mon[111841]: Reconfiguring mgr.vm07.xacuym (monmap changed)... 2026-03-09T19:28:48.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:48 vm07.local ceph-mon[111841]: Reconfiguring daemon mgr.vm07.xacuym on vm07 2026-03-09T19:28:48.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:48 vm07.local ceph-mon[111841]: Reconfiguring ceph-exporter.vm07 (monmap changed)... 2026-03-09T19:28:48.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:48 vm07.local ceph-mon[111841]: Unable to update caps for client.ceph-exporter.vm07 2026-03-09T19:28:48.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:48 vm07.local ceph-mon[111841]: Reconfiguring daemon ceph-exporter.vm07 on vm07 2026-03-09T19:28:48.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:48 vm07.local ceph-mon[111841]: pgmap v10: 65 pgs: 65 active+clean; 290 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 676 KiB/s wr, 151 op/s 2026-03-09T19:28:48.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:48.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:48.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm07", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T19:28:48.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:48.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:48.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:48.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T19:28:48.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:48.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: Reconfiguring crash.vm07 (monmap changed)... 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: Reconfiguring daemon crash.vm07 on vm07 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: Reconfiguring osd.0 (monmap changed)... 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: Reconfiguring daemon osd.0 on vm07 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: Reconfiguring osd.1 (monmap changed)... 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: Reconfiguring daemon osd.1 on vm07 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.uizncw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:28:49.697 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: Reconfiguring crash.vm07 (monmap changed)... 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: Reconfiguring daemon crash.vm07 on vm07 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: Reconfiguring osd.0 (monmap changed)... 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: Reconfiguring daemon osd.0 on vm07 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: Reconfiguring osd.1 (monmap changed)... 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: Reconfiguring daemon osd.1 on vm07 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.uizncw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:28:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:50.699 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:50 vm07.local ceph-mon[111841]: Reconfiguring osd.2 (monmap changed)... 2026-03-09T19:28:50.699 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:50 vm07.local ceph-mon[111841]: Reconfiguring daemon osd.2 on vm07 2026-03-09T19:28:50.699 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:50 vm07.local ceph-mon[111841]: pgmap v11: 65 pgs: 65 active+clean; 290 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 676 KiB/s wr, 151 op/s 2026-03-09T19:28:50.699 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:50 vm07.local ceph-mon[111841]: Reconfiguring mds.cephfs.vm07.uizncw (monmap changed)... 2026-03-09T19:28:50.699 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:50 vm07.local ceph-mon[111841]: Reconfiguring daemon mds.cephfs.vm07.uizncw on vm07 2026-03-09T19:28:50.699 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:50.699 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:50.699 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.zkmcyw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:28:50.699 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:50.699 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:50.700 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:50.700 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:28:50.700 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:28:50.700 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T19:28:50.700 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm08"}]: dispatch 2026-03-09T19:28:50.700 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:51.050 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:50 vm08.local ceph-mon[103420]: Reconfiguring osd.2 (monmap changed)... 2026-03-09T19:28:51.050 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:50 vm08.local ceph-mon[103420]: Reconfiguring daemon osd.2 on vm07 2026-03-09T19:28:51.050 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:50 vm08.local ceph-mon[103420]: pgmap v11: 65 pgs: 65 active+clean; 290 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 676 KiB/s wr, 151 op/s 2026-03-09T19:28:51.050 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:50 vm08.local ceph-mon[103420]: Reconfiguring mds.cephfs.vm07.uizncw (monmap changed)... 2026-03-09T19:28:51.050 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:50 vm08.local ceph-mon[103420]: Reconfiguring daemon mds.cephfs.vm07.uizncw on vm07 2026-03-09T19:28:51.050 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:51.050 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:51.050 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.zkmcyw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:28:51.050 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:51.050 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:51.050 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:51.050 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:28:51.050 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:28:51.050 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T19:28:51.050 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm08"}]: dispatch 2026-03-09T19:28:51.051 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:51.799 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: Reconfiguring mds.cephfs.vm07.zkmcyw (monmap changed)... 2026-03-09T19:28:51.799 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: Reconfiguring daemon mds.cephfs.vm07.zkmcyw on vm07 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: Reconfiguring ceph-exporter.vm08 (monmap changed)... 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: Unable to update caps for client.ceph-exporter.vm08 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: Reconfiguring daemon ceph-exporter.vm08 on vm08 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: Reconfiguring crash.vm08 (monmap changed)... 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: Reconfiguring daemon crash.vm08 on vm08 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.mxylvw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T19:28:51.800 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: Reconfiguring mds.cephfs.vm07.zkmcyw (monmap changed)... 2026-03-09T19:28:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: Reconfiguring daemon mds.cephfs.vm07.zkmcyw on vm07 2026-03-09T19:28:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: Reconfiguring ceph-exporter.vm08 (monmap changed)... 2026-03-09T19:28:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: Unable to update caps for client.ceph-exporter.vm08 2026-03-09T19:28:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: Reconfiguring daemon ceph-exporter.vm08 on vm08 2026-03-09T19:28:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: Reconfiguring crash.vm08 (monmap changed)... 2026-03-09T19:28:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T19:28:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: Reconfiguring daemon crash.vm08 on vm08 2026-03-09T19:28:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.mxylvw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T19:28:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T19:28:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T19:28:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T19:28:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:53.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: Reconfiguring mgr.vm08.mxylvw (monmap changed)... 2026-03-09T19:28:53.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: Reconfiguring daemon mgr.vm08.mxylvw on vm08 2026-03-09T19:28:53.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: pgmap v12: 65 pgs: 65 active+clean; 293 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 42 KiB/s rd, 1.2 MiB/s wr, 267 op/s 2026-03-09T19:28:53.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: Reconfiguring mon.vm08 (monmap changed)... 2026-03-09T19:28:53.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: Reconfiguring daemon mon.vm08 on vm08 2026-03-09T19:28:53.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:53.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:53.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T19:28:53.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:53.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:53.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:53.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T19:28:53.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:53.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:53.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:53.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T19:28:53.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:52 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: Reconfiguring mgr.vm08.mxylvw (monmap changed)... 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: Reconfiguring daemon mgr.vm08.mxylvw on vm08 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: pgmap v12: 65 pgs: 65 active+clean; 293 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 42 KiB/s rd, 1.2 MiB/s wr, 267 op/s 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: Reconfiguring mon.vm08 (monmap changed)... 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: Reconfiguring daemon mon.vm08 on vm08 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T19:28:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:52 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:54.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:54 vm08.local ceph-mon[103420]: Reconfiguring osd.3 (monmap changed)... 2026-03-09T19:28:54.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:54 vm08.local ceph-mon[103420]: Reconfiguring daemon osd.3 on vm08 2026-03-09T19:28:54.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:54 vm08.local ceph-mon[103420]: Reconfiguring osd.4 (monmap changed)... 2026-03-09T19:28:54.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:54 vm08.local ceph-mon[103420]: Reconfiguring daemon osd.4 on vm08 2026-03-09T19:28:54.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:54 vm08.local ceph-mon[103420]: Reconfiguring osd.5 (monmap changed)... 2026-03-09T19:28:54.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:54 vm08.local ceph-mon[103420]: Reconfiguring daemon osd.5 on vm08 2026-03-09T19:28:54.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:54 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:54.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:54 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:54.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:54 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.zcaqju", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:28:54.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:54 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:54.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:54 vm07.local ceph-mon[111841]: Reconfiguring osd.3 (monmap changed)... 2026-03-09T19:28:54.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:54 vm07.local ceph-mon[111841]: Reconfiguring daemon osd.3 on vm08 2026-03-09T19:28:54.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:54 vm07.local ceph-mon[111841]: Reconfiguring osd.4 (monmap changed)... 2026-03-09T19:28:54.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:54 vm07.local ceph-mon[111841]: Reconfiguring daemon osd.4 on vm08 2026-03-09T19:28:54.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:54 vm07.local ceph-mon[111841]: Reconfiguring osd.5 (monmap changed)... 2026-03-09T19:28:54.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:54 vm07.local ceph-mon[111841]: Reconfiguring daemon osd.5 on vm08 2026-03-09T19:28:54.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:54 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:54.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:54 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:54.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:54 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.zcaqju", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:28:54.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:54 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:55.465 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: Reconfiguring mds.cephfs.vm08.zcaqju (monmap changed)... 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: Reconfiguring daemon mds.cephfs.vm08.zcaqju on vm08 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: pgmap v13: 65 pgs: 65 active+clean; 293 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 937 KiB/s wr, 203 op/s 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: Reconfiguring mds.cephfs.vm08.jwsqrf (monmap changed)... 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.jwsqrf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: Reconfiguring daemon mds.cephfs.vm08.jwsqrf on vm08 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: Upgrade: Setting container_image for all mon 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm07"}]: dispatch 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm07"}]': finished 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm08"}]: dispatch 2026-03-09T19:28:55.466 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:55 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm08"}]': finished 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: Reconfiguring mds.cephfs.vm08.zcaqju (monmap changed)... 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: Reconfiguring daemon mds.cephfs.vm08.zcaqju on vm08 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: pgmap v13: 65 pgs: 65 active+clean; 293 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 937 KiB/s wr, 203 op/s 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: Reconfiguring mds.cephfs.vm08.jwsqrf (monmap changed)... 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.jwsqrf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: Reconfiguring daemon mds.cephfs.vm08.jwsqrf on vm08 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: Upgrade: Setting container_image for all mon 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm07"}]: dispatch 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm07"}]': finished 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm08"}]: dispatch 2026-03-09T19:28:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:55 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm08"}]': finished 2026-03-09T19:28:56.457 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:56 vm07.local ceph-mon[111841]: Upgrade: Updating crash.vm07 (1/2) 2026-03-09T19:28:56.458 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:56 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:56.458 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:56 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm07", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T19:28:56.458 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:56 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:56.458 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:56 vm07.local ceph-mon[111841]: Deploying daemon crash.vm07 on vm07 2026-03-09T19:28:56.458 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:56 vm07.local ceph-mon[111841]: pgmap v14: 65 pgs: 65 active+clean; 294 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1.3 MiB/s wr, 330 op/s 2026-03-09T19:28:56.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:56 vm08.local ceph-mon[103420]: Upgrade: Updating crash.vm07 (1/2) 2026-03-09T19:28:56.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:56 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:56.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:56 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm07", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T19:28:56.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:56 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:56.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:56 vm08.local ceph-mon[103420]: Deploying daemon crash.vm07 on vm07 2026-03-09T19:28:56.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:56 vm08.local ceph-mon[103420]: pgmap v14: 65 pgs: 65 active+clean; 294 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1.3 MiB/s wr, 330 op/s 2026-03-09T19:28:57.657 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:57 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:57.657 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:57 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:57.657 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:57 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:57.657 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:57 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T19:28:57.657 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:57 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:57 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:57 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:57 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:28:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:57 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T19:28:57.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:57 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:28:58.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:58 vm08.local ceph-mon[103420]: Upgrade: Updating crash.vm08 (2/2) 2026-03-09T19:28:58.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:58 vm08.local ceph-mon[103420]: Deploying daemon crash.vm08 on vm08 2026-03-09T19:28:58.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:28:58 vm08.local ceph-mon[103420]: pgmap v15: 65 pgs: 65 active+clean; 294 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 23 KiB/s rd, 912 KiB/s wr, 242 op/s 2026-03-09T19:28:58.862 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:58 vm07.local ceph-mon[111841]: Upgrade: Updating crash.vm08 (2/2) 2026-03-09T19:28:58.862 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:58 vm07.local ceph-mon[111841]: Deploying daemon crash.vm08 on vm08 2026-03-09T19:28:58.862 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:28:58 vm07.local ceph-mon[111841]: pgmap v15: 65 pgs: 65 active+clean; 294 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 23 KiB/s rd, 912 KiB/s wr, 242 op/s 2026-03-09T19:29:00.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:00 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:00.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:00 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:00.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:00 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:29:00.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:00 vm08.local ceph-mon[103420]: pgmap v16: 65 pgs: 65 active+clean; 294 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 23 KiB/s rd, 912 KiB/s wr, 242 op/s 2026-03-09T19:29:00.110 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:00 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:00.110 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:00 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:00.111 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:00 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:29:00.111 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:00 vm07.local ceph-mon[111841]: pgmap v16: 65 pgs: 65 active+clean; 294 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 23 KiB/s rd, 912 KiB/s wr, 242 op/s 2026-03-09T19:29:01.926 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:01.926 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:01.926 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:01.926 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:01.926 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:01.926 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:01.926 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:01.926 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:01.936 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:01.937 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:01.937 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:01.937 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:01.937 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:01.937 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:01.937 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:01.937 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:02.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:02 vm07.local ceph-mon[111841]: pgmap v17: 65 pgs: 65 active+clean; 296 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 35 KiB/s rd, 1.3 MiB/s wr, 358 op/s 2026-03-09T19:29:02.967 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:02 vm08.local ceph-mon[103420]: pgmap v17: 65 pgs: 65 active+clean; 296 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 35 KiB/s rd, 1.3 MiB/s wr, 358 op/s 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: Upgrade: Setting container_image for all crash 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm07"}]: dispatch 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm07"}]': finished 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm08"}]: dispatch 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm08"}]': finished 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: Upgrade: osd.0 is safe to restart 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: pgmap v18: 65 pgs: 65 active+clean; 296 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 768 KiB/s wr, 242 op/s 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: Upgrade: Updating osd.0 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: Upgrade: Setting container_image for all crash 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm07"}]: dispatch 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm07"}]': finished 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm08"}]: dispatch 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm08"}]': finished 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: Upgrade: osd.0 is safe to restart 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: pgmap v18: 65 pgs: 65 active+clean; 296 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 768 KiB/s wr, 242 op/s 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: Upgrade: Updating osd.0 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:04.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T19:29:05.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.043+0000 7faffda8e640 1 -- 192.168.123.107:0/3228695976 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faff8075ba0 msgr2=0x7faff8075fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:05.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.043+0000 7faffda8e640 1 --2- 192.168.123.107:0/3228695976 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faff8075ba0 0x7faff8075fa0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fafe0009a00 tx=0x7fafe002f290 comp rx=0 tx=0).stop 2026-03-09T19:29:05.044 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.045+0000 7faffda8e640 1 -- 192.168.123.107:0/3228695976 shutdown_connections 2026-03-09T19:29:05.044 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.045+0000 7faffda8e640 1 --2- 192.168.123.107:0/3228695976 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faff8076df0 0x7faff8077250 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.044 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.045+0000 7faffda8e640 1 --2- 192.168.123.107:0/3228695976 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faff8075ba0 0x7faff8075fa0 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.044 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.045+0000 7faffda8e640 1 -- 192.168.123.107:0/3228695976 >> 192.168.123.107:0/3228695976 conn(0x7faff80fe040 msgr2=0x7faff8100460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:05.045 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.045+0000 7faffda8e640 1 -- 192.168.123.107:0/3228695976 shutdown_connections 2026-03-09T19:29:05.045 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.045+0000 7faffda8e640 1 -- 192.168.123.107:0/3228695976 wait complete. 2026-03-09T19:29:05.045 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.046+0000 7faffda8e640 1 Processor -- start 2026-03-09T19:29:05.045 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.046+0000 7faffda8e640 1 -- start start 2026-03-09T19:29:05.045 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.046+0000 7faffda8e640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faff8075ba0 0x7faff8071670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:05.045 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.046+0000 7faffda8e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faff8076df0 0x7faff8071bb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:05.045 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.046+0000 7faffda8e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faff80730b0 con 0x7faff8076df0 2026-03-09T19:29:05.045 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.046+0000 7faffda8e640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faff8073220 con 0x7faff8075ba0 2026-03-09T19:29:05.046 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.047+0000 7fafeffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faff8076df0 0x7faff8071bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:05.046 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.047+0000 7fafeffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faff8076df0 0x7faff8071bb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43830/0 (socket says 192.168.123.107:43830) 2026-03-09T19:29:05.046 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.047+0000 7fafeffff640 1 -- 192.168.123.107:0/3563177982 learned_addr learned my addr 192.168.123.107:0/3563177982 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:29:05.046 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.047+0000 7fafeffff640 1 -- 192.168.123.107:0/3563177982 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faff8075ba0 msgr2=0x7faff8071670 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:29:05.047 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.047+0000 7fafeffff640 1 --2- 192.168.123.107:0/3563177982 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faff8075ba0 0x7faff8071670 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.047 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.047+0000 7fafeffff640 1 -- 192.168.123.107:0/3563177982 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fafe0009660 con 0x7faff8076df0 2026-03-09T19:29:05.047 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.048+0000 7fafeffff640 1 --2- 192.168.123.107:0/3563177982 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faff8076df0 0x7faff8071bb0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7fafe800b750 tx=0x7fafe800bc20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:05.047 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.048+0000 7fafedffb640 1 -- 192.168.123.107:0/3563177982 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fafe8004070 con 0x7faff8076df0 2026-03-09T19:29:05.048 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.048+0000 7faffda8e640 1 -- 192.168.123.107:0/3563177982 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faff8072290 con 0x7faff8076df0 2026-03-09T19:29:05.048 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.048+0000 7faffda8e640 1 -- 192.168.123.107:0/3563177982 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faff819bbb0 con 0x7faff8076df0 2026-03-09T19:29:05.048 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.049+0000 7fafedffb640 1 -- 192.168.123.107:0/3563177982 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fafe8002780 con 0x7faff8076df0 2026-03-09T19:29:05.048 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.049+0000 7fafedffb640 1 -- 192.168.123.107:0/3563177982 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fafe800ca70 con 0x7faff8076df0 2026-03-09T19:29:05.052 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.053+0000 7fafedffb640 1 -- 192.168.123.107:0/3563177982 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fafe800ccb0 con 0x7faff8076df0 2026-03-09T19:29:05.052 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.053+0000 7faffda8e640 1 -- 192.168.123.107:0/3563177982 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fafc4005350 con 0x7faff8076df0 2026-03-09T19:29:05.055 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.056+0000 7fafedffb640 1 --2- 192.168.123.107:0/3563177982 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fafd0077a00 0x7fafd0079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:05.055 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.056+0000 7fafedffb640 1 -- 192.168.123.107:0/3563177982 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(46..46 src has 1..46) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fafe8099210 con 0x7faff8076df0 2026-03-09T19:29:05.058 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.057+0000 7faffca8c640 1 --2- 192.168.123.107:0/3563177982 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fafd0077a00 0x7fafd0079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:05.063 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.064+0000 7fafedffb640 1 -- 192.168.123.107:0/3563177982 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fafe8061800 con 0x7faff8076df0 2026-03-09T19:29:05.068 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.066+0000 7faffca8c640 1 --2- 192.168.123.107:0/3563177982 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fafd0077a00 0x7fafd0079ec0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fafe0002c80 tx=0x7fafe00023d0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:05.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:05 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:29:05.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:05 vm07.local ceph-mon[111841]: Deploying daemon osd.0 on vm07 2026-03-09T19:29:05.315 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.315+0000 7faffda8e640 1 -- 192.168.123.107:0/3563177982 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fafc4002bf0 con 0x7fafd0077a00 2026-03-09T19:29:05.317 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.319+0000 7fafedffb640 1 -- 192.168.123.107:0/3563177982 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fafc4002bf0 con 0x7fafd0077a00 2026-03-09T19:29:05.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.322+0000 7fafcf7fe640 1 -- 192.168.123.107:0/3563177982 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fafd0077a00 msgr2=0x7fafd0079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:05.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.322+0000 7fafcf7fe640 1 --2- 192.168.123.107:0/3563177982 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fafd0077a00 0x7fafd0079ec0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fafe0002c80 tx=0x7fafe00023d0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.322+0000 7fafcf7fe640 1 -- 192.168.123.107:0/3563177982 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faff8076df0 msgr2=0x7faff8071bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:05.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.322+0000 7fafcf7fe640 1 --2- 192.168.123.107:0/3563177982 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faff8076df0 0x7faff8071bb0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7fafe800b750 tx=0x7fafe800bc20 comp rx=0 tx=0).stop 2026-03-09T19:29:05.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.323+0000 7fafcf7fe640 1 -- 192.168.123.107:0/3563177982 shutdown_connections 2026-03-09T19:29:05.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.323+0000 7fafcf7fe640 1 --2- 192.168.123.107:0/3563177982 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fafd0077a00 0x7fafd0079ec0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.323+0000 7fafcf7fe640 1 --2- 192.168.123.107:0/3563177982 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faff8076df0 0x7faff8071bb0 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.323+0000 7fafcf7fe640 1 --2- 192.168.123.107:0/3563177982 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faff8075ba0 0x7faff8071670 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.323+0000 7fafcf7fe640 1 -- 192.168.123.107:0/3563177982 >> 192.168.123.107:0/3563177982 conn(0x7faff80fe040 msgr2=0x7faff80ffb60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:05.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.323+0000 7fafcf7fe640 1 -- 192.168.123.107:0/3563177982 shutdown_connections 2026-03-09T19:29:05.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.323+0000 7fafcf7fe640 1 -- 192.168.123.107:0/3563177982 wait complete. 2026-03-09T19:29:05.333 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:29:05.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:05 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:29:05.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:05 vm08.local ceph-mon[103420]: Deploying daemon osd.0 on vm07 2026-03-09T19:29:05.417 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.417+0000 7fd4795e7640 1 -- 192.168.123.107:0/1744522957 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd474072440 msgr2=0x7fd4740771b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:05.417 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.417+0000 7fd4795e7640 1 --2- 192.168.123.107:0/1744522957 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd474072440 0x7fd4740771b0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fd46c009040 tx=0x7fd46c031a90 comp rx=0 tx=0).stop 2026-03-09T19:29:05.417 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.418+0000 7fd4795e7640 1 -- 192.168.123.107:0/1744522957 shutdown_connections 2026-03-09T19:29:05.417 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.418+0000 7fd4795e7640 1 --2- 192.168.123.107:0/1744522957 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd474072440 0x7fd4740771b0 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.417 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.418+0000 7fd4795e7640 1 --2- 192.168.123.107:0/1744522957 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd474071a70 0x7fd474071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.417 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.418+0000 7fd4795e7640 1 -- 192.168.123.107:0/1744522957 >> 192.168.123.107:0/1744522957 conn(0x7fd47406d4f0 msgr2=0x7fd47406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:05.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.418+0000 7fd4795e7640 1 -- 192.168.123.107:0/1744522957 shutdown_connections 2026-03-09T19:29:05.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.418+0000 7fd4795e7640 1 -- 192.168.123.107:0/1744522957 wait complete. 2026-03-09T19:29:05.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.418+0000 7fd4795e7640 1 Processor -- start 2026-03-09T19:29:05.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.418+0000 7fd4795e7640 1 -- start start 2026-03-09T19:29:05.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.419+0000 7fd4795e7640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd474071a70 0x7fd4740840d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:05.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.419+0000 7fd4795e7640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd474082720 0x7fd474082ba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:05.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.419+0000 7fd4795e7640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd474084610 con 0x7fd474071a70 2026-03-09T19:29:05.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.419+0000 7fd4795e7640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd4740830e0 con 0x7fd474082720 2026-03-09T19:29:05.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.419+0000 7fd472ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd474071a70 0x7fd4740840d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:05.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.419+0000 7fd472ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd474071a70 0x7fd4740840d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43856/0 (socket says 192.168.123.107:43856) 2026-03-09T19:29:05.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.419+0000 7fd472ffd640 1 -- 192.168.123.107:0/1479527227 learned_addr learned my addr 192.168.123.107:0/1479527227 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:29:05.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.419+0000 7fd4727fc640 1 --2- 192.168.123.107:0/1479527227 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd474082720 0x7fd474082ba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:05.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.420+0000 7fd4727fc640 1 -- 192.168.123.107:0/1479527227 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd474071a70 msgr2=0x7fd4740840d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:05.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.420+0000 7fd4727fc640 1 --2- 192.168.123.107:0/1479527227 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd474071a70 0x7fd4740840d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.420+0000 7fd4727fc640 1 -- 192.168.123.107:0/1479527227 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd46c008cf0 con 0x7fd474082720 2026-03-09T19:29:05.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.420+0000 7fd4727fc640 1 --2- 192.168.123.107:0/1479527227 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd474082720 0x7fd474082ba0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fd4740730e0 tx=0x7fd46c004060 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:05.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.420+0000 7fd453fff640 1 -- 192.168.123.107:0/1479527227 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd46c0045d0 con 0x7fd474082720 2026-03-09T19:29:05.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.420+0000 7fd4795e7640 1 -- 192.168.123.107:0/1479527227 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd474083330 con 0x7fd474082720 2026-03-09T19:29:05.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.420+0000 7fd4795e7640 1 -- 192.168.123.107:0/1479527227 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd47412ef70 con 0x7fd474082720 2026-03-09T19:29:05.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.421+0000 7fd453fff640 1 -- 192.168.123.107:0/1479527227 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd46c037a40 con 0x7fd474082720 2026-03-09T19:29:05.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.421+0000 7fd453fff640 1 -- 192.168.123.107:0/1479527227 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd46c00de20 con 0x7fd474082720 2026-03-09T19:29:05.422 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.423+0000 7fd453fff640 1 -- 192.168.123.107:0/1479527227 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fd46c03e070 con 0x7fd474082720 2026-03-09T19:29:05.422 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.423+0000 7fd453fff640 1 --2- 192.168.123.107:0/1479527227 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd454077a00 0x7fd454079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:05.422 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.423+0000 7fd453fff640 1 -- 192.168.123.107:0/1479527227 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(46..46 src has 1..46) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fd46c033040 con 0x7fd474082720 2026-03-09T19:29:05.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.424+0000 7fd472ffd640 1 --2- 192.168.123.107:0/1479527227 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd454077a00 0x7fd454079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:05.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.424+0000 7fd472ffd640 1 --2- 192.168.123.107:0/1479527227 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd454077a00 0x7fd454079ec0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fd464005fd0 tx=0x7fd4640074e0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:05.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.424+0000 7fd4795e7640 1 -- 192.168.123.107:0/1479527227 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd474072440 con 0x7fd474082720 2026-03-09T19:29:05.427 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.428+0000 7fd453fff640 1 -- 192.168.123.107:0/1479527227 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd46c089900 con 0x7fd474082720 2026-03-09T19:29:05.595 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.594+0000 7fd4795e7640 1 -- 192.168.123.107:0/1479527227 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd474075a60 con 0x7fd454077a00 2026-03-09T19:29:05.595 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:05 vm07.local systemd[1]: Stopping Ceph osd.0 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:29:05.599 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.600+0000 7fd453fff640 1 -- 192.168.123.107:0/1479527227 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fd474075a60 con 0x7fd454077a00 2026-03-09T19:29:05.601 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.602+0000 7fd4795e7640 1 -- 192.168.123.107:0/1479527227 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd454077a00 msgr2=0x7fd454079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:05.601 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.602+0000 7fd4795e7640 1 --2- 192.168.123.107:0/1479527227 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd454077a00 0x7fd454079ec0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fd464005fd0 tx=0x7fd4640074e0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.601 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.602+0000 7fd4795e7640 1 -- 192.168.123.107:0/1479527227 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd474082720 msgr2=0x7fd474082ba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:05.601 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.602+0000 7fd4795e7640 1 --2- 192.168.123.107:0/1479527227 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd474082720 0x7fd474082ba0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fd4740730e0 tx=0x7fd46c004060 comp rx=0 tx=0).stop 2026-03-09T19:29:05.601 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.602+0000 7fd4795e7640 1 -- 192.168.123.107:0/1479527227 shutdown_connections 2026-03-09T19:29:05.601 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.602+0000 7fd4795e7640 1 --2- 192.168.123.107:0/1479527227 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd454077a00 0x7fd454079ec0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.601 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.602+0000 7fd4795e7640 1 --2- 192.168.123.107:0/1479527227 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd474082720 0x7fd474082ba0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.602 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.602+0000 7fd4795e7640 1 --2- 192.168.123.107:0/1479527227 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd474071a70 0x7fd4740840d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.602 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.602+0000 7fd4795e7640 1 -- 192.168.123.107:0/1479527227 >> 192.168.123.107:0/1479527227 conn(0x7fd47406d4f0 msgr2=0x7fd474073150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:05.602 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.603+0000 7fd4795e7640 1 -- 192.168.123.107:0/1479527227 shutdown_connections 2026-03-09T19:29:05.602 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.603+0000 7fd4795e7640 1 -- 192.168.123.107:0/1479527227 wait complete. 2026-03-09T19:29:05.681 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.682+0000 7f3d96375640 1 -- 192.168.123.107:0/654789361 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d90102a60 msgr2=0x7f3d90102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:05.681 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.682+0000 7f3d96375640 1 --2- 192.168.123.107:0/654789361 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d90102a60 0x7f3d90102e60 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f3d7c0099b0 tx=0x7f3d7c02f240 comp rx=0 tx=0).stop 2026-03-09T19:29:05.681 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.682+0000 7f3d96375640 1 -- 192.168.123.107:0/654789361 shutdown_connections 2026-03-09T19:29:05.681 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.682+0000 7f3d96375640 1 --2- 192.168.123.107:0/654789361 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d90103c60 0x7f3d901040e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.681 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.682+0000 7f3d96375640 1 --2- 192.168.123.107:0/654789361 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d90102a60 0x7f3d90102e60 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.681 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.682+0000 7f3d96375640 1 -- 192.168.123.107:0/654789361 >> 192.168.123.107:0/654789361 conn(0x7f3d900fe250 msgr2=0x7f3d90100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:05.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.682+0000 7f3d96375640 1 -- 192.168.123.107:0/654789361 shutdown_connections 2026-03-09T19:29:05.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.682+0000 7f3d96375640 1 -- 192.168.123.107:0/654789361 wait complete. 2026-03-09T19:29:05.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.683+0000 7f3d96375640 1 Processor -- start 2026-03-09T19:29:05.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.683+0000 7f3d96375640 1 -- start start 2026-03-09T19:29:05.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.684+0000 7f3d96375640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d90102a60 0x7f3d90078fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:05.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.684+0000 7f3d8ffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d90102a60 0x7f3d90078fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:05.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.684+0000 7f3d96375640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d90103c60 0x7f3d900794e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:05.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.684+0000 7f3d8ffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d90102a60 0x7f3d90078fa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43870/0 (socket says 192.168.123.107:43870) 2026-03-09T19:29:05.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.684+0000 7f3d8ffff640 1 -- 192.168.123.107:0/569975957 learned_addr learned my addr 192.168.123.107:0/569975957 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:29:05.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.684+0000 7f3d96375640 1 -- 192.168.123.107:0/569975957 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d90075a00 con 0x7f3d90102a60 2026-03-09T19:29:05.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.684+0000 7f3d96375640 1 -- 192.168.123.107:0/569975957 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d90075b70 con 0x7f3d90103c60 2026-03-09T19:29:05.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.684+0000 7f3d8f7fe640 1 --2- 192.168.123.107:0/569975957 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d90103c60 0x7f3d900794e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:05.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.685+0000 7f3d8ffff640 1 -- 192.168.123.107:0/569975957 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d90103c60 msgr2=0x7f3d900794e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:05.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.685+0000 7f3d8ffff640 1 --2- 192.168.123.107:0/569975957 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d90103c60 0x7f3d900794e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.685+0000 7f3d8ffff640 1 -- 192.168.123.107:0/569975957 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3d7c009660 con 0x7f3d90102a60 2026-03-09T19:29:05.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.685+0000 7f3d8ffff640 1 --2- 192.168.123.107:0/569975957 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d90102a60 0x7f3d90078fa0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f3d7c031cf0 tx=0x7f3d7c031d20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:05.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.686+0000 7f3d8d7fa640 1 -- 192.168.123.107:0/569975957 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d7c03d070 con 0x7f3d90102a60 2026-03-09T19:29:05.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.686+0000 7f3d8d7fa640 1 -- 192.168.123.107:0/569975957 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3d7c002e00 con 0x7f3d90102a60 2026-03-09T19:29:05.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.686+0000 7f3d8d7fa640 1 -- 192.168.123.107:0/569975957 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d7c031070 con 0x7f3d90102a60 2026-03-09T19:29:05.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.686+0000 7f3d96375640 1 -- 192.168.123.107:0/569975957 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3d90075df0 con 0x7f3d90102a60 2026-03-09T19:29:05.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.686+0000 7f3d96375640 1 -- 192.168.123.107:0/569975957 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3d90076360 con 0x7f3d90102a60 2026-03-09T19:29:05.687 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.688+0000 7f3d96375640 1 -- 192.168.123.107:0/569975957 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3d54005350 con 0x7f3d90102a60 2026-03-09T19:29:05.691 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.689+0000 7f3d8d7fa640 1 -- 192.168.123.107:0/569975957 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f3d7c038830 con 0x7f3d90102a60 2026-03-09T19:29:05.691 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.690+0000 7f3d8d7fa640 1 --2- 192.168.123.107:0/569975957 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3d64077ad0 0x7f3d64079f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:05.691 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.690+0000 7f3d8d7fa640 1 -- 192.168.123.107:0/569975957 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(46..46 src has 1..46) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f3d7c0be7d0 con 0x7f3d90102a60 2026-03-09T19:29:05.691 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.692+0000 7f3d8f7fe640 1 --2- 192.168.123.107:0/569975957 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3d64077ad0 0x7f3d64079f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:05.691 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.692+0000 7f3d8d7fa640 1 -- 192.168.123.107:0/569975957 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3d7c086de0 con 0x7f3d90102a60 2026-03-09T19:29:05.692 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.693+0000 7f3d8f7fe640 1 --2- 192.168.123.107:0/569975957 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3d64077ad0 0x7f3d64079f90 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f3d90076f20 tx=0x7f3d80009210 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:05.813 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.812+0000 7f3d96375640 1 -- 192.168.123.107:0/569975957 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f3d54002bf0 con 0x7f3d64077ad0 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (5m) 4s ago 6m 23.0M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (6m) 4s ago 6m 8925k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (6m) 5s ago 6m 10.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (9s) 4s ago 6m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 91ed5a6dbf3f 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (7s) 5s ago 6m 8321k - 19.2.3-678-ge911bdeb 654f31e6858e b2465a9d2305 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (5m) 4s ago 6m 88.5M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (4m) 4s ago 4m 16.6M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (4m) 4s ago 4m 18.3M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (4m) 5s ago 4m 28.8M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (4m) 5s ago 4m 239M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:8443,9283,8765 running (61s) 4s ago 7m 590M - 19.2.3-678-ge911bdeb 654f31e6858e 6c1350e70bfa 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (43s) 5s ago 6m 489M - 19.2.3-678-ge911bdeb 654f31e6858e c4c36685d8dc 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (37s) 4s ago 7m 56.3M 2048M 19.2.3-678-ge911bdeb 654f31e6858e ad39140965d8 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (25s) 5s ago 6m 46.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b4a58927ebfd 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (6m) 4s ago 6m 14.2M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (6m) 5s ago 6m 16.6M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (5m) 4s ago 5m 360M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d7417e3377af 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (5m) 4s ago 5m 387M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2b3c7dd92144 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (5m) 4s ago 5m 315M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 67f7c4b96ef8 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (5m) 5s ago 5m 441M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 740e44caf4fc 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (5m) 5s ago 5m 446M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d929d31f8a58 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (5m) 5s ago 5m 348M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (46s) 4s ago 6m 44.3M - 2.43.0 a07b618ecd1d c09450c20f5f 2026-03-09T19:29:05.819 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.819+0000 7f3d8d7fa640 1 -- 192.168.123.107:0/569975957 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f3d54002bf0 con 0x7f3d64077ad0 2026-03-09T19:29:05.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.822+0000 7f3d96375640 1 -- 192.168.123.107:0/569975957 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3d64077ad0 msgr2=0x7f3d64079f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:05.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.822+0000 7f3d96375640 1 --2- 192.168.123.107:0/569975957 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3d64077ad0 0x7f3d64079f90 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f3d90076f20 tx=0x7f3d80009210 comp rx=0 tx=0).stop 2026-03-09T19:29:05.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.822+0000 7f3d96375640 1 -- 192.168.123.107:0/569975957 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d90102a60 msgr2=0x7f3d90078fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:05.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.822+0000 7f3d96375640 1 --2- 192.168.123.107:0/569975957 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d90102a60 0x7f3d90078fa0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f3d7c031cf0 tx=0x7f3d7c031d20 comp rx=0 tx=0).stop 2026-03-09T19:29:05.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.822+0000 7f3d96375640 1 -- 192.168.123.107:0/569975957 shutdown_connections 2026-03-09T19:29:05.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.822+0000 7f3d96375640 1 --2- 192.168.123.107:0/569975957 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3d64077ad0 0x7f3d64079f90 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.822+0000 7f3d96375640 1 --2- 192.168.123.107:0/569975957 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d90103c60 0x7f3d900794e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.822+0000 7f3d96375640 1 --2- 192.168.123.107:0/569975957 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d90102a60 0x7f3d90078fa0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.822+0000 7f3d96375640 1 -- 192.168.123.107:0/569975957 >> 192.168.123.107:0/569975957 conn(0x7f3d900fe250 msgr2=0x7f3d900ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:05.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.822+0000 7f3d96375640 1 -- 192.168.123.107:0/569975957 shutdown_connections 2026-03-09T19:29:05.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.823+0000 7f3d96375640 1 -- 192.168.123.107:0/569975957 wait complete. 2026-03-09T19:29:05.904 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:05 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0[68186]: 2026-03-09T19:29:05.655+0000 7f94399e3640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T19:29:05.904 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:05 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0[68186]: 2026-03-09T19:29:05.655+0000 7f94399e3640 -1 osd.0 46 *** Got signal Terminated *** 2026-03-09T19:29:05.904 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:05 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0[68186]: 2026-03-09T19:29:05.655+0000 7f94399e3640 -1 osd.0 46 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T19:29:05.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.903+0000 7fb007fff640 1 -- 192.168.123.107:0/1419928459 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb008103a20 msgr2=0x7fb008105e10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:05.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.903+0000 7fb007fff640 1 --2- 192.168.123.107:0/1419928459 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb008103a20 0x7fb008105e10 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7faff40099b0 tx=0x7faff402f220 comp rx=0 tx=0).stop 2026-03-09T19:29:05.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.903+0000 7fb007fff640 1 -- 192.168.123.107:0/1419928459 shutdown_connections 2026-03-09T19:29:05.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.903+0000 7fb007fff640 1 --2- 192.168.123.107:0/1419928459 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb008103a20 0x7fb008105e10 secure :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7faff40099b0 tx=0x7faff402f220 comp rx=0 tx=0).stop 2026-03-09T19:29:05.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.903+0000 7fb007fff640 1 --2- 192.168.123.107:0/1419928459 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0081010f0 0x7fb0081034e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.903+0000 7fb007fff640 1 -- 192.168.123.107:0/1419928459 >> 192.168.123.107:0/1419928459 conn(0x7fb0080fac90 msgr2=0x7fb0080fd0b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:05.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.907+0000 7fb007fff640 1 -- 192.168.123.107:0/1419928459 shutdown_connections 2026-03-09T19:29:05.906 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.907+0000 7fb007fff640 1 -- 192.168.123.107:0/1419928459 wait complete. 2026-03-09T19:29:05.906 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.907+0000 7fb007fff640 1 Processor -- start 2026-03-09T19:29:05.906 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.907+0000 7fb007fff640 1 -- start start 2026-03-09T19:29:05.907 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.908+0000 7fb007fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb0081010f0 0x7fb008195ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:05.907 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.908+0000 7fb007fff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb008196530 0x7fb00819b5a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:05.907 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.908+0000 7fb007fff640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb0081969b0 con 0x7fb0081010f0 2026-03-09T19:29:05.907 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.908+0000 7fb007fff640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb008196b20 con 0x7fb008196530 2026-03-09T19:29:05.907 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.908+0000 7fb0067fc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb008196530 0x7fb00819b5a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:05.907 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.908+0000 7fb0067fc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb008196530 0x7fb00819b5a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:52804/0 (socket says 192.168.123.107:52804) 2026-03-09T19:29:05.907 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.908+0000 7fb0067fc640 1 -- 192.168.123.107:0/382504461 learned_addr learned my addr 192.168.123.107:0/382504461 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:29:05.908 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.909+0000 7fb006ffd640 1 --2- 192.168.123.107:0/382504461 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb0081010f0 0x7fb008195ff0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:05.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.909+0000 7fb006ffd640 1 -- 192.168.123.107:0/382504461 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb008196530 msgr2=0x7fb00819b5a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:05.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.909+0000 7fb006ffd640 1 --2- 192.168.123.107:0/382504461 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb008196530 0x7fb00819b5a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:05.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.910+0000 7fb006ffd640 1 -- 192.168.123.107:0/382504461 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faff4009660 con 0x7fb0081010f0 2026-03-09T19:29:05.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.910+0000 7fb006ffd640 1 --2- 192.168.123.107:0/382504461 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb0081010f0 0x7fb008195ff0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7faffc00d6b0 tx=0x7faffc00db80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:05.910 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.911+0000 7fafebfff640 1 -- 192.168.123.107:0/382504461 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faffc004280 con 0x7fb0081010f0 2026-03-09T19:29:05.910 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.911+0000 7fafebfff640 1 -- 192.168.123.107:0/382504461 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7faffc004d60 con 0x7fb0081010f0 2026-03-09T19:29:05.910 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.911+0000 7fafebfff640 1 -- 192.168.123.107:0/382504461 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faffc00bea0 con 0x7fb0081010f0 2026-03-09T19:29:05.910 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.911+0000 7fb007fff640 1 -- 192.168.123.107:0/382504461 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb00819bb40 con 0x7fb0081010f0 2026-03-09T19:29:05.911 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.911+0000 7fb007fff640 1 -- 192.168.123.107:0/382504461 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb00819c040 con 0x7fb0081010f0 2026-03-09T19:29:05.913 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.914+0000 7fafebfff640 1 -- 192.168.123.107:0/382504461 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7faffc005230 con 0x7fb0081010f0 2026-03-09T19:29:05.913 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.914+0000 7fb007fff640 1 -- 192.168.123.107:0/382504461 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fafd0005350 con 0x7fb0081010f0 2026-03-09T19:29:05.917 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.915+0000 7fafebfff640 1 --2- 192.168.123.107:0/382504461 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fafd8077ad0 0x7fafd8079f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:05.917 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.915+0000 7fafebfff640 1 -- 192.168.123.107:0/382504461 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(46..46 src has 1..46) v4 ==== 6480+0+0 (secure 0 0 0) 0x7faffc099ad0 con 0x7fb0081010f0 2026-03-09T19:29:05.917 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.919+0000 7fb0067fc640 1 --2- 192.168.123.107:0/382504461 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fafd8077ad0 0x7fafd8079f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:05.918 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.919+0000 7fafebfff640 1 -- 192.168.123.107:0/382504461 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7faffc062fa0 con 0x7fb0081010f0 2026-03-09T19:29:05.918 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:05.919+0000 7fb0067fc640 1 --2- 192.168.123.107:0/382504461 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fafd8077ad0 0x7fafd8079f90 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7faff4002410 tx=0x7faff403a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:06.101 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.102+0000 7fb007fff640 1 -- 192.168.123.107:0/382504461 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fafd0005e10 con 0x7fb0081010f0 2026-03-09T19:29:06.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.111+0000 7fafebfff640 1 -- 192.168.123.107:0/382504461 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+739 (secure 0 0 0) 0x7faffc014300 con 0x7fb0081010f0 2026-03-09T19:29:06.110 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 6 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 10, 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:29:06.111 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:29:06.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.114+0000 7fb007fff640 1 -- 192.168.123.107:0/382504461 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fafd8077ad0 msgr2=0x7fafd8079f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:06.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.114+0000 7fb007fff640 1 --2- 192.168.123.107:0/382504461 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fafd8077ad0 0x7fafd8079f90 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7faff4002410 tx=0x7faff403a040 comp rx=0 tx=0).stop 2026-03-09T19:29:06.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.114+0000 7fb007fff640 1 -- 192.168.123.107:0/382504461 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb0081010f0 msgr2=0x7fb008195ff0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:06.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.114+0000 7fb007fff640 1 --2- 192.168.123.107:0/382504461 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb0081010f0 0x7fb008195ff0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7faffc00d6b0 tx=0x7faffc00db80 comp rx=0 tx=0).stop 2026-03-09T19:29:06.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.114+0000 7fb007fff640 1 -- 192.168.123.107:0/382504461 shutdown_connections 2026-03-09T19:29:06.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.114+0000 7fb007fff640 1 --2- 192.168.123.107:0/382504461 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fafd8077ad0 0x7fafd8079f90 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.114+0000 7fb007fff640 1 --2- 192.168.123.107:0/382504461 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb008196530 0x7fb00819b5a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.114 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.114+0000 7fb007fff640 1 --2- 192.168.123.107:0/382504461 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb0081010f0 0x7fb008195ff0 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.114 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.114+0000 7fb007fff640 1 -- 192.168.123.107:0/382504461 >> 192.168.123.107:0/382504461 conn(0x7fb0080fac90 msgr2=0x7fb0080ff680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:06.114 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.114+0000 7fb007fff640 1 -- 192.168.123.107:0/382504461 shutdown_connections 2026-03-09T19:29:06.114 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.114+0000 7fb007fff640 1 -- 192.168.123.107:0/382504461 wait complete. 2026-03-09T19:29:06.393 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.393+0000 7f37a7381640 1 -- 192.168.123.107:0/1858440702 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f37a0072710 msgr2=0x7f37a010c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:06.393 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.393+0000 7f37a7381640 1 --2- 192.168.123.107:0/1858440702 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f37a0072710 0x7f37a010c590 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f379800b0a0 tx=0x7f379802f4c0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.393 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.394+0000 7f37a7381640 1 -- 192.168.123.107:0/1858440702 shutdown_connections 2026-03-09T19:29:06.393 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.394+0000 7f37a7381640 1 --2- 192.168.123.107:0/1858440702 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f37a0072710 0x7f37a010c590 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.393 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.394+0000 7f37a7381640 1 --2- 192.168.123.107:0/1858440702 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37a0071d40 0x7f37a0072140 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.393 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.394+0000 7f37a7381640 1 -- 192.168.123.107:0/1858440702 >> 192.168.123.107:0/1858440702 conn(0x7f37a006d660 msgr2=0x7f37a006faa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:06.393 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.394+0000 7f37a7381640 1 -- 192.168.123.107:0/1858440702 shutdown_connections 2026-03-09T19:29:06.393 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.394+0000 7f37a7381640 1 -- 192.168.123.107:0/1858440702 wait complete. 2026-03-09T19:29:06.393 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.394+0000 7f37a7381640 1 Processor -- start 2026-03-09T19:29:06.393 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.394+0000 7f37a7381640 1 -- start start 2026-03-09T19:29:06.393 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.395+0000 7f37a7381640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f37a0071d40 0x7f37a01a7210 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:06.393 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.395+0000 7f37a7381640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37a0072710 0x7f37a01a7750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:06.393 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.395+0000 7f37a7381640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f37a01a7d20 con 0x7f37a0072710 2026-03-09T19:29:06.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.395+0000 7f37a7381640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f37a01a7e90 con 0x7f37a0071d40 2026-03-09T19:29:06.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.395+0000 7f37a48f5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37a0072710 0x7f37a01a7750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:06.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.395+0000 7f37a48f5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37a0072710 0x7f37a01a7750 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43912/0 (socket says 192.168.123.107:43912) 2026-03-09T19:29:06.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.395+0000 7f37a48f5640 1 -- 192.168.123.107:0/1844732499 learned_addr learned my addr 192.168.123.107:0/1844732499 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:29:06.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.395+0000 7f37a48f5640 1 -- 192.168.123.107:0/1844732499 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f37a0071d40 msgr2=0x7f37a01a7210 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:06.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.395+0000 7f37a48f5640 1 --2- 192.168.123.107:0/1844732499 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f37a0071d40 0x7f37a01a7210 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.394 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.395+0000 7f37a48f5640 1 -- 192.168.123.107:0/1844732499 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3798009d00 con 0x7f37a0072710 2026-03-09T19:29:06.395 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.396+0000 7f37a48f5640 1 --2- 192.168.123.107:0/1844732499 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37a0072710 0x7f37a01a7750 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f379802f9d0 tx=0x7f3798002d10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:06.395 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.396+0000 7f378e7fc640 1 -- 192.168.123.107:0/1844732499 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f37980096e0 con 0x7f37a0072710 2026-03-09T19:29:06.395 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.396+0000 7f37a7381640 1 -- 192.168.123.107:0/1844732499 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f37a01ac880 con 0x7f37a0072710 2026-03-09T19:29:06.395 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.396+0000 7f37a7381640 1 -- 192.168.123.107:0/1844732499 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f37a01acdd0 con 0x7f37a0072710 2026-03-09T19:29:06.396 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.397+0000 7f378e7fc640 1 -- 192.168.123.107:0/1844732499 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f37980093a0 con 0x7f37a0072710 2026-03-09T19:29:06.396 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.397+0000 7f378e7fc640 1 -- 192.168.123.107:0/1844732499 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f37980407e0 con 0x7f37a0072710 2026-03-09T19:29:06.396 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.398+0000 7f37a7381640 1 -- 192.168.123.107:0/1844732499 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3768005350 con 0x7f37a0072710 2026-03-09T19:29:06.415 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.416+0000 7f378e7fc640 1 -- 192.168.123.107:0/1844732499 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f37980046a0 con 0x7f37a0072710 2026-03-09T19:29:06.415 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.416+0000 7f378e7fc640 1 --2- 192.168.123.107:0/1844732499 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3774077a00 0x7f3774079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:06.415 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.416+0000 7f378e7fc640 1 -- 192.168.123.107:0/1844732499 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(46..46 src has 1..46) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f37980bf180 con 0x7f37a0072710 2026-03-09T19:29:06.417 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.419+0000 7f37a50f6640 1 --2- 192.168.123.107:0/1844732499 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3774077a00 0x7f3774079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:06.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.420+0000 7f37a50f6640 1 --2- 192.168.123.107:0/1844732499 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3774077a00 0x7f3774079ec0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f379c0059c0 tx=0x7f379c005950 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:06.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.420+0000 7f378e7fc640 1 -- 192.168.123.107:0/1844732499 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f37980877a0 con 0x7f37a0072710 2026-03-09T19:29:06.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.561+0000 7f37a7381640 1 -- 192.168.123.107:0/1844732499 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f37680058d0 con 0x7f37a0072710 2026-03-09T19:29:06.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.562+0000 7f378e7fc640 1 -- 192.168.123.107:0/1844732499 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1926 (secure 0 0 0) 0x7f3798086ef0 con 0x7f37a0072710 2026-03-09T19:29:06.562 INFO:teuthology.orchestra.run.vm07.stdout:e13 2026-03-09T19:29:06.562 INFO:teuthology.orchestra.run.vm07.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T19:29:06.562 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:29:06.562 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:epoch 13 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:qdb_cluster leader: 0 members: 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:29:06.563 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:29:06.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.566+0000 7f37a7381640 1 -- 192.168.123.107:0/1844732499 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3774077a00 msgr2=0x7f3774079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:06.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.566+0000 7f37a7381640 1 --2- 192.168.123.107:0/1844732499 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3774077a00 0x7f3774079ec0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f379c0059c0 tx=0x7f379c005950 comp rx=0 tx=0).stop 2026-03-09T19:29:06.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.566+0000 7f37a7381640 1 -- 192.168.123.107:0/1844732499 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37a0072710 msgr2=0x7f37a01a7750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:06.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.566+0000 7f37a7381640 1 --2- 192.168.123.107:0/1844732499 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37a0072710 0x7f37a01a7750 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f379802f9d0 tx=0x7f3798002d10 comp rx=0 tx=0).stop 2026-03-09T19:29:06.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.566+0000 7f37a7381640 1 -- 192.168.123.107:0/1844732499 shutdown_connections 2026-03-09T19:29:06.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.566+0000 7f37a7381640 1 --2- 192.168.123.107:0/1844732499 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3774077a00 0x7f3774079ec0 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.566+0000 7f37a7381640 1 --2- 192.168.123.107:0/1844732499 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37a0072710 0x7f37a01a7750 secure :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f379802f9d0 tx=0x7f3798002d10 comp rx=0 tx=0).stop 2026-03-09T19:29:06.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.566+0000 7f37a7381640 1 --2- 192.168.123.107:0/1844732499 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f37a0071d40 0x7f37a01a7210 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.565 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.566+0000 7f37a7381640 1 -- 192.168.123.107:0/1844732499 >> 192.168.123.107:0/1844732499 conn(0x7f37a006d660 msgr2=0x7f37a010a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:06.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.567+0000 7f37a7381640 1 -- 192.168.123.107:0/1844732499 shutdown_connections 2026-03-09T19:29:06.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.567+0000 7f37a7381640 1 -- 192.168.123.107:0/1844732499 wait complete. 2026-03-09T19:29:06.667 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:06 vm07.local ceph-mon[111841]: from='client.34124 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:29:06.667 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:06 vm07.local ceph-mon[111841]: pgmap v19: 65 pgs: 65 active+clean; 295 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1.2 MiB/s wr, 324 op/s 2026-03-09T19:29:06.667 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:06 vm07.local ceph-mon[111841]: from='client.44101 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:29:06.667 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:06 vm07.local ceph-mon[111841]: osd.0 marked itself down and dead 2026-03-09T19:29:06.667 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:06 vm07.local ceph-mon[111841]: from='client.34132 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:29:06.667 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.668+0000 7fded90ca640 1 -- 192.168.123.107:0/4166705743 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fded4075ba0 msgr2=0x7fded4075fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:06.667 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.668+0000 7fded90ca640 1 --2- 192.168.123.107:0/4166705743 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fded4075ba0 0x7fded4075fa0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7fdec80099b0 tx=0x7fdec802f240 comp rx=0 tx=0).stop 2026-03-09T19:29:06.667 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.668+0000 7fded90ca640 1 -- 192.168.123.107:0/4166705743 shutdown_connections 2026-03-09T19:29:06.667 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.668+0000 7fded90ca640 1 --2- 192.168.123.107:0/4166705743 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fded4076df0 0x7fded4077250 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.668 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.668+0000 7fded90ca640 1 --2- 192.168.123.107:0/4166705743 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fded4075ba0 0x7fded4075fa0 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.668 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.668+0000 7fded90ca640 1 -- 192.168.123.107:0/4166705743 >> 192.168.123.107:0/4166705743 conn(0x7fded40fe250 msgr2=0x7fded4100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:06.668 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.668+0000 7fded90ca640 1 -- 192.168.123.107:0/4166705743 shutdown_connections 2026-03-09T19:29:06.668 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.669+0000 7fded90ca640 1 -- 192.168.123.107:0/4166705743 wait complete. 2026-03-09T19:29:06.668 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.669+0000 7fded90ca640 1 Processor -- start 2026-03-09T19:29:06.668 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.669+0000 7fded90ca640 1 -- start start 2026-03-09T19:29:06.668 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.669+0000 7fded90ca640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fded4075ba0 0x7fded419eba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:06.668 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.669+0000 7fded90ca640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fded4076df0 0x7fded419f0e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:06.668 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.669+0000 7fded90ca640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fded419f6b0 con 0x7fded4076df0 2026-03-09T19:29:06.668 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.669+0000 7fded90ca640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fded419f820 con 0x7fded4075ba0 2026-03-09T19:29:06.668 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.669+0000 7fded2d76640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fded4075ba0 0x7fded419eba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:06.668 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.669+0000 7fded2d76640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fded4075ba0 0x7fded419eba0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:52840/0 (socket says 192.168.123.107:52840) 2026-03-09T19:29:06.668 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.669+0000 7fded2d76640 1 -- 192.168.123.107:0/3318704166 learned_addr learned my addr 192.168.123.107:0/3318704166 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:29:06.669 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.670+0000 7fded2d76640 1 -- 192.168.123.107:0/3318704166 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fded4076df0 msgr2=0x7fded419f0e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:06.669 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.670+0000 7fded2d76640 1 --2- 192.168.123.107:0/3318704166 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fded4076df0 0x7fded419f0e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.669 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.670+0000 7fded2d76640 1 -- 192.168.123.107:0/3318704166 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdec8009660 con 0x7fded4075ba0 2026-03-09T19:29:06.671 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.671+0000 7fded2d76640 1 --2- 192.168.123.107:0/3318704166 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fded4075ba0 0x7fded419eba0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7fdec8002410 tx=0x7fdec8002c60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:06.671 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.671+0000 7fdeb3fff640 1 -- 192.168.123.107:0/3318704166 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdec803d070 con 0x7fded4075ba0 2026-03-09T19:29:06.671 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.671+0000 7fded90ca640 1 -- 192.168.123.107:0/3318704166 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fded406aad0 con 0x7fded4075ba0 2026-03-09T19:29:06.671 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.671+0000 7fded90ca640 1 -- 192.168.123.107:0/3318704166 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fded406afc0 con 0x7fded4075ba0 2026-03-09T19:29:06.671 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.671+0000 7fdeb3fff640 1 -- 192.168.123.107:0/3318704166 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fdec8002e10 con 0x7fded4075ba0 2026-03-09T19:29:06.671 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.671+0000 7fdeb3fff640 1 -- 192.168.123.107:0/3318704166 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdec8031520 con 0x7fded4075ba0 2026-03-09T19:29:06.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.673+0000 7fdeb3fff640 1 -- 192.168.123.107:0/3318704166 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fdec8049050 con 0x7fded4075ba0 2026-03-09T19:29:06.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.674+0000 7fdeb3fff640 1 --2- 192.168.123.107:0/3318704166 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdea8077a00 0x7fdea8079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:06.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.674+0000 7fdeb3fff640 1 -- 192.168.123.107:0/3318704166 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(47..47 src has 1..47) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fdec80bf480 con 0x7fded4075ba0 2026-03-09T19:29:06.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.674+0000 7fded90ca640 1 -- 192.168.123.107:0/3318704166 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fded410fb80 con 0x7fded4075ba0 2026-03-09T19:29:06.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.676+0000 7fded2575640 1 --2- 192.168.123.107:0/3318704166 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdea8077a00 0x7fdea8079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:06.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.676+0000 7fded2575640 1 --2- 192.168.123.107:0/3318704166 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdea8077a00 0x7fdea8079ec0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fdebc0059c0 tx=0x7fdebc005950 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:06.679 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.678+0000 7fdeb3fff640 1 -- 192.168.123.107:0/3318704166 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fdec8087960 con 0x7fded4075ba0 2026-03-09T19:29:06.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:06 vm08.local ceph-mon[103420]: from='client.34124 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:29:06.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:06 vm08.local ceph-mon[103420]: pgmap v19: 65 pgs: 65 active+clean; 295 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1.2 MiB/s wr, 324 op/s 2026-03-09T19:29:06.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:06 vm08.local ceph-mon[103420]: from='client.44101 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:29:06.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:06 vm08.local ceph-mon[103420]: osd.0 marked itself down and dead 2026-03-09T19:29:06.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:06 vm08.local ceph-mon[103420]: from='client.34132 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:29:06.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.850+0000 7fded90ca640 1 -- 192.168.123.107:0/3318704166 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fded410ded0 con 0x7fdea8077a00 2026-03-09T19:29:06.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.852+0000 7fdeb3fff640 1 -- 192.168.123.107:0/3318704166 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fded410ded0 con 0x7fdea8077a00 2026-03-09T19:29:06.852 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:29:06.852 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T19:29:06.852 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:29:06.852 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:29:06.852 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [ 2026-03-09T19:29:06.852 INFO:teuthology.orchestra.run.vm07.stdout: "mgr", 2026-03-09T19:29:06.852 INFO:teuthology.orchestra.run.vm07.stdout: "crash", 2026-03-09T19:29:06.852 INFO:teuthology.orchestra.run.vm07.stdout: "mon" 2026-03-09T19:29:06.852 INFO:teuthology.orchestra.run.vm07.stdout: ], 2026-03-09T19:29:06.852 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "6/23 daemons upgraded", 2026-03-09T19:29:06.852 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T19:29:06.852 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:29:06.852 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:29:06.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.856+0000 7fdeb1ffb640 1 -- 192.168.123.107:0/3318704166 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdea8077a00 msgr2=0x7fdea8079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:06.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.856+0000 7fdeb1ffb640 1 --2- 192.168.123.107:0/3318704166 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdea8077a00 0x7fdea8079ec0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fdebc0059c0 tx=0x7fdebc005950 comp rx=0 tx=0).stop 2026-03-09T19:29:06.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.856+0000 7fdeb1ffb640 1 -- 192.168.123.107:0/3318704166 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fded4075ba0 msgr2=0x7fded419eba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:06.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.856+0000 7fdeb1ffb640 1 --2- 192.168.123.107:0/3318704166 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fded4075ba0 0x7fded419eba0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7fdec8002410 tx=0x7fdec8002c60 comp rx=0 tx=0).stop 2026-03-09T19:29:06.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.856+0000 7fdeb1ffb640 1 -- 192.168.123.107:0/3318704166 shutdown_connections 2026-03-09T19:29:06.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.856+0000 7fdeb1ffb640 1 --2- 192.168.123.107:0/3318704166 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdea8077a00 0x7fdea8079ec0 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.856+0000 7fdeb1ffb640 1 --2- 192.168.123.107:0/3318704166 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fded4076df0 0x7fded419f0e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.856+0000 7fdeb1ffb640 1 --2- 192.168.123.107:0/3318704166 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fded4075ba0 0x7fded419eba0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.856+0000 7fdeb1ffb640 1 -- 192.168.123.107:0/3318704166 >> 192.168.123.107:0/3318704166 conn(0x7fded40fe250 msgr2=0x7fded40fffb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:06.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.856+0000 7fdeb1ffb640 1 -- 192.168.123.107:0/3318704166 shutdown_connections 2026-03-09T19:29:06.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.856+0000 7fdeb1ffb640 1 -- 192.168.123.107:0/3318704166 wait complete. 2026-03-09T19:29:06.942 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.941+0000 7f4726eb4640 1 -- 192.168.123.107:0/316284708 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4720071a50 msgr2=0x7f4720071e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:06.942 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.941+0000 7f4726eb4640 1 --2- 192.168.123.107:0/316284708 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4720071a50 0x7f4720071e50 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f471c010a30 tx=0x7f471c037840 comp rx=0 tx=0).stop 2026-03-09T19:29:06.942 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:06 vm07.local podman[119105]: 2026-03-09 19:29:06.70202844 +0000 UTC m=+1.080626449 container died d7417e3377af17dfb77afbe9eda431b9702755a2cc5932fcac285af783114b7a (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_REF=reef) 2026-03-09T19:29:06.942 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:06 vm07.local podman[119105]: 2026-03-09 19:29:06.730238198 +0000 UTC m=+1.108836207 container remove d7417e3377af17dfb77afbe9eda431b9702755a2cc5932fcac285af783114b7a (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.label-schema.build-date=20260223) 2026-03-09T19:29:06.942 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:06 vm07.local bash[119105]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0 2026-03-09T19:29:06.944 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.941+0000 7f4726eb4640 1 -- 192.168.123.107:0/316284708 shutdown_connections 2026-03-09T19:29:06.944 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.941+0000 7f4726eb4640 1 --2- 192.168.123.107:0/316284708 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4720072420 0x7f4720077190 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.944 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.941+0000 7f4726eb4640 1 --2- 192.168.123.107:0/316284708 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4720071a50 0x7f4720071e50 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.944 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.941+0000 7f4726eb4640 1 -- 192.168.123.107:0/316284708 >> 192.168.123.107:0/316284708 conn(0x7f472006d4f0 msgr2=0x7f472006f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:06.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.947+0000 7f4726eb4640 1 -- 192.168.123.107:0/316284708 shutdown_connections 2026-03-09T19:29:06.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.947+0000 7f4726eb4640 1 -- 192.168.123.107:0/316284708 wait complete. 2026-03-09T19:29:06.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.947+0000 7f4726eb4640 1 Processor -- start 2026-03-09T19:29:06.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.947+0000 7f4726eb4640 1 -- start start 2026-03-09T19:29:06.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.947+0000 7f4726eb4640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4720072420 0x7f4720084080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:06.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.947+0000 7f4726eb4640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f47200826d0 0x7f4720082b50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:06.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.947+0000 7f4726eb4640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f47200845c0 con 0x7f4720072420 2026-03-09T19:29:06.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.947+0000 7f4726eb4640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4720083090 con 0x7f47200826d0 2026-03-09T19:29:06.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.948+0000 7f47256b1640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f47200826d0 0x7f4720082b50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:06.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.948+0000 7f47256b1640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f47200826d0 0x7f4720082b50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:52860/0 (socket says 192.168.123.107:52860) 2026-03-09T19:29:06.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.948+0000 7f47256b1640 1 -- 192.168.123.107:0/1870099244 learned_addr learned my addr 192.168.123.107:0/1870099244 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:29:06.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.949+0000 7f4725eb2640 1 --2- 192.168.123.107:0/1870099244 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4720072420 0x7f4720084080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:06.948 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.949+0000 7f47256b1640 1 -- 192.168.123.107:0/1870099244 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4720072420 msgr2=0x7f4720084080 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:06.948 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.949+0000 7f47256b1640 1 --2- 192.168.123.107:0/1870099244 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4720072420 0x7f4720084080 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:06.948 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.949+0000 7f47256b1640 1 -- 192.168.123.107:0/1870099244 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f471c0106e0 con 0x7f47200826d0 2026-03-09T19:29:06.948 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.949+0000 7f4725eb2640 1 --2- 192.168.123.107:0/1870099244 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4720072420 0x7f4720084080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:29:06.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.949+0000 7f47256b1640 1 --2- 192.168.123.107:0/1870099244 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f47200826d0 0x7f4720082b50 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f471800f3e0 tx=0x7f4718007f90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:06.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.949+0000 7f4716ffd640 1 -- 192.168.123.107:0/1870099244 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4718010040 con 0x7f47200826d0 2026-03-09T19:29:06.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.949+0000 7f4726eb4640 1 -- 192.168.123.107:0/1870099244 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4720083310 con 0x7f47200826d0 2026-03-09T19:29:06.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.949+0000 7f4726eb4640 1 -- 192.168.123.107:0/1870099244 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f47201b5bc0 con 0x7f47200826d0 2026-03-09T19:29:06.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.949+0000 7f4716ffd640 1 -- 192.168.123.107:0/1870099244 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f471800b490 con 0x7f47200826d0 2026-03-09T19:29:06.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.949+0000 7f4716ffd640 1 -- 192.168.123.107:0/1870099244 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f47180040d0 con 0x7f47200826d0 2026-03-09T19:29:06.949 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.950+0000 7f4726eb4640 1 -- 192.168.123.107:0/1870099244 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f472007a810 con 0x7f47200826d0 2026-03-09T19:29:06.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.951+0000 7f4716ffd640 1 -- 192.168.123.107:0/1870099244 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f4718013050 con 0x7f47200826d0 2026-03-09T19:29:06.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.951+0000 7f4716ffd640 1 --2- 192.168.123.107:0/1870099244 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f46f4077a00 0x7f46f4079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:06.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.951+0000 7f4716ffd640 1 -- 192.168.123.107:0/1870099244 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(47..47 src has 1..47) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f4718099b70 con 0x7f47200826d0 2026-03-09T19:29:06.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.952+0000 7f4725eb2640 1 --2- 192.168.123.107:0/1870099244 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f46f4077a00 0x7f46f4079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:06.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.952+0000 7f4725eb2640 1 --2- 192.168.123.107:0/1870099244 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f46f4077a00 0x7f46f4079ec0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f471c0106b0 tx=0x7f471c010620 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:06.952 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:06.953+0000 7f4716ffd640 1 -- 192.168.123.107:0/1870099244 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f47180620f0 con 0x7f47200826d0 2026-03-09T19:29:07.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:07.132+0000 7f4726eb4640 1 -- 192.168.123.107:0/1870099244 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f472007aa50 con 0x7f47200826d0 2026-03-09T19:29:07.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:07.137+0000 7f4716ffd640 1 -- 192.168.123.107:0/1870099244 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+95 (secure 0 0 0) 0x7f4718061840 con 0x7f47200826d0 2026-03-09T19:29:07.137 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_WARN 1 osds down 2026-03-09T19:29:07.137 INFO:teuthology.orchestra.run.vm07.stdout:[WRN] OSD_DOWN: 1 osds down 2026-03-09T19:29:07.137 INFO:teuthology.orchestra.run.vm07.stdout: osd.0 (root=default,host=vm07) is down 2026-03-09T19:29:07.140 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:07.141+0000 7f4714ff9640 1 -- 192.168.123.107:0/1870099244 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f46f4077a00 msgr2=0x7f46f4079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:07.140 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:07.141+0000 7f4714ff9640 1 --2- 192.168.123.107:0/1870099244 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f46f4077a00 0x7f46f4079ec0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f471c0106b0 tx=0x7f471c010620 comp rx=0 tx=0).stop 2026-03-09T19:29:07.140 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:07.141+0000 7f4714ff9640 1 -- 192.168.123.107:0/1870099244 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f47200826d0 msgr2=0x7f4720082b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:07.140 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:07.141+0000 7f4714ff9640 1 --2- 192.168.123.107:0/1870099244 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f47200826d0 0x7f4720082b50 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f471800f3e0 tx=0x7f4718007f90 comp rx=0 tx=0).stop 2026-03-09T19:29:07.140 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:07.142+0000 7f4714ff9640 1 -- 192.168.123.107:0/1870099244 shutdown_connections 2026-03-09T19:29:07.140 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:07.142+0000 7f4714ff9640 1 --2- 192.168.123.107:0/1870099244 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f46f4077a00 0x7f46f4079ec0 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:07.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:07.142+0000 7f4714ff9640 1 --2- 192.168.123.107:0/1870099244 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f47200826d0 0x7f4720082b50 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:07.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:07.142+0000 7f4714ff9640 1 --2- 192.168.123.107:0/1870099244 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4720072420 0x7f4720084080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:07.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:07.142+0000 7f4714ff9640 1 -- 192.168.123.107:0/1870099244 >> 192.168.123.107:0/1870099244 conn(0x7f472006d4f0 msgr2=0x7f47200753c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:07.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:07.142+0000 7f4714ff9640 1 -- 192.168.123.107:0/1870099244 shutdown_connections 2026-03-09T19:29:07.141 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:07.142+0000 7f4714ff9640 1 -- 192.168.123.107:0/1870099244 wait complete. 2026-03-09T19:29:07.195 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:06 vm07.local podman[119258]: 2026-03-09 19:29:06.96048827 +0000 UTC m=+0.031839663 container create 81d3a79b6932baca2ab2fd4b968376068db87a07eb9940fdbabe84fe73d9e8a3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS) 2026-03-09T19:29:07.195 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local podman[119258]: 2026-03-09 19:29:07.012901958 +0000 UTC m=+0.084253351 container init 81d3a79b6932baca2ab2fd4b968376068db87a07eb9940fdbabe84fe73d9e8a3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default) 2026-03-09T19:29:07.195 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local podman[119258]: 2026-03-09 19:29:07.016654102 +0000 UTC m=+0.088005495 container start 81d3a79b6932baca2ab2fd4b968376068db87a07eb9940fdbabe84fe73d9e8a3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0) 2026-03-09T19:29:07.195 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local podman[119258]: 2026-03-09 19:29:07.022707643 +0000 UTC m=+0.094059026 container attach 81d3a79b6932baca2ab2fd4b968376068db87a07eb9940fdbabe84fe73d9e8a3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-deactivate, ceph=True, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T19:29:07.195 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local podman[119258]: 2026-03-09 19:29:06.942432401 +0000 UTC m=+0.013783783 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:29:07.195 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local podman[119258]: 2026-03-09 19:29:07.178820854 +0000 UTC m=+0.250172247 container died 81d3a79b6932baca2ab2fd4b968376068db87a07eb9940fdbabe84fe73d9e8a3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T19:29:07.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:07 vm07.local ceph-mon[111841]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T19:29:07.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:07 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/382504461' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:29:07.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:07 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1844732499' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:29:07.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:07 vm07.local ceph-mon[111841]: osdmap e47: 6 total, 5 up, 6 in 2026-03-09T19:29:07.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:07 vm07.local ceph-mon[111841]: from='client.44113 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:29:07.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:07 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1870099244' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:29:07.479 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local podman[119258]: 2026-03-09 19:29:07.197257516 +0000 UTC m=+0.268608909 container remove 81d3a79b6932baca2ab2fd4b968376068db87a07eb9940fdbabe84fe73d9e8a3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20260223, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T19:29:07.479 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.0.service: Deactivated successfully. 2026-03-09T19:29:07.479 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.0.service: Unit process 119289 (conmon) remains running after unit stopped. 2026-03-09T19:29:07.479 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.0.service: Unit process 119300 (podman) remains running after unit stopped. 2026-03-09T19:29:07.479 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local systemd[1]: Stopped Ceph osd.0 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:29:07.479 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.0.service: Consumed 30.110s CPU time, 761.3M memory peak. 2026-03-09T19:29:07.479 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local systemd[1]: Starting Ceph osd.0 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:29:07.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:07 vm08.local ceph-mon[103420]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T19:29:07.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:07 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/382504461' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:29:07.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:07 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1844732499' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:29:07.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:07 vm08.local ceph-mon[103420]: osdmap e47: 6 total, 5 up, 6 in 2026-03-09T19:29:07.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:07 vm08.local ceph-mon[103420]: from='client.44113 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:29:07.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:07 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1870099244' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:29:07.979 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local podman[119384]: 2026-03-09 19:29:07.558991244 +0000 UTC m=+0.023947571 container create dfb9106bf1346f2274a31670153d59a46249b39a1149ad6bed13f01678cd8c9c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0) 2026-03-09T19:29:07.979 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local podman[119384]: 2026-03-09 19:29:07.611939583 +0000 UTC m=+0.076895909 container init dfb9106bf1346f2274a31670153d59a46249b39a1149ad6bed13f01678cd8c9c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T19:29:07.979 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local podman[119384]: 2026-03-09 19:29:07.622653608 +0000 UTC m=+0.087609934 container start dfb9106bf1346f2274a31670153d59a46249b39a1149ad6bed13f01678cd8c9c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) 2026-03-09T19:29:07.979 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local podman[119384]: 2026-03-09 19:29:07.625922327 +0000 UTC m=+0.090878653 container attach dfb9106bf1346f2274a31670153d59a46249b39a1149ad6bed13f01678cd8c9c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T19:29:07.979 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local podman[119384]: 2026-03-09 19:29:07.548546722 +0000 UTC m=+0.013503048 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:29:07.979 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate[119395]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:29:07.979 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local bash[119384]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:29:07.979 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate[119395]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:29:07.979 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:07 vm07.local bash[119384]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:29:08.623 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate[119395]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T19:29:08.623 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate[119395]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:29:08.623 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local bash[119384]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T19:29:08.623 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local bash[119384]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:29:08.623 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate[119395]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:29:08.623 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local bash[119384]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:29:08.623 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate[119395]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-09T19:29:08.623 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local bash[119384]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-09T19:29:08.623 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate[119395]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-b9d1c2c0-0493-49ed-90f6-a23ceb5626d0/osd-block-133448fe-3146-488b-ab63-557fcf7f955d --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-09T19:29:08.623 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local bash[119384]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-b9d1c2c0-0493-49ed-90f6-a23ceb5626d0/osd-block-133448fe-3146-488b-ab63-557fcf7f955d --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-09T19:29:08.944 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:08 vm08.local ceph-mon[103420]: pgmap v21: 65 pgs: 9 stale+active+clean, 56 active+clean; 295 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 23 KiB/s rd, 968 KiB/s wr, 237 op/s 2026-03-09T19:29:08.944 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:08 vm08.local ceph-mon[103420]: osdmap e48: 6 total, 5 up, 6 in 2026-03-09T19:29:08.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:08 vm07.local ceph-mon[111841]: pgmap v21: 65 pgs: 9 stale+active+clean, 56 active+clean; 295 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 23 KiB/s rd, 968 KiB/s wr, 237 op/s 2026-03-09T19:29:08.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:08 vm07.local ceph-mon[111841]: osdmap e48: 6 total, 5 up, 6 in 2026-03-09T19:29:08.978 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate[119395]: Running command: /usr/bin/ln -snf /dev/ceph-b9d1c2c0-0493-49ed-90f6-a23ceb5626d0/osd-block-133448fe-3146-488b-ab63-557fcf7f955d /var/lib/ceph/osd/ceph-0/block 2026-03-09T19:29:08.978 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local bash[119384]: Running command: /usr/bin/ln -snf /dev/ceph-b9d1c2c0-0493-49ed-90f6-a23ceb5626d0/osd-block-133448fe-3146-488b-ab63-557fcf7f955d /var/lib/ceph/osd/ceph-0/block 2026-03-09T19:29:08.978 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate[119395]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-09T19:29:08.978 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local bash[119384]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-09T19:29:08.978 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate[119395]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-09T19:29:08.978 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local bash[119384]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-09T19:29:08.978 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate[119395]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-09T19:29:08.978 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local bash[119384]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-09T19:29:08.978 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate[119395]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-09T19:29:08.978 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local bash[119384]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-09T19:29:08.978 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local podman[119384]: 2026-03-09 19:29:08.769956672 +0000 UTC m=+1.234912998 container died dfb9106bf1346f2274a31670153d59a46249b39a1149ad6bed13f01678cd8c9c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2) 2026-03-09T19:29:08.978 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:08 vm07.local podman[119384]: 2026-03-09 19:29:08.808867063 +0000 UTC m=+1.273823389 container remove dfb9106bf1346f2274a31670153d59a46249b39a1149ad6bed13f01678cd8c9c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-activate, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T19:29:09.338 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:09 vm07.local podman[119645]: 2026-03-09 19:29:09.064872619 +0000 UTC m=+0.104721908 container create a203aa2416565f2bebebbedece3a291fe2772c83ac9d0f5d18fed2fc70ffd81e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T19:29:09.338 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:09 vm07.local podman[119645]: 2026-03-09 19:29:08.973015554 +0000 UTC m=+0.012864854 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:29:09.338 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:09 vm07.local podman[119645]: 2026-03-09 19:29:09.125554103 +0000 UTC m=+0.165403392 container init a203aa2416565f2bebebbedece3a291fe2772c83ac9d0f5d18fed2fc70ffd81e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) 2026-03-09T19:29:09.338 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:09 vm07.local podman[119645]: 2026-03-09 19:29:09.129141598 +0000 UTC m=+0.168990887 container start a203aa2416565f2bebebbedece3a291fe2772c83ac9d0f5d18fed2fc70ffd81e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2) 2026-03-09T19:29:09.338 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:09 vm07.local bash[119645]: a203aa2416565f2bebebbedece3a291fe2772c83ac9d0f5d18fed2fc70ffd81e 2026-03-09T19:29:09.338 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:09 vm07.local systemd[1]: Started Ceph osd.0 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:29:09.980 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:09 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0[119655]: 2026-03-09T19:29:09.784+0000 7f6b16878740 -1 Falling back to public interface 2026-03-09T19:29:10.590 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:10 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:10.590 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:10 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:10.590 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:10 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:29:10.590 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:10 vm07.local ceph-mon[111841]: pgmap v23: 65 pgs: 9 stale+active+clean, 56 active+clean; 295 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 12 KiB/s rd, 631 KiB/s wr, 123 op/s 2026-03-09T19:29:10.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:10 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:10.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:10 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:10.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:10 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:29:10.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:10 vm08.local ceph-mon[103420]: pgmap v23: 65 pgs: 9 stale+active+clean, 56 active+clean; 295 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 12 KiB/s rd, 631 KiB/s wr, 123 op/s 2026-03-09T19:29:11.656 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:11 vm07.local ceph-mon[111841]: Health check failed: Degraded data redundancy: 6318/42732 objects degraded (14.785%), 34 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:12.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:11 vm08.local ceph-mon[103420]: Health check failed: Degraded data redundancy: 6318/42732 objects degraded (14.785%), 34 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:13.061 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:12 vm07.local ceph-mon[111841]: pgmap v24: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 295 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 36 KiB/s rd, 1.7 MiB/s wr, 324 op/s; 6318/42732 objects degraded (14.785%) 2026-03-09T19:29:13.061 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:12 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:13.061 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:12 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:13.061 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:12 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:13.061 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:12 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:13.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:12 vm08.local ceph-mon[103420]: pgmap v24: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 295 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 36 KiB/s rd, 1.7 MiB/s wr, 324 op/s; 6318/42732 objects degraded (14.785%) 2026-03-09T19:29:13.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:12 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:13.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:12 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:13.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:12 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:13.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:12 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:15.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:14 vm08.local ceph-mon[103420]: pgmap v25: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 295 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.0 MiB/s wr, 201 op/s; 6318/42732 objects degraded (14.785%) 2026-03-09T19:29:15.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:15.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:15.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:29:15.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:29:15.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:15.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:29:15.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:29:15.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:29:15.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:29:15.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:29:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:14 vm07.local ceph-mon[111841]: pgmap v25: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 295 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.0 MiB/s wr, 201 op/s; 6318/42732 objects degraded (14.785%) 2026-03-09T19:29:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:29:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:29:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:29:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:29:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:29:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:29:15.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:29:15.978 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:15 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0[119655]: 2026-03-09T19:29:15.816+0000 7f6b16878740 -1 osd.0 46 log_to_monitors true 2026-03-09T19:29:16.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:16 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:29:16.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:16 vm07.local ceph-mon[111841]: Upgrade: unsafe to stop osd(s) at this time (15 PGs are or would become offline) 2026-03-09T19:29:16.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:16 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:29:16.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:16 vm08.local ceph-mon[103420]: Upgrade: unsafe to stop osd(s) at this time (15 PGs are or would become offline) 2026-03-09T19:29:17.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:17 vm08.local ceph-mon[103420]: pgmap v26: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 292 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1.5 MiB/s wr, 944 op/s; 6101/40944 objects degraded (14.901%) 2026-03-09T19:29:17.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:17 vm08.local ceph-mon[103420]: from='osd.0 [v2:192.168.123.107:6802/3476558232,v1:192.168.123.107:6803/3476558232]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T19:29:17.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:17 vm08.local ceph-mon[103420]: from='osd.0 [v2:192.168.123.107:6802/3476558232,v1:192.168.123.107:6803/3476558232]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-09T19:29:17.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:17 vm08.local ceph-mon[103420]: osdmap e49: 6 total, 5 up, 6 in 2026-03-09T19:29:17.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:17 vm08.local ceph-mon[103420]: from='osd.0 [v2:192.168.123.107:6802/3476558232,v1:192.168.123.107:6803/3476558232]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-09T19:29:17.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:17 vm07.local ceph-mon[111841]: pgmap v26: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 292 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1.5 MiB/s wr, 944 op/s; 6101/40944 objects degraded (14.901%) 2026-03-09T19:29:17.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:17 vm07.local ceph-mon[111841]: from='osd.0 [v2:192.168.123.107:6802/3476558232,v1:192.168.123.107:6803/3476558232]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T19:29:17.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:17 vm07.local ceph-mon[111841]: from='osd.0 [v2:192.168.123.107:6802/3476558232,v1:192.168.123.107:6803/3476558232]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-09T19:29:17.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:17 vm07.local ceph-mon[111841]: osdmap e49: 6 total, 5 up, 6 in 2026-03-09T19:29:17.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:17 vm07.local ceph-mon[111841]: from='osd.0 [v2:192.168.123.107:6802/3476558232,v1:192.168.123.107:6803/3476558232]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-09T19:29:17.979 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:29:17 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0[119655]: 2026-03-09T19:29:17.918+0000 7f6b0de11640 -1 osd.0 46 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T19:29:19.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:19 vm08.local ceph-mon[103420]: pgmap v28: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 292 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 29 KiB/s rd, 1.3 MiB/s wr, 851 op/s; 6101/40944 objects degraded (14.901%) 2026-03-09T19:29:19.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:19 vm08.local ceph-mon[103420]: from='osd.0 [v2:192.168.123.107:6802/3476558232,v1:192.168.123.107:6803/3476558232]' entity='osd.0' 2026-03-09T19:29:19.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:19 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 6101/40944 objects degraded (14.901%), 34 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:19.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:19 vm07.local ceph-mon[111841]: pgmap v28: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 292 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 29 KiB/s rd, 1.3 MiB/s wr, 851 op/s; 6101/40944 objects degraded (14.901%) 2026-03-09T19:29:19.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:19 vm07.local ceph-mon[111841]: from='osd.0 [v2:192.168.123.107:6802/3476558232,v1:192.168.123.107:6803/3476558232]' entity='osd.0' 2026-03-09T19:29:19.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:19 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 6101/40944 objects degraded (14.901%), 34 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:20.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:20 vm08.local ceph-mon[103420]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T19:29:20.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:20 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:20.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:20 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:29:20.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:20 vm08.local ceph-mon[103420]: osd.0 [v2:192.168.123.107:6802/3476558232,v1:192.168.123.107:6803/3476558232] boot 2026-03-09T19:29:20.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:20 vm08.local ceph-mon[103420]: osdmap e50: 6 total, 6 up, 6 in 2026-03-09T19:29:20.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:20 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:29:20.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:20 vm08.local ceph-mon[103420]: pgmap v30: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 292 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 11 KiB/s rd, 612 KiB/s wr, 857 op/s; 6101/40944 objects degraded (14.901%) 2026-03-09T19:29:20.649 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:20 vm07.local ceph-mon[111841]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T19:29:20.649 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:20 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:20.649 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:20 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:29:20.649 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:20 vm07.local ceph-mon[111841]: osd.0 [v2:192.168.123.107:6802/3476558232,v1:192.168.123.107:6803/3476558232] boot 2026-03-09T19:29:20.649 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:20 vm07.local ceph-mon[111841]: osdmap e50: 6 total, 6 up, 6 in 2026-03-09T19:29:20.649 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:20 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T19:29:20.649 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:20 vm07.local ceph-mon[111841]: pgmap v30: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 292 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 11 KiB/s rd, 612 KiB/s wr, 857 op/s; 6101/40944 objects degraded (14.901%) 2026-03-09T19:29:21.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:21 vm07.local ceph-mon[111841]: osdmap e51: 6 total, 6 up, 6 in 2026-03-09T19:29:21.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:21 vm08.local ceph-mon[103420]: osdmap e51: 6 total, 6 up, 6 in 2026-03-09T19:29:22.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:22 vm07.local ceph-mon[111841]: osdmap e52: 6 total, 6 up, 6 in 2026-03-09T19:29:22.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:22 vm07.local ceph-mon[111841]: pgmap v33: 65 pgs: 1 unknown, 24 peering, 6 active+undersized+degraded, 34 active+clean; 290 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 14 KiB/s rd, 1.1 MiB/s wr, 439 op/s; 2479/37965 objects degraded (6.530%); 6 B/s, 49 keys/s, 7 objects/s recovering 2026-03-09T19:29:22.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:22 vm08.local ceph-mon[103420]: osdmap e52: 6 total, 6 up, 6 in 2026-03-09T19:29:22.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:22 vm08.local ceph-mon[103420]: pgmap v33: 65 pgs: 1 unknown, 24 peering, 6 active+undersized+degraded, 34 active+clean; 290 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 14 KiB/s rd, 1.1 MiB/s wr, 439 op/s; 2479/37965 objects degraded (6.530%); 6 B/s, 49 keys/s, 7 objects/s recovering 2026-03-09T19:29:23.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:23 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 2479/37965 objects degraded (6.530%), 6 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:24.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:23 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 2479/37965 objects degraded (6.530%), 6 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:25.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:24 vm08.local ceph-mon[103420]: pgmap v34: 65 pgs: 1 unknown, 24 peering, 6 active+undersized+degraded, 34 active+clean; 290 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 12 KiB/s rd, 947 KiB/s wr, 380 op/s; 2479/37965 objects degraded (6.530%); 5 B/s, 42 keys/s, 6 objects/s recovering 2026-03-09T19:29:25.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:24 vm07.local ceph-mon[111841]: pgmap v34: 65 pgs: 1 unknown, 24 peering, 6 active+undersized+degraded, 34 active+clean; 290 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 12 KiB/s rd, 947 KiB/s wr, 380 op/s; 2479/37965 objects degraded (6.530%); 5 B/s, 42 keys/s, 6 objects/s recovering 2026-03-09T19:29:27.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:26 vm07.local ceph-mon[111841]: pgmap v35: 65 pgs: 15 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 9 active+recovery_wait+degraded, 40 active+clean; 285 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 44 KiB/s rd, 2.0 MiB/s wr, 844 op/s; 2532/33270 objects degraded (7.610%); 54 KiB/s, 169 keys/s, 22 objects/s recovering 2026-03-09T19:29:27.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:26 vm08.local ceph-mon[103420]: pgmap v35: 65 pgs: 15 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 9 active+recovery_wait+degraded, 40 active+clean; 285 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 44 KiB/s rd, 2.0 MiB/s wr, 844 op/s; 2532/33270 objects degraded (7.610%); 54 KiB/s, 169 keys/s, 22 objects/s recovering 2026-03-09T19:29:28.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:28 vm08.local ceph-mon[103420]: pgmap v36: 65 pgs: 15 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 9 active+recovery_wait+degraded, 40 active+clean; 285 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.5 MiB/s wr, 669 op/s; 2532/33252 objects degraded (7.615%); 42 KiB/s, 133 keys/s, 17 objects/s recovering 2026-03-09T19:29:28.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:28 vm07.local ceph-mon[111841]: pgmap v36: 65 pgs: 15 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 9 active+recovery_wait+degraded, 40 active+clean; 285 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.5 MiB/s wr, 669 op/s; 2532/33252 objects degraded (7.615%); 42 KiB/s, 133 keys/s, 17 objects/s recovering 2026-03-09T19:29:29.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:29 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 2532/33252 objects degraded (7.615%), 25 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:29.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:29 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 2532/33252 objects degraded (7.615%), 25 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:31.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:31 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:29:31.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:31 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:29:31.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:31 vm07.local ceph-mon[111841]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-09T19:29:31.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:31 vm07.local ceph-mon[111841]: pgmap v37: 65 pgs: 15 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 9 active+recovery_wait+degraded, 40 active+clean; 285 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 22 KiB/s rd, 770 KiB/s wr, 339 op/s; 2532/33252 objects degraded (7.615%); 36 KiB/s, 115 keys/s, 15 objects/s recovering 2026-03-09T19:29:31.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:31 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:29:31.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:31 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:29:31.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:31 vm08.local ceph-mon[103420]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-09T19:29:31.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:31 vm08.local ceph-mon[103420]: pgmap v37: 65 pgs: 15 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 9 active+recovery_wait+degraded, 40 active+clean; 285 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 22 KiB/s rd, 770 KiB/s wr, 339 op/s; 2532/33252 objects degraded (7.615%); 36 KiB/s, 115 keys/s, 15 objects/s recovering 2026-03-09T19:29:32.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:32 vm08.local ceph-mon[103420]: pgmap v38: 65 pgs: 15 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 5 active+recovery_wait+degraded, 44 active+clean; 267 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 1.2 MiB/s wr, 604 op/s; 2464/28065 objects degraded (8.780%); 33 KiB/s, 144 keys/s, 20 objects/s recovering 2026-03-09T19:29:32.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:32 vm07.local ceph-mon[111841]: pgmap v38: 65 pgs: 15 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 5 active+recovery_wait+degraded, 44 active+clean; 267 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 1.2 MiB/s wr, 604 op/s; 2464/28065 objects degraded (8.780%); 33 KiB/s, 144 keys/s, 20 objects/s recovering 2026-03-09T19:29:33.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:33 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 2464/28065 objects degraded (8.780%), 21 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:33.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:33.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:29:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:33 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 2464/28065 objects degraded (8.780%), 21 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:29:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:29:34.844 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:34 vm08.local ceph-mon[103420]: pgmap v39: 65 pgs: 15 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 5 active+recovery_wait+degraded, 44 active+clean; 267 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1013 KiB/s wr, 507 op/s; 2464/28065 objects degraded (8.780%); 28 KiB/s, 101 keys/s, 14 objects/s recovering 2026-03-09T19:29:34.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:34 vm07.local ceph-mon[111841]: pgmap v39: 65 pgs: 15 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 5 active+recovery_wait+degraded, 44 active+clean; 267 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1013 KiB/s wr, 507 op/s; 2464/28065 objects degraded (8.780%); 28 KiB/s, 101 keys/s, 14 objects/s recovering 2026-03-09T19:29:35.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:35 vm07.local ceph-mon[111841]: osdmap e53: 6 total, 6 up, 6 in 2026-03-09T19:29:36.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:35 vm08.local ceph-mon[103420]: osdmap e53: 6 total, 6 up, 6 in 2026-03-09T19:29:37.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:36 vm08.local ceph-mon[103420]: pgmap v41: 65 pgs: 1 active+recovering+degraded, 1 active+undersized+remapped, 14 active+recovery_wait+undersized+degraded+remapped, 49 active+clean; 267 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.1 MiB/s wr, 498 op/s; 2413/24078 objects degraded (10.022%); 410 KiB/s, 59 keys/s, 11 objects/s recovering 2026-03-09T19:29:37.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:36 vm08.local ceph-mon[103420]: osdmap e54: 6 total, 6 up, 6 in 2026-03-09T19:29:37.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:36 vm07.local ceph-mon[111841]: pgmap v41: 65 pgs: 1 active+recovering+degraded, 1 active+undersized+remapped, 14 active+recovery_wait+undersized+degraded+remapped, 49 active+clean; 267 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.1 MiB/s wr, 498 op/s; 2413/24078 objects degraded (10.022%); 410 KiB/s, 59 keys/s, 11 objects/s recovering 2026-03-09T19:29:37.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:36 vm07.local ceph-mon[111841]: osdmap e54: 6 total, 6 up, 6 in 2026-03-09T19:29:37.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.239+0000 7f32633e5640 1 -- 192.168.123.107:0/1498592504 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3254006df0 msgr2=0x7f3254007250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:37.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.239+0000 7f32633e5640 1 --2- 192.168.123.107:0/1498592504 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3254006df0 0x7f3254007250 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f32500099b0 tx=0x7f325002f240 comp rx=0 tx=0).stop 2026-03-09T19:29:37.243 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.244+0000 7f32633e5640 1 -- 192.168.123.107:0/1498592504 shutdown_connections 2026-03-09T19:29:37.243 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.244+0000 7f32633e5640 1 --2- 192.168.123.107:0/1498592504 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3254006df0 0x7f3254007250 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.243 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.244+0000 7f32633e5640 1 --2- 192.168.123.107:0/1498592504 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3254008450 0x7f32540068b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.243 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.244+0000 7f32633e5640 1 -- 192.168.123.107:0/1498592504 >> 192.168.123.107:0/1498592504 conn(0x7f3254090d00 msgr2=0x7f3254093160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:37.243 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.244+0000 7f32633e5640 1 -- 192.168.123.107:0/1498592504 shutdown_connections 2026-03-09T19:29:37.243 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.244+0000 7f32633e5640 1 -- 192.168.123.107:0/1498592504 wait complete. 2026-03-09T19:29:37.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.245+0000 7f32633e5640 1 Processor -- start 2026-03-09T19:29:37.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.245+0000 7f32633e5640 1 -- start start 2026-03-09T19:29:37.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.245+0000 7f32633e5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3254006df0 0x7f32541336f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:37.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.245+0000 7f32633e5640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3254008450 0x7f3254133c30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:37.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.245+0000 7f326115a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3254006df0 0x7f32541336f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:37.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.245+0000 7f326115a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3254006df0 0x7f32541336f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:55168/0 (socket says 192.168.123.107:55168) 2026-03-09T19:29:37.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.245+0000 7f326115a640 1 -- 192.168.123.107:0/638083276 learned_addr learned my addr 192.168.123.107:0/638083276 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:29:37.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.245+0000 7f32633e5640 1 -- 192.168.123.107:0/638083276 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3254134200 con 0x7f3254006df0 2026-03-09T19:29:37.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.245+0000 7f32633e5640 1 -- 192.168.123.107:0/638083276 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3254134370 con 0x7f3254008450 2026-03-09T19:29:37.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.246+0000 7f3260959640 1 --2- 192.168.123.107:0/638083276 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3254008450 0x7f3254133c30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:37.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.246+0000 7f326115a640 1 -- 192.168.123.107:0/638083276 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3254008450 msgr2=0x7f3254133c30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:37.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.246+0000 7f326115a640 1 --2- 192.168.123.107:0/638083276 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3254008450 0x7f3254133c30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.246+0000 7f326115a640 1 -- 192.168.123.107:0/638083276 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3250009660 con 0x7f3254006df0 2026-03-09T19:29:37.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.247+0000 7f326115a640 1 --2- 192.168.123.107:0/638083276 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3254006df0 0x7f32541336f0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f324c00e9c0 tx=0x7f324c00ee90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:37.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.247+0000 7f324a7fc640 1 -- 192.168.123.107:0/638083276 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f324c00cd60 con 0x7f3254006df0 2026-03-09T19:29:37.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.247+0000 7f324a7fc640 1 -- 192.168.123.107:0/638083276 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f324c00cec0 con 0x7f3254006df0 2026-03-09T19:29:37.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.247+0000 7f32633e5640 1 -- 192.168.123.107:0/638083276 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3254138e10 con 0x7f3254006df0 2026-03-09T19:29:37.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.247+0000 7f32633e5640 1 -- 192.168.123.107:0/638083276 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3254139270 con 0x7f3254006df0 2026-03-09T19:29:37.248 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.249+0000 7f324a7fc640 1 -- 192.168.123.107:0/638083276 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f324c010640 con 0x7f3254006df0 2026-03-09T19:29:37.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.249+0000 7f324a7fc640 1 -- 192.168.123.107:0/638083276 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f324c010820 con 0x7f3254006df0 2026-03-09T19:29:37.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.249+0000 7f324a7fc640 1 --2- 192.168.123.107:0/638083276 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f322c077ba0 0x7f322c07a060 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:37.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.250+0000 7f3260959640 1 --2- 192.168.123.107:0/638083276 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f322c077ba0 0x7f322c07a060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:37.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.250+0000 7f324a7fc640 1 -- 192.168.123.107:0/638083276 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6857+0+0 (secure 0 0 0) 0x7f324c014070 con 0x7f3254006df0 2026-03-09T19:29:37.252 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.251+0000 7f322bfff640 1 -- 192.168.123.107:0/638083276 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3254007250 con 0x7f3254006df0 2026-03-09T19:29:37.252 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.251+0000 7f3260959640 1 --2- 192.168.123.107:0/638083276 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f322c077ba0 0x7f322c07a060 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f3254134c10 tx=0x7f325003a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:37.254 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.254+0000 7f324a7fc640 1 -- 192.168.123.107:0/638083276 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f324c0639f0 con 0x7f3254006df0 2026-03-09T19:29:37.392 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.393+0000 7f322bfff640 1 -- 192.168.123.107:0/638083276 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f32540a4920 con 0x7f322c077ba0 2026-03-09T19:29:37.393 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.394+0000 7f324a7fc640 1 -- 192.168.123.107:0/638083276 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f32540a4920 con 0x7f322c077ba0 2026-03-09T19:29:37.398 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.399+0000 7f32633e5640 1 -- 192.168.123.107:0/638083276 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f322c077ba0 msgr2=0x7f322c07a060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:37.398 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.399+0000 7f32633e5640 1 --2- 192.168.123.107:0/638083276 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f322c077ba0 0x7f322c07a060 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f3254134c10 tx=0x7f325003a040 comp rx=0 tx=0).stop 2026-03-09T19:29:37.398 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.399+0000 7f32633e5640 1 -- 192.168.123.107:0/638083276 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3254006df0 msgr2=0x7f32541336f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:37.398 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.399+0000 7f32633e5640 1 --2- 192.168.123.107:0/638083276 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3254006df0 0x7f32541336f0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f324c00e9c0 tx=0x7f324c00ee90 comp rx=0 tx=0).stop 2026-03-09T19:29:37.398 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.399+0000 7f32633e5640 1 -- 192.168.123.107:0/638083276 shutdown_connections 2026-03-09T19:29:37.398 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.399+0000 7f32633e5640 1 --2- 192.168.123.107:0/638083276 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f322c077ba0 0x7f322c07a060 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.398 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.399+0000 7f32633e5640 1 --2- 192.168.123.107:0/638083276 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3254008450 0x7f3254133c30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.398 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.399+0000 7f32633e5640 1 --2- 192.168.123.107:0/638083276 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3254006df0 0x7f32541336f0 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.398 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.399+0000 7f32633e5640 1 -- 192.168.123.107:0/638083276 >> 192.168.123.107:0/638083276 conn(0x7f3254090d00 msgr2=0x7f32540928b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:37.398 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.399+0000 7f32633e5640 1 -- 192.168.123.107:0/638083276 shutdown_connections 2026-03-09T19:29:37.398 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.399+0000 7f32633e5640 1 -- 192.168.123.107:0/638083276 wait complete. 2026-03-09T19:29:37.410 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:29:37.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.492+0000 7f0c943ff640 1 -- 192.168.123.107:0/2246674899 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c8c072440 msgr2=0x7f0c8c0771b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:37.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.492+0000 7f0c943ff640 1 --2- 192.168.123.107:0/2246674899 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c8c072440 0x7f0c8c0771b0 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f0c8400b600 tx=0x7f0c84030670 comp rx=0 tx=0).stop 2026-03-09T19:29:37.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.492+0000 7f0c943ff640 1 -- 192.168.123.107:0/2246674899 shutdown_connections 2026-03-09T19:29:37.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.492+0000 7f0c943ff640 1 --2- 192.168.123.107:0/2246674899 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c8c072440 0x7f0c8c0771b0 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.492+0000 7f0c943ff640 1 --2- 192.168.123.107:0/2246674899 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0c8c071a70 0x7f0c8c071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.492+0000 7f0c943ff640 1 -- 192.168.123.107:0/2246674899 >> 192.168.123.107:0/2246674899 conn(0x7f0c8c06d4f0 msgr2=0x7f0c8c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:37.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.492+0000 7f0c943ff640 1 -- 192.168.123.107:0/2246674899 shutdown_connections 2026-03-09T19:29:37.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.492+0000 7f0c943ff640 1 -- 192.168.123.107:0/2246674899 wait complete. 2026-03-09T19:29:37.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.495+0000 7f0c943ff640 1 Processor -- start 2026-03-09T19:29:37.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.495+0000 7f0c943ff640 1 -- start start 2026-03-09T19:29:37.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.496+0000 7f0c943ff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0c8c071a70 0x7f0c8c1319a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:37.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.496+0000 7f0c943ff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c8c133350 0x7f0c8c131ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:37.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.496+0000 7f0c943ff640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0c8c1324b0 con 0x7f0c8c071a70 2026-03-09T19:29:37.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.496+0000 7f0c943ff640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0c8c132620 con 0x7f0c8c133350 2026-03-09T19:29:37.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.496+0000 7f0c92174640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0c8c071a70 0x7f0c8c1319a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:37.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.496+0000 7f0c92174640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0c8c071a70 0x7f0c8c1319a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:55180/0 (socket says 192.168.123.107:55180) 2026-03-09T19:29:37.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.496+0000 7f0c92174640 1 -- 192.168.123.107:0/3563905383 learned_addr learned my addr 192.168.123.107:0/3563905383 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:29:37.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.496+0000 7f0c91973640 1 --2- 192.168.123.107:0/3563905383 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c8c133350 0x7f0c8c131ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:37.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.496+0000 7f0c91973640 1 -- 192.168.123.107:0/3563905383 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0c8c071a70 msgr2=0x7f0c8c1319a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:37.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.496+0000 7f0c91973640 1 --2- 192.168.123.107:0/3563905383 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0c8c071a70 0x7f0c8c1319a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.496+0000 7f0c91973640 1 -- 192.168.123.107:0/3563905383 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0c84009d00 con 0x7f0c8c133350 2026-03-09T19:29:37.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.497+0000 7f0c91973640 1 --2- 192.168.123.107:0/3563905383 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c8c133350 0x7f0c8c131ee0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f0c8c0730e0 tx=0x7f0c84009c10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:37.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.497+0000 7f0c837fe640 1 -- 192.168.123.107:0/3563905383 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0c84007ba0 con 0x7f0c8c133350 2026-03-09T19:29:37.498 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.497+0000 7f0c943ff640 1 -- 192.168.123.107:0/3563905383 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0c8c07fad0 con 0x7f0c8c133350 2026-03-09T19:29:37.498 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.497+0000 7f0c943ff640 1 -- 192.168.123.107:0/3563905383 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0c8c080020 con 0x7f0c8c133350 2026-03-09T19:29:37.498 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.499+0000 7f0c837fe640 1 -- 192.168.123.107:0/3563905383 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f0c84033040 con 0x7f0c8c133350 2026-03-09T19:29:37.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.499+0000 7f0c837fe640 1 -- 192.168.123.107:0/3563905383 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0c84038910 con 0x7f0c8c133350 2026-03-09T19:29:37.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.501+0000 7f0c837fe640 1 -- 192.168.123.107:0/3563905383 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f0c84038a70 con 0x7f0c8c133350 2026-03-09T19:29:37.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.501+0000 7f0c837fe640 1 --2- 192.168.123.107:0/3563905383 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0c6c077a00 0x7f0c6c079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:37.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.501+0000 7f0c837fe640 1 -- 192.168.123.107:0/3563905383 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6857+0+0 (secure 0 0 0) 0x7f0c840bf7e0 con 0x7f0c8c133350 2026-03-09T19:29:37.501 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.502+0000 7f0c943ff640 1 -- 192.168.123.107:0/3563905383 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0c8c1327e0 con 0x7f0c8c133350 2026-03-09T19:29:37.501 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.503+0000 7f0c92174640 1 --2- 192.168.123.107:0/3563905383 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0c6c077a00 0x7f0c6c079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:37.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.503+0000 7f0c92174640 1 --2- 192.168.123.107:0/3563905383 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0c6c077a00 0x7f0c6c079ec0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f0c88009e10 tx=0x7f0c88009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:37.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.506+0000 7f0c837fe640 1 -- 192.168.123.107:0/3563905383 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0c84087bd0 con 0x7f0c8c133350 2026-03-09T19:29:37.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.679+0000 7f0c943ff640 1 -- 192.168.123.107:0/3563905383 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f0c8c0761e0 con 0x7f0c6c077a00 2026-03-09T19:29:37.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.681+0000 7f0c837fe640 1 -- 192.168.123.107:0/3563905383 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f0c8c0761e0 con 0x7f0c6c077a00 2026-03-09T19:29:37.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.684+0000 7f0c943ff640 1 -- 192.168.123.107:0/3563905383 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0c6c077a00 msgr2=0x7f0c6c079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:37.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.684+0000 7f0c943ff640 1 --2- 192.168.123.107:0/3563905383 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0c6c077a00 0x7f0c6c079ec0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f0c88009e10 tx=0x7f0c88009290 comp rx=0 tx=0).stop 2026-03-09T19:29:37.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.684+0000 7f0c943ff640 1 -- 192.168.123.107:0/3563905383 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c8c133350 msgr2=0x7f0c8c131ee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:37.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.684+0000 7f0c943ff640 1 --2- 192.168.123.107:0/3563905383 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c8c133350 0x7f0c8c131ee0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f0c8c0730e0 tx=0x7f0c84009c10 comp rx=0 tx=0).stop 2026-03-09T19:29:37.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.684+0000 7f0c943ff640 1 -- 192.168.123.107:0/3563905383 shutdown_connections 2026-03-09T19:29:37.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.684+0000 7f0c943ff640 1 --2- 192.168.123.107:0/3563905383 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0c6c077a00 0x7f0c6c079ec0 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.684+0000 7f0c943ff640 1 --2- 192.168.123.107:0/3563905383 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c8c133350 0x7f0c8c131ee0 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.684+0000 7f0c943ff640 1 --2- 192.168.123.107:0/3563905383 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0c8c071a70 0x7f0c8c1319a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.684+0000 7f0c943ff640 1 -- 192.168.123.107:0/3563905383 >> 192.168.123.107:0/3563905383 conn(0x7f0c8c06d4f0 msgr2=0x7f0c8c070410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:37.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.684+0000 7f0c943ff640 1 -- 192.168.123.107:0/3563905383 shutdown_connections 2026-03-09T19:29:37.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.684+0000 7f0c943ff640 1 -- 192.168.123.107:0/3563905383 wait complete. 2026-03-09T19:29:37.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.755+0000 7f560ab66640 1 -- 192.168.123.107:0/3644441293 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5604072440 msgr2=0x7f56040771b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:37.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.755+0000 7f560ab66640 1 --2- 192.168.123.107:0/3644441293 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5604072440 0x7f56040771b0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f55fc008030 tx=0x7f55fc031e80 comp rx=0 tx=0).stop 2026-03-09T19:29:37.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.756+0000 7f560ab66640 1 -- 192.168.123.107:0/3644441293 shutdown_connections 2026-03-09T19:29:37.756 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.756+0000 7f560ab66640 1 --2- 192.168.123.107:0/3644441293 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5604072440 0x7f56040771b0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.756 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.756+0000 7f560ab66640 1 --2- 192.168.123.107:0/3644441293 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5604071a70 0x7f5604071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.756 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.756+0000 7f560ab66640 1 -- 192.168.123.107:0/3644441293 >> 192.168.123.107:0/3644441293 conn(0x7f560406d4f0 msgr2=0x7f560406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:37.760 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.757+0000 7f560ab66640 1 -- 192.168.123.107:0/3644441293 shutdown_connections 2026-03-09T19:29:37.760 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.757+0000 7f560ab66640 1 -- 192.168.123.107:0/3644441293 wait complete. 2026-03-09T19:29:37.760 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.758+0000 7f560ab66640 1 Processor -- start 2026-03-09T19:29:37.760 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.759+0000 7f560ab66640 1 -- start start 2026-03-09T19:29:37.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.759+0000 7f560ab66640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5604071a70 0x7f5604084100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:37.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.759+0000 7f560ab66640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5604082750 0x7f5604082bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:37.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.759+0000 7f560ab66640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5604084640 con 0x7f5604071a70 2026-03-09T19:29:37.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.759+0000 7f560ab66640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5604083110 con 0x7f5604082750 2026-03-09T19:29:37.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.761+0000 7f56088db640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5604071a70 0x7f5604084100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:37.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.761+0000 7f5603fff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5604082750 0x7f5604082bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:37.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.761+0000 7f5603fff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5604082750 0x7f5604082bd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:57338/0 (socket says 192.168.123.107:57338) 2026-03-09T19:29:37.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.761+0000 7f5603fff640 1 -- 192.168.123.107:0/2472467774 learned_addr learned my addr 192.168.123.107:0/2472467774 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:29:37.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.761+0000 7f5603fff640 1 -- 192.168.123.107:0/2472467774 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5604071a70 msgr2=0x7f5604084100 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:37.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.761+0000 7f5603fff640 1 --2- 192.168.123.107:0/2472467774 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5604071a70 0x7f5604084100 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.761+0000 7f5603fff640 1 -- 192.168.123.107:0/2472467774 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f55fc007ce0 con 0x7f5604082750 2026-03-09T19:29:37.761 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.761+0000 7f5603fff640 1 --2- 192.168.123.107:0/2472467774 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5604082750 0x7f5604082bd0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f55fc004a70 tx=0x7f55fc002ea0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:37.764 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.763+0000 7f5601ffb640 1 -- 192.168.123.107:0/2472467774 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f55fc0102e0 con 0x7f5604082750 2026-03-09T19:29:37.764 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.763+0000 7f560ab66640 1 -- 192.168.123.107:0/2472467774 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5604083390 con 0x7f5604082750 2026-03-09T19:29:37.764 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.763+0000 7f560ab66640 1 -- 192.168.123.107:0/2472467774 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f560412ef70 con 0x7f5604082750 2026-03-09T19:29:37.764 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.764+0000 7f5601ffb640 1 -- 192.168.123.107:0/2472467774 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f55fc032ba0 con 0x7f5604082750 2026-03-09T19:29:37.764 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.764+0000 7f5601ffb640 1 -- 192.168.123.107:0/2472467774 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f55fc03bdb0 con 0x7f5604082750 2026-03-09T19:29:37.764 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.765+0000 7f560ab66640 1 -- 192.168.123.107:0/2472467774 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f55cc005350 con 0x7f5604082750 2026-03-09T19:29:37.766 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.766+0000 7f5601ffb640 1 -- 192.168.123.107:0/2472467774 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f55fc04b050 con 0x7f5604082750 2026-03-09T19:29:37.766 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.767+0000 7f5601ffb640 1 --2- 192.168.123.107:0/2472467774 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f55e4077a00 0x7f55e4079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:37.766 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.767+0000 7f5601ffb640 1 -- 192.168.123.107:0/2472467774 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6857+0+0 (secure 0 0 0) 0x7f55fc053080 con 0x7f5604082750 2026-03-09T19:29:37.766 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.767+0000 7f56088db640 1 --2- 192.168.123.107:0/2472467774 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f55e4077a00 0x7f55e4079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:37.767 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.768+0000 7f56088db640 1 --2- 192.168.123.107:0/2472467774 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f55e4077a00 0x7f55e4079ec0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f55f4009ba0 tx=0x7f55f4009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:37.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.769+0000 7f5601ffb640 1 -- 192.168.123.107:0/2472467774 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f55fc088010 con 0x7f5604082750 2026-03-09T19:29:37.878 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.879+0000 7f560ab66640 1 -- 192.168.123.107:0/2472467774 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f55cc002bf0 con 0x7f55e4077a00 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.884+0000 7f5601ffb640 1 -- 192.168.123.107:0/2472467774 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f55cc002bf0 con 0x7f55e4077a00 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (6m) 26s ago 7m 23.2M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (7m) 26s ago 7m 8938k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (6m) 37s ago 6m 10.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (41s) 26s ago 7m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 91ed5a6dbf3f 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (39s) 37s ago 6m 8321k - 19.2.3-678-ge911bdeb 654f31e6858e b2465a9d2305 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (6m) 26s ago 6m 88.6M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (5m) 26s ago 5m 16.7M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (5m) 26s ago 5m 18.3M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (5m) 37s ago 5m 28.8M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (5m) 37s ago 5m 239M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:8443,9283,8765 running (93s) 26s ago 7m 593M - 19.2.3-678-ge911bdeb 654f31e6858e 6c1350e70bfa 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (76s) 37s ago 6m 489M - 19.2.3-678-ge911bdeb 654f31e6858e c4c36685d8dc 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (69s) 26s ago 7m 56.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e ad39140965d8 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (57s) 37s ago 6m 46.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b4a58927ebfd 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (7m) 26s ago 7m 14.3M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:29:37.883 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (6m) 37s ago 6m 16.6M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:29:37.884 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (28s) 26s ago 6m 30.8M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a203aa241656 2026-03-09T19:29:37.884 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (6m) 26s ago 6m 394M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2b3c7dd92144 2026-03-09T19:29:37.884 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (6m) 26s ago 6m 318M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 67f7c4b96ef8 2026-03-09T19:29:37.884 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (5m) 37s ago 5m 441M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 740e44caf4fc 2026-03-09T19:29:37.884 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (5m) 37s ago 5m 446M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d929d31f8a58 2026-03-09T19:29:37.884 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (5m) 37s ago 5m 348M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:29:37.884 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (78s) 26s ago 6m 44.5M - 2.43.0 a07b618ecd1d c09450c20f5f 2026-03-09T19:29:37.886 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.887+0000 7f560ab66640 1 -- 192.168.123.107:0/2472467774 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f55e4077a00 msgr2=0x7f55e4079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:37.886 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.887+0000 7f560ab66640 1 --2- 192.168.123.107:0/2472467774 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f55e4077a00 0x7f55e4079ec0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f55f4009ba0 tx=0x7f55f4009290 comp rx=0 tx=0).stop 2026-03-09T19:29:37.886 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.887+0000 7f560ab66640 1 -- 192.168.123.107:0/2472467774 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5604082750 msgr2=0x7f5604082bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:37.886 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.887+0000 7f560ab66640 1 --2- 192.168.123.107:0/2472467774 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5604082750 0x7f5604082bd0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f55fc004a70 tx=0x7f55fc002ea0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.886 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.887+0000 7f560ab66640 1 -- 192.168.123.107:0/2472467774 shutdown_connections 2026-03-09T19:29:37.886 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.887+0000 7f560ab66640 1 --2- 192.168.123.107:0/2472467774 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f55e4077a00 0x7f55e4079ec0 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.886 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.887+0000 7f560ab66640 1 --2- 192.168.123.107:0/2472467774 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5604082750 0x7f5604082bd0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.886 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.887+0000 7f560ab66640 1 --2- 192.168.123.107:0/2472467774 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5604071a70 0x7f5604084100 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.886 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.887+0000 7f560ab66640 1 -- 192.168.123.107:0/2472467774 >> 192.168.123.107:0/2472467774 conn(0x7f560406d4f0 msgr2=0x7f5604073150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:37.886 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.887+0000 7f560ab66640 1 -- 192.168.123.107:0/2472467774 shutdown_connections 2026-03-09T19:29:37.886 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.887+0000 7f560ab66640 1 -- 192.168.123.107:0/2472467774 wait complete. 2026-03-09T19:29:37.959 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.959+0000 7fc8135f0640 1 -- 192.168.123.107:0/3140566573 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc80c102a60 msgr2=0x7fc80c102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:37.959 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.959+0000 7fc8135f0640 1 --2- 192.168.123.107:0/3140566573 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc80c102a60 0x7fc80c102e60 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fc7f40099b0 tx=0x7fc7f402f220 comp rx=0 tx=0).stop 2026-03-09T19:29:37.960 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.961+0000 7fc8135f0640 1 -- 192.168.123.107:0/3140566573 shutdown_connections 2026-03-09T19:29:37.960 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.961+0000 7fc8135f0640 1 --2- 192.168.123.107:0/3140566573 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc80c103c60 0x7fc80c1040e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.960 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.961+0000 7fc8135f0640 1 --2- 192.168.123.107:0/3140566573 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc80c102a60 0x7fc80c102e60 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.960 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.961+0000 7fc8135f0640 1 -- 192.168.123.107:0/3140566573 >> 192.168.123.107:0/3140566573 conn(0x7fc80c0fe250 msgr2=0x7fc80c100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:37.962 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.964+0000 7fc8135f0640 1 -- 192.168.123.107:0/3140566573 shutdown_connections 2026-03-09T19:29:37.963 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.964+0000 7fc8135f0640 1 -- 192.168.123.107:0/3140566573 wait complete. 2026-03-09T19:29:37.963 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.964+0000 7fc8135f0640 1 Processor -- start 2026-03-09T19:29:37.963 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.964+0000 7fc8135f0640 1 -- start start 2026-03-09T19:29:37.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.965+0000 7fc8135f0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc80c102a60 0x7fc80c071650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:37.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.965+0000 7fc8135f0640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc80c103c60 0x7fc80c071b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:37.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.965+0000 7fc8135f0640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc80c073090 con 0x7fc80c102a60 2026-03-09T19:29:37.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.965+0000 7fc8135f0640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc80c073200 con 0x7fc80c103c60 2026-03-09T19:29:37.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.965+0000 7fc811365640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc80c102a60 0x7fc80c071650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:37.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.965+0000 7fc811365640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc80c102a60 0x7fc80c071650 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:55224/0 (socket says 192.168.123.107:55224) 2026-03-09T19:29:37.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.965+0000 7fc811365640 1 -- 192.168.123.107:0/2147054967 learned_addr learned my addr 192.168.123.107:0/2147054967 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:29:37.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.965+0000 7fc811365640 1 -- 192.168.123.107:0/2147054967 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc80c103c60 msgr2=0x7fc80c071b90 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:29:37.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.965+0000 7fc811365640 1 --2- 192.168.123.107:0/2147054967 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc80c103c60 0x7fc80c071b90 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:37.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.965+0000 7fc811365640 1 -- 192.168.123.107:0/2147054967 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc7fc009590 con 0x7fc80c102a60 2026-03-09T19:29:37.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.965+0000 7fc811365640 1 --2- 192.168.123.107:0/2147054967 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc80c102a60 0x7fc80c071650 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fc7f4005bb0 tx=0x7fc7f4002f60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:37.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.966+0000 7fc8027fc640 1 -- 192.168.123.107:0/2147054967 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc7f403d070 con 0x7fc80c102a60 2026-03-09T19:29:37.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.966+0000 7fc8027fc640 1 -- 192.168.123.107:0/2147054967 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fc7f4038730 con 0x7fc80c102a60 2026-03-09T19:29:37.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.966+0000 7fc8027fc640 1 -- 192.168.123.107:0/2147054967 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc7f4041820 con 0x7fc80c102a60 2026-03-09T19:29:37.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.966+0000 7fc8135f0640 1 -- 192.168.123.107:0/2147054967 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc7f4009660 con 0x7fc80c102a60 2026-03-09T19:29:37.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.966+0000 7fc8135f0640 1 -- 192.168.123.107:0/2147054967 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc80c0720d0 con 0x7fc80c102a60 2026-03-09T19:29:37.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.968+0000 7fc8027fc640 1 -- 192.168.123.107:0/2147054967 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fc7f4038cb0 con 0x7fc80c102a60 2026-03-09T19:29:37.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.968+0000 7fc8027fc640 1 --2- 192.168.123.107:0/2147054967 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc7e8077ad0 0x7fc7e8079f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:37.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.968+0000 7fc8027fc640 1 -- 192.168.123.107:0/2147054967 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6828+0+0 (secure 0 0 0) 0x7fc7f40be7f0 con 0x7fc80c102a60 2026-03-09T19:29:37.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.968+0000 7fc810b64640 1 --2- 192.168.123.107:0/2147054967 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc7e8077ad0 0x7fc7e8079f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:37.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.969+0000 7fc810b64640 1 --2- 192.168.123.107:0/2147054967 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc7e8077ad0 0x7fc7e8079f90 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fc80c072d80 tx=0x7fc7fc009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:37.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.969+0000 7fc8135f0640 1 -- 192.168.123.107:0/2147054967 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc7d4005350 con 0x7fc80c102a60 2026-03-09T19:29:37.971 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:37.972+0000 7fc8027fc640 1 -- 192.168.123.107:0/2147054967 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc7f4086bd0 con 0x7fc80c102a60 2026-03-09T19:29:38.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:37 vm08.local ceph-mon[103420]: osdmap e55: 6 total, 6 up, 6 in 2026-03-09T19:29:38.189 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.190+0000 7fc8135f0640 1 -- 192.168.123.107:0/2147054967 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fc7d40058d0 con 0x7fc80c102a60 2026-03-09T19:29:38.189 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:37 vm07.local ceph-mon[111841]: osdmap e55: 6 total, 6 up, 6 in 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.190+0000 7fc8027fc640 1 -- 192.168.123.107:0/2147054967 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+844 (secure 0 0 0) 0x7fc7f4046070 con 0x7fc80c102a60 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 5, 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 9, 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:29:38.190 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:29:38.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.193+0000 7fc8135f0640 1 -- 192.168.123.107:0/2147054967 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc7e8077ad0 msgr2=0x7fc7e8079f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:38.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.193+0000 7fc8135f0640 1 --2- 192.168.123.107:0/2147054967 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc7e8077ad0 0x7fc7e8079f90 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fc80c072d80 tx=0x7fc7fc009290 comp rx=0 tx=0).stop 2026-03-09T19:29:38.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.193+0000 7fc8135f0640 1 -- 192.168.123.107:0/2147054967 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc80c102a60 msgr2=0x7fc80c071650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:38.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.193+0000 7fc8135f0640 1 --2- 192.168.123.107:0/2147054967 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc80c102a60 0x7fc80c071650 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fc7f4005bb0 tx=0x7fc7f4002f60 comp rx=0 tx=0).stop 2026-03-09T19:29:38.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.194+0000 7fc8135f0640 1 -- 192.168.123.107:0/2147054967 shutdown_connections 2026-03-09T19:29:38.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.194+0000 7fc8135f0640 1 --2- 192.168.123.107:0/2147054967 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc7e8077ad0 0x7fc7e8079f90 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.193 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.194+0000 7fc8135f0640 1 --2- 192.168.123.107:0/2147054967 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc80c103c60 0x7fc80c071b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.193 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.194+0000 7fc8135f0640 1 --2- 192.168.123.107:0/2147054967 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc80c102a60 0x7fc80c071650 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.193 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.194+0000 7fc8135f0640 1 -- 192.168.123.107:0/2147054967 >> 192.168.123.107:0/2147054967 conn(0x7fc80c0fe250 msgr2=0x7fc80c0ffd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:38.193 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.194+0000 7fc8135f0640 1 -- 192.168.123.107:0/2147054967 shutdown_connections 2026-03-09T19:29:38.193 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.194+0000 7fc8135f0640 1 -- 192.168.123.107:0/2147054967 wait complete. 2026-03-09T19:29:38.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.262+0000 7f1dc3577640 1 -- 192.168.123.107:0/3469071363 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1dc40fe210 msgr2=0x7f1dc40fe610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:38.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.262+0000 7f1dc3577640 1 --2- 192.168.123.107:0/3469071363 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1dc40fe210 0x7f1dc40fe610 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f1db80099e0 tx=0x7f1db802f260 comp rx=0 tx=0).stop 2026-03-09T19:29:38.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.263+0000 7f1dc3577640 1 -- 192.168.123.107:0/3469071363 shutdown_connections 2026-03-09T19:29:38.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.263+0000 7f1dc3577640 1 --2- 192.168.123.107:0/3469071363 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1dc40feb50 0x7f1dc4105ec0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.263+0000 7f1dc3577640 1 --2- 192.168.123.107:0/3469071363 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1dc40fe210 0x7f1dc40fe610 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.263+0000 7f1dc3577640 1 -- 192.168.123.107:0/3469071363 >> 192.168.123.107:0/3469071363 conn(0x7f1dc40f9f80 msgr2=0x7f1dc40fc3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:38.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.263+0000 7f1dc3577640 1 -- 192.168.123.107:0/3469071363 shutdown_connections 2026-03-09T19:29:38.262 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.263+0000 7f1dc3577640 1 -- 192.168.123.107:0/3469071363 wait complete. 2026-03-09T19:29:38.263 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.264+0000 7f1dc3577640 1 Processor -- start 2026-03-09T19:29:38.263 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.264+0000 7f1dc3577640 1 -- start start 2026-03-09T19:29:38.263 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.264+0000 7f1dc3577640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1dc40fe210 0x7f1dc419a390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:38.263 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.264+0000 7f1dc3577640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1dc40feb50 0x7f1dc419a8d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:38.263 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.264+0000 7f1dc3577640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1dc419aea0 con 0x7f1dc40feb50 2026-03-09T19:29:38.263 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.264+0000 7f1dc3577640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1dc419b010 con 0x7f1dc40fe210 2026-03-09T19:29:38.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.265+0000 7f1dc2575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1dc40fe210 0x7f1dc419a390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:38.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.265+0000 7f1dc2575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1dc40fe210 0x7f1dc419a390 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:57370/0 (socket says 192.168.123.107:57370) 2026-03-09T19:29:38.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.265+0000 7f1dc2575640 1 -- 192.168.123.107:0/1862926854 learned_addr learned my addr 192.168.123.107:0/1862926854 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:29:38.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.265+0000 7f1dc1d74640 1 --2- 192.168.123.107:0/1862926854 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1dc40feb50 0x7f1dc419a8d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:38.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.265+0000 7f1dc1d74640 1 -- 192.168.123.107:0/1862926854 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1dc40fe210 msgr2=0x7f1dc419a390 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:38.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.265+0000 7f1dc1d74640 1 --2- 192.168.123.107:0/1862926854 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1dc40fe210 0x7f1dc419a390 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.265+0000 7f1dc1d74640 1 -- 192.168.123.107:0/1862926854 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1db8009660 con 0x7f1dc40feb50 2026-03-09T19:29:38.265 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.266+0000 7f1dc1d74640 1 --2- 192.168.123.107:0/1862926854 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1dc40feb50 0x7f1dc419a8d0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f1dac00b4d0 tx=0x7f1dac00b9a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:38.265 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.266+0000 7f1dab7fe640 1 -- 192.168.123.107:0/1862926854 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1dac004280 con 0x7f1dc40feb50 2026-03-09T19:29:38.265 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.266+0000 7f1dab7fe640 1 -- 192.168.123.107:0/1862926854 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f1dac0043e0 con 0x7f1dc40feb50 2026-03-09T19:29:38.266 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.266+0000 7f1dc3577640 1 -- 192.168.123.107:0/1862926854 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1dc419fa60 con 0x7f1dc40feb50 2026-03-09T19:29:38.266 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.266+0000 7f1dc3577640 1 -- 192.168.123.107:0/1862926854 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1dc41a0030 con 0x7f1dc40feb50 2026-03-09T19:29:38.267 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.267+0000 7f1dab7fe640 1 -- 192.168.123.107:0/1862926854 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1dac010b50 con 0x7f1dc40feb50 2026-03-09T19:29:38.267 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.268+0000 7f1dc3577640 1 -- 192.168.123.107:0/1862926854 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1d88005350 con 0x7f1dc40feb50 2026-03-09T19:29:38.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.269+0000 7f1dab7fe640 1 -- 192.168.123.107:0/1862926854 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f1dac0026e0 con 0x7f1dc40feb50 2026-03-09T19:29:38.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.270+0000 7f1dab7fe640 1 --2- 192.168.123.107:0/1862926854 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1d980779b0 0x7f1d98079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:38.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.270+0000 7f1dab7fe640 1 -- 192.168.123.107:0/1862926854 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6828+0+0 (secure 0 0 0) 0x7f1dac099300 con 0x7f1dc40feb50 2026-03-09T19:29:38.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.270+0000 7f1dc2575640 1 --2- 192.168.123.107:0/1862926854 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1d980779b0 0x7f1d98079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:38.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.271+0000 7f1dc2575640 1 --2- 192.168.123.107:0/1862926854 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1d980779b0 0x7f1d98079e70 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f1db80099b0 tx=0x7f1db803a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:38.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.271+0000 7f1dab7fe640 1 -- 192.168.123.107:0/1862926854 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1dac060f10 con 0x7f1dc40feb50 2026-03-09T19:29:38.400 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.401+0000 7f1dc3577640 1 -- 192.168.123.107:0/1862926854 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f1d880058d0 con 0x7f1dc40feb50 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.402+0000 7f1dab7fe640 1 -- 192.168.123.107:0/1862926854 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1926 (secure 0 0 0) 0x7f1dac060d30 con 0x7f1dc40feb50 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:e13 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:epoch 13 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:29:38.401 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout:qdb_cluster leader: 0 members: 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:29:38.402 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:29:38.404 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.405+0000 7f1dc3577640 1 -- 192.168.123.107:0/1862926854 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1d980779b0 msgr2=0x7f1d98079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:38.404 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.405+0000 7f1dc3577640 1 --2- 192.168.123.107:0/1862926854 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1d980779b0 0x7f1d98079e70 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f1db80099b0 tx=0x7f1db803a040 comp rx=0 tx=0).stop 2026-03-09T19:29:38.404 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.405+0000 7f1dc3577640 1 -- 192.168.123.107:0/1862926854 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1dc40feb50 msgr2=0x7f1dc419a8d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:38.404 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.405+0000 7f1dc3577640 1 --2- 192.168.123.107:0/1862926854 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1dc40feb50 0x7f1dc419a8d0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f1dac00b4d0 tx=0x7f1dac00b9a0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.404 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.405+0000 7f1dc3577640 1 -- 192.168.123.107:0/1862926854 shutdown_connections 2026-03-09T19:29:38.404 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.405+0000 7f1dc3577640 1 --2- 192.168.123.107:0/1862926854 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1d980779b0 0x7f1d98079e70 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.404 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.406+0000 7f1dc3577640 1 --2- 192.168.123.107:0/1862926854 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1dc40feb50 0x7f1dc419a8d0 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.405 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.406+0000 7f1dc3577640 1 --2- 192.168.123.107:0/1862926854 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1dc40fe210 0x7f1dc419a390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.405 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.406+0000 7f1dc3577640 1 -- 192.168.123.107:0/1862926854 >> 192.168.123.107:0/1862926854 conn(0x7f1dc40f9f80 msgr2=0x7f1dc41021c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:38.405 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.406+0000 7f1dc3577640 1 -- 192.168.123.107:0/1862926854 shutdown_connections 2026-03-09T19:29:38.405 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.406+0000 7f1dc3577640 1 -- 192.168.123.107:0/1862926854 wait complete. 2026-03-09T19:29:38.472 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.473+0000 7eff1538a640 1 -- 192.168.123.107:0/147786058 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7eff10102a60 msgr2=0x7eff10102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:38.472 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.473+0000 7eff1538a640 1 --2- 192.168.123.107:0/147786058 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7eff10102a60 0x7eff10102e60 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7efef8009a00 tx=0x7efef802f280 comp rx=0 tx=0).stop 2026-03-09T19:29:38.476 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.478+0000 7eff1538a640 1 -- 192.168.123.107:0/147786058 shutdown_connections 2026-03-09T19:29:38.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.478+0000 7eff1538a640 1 --2- 192.168.123.107:0/147786058 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7eff10103c60 0x7eff101040e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.478+0000 7eff1538a640 1 --2- 192.168.123.107:0/147786058 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7eff10102a60 0x7eff10102e60 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.478+0000 7eff1538a640 1 -- 192.168.123.107:0/147786058 >> 192.168.123.107:0/147786058 conn(0x7eff100fe250 msgr2=0x7eff10100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:38.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.478+0000 7eff1538a640 1 -- 192.168.123.107:0/147786058 shutdown_connections 2026-03-09T19:29:38.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.478+0000 7eff1538a640 1 -- 192.168.123.107:0/147786058 wait complete. 2026-03-09T19:29:38.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.478+0000 7eff1538a640 1 Processor -- start 2026-03-09T19:29:38.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.479+0000 7eff1538a640 1 -- start start 2026-03-09T19:29:38.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.479+0000 7eff1538a640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7eff10102a60 0x7eff1019a460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:38.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.479+0000 7eff1538a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7eff10103c60 0x7eff1019a9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:38.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.479+0000 7eff0e7fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7eff10103c60 0x7eff1019a9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:38.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.479+0000 7eff0e7fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7eff10103c60 0x7eff1019a9a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:55256/0 (socket says 192.168.123.107:55256) 2026-03-09T19:29:38.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.479+0000 7eff0e7fc640 1 -- 192.168.123.107:0/3086775685 learned_addr learned my addr 192.168.123.107:0/3086775685 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:29:38.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.479+0000 7eff0effd640 1 --2- 192.168.123.107:0/3086775685 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7eff10102a60 0x7eff1019a460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:38.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.480+0000 7eff1538a640 1 -- 192.168.123.107:0/3086775685 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7eff1019af70 con 0x7eff10103c60 2026-03-09T19:29:38.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.480+0000 7eff1538a640 1 -- 192.168.123.107:0/3086775685 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7eff1019b0e0 con 0x7eff10102a60 2026-03-09T19:29:38.479 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.480+0000 7eff0e7fc640 1 -- 192.168.123.107:0/3086775685 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7eff10102a60 msgr2=0x7eff1019a460 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:38.479 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.480+0000 7eff0e7fc640 1 --2- 192.168.123.107:0/3086775685 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7eff10102a60 0x7eff1019a460 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.479 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.480+0000 7eff0e7fc640 1 -- 192.168.123.107:0/3086775685 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7eff04009590 con 0x7eff10103c60 2026-03-09T19:29:38.479 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.480+0000 7eff0e7fc640 1 --2- 192.168.123.107:0/3086775685 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7eff10103c60 0x7eff1019a9a0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7eff040029c0 tx=0x7eff04002e90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:38.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.481+0000 7eff0effd640 1 --2- 192.168.123.107:0/3086775685 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7eff10102a60 0x7eff1019a460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:29:38.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.481+0000 7efeeffff640 1 -- 192.168.123.107:0/3086775685 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7eff0400ebd0 con 0x7eff10103c60 2026-03-09T19:29:38.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.481+0000 7efeeffff640 1 -- 192.168.123.107:0/3086775685 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7eff0400ed30 con 0x7eff10103c60 2026-03-09T19:29:38.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.481+0000 7efeeffff640 1 -- 192.168.123.107:0/3086775685 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7eff040187b0 con 0x7eff10103c60 2026-03-09T19:29:38.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.481+0000 7eff1538a640 1 -- 192.168.123.107:0/3086775685 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efef8009660 con 0x7eff10103c60 2026-03-09T19:29:38.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.481+0000 7eff1538a640 1 -- 192.168.123.107:0/3086775685 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7eff1019ff40 con 0x7eff10103c60 2026-03-09T19:29:38.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.482+0000 7eff1538a640 1 -- 192.168.123.107:0/3086775685 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7eff10102e60 con 0x7eff10103c60 2026-03-09T19:29:38.486 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.483+0000 7efeeffff640 1 -- 192.168.123.107:0/3086775685 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7eff04016020 con 0x7eff10103c60 2026-03-09T19:29:38.486 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.483+0000 7efeeffff640 1 --2- 192.168.123.107:0/3086775685 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efee40779b0 0x7efee4079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:38.486 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.483+0000 7efeeffff640 1 -- 192.168.123.107:0/3086775685 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6828+0+0 (secure 0 0 0) 0x7eff0409a620 con 0x7eff10103c60 2026-03-09T19:29:38.486 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.487+0000 7efeeffff640 1 -- 192.168.123.107:0/3086775685 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7eff04062980 con 0x7eff10103c60 2026-03-09T19:29:38.487 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.488+0000 7eff0effd640 1 --2- 192.168.123.107:0/3086775685 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efee40779b0 0x7efee4079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:38.487 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.488+0000 7eff0effd640 1 --2- 192.168.123.107:0/3086775685 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efee40779b0 0x7efee4079e70 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7efef8002c80 tx=0x7efef80023d0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:38.618 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.619+0000 7eff1538a640 1 -- 192.168.123.107:0/3086775685 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7eff10107ea0 con 0x7efee40779b0 2026-03-09T19:29:38.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.621+0000 7efeeffff640 1 -- 192.168.123.107:0/3086775685 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7eff10107ea0 con 0x7efee40779b0 2026-03-09T19:29:38.620 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:29:38.620 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T19:29:38.620 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:29:38.620 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:29:38.620 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [ 2026-03-09T19:29:38.620 INFO:teuthology.orchestra.run.vm07.stdout: "mgr", 2026-03-09T19:29:38.620 INFO:teuthology.orchestra.run.vm07.stdout: "crash", 2026-03-09T19:29:38.620 INFO:teuthology.orchestra.run.vm07.stdout: "mon" 2026-03-09T19:29:38.620 INFO:teuthology.orchestra.run.vm07.stdout: ], 2026-03-09T19:29:38.620 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "7/23 daemons upgraded", 2026-03-09T19:29:38.620 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T19:29:38.620 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:29:38.620 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:29:38.623 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.624+0000 7efeedffb640 1 -- 192.168.123.107:0/3086775685 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efee40779b0 msgr2=0x7efee4079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:38.623 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.624+0000 7efeedffb640 1 --2- 192.168.123.107:0/3086775685 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efee40779b0 0x7efee4079e70 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7efef8002c80 tx=0x7efef80023d0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.623 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.624+0000 7efeedffb640 1 -- 192.168.123.107:0/3086775685 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7eff10103c60 msgr2=0x7eff1019a9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:38.623 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.624+0000 7efeedffb640 1 --2- 192.168.123.107:0/3086775685 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7eff10103c60 0x7eff1019a9a0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7eff040029c0 tx=0x7eff04002e90 comp rx=0 tx=0).stop 2026-03-09T19:29:38.623 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.624+0000 7efeedffb640 1 -- 192.168.123.107:0/3086775685 shutdown_connections 2026-03-09T19:29:38.623 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.624+0000 7efeedffb640 1 --2- 192.168.123.107:0/3086775685 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efee40779b0 0x7efee4079e70 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.623 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.624+0000 7efeedffb640 1 --2- 192.168.123.107:0/3086775685 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7eff10103c60 0x7eff1019a9a0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.623 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.624+0000 7efeedffb640 1 --2- 192.168.123.107:0/3086775685 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7eff10102a60 0x7eff1019a460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.623 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.624+0000 7efeedffb640 1 -- 192.168.123.107:0/3086775685 >> 192.168.123.107:0/3086775685 conn(0x7eff100fe250 msgr2=0x7eff100ffd60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:38.624 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.625+0000 7efeedffb640 1 -- 192.168.123.107:0/3086775685 shutdown_connections 2026-03-09T19:29:38.624 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.625+0000 7efeedffb640 1 -- 192.168.123.107:0/3086775685 wait complete. 2026-03-09T19:29:38.694 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.695+0000 7fbf16efe640 1 -- 192.168.123.107:0/1327756304 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf080989c0 msgr2=0x7fbf0809adb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:38.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.695+0000 7fbf16efe640 1 --2- 192.168.123.107:0/1327756304 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf080989c0 0x7fbf0809adb0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fbf040098e0 tx=0x7fbf0402f1b0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.696+0000 7fbf16efe640 1 -- 192.168.123.107:0/1327756304 shutdown_connections 2026-03-09T19:29:38.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.696+0000 7fbf16efe640 1 --2- 192.168.123.107:0/1327756304 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf080989c0 0x7fbf0809adb0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.696+0000 7fbf16efe640 1 --2- 192.168.123.107:0/1327756304 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbf08096090 0x7fbf08098480 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.696+0000 7fbf16efe640 1 -- 192.168.123.107:0/1327756304 >> 192.168.123.107:0/1327756304 conn(0x7fbf0808fbf0 msgr2=0x7fbf08092050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:38.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.696+0000 7fbf16efe640 1 -- 192.168.123.107:0/1327756304 shutdown_connections 2026-03-09T19:29:38.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.696+0000 7fbf16efe640 1 -- 192.168.123.107:0/1327756304 wait complete. 2026-03-09T19:29:38.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.697+0000 7fbf16efe640 1 Processor -- start 2026-03-09T19:29:38.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.697+0000 7fbf16efe640 1 -- start start 2026-03-09T19:29:38.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.697+0000 7fbf16efe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf08096090 0x7fbf0812f180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:38.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.697+0000 7fbf16efe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbf080989c0 0x7fbf0812f6c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:38.697 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.697+0000 7fbf16efe640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf0812fc90 con 0x7fbf080989c0 2026-03-09T19:29:38.700 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.697+0000 7fbf16efe640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf0812fe00 con 0x7fbf08096090 2026-03-09T19:29:38.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.697+0000 7fbf156fb640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbf080989c0 0x7fbf0812f6c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:38.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.697+0000 7fbf156fb640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbf080989c0 0x7fbf0812f6c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:42166/0 (socket says 192.168.123.107:42166) 2026-03-09T19:29:38.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.697+0000 7fbf156fb640 1 -- 192.168.123.107:0/664220283 learned_addr learned my addr 192.168.123.107:0/664220283 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:29:38.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.698+0000 7fbf156fb640 1 -- 192.168.123.107:0/664220283 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf08096090 msgr2=0x7fbf0812f180 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T19:29:38.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.698+0000 7fbf156fb640 1 --2- 192.168.123.107:0/664220283 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf08096090 0x7fbf0812f180 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.698+0000 7fbf156fb640 1 -- 192.168.123.107:0/664220283 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbf04009590 con 0x7fbf080989c0 2026-03-09T19:29:38.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.698+0000 7fbf156fb640 1 --2- 192.168.123.107:0/664220283 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbf080989c0 0x7fbf0812f6c0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fbf04004580 tx=0x7fbf040045b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:38.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.698+0000 7fbefeffd640 1 -- 192.168.123.107:0/664220283 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf04042e30 con 0x7fbf080989c0 2026-03-09T19:29:38.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.698+0000 7fbefeffd640 1 -- 192.168.123.107:0/664220283 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fbf0402fe90 con 0x7fbf080989c0 2026-03-09T19:29:38.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.698+0000 7fbf16efe640 1 -- 192.168.123.107:0/664220283 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbf081260d0 con 0x7fbf080989c0 2026-03-09T19:29:38.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.698+0000 7fbf16efe640 1 -- 192.168.123.107:0/664220283 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbf081265c0 con 0x7fbf080989c0 2026-03-09T19:29:38.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.699+0000 7fbf16efe640 1 -- 192.168.123.107:0/664220283 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbf0809e1e0 con 0x7fbf080989c0 2026-03-09T19:29:38.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.702+0000 7fbefeffd640 1 -- 192.168.123.107:0/664220283 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf0404a5f0 con 0x7fbf080989c0 2026-03-09T19:29:38.708 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.708+0000 7fbefeffd640 1 -- 192.168.123.107:0/664220283 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fbf04052050 con 0x7fbf080989c0 2026-03-09T19:29:38.708 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.709+0000 7fbefeffd640 1 --2- 192.168.123.107:0/664220283 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fbeec077ad0 0x7fbeec079f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:29:38.708 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.709+0000 7fbefeffd640 1 -- 192.168.123.107:0/664220283 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6828+0+0 (secure 0 0 0) 0x7fbf04059080 con 0x7fbf080989c0 2026-03-09T19:29:38.708 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.709+0000 7fbefeffd640 1 -- 192.168.123.107:0/664220283 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbf040c7a40 con 0x7fbf080989c0 2026-03-09T19:29:38.708 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.709+0000 7fbf15efc640 1 --2- 192.168.123.107:0/664220283 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fbeec077ad0 0x7fbeec079f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:29:38.712 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.713+0000 7fbf15efc640 1 --2- 192.168.123.107:0/664220283 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fbeec077ad0 0x7fbeec079f90 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fbf00009b70 tx=0x7fbf00009340 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:29:38.896 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.897+0000 7fbf16efe640 1 -- 192.168.123.107:0/664220283 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fbf0809e400 con 0x7fbf080989c0 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.899+0000 7fbefeffd640 1 -- 192.168.123.107:0/664220283 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1351 (secure 0 0 0) 0x7fbf0408fb60 con 0x7fbf080989c0 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_WARN Degraded data redundancy: 2413/24048 objects degraded (10.034%), 15 pgs degraded 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 2413/24048 objects degraded (10.034%), 15 pgs degraded 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.12 is active+recovering+degraded, acting [3,1,0] 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.1 is active+recovery_wait+undersized+degraded+remapped, acting [2,4] 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.3 is active+recovery_wait+undersized+degraded+remapped, acting [4,3] 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.6 is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.b is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.c is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.f is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.10 is active+recovery_wait+undersized+degraded+remapped, acting [5,1] 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.11 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.12 is active+recovery_wait+undersized+degraded+remapped, acting [1,3] 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.15 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.17 is active+recovery_wait+undersized+degraded+remapped, acting [2,5] 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.18 is active+recovery_wait+undersized+degraded+remapped, acting [2,1] 2026-03-09T19:29:38.898 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.1b is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T19:29:38.899 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.1f is active+recovery_wait+undersized+degraded+remapped, acting [2,3] 2026-03-09T19:29:38.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.903+0000 7fbefcff9640 1 -- 192.168.123.107:0/664220283 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fbeec077ad0 msgr2=0x7fbeec079f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:38.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.904+0000 7fbefcff9640 1 --2- 192.168.123.107:0/664220283 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fbeec077ad0 0x7fbeec079f90 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fbf00009b70 tx=0x7fbf00009340 comp rx=0 tx=0).stop 2026-03-09T19:29:38.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.905+0000 7fbefcff9640 1 -- 192.168.123.107:0/664220283 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbf080989c0 msgr2=0x7fbf0812f6c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:29:38.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.905+0000 7fbefcff9640 1 --2- 192.168.123.107:0/664220283 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbf080989c0 0x7fbf0812f6c0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fbf04004580 tx=0x7fbf040045b0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.905+0000 7fbefcff9640 1 -- 192.168.123.107:0/664220283 shutdown_connections 2026-03-09T19:29:38.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.906+0000 7fbefcff9640 1 --2- 192.168.123.107:0/664220283 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fbeec077ad0 0x7fbeec079f90 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.906+0000 7fbefcff9640 1 --2- 192.168.123.107:0/664220283 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fbf080989c0 0x7fbf0812f6c0 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.906+0000 7fbefcff9640 1 --2- 192.168.123.107:0/664220283 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf08096090 0x7fbf0812f180 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:29:38.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.906+0000 7fbefcff9640 1 -- 192.168.123.107:0/664220283 >> 192.168.123.107:0/664220283 conn(0x7fbf0808fbf0 msgr2=0x7fbf080940a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:29:38.906 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.907+0000 7fbefcff9640 1 -- 192.168.123.107:0/664220283 shutdown_connections 2026-03-09T19:29:38.906 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:29:38.908+0000 7fbefcff9640 1 -- 192.168.123.107:0/664220283 wait complete. 2026-03-09T19:29:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:38 vm08.local ceph-mon[103420]: from='client.34152 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:29:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:38 vm08.local ceph-mon[103420]: pgmap v44: 65 pgs: 1 active+recovering+degraded, 1 active+undersized+remapped, 14 active+recovery_wait+undersized+degraded+remapped, 49 active+clean; 267 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.1 MiB/s wr, 325 op/s; 2413/24048 objects degraded (10.034%); 682 KiB/s, 31 keys/s, 8 objects/s recovering 2026-03-09T19:29:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:38 vm08.local ceph-mon[103420]: from='client.44129 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:29:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:38 vm08.local ceph-mon[103420]: osdmap e56: 6 total, 6 up, 6 in 2026-03-09T19:29:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:38 vm08.local ceph-mon[103420]: from='client.44133 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:29:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:38 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/2147054967' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:29:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:38 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1862926854' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:29:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:38 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 2413/24048 objects degraded (10.034%), 15 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:39.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:38 vm07.local ceph-mon[111841]: from='client.34152 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:29:39.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:38 vm07.local ceph-mon[111841]: pgmap v44: 65 pgs: 1 active+recovering+degraded, 1 active+undersized+remapped, 14 active+recovery_wait+undersized+degraded+remapped, 49 active+clean; 267 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.1 MiB/s wr, 325 op/s; 2413/24048 objects degraded (10.034%); 682 KiB/s, 31 keys/s, 8 objects/s recovering 2026-03-09T19:29:39.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:38 vm07.local ceph-mon[111841]: from='client.44129 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:29:39.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:38 vm07.local ceph-mon[111841]: osdmap e56: 6 total, 6 up, 6 in 2026-03-09T19:29:39.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:38 vm07.local ceph-mon[111841]: from='client.44133 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:29:39.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:38 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/2147054967' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:29:39.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:38 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1862926854' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:29:39.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:38 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 2413/24048 objects degraded (10.034%), 15 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:40.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:40 vm08.local ceph-mon[103420]: from='client.34170 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:29:40.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:40 vm08.local ceph-mon[103420]: osdmap e57: 6 total, 6 up, 6 in 2026-03-09T19:29:40.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:40 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/664220283' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:29:40.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:40 vm07.local ceph-mon[111841]: from='client.34170 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:29:40.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:40 vm07.local ceph-mon[111841]: osdmap e57: 6 total, 6 up, 6 in 2026-03-09T19:29:40.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:40 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/664220283' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:29:41.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:41 vm07.local ceph-mon[111841]: pgmap v47: 65 pgs: 1 active+recovering+degraded, 1 active+undersized+remapped, 14 active+recovery_wait+undersized+degraded+remapped, 49 active+clean; 267 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 0 B/s wr, 2 op/s; 2413/24048 objects degraded (10.034%) 2026-03-09T19:29:41.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:41 vm07.local ceph-mon[111841]: osdmap e58: 6 total, 6 up, 6 in 2026-03-09T19:29:41.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:41 vm08.local ceph-mon[103420]: pgmap v47: 65 pgs: 1 active+recovering+degraded, 1 active+undersized+remapped, 14 active+recovery_wait+undersized+degraded+remapped, 49 active+clean; 267 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 0 B/s wr, 2 op/s; 2413/24048 objects degraded (10.034%) 2026-03-09T19:29:41.346 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:41 vm08.local ceph-mon[103420]: osdmap e58: 6 total, 6 up, 6 in 2026-03-09T19:29:42.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:42 vm07.local ceph-mon[111841]: osdmap e59: 6 total, 6 up, 6 in 2026-03-09T19:29:42.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:42 vm07.local ceph-mon[111841]: pgmap v50: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 262 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.1 MiB/s wr, 416 op/s; 1773/20028 objects degraded (8.853%); 0 B/s, 0 keys/s, 135 objects/s recovering 2026-03-09T19:29:42.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:42 vm08.local ceph-mon[103420]: osdmap e59: 6 total, 6 up, 6 in 2026-03-09T19:29:42.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:42 vm08.local ceph-mon[103420]: pgmap v50: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 262 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.1 MiB/s wr, 416 op/s; 1773/20028 objects degraded (8.853%); 0 B/s, 0 keys/s, 135 objects/s recovering 2026-03-09T19:29:43.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:43 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 1773/20028 objects degraded (8.853%), 11 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:44.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:43 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 1773/20028 objects degraded (8.853%), 11 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:44.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:44 vm07.local ceph-mon[111841]: pgmap v51: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 262 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 759 KiB/s wr, 288 op/s; 1773/20028 objects degraded (8.853%); 0 B/s, 0 keys/s, 94 objects/s recovering 2026-03-09T19:29:44.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:44 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:29:45.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:44 vm08.local ceph-mon[103420]: pgmap v51: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 262 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 759 KiB/s wr, 288 op/s; 1773/20028 objects degraded (8.853%); 0 B/s, 0 keys/s, 94 objects/s recovering 2026-03-09T19:29:45.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:44 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:29:46.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:45 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:29:46.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:45 vm08.local ceph-mon[103420]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-09T19:29:46.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:45 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:29:46.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:45 vm07.local ceph-mon[111841]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-09T19:29:47.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:46 vm08.local ceph-mon[103420]: pgmap v52: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 254 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 40 KiB/s rd, 1.5 MiB/s wr, 467 op/s; 1773/16611 objects degraded (10.674%); 0 B/s, 0 keys/s, 86 objects/s recovering 2026-03-09T19:29:47.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:46 vm07.local ceph-mon[111841]: pgmap v52: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 254 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 40 KiB/s rd, 1.5 MiB/s wr, 467 op/s; 1773/16611 objects degraded (10.674%); 0 B/s, 0 keys/s, 86 objects/s recovering 2026-03-09T19:29:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:48 vm08.local ceph-mon[103420]: pgmap v53: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 254 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.2 MiB/s wr, 401 op/s; 1773/16470 objects degraded (10.765%); 0 B/s, 0 keys/s, 72 objects/s recovering 2026-03-09T19:29:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:48 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 1773/16470 objects degraded (10.765%), 11 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:29:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:48 vm07.local ceph-mon[111841]: pgmap v53: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 254 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.2 MiB/s wr, 401 op/s; 1773/16470 objects degraded (10.765%); 0 B/s, 0 keys/s, 72 objects/s recovering 2026-03-09T19:29:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:48 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 1773/16470 objects degraded (10.765%), 11 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:49.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:29:50.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:50 vm07.local ceph-mon[111841]: pgmap v54: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 254 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 14 KiB/s rd, 604 KiB/s wr, 163 op/s; 1773/16470 objects degraded (10.765%); 0 B/s, 4 objects/s recovering 2026-03-09T19:29:50.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:50 vm08.local ceph-mon[103420]: pgmap v54: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 254 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 14 KiB/s rd, 604 KiB/s wr, 163 op/s; 1773/16470 objects degraded (10.765%); 0 B/s, 4 objects/s recovering 2026-03-09T19:29:53.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:52 vm08.local ceph-mon[103420]: pgmap v55: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 248 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 1.1 MiB/s wr, 295 op/s; 1770/13077 objects degraded (13.535%); 0 B/s, 8 objects/s recovering 2026-03-09T19:29:53.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:52 vm07.local ceph-mon[111841]: pgmap v55: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 248 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 1.1 MiB/s wr, 295 op/s; 1770/13077 objects degraded (13.535%); 0 B/s, 8 objects/s recovering 2026-03-09T19:29:54.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:53 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 1770/13077 objects degraded (13.535%), 11 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:54.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:53 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 1770/13077 objects degraded (13.535%), 11 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:55.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:54 vm08.local ceph-mon[103420]: pgmap v56: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 248 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 945 KiB/s wr, 257 op/s; 1770/13077 objects degraded (13.535%); 0 B/s, 7 objects/s recovering 2026-03-09T19:29:55.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:54 vm07.local ceph-mon[111841]: pgmap v56: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 248 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 945 KiB/s wr, 257 op/s; 1770/13077 objects degraded (13.535%); 0 B/s, 7 objects/s recovering 2026-03-09T19:29:56.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:56 vm08.local ceph-mon[103420]: pgmap v57: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 245 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1.4 MiB/s wr, 362 op/s; 1741/10449 objects degraded (16.662%); 0 B/s, 10 objects/s recovering 2026-03-09T19:29:56.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:56 vm07.local ceph-mon[111841]: pgmap v57: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 245 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1.4 MiB/s wr, 362 op/s; 1741/10449 objects degraded (16.662%); 0 B/s, 10 objects/s recovering 2026-03-09T19:29:58.311 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.1... 2026-03-09T19:29:58.311 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.1 /home/ubuntu/cephtest/clone.client.1 2026-03-09T19:29:58.761 DEBUG:teuthology.parallel:result is None 2026-03-09T19:29:58.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:58 vm07.local ceph-mon[111841]: pgmap v58: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 241 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 22 KiB/s rd, 926 KiB/s wr, 240 op/s; 1741/10326 objects degraded (16.860%); 0 B/s, 7 objects/s recovering 2026-03-09T19:29:58.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:58 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 1741/10326 objects degraded (16.860%), 11 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:59.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:58 vm08.local ceph-mon[103420]: pgmap v58: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 241 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 22 KiB/s rd, 926 KiB/s wr, 240 op/s; 1741/10326 objects degraded (16.860%); 0 B/s, 7 objects/s recovering 2026-03-09T19:29:59.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:58 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 1741/10326 objects degraded (16.860%), 11 pgs degraded (PG_DEGRADED) 2026-03-09T19:29:59.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:59 vm07.local ceph-mon[111841]: osdmap e60: 6 total, 6 up, 6 in 2026-03-09T19:29:59.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:59 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:29:59.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:29:59 vm07.local ceph-mon[111841]: osdmap e61: 6 total, 6 up, 6 in 2026-03-09T19:30:00.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:59 vm08.local ceph-mon[103420]: osdmap e60: 6 total, 6 up, 6 in 2026-03-09T19:30:00.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:59 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:30:00.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:29:59 vm08.local ceph-mon[103420]: osdmap e61: 6 total, 6 up, 6 in 2026-03-09T19:30:01.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:00 vm07.local ceph-mon[111841]: pgmap v60: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 241 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.1 MiB/s wr, 284 op/s; 1741/10326 objects degraded (16.860%); 0 B/s, 8 objects/s recovering 2026-03-09T19:30:01.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:00 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:30:01.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:00 vm07.local ceph-mon[111841]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-09T19:30:01.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:00 vm07.local ceph-mon[111841]: Health detail: HEALTH_WARN Degraded data redundancy: 1741/10326 objects degraded (16.860%), 11 pgs degraded 2026-03-09T19:30:01.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:00 vm07.local ceph-mon[111841]: [WRN] PG_DEGRADED: Degraded data redundancy: 1741/10326 objects degraded (16.860%), 11 pgs degraded 2026-03-09T19:30:01.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:00 vm07.local ceph-mon[111841]: pg 3.1 is active+recovering+undersized+degraded+remapped, acting [2,4] 2026-03-09T19:30:01.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:00 vm07.local ceph-mon[111841]: pg 3.6 is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-09T19:30:01.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:00 vm07.local ceph-mon[111841]: pg 3.b is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-09T19:30:01.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:00 vm07.local ceph-mon[111841]: pg 3.f is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-09T19:30:01.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:00 vm07.local ceph-mon[111841]: pg 3.10 is active+recovery_wait+undersized+degraded+remapped, acting [5,1] 2026-03-09T19:30:01.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:00 vm07.local ceph-mon[111841]: pg 3.11 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T19:30:01.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:00 vm07.local ceph-mon[111841]: pg 3.15 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T19:30:01.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:00 vm07.local ceph-mon[111841]: pg 3.17 is active+recovery_wait+undersized+degraded+remapped, acting [2,5] 2026-03-09T19:30:01.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:00 vm07.local ceph-mon[111841]: pg 3.18 is active+recovery_wait+undersized+degraded+remapped, acting [2,1] 2026-03-09T19:30:01.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:00 vm07.local ceph-mon[111841]: pg 3.1b is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T19:30:01.173 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:00 vm07.local ceph-mon[111841]: pg 3.1f is active+recovery_wait+undersized+degraded+remapped, acting [2,3] 2026-03-09T19:30:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:00 vm08.local ceph-mon[103420]: pgmap v60: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 241 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.1 MiB/s wr, 284 op/s; 1741/10326 objects degraded (16.860%); 0 B/s, 8 objects/s recovering 2026-03-09T19:30:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:00 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:30:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:00 vm08.local ceph-mon[103420]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-09T19:30:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:00 vm08.local ceph-mon[103420]: Health detail: HEALTH_WARN Degraded data redundancy: 1741/10326 objects degraded (16.860%), 11 pgs degraded 2026-03-09T19:30:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:00 vm08.local ceph-mon[103420]: [WRN] PG_DEGRADED: Degraded data redundancy: 1741/10326 objects degraded (16.860%), 11 pgs degraded 2026-03-09T19:30:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:00 vm08.local ceph-mon[103420]: pg 3.1 is active+recovering+undersized+degraded+remapped, acting [2,4] 2026-03-09T19:30:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:00 vm08.local ceph-mon[103420]: pg 3.6 is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-09T19:30:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:00 vm08.local ceph-mon[103420]: pg 3.b is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-09T19:30:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:00 vm08.local ceph-mon[103420]: pg 3.f is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-09T19:30:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:00 vm08.local ceph-mon[103420]: pg 3.10 is active+recovery_wait+undersized+degraded+remapped, acting [5,1] 2026-03-09T19:30:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:00 vm08.local ceph-mon[103420]: pg 3.11 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T19:30:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:00 vm08.local ceph-mon[103420]: pg 3.15 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T19:30:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:00 vm08.local ceph-mon[103420]: pg 3.17 is active+recovery_wait+undersized+degraded+remapped, acting [2,5] 2026-03-09T19:30:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:00 vm08.local ceph-mon[103420]: pg 3.18 is active+recovery_wait+undersized+degraded+remapped, acting [2,1] 2026-03-09T19:30:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:00 vm08.local ceph-mon[103420]: pg 3.1b is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T19:30:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:00 vm08.local ceph-mon[103420]: pg 3.1f is active+recovery_wait+undersized+degraded+remapped, acting [2,3] 2026-03-09T19:30:03.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:03 vm07.local ceph-mon[111841]: pgmap v62: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 236 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1.4 MiB/s wr, 370 op/s; 1607/6555 objects degraded (24.516%); 0 B/s, 10 objects/s recovering 2026-03-09T19:30:03.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:03 vm08.local ceph-mon[103420]: pgmap v62: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 236 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1.4 MiB/s wr, 370 op/s; 1607/6555 objects degraded (24.516%); 0 B/s, 10 objects/s recovering 2026-03-09T19:30:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:04 vm07.local ceph-mon[111841]: pgmap v63: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 236 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 743 KiB/s wr, 213 op/s; 1607/6555 objects degraded (24.516%); 0 B/s, 5 objects/s recovering 2026-03-09T19:30:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:04 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 1607/6555 objects degraded (24.516%), 10 pgs degraded (PG_DEGRADED) 2026-03-09T19:30:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:04 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:30:04.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:04 vm08.local ceph-mon[103420]: pgmap v63: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 236 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 743 KiB/s wr, 213 op/s; 1607/6555 objects degraded (24.516%); 0 B/s, 5 objects/s recovering 2026-03-09T19:30:04.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:04 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 1607/6555 objects degraded (24.516%), 10 pgs degraded (PG_DEGRADED) 2026-03-09T19:30:04.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:04 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:30:05.947 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.0... 2026-03-09T19:30:05.947 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-09T19:30:06.349 DEBUG:teuthology.parallel:result is None 2026-03-09T19:30:06.349 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-09T19:30:06.395 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-09T19:30:06.395 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1 2026-03-09T19:30:06.425 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.1/client.1 2026-03-09T19:30:06.425 DEBUG:teuthology.parallel:result is None 2026-03-09T19:30:07.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:06 vm08.local ceph-mon[103420]: mgrmap e37: vm07.xacuym(active, since 92s), standbys: vm08.mxylvw 2026-03-09T19:30:07.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:06 vm08.local ceph-mon[103420]: pgmap v64: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 233 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 1.4 MiB/s wr, 399 op/s; 1607/3432 objects degraded (46.824%); 0 B/s, 11 objects/s recovering 2026-03-09T19:30:07.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:06 vm07.local ceph-mon[111841]: mgrmap e37: vm07.xacuym(active, since 92s), standbys: vm08.mxylvw 2026-03-09T19:30:07.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:06 vm07.local ceph-mon[111841]: pgmap v64: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 233 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 1.4 MiB/s wr, 399 op/s; 1607/3432 objects degraded (46.824%); 0 B/s, 11 objects/s recovering 2026-03-09T19:30:08.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.979+0000 7efcede13640 1 -- 192.168.123.107:0/2781123827 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efce8076df0 msgr2=0x7efce8077250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:08.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.979+0000 7efcede13640 1 --2- 192.168.123.107:0/2781123827 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efce8076df0 0x7efce8077250 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7efcd4009a00 tx=0x7efcd402f280 comp rx=0 tx=0).stop 2026-03-09T19:30:08.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.980+0000 7efcede13640 1 -- 192.168.123.107:0/2781123827 shutdown_connections 2026-03-09T19:30:08.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.980+0000 7efcede13640 1 --2- 192.168.123.107:0/2781123827 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efce8076df0 0x7efce8077250 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:08.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.980+0000 7efcede13640 1 --2- 192.168.123.107:0/2781123827 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efce8075ba0 0x7efce8075fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:08.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.980+0000 7efcede13640 1 -- 192.168.123.107:0/2781123827 >> 192.168.123.107:0/2781123827 conn(0x7efce80fe250 msgr2=0x7efce8100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:08.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.980+0000 7efcede13640 1 -- 192.168.123.107:0/2781123827 shutdown_connections 2026-03-09T19:30:08.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.980+0000 7efcede13640 1 -- 192.168.123.107:0/2781123827 wait complete. 2026-03-09T19:30:08.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.981+0000 7efcede13640 1 Processor -- start 2026-03-09T19:30:08.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.981+0000 7efcede13640 1 -- start start 2026-03-09T19:30:08.980 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.981+0000 7efcede13640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efce8075ba0 0x7efce819e900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:08.980 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.981+0000 7efcede13640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efce8076df0 0x7efce819ee40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:08.980 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.981+0000 7efcede13640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efce819f410 con 0x7efce8076df0 2026-03-09T19:30:08.980 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.981+0000 7efcede13640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efce819f580 con 0x7efce8075ba0 2026-03-09T19:30:08.980 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.981+0000 7efce77fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efce8075ba0 0x7efce819e900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:08.980 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.981+0000 7efce6ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efce8076df0 0x7efce819ee40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:08.980 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.981+0000 7efce6ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efce8076df0 0x7efce819ee40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:42734/0 (socket says 192.168.123.107:42734) 2026-03-09T19:30:08.980 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.981+0000 7efce6ffd640 1 -- 192.168.123.107:0/2882347322 learned_addr learned my addr 192.168.123.107:0/2882347322 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:30:08.981 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.982+0000 7efce6ffd640 1 -- 192.168.123.107:0/2882347322 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efce8075ba0 msgr2=0x7efce819e900 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:08.981 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.982+0000 7efce6ffd640 1 --2- 192.168.123.107:0/2882347322 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efce8075ba0 0x7efce819e900 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:08.981 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.982+0000 7efce6ffd640 1 -- 192.168.123.107:0/2882347322 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efcd4009660 con 0x7efce8076df0 2026-03-09T19:30:08.981 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.982+0000 7efce6ffd640 1 --2- 192.168.123.107:0/2882347322 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efce8076df0 0x7efce819ee40 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7efcd4004400 tx=0x7efcd4004430 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:08.981 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.982+0000 7efce4ff9640 1 -- 192.168.123.107:0/2882347322 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efcd4031b80 con 0x7efce8076df0 2026-03-09T19:30:08.981 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.982+0000 7efce4ff9640 1 -- 192.168.123.107:0/2882347322 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7efcd4031ce0 con 0x7efce8076df0 2026-03-09T19:30:08.982 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.983+0000 7efce4ff9640 1 -- 192.168.123.107:0/2882347322 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efcd4031280 con 0x7efce8076df0 2026-03-09T19:30:08.983 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.984+0000 7efcede13640 1 -- 192.168.123.107:0/2882347322 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efce81a3fc0 con 0x7efce8076df0 2026-03-09T19:30:08.983 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.984+0000 7efcede13640 1 -- 192.168.123.107:0/2882347322 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efce81a4370 con 0x7efce8076df0 2026-03-09T19:30:08.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.985+0000 7efcede13640 1 -- 192.168.123.107:0/2882347322 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efcac005350 con 0x7efce8076df0 2026-03-09T19:30:08.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.987+0000 7efce4ff9640 1 -- 192.168.123.107:0/2882347322 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7efcd403f070 con 0x7efce8076df0 2026-03-09T19:30:08.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.987+0000 7efce4ff9640 1 --2- 192.168.123.107:0/2882347322 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efcbc077720 0x7efcbc079be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:08.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.987+0000 7efce77fe640 1 --2- 192.168.123.107:0/2882347322 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efcbc077720 0x7efcbc079be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:08.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.988+0000 7efce77fe640 1 --2- 192.168.123.107:0/2882347322 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efcbc077720 0x7efcbc079be0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7efcd8004500 tx=0x7efcd8009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:08.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.988+0000 7efce4ff9640 1 -- 192.168.123.107:0/2882347322 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6770+0+0 (secure 0 0 0) 0x7efcd40bea80 con 0x7efce8076df0 2026-03-09T19:30:08.987 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:08.988+0000 7efce4ff9640 1 -- 192.168.123.107:0/2882347322 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7efcd4087160 con 0x7efce8076df0 2026-03-09T19:30:09.090 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.091+0000 7efcede13640 1 -- 192.168.123.107:0/2882347322 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7efcac002bf0 con 0x7efcbc077720 2026-03-09T19:30:09.090 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:08 vm07.local ceph-mon[111841]: pgmap v65: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 233 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.3 MiB/s wr, 367 op/s; 1607/3273 objects degraded (49.099%); 0 B/s, 10 objects/s recovering 2026-03-09T19:30:09.090 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:08 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 1607/3273 objects degraded (49.099%), 10 pgs degraded (PG_DEGRADED) 2026-03-09T19:30:09.093 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.094+0000 7efce4ff9640 1 -- 192.168.123.107:0/2882347322 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7efcac002bf0 con 0x7efcbc077720 2026-03-09T19:30:09.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:08 vm08.local ceph-mon[103420]: pgmap v65: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 233 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.3 MiB/s wr, 367 op/s; 1607/3273 objects degraded (49.099%); 0 B/s, 10 objects/s recovering 2026-03-09T19:30:09.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:08 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 1607/3273 objects degraded (49.099%), 10 pgs degraded (PG_DEGRADED) 2026-03-09T19:30:09.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.097+0000 7efcede13640 1 -- 192.168.123.107:0/2882347322 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efcbc077720 msgr2=0x7efcbc079be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.097+0000 7efcede13640 1 --2- 192.168.123.107:0/2882347322 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efcbc077720 0x7efcbc079be0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7efcd8004500 tx=0x7efcd8009290 comp rx=0 tx=0).stop 2026-03-09T19:30:09.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.097+0000 7efcede13640 1 -- 192.168.123.107:0/2882347322 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efce8076df0 msgr2=0x7efce819ee40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.097+0000 7efcede13640 1 --2- 192.168.123.107:0/2882347322 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efce8076df0 0x7efce819ee40 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7efcd4004400 tx=0x7efcd4004430 comp rx=0 tx=0).stop 2026-03-09T19:30:09.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.098+0000 7efcede13640 1 -- 192.168.123.107:0/2882347322 shutdown_connections 2026-03-09T19:30:09.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.098+0000 7efcede13640 1 --2- 192.168.123.107:0/2882347322 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efcbc077720 0x7efcbc079be0 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.098+0000 7efcede13640 1 --2- 192.168.123.107:0/2882347322 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efce8076df0 0x7efce819ee40 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.098+0000 7efcede13640 1 --2- 192.168.123.107:0/2882347322 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efce8075ba0 0x7efce819e900 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.098+0000 7efcede13640 1 -- 192.168.123.107:0/2882347322 >> 192.168.123.107:0/2882347322 conn(0x7efce80fe250 msgr2=0x7efce80ffa30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:09.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.098+0000 7efcede13640 1 -- 192.168.123.107:0/2882347322 shutdown_connections 2026-03-09T19:30:09.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.098+0000 7efcede13640 1 -- 192.168.123.107:0/2882347322 wait complete. 2026-03-09T19:30:09.105 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:30:09.156 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.157+0000 7fc208135640 1 -- 192.168.123.107:0/1585755892 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc200103c60 msgr2=0x7fc2001040e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.156 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.157+0000 7fc208135640 1 --2- 192.168.123.107:0/1585755892 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc200103c60 0x7fc2001040e0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fc1f0009a00 tx=0x7fc1f002f280 comp rx=0 tx=0).stop 2026-03-09T19:30:09.156 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.157+0000 7fc208135640 1 -- 192.168.123.107:0/1585755892 shutdown_connections 2026-03-09T19:30:09.156 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.157+0000 7fc208135640 1 --2- 192.168.123.107:0/1585755892 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc200103c60 0x7fc2001040e0 secure :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fc1f0009a00 tx=0x7fc1f002f280 comp rx=0 tx=0).stop 2026-03-09T19:30:09.156 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.157+0000 7fc208135640 1 --2- 192.168.123.107:0/1585755892 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc200102a60 0x7fc200102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.156 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.157+0000 7fc208135640 1 -- 192.168.123.107:0/1585755892 >> 192.168.123.107:0/1585755892 conn(0x7fc2000fe250 msgr2=0x7fc200100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:09.157 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.158+0000 7fc208135640 1 -- 192.168.123.107:0/1585755892 shutdown_connections 2026-03-09T19:30:09.157 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.158+0000 7fc208135640 1 -- 192.168.123.107:0/1585755892 wait complete. 2026-03-09T19:30:09.157 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.158+0000 7fc208135640 1 Processor -- start 2026-03-09T19:30:09.157 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.158+0000 7fc208135640 1 -- start start 2026-03-09T19:30:09.157 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.158+0000 7fc208135640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc200102a60 0x7fc20019a670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:09.157 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.159+0000 7fc208135640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc20019abb0 0x7fc20019fc20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:09.158 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.159+0000 7fc208135640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc20019b030 con 0x7fc200102a60 2026-03-09T19:30:09.158 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.159+0000 7fc208135640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc20019b1a0 con 0x7fc20019abb0 2026-03-09T19:30:09.158 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.159+0000 7fc205eaa640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc200102a60 0x7fc20019a670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:09.158 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.159+0000 7fc205eaa640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc200102a60 0x7fc20019a670 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:42754/0 (socket says 192.168.123.107:42754) 2026-03-09T19:30:09.158 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.159+0000 7fc205eaa640 1 -- 192.168.123.107:0/255245189 learned_addr learned my addr 192.168.123.107:0/255245189 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:30:09.158 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.159+0000 7fc205eaa640 1 -- 192.168.123.107:0/255245189 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc20019abb0 msgr2=0x7fc20019fc20 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T19:30:09.158 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.159+0000 7fc205eaa640 1 --2- 192.168.123.107:0/255245189 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc20019abb0 0x7fc20019fc20 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.158 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.159+0000 7fc205eaa640 1 -- 192.168.123.107:0/255245189 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc1f0009660 con 0x7fc200102a60 2026-03-09T19:30:09.158 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.159+0000 7fc205eaa640 1 --2- 192.168.123.107:0/255245189 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc200102a60 0x7fc20019a670 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fc1f400d900 tx=0x7fc1f400ddd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:09.159 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.160+0000 7fc1eeffd640 1 -- 192.168.123.107:0/255245189 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc1f4004490 con 0x7fc200102a60 2026-03-09T19:30:09.159 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.160+0000 7fc1eeffd640 1 -- 192.168.123.107:0/255245189 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fc1f4007620 con 0x7fc200102a60 2026-03-09T19:30:09.159 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.160+0000 7fc208135640 1 -- 192.168.123.107:0/255245189 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc2001a01c0 con 0x7fc200102a60 2026-03-09T19:30:09.160 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.161+0000 7fc208135640 1 -- 192.168.123.107:0/255245189 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc2001a06c0 con 0x7fc200102a60 2026-03-09T19:30:09.161 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.161+0000 7fc1eeffd640 1 -- 192.168.123.107:0/255245189 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc1f4002e60 con 0x7fc200102a60 2026-03-09T19:30:09.161 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.161+0000 7fc1eeffd640 1 -- 192.168.123.107:0/255245189 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc1f4010550 con 0x7fc200102a60 2026-03-09T19:30:09.161 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.162+0000 7fc1eeffd640 1 --2- 192.168.123.107:0/255245189 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc1dc077680 0x7fc1dc079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:09.161 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.162+0000 7fc1eeffd640 1 -- 192.168.123.107:0/255245189 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6770+0+0 (secure 0 0 0) 0x7fc1f409a280 con 0x7fc200102a60 2026-03-09T19:30:09.161 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.162+0000 7fc2056a9640 1 --2- 192.168.123.107:0/255245189 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc1dc077680 0x7fc1dc079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:09.161 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.162+0000 7fc208135640 1 -- 192.168.123.107:0/255245189 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc2001040e0 con 0x7fc200102a60 2026-03-09T19:30:09.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.165+0000 7fc2056a9640 1 --2- 192.168.123.107:0/255245189 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc1dc077680 0x7fc1dc079b40 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fc1f0002c80 tx=0x7fc1f00023d0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:09.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.165+0000 7fc1eeffd640 1 -- 192.168.123.107:0/255245189 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc1f409f050 con 0x7fc200102a60 2026-03-09T19:30:09.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.273+0000 7fc208135640 1 -- 192.168.123.107:0/255245189 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc200107ea0 con 0x7fc1dc077680 2026-03-09T19:30:09.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.274+0000 7fc1eeffd640 1 -- 192.168.123.107:0/255245189 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fc200107ea0 con 0x7fc1dc077680 2026-03-09T19:30:09.276 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.277+0000 7fc208135640 1 -- 192.168.123.107:0/255245189 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc1dc077680 msgr2=0x7fc1dc079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.276 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.277+0000 7fc208135640 1 --2- 192.168.123.107:0/255245189 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc1dc077680 0x7fc1dc079b40 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fc1f0002c80 tx=0x7fc1f00023d0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.276 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.277+0000 7fc208135640 1 -- 192.168.123.107:0/255245189 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc200102a60 msgr2=0x7fc20019a670 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.276 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.277+0000 7fc208135640 1 --2- 192.168.123.107:0/255245189 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc200102a60 0x7fc20019a670 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fc1f400d900 tx=0x7fc1f400ddd0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.276 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.277+0000 7fc208135640 1 -- 192.168.123.107:0/255245189 shutdown_connections 2026-03-09T19:30:09.276 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.277+0000 7fc208135640 1 --2- 192.168.123.107:0/255245189 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc1dc077680 0x7fc1dc079b40 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.276 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.277+0000 7fc208135640 1 --2- 192.168.123.107:0/255245189 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc20019abb0 0x7fc20019fc20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.277 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.277+0000 7fc208135640 1 --2- 192.168.123.107:0/255245189 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc200102a60 0x7fc20019a670 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.277 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.277+0000 7fc208135640 1 -- 192.168.123.107:0/255245189 >> 192.168.123.107:0/255245189 conn(0x7fc2000fe250 msgr2=0x7fc2000ffec0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:09.277 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.278+0000 7fc208135640 1 -- 192.168.123.107:0/255245189 shutdown_connections 2026-03-09T19:30:09.277 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.278+0000 7fc208135640 1 -- 192.168.123.107:0/255245189 wait complete. 2026-03-09T19:30:09.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.340+0000 7f4d0345c640 1 -- 192.168.123.107:0/1420491500 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cfc069a50 msgr2=0x7f4cfc10c5d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.340+0000 7f4d0345c640 1 --2- 192.168.123.107:0/1420491500 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cfc069a50 0x7f4cfc10c5d0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f4cf00099b0 tx=0x7f4cf002f220 comp rx=0 tx=0).stop 2026-03-09T19:30:09.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.340+0000 7f4d0345c640 1 -- 192.168.123.107:0/1420491500 shutdown_connections 2026-03-09T19:30:09.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.340+0000 7f4d0345c640 1 --2- 192.168.123.107:0/1420491500 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cfc069a50 0x7f4cfc10c5d0 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.340+0000 7f4d0345c640 1 --2- 192.168.123.107:0/1420491500 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cfc069080 0x7f4cfc069480 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.340+0000 7f4d0345c640 1 -- 192.168.123.107:0/1420491500 >> 192.168.123.107:0/1420491500 conn(0x7f4cfc06e680 msgr2=0x7f4cfc070ac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:09.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.340+0000 7f4d0345c640 1 -- 192.168.123.107:0/1420491500 shutdown_connections 2026-03-09T19:30:09.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.340+0000 7f4d0345c640 1 -- 192.168.123.107:0/1420491500 wait complete. 2026-03-09T19:30:09.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.341+0000 7f4d0345c640 1 Processor -- start 2026-03-09T19:30:09.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.341+0000 7f4d0345c640 1 -- start start 2026-03-09T19:30:09.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.341+0000 7f4d0345c640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cfc069080 0x7f4cfc1a7320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:09.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.341+0000 7f4d0345c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cfc069a50 0x7f4cfc1a7860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:09.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.341+0000 7f4d0345c640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4cfc1a7e30 con 0x7f4cfc069a50 2026-03-09T19:30:09.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.341+0000 7f4d0345c640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4cfc1a7fa0 con 0x7f4cfc069080 2026-03-09T19:30:09.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.341+0000 7f4d009d0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cfc069a50 0x7f4cfc1a7860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:09.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.341+0000 7f4d011d1640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cfc069080 0x7f4cfc1a7320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:09.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.341+0000 7f4d011d1640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cfc069080 0x7f4cfc1a7320 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:50294/0 (socket says 192.168.123.107:50294) 2026-03-09T19:30:09.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.341+0000 7f4d011d1640 1 -- 192.168.123.107:0/2622394287 learned_addr learned my addr 192.168.123.107:0/2622394287 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:30:09.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.342+0000 7f4d011d1640 1 -- 192.168.123.107:0/2622394287 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cfc069a50 msgr2=0x7f4cfc1a7860 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.343+0000 7f4d011d1640 1 --2- 192.168.123.107:0/2622394287 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cfc069a50 0x7f4cfc1a7860 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.343+0000 7f4d011d1640 1 -- 192.168.123.107:0/2622394287 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4cec009590 con 0x7f4cfc069080 2026-03-09T19:30:09.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.343+0000 7f4d009d0640 1 --2- 192.168.123.107:0/2622394287 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cfc069a50 0x7f4cfc1a7860 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:30:09.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.343+0000 7f4d011d1640 1 --2- 192.168.123.107:0/2622394287 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cfc069080 0x7f4cfc1a7320 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f4cec002760 tx=0x7f4cec002c30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:09.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.345+0000 7f4cea7fc640 1 -- 192.168.123.107:0/2622394287 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4cec00ecf0 con 0x7f4cfc069080 2026-03-09T19:30:09.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.345+0000 7f4d0345c640 1 -- 192.168.123.107:0/2622394287 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4cf0009660 con 0x7f4cfc069080 2026-03-09T19:30:09.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.345+0000 7f4d0345c640 1 -- 192.168.123.107:0/2622394287 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4cfc1ace00 con 0x7f4cfc069080 2026-03-09T19:30:09.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.345+0000 7f4cea7fc640 1 -- 192.168.123.107:0/2622394287 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f4cec002e90 con 0x7f4cfc069080 2026-03-09T19:30:09.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.345+0000 7f4cea7fc640 1 -- 192.168.123.107:0/2622394287 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4cec018740 con 0x7f4cfc069080 2026-03-09T19:30:09.345 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.346+0000 7f4cea7fc640 1 -- 192.168.123.107:0/2622394287 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4cec016020 con 0x7f4cfc069080 2026-03-09T19:30:09.346 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.347+0000 7f4cea7fc640 1 --2- 192.168.123.107:0/2622394287 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4ce00779b0 0x7f4ce0079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:09.346 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.347+0000 7f4d009d0640 1 --2- 192.168.123.107:0/2622394287 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4ce00779b0 0x7f4ce0079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:09.346 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.347+0000 7f4cea7fc640 1 -- 192.168.123.107:0/2622394287 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6770+0+0 (secure 0 0 0) 0x7f4cec09b200 con 0x7f4cfc069080 2026-03-09T19:30:09.346 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.348+0000 7f4d009d0640 1 --2- 192.168.123.107:0/2622394287 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4ce00779b0 0x7f4ce0079e70 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f4cf0002410 tx=0x7f4cf0005990 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:09.347 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.348+0000 7f4d0345c640 1 -- 192.168.123.107:0/2622394287 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4cc8005350 con 0x7f4cfc069080 2026-03-09T19:30:09.351 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.352+0000 7f4cea7fc640 1 -- 192.168.123.107:0/2622394287 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4cec063830 con 0x7f4cfc069080 2026-03-09T19:30:09.457 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.458+0000 7f4d0345c640 1 -- 192.168.123.107:0/2622394287 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f4cc8002bf0 con 0x7f4ce00779b0 2026-03-09T19:30:09.462 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.463+0000 7f4cea7fc640 1 -- 192.168.123.107:0/2622394287 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3600 (secure 0 0 0) 0x7f4cc8002bf0 con 0x7f4ce00779b0 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (7m) 57s ago 7m 23.2M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (7m) 57s ago 7m 8938k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (7m) 68s ago 7m 10.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (73s) 57s ago 7m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 91ed5a6dbf3f 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (70s) 68s ago 7m 8321k - 19.2.3-678-ge911bdeb 654f31e6858e b2465a9d2305 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (7m) 57s ago 7m 88.6M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (5m) 57s ago 5m 16.7M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (5m) 57s ago 5m 18.3M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (5m) 68s ago 5m 28.8M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (5m) 68s ago 5m 239M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:8443,9283,8765 running (2m) 57s ago 8m 593M - 19.2.3-678-ge911bdeb 654f31e6858e 6c1350e70bfa 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (107s) 68s ago 7m 489M - 19.2.3-678-ge911bdeb 654f31e6858e c4c36685d8dc 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (101s) 57s ago 8m 56.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e ad39140965d8 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (88s) 68s ago 7m 46.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b4a58927ebfd 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (7m) 57s ago 7m 14.3M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (7m) 68s ago 7m 16.6M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (60s) 57s ago 6m 30.8M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a203aa241656 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (6m) 57s ago 6m 394M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2b3c7dd92144 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (6m) 57s ago 6m 318M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 67f7c4b96ef8 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (6m) 68s ago 6m 441M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 740e44caf4fc 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (6m) 68s ago 6m 446M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d929d31f8a58 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (6m) 68s ago 6m 348M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:30:09.463 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (109s) 57s ago 7m 44.5M - 2.43.0 a07b618ecd1d c09450c20f5f 2026-03-09T19:30:09.465 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.467+0000 7f4d0345c640 1 -- 192.168.123.107:0/2622394287 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4ce00779b0 msgr2=0x7f4ce0079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.465 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.467+0000 7f4d0345c640 1 --2- 192.168.123.107:0/2622394287 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4ce00779b0 0x7f4ce0079e70 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f4cf0002410 tx=0x7f4cf0005990 comp rx=0 tx=0).stop 2026-03-09T19:30:09.466 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.467+0000 7f4d0345c640 1 -- 192.168.123.107:0/2622394287 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cfc069080 msgr2=0x7f4cfc1a7320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.466 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.467+0000 7f4d0345c640 1 --2- 192.168.123.107:0/2622394287 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cfc069080 0x7f4cfc1a7320 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f4cec002760 tx=0x7f4cec002c30 comp rx=0 tx=0).stop 2026-03-09T19:30:09.466 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.467+0000 7f4d0345c640 1 -- 192.168.123.107:0/2622394287 shutdown_connections 2026-03-09T19:30:09.466 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.467+0000 7f4d0345c640 1 --2- 192.168.123.107:0/2622394287 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4ce00779b0 0x7f4ce0079e70 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.466 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.467+0000 7f4d0345c640 1 --2- 192.168.123.107:0/2622394287 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cfc069a50 0x7f4cfc1a7860 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.466 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.467+0000 7f4d0345c640 1 --2- 192.168.123.107:0/2622394287 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cfc069080 0x7f4cfc1a7320 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.466 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.467+0000 7f4d0345c640 1 -- 192.168.123.107:0/2622394287 >> 192.168.123.107:0/2622394287 conn(0x7f4cfc06e680 msgr2=0x7f4cfc10a800 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:09.466 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.467+0000 7f4d0345c640 1 -- 192.168.123.107:0/2622394287 shutdown_connections 2026-03-09T19:30:09.466 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.467+0000 7f4d0345c640 1 -- 192.168.123.107:0/2622394287 wait complete. 2026-03-09T19:30:09.524 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.525+0000 7f91d6caa640 1 -- 192.168.123.107:0/1674997735 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f91d0073b40 msgr2=0x7f91d0073fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.524 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.525+0000 7f91d6caa640 1 --2- 192.168.123.107:0/1674997735 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f91d0073b40 0x7f91d0073fa0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f91c0009a00 tx=0x7f91c002f290 comp rx=0 tx=0).stop 2026-03-09T19:30:09.524 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.525+0000 7f91d6caa640 1 -- 192.168.123.107:0/1674997735 shutdown_connections 2026-03-09T19:30:09.524 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.525+0000 7f91d6caa640 1 --2- 192.168.123.107:0/1674997735 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f91d0073b40 0x7f91d0073fa0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.524 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.525+0000 7f91d6caa640 1 --2- 192.168.123.107:0/1674997735 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91d00751a0 0x7f91d0073600 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.524 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.525+0000 7f91d6caa640 1 -- 192.168.123.107:0/1674997735 >> 192.168.123.107:0/1674997735 conn(0x7f91d00fbfb0 msgr2=0x7f91d00fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:09.524 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.525+0000 7f91d6caa640 1 -- 192.168.123.107:0/1674997735 shutdown_connections 2026-03-09T19:30:09.524 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.525+0000 7f91d6caa640 1 -- 192.168.123.107:0/1674997735 wait complete. 2026-03-09T19:30:09.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.526+0000 7f91d6caa640 1 Processor -- start 2026-03-09T19:30:09.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.526+0000 7f91d6caa640 1 -- start start 2026-03-09T19:30:09.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.526+0000 7f91d6caa640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91d0073b40 0x7f91d019a4d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:09.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.526+0000 7f91d6caa640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f91d00751a0 0x7f91d019aa10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:09.525 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.526+0000 7f91d6caa640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f91d019afe0 con 0x7f91d00751a0 2026-03-09T19:30:09.526 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.527+0000 7f91d4a1f640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91d0073b40 0x7f91d019a4d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:09.526 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.527+0000 7f91c7fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f91d00751a0 0x7f91d019aa10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:09.526 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.527+0000 7f91d4a1f640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91d0073b40 0x7f91d019a4d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:50308/0 (socket says 192.168.123.107:50308) 2026-03-09T19:30:09.526 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.527+0000 7f91d4a1f640 1 -- 192.168.123.107:0/2069766050 learned_addr learned my addr 192.168.123.107:0/2069766050 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:30:09.526 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.527+0000 7f91d6caa640 1 -- 192.168.123.107:0/2069766050 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f91d019b150 con 0x7f91d0073b40 2026-03-09T19:30:09.526 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.527+0000 7f91c7fff640 1 -- 192.168.123.107:0/2069766050 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91d0073b40 msgr2=0x7f91d019a4d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.526 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.527+0000 7f91c7fff640 1 --2- 192.168.123.107:0/2069766050 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91d0073b40 0x7f91d019a4d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.526 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.528+0000 7f91c7fff640 1 -- 192.168.123.107:0/2069766050 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f91c0009660 con 0x7f91d00751a0 2026-03-09T19:30:09.527 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.528+0000 7f91c7fff640 1 --2- 192.168.123.107:0/2069766050 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f91d00751a0 0x7f91d019aa10 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f91c0002bf0 tx=0x7f91c0031d40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:09.528 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.528+0000 7f91c5ffb640 1 -- 192.168.123.107:0/2069766050 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f91c0031ee0 con 0x7f91d00751a0 2026-03-09T19:30:09.528 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.528+0000 7f91c5ffb640 1 -- 192.168.123.107:0/2069766050 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f91c00043d0 con 0x7f91d00751a0 2026-03-09T19:30:09.528 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.528+0000 7f91c5ffb640 1 -- 192.168.123.107:0/2069766050 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f91c0031280 con 0x7f91d00751a0 2026-03-09T19:30:09.528 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.528+0000 7f91d6caa640 1 -- 192.168.123.107:0/2069766050 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f91d019fb90 con 0x7f91d00751a0 2026-03-09T19:30:09.528 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.528+0000 7f91d6caa640 1 -- 192.168.123.107:0/2069766050 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f91d01005a0 con 0x7f91d00751a0 2026-03-09T19:30:09.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.530+0000 7f91c5ffb640 1 -- 192.168.123.107:0/2069766050 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f91c003f070 con 0x7f91d00751a0 2026-03-09T19:30:09.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.530+0000 7f91d6caa640 1 -- 192.168.123.107:0/2069766050 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9198005350 con 0x7f91d00751a0 2026-03-09T19:30:09.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.530+0000 7f91c5ffb640 1 --2- 192.168.123.107:0/2069766050 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f91ac0778e0 0x7f91ac079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:09.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.531+0000 7f91c5ffb640 1 -- 192.168.123.107:0/2069766050 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6770+0+0 (secure 0 0 0) 0x7f91c00be520 con 0x7f91d00751a0 2026-03-09T19:30:09.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.533+0000 7f91d4a1f640 1 --2- 192.168.123.107:0/2069766050 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f91ac0778e0 0x7f91ac079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:09.533 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.534+0000 7f91d4a1f640 1 --2- 192.168.123.107:0/2069766050 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f91ac0778e0 0x7f91ac079da0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f91b8004660 tx=0x7f91b8009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:09.533 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.533+0000 7f91c5ffb640 1 -- 192.168.123.107:0/2069766050 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f91c0086c00 con 0x7f91d00751a0 2026-03-09T19:30:09.671 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.670+0000 7f91d6caa640 1 -- 192.168.123.107:0/2069766050 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f9198005e10 con 0x7f91d00751a0 2026-03-09T19:30:09.671 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.671+0000 7f91c5ffb640 1 -- 192.168.123.107:0/2069766050 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+844 (secure 0 0 0) 0x7f91c0086350 con 0x7f91d00751a0 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 5, 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 9, 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:30:09.672 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:30:09.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.676+0000 7f91d6caa640 1 -- 192.168.123.107:0/2069766050 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f91ac0778e0 msgr2=0x7f91ac079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.676+0000 7f91d6caa640 1 --2- 192.168.123.107:0/2069766050 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f91ac0778e0 0x7f91ac079da0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f91b8004660 tx=0x7f91b8009290 comp rx=0 tx=0).stop 2026-03-09T19:30:09.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.676+0000 7f91d6caa640 1 -- 192.168.123.107:0/2069766050 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f91d00751a0 msgr2=0x7f91d019aa10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.676+0000 7f91d6caa640 1 --2- 192.168.123.107:0/2069766050 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f91d00751a0 0x7f91d019aa10 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f91c0002bf0 tx=0x7f91c0031d40 comp rx=0 tx=0).stop 2026-03-09T19:30:09.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.676+0000 7f91d6caa640 1 -- 192.168.123.107:0/2069766050 shutdown_connections 2026-03-09T19:30:09.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.676+0000 7f91d6caa640 1 --2- 192.168.123.107:0/2069766050 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f91ac0778e0 0x7f91ac079da0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.676+0000 7f91d6caa640 1 --2- 192.168.123.107:0/2069766050 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f91d00751a0 0x7f91d019aa10 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.676+0000 7f91d6caa640 1 --2- 192.168.123.107:0/2069766050 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91d0073b40 0x7f91d019a4d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.676+0000 7f91d6caa640 1 -- 192.168.123.107:0/2069766050 >> 192.168.123.107:0/2069766050 conn(0x7f91d00fbfb0 msgr2=0x7f91d00fd770 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:09.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.677+0000 7f91d6caa640 1 -- 192.168.123.107:0/2069766050 shutdown_connections 2026-03-09T19:30:09.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.677+0000 7f91d6caa640 1 -- 192.168.123.107:0/2069766050 wait complete. 2026-03-09T19:30:09.733 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.734+0000 7fb18ecb8640 1 -- 192.168.123.107:0/1446467954 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb188103c80 msgr2=0x7fb188104100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.733 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.734+0000 7fb18ecb8640 1 --2- 192.168.123.107:0/1446467954 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb188103c80 0x7fb188104100 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb178009a00 tx=0x7fb17802f290 comp rx=0 tx=0).stop 2026-03-09T19:30:09.733 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.734+0000 7fb18ecb8640 1 -- 192.168.123.107:0/1446467954 shutdown_connections 2026-03-09T19:30:09.733 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.734+0000 7fb18ecb8640 1 --2- 192.168.123.107:0/1446467954 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb188103c80 0x7fb188104100 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.733 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.734+0000 7fb18ecb8640 1 --2- 192.168.123.107:0/1446467954 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb188102a80 0x7fb188102e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.733 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.734+0000 7fb18ecb8640 1 -- 192.168.123.107:0/1446467954 >> 192.168.123.107:0/1446467954 conn(0x7fb1880fe250 msgr2=0x7fb188100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:09.733 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.734+0000 7fb18ecb8640 1 -- 192.168.123.107:0/1446467954 shutdown_connections 2026-03-09T19:30:09.733 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.735+0000 7fb18ecb8640 1 -- 192.168.123.107:0/1446467954 wait complete. 2026-03-09T19:30:09.734 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.735+0000 7fb18ecb8640 1 Processor -- start 2026-03-09T19:30:09.734 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.735+0000 7fb18ecb8640 1 -- start start 2026-03-09T19:30:09.734 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.736+0000 7fb18ecb8640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb188102a80 0x7fb18819a720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:09.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.736+0000 7fb18ecb8640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb18819ac60 0x7fb18819fcd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:09.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.736+0000 7fb18ecb8640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb18819b0e0 con 0x7fb188102a80 2026-03-09T19:30:09.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.736+0000 7fb18ecb8640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb18819b250 con 0x7fb18819ac60 2026-03-09T19:30:09.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.736+0000 7fb18ca2d640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb188102a80 0x7fb18819a720 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:09.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.736+0000 7fb17ffff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb18819ac60 0x7fb18819fcd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:09.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.737+0000 7fb17ffff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb18819ac60 0x7fb18819fcd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:50322/0 (socket says 192.168.123.107:50322) 2026-03-09T19:30:09.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.737+0000 7fb17ffff640 1 -- 192.168.123.107:0/295758432 learned_addr learned my addr 192.168.123.107:0/295758432 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:30:09.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.737+0000 7fb17ffff640 1 -- 192.168.123.107:0/295758432 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb188102a80 msgr2=0x7fb18819a720 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.737+0000 7fb17ffff640 1 --2- 192.168.123.107:0/295758432 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb188102a80 0x7fb18819a720 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.737+0000 7fb17ffff640 1 -- 192.168.123.107:0/295758432 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb178009660 con 0x7fb18819ac60 2026-03-09T19:30:09.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.737+0000 7fb18ca2d640 1 --2- 192.168.123.107:0/295758432 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb188102a80 0x7fb18819a720 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:30:09.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.738+0000 7fb17ffff640 1 --2- 192.168.123.107:0/295758432 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb18819ac60 0x7fb18819fcd0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fb17802f7a0 tx=0x7fb178031d40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:09.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.738+0000 7fb17dffb640 1 -- 192.168.123.107:0/295758432 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb17802faf0 con 0x7fb18819ac60 2026-03-09T19:30:09.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.738+0000 7fb18ecb8640 1 -- 192.168.123.107:0/295758432 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb1881a0210 con 0x7fb18819ac60 2026-03-09T19:30:09.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.738+0000 7fb18ecb8640 1 -- 192.168.123.107:0/295758432 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb1881a0730 con 0x7fb18819ac60 2026-03-09T19:30:09.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.739+0000 7fb17dffb640 1 -- 192.168.123.107:0/295758432 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fb17802fc50 con 0x7fb18819ac60 2026-03-09T19:30:09.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.739+0000 7fb17dffb640 1 -- 192.168.123.107:0/295758432 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb178038710 con 0x7fb18819ac60 2026-03-09T19:30:09.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.740+0000 7fb17dffb640 1 -- 192.168.123.107:0/295758432 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb178048050 con 0x7fb18819ac60 2026-03-09T19:30:09.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.740+0000 7fb18ecb8640 1 -- 192.168.123.107:0/295758432 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb18810b750 con 0x7fb18819ac60 2026-03-09T19:30:09.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.740+0000 7fb17dffb640 1 --2- 192.168.123.107:0/295758432 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb160077890 0x7fb160079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:09.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.741+0000 7fb18ca2d640 1 --2- 192.168.123.107:0/295758432 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb160077890 0x7fb160079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:09.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.741+0000 7fb17dffb640 1 -- 192.168.123.107:0/295758432 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6770+0+0 (secure 0 0 0) 0x7fb1780be970 con 0x7fb18819ac60 2026-03-09T19:30:09.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.741+0000 7fb18ca2d640 1 --2- 192.168.123.107:0/295758432 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb160077890 0x7fb160079d50 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fb188103ae0 tx=0x7fb170009210 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:09.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.743+0000 7fb17dffb640 1 -- 192.168.123.107:0/295758432 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb178087f70 con 0x7fb18819ac60 2026-03-09T19:30:09.867 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.868+0000 7fb18ecb8640 1 -- 192.168.123.107:0/295758432 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fb18810b940 con 0x7fb18819ac60 2026-03-09T19:30:09.868 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.869+0000 7fb17dffb640 1 -- 192.168.123.107:0/295758432 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1926 (secure 0 0 0) 0x7fb17803d7a0 con 0x7fb18819ac60 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:e13 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:epoch 13 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:30:09.869 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:30:09.870 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:30:09.870 INFO:teuthology.orchestra.run.vm07.stdout:qdb_cluster leader: 0 members: 2026-03-09T19:30:09.870 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:30:09.870 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:30:09.870 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:30:09.870 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:30:09.870 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:30:09.870 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:30:09.870 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:30:09.870 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:30:09.870 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:30:09.871 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.873+0000 7fb18ecb8640 1 -- 192.168.123.107:0/295758432 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb160077890 msgr2=0x7fb160079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.872 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.873+0000 7fb18ecb8640 1 --2- 192.168.123.107:0/295758432 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb160077890 0x7fb160079d50 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fb188103ae0 tx=0x7fb170009210 comp rx=0 tx=0).stop 2026-03-09T19:30:09.872 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.873+0000 7fb18ecb8640 1 -- 192.168.123.107:0/295758432 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb18819ac60 msgr2=0x7fb18819fcd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.872 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.873+0000 7fb18ecb8640 1 --2- 192.168.123.107:0/295758432 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb18819ac60 0x7fb18819fcd0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fb17802f7a0 tx=0x7fb178031d40 comp rx=0 tx=0).stop 2026-03-09T19:30:09.872 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.873+0000 7fb18ecb8640 1 -- 192.168.123.107:0/295758432 shutdown_connections 2026-03-09T19:30:09.872 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.873+0000 7fb18ecb8640 1 --2- 192.168.123.107:0/295758432 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb160077890 0x7fb160079d50 secure :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fb188103ae0 tx=0x7fb170009210 comp rx=0 tx=0).stop 2026-03-09T19:30:09.872 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.873+0000 7fb18ecb8640 1 --2- 192.168.123.107:0/295758432 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb18819ac60 0x7fb18819fcd0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.872 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.873+0000 7fb18ecb8640 1 --2- 192.168.123.107:0/295758432 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb188102a80 0x7fb18819a720 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.872 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.873+0000 7fb18ecb8640 1 -- 192.168.123.107:0/295758432 >> 192.168.123.107:0/295758432 conn(0x7fb1880fe250 msgr2=0x7fb1880ffb90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:09.872 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.873+0000 7fb18ecb8640 1 -- 192.168.123.107:0/295758432 shutdown_connections 2026-03-09T19:30:09.872 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.873+0000 7fb18ecb8640 1 -- 192.168.123.107:0/295758432 wait complete. 2026-03-09T19:30:09.933 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:09 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/2069766050' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:30:09.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.934+0000 7f903638f640 1 -- 192.168.123.107:0/4285765090 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9030105d80 msgr2=0x7f9030108170 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.934+0000 7f903638f640 1 --2- 192.168.123.107:0/4285765090 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9030105d80 0x7f9030108170 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f901c0099b0 tx=0x7f901c02f240 comp rx=0 tx=0).stop 2026-03-09T19:30:09.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.939+0000 7f903638f640 1 -- 192.168.123.107:0/4285765090 shutdown_connections 2026-03-09T19:30:09.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.939+0000 7f903638f640 1 --2- 192.168.123.107:0/4285765090 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9030105d80 0x7f9030108170 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.939+0000 7f903638f640 1 --2- 192.168.123.107:0/4285765090 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9030069930 0x7f9030105840 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.939+0000 7f903638f640 1 -- 192.168.123.107:0/4285765090 >> 192.168.123.107:0/4285765090 conn(0x7f90300fae50 msgr2=0x7f90300fd290 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:09.938 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.939+0000 7f903638f640 1 -- 192.168.123.107:0/4285765090 shutdown_connections 2026-03-09T19:30:09.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.940+0000 7f903638f640 1 -- 192.168.123.107:0/4285765090 wait complete. 2026-03-09T19:30:09.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.940+0000 7f903638f640 1 Processor -- start 2026-03-09T19:30:09.939 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.940+0000 7f903638f640 1 -- start start 2026-03-09T19:30:09.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.941+0000 7f903638f640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9030069930 0x7f903019a430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:09.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.941+0000 7f903638f640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9030105d80 0x7f903019a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:09.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.941+0000 7f903638f640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f903019af40 con 0x7f9030069930 2026-03-09T19:30:09.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.941+0000 7f903638f640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f903019b0b0 con 0x7f9030105d80 2026-03-09T19:30:09.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.941+0000 7f902ffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9030069930 0x7f903019a430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:09.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.941+0000 7f902ffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9030069930 0x7f903019a430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:42834/0 (socket says 192.168.123.107:42834) 2026-03-09T19:30:09.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.941+0000 7f902ffff640 1 -- 192.168.123.107:0/3563221457 learned_addr learned my addr 192.168.123.107:0/3563221457 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:30:09.940 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.942+0000 7f902f7fe640 1 --2- 192.168.123.107:0/3563221457 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9030105d80 0x7f903019a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:09.941 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.942+0000 7f902ffff640 1 -- 192.168.123.107:0/3563221457 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9030105d80 msgr2=0x7f903019a970 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:09.941 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.942+0000 7f902ffff640 1 --2- 192.168.123.107:0/3563221457 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9030105d80 0x7f903019a970 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:09.941 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.942+0000 7f902ffff640 1 -- 192.168.123.107:0/3563221457 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f901c009660 con 0x7f9030069930 2026-03-09T19:30:09.941 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.942+0000 7f902ffff640 1 --2- 192.168.123.107:0/3563221457 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9030069930 0x7f903019a430 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f902000e9c0 tx=0x7f902000ee90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:09.942 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.942+0000 7f902d7fa640 1 -- 192.168.123.107:0/3563221457 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f902000cd60 con 0x7f9030069930 2026-03-09T19:30:09.942 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.942+0000 7f902d7fa640 1 -- 192.168.123.107:0/3563221457 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f902000cec0 con 0x7f9030069930 2026-03-09T19:30:09.943 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.942+0000 7f902d7fa640 1 -- 192.168.123.107:0/3563221457 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9020010640 con 0x7f9030069930 2026-03-09T19:30:09.943 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.942+0000 7f903638f640 1 -- 192.168.123.107:0/3563221457 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f903019fb50 con 0x7f9030069930 2026-03-09T19:30:09.943 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.943+0000 7f903638f640 1 -- 192.168.123.107:0/3563221457 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f90301a00a0 con 0x7f9030069930 2026-03-09T19:30:09.943 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.944+0000 7f903638f640 1 -- 192.168.123.107:0/3563221457 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8ff4005350 con 0x7f9030069930 2026-03-09T19:30:09.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.945+0000 7f902d7fa640 1 -- 192.168.123.107:0/3563221457 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9020002900 con 0x7f9030069930 2026-03-09T19:30:09.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.946+0000 7f902d7fa640 1 --2- 192.168.123.107:0/3563221457 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f90040779b0 0x7f9004079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:09.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.946+0000 7f902d7fa640 1 -- 192.168.123.107:0/3563221457 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6770+0+0 (secure 0 0 0) 0x7f902001d070 con 0x7f9030069930 2026-03-09T19:30:09.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.946+0000 7f902f7fe640 1 --2- 192.168.123.107:0/3563221457 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f90040779b0 0x7f9004079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:09.948 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.947+0000 7f902f7fe640 1 --2- 192.168.123.107:0/3563221457 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f90040779b0 0x7f9004079e70 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f901c002c30 tx=0x7f901c03a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:09.948 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:09.949+0000 7f902d7fa640 1 -- 192.168.123.107:0/3563221457 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9020061fa0 con 0x7f9030069930 2026-03-09T19:30:10.055 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.056+0000 7f903638f640 1 -- 192.168.123.107:0/3563221457 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8ff4002bf0 con 0x7f90040779b0 2026-03-09T19:30:10.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.058+0000 7f902d7fa640 1 -- 192.168.123.107:0/3563221457 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f8ff4002bf0 con 0x7f90040779b0 2026-03-09T19:30:10.057 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:30:10.057 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T19:30:10.057 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:30:10.057 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:30:10.057 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [ 2026-03-09T19:30:10.057 INFO:teuthology.orchestra.run.vm07.stdout: "mgr", 2026-03-09T19:30:10.057 INFO:teuthology.orchestra.run.vm07.stdout: "crash", 2026-03-09T19:30:10.057 INFO:teuthology.orchestra.run.vm07.stdout: "mon" 2026-03-09T19:30:10.057 INFO:teuthology.orchestra.run.vm07.stdout: ], 2026-03-09T19:30:10.057 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "7/23 daemons upgraded", 2026-03-09T19:30:10.057 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T19:30:10.057 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:30:10.057 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:30:10.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.061+0000 7f903638f640 1 -- 192.168.123.107:0/3563221457 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f90040779b0 msgr2=0x7f9004079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:10.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.061+0000 7f903638f640 1 --2- 192.168.123.107:0/3563221457 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f90040779b0 0x7f9004079e70 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f901c002c30 tx=0x7f901c03a040 comp rx=0 tx=0).stop 2026-03-09T19:30:10.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.061+0000 7f903638f640 1 -- 192.168.123.107:0/3563221457 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9030069930 msgr2=0x7f903019a430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:10.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.061+0000 7f903638f640 1 --2- 192.168.123.107:0/3563221457 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9030069930 0x7f903019a430 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f902000e9c0 tx=0x7f902000ee90 comp rx=0 tx=0).stop 2026-03-09T19:30:10.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.061+0000 7f903638f640 1 -- 192.168.123.107:0/3563221457 shutdown_connections 2026-03-09T19:30:10.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.061+0000 7f903638f640 1 --2- 192.168.123.107:0/3563221457 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f90040779b0 0x7f9004079e70 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:10.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.062+0000 7f903638f640 1 --2- 192.168.123.107:0/3563221457 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9030105d80 0x7f903019a970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:10.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.062+0000 7f903638f640 1 --2- 192.168.123.107:0/3563221457 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9030069930 0x7f903019a430 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:10.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.062+0000 7f903638f640 1 -- 192.168.123.107:0/3563221457 >> 192.168.123.107:0/3563221457 conn(0x7f90300fae50 msgr2=0x7f90300fd290 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:10.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.062+0000 7f903638f640 1 -- 192.168.123.107:0/3563221457 shutdown_connections 2026-03-09T19:30:10.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.062+0000 7f903638f640 1 -- 192.168.123.107:0/3563221457 wait complete. 2026-03-09T19:30:10.123 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.124+0000 7f5597647640 1 -- 192.168.123.107:0/3021558865 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55900ff5a0 msgr2=0x7f55900ffa00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:10.123 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.124+0000 7f5597647640 1 --2- 192.168.123.107:0/3021558865 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55900ff5a0 0x7f55900ffa00 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f55880099e0 tx=0x7f558802f280 comp rx=0 tx=0).stop 2026-03-09T19:30:10.123 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.124+0000 7f5597647640 1 -- 192.168.123.107:0/3021558865 shutdown_connections 2026-03-09T19:30:10.123 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.124+0000 7f5597647640 1 --2- 192.168.123.107:0/3021558865 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55900ff5a0 0x7f55900ffa00 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:10.123 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.124+0000 7f5597647640 1 --2- 192.168.123.107:0/3021558865 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f55900fe4b0 0x7f55900fe8b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:10.124 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.124+0000 7f5597647640 1 -- 192.168.123.107:0/3021558865 >> 192.168.123.107:0/3021558865 conn(0x7f55900f9f80 msgr2=0x7f55900fc3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:10.124 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.125+0000 7f5597647640 1 -- 192.168.123.107:0/3021558865 shutdown_connections 2026-03-09T19:30:10.124 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.125+0000 7f5597647640 1 -- 192.168.123.107:0/3021558865 wait complete. 2026-03-09T19:30:10.124 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.125+0000 7f5597647640 1 Processor -- start 2026-03-09T19:30:10.124 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.125+0000 7f5597647640 1 -- start start 2026-03-09T19:30:10.124 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.126+0000 7f5597647640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f55900fe4b0 0x7f559019a2a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:10.124 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.126+0000 7f5597647640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55900ff5a0 0x7f559019a7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:10.125 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.126+0000 7f5597647640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f559019adb0 con 0x7f55900fe4b0 2026-03-09T19:30:10.125 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.126+0000 7f5597647640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f559019af20 con 0x7f55900ff5a0 2026-03-09T19:30:10.125 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.126+0000 7f5596645640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f55900fe4b0 0x7f559019a2a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:10.129 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.126+0000 7f5596645640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f55900fe4b0 0x7f559019a2a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:42854/0 (socket says 192.168.123.107:42854) 2026-03-09T19:30:10.129 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.126+0000 7f5596645640 1 -- 192.168.123.107:0/876013047 learned_addr learned my addr 192.168.123.107:0/876013047 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:30:10.129 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.126+0000 7f5595e44640 1 --2- 192.168.123.107:0/876013047 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55900ff5a0 0x7f559019a7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:10.129 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.126+0000 7f5596645640 1 -- 192.168.123.107:0/876013047 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55900ff5a0 msgr2=0x7f559019a7e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:10.129 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.126+0000 7f5596645640 1 --2- 192.168.123.107:0/876013047 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55900ff5a0 0x7f559019a7e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:10.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.126+0000 7f5596645640 1 -- 192.168.123.107:0/876013047 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f557c009590 con 0x7f55900fe4b0 2026-03-09T19:30:10.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.138+0000 7f5596645640 1 --2- 192.168.123.107:0/876013047 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f55900fe4b0 0x7f559019a2a0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f557c0029c0 tx=0x7f557c002e90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:10.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.139+0000 7f557b7fe640 1 -- 192.168.123.107:0/876013047 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f557c00ebd0 con 0x7f55900fe4b0 2026-03-09T19:30:10.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.139+0000 7f557b7fe640 1 -- 192.168.123.107:0/876013047 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f557c00ed30 con 0x7f55900fe4b0 2026-03-09T19:30:10.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.139+0000 7f557b7fe640 1 -- 192.168.123.107:0/876013047 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f557c00f710 con 0x7f55900fe4b0 2026-03-09T19:30:10.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.139+0000 7f5597647640 1 -- 192.168.123.107:0/876013047 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5588009660 con 0x7f55900fe4b0 2026-03-09T19:30:10.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.139+0000 7f5597647640 1 -- 192.168.123.107:0/876013047 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f559019fd10 con 0x7f55900fe4b0 2026-03-09T19:30:10.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.140+0000 7f5597647640 1 -- 192.168.123.107:0/876013047 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f559010b4d0 con 0x7f55900fe4b0 2026-03-09T19:30:10.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.143+0000 7f557b7fe640 1 -- 192.168.123.107:0/876013047 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f557c016020 con 0x7f55900fe4b0 2026-03-09T19:30:10.144 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.145+0000 7f557b7fe640 1 --2- 192.168.123.107:0/876013047 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f55640779b0 0x7f5564079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:10.144 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.145+0000 7f557b7fe640 1 -- 192.168.123.107:0/876013047 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6770+0+0 (secure 0 0 0) 0x7f557c099f80 con 0x7f55900fe4b0 2026-03-09T19:30:10.144 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.145+0000 7f557b7fe640 1 -- 192.168.123.107:0/876013047 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f557c09a360 con 0x7f55900fe4b0 2026-03-09T19:30:10.149 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.148+0000 7f5595e44640 1 --2- 192.168.123.107:0/876013047 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f55640779b0 0x7f5564079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:10.153 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.154+0000 7f5595e44640 1 --2- 192.168.123.107:0/876013047 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f55640779b0 0x7f5564079e70 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f559019b750 tx=0x7f558803a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:10.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.280+0000 7f5597647640 1 -- 192.168.123.107:0/876013047 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f559010b720 con 0x7f55900fe4b0 2026-03-09T19:30:10.280 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.281+0000 7f557b7fe640 1 -- 192.168.123.107:0/876013047 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+975 (secure 0 0 0) 0x7f557c0625b0 con 0x7f55900fe4b0 2026-03-09T19:30:10.281 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_WARN Degraded data redundancy: 1607/3273 objects degraded (49.099%), 10 pgs degraded 2026-03-09T19:30:10.281 INFO:teuthology.orchestra.run.vm07.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 1607/3273 objects degraded (49.099%), 10 pgs degraded 2026-03-09T19:30:10.281 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.6 is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-09T19:30:10.281 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.b is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-09T19:30:10.281 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.f is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-09T19:30:10.281 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.10 is active+recovery_wait+undersized+degraded+remapped, acting [5,1] 2026-03-09T19:30:10.281 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.11 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T19:30:10.281 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.15 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T19:30:10.281 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.17 is active+recovery_wait+undersized+degraded+remapped, acting [2,5] 2026-03-09T19:30:10.281 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.18 is active+recovery_wait+undersized+degraded+remapped, acting [2,1] 2026-03-09T19:30:10.281 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.1b is active+recovering+undersized+degraded+remapped, acting [3,4] 2026-03-09T19:30:10.281 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.1f is active+recovery_wait+undersized+degraded+remapped, acting [2,3] 2026-03-09T19:30:10.282 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.283+0000 7f5597647640 1 -- 192.168.123.107:0/876013047 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f55640779b0 msgr2=0x7f5564079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:10.282 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.283+0000 7f5597647640 1 --2- 192.168.123.107:0/876013047 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f55640779b0 0x7f5564079e70 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f559019b750 tx=0x7f558803a040 comp rx=0 tx=0).stop 2026-03-09T19:30:10.282 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.284+0000 7f5597647640 1 -- 192.168.123.107:0/876013047 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f55900fe4b0 msgr2=0x7f559019a2a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:10.283 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.284+0000 7f5597647640 1 --2- 192.168.123.107:0/876013047 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f55900fe4b0 0x7f559019a2a0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f557c0029c0 tx=0x7f557c002e90 comp rx=0 tx=0).stop 2026-03-09T19:30:10.283 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.284+0000 7f5597647640 1 -- 192.168.123.107:0/876013047 shutdown_connections 2026-03-09T19:30:10.283 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.284+0000 7f5597647640 1 --2- 192.168.123.107:0/876013047 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f55640779b0 0x7f5564079e70 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:10.283 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.284+0000 7f5597647640 1 --2- 192.168.123.107:0/876013047 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55900ff5a0 0x7f559019a7e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:10.283 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.285+0000 7f5597647640 1 --2- 192.168.123.107:0/876013047 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f55900fe4b0 0x7f559019a2a0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:10.284 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.285+0000 7f5597647640 1 -- 192.168.123.107:0/876013047 >> 192.168.123.107:0/876013047 conn(0x7f55900f9f80 msgr2=0x7f55900fbab0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:10.284 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.285+0000 7f5597647640 1 -- 192.168.123.107:0/876013047 shutdown_connections 2026-03-09T19:30:10.284 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:10.285+0000 7f5597647640 1 -- 192.168.123.107:0/876013047 wait complete. 2026-03-09T19:30:10.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:09 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/2069766050' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:30:11.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:10 vm07.local ceph-mon[111841]: from='client.34178 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:30:11.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:10 vm07.local ceph-mon[111841]: from='client.34182 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:30:11.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:10 vm07.local ceph-mon[111841]: from='client.44151 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:30:11.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:10 vm07.local ceph-mon[111841]: pgmap v66: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 233 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 30 KiB/s rd, 1.1 MiB/s wr, 325 op/s; 1607/3273 objects degraded (49.099%); 0 B/s, 8 objects/s recovering 2026-03-09T19:30:11.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:10 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/295758432' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:30:11.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:10 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/876013047' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:30:11.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:10 vm08.local ceph-mon[103420]: from='client.34178 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:30:11.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:10 vm08.local ceph-mon[103420]: from='client.34182 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:30:11.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:10 vm08.local ceph-mon[103420]: from='client.44151 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:30:11.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:10 vm08.local ceph-mon[103420]: pgmap v66: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 233 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 30 KiB/s rd, 1.1 MiB/s wr, 325 op/s; 1607/3273 objects degraded (49.099%); 0 B/s, 8 objects/s recovering 2026-03-09T19:30:11.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:10 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/295758432' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:30:11.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:10 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/876013047' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:30:12.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:11 vm07.local ceph-mon[111841]: from='client.34196 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:30:12.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:11 vm08.local ceph-mon[103420]: from='client.34196 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:30:13.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:12 vm07.local ceph-mon[111841]: pgmap v67: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 222 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 21 KiB/s rd, 567 KiB/s wr, 231 op/s; 1607/300 objects degraded (535.667%); 0 B/s, 10 objects/s recovering 2026-03-09T19:30:13.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:12 vm08.local ceph-mon[103420]: pgmap v67: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 222 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 21 KiB/s rd, 567 KiB/s wr, 231 op/s; 1607/300 objects degraded (535.667%); 0 B/s, 10 objects/s recovering 2026-03-09T19:30:14.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:13 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 1607/300 objects degraded (535.667%), 10 pgs degraded (PG_DEGRADED) 2026-03-09T19:30:14.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:13 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 1607/300 objects degraded (535.667%), 10 pgs degraded (PG_DEGRADED) 2026-03-09T19:30:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:14 vm07.local ceph-mon[111841]: pgmap v68: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 222 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 21 KiB/s rd, 560 KiB/s wr, 220 op/s; 1607/300 objects degraded (535.667%); 0 B/s, 7 objects/s recovering 2026-03-09T19:30:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:30:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:30:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:30:15.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:30:15.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:30:15.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:30:15.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:30:15.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:30:15.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:30:15.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:14 vm08.local ceph-mon[103420]: pgmap v68: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 222 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 21 KiB/s rd, 560 KiB/s wr, 220 op/s; 1607/300 objects degraded (535.667%); 0 B/s, 7 objects/s recovering 2026-03-09T19:30:15.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:30:15.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:30:15.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:30:15.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:30:15.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:30:15.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:30:15.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:30:15.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:30:15.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:30:16.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:15 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:30:16.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:15 vm07.local ceph-mon[111841]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-09T19:30:16.246 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:15 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:30:16.246 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:15 vm08.local ceph-mon[103420]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-09T19:30:17.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:16 vm07.local ceph-mon[111841]: pgmap v69: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 21 KiB/s rd, 594 KiB/s wr, 222 op/s; 1560/291 objects degraded (536.082%); 0 B/s, 11 objects/s recovering 2026-03-09T19:30:17.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:16 vm08.local ceph-mon[103420]: pgmap v69: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 21 KiB/s rd, 594 KiB/s wr, 222 op/s; 1560/291 objects degraded (536.082%); 0 B/s, 11 objects/s recovering 2026-03-09T19:30:19.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:18 vm07.local ceph-mon[111841]: pgmap v70: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 8.2 KiB/s rd, 145 KiB/s wr, 94 op/s; 1560/291 objects degraded (536.082%); 0 B/s, 7 objects/s recovering 2026-03-09T19:30:19.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:18 vm07.local ceph-mon[111841]: osdmap e62: 6 total, 6 up, 6 in 2026-03-09T19:30:19.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:18 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 1560/291 objects degraded (536.082%), 10 pgs degraded (PG_DEGRADED) 2026-03-09T19:30:19.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:18 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:30:19.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:18 vm08.local ceph-mon[103420]: pgmap v70: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 8.2 KiB/s rd, 145 KiB/s wr, 94 op/s; 1560/291 objects degraded (536.082%); 0 B/s, 7 objects/s recovering 2026-03-09T19:30:19.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:18 vm08.local ceph-mon[103420]: osdmap e62: 6 total, 6 up, 6 in 2026-03-09T19:30:19.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:18 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 1560/291 objects degraded (536.082%), 10 pgs degraded (PG_DEGRADED) 2026-03-09T19:30:19.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:18 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:30:20.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:19 vm08.local ceph-mon[103420]: osdmap e63: 6 total, 6 up, 6 in 2026-03-09T19:30:20.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:19 vm07.local ceph-mon[111841]: osdmap e63: 6 total, 6 up, 6 in 2026-03-09T19:30:21.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:20 vm08.local ceph-mon[103420]: pgmap v73: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 255 B/s rd, 52 KiB/s wr, 2 op/s; 1560/291 objects degraded (536.082%); 0 B/s, 6 objects/s recovering 2026-03-09T19:30:21.356 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:20 vm07.local ceph-mon[111841]: pgmap v73: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 255 B/s rd, 52 KiB/s wr, 2 op/s; 1560/291 objects degraded (536.082%); 0 B/s, 6 objects/s recovering 2026-03-09T19:30:22.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:22 vm08.local ceph-mon[103420]: pgmap v74: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 255 B/s rd, 53 KiB/s wr, 2 op/s; 1088/291 objects degraded (373.883%); 0 B/s, 52 objects/s recovering 2026-03-09T19:30:22.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:22 vm07.local ceph-mon[111841]: pgmap v74: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 255 B/s rd, 53 KiB/s wr, 2 op/s; 1088/291 objects degraded (373.883%); 0 B/s, 52 objects/s recovering 2026-03-09T19:30:23.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:23 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 1088/291 objects degraded (373.883%), 7 pgs degraded, 7 pgs undersized (PG_DEGRADED) 2026-03-09T19:30:24.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:23 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 1088/291 objects degraded (373.883%), 7 pgs degraded, 7 pgs undersized (PG_DEGRADED) 2026-03-09T19:30:24.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:24 vm07.local ceph-mon[111841]: pgmap v75: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 511 B/s wr, 0 op/s; 1088/291 objects degraded (373.883%); 0 B/s, 46 objects/s recovering 2026-03-09T19:30:25.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:24 vm08.local ceph-mon[103420]: pgmap v75: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 511 B/s wr, 0 op/s; 1088/291 objects degraded (373.883%); 0 B/s, 46 objects/s recovering 2026-03-09T19:30:26.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:26 vm07.local ceph-mon[111841]: pgmap v76: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 511 B/s wr, 0 op/s; 1088/291 objects degraded (373.883%); 0 B/s, 52 objects/s recovering 2026-03-09T19:30:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:26 vm08.local ceph-mon[103420]: pgmap v76: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 511 B/s wr, 0 op/s; 1088/291 objects degraded (373.883%); 0 B/s, 52 objects/s recovering 2026-03-09T19:30:28.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:28 vm07.local ceph-mon[111841]: pgmap v77: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 427 B/s wr, 0 op/s; 1088/291 objects degraded (373.883%); 0 B/s, 44 objects/s recovering 2026-03-09T19:30:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:28 vm08.local ceph-mon[103420]: pgmap v77: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 427 B/s wr, 0 op/s; 1088/291 objects degraded (373.883%); 0 B/s, 44 objects/s recovering 2026-03-09T19:30:30.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:30 vm07.local ceph-mon[111841]: pgmap v78: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 386 B/s wr, 0 op/s; 1088/291 objects degraded (373.883%); 0 B/s, 39 objects/s recovering 2026-03-09T19:30:30.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:30 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:30:30.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:30 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:30:30.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:30 vm07.local ceph-mon[111841]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-09T19:30:31.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:30 vm08.local ceph-mon[103420]: pgmap v78: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 386 B/s wr, 0 op/s; 1088/291 objects degraded (373.883%); 0 B/s, 39 objects/s recovering 2026-03-09T19:30:31.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:30 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:30:31.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:30 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:30:31.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:30 vm08.local ceph-mon[103420]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-09T19:30:32.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:32 vm07.local ceph-mon[111841]: pgmap v79: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 341 B/s wr, 0 op/s; 1088/291 objects degraded (373.883%); 0 B/s, 39 objects/s recovering 2026-03-09T19:30:33.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:32 vm08.local ceph-mon[103420]: pgmap v79: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 341 B/s wr, 0 op/s; 1088/291 objects degraded (373.883%); 0 B/s, 39 objects/s recovering 2026-03-09T19:30:33.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:30:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:30:34.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:34 vm07.local ceph-mon[111841]: pgmap v80: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 1088/291 objects degraded (373.883%); 0 B/s, 8 objects/s recovering 2026-03-09T19:30:35.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:34 vm08.local ceph-mon[103420]: pgmap v80: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 1088/291 objects degraded (373.883%); 0 B/s, 8 objects/s recovering 2026-03-09T19:30:35.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:35 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 1039/291 objects degraded (357.045%), 7 pgs degraded, 7 pgs undersized (PG_DEGRADED) 2026-03-09T19:30:36.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:35 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 1039/291 objects degraded (357.045%), 7 pgs degraded, 7 pgs undersized (PG_DEGRADED) 2026-03-09T19:30:37.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:36 vm08.local ceph-mon[103420]: pgmap v81: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 1039/291 objects degraded (357.045%); 0 B/s, 12 objects/s recovering 2026-03-09T19:30:37.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:36 vm07.local ceph-mon[111841]: pgmap v81: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 1039/291 objects degraded (357.045%); 0 B/s, 12 objects/s recovering 2026-03-09T19:30:38.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:37 vm08.local ceph-mon[103420]: osdmap e64: 6 total, 6 up, 6 in 2026-03-09T19:30:38.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:37 vm07.local ceph-mon[111841]: osdmap e64: 6 total, 6 up, 6 in 2026-03-09T19:30:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:38 vm08.local ceph-mon[103420]: pgmap v83: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 1039/291 objects degraded (357.045%); 0 B/s, 9 objects/s recovering 2026-03-09T19:30:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:38 vm08.local ceph-mon[103420]: osdmap e65: 6 total, 6 up, 6 in 2026-03-09T19:30:39.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:38 vm07.local ceph-mon[111841]: pgmap v83: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 1039/291 objects degraded (357.045%); 0 B/s, 9 objects/s recovering 2026-03-09T19:30:39.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:38 vm07.local ceph-mon[111841]: osdmap e65: 6 total, 6 up, 6 in 2026-03-09T19:30:40.350 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.351+0000 7fca3737c640 1 -- 192.168.123.107:0/466295870 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca30102a60 msgr2=0x7fca30102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:40.351 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.351+0000 7fca3737c640 1 --2- 192.168.123.107:0/466295870 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca30102a60 0x7fca30102e60 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fca180099b0 tx=0x7fca1802f220 comp rx=0 tx=0).stop 2026-03-09T19:30:40.351 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.352+0000 7fca3737c640 1 -- 192.168.123.107:0/466295870 shutdown_connections 2026-03-09T19:30:40.351 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.352+0000 7fca3737c640 1 --2- 192.168.123.107:0/466295870 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fca30103c60 0x7fca301040e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.351 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.352+0000 7fca3737c640 1 --2- 192.168.123.107:0/466295870 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca30102a60 0x7fca30102e60 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.351 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.352+0000 7fca3737c640 1 -- 192.168.123.107:0/466295870 >> 192.168.123.107:0/466295870 conn(0x7fca300fe250 msgr2=0x7fca30100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:40.351 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.352+0000 7fca3737c640 1 -- 192.168.123.107:0/466295870 shutdown_connections 2026-03-09T19:30:40.351 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.352+0000 7fca3737c640 1 -- 192.168.123.107:0/466295870 wait complete. 2026-03-09T19:30:40.352 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.353+0000 7fca3737c640 1 Processor -- start 2026-03-09T19:30:40.352 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.353+0000 7fca3737c640 1 -- start start 2026-03-09T19:30:40.352 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.353+0000 7fca3737c640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca30102a60 0x7fca3019a440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:40.352 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.353+0000 7fca3737c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fca30103c60 0x7fca3019a980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:40.353 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.353+0000 7fca3737c640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca3019af50 con 0x7fca30103c60 2026-03-09T19:30:40.353 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.353+0000 7fca3737c640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca3019b0c0 con 0x7fca30102a60 2026-03-09T19:30:40.353 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.354+0000 7fca350f1640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca30102a60 0x7fca3019a440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:40.353 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.354+0000 7fca350f1640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca30102a60 0x7fca3019a440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:40978/0 (socket says 192.168.123.107:40978) 2026-03-09T19:30:40.353 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.354+0000 7fca350f1640 1 -- 192.168.123.107:0/2474890471 learned_addr learned my addr 192.168.123.107:0/2474890471 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:30:40.353 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.354+0000 7fca350f1640 1 -- 192.168.123.107:0/2474890471 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fca30103c60 msgr2=0x7fca3019a980 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:30:40.354 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.354+0000 7fca348f0640 1 --2- 192.168.123.107:0/2474890471 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fca30103c60 0x7fca3019a980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:40.354 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.355+0000 7fca350f1640 1 --2- 192.168.123.107:0/2474890471 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fca30103c60 0x7fca3019a980 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.354 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.355+0000 7fca350f1640 1 -- 192.168.123.107:0/2474890471 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fca20009590 con 0x7fca30102a60 2026-03-09T19:30:40.354 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.355+0000 7fca348f0640 1 --2- 192.168.123.107:0/2474890471 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fca30103c60 0x7fca3019a980 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:30:40.354 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.356+0000 7fca350f1640 1 --2- 192.168.123.107:0/2474890471 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca30102a60 0x7fca3019a440 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fca18002c20 tx=0x7fca18002910 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:40.355 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.356+0000 7fca267fc640 1 -- 192.168.123.107:0/2474890471 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca1803d070 con 0x7fca30102a60 2026-03-09T19:30:40.356 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.356+0000 7fca3737c640 1 -- 192.168.123.107:0/2474890471 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fca18009660 con 0x7fca30102a60 2026-03-09T19:30:40.356 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.356+0000 7fca267fc640 1 -- 192.168.123.107:0/2474890471 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fca18002e20 con 0x7fca30102a60 2026-03-09T19:30:40.356 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.356+0000 7fca267fc640 1 -- 192.168.123.107:0/2474890471 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca18041740 con 0x7fca30102a60 2026-03-09T19:30:40.356 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.356+0000 7fca3737c640 1 -- 192.168.123.107:0/2474890471 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fca3019fe60 con 0x7fca30102a60 2026-03-09T19:30:40.357 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.357+0000 7fca267fc640 1 -- 192.168.123.107:0/2474890471 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fca18038730 con 0x7fca30102a60 2026-03-09T19:30:40.357 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.358+0000 7fca3737c640 1 -- 192.168.123.107:0/2474890471 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc9f8005350 con 0x7fca30102a60 2026-03-09T19:30:40.357 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.358+0000 7fca267fc640 1 --2- 192.168.123.107:0/2474890471 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fca0c077890 0x7fca0c079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:40.357 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.359+0000 7fca348f0640 1 --2- 192.168.123.107:0/2474890471 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fca0c077890 0x7fca0c079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:40.358 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.359+0000 7fca267fc640 1 -- 192.168.123.107:0/2474890471 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6654+0+0 (secure 0 0 0) 0x7fca180be3c0 con 0x7fca30102a60 2026-03-09T19:30:40.358 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.359+0000 7fca348f0640 1 --2- 192.168.123.107:0/2474890471 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fca0c077890 0x7fca0c079d50 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fca3019b960 tx=0x7fca20009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:40.360 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.361+0000 7fca267fc640 1 -- 192.168.123.107:0/2474890471 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fca18086b40 con 0x7fca30102a60 2026-03-09T19:30:40.466 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.466+0000 7fca3737c640 1 -- 192.168.123.107:0/2474890471 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc9f8002bf0 con 0x7fca0c077890 2026-03-09T19:30:40.467 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.468+0000 7fca267fc640 1 -- 192.168.123.107:0/2474890471 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fc9f8002bf0 con 0x7fca0c077890 2026-03-09T19:30:40.469 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.470+0000 7fca3737c640 1 -- 192.168.123.107:0/2474890471 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fca0c077890 msgr2=0x7fca0c079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:40.469 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.470+0000 7fca3737c640 1 --2- 192.168.123.107:0/2474890471 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fca0c077890 0x7fca0c079d50 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fca3019b960 tx=0x7fca20009290 comp rx=0 tx=0).stop 2026-03-09T19:30:40.469 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.470+0000 7fca3737c640 1 -- 192.168.123.107:0/2474890471 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca30102a60 msgr2=0x7fca3019a440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:40.469 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.470+0000 7fca3737c640 1 --2- 192.168.123.107:0/2474890471 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca30102a60 0x7fca3019a440 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fca18002c20 tx=0x7fca18002910 comp rx=0 tx=0).stop 2026-03-09T19:30:40.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.471+0000 7fca3737c640 1 -- 192.168.123.107:0/2474890471 shutdown_connections 2026-03-09T19:30:40.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.471+0000 7fca3737c640 1 --2- 192.168.123.107:0/2474890471 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fca0c077890 0x7fca0c079d50 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.471+0000 7fca3737c640 1 --2- 192.168.123.107:0/2474890471 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fca30103c60 0x7fca3019a980 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.471+0000 7fca3737c640 1 --2- 192.168.123.107:0/2474890471 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca30102a60 0x7fca3019a440 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.471+0000 7fca3737c640 1 -- 192.168.123.107:0/2474890471 >> 192.168.123.107:0/2474890471 conn(0x7fca300fe250 msgr2=0x7fca300ffd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:40.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.471+0000 7fca3737c640 1 -- 192.168.123.107:0/2474890471 shutdown_connections 2026-03-09T19:30:40.470 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.471+0000 7fca3737c640 1 -- 192.168.123.107:0/2474890471 wait complete. 2026-03-09T19:30:40.478 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:30:40.530 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.531+0000 7fdd75e9a640 1 -- 192.168.123.107:0/2803177781 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdd7010c6a0 msgr2=0x7fdd7010cb00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:40.530 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.531+0000 7fdd75e9a640 1 --2- 192.168.123.107:0/2803177781 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdd7010c6a0 0x7fdd7010cb00 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fdd6000b0a0 tx=0x7fdd6002f4c0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.530 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.531+0000 7fdd75e9a640 1 -- 192.168.123.107:0/2803177781 shutdown_connections 2026-03-09T19:30:40.530 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.531+0000 7fdd75e9a640 1 --2- 192.168.123.107:0/2803177781 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdd7010c6a0 0x7fdd7010cb00 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.530 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.531+0000 7fdd75e9a640 1 --2- 192.168.123.107:0/2803177781 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdd7010b920 0x7fdd7010bd20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.530 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.531+0000 7fdd75e9a640 1 -- 192.168.123.107:0/2803177781 >> 192.168.123.107:0/2803177781 conn(0x7fdd7006a890 msgr2=0x7fdd7006acc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:40.530 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.531+0000 7fdd75e9a640 1 -- 192.168.123.107:0/2803177781 shutdown_connections 2026-03-09T19:30:40.530 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.531+0000 7fdd75e9a640 1 -- 192.168.123.107:0/2803177781 wait complete. 2026-03-09T19:30:40.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.532+0000 7fdd75e9a640 1 Processor -- start 2026-03-09T19:30:40.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.532+0000 7fdd75e9a640 1 -- start start 2026-03-09T19:30:40.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.532+0000 7fdd75e9a640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdd7010b920 0x7fdd701a2bb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:40.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.533+0000 7fdd75e9a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdd7010c6a0 0x7fdd701a30f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:40.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.533+0000 7fdd75e9a640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdd701a36c0 con 0x7fdd7010c6a0 2026-03-09T19:30:40.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.533+0000 7fdd75e9a640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdd701a3830 con 0x7fdd7010b920 2026-03-09T19:30:40.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.533+0000 7fdd6f7fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdd7010b920 0x7fdd701a2bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:40.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.533+0000 7fdd6effd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdd7010c6a0 0x7fdd701a30f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:40.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.533+0000 7fdd6f7fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdd7010b920 0x7fdd701a2bb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:40994/0 (socket says 192.168.123.107:40994) 2026-03-09T19:30:40.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.533+0000 7fdd6f7fe640 1 -- 192.168.123.107:0/2354515677 learned_addr learned my addr 192.168.123.107:0/2354515677 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:30:40.533 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.534+0000 7fdd6effd640 1 -- 192.168.123.107:0/2354515677 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdd7010b920 msgr2=0x7fdd701a2bb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:40.533 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.534+0000 7fdd6effd640 1 --2- 192.168.123.107:0/2354515677 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdd7010b920 0x7fdd701a2bb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.533 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.534+0000 7fdd6effd640 1 -- 192.168.123.107:0/2354515677 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdd58009590 con 0x7fdd7010c6a0 2026-03-09T19:30:40.533 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.534+0000 7fdd6effd640 1 --2- 192.168.123.107:0/2354515677 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdd7010c6a0 0x7fdd701a30f0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fdd6000b0a0 tx=0x7fdd60004990 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:40.534 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.534+0000 7fdd6cff9640 1 -- 192.168.123.107:0/2354515677 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdd600093a0 con 0x7fdd7010c6a0 2026-03-09T19:30:40.534 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.534+0000 7fdd6cff9640 1 -- 192.168.123.107:0/2354515677 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fdd60007b70 con 0x7fdd7010c6a0 2026-03-09T19:30:40.534 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.534+0000 7fdd6cff9640 1 -- 192.168.123.107:0/2354515677 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdd60038cc0 con 0x7fdd7010c6a0 2026-03-09T19:30:40.534 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.534+0000 7fdd75e9a640 1 -- 192.168.123.107:0/2354515677 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdd60009d00 con 0x7fdd7010c6a0 2026-03-09T19:30:40.534 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.535+0000 7fdd75e9a640 1 -- 192.168.123.107:0/2354515677 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdd701a8630 con 0x7fdd7010c6a0 2026-03-09T19:30:40.534 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.535+0000 7fdd4e7fc640 1 -- 192.168.123.107:0/2354515677 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdd34005350 con 0x7fdd7010c6a0 2026-03-09T19:30:40.538 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.539+0000 7fdd6cff9640 1 -- 192.168.123.107:0/2354515677 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fdd60048050 con 0x7fdd7010c6a0 2026-03-09T19:30:40.539 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.540+0000 7fdd6cff9640 1 --2- 192.168.123.107:0/2354515677 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdd440778e0 0x7fdd44079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:40.539 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.540+0000 7fdd6f7fe640 1 --2- 192.168.123.107:0/2354515677 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdd440778e0 0x7fdd44079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:40.539 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.540+0000 7fdd6cff9640 1 -- 192.168.123.107:0/2354515677 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6654+0+0 (secure 0 0 0) 0x7fdd6004f080 con 0x7fdd7010c6a0 2026-03-09T19:30:40.539 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.540+0000 7fdd6f7fe640 1 --2- 192.168.123.107:0/2354515677 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdd440778e0 0x7fdd44079da0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fdd580098d0 tx=0x7fdd58009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:40.539 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.541+0000 7fdd6cff9640 1 -- 192.168.123.107:0/2354515677 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fdd600ee9f0 con 0x7fdd7010c6a0 2026-03-09T19:30:40.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.648+0000 7fdd4e7fc640 1 -- 192.168.123.107:0/2354515677 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fdd34002bf0 con 0x7fdd440778e0 2026-03-09T19:30:40.648 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.649+0000 7fdd6cff9640 1 -- 192.168.123.107:0/2354515677 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fdd34002bf0 con 0x7fdd440778e0 2026-03-09T19:30:40.653 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.654+0000 7fdd4e7fc640 1 -- 192.168.123.107:0/2354515677 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdd440778e0 msgr2=0x7fdd44079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:40.653 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.654+0000 7fdd4e7fc640 1 --2- 192.168.123.107:0/2354515677 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdd440778e0 0x7fdd44079da0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fdd580098d0 tx=0x7fdd58009290 comp rx=0 tx=0).stop 2026-03-09T19:30:40.654 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.655+0000 7fdd4e7fc640 1 -- 192.168.123.107:0/2354515677 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdd7010c6a0 msgr2=0x7fdd701a30f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:40.654 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.655+0000 7fdd4e7fc640 1 --2- 192.168.123.107:0/2354515677 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdd7010c6a0 0x7fdd701a30f0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fdd6000b0a0 tx=0x7fdd60004990 comp rx=0 tx=0).stop 2026-03-09T19:30:40.654 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.655+0000 7fdd4e7fc640 1 -- 192.168.123.107:0/2354515677 shutdown_connections 2026-03-09T19:30:40.654 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.655+0000 7fdd4e7fc640 1 --2- 192.168.123.107:0/2354515677 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdd440778e0 0x7fdd44079da0 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.654 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.655+0000 7fdd4e7fc640 1 --2- 192.168.123.107:0/2354515677 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdd7010c6a0 0x7fdd701a30f0 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.654 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.656+0000 7fdd4e7fc640 1 --2- 192.168.123.107:0/2354515677 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdd7010b920 0x7fdd701a2bb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.655 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.656+0000 7fdd4e7fc640 1 -- 192.168.123.107:0/2354515677 >> 192.168.123.107:0/2354515677 conn(0x7fdd7006a890 msgr2=0x7fdd7010a370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:40.655 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.656+0000 7fdd4e7fc640 1 -- 192.168.123.107:0/2354515677 shutdown_connections 2026-03-09T19:30:40.655 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.656+0000 7fdd4e7fc640 1 -- 192.168.123.107:0/2354515677 wait complete. 2026-03-09T19:30:40.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.724+0000 7fba49975640 1 -- 192.168.123.107:0/1237284484 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fba44101b00 msgr2=0x7fba44101f80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:40.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.724+0000 7fba49975640 1 --2- 192.168.123.107:0/1237284484 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fba44101b00 0x7fba44101f80 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fba380099b0 tx=0x7fba3802f220 comp rx=0 tx=0).stop 2026-03-09T19:30:40.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.724+0000 7fba49975640 1 -- 192.168.123.107:0/1237284484 shutdown_connections 2026-03-09T19:30:40.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.724+0000 7fba49975640 1 --2- 192.168.123.107:0/1237284484 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fba44101b00 0x7fba44101f80 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.724+0000 7fba49975640 1 --2- 192.168.123.107:0/1237284484 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fba440feaa0 0x7fba440feea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.724+0000 7fba49975640 1 -- 192.168.123.107:0/1237284484 >> 192.168.123.107:0/1237284484 conn(0x7fba440fa5b0 msgr2=0x7fba440fc9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:40.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.724+0000 7fba49975640 1 -- 192.168.123.107:0/1237284484 shutdown_connections 2026-03-09T19:30:40.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.725+0000 7fba49975640 1 -- 192.168.123.107:0/1237284484 wait complete. 2026-03-09T19:30:40.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.726+0000 7fba49975640 1 Processor -- start 2026-03-09T19:30:40.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.726+0000 7fba49975640 1 -- start start 2026-03-09T19:30:40.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.726+0000 7fba49975640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fba440feaa0 0x7fba441013e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:40.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.726+0000 7fba49975640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fba44101b00 0x7fba440ffa30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:40.726 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.726+0000 7fba49975640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fba441019b0 con 0x7fba440feaa0 2026-03-09T19:30:40.726 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.726+0000 7fba49975640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fba440fffa0 con 0x7fba44101b00 2026-03-09T19:30:40.726 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.726+0000 7fba42ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fba440feaa0 0x7fba441013e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:40.727 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.726+0000 7fba42ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fba440feaa0 0x7fba441013e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:58698/0 (socket says 192.168.123.107:58698) 2026-03-09T19:30:40.727 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.726+0000 7fba42ffd640 1 -- 192.168.123.107:0/4059813927 learned_addr learned my addr 192.168.123.107:0/4059813927 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:30:40.727 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.726+0000 7fba427fc640 1 --2- 192.168.123.107:0/4059813927 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fba44101b00 0x7fba440ffa30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:40.727 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.727+0000 7fba427fc640 1 -- 192.168.123.107:0/4059813927 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fba440feaa0 msgr2=0x7fba441013e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:40.727 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.727+0000 7fba427fc640 1 --2- 192.168.123.107:0/4059813927 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fba440feaa0 0x7fba441013e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.727 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.727+0000 7fba427fc640 1 -- 192.168.123.107:0/4059813927 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fba38009660 con 0x7fba44101b00 2026-03-09T19:30:40.727 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.727+0000 7fba427fc640 1 --2- 192.168.123.107:0/4059813927 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fba44101b00 0x7fba440ffa30 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fba38002af0 tx=0x7fba38002910 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:40.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.727+0000 7fba48973640 1 -- 192.168.123.107:0/4059813927 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fba3803d070 con 0x7fba44101b00 2026-03-09T19:30:40.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.727+0000 7fba49975640 1 -- 192.168.123.107:0/4059813927 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fba44100220 con 0x7fba44101b00 2026-03-09T19:30:40.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.727+0000 7fba49975640 1 -- 192.168.123.107:0/4059813927 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fba44100790 con 0x7fba44101b00 2026-03-09T19:30:40.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.727+0000 7fba48973640 1 -- 192.168.123.107:0/4059813927 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fba3802fd50 con 0x7fba44101b00 2026-03-09T19:30:40.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.727+0000 7fba48973640 1 -- 192.168.123.107:0/4059813927 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fba380418c0 con 0x7fba44101b00 2026-03-09T19:30:40.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.729+0000 7fba48973640 1 -- 192.168.123.107:0/4059813927 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fba3804b430 con 0x7fba44101b00 2026-03-09T19:30:40.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.729+0000 7fba48973640 1 --2- 192.168.123.107:0/4059813927 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fba140778e0 0x7fba14079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:40.729 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.729+0000 7fba48973640 1 -- 192.168.123.107:0/4059813927 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6654+0+0 (secure 0 0 0) 0x7fba380be730 con 0x7fba44101b00 2026-03-09T19:30:40.729 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.729+0000 7fba42ffd640 1 --2- 192.168.123.107:0/4059813927 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fba140778e0 0x7fba14079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:40.729 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.730+0000 7fba42ffd640 1 --2- 192.168.123.107:0/4059813927 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fba140778e0 0x7fba14079da0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fba2c013fd0 tx=0x7fba2c015040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:40.729 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.730+0000 7fba49975640 1 -- 192.168.123.107:0/4059813927 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fba08005350 con 0x7fba44101b00 2026-03-09T19:30:40.732 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.733+0000 7fba48973640 1 -- 192.168.123.107:0/4059813927 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fba38087d60 con 0x7fba44101b00 2026-03-09T19:30:40.857 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.858+0000 7fba49975640 1 -- 192.168.123.107:0/4059813927 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fba08002bf0 con 0x7fba140778e0 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.867+0000 7fba48973640 1 -- 192.168.123.107:0/4059813927 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3600 (secure 0 0 0) 0x7fba08002bf0 con 0x7fba140778e0 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (7m) 89s ago 8m 23.2M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (8m) 89s ago 8m 8938k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (7m) 100s ago 7m 10.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (104s) 89s ago 8m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 91ed5a6dbf3f 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (102s) 100s ago 7m 8321k - 19.2.3-678-ge911bdeb 654f31e6858e b2465a9d2305 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (7m) 89s ago 8m 88.6M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (6m) 89s ago 6m 16.7M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (6m) 89s ago 6m 18.3M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (6m) 100s ago 6m 28.8M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (6m) 100s ago 6m 239M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:8443,9283,8765 running (2m) 89s ago 8m 593M - 19.2.3-678-ge911bdeb 654f31e6858e 6c1350e70bfa 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (2m) 100s ago 7m 489M - 19.2.3-678-ge911bdeb 654f31e6858e c4c36685d8dc 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (2m) 89s ago 8m 56.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e ad39140965d8 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (2m) 100s ago 7m 46.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b4a58927ebfd 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (8m) 89s ago 8m 14.3M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (7m) 100s ago 7m 16.6M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (91s) 89s ago 7m 30.8M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a203aa241656 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (7m) 89s ago 7m 394M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2b3c7dd92144 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (7m) 89s ago 7m 318M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 67f7c4b96ef8 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (6m) 100s ago 6m 441M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 740e44caf4fc 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (6m) 100s ago 6m 446M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d929d31f8a58 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (6m) 100s ago 6m 348M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:30:40.866 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (2m) 89s ago 7m 44.5M - 2.43.0 a07b618ecd1d c09450c20f5f 2026-03-09T19:30:40.868 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.869+0000 7fba49975640 1 -- 192.168.123.107:0/4059813927 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fba140778e0 msgr2=0x7fba14079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:40.868 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.869+0000 7fba49975640 1 --2- 192.168.123.107:0/4059813927 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fba140778e0 0x7fba14079da0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fba2c013fd0 tx=0x7fba2c015040 comp rx=0 tx=0).stop 2026-03-09T19:30:40.868 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.869+0000 7fba49975640 1 -- 192.168.123.107:0/4059813927 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fba44101b00 msgr2=0x7fba440ffa30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:40.868 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.869+0000 7fba49975640 1 --2- 192.168.123.107:0/4059813927 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fba44101b00 0x7fba440ffa30 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fba38002af0 tx=0x7fba38002910 comp rx=0 tx=0).stop 2026-03-09T19:30:40.868 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.869+0000 7fba49975640 1 -- 192.168.123.107:0/4059813927 shutdown_connections 2026-03-09T19:30:40.868 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.869+0000 7fba49975640 1 --2- 192.168.123.107:0/4059813927 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fba140778e0 0x7fba14079da0 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.868 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.869+0000 7fba49975640 1 --2- 192.168.123.107:0/4059813927 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fba44101b00 0x7fba440ffa30 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.868 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.869+0000 7fba49975640 1 --2- 192.168.123.107:0/4059813927 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fba440feaa0 0x7fba441013e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.868 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.869+0000 7fba49975640 1 -- 192.168.123.107:0/4059813927 >> 192.168.123.107:0/4059813927 conn(0x7fba440fa5b0 msgr2=0x7fba440fc0e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:40.868 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.869+0000 7fba49975640 1 -- 192.168.123.107:0/4059813927 shutdown_connections 2026-03-09T19:30:40.868 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.869+0000 7fba49975640 1 -- 192.168.123.107:0/4059813927 wait complete. 2026-03-09T19:30:40.928 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:40 vm07.local ceph-mon[111841]: pgmap v85: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 1039/291 objects degraded (357.045%); 0 B/s, 6 objects/s recovering 2026-03-09T19:30:40.928 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.928+0000 7f3d8807f640 1 -- 192.168.123.107:0/4232112414 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d80102a80 msgr2=0x7f3d80102e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:40.928 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.928+0000 7f3d8807f640 1 --2- 192.168.123.107:0/4232112414 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d80102a80 0x7f3d80102e80 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f3d700099b0 tx=0x7f3d7002f220 comp rx=0 tx=0).stop 2026-03-09T19:30:40.931 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.932+0000 7f3d8807f640 1 -- 192.168.123.107:0/4232112414 shutdown_connections 2026-03-09T19:30:40.931 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.932+0000 7f3d8807f640 1 --2- 192.168.123.107:0/4232112414 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d80103c80 0x7f3d80104100 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.931 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.932+0000 7f3d8807f640 1 --2- 192.168.123.107:0/4232112414 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d80102a80 0x7f3d80102e80 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.931 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.932+0000 7f3d8807f640 1 -- 192.168.123.107:0/4232112414 >> 192.168.123.107:0/4232112414 conn(0x7f3d800fe250 msgr2=0x7f3d80100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:40.931 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.932+0000 7f3d8807f640 1 -- 192.168.123.107:0/4232112414 shutdown_connections 2026-03-09T19:30:40.931 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.933+0000 7f3d8807f640 1 -- 192.168.123.107:0/4232112414 wait complete. 2026-03-09T19:30:40.932 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.933+0000 7f3d8807f640 1 Processor -- start 2026-03-09T19:30:40.932 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.933+0000 7f3d8807f640 1 -- start start 2026-03-09T19:30:40.932 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.933+0000 7f3d8807f640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d80103c80 0x7f3d8019a670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:40.932 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.933+0000 7f3d8807f640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d8019abb0 0x7f3d8019fc20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:40.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.933+0000 7f3d8807f640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d8019b030 con 0x7f3d8019abb0 2026-03-09T19:30:40.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.933+0000 7f3d8807f640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d8019b1a0 con 0x7f3d80103c80 2026-03-09T19:30:40.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.934+0000 7f3d85df4640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d80103c80 0x7f3d8019a670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:40.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.934+0000 7f3d85df4640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d80103c80 0x7f3d8019a670 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:41028/0 (socket says 192.168.123.107:41028) 2026-03-09T19:30:40.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.934+0000 7f3d85df4640 1 -- 192.168.123.107:0/1344807694 learned_addr learned my addr 192.168.123.107:0/1344807694 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:30:40.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.934+0000 7f3d85df4640 1 -- 192.168.123.107:0/1344807694 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d8019abb0 msgr2=0x7f3d8019fc20 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T19:30:40.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.934+0000 7f3d855f3640 1 --2- 192.168.123.107:0/1344807694 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d8019abb0 0x7f3d8019fc20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:40.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.934+0000 7f3d85df4640 1 --2- 192.168.123.107:0/1344807694 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d8019abb0 0x7f3d8019fc20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:40.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.934+0000 7f3d85df4640 1 -- 192.168.123.107:0/1344807694 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3d68009590 con 0x7f3d80103c80 2026-03-09T19:30:40.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.934+0000 7f3d85df4640 1 --2- 192.168.123.107:0/1344807694 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d80103c80 0x7f3d8019a670 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f3d70002fd0 tx=0x7f3d700029a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:40.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.934+0000 7f3d855f3640 1 --2- 192.168.123.107:0/1344807694 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d8019abb0 0x7f3d8019fc20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:30:40.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.935+0000 7f3d76ffd640 1 -- 192.168.123.107:0/1344807694 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d7003d070 con 0x7f3d80103c80 2026-03-09T19:30:40.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.935+0000 7f3d76ffd640 1 -- 192.168.123.107:0/1344807694 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3d70031d30 con 0x7f3d80103c80 2026-03-09T19:30:40.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.935+0000 7f3d8807f640 1 -- 192.168.123.107:0/1344807694 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3d70009660 con 0x7f3d80103c80 2026-03-09T19:30:40.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.935+0000 7f3d8807f640 1 -- 192.168.123.107:0/1344807694 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3d801a0440 con 0x7f3d80103c80 2026-03-09T19:30:40.935 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.936+0000 7f3d76ffd640 1 -- 192.168.123.107:0/1344807694 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d700318f0 con 0x7f3d80103c80 2026-03-09T19:30:40.935 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.936+0000 7f3d76ffd640 1 -- 192.168.123.107:0/1344807694 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3d70038470 con 0x7f3d80103c80 2026-03-09T19:30:40.935 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.936+0000 7f3d8807f640 1 -- 192.168.123.107:0/1344807694 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3d48005350 con 0x7f3d80103c80 2026-03-09T19:30:40.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.937+0000 7f3d76ffd640 1 --2- 192.168.123.107:0/1344807694 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3d58077680 0x7f3d58079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:40.937 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.938+0000 7f3d855f3640 1 --2- 192.168.123.107:0/1344807694 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3d58077680 0x7f3d58079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:40.937 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.938+0000 7f3d76ffd640 1 -- 192.168.123.107:0/1344807694 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6654+0+0 (secure 0 0 0) 0x7f3d700be2c0 con 0x7f3d80103c80 2026-03-09T19:30:40.937 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.938+0000 7f3d855f3640 1 --2- 192.168.123.107:0/1344807694 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3d58077680 0x7f3d58079b40 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f3d8019bb90 tx=0x7f3d68009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:40.941 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:40.942+0000 7f3d76ffd640 1 -- 192.168.123.107:0/1344807694 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3d70087aa0 con 0x7f3d80103c80 2026-03-09T19:30:41.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.073+0000 7f3d8807f640 1 -- 192.168.123.107:0/1344807694 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f3d48005e10 con 0x7f3d80103c80 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.074+0000 7f3d76ffd640 1 -- 192.168.123.107:0/1344807694 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+844 (secure 0 0 0) 0x7f3d700871f0 con 0x7f3d80103c80 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 5, 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 9, 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-09T19:30:41.073 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:30:41.074 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:30:41.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.077+0000 7f3d8807f640 1 -- 192.168.123.107:0/1344807694 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3d58077680 msgr2=0x7f3d58079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:41.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.077+0000 7f3d8807f640 1 --2- 192.168.123.107:0/1344807694 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3d58077680 0x7f3d58079b40 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f3d8019bb90 tx=0x7f3d68009290 comp rx=0 tx=0).stop 2026-03-09T19:30:41.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.077+0000 7f3d8807f640 1 -- 192.168.123.107:0/1344807694 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d80103c80 msgr2=0x7f3d8019a670 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:41.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.077+0000 7f3d8807f640 1 --2- 192.168.123.107:0/1344807694 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d80103c80 0x7f3d8019a670 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f3d70002fd0 tx=0x7f3d700029a0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.077+0000 7f3d8807f640 1 -- 192.168.123.107:0/1344807694 shutdown_connections 2026-03-09T19:30:41.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.077+0000 7f3d8807f640 1 --2- 192.168.123.107:0/1344807694 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3d58077680 0x7f3d58079b40 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.077+0000 7f3d8807f640 1 --2- 192.168.123.107:0/1344807694 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d8019abb0 0x7f3d8019fc20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.077+0000 7f3d8807f640 1 --2- 192.168.123.107:0/1344807694 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d80103c80 0x7f3d8019a670 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.077+0000 7f3d8807f640 1 -- 192.168.123.107:0/1344807694 >> 192.168.123.107:0/1344807694 conn(0x7f3d800fe250 msgr2=0x7f3d800ffb10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:41.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.078+0000 7f3d8807f640 1 -- 192.168.123.107:0/1344807694 shutdown_connections 2026-03-09T19:30:41.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.078+0000 7f3d8807f640 1 -- 192.168.123.107:0/1344807694 wait complete. 2026-03-09T19:30:41.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:40 vm08.local ceph-mon[103420]: pgmap v85: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 1039/291 objects degraded (357.045%); 0 B/s, 6 objects/s recovering 2026-03-09T19:30:41.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.137+0000 7f9bd4017640 1 -- 192.168.123.107:0/2625227452 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9bcc075ba0 msgr2=0x7f9bcc075fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:41.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.137+0000 7f9bd4017640 1 --2- 192.168.123.107:0/2625227452 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9bcc075ba0 0x7f9bcc075fa0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f9bbc0099b0 tx=0x7f9bbc02f220 comp rx=0 tx=0).stop 2026-03-09T19:30:41.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.138+0000 7f9bd4017640 1 -- 192.168.123.107:0/2625227452 shutdown_connections 2026-03-09T19:30:41.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.138+0000 7f9bd4017640 1 --2- 192.168.123.107:0/2625227452 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9bcc076df0 0x7f9bcc077250 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.138+0000 7f9bd4017640 1 --2- 192.168.123.107:0/2625227452 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9bcc075ba0 0x7f9bcc075fa0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.138+0000 7f9bd4017640 1 -- 192.168.123.107:0/2625227452 >> 192.168.123.107:0/2625227452 conn(0x7f9bcc0fe0c0 msgr2=0x7f9bcc1004e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:41.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.138+0000 7f9bd4017640 1 -- 192.168.123.107:0/2625227452 shutdown_connections 2026-03-09T19:30:41.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.138+0000 7f9bd4017640 1 -- 192.168.123.107:0/2625227452 wait complete. 2026-03-09T19:30:41.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.138+0000 7f9bd4017640 1 Processor -- start 2026-03-09T19:30:41.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.138+0000 7f9bd4017640 1 -- start start 2026-03-09T19:30:41.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.139+0000 7f9bd4017640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9bcc075ba0 0x7f9bcc19e790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:41.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.139+0000 7f9bd4017640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9bcc076df0 0x7f9bcc19ecd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:41.138 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.139+0000 7f9bd1d8c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9bcc075ba0 0x7f9bcc19e790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:41.138 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.139+0000 7f9bd1d8c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9bcc075ba0 0x7f9bcc19e790 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:58738/0 (socket says 192.168.123.107:58738) 2026-03-09T19:30:41.138 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.139+0000 7f9bd1d8c640 1 -- 192.168.123.107:0/247161637 learned_addr learned my addr 192.168.123.107:0/247161637 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:30:41.138 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.139+0000 7f9bd4017640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9bcc19f210 con 0x7f9bcc075ba0 2026-03-09T19:30:41.138 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.139+0000 7f9bd4017640 1 -- 192.168.123.107:0/247161637 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9bcc19f380 con 0x7f9bcc076df0 2026-03-09T19:30:41.138 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.139+0000 7f9bd158b640 1 --2- 192.168.123.107:0/247161637 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9bcc076df0 0x7f9bcc19ecd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:41.138 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.139+0000 7f9bd1d8c640 1 -- 192.168.123.107:0/247161637 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9bcc076df0 msgr2=0x7f9bcc19ecd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:41.138 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.139+0000 7f9bd1d8c640 1 --2- 192.168.123.107:0/247161637 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9bcc076df0 0x7f9bcc19ecd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.138 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.140+0000 7f9bd1d8c640 1 -- 192.168.123.107:0/247161637 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9bbc009660 con 0x7f9bcc075ba0 2026-03-09T19:30:41.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.140+0000 7f9bd158b640 1 --2- 192.168.123.107:0/247161637 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9bcc076df0 0x7f9bcc19ecd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:30:41.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.140+0000 7f9bd1d8c640 1 --2- 192.168.123.107:0/247161637 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9bcc075ba0 0x7f9bcc19e790 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f9bbc02f730 tx=0x7f9bbc002980 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:41.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.140+0000 7f9bbaffd640 1 -- 192.168.123.107:0/247161637 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9bbc03d070 con 0x7f9bcc075ba0 2026-03-09T19:30:41.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.140+0000 7f9bbaffd640 1 -- 192.168.123.107:0/247161637 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9bbc02fd50 con 0x7f9bcc075ba0 2026-03-09T19:30:41.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.140+0000 7f9bd4017640 1 -- 192.168.123.107:0/247161637 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9bcc1a3e00 con 0x7f9bcc075ba0 2026-03-09T19:30:41.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.140+0000 7f9bbaffd640 1 -- 192.168.123.107:0/247161637 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9bbc041a30 con 0x7f9bcc075ba0 2026-03-09T19:30:41.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.140+0000 7f9bd4017640 1 -- 192.168.123.107:0/247161637 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9bcc1a4240 con 0x7f9bcc075ba0 2026-03-09T19:30:41.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.142+0000 7f9bbaffd640 1 -- 192.168.123.107:0/247161637 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9bbc04b430 con 0x7f9bcc075ba0 2026-03-09T19:30:41.142 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.143+0000 7f9bd4017640 1 -- 192.168.123.107:0/247161637 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9b94005350 con 0x7f9bcc075ba0 2026-03-09T19:30:41.144 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.143+0000 7f9bbaffd640 1 --2- 192.168.123.107:0/247161637 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9ba40778e0 0x7f9ba4079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:41.145 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.146+0000 7f9bbaffd640 1 -- 192.168.123.107:0/247161637 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6654+0+0 (secure 0 0 0) 0x7f9bbc0bf8d0 con 0x7f9bcc075ba0 2026-03-09T19:30:41.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.146+0000 7f9bd158b640 1 --2- 192.168.123.107:0/247161637 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9ba40778e0 0x7f9ba4079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:41.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.147+0000 7f9bd158b640 1 --2- 192.168.123.107:0/247161637 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9ba40778e0 0x7f9ba4079da0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f9bcc19fcb0 tx=0x7f9bc0005e90 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:41.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.147+0000 7f9bbaffd640 1 -- 192.168.123.107:0/247161637 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9bbc087e80 con 0x7f9bcc075ba0 2026-03-09T19:30:41.260 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.261+0000 7f9bd4017640 1 -- 192.168.123.107:0/247161637 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f9b94005e10 con 0x7f9bcc075ba0 2026-03-09T19:30:41.260 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.261+0000 7f9bbaffd640 1 -- 192.168.123.107:0/247161637 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1926 (secure 0 0 0) 0x7f9bbc0875d0 con 0x7f9bcc075ba0 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:e13 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:epoch 13 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:qdb_cluster leader: 0 members: 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:30:41.261 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:30:41.263 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.265+0000 7f9bd4017640 1 -- 192.168.123.107:0/247161637 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9ba40778e0 msgr2=0x7f9ba4079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:41.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.265+0000 7f9bd4017640 1 --2- 192.168.123.107:0/247161637 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9ba40778e0 0x7f9ba4079da0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f9bcc19fcb0 tx=0x7f9bc0005e90 comp rx=0 tx=0).stop 2026-03-09T19:30:41.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.265+0000 7f9bd4017640 1 -- 192.168.123.107:0/247161637 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9bcc075ba0 msgr2=0x7f9bcc19e790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:41.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.265+0000 7f9bd4017640 1 --2- 192.168.123.107:0/247161637 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9bcc075ba0 0x7f9bcc19e790 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f9bbc02f730 tx=0x7f9bbc002980 comp rx=0 tx=0).stop 2026-03-09T19:30:41.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.265+0000 7f9bd4017640 1 -- 192.168.123.107:0/247161637 shutdown_connections 2026-03-09T19:30:41.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.265+0000 7f9bd4017640 1 --2- 192.168.123.107:0/247161637 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9ba40778e0 0x7f9ba4079da0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.266+0000 7f9bd4017640 1 --2- 192.168.123.107:0/247161637 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9bcc076df0 0x7f9bcc19ecd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.266+0000 7f9bd4017640 1 --2- 192.168.123.107:0/247161637 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9bcc075ba0 0x7f9bcc19e790 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.266+0000 7f9bd4017640 1 -- 192.168.123.107:0/247161637 >> 192.168.123.107:0/247161637 conn(0x7f9bcc0fe0c0 msgr2=0x7f9bcc0ffc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:41.265 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.266+0000 7f9bd4017640 1 -- 192.168.123.107:0/247161637 shutdown_connections 2026-03-09T19:30:41.265 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.266+0000 7f9bd4017640 1 -- 192.168.123.107:0/247161637 wait complete. 2026-03-09T19:30:41.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.324+0000 7fd7bc67a640 1 -- 192.168.123.107:0/3482610484 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7b4069a50 msgr2=0x7fd7b410c5d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:41.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.324+0000 7fd7bc67a640 1 --2- 192.168.123.107:0/3482610484 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7b4069a50 0x7fd7b410c5d0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fd7a4009a00 tx=0x7fd7a402f290 comp rx=0 tx=0).stop 2026-03-09T19:30:41.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.324+0000 7fd7bc67a640 1 -- 192.168.123.107:0/3482610484 shutdown_connections 2026-03-09T19:30:41.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.324+0000 7fd7bc67a640 1 --2- 192.168.123.107:0/3482610484 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7b4069a50 0x7fd7b410c5d0 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.324+0000 7fd7bc67a640 1 --2- 192.168.123.107:0/3482610484 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd7b4069080 0x7fd7b4069480 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.324+0000 7fd7bc67a640 1 -- 192.168.123.107:0/3482610484 >> 192.168.123.107:0/3482610484 conn(0x7fd7b406e680 msgr2=0x7fd7b4070ac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:41.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.324+0000 7fd7bc67a640 1 -- 192.168.123.107:0/3482610484 shutdown_connections 2026-03-09T19:30:41.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.324+0000 7fd7bc67a640 1 -- 192.168.123.107:0/3482610484 wait complete. 2026-03-09T19:30:41.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.325+0000 7fd7bc67a640 1 Processor -- start 2026-03-09T19:30:41.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.325+0000 7fd7bc67a640 1 -- start start 2026-03-09T19:30:41.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.325+0000 7fd7bc67a640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd7b4069080 0x7fd7b41a7350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:41.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.325+0000 7fd7bc67a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7b4069a50 0x7fd7b41a7890 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:41.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.325+0000 7fd7ba3ef640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd7b4069080 0x7fd7b41a7350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:41.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.326+0000 7fd7ba3ef640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd7b4069080 0x7fd7b41a7350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:41064/0 (socket says 192.168.123.107:41064) 2026-03-09T19:30:41.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.326+0000 7fd7ba3ef640 1 -- 192.168.123.107:0/663548140 learned_addr learned my addr 192.168.123.107:0/663548140 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:30:41.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.326+0000 7fd7b9bee640 1 --2- 192.168.123.107:0/663548140 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7b4069a50 0x7fd7b41a7890 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:41.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.326+0000 7fd7bc67a640 1 -- 192.168.123.107:0/663548140 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd7b41a7e60 con 0x7fd7b4069a50 2026-03-09T19:30:41.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.326+0000 7fd7bc67a640 1 -- 192.168.123.107:0/663548140 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd7b41a7fd0 con 0x7fd7b4069080 2026-03-09T19:30:41.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.326+0000 7fd7b9bee640 1 -- 192.168.123.107:0/663548140 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd7b4069080 msgr2=0x7fd7b41a7350 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:41.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.326+0000 7fd7b9bee640 1 --2- 192.168.123.107:0/663548140 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd7b4069080 0x7fd7b41a7350 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.326+0000 7fd7b9bee640 1 -- 192.168.123.107:0/663548140 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd7a4009660 con 0x7fd7b4069a50 2026-03-09T19:30:41.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.327+0000 7fd7b9bee640 1 --2- 192.168.123.107:0/663548140 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7b4069a50 0x7fd7b41a7890 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fd7a4005bb0 tx=0x7fd7a40042b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:41.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.327+0000 7fd79b7fe640 1 -- 192.168.123.107:0/663548140 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd7a402faf0 con 0x7fd7b4069a50 2026-03-09T19:30:41.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.327+0000 7fd79b7fe640 1 -- 192.168.123.107:0/663548140 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd7a402fc50 con 0x7fd7b4069a50 2026-03-09T19:30:41.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.327+0000 7fd79b7fe640 1 -- 192.168.123.107:0/663548140 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd7a4041860 con 0x7fd7b4069a50 2026-03-09T19:30:41.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.327+0000 7fd7bc67a640 1 -- 192.168.123.107:0/663548140 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd7b41aca10 con 0x7fd7b4069a50 2026-03-09T19:30:41.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.328+0000 7fd7bc67a640 1 -- 192.168.123.107:0/663548140 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd7b41ace80 con 0x7fd7b4069a50 2026-03-09T19:30:41.327 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.329+0000 7fd7bc67a640 1 -- 192.168.123.107:0/663548140 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd7b4069480 con 0x7fd7b4069a50 2026-03-09T19:30:41.331 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.329+0000 7fd79b7fe640 1 -- 192.168.123.107:0/663548140 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd7a403f070 con 0x7fd7b4069a50 2026-03-09T19:30:41.331 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.329+0000 7fd79b7fe640 1 --2- 192.168.123.107:0/663548140 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd7900776d0 0x7fd790079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:41.331 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.330+0000 7fd79b7fe640 1 -- 192.168.123.107:0/663548140 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6654+0+0 (secure 0 0 0) 0x7fd7a40be390 con 0x7fd7b4069a50 2026-03-09T19:30:41.331 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.332+0000 7fd7ba3ef640 1 --2- 192.168.123.107:0/663548140 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd7900776d0 0x7fd790079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:41.331 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.332+0000 7fd79b7fe640 1 -- 192.168.123.107:0/663548140 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd7a4086af0 con 0x7fd7b4069a50 2026-03-09T19:30:41.333 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.333+0000 7fd7ba3ef640 1 --2- 192.168.123.107:0/663548140 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd7900776d0 0x7fd790079b90 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fd7a80059c0 tx=0x7fd7a8009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:41.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.439+0000 7fd7bc67a640 1 -- 192.168.123.107:0/663548140 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd7b410b930 con 0x7fd7900776d0 2026-03-09T19:30:41.439 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.440+0000 7fd79b7fe640 1 -- 192.168.123.107:0/663548140 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fd7b410b930 con 0x7fd7900776d0 2026-03-09T19:30:41.439 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:30:41.439 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T19:30:41.439 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:30:41.439 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:30:41.439 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [ 2026-03-09T19:30:41.439 INFO:teuthology.orchestra.run.vm07.stdout: "mgr", 2026-03-09T19:30:41.439 INFO:teuthology.orchestra.run.vm07.stdout: "crash", 2026-03-09T19:30:41.439 INFO:teuthology.orchestra.run.vm07.stdout: "mon" 2026-03-09T19:30:41.439 INFO:teuthology.orchestra.run.vm07.stdout: ], 2026-03-09T19:30:41.439 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "7/23 daemons upgraded", 2026-03-09T19:30:41.439 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T19:30:41.439 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:30:41.439 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:30:41.441 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.442+0000 7fd7bc67a640 1 -- 192.168.123.107:0/663548140 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd7900776d0 msgr2=0x7fd790079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:41.441 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.442+0000 7fd7bc67a640 1 --2- 192.168.123.107:0/663548140 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd7900776d0 0x7fd790079b90 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fd7a80059c0 tx=0x7fd7a8009290 comp rx=0 tx=0).stop 2026-03-09T19:30:41.441 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.443+0000 7fd7bc67a640 1 -- 192.168.123.107:0/663548140 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7b4069a50 msgr2=0x7fd7b41a7890 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:41.441 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.443+0000 7fd7bc67a640 1 --2- 192.168.123.107:0/663548140 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7b4069a50 0x7fd7b41a7890 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fd7a4005bb0 tx=0x7fd7a40042b0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.443+0000 7fd7bc67a640 1 -- 192.168.123.107:0/663548140 shutdown_connections 2026-03-09T19:30:41.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.443+0000 7fd7bc67a640 1 --2- 192.168.123.107:0/663548140 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd7900776d0 0x7fd790079b90 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.443+0000 7fd7bc67a640 1 --2- 192.168.123.107:0/663548140 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7b4069a50 0x7fd7b41a7890 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.443+0000 7fd7bc67a640 1 --2- 192.168.123.107:0/663548140 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd7b4069080 0x7fd7b41a7350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.443+0000 7fd7bc67a640 1 -- 192.168.123.107:0/663548140 >> 192.168.123.107:0/663548140 conn(0x7fd7b406e680 msgr2=0x7fd7b40708d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:41.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.443+0000 7fd7bc67a640 1 -- 192.168.123.107:0/663548140 shutdown_connections 2026-03-09T19:30:41.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.443+0000 7fd7bc67a640 1 -- 192.168.123.107:0/663548140 wait complete. 2026-03-09T19:30:41.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.500+0000 7f9e0f29d640 1 -- 192.168.123.107:0/149289623 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e08103c80 msgr2=0x7f9e08104100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:41.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.500+0000 7f9e0f29d640 1 --2- 192.168.123.107:0/149289623 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e08103c80 0x7f9e08104100 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f9df8009a00 tx=0x7f9df802f290 comp rx=0 tx=0).stop 2026-03-09T19:30:41.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.500+0000 7f9e0f29d640 1 -- 192.168.123.107:0/149289623 shutdown_connections 2026-03-09T19:30:41.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.500+0000 7f9e0f29d640 1 --2- 192.168.123.107:0/149289623 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e08103c80 0x7f9e08104100 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.500+0000 7f9e0f29d640 1 --2- 192.168.123.107:0/149289623 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e08102a80 0x7f9e08102e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.500+0000 7f9e0f29d640 1 -- 192.168.123.107:0/149289623 >> 192.168.123.107:0/149289623 conn(0x7f9e080fe250 msgr2=0x7f9e08100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:41.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.500+0000 7f9e0f29d640 1 -- 192.168.123.107:0/149289623 shutdown_connections 2026-03-09T19:30:41.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.501+0000 7f9e0f29d640 1 -- 192.168.123.107:0/149289623 wait complete. 2026-03-09T19:30:41.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.501+0000 7f9e0f29d640 1 Processor -- start 2026-03-09T19:30:41.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.501+0000 7f9e0f29d640 1 -- start start 2026-03-09T19:30:41.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.501+0000 7f9e0f29d640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e08102a80 0x7f9e0819a460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:41.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.501+0000 7f9e0f29d640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e08103c80 0x7f9e0819a9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:41.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.502+0000 7f9e0d012640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e08102a80 0x7f9e0819a460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:41.501 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.502+0000 7f9e0f29d640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9e0819af70 con 0x7f9e08103c80 2026-03-09T19:30:41.501 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.502+0000 7f9e0f29d640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9e0819b0e0 con 0x7f9e08102a80 2026-03-09T19:30:41.501 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.502+0000 7f9e0c811640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e08103c80 0x7f9e0819a9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:41.501 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.502+0000 7f9e0d012640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e08102a80 0x7f9e0819a460 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:41090/0 (socket says 192.168.123.107:41090) 2026-03-09T19:30:41.501 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.502+0000 7f9e0d012640 1 -- 192.168.123.107:0/854811862 learned_addr learned my addr 192.168.123.107:0/854811862 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:30:41.501 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.502+0000 7f9e0c811640 1 -- 192.168.123.107:0/854811862 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e08102a80 msgr2=0x7f9e0819a460 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:41.501 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.502+0000 7f9e0c811640 1 --2- 192.168.123.107:0/854811862 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e08102a80 0x7f9e0819a460 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.501 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.503+0000 7f9e0c811640 1 -- 192.168.123.107:0/854811862 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9df8009660 con 0x7f9e08103c80 2026-03-09T19:30:41.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.503+0000 7f9e0c811640 1 --2- 192.168.123.107:0/854811862 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e08103c80 0x7f9e0819a9a0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f9df8005bb0 tx=0x7f9df8004320 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:41.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.503+0000 7f9df67fc640 1 -- 192.168.123.107:0/854811862 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9df802faf0 con 0x7f9e08103c80 2026-03-09T19:30:41.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.503+0000 7f9df67fc640 1 -- 192.168.123.107:0/854811862 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9df802fc50 con 0x7f9e08103c80 2026-03-09T19:30:41.503 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.504+0000 7f9df67fc640 1 -- 192.168.123.107:0/854811862 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9df80418e0 con 0x7f9e08103c80 2026-03-09T19:30:41.503 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.504+0000 7f9e0f29d640 1 -- 192.168.123.107:0/854811862 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9e0819fb20 con 0x7f9e08103c80 2026-03-09T19:30:41.503 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.504+0000 7f9e0f29d640 1 -- 192.168.123.107:0/854811862 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9e0819ff90 con 0x7f9e08103c80 2026-03-09T19:30:41.505 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.505+0000 7f9e0f29d640 1 -- 192.168.123.107:0/854811862 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9e08102e80 con 0x7f9e08103c80 2026-03-09T19:30:41.506 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.505+0000 7f9df67fc640 1 -- 192.168.123.107:0/854811862 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9df803f070 con 0x7f9e08103c80 2026-03-09T19:30:41.506 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.506+0000 7f9df67fc640 1 --2- 192.168.123.107:0/854811862 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9de40776d0 0x7f9de4079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:30:41.506 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.506+0000 7f9df67fc640 1 -- 192.168.123.107:0/854811862 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6654+0+0 (secure 0 0 0) 0x7f9df80be3f0 con 0x7f9e08103c80 2026-03-09T19:30:41.507 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.508+0000 7f9e0d012640 1 --2- 192.168.123.107:0/854811862 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9de40776d0 0x7f9de4079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:30:41.507 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.508+0000 7f9df67fc640 1 -- 192.168.123.107:0/854811862 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9df8086ad0 con 0x7f9e08103c80 2026-03-09T19:30:41.508 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.509+0000 7f9e0d012640 1 --2- 192.168.123.107:0/854811862 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9de40776d0 0x7f9de4079b90 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f9dfc0059c0 tx=0x7f9dfc009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:30:41.652 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.653+0000 7f9e0f29d640 1 -- 192.168.123.107:0/854811862 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f9e081a0330 con 0x7f9e08103c80 2026-03-09T19:30:41.653 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.654+0000 7f9df67fc640 1 -- 192.168.123.107:0/854811862 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1089 (secure 0 0 0) 0x7f9df8086220 con 0x7f9e08103c80 2026-03-09T19:30:41.653 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_WARN Degraded data redundancy: 1039/291 objects degraded (357.045%), 7 pgs degraded, 7 pgs undersized 2026-03-09T19:30:41.653 INFO:teuthology.orchestra.run.vm07.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 1039/291 objects degraded (357.045%), 7 pgs degraded, 7 pgs undersized 2026-03-09T19:30:41.653 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.b is stuck undersized for 78s, current state active+recovery_wait+undersized+degraded+remapped, last acting [1,4] 2026-03-09T19:30:41.653 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.10 is stuck undersized for 78s, current state active+recovery_wait+undersized+degraded+remapped, last acting [5,1] 2026-03-09T19:30:41.653 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.11 is stuck undersized for 78s, current state active+recovery_wait+undersized+degraded+remapped, last acting [3,4] 2026-03-09T19:30:41.653 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.15 is stuck undersized for 78s, current state active+recovery_wait+undersized+degraded+remapped, last acting [3,4] 2026-03-09T19:30:41.653 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.17 is stuck undersized for 78s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,5] 2026-03-09T19:30:41.653 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.18 is stuck undersized for 78s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,1] 2026-03-09T19:30:41.653 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.1f is stuck undersized for 78s, current state active+recovering+undersized+degraded+remapped, last acting [2,3] 2026-03-09T19:30:41.655 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.656+0000 7f9e0f29d640 1 -- 192.168.123.107:0/854811862 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9de40776d0 msgr2=0x7f9de4079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:41.655 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.656+0000 7f9e0f29d640 1 --2- 192.168.123.107:0/854811862 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9de40776d0 0x7f9de4079b90 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f9dfc0059c0 tx=0x7f9dfc009290 comp rx=0 tx=0).stop 2026-03-09T19:30:41.655 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.657+0000 7f9e0f29d640 1 -- 192.168.123.107:0/854811862 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e08103c80 msgr2=0x7f9e0819a9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:30:41.655 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.657+0000 7f9e0f29d640 1 --2- 192.168.123.107:0/854811862 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e08103c80 0x7f9e0819a9a0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f9df8005bb0 tx=0x7f9df8004320 comp rx=0 tx=0).stop 2026-03-09T19:30:41.656 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.657+0000 7f9e0f29d640 1 -- 192.168.123.107:0/854811862 shutdown_connections 2026-03-09T19:30:41.656 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.657+0000 7f9e0f29d640 1 --2- 192.168.123.107:0/854811862 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9de40776d0 0x7f9de4079b90 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.656 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.657+0000 7f9e0f29d640 1 --2- 192.168.123.107:0/854811862 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e08103c80 0x7f9e0819a9a0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.656 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.657+0000 7f9e0f29d640 1 --2- 192.168.123.107:0/854811862 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e08102a80 0x7f9e0819a460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:30:41.656 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.657+0000 7f9e0f29d640 1 -- 192.168.123.107:0/854811862 >> 192.168.123.107:0/854811862 conn(0x7f9e080fe250 msgr2=0x7f9e080ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:30:41.656 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.657+0000 7f9e0f29d640 1 -- 192.168.123.107:0/854811862 shutdown_connections 2026-03-09T19:30:41.656 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:30:41.657+0000 7f9e0f29d640 1 -- 192.168.123.107:0/854811862 wait complete. 2026-03-09T19:30:42.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:41 vm08.local ceph-mon[103420]: from='client.44167 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:30:42.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:41 vm08.local ceph-mon[103420]: from='client.34206 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:30:42.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:41 vm08.local ceph-mon[103420]: from='client.44175 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:30:42.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:41 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1344807694' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:30:42.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:41 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/247161637' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:30:42.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:41 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/854811862' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:30:42.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:41 vm07.local ceph-mon[111841]: from='client.44167 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:30:42.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:41 vm07.local ceph-mon[111841]: from='client.34206 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:30:42.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:41 vm07.local ceph-mon[111841]: from='client.44175 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:30:42.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:41 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1344807694' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:30:42.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:41 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/247161637' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:30:42.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:41 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/854811862' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:30:43.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:42 vm08.local ceph-mon[103420]: from='client.34218 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:30:43.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:42 vm08.local ceph-mon[103420]: pgmap v86: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 941/291 objects degraded (323.368%); 0 B/s, 11 objects/s recovering 2026-03-09T19:30:43.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:42 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 941/291 objects degraded (323.368%), 6 pgs degraded, 6 pgs undersized (PG_DEGRADED) 2026-03-09T19:30:43.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:42 vm07.local ceph-mon[111841]: from='client.34218 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:30:43.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:42 vm07.local ceph-mon[111841]: pgmap v86: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 941/291 objects degraded (323.368%); 0 B/s, 11 objects/s recovering 2026-03-09T19:30:43.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:42 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 941/291 objects degraded (323.368%), 6 pgs degraded, 6 pgs undersized (PG_DEGRADED) 2026-03-09T19:30:45.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:44 vm07.local ceph-mon[111841]: pgmap v87: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 941/291 objects degraded (323.368%); 0 B/s, 5 objects/s recovering 2026-03-09T19:30:45.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:44 vm08.local ceph-mon[103420]: pgmap v87: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 941/291 objects degraded (323.368%); 0 B/s, 5 objects/s recovering 2026-03-09T19:30:46.129 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:45 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:30:46.129 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:45 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:30:46.129 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:45 vm07.local ceph-mon[111841]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-09T19:30:46.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:45 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:30:46.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:45 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:30:46.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:45 vm08.local ceph-mon[103420]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-09T19:30:47.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:46 vm07.local ceph-mon[111841]: pgmap v88: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 941/291 objects degraded (323.368%); 0 B/s, 10 objects/s recovering 2026-03-09T19:30:47.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:46 vm08.local ceph-mon[103420]: pgmap v88: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 941/291 objects degraded (323.368%); 0 B/s, 10 objects/s recovering 2026-03-09T19:30:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:48 vm07.local ceph-mon[111841]: pgmap v89: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 941/291 objects degraded (323.368%); 0 B/s, 9 objects/s recovering 2026-03-09T19:30:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:30:49.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:48 vm08.local ceph-mon[103420]: pgmap v89: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 941/291 objects degraded (323.368%); 0 B/s, 9 objects/s recovering 2026-03-09T19:30:49.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:30:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:50 vm07.local ceph-mon[111841]: pgmap v90: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 941/291 objects degraded (323.368%); 0 B/s, 7 objects/s recovering 2026-03-09T19:30:51.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:50 vm08.local ceph-mon[103420]: pgmap v90: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 941/291 objects degraded (323.368%); 0 B/s, 7 objects/s recovering 2026-03-09T19:30:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:51 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 897/291 objects degraded (308.247%), 6 pgs degraded, 6 pgs undersized (PG_DEGRADED) 2026-03-09T19:30:52.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:51 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 897/291 objects degraded (308.247%), 6 pgs degraded, 6 pgs undersized (PG_DEGRADED) 2026-03-09T19:30:53.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:52 vm07.local ceph-mon[111841]: pgmap v91: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 897/291 objects degraded (308.247%); 0 B/s, 11 objects/s recovering 2026-03-09T19:30:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:52 vm08.local ceph-mon[103420]: pgmap v91: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 897/291 objects degraded (308.247%); 0 B/s, 11 objects/s recovering 2026-03-09T19:30:54.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:53 vm07.local ceph-mon[111841]: osdmap e66: 6 total, 6 up, 6 in 2026-03-09T19:30:54.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:53 vm08.local ceph-mon[103420]: osdmap e66: 6 total, 6 up, 6 in 2026-03-09T19:30:55.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:54 vm07.local ceph-mon[111841]: pgmap v93: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 897/291 objects degraded (308.247%); 0 B/s, 9 objects/s recovering 2026-03-09T19:30:55.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:54 vm07.local ceph-mon[111841]: osdmap e67: 6 total, 6 up, 6 in 2026-03-09T19:30:55.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:54 vm08.local ceph-mon[103420]: pgmap v93: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 897/291 objects degraded (308.247%); 0 B/s, 9 objects/s recovering 2026-03-09T19:30:55.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:54 vm08.local ceph-mon[103420]: osdmap e67: 6 total, 6 up, 6 in 2026-03-09T19:30:57.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:56 vm07.local ceph-mon[111841]: pgmap v95: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 400/291 objects degraded (137.457%); 0 B/s, 57 objects/s recovering 2026-03-09T19:30:57.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:56 vm08.local ceph-mon[103420]: pgmap v95: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 400/291 objects degraded (137.457%); 0 B/s, 57 objects/s recovering 2026-03-09T19:30:59.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:58 vm07.local ceph-mon[111841]: pgmap v96: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 400/291 objects degraded (137.457%); 0 B/s, 57 objects/s recovering 2026-03-09T19:30:59.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:30:58 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 400/291 objects degraded (137.457%), 3 pgs degraded, 3 pgs undersized (PG_DEGRADED) 2026-03-09T19:30:59.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:58 vm08.local ceph-mon[103420]: pgmap v96: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 400/291 objects degraded (137.457%); 0 B/s, 57 objects/s recovering 2026-03-09T19:30:59.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:30:58 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 400/291 objects degraded (137.457%), 3 pgs degraded, 3 pgs undersized (PG_DEGRADED) 2026-03-09T19:31:01.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:00 vm07.local ceph-mon[111841]: pgmap v97: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 400/291 objects degraded (137.457%); 0 B/s, 51 objects/s recovering 2026-03-09T19:31:01.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:00 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:31:01.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:00 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:31:01.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:00 vm07.local ceph-mon[111841]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-09T19:31:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:00 vm08.local ceph-mon[103420]: pgmap v97: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 400/291 objects degraded (137.457%); 0 B/s, 51 objects/s recovering 2026-03-09T19:31:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:00 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:31:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:00 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:31:01.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:00 vm08.local ceph-mon[103420]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-09T19:31:02.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:02 vm08.local ceph-mon[103420]: pgmap v98: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 400/291 objects degraded (137.457%); 0 B/s, 53 objects/s recovering 2026-03-09T19:31:02.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:02 vm07.local ceph-mon[111841]: pgmap v98: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 400/291 objects degraded (137.457%); 0 B/s, 53 objects/s recovering 2026-03-09T19:31:03.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:31:04.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:31:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:04 vm07.local ceph-mon[111841]: pgmap v99: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 400/291 objects degraded (137.457%); 0 B/s, 46 objects/s recovering 2026-03-09T19:31:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:04 vm08.local ceph-mon[103420]: pgmap v99: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 400/291 objects degraded (137.457%); 0 B/s, 46 objects/s recovering 2026-03-09T19:31:05.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:05 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 390/291 objects degraded (134.021%), 3 pgs degraded, 3 pgs undersized (PG_DEGRADED) 2026-03-09T19:31:06.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:05 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 390/291 objects degraded (134.021%), 3 pgs degraded, 3 pgs undersized (PG_DEGRADED) 2026-03-09T19:31:06.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:06 vm07.local ceph-mon[111841]: pgmap v100: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 390/291 objects degraded (134.021%); 0 B/s, 43 objects/s recovering 2026-03-09T19:31:07.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:06 vm08.local ceph-mon[103420]: pgmap v100: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 390/291 objects degraded (134.021%); 0 B/s, 43 objects/s recovering 2026-03-09T19:31:08.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:08 vm07.local ceph-mon[111841]: pgmap v101: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 390/291 objects degraded (134.021%); 0 B/s, 7 objects/s recovering 2026-03-09T19:31:09.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:08 vm08.local ceph-mon[103420]: pgmap v101: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 390/291 objects degraded (134.021%); 0 B/s, 7 objects/s recovering 2026-03-09T19:31:11.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:11 vm08.local ceph-mon[103420]: pgmap v102: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 390/291 objects degraded (134.021%); 0 B/s, 7 objects/s recovering 2026-03-09T19:31:11.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:11 vm08.local ceph-mon[103420]: osdmap e68: 6 total, 6 up, 6 in 2026-03-09T19:31:11.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:11 vm07.local ceph-mon[111841]: pgmap v102: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 390/291 objects degraded (134.021%); 0 B/s, 7 objects/s recovering 2026-03-09T19:31:11.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:11 vm07.local ceph-mon[111841]: osdmap e68: 6 total, 6 up, 6 in 2026-03-09T19:31:11.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.722+0000 7fcd6d231640 1 -- 192.168.123.107:0/2992316208 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68102a60 msgr2=0x7fcd68102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:11.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.722+0000 7fcd6d231640 1 --2- 192.168.123.107:0/2992316208 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68102a60 0x7fcd68102e60 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fcd540099b0 tx=0x7fcd5402f220 comp rx=0 tx=0).stop 2026-03-09T19:31:11.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.723+0000 7fcd6d231640 1 -- 192.168.123.107:0/2992316208 shutdown_connections 2026-03-09T19:31:11.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.723+0000 7fcd6d231640 1 --2- 192.168.123.107:0/2992316208 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcd68103c60 0x7fcd681040e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:11.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.723+0000 7fcd6d231640 1 --2- 192.168.123.107:0/2992316208 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68102a60 0x7fcd68102e60 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:11.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.723+0000 7fcd6d231640 1 -- 192.168.123.107:0/2992316208 >> 192.168.123.107:0/2992316208 conn(0x7fcd680fe250 msgr2=0x7fcd68100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:11.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.723+0000 7fcd6d231640 1 -- 192.168.123.107:0/2992316208 shutdown_connections 2026-03-09T19:31:11.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.723+0000 7fcd6d231640 1 -- 192.168.123.107:0/2992316208 wait complete. 2026-03-09T19:31:11.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.724+0000 7fcd6d231640 1 Processor -- start 2026-03-09T19:31:11.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.724+0000 7fcd6d231640 1 -- start start 2026-03-09T19:31:11.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.724+0000 7fcd6d231640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcd68102a60 0x7fcd68195f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:11.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.724+0000 7fcd6d231640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68103c60 0x7fcd68196480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:11.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.724+0000 7fcd6d231640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcd681969c0 con 0x7fcd68103c60 2026-03-09T19:31:11.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.724+0000 7fcd6d231640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcd68196b30 con 0x7fcd68102a60 2026-03-09T19:31:11.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.724+0000 7fcd66575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68103c60 0x7fcd68196480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:11.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.724+0000 7fcd66575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68103c60 0x7fcd68196480 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:49230/0 (socket says 192.168.123.107:49230) 2026-03-09T19:31:11.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.724+0000 7fcd66575640 1 -- 192.168.123.107:0/4078515641 learned_addr learned my addr 192.168.123.107:0/4078515641 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:31:11.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.724+0000 7fcd66575640 1 -- 192.168.123.107:0/4078515641 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcd68102a60 msgr2=0x7fcd68195f40 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:31:11.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.724+0000 7fcd66575640 1 --2- 192.168.123.107:0/4078515641 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcd68102a60 0x7fcd68195f40 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:11.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.724+0000 7fcd66575640 1 -- 192.168.123.107:0/4078515641 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcd54009660 con 0x7fcd68103c60 2026-03-09T19:31:11.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.724+0000 7fcd66575640 1 --2- 192.168.123.107:0/4078515641 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68103c60 0x7fcd68196480 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fcd5c00cc60 tx=0x7fcd5c007590 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:11.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.725+0000 7fcd43fff640 1 -- 192.168.123.107:0/4078515641 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd5c007e00 con 0x7fcd68103c60 2026-03-09T19:31:11.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.725+0000 7fcd6d231640 1 -- 192.168.123.107:0/4078515641 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcd6819b610 con 0x7fcd68103c60 2026-03-09T19:31:11.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.725+0000 7fcd6d231640 1 -- 192.168.123.107:0/4078515641 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcd6819bb10 con 0x7fcd68103c60 2026-03-09T19:31:11.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.726+0000 7fcd6d231640 1 -- 192.168.123.107:0/4078515641 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcd2c005350 con 0x7fcd68103c60 2026-03-09T19:31:11.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.726+0000 7fcd43fff640 1 -- 192.168.123.107:0/4078515641 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fcd5c00ce80 con 0x7fcd68103c60 2026-03-09T19:31:11.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.727+0000 7fcd43fff640 1 -- 192.168.123.107:0/4078515641 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd5c002ea0 con 0x7fcd68103c60 2026-03-09T19:31:11.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.727+0000 7fcd43fff640 1 -- 192.168.123.107:0/4078515641 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcd5c00f760 con 0x7fcd68103c60 2026-03-09T19:31:11.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.727+0000 7fcd43fff640 1 --2- 192.168.123.107:0/4078515641 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcd38077a80 0x7fcd38079f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:11.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.727+0000 7fcd66d76640 1 --2- 192.168.123.107:0/4078515641 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcd38077a80 0x7fcd38079f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:11.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.728+0000 7fcd66d76640 1 --2- 192.168.123.107:0/4078515641 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcd38077a80 0x7fcd38079f40 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fcd54002af0 tx=0x7fcd540023d0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:11.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.728+0000 7fcd43fff640 1 -- 192.168.123.107:0/4078515641 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6538+0+0 (secure 0 0 0) 0x7fcd5c09a0f0 con 0x7fcd68103c60 2026-03-09T19:31:11.729 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.730+0000 7fcd43fff640 1 -- 192.168.123.107:0/4078515641 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcd5c09f050 con 0x7fcd68103c60 2026-03-09T19:31:11.838 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.839+0000 7fcd6d231640 1 -- 192.168.123.107:0/4078515641 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fcd2c002bf0 con 0x7fcd38077a80 2026-03-09T19:31:11.839 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.840+0000 7fcd43fff640 1 -- 192.168.123.107:0/4078515641 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fcd2c002bf0 con 0x7fcd38077a80 2026-03-09T19:31:11.842 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.843+0000 7fcd41ffb640 1 -- 192.168.123.107:0/4078515641 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcd38077a80 msgr2=0x7fcd38079f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:11.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.843+0000 7fcd41ffb640 1 --2- 192.168.123.107:0/4078515641 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcd38077a80 0x7fcd38079f40 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fcd54002af0 tx=0x7fcd540023d0 comp rx=0 tx=0).stop 2026-03-09T19:31:11.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.843+0000 7fcd41ffb640 1 -- 192.168.123.107:0/4078515641 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68103c60 msgr2=0x7fcd68196480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:11.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.843+0000 7fcd41ffb640 1 --2- 192.168.123.107:0/4078515641 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68103c60 0x7fcd68196480 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fcd5c00cc60 tx=0x7fcd5c007590 comp rx=0 tx=0).stop 2026-03-09T19:31:11.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.843+0000 7fcd41ffb640 1 -- 192.168.123.107:0/4078515641 shutdown_connections 2026-03-09T19:31:11.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.843+0000 7fcd41ffb640 1 --2- 192.168.123.107:0/4078515641 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcd38077a80 0x7fcd38079f40 secure :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fcd54002af0 tx=0x7fcd540023d0 comp rx=0 tx=0).stop 2026-03-09T19:31:11.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.843+0000 7fcd41ffb640 1 --2- 192.168.123.107:0/4078515641 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68103c60 0x7fcd68196480 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:11.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.843+0000 7fcd41ffb640 1 --2- 192.168.123.107:0/4078515641 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcd68102a60 0x7fcd68195f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:11.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.843+0000 7fcd41ffb640 1 -- 192.168.123.107:0/4078515641 >> 192.168.123.107:0/4078515641 conn(0x7fcd680fe250 msgr2=0x7fcd680ffa00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:11.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.843+0000 7fcd41ffb640 1 -- 192.168.123.107:0/4078515641 shutdown_connections 2026-03-09T19:31:11.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.844+0000 7fcd41ffb640 1 -- 192.168.123.107:0/4078515641 wait complete. 2026-03-09T19:31:11.854 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:31:11.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.906+0000 7f438b9da640 1 -- 192.168.123.107:0/3239805823 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4384075ba0 msgr2=0x7f4384075fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:11.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.906+0000 7f438b9da640 1 --2- 192.168.123.107:0/3239805823 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4384075ba0 0x7f4384075fa0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f4374009a00 tx=0x7f437402f280 comp rx=0 tx=0).stop 2026-03-09T19:31:11.906 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.906+0000 7f438b9da640 1 -- 192.168.123.107:0/3239805823 shutdown_connections 2026-03-09T19:31:11.906 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.906+0000 7f438b9da640 1 --2- 192.168.123.107:0/3239805823 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4384076df0 0x7f4384077250 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:11.906 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.906+0000 7f438b9da640 1 --2- 192.168.123.107:0/3239805823 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4384075ba0 0x7f4384075fa0 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:11.906 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.906+0000 7f438b9da640 1 -- 192.168.123.107:0/3239805823 >> 192.168.123.107:0/3239805823 conn(0x7f43840fe250 msgr2=0x7f4384100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:11.906 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.907+0000 7f438b9da640 1 -- 192.168.123.107:0/3239805823 shutdown_connections 2026-03-09T19:31:11.906 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.907+0000 7f438b9da640 1 -- 192.168.123.107:0/3239805823 wait complete. 2026-03-09T19:31:11.906 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.907+0000 7f438b9da640 1 Processor -- start 2026-03-09T19:31:11.906 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.908+0000 7f438b9da640 1 -- start start 2026-03-09T19:31:11.907 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.908+0000 7f438b9da640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4384075ba0 0x7f438410d120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:11.907 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.908+0000 7f438b9da640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4384076df0 0x7f438410d660 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:11.907 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.908+0000 7f438b9da640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f438410eb60 con 0x7f4384076df0 2026-03-09T19:31:11.907 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.908+0000 7f438b9da640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f438410ecd0 con 0x7f4384075ba0 2026-03-09T19:31:11.907 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.908+0000 7f438974f640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4384075ba0 0x7f438410d120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:11.907 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.908+0000 7f4388f4e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4384076df0 0x7f438410d660 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:11.907 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.908+0000 7f4388f4e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4384076df0 0x7f438410d660 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:49246/0 (socket says 192.168.123.107:49246) 2026-03-09T19:31:11.908 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.908+0000 7f4388f4e640 1 -- 192.168.123.107:0/865822380 learned_addr learned my addr 192.168.123.107:0/865822380 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:31:11.908 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.909+0000 7f4388f4e640 1 -- 192.168.123.107:0/865822380 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4384075ba0 msgr2=0x7f438410d120 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:11.908 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.909+0000 7f4388f4e640 1 --2- 192.168.123.107:0/865822380 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4384075ba0 0x7f438410d120 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:11.908 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.909+0000 7f4388f4e640 1 -- 192.168.123.107:0/865822380 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4374009660 con 0x7f4384076df0 2026-03-09T19:31:11.908 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.909+0000 7f4388f4e640 1 --2- 192.168.123.107:0/865822380 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4384076df0 0x7f438410d660 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f436c00e9b0 tx=0x7f436c00ee80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:11.908 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.909+0000 7f437a7fc640 1 -- 192.168.123.107:0/865822380 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f436c00cd90 con 0x7f4384076df0 2026-03-09T19:31:11.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.910+0000 7f438b9da640 1 -- 192.168.123.107:0/865822380 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f438410dcc0 con 0x7f4384076df0 2026-03-09T19:31:11.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.910+0000 7f438b9da640 1 -- 192.168.123.107:0/865822380 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f43841ad090 con 0x7f4384076df0 2026-03-09T19:31:11.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.910+0000 7f437a7fc640 1 -- 192.168.123.107:0/865822380 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f436c004590 con 0x7f4384076df0 2026-03-09T19:31:11.909 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.910+0000 7f437a7fc640 1 -- 192.168.123.107:0/865822380 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f436c010640 con 0x7f4384076df0 2026-03-09T19:31:11.911 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.912+0000 7f438b9da640 1 -- 192.168.123.107:0/865822380 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f434c005350 con 0x7f4384076df0 2026-03-09T19:31:11.915 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.913+0000 7f437a7fc640 1 -- 192.168.123.107:0/865822380 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f436c0040d0 con 0x7f4384076df0 2026-03-09T19:31:11.915 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.913+0000 7f437a7fc640 1 --2- 192.168.123.107:0/865822380 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4358077890 0x7f4358079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:11.915 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.913+0000 7f438974f640 1 --2- 192.168.123.107:0/865822380 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4358077890 0x7f4358079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:11.915 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.914+0000 7f438974f640 1 --2- 192.168.123.107:0/865822380 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4358077890 0x7f4358079d50 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f4374005ec0 tx=0x7f4374005e50 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:11.915 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.916+0000 7f437a7fc640 1 -- 192.168.123.107:0/865822380 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6538+0+0 (secure 0 0 0) 0x7f436c014070 con 0x7f4384076df0 2026-03-09T19:31:11.915 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:11.916+0000 7f437a7fc640 1 -- 192.168.123.107:0/865822380 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f436c09e050 con 0x7f4384076df0 2026-03-09T19:31:12.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.025+0000 7f438b9da640 1 -- 192.168.123.107:0/865822380 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f434c002bf0 con 0x7f4358077890 2026-03-09T19:31:12.025 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.026+0000 7f437a7fc640 1 -- 192.168.123.107:0/865822380 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f434c002bf0 con 0x7f4358077890 2026-03-09T19:31:12.028 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.029+0000 7f438b9da640 1 -- 192.168.123.107:0/865822380 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4358077890 msgr2=0x7f4358079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.028 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.029+0000 7f438b9da640 1 --2- 192.168.123.107:0/865822380 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4358077890 0x7f4358079d50 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f4374005ec0 tx=0x7f4374005e50 comp rx=0 tx=0).stop 2026-03-09T19:31:12.028 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.029+0000 7f438b9da640 1 -- 192.168.123.107:0/865822380 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4384076df0 msgr2=0x7f438410d660 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.028 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.029+0000 7f438b9da640 1 --2- 192.168.123.107:0/865822380 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4384076df0 0x7f438410d660 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f436c00e9b0 tx=0x7f436c00ee80 comp rx=0 tx=0).stop 2026-03-09T19:31:12.028 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.029+0000 7f438b9da640 1 -- 192.168.123.107:0/865822380 shutdown_connections 2026-03-09T19:31:12.028 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.029+0000 7f438b9da640 1 --2- 192.168.123.107:0/865822380 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4358077890 0x7f4358079d50 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.028 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.029+0000 7f438b9da640 1 --2- 192.168.123.107:0/865822380 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4384076df0 0x7f438410d660 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.028 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.029+0000 7f438b9da640 1 --2- 192.168.123.107:0/865822380 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4384075ba0 0x7f438410d120 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.028 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.030+0000 7f438b9da640 1 -- 192.168.123.107:0/865822380 >> 192.168.123.107:0/865822380 conn(0x7f43840fe250 msgr2=0x7f4384100000 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:12.028 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.030+0000 7f438b9da640 1 -- 192.168.123.107:0/865822380 shutdown_connections 2026-03-09T19:31:12.029 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.030+0000 7f438b9da640 1 -- 192.168.123.107:0/865822380 wait complete. 2026-03-09T19:31:12.094 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.094+0000 7f8ca0841640 1 -- 192.168.123.107:0/3886223177 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c98103c60 msgr2=0x7f8c981040e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.094+0000 7f8ca0841640 1 --2- 192.168.123.107:0/3886223177 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c98103c60 0x7f8c981040e0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f8c900099b0 tx=0x7f8c9002f220 comp rx=0 tx=0).stop 2026-03-09T19:31:12.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.095+0000 7f8ca0841640 1 -- 192.168.123.107:0/3886223177 shutdown_connections 2026-03-09T19:31:12.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.095+0000 7f8ca0841640 1 --2- 192.168.123.107:0/3886223177 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c98103c60 0x7f8c981040e0 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.095+0000 7f8ca0841640 1 --2- 192.168.123.107:0/3886223177 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c98102a60 0x7f8c98102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.095+0000 7f8ca0841640 1 -- 192.168.123.107:0/3886223177 >> 192.168.123.107:0/3886223177 conn(0x7f8c980fe250 msgr2=0x7f8c98100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:12.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.095+0000 7f8ca0841640 1 -- 192.168.123.107:0/3886223177 shutdown_connections 2026-03-09T19:31:12.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.095+0000 7f8ca0841640 1 -- 192.168.123.107:0/3886223177 wait complete. 2026-03-09T19:31:12.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.095+0000 7f8ca0841640 1 Processor -- start 2026-03-09T19:31:12.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.096+0000 7f8ca0841640 1 -- start start 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.096+0000 7f8ca0841640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c98102a60 0x7f8c9819a430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.096+0000 7f8c9e5b6640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c98102a60 0x7f8c9819a430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.096+0000 7f8c9e5b6640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c98102a60 0x7f8c9819a430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:49272/0 (socket says 192.168.123.107:49272) 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.096+0000 7f8ca0841640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c98103c60 0x7f8c9819a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.096+0000 7f8c9e5b6640 1 -- 192.168.123.107:0/1003373299 learned_addr learned my addr 192.168.123.107:0/1003373299 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.096+0000 7f8ca0841640 1 -- 192.168.123.107:0/1003373299 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8c9819aeb0 con 0x7f8c98102a60 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.096+0000 7f8ca0841640 1 -- 192.168.123.107:0/1003373299 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8c9819b020 con 0x7f8c98103c60 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.096+0000 7f8c9ddb5640 1 --2- 192.168.123.107:0/1003373299 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c98103c60 0x7f8c9819a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.097+0000 7f8c9e5b6640 1 -- 192.168.123.107:0/1003373299 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c98103c60 msgr2=0x7f8c9819a970 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.097+0000 7f8c9e5b6640 1 --2- 192.168.123.107:0/1003373299 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c98103c60 0x7f8c9819a970 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.097+0000 7f8c9e5b6640 1 -- 192.168.123.107:0/1003373299 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8c90009660 con 0x7f8c98102a60 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.097+0000 7f8c9e5b6640 1 --2- 192.168.123.107:0/1003373299 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c98102a60 0x7f8c9819a430 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f8c8800cbf0 tx=0x7f8c88007590 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.097+0000 7f8c877fe640 1 -- 192.168.123.107:0/1003373299 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8c88007c30 con 0x7f8c98102a60 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.097+0000 7f8c877fe640 1 -- 192.168.123.107:0/1003373299 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8c88004530 con 0x7f8c98102a60 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.097+0000 7f8c877fe640 1 -- 192.168.123.107:0/1003373299 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8c8800f450 con 0x7f8c98102a60 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.097+0000 7f8ca0841640 1 -- 192.168.123.107:0/1003373299 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8c9819fb00 con 0x7f8c98102a60 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.097+0000 7f8ca0841640 1 -- 192.168.123.107:0/1003373299 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8c981a0000 con 0x7f8c98102a60 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.099+0000 7f8c877fe640 1 -- 192.168.123.107:0/1003373299 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8c880027e0 con 0x7f8c98102a60 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.099+0000 7f8ca0841640 1 -- 192.168.123.107:0/1003373299 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8c68005350 con 0x7f8c98102a60 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.099+0000 7f8c877fe640 1 --2- 192.168.123.107:0/1003373299 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8c78077890 0x7f8c78079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.099+0000 7f8c877fe640 1 -- 192.168.123.107:0/1003373299 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6538+0+0 (secure 0 0 0) 0x7f8c880999c0 con 0x7f8c98102a60 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.099+0000 7f8c9ddb5640 1 --2- 192.168.123.107:0/1003373299 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8c78077890 0x7f8c78079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.101+0000 7f8c9ddb5640 1 --2- 192.168.123.107:0/1003373299 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8c78077890 0x7f8c78079d50 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f8c9819b950 tx=0x7f8c9003a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:12.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.102+0000 7f8c877fe640 1 -- 192.168.123.107:0/1003373299 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8c88062110 con 0x7f8c98102a60 2026-03-09T19:31:12.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.200+0000 7f8ca0841640 1 -- 192.168.123.107:0/1003373299 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f8c68002bf0 con 0x7f8c78077890 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.205+0000 7f8c877fe640 1 -- 192.168.123.107:0/1003373299 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f8c68002bf0 con 0x7f8c78077890 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (8m) 2m ago 8m 23.2M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (8m) 2m ago 8m 8938k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (8m) 2m ago 8m 10.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (2m) 2m ago 8m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 91ed5a6dbf3f 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (2m) 2m ago 8m 8321k - 19.2.3-678-ge911bdeb 654f31e6858e b2465a9d2305 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (8m) 2m ago 8m 88.6M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (6m) 2m ago 6m 16.7M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (6m) 2m ago 6m 18.3M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (6m) 2m ago 6m 28.8M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (6m) 2m ago 6m 239M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:8443,9283,8765 running (3m) 2m ago 9m 593M - 19.2.3-678-ge911bdeb 654f31e6858e 6c1350e70bfa 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (2m) 2m ago 8m 489M - 19.2.3-678-ge911bdeb 654f31e6858e c4c36685d8dc 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (2m) 2m ago 9m 56.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e ad39140965d8 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (2m) 2m ago 8m 46.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b4a58927ebfd 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (8m) 2m ago 8m 14.3M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:31:12.212 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (8m) 2m ago 8m 16.6M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (2m) 2m ago 7m 30.8M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a203aa241656 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (7m) 2m ago 7m 394M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 2b3c7dd92144 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (7m) 2m ago 7m 318M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 67f7c4b96ef8 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (7m) 2m ago 7m 441M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 740e44caf4fc 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (7m) 2m ago 7m 446M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d929d31f8a58 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (7m) 2m ago 7m 348M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (2m) 2m ago 8m 44.5M - 2.43.0 a07b618ecd1d c09450c20f5f 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.210+0000 7f8ca0841640 1 -- 192.168.123.107:0/1003373299 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8c78077890 msgr2=0x7f8c78079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.210+0000 7f8ca0841640 1 --2- 192.168.123.107:0/1003373299 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8c78077890 0x7f8c78079d50 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f8c9819b950 tx=0x7f8c9003a040 comp rx=0 tx=0).stop 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.210+0000 7f8ca0841640 1 -- 192.168.123.107:0/1003373299 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c98102a60 msgr2=0x7f8c9819a430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.210+0000 7f8ca0841640 1 --2- 192.168.123.107:0/1003373299 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c98102a60 0x7f8c9819a430 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f8c8800cbf0 tx=0x7f8c88007590 comp rx=0 tx=0).stop 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.210+0000 7f8ca0841640 1 -- 192.168.123.107:0/1003373299 shutdown_connections 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.210+0000 7f8ca0841640 1 --2- 192.168.123.107:0/1003373299 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8c78077890 0x7f8c78079d50 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.210+0000 7f8ca0841640 1 --2- 192.168.123.107:0/1003373299 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c98103c60 0x7f8c9819a970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.210+0000 7f8ca0841640 1 --2- 192.168.123.107:0/1003373299 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c98102a60 0x7f8c9819a430 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.210+0000 7f8ca0841640 1 -- 192.168.123.107:0/1003373299 >> 192.168.123.107:0/1003373299 conn(0x7f8c980fe250 msgr2=0x7f8c980ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.210+0000 7f8ca0841640 1 -- 192.168.123.107:0/1003373299 shutdown_connections 2026-03-09T19:31:12.213 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.211+0000 7f8ca0841640 1 -- 192.168.123.107:0/1003373299 wait complete. 2026-03-09T19:31:12.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.272+0000 7f379ef76640 1 -- 192.168.123.107:0/353271131 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3798103c80 msgr2=0x7f3798104100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.272+0000 7f379ef76640 1 --2- 192.168.123.107:0/353271131 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3798103c80 0x7f3798104100 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f37880099b0 tx=0x7f378802f220 comp rx=0 tx=0).stop 2026-03-09T19:31:12.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.272+0000 7f379ef76640 1 -- 192.168.123.107:0/353271131 shutdown_connections 2026-03-09T19:31:12.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.272+0000 7f379ef76640 1 --2- 192.168.123.107:0/353271131 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3798103c80 0x7f3798104100 secure :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f37880099b0 tx=0x7f378802f220 comp rx=0 tx=0).stop 2026-03-09T19:31:12.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.272+0000 7f379ef76640 1 --2- 192.168.123.107:0/353271131 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3798102a80 0x7f3798102e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.272+0000 7f379ef76640 1 -- 192.168.123.107:0/353271131 >> 192.168.123.107:0/353271131 conn(0x7f37980fe250 msgr2=0x7f3798100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:12.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.273+0000 7f379ef76640 1 -- 192.168.123.107:0/353271131 shutdown_connections 2026-03-09T19:31:12.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.273+0000 7f379ef76640 1 -- 192.168.123.107:0/353271131 wait complete. 2026-03-09T19:31:12.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.273+0000 7f379ef76640 1 Processor -- start 2026-03-09T19:31:12.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.273+0000 7f379ef76640 1 -- start start 2026-03-09T19:31:12.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.273+0000 7f379ef76640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3798102a80 0x7f37980717f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:12.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.273+0000 7f379ef76640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3798071d30 0x7f37980721b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:12.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.273+0000 7f379ef76640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f37980731a0 con 0x7f3798102a80 2026-03-09T19:31:12.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.273+0000 7f379ef76640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3798079650 con 0x7f3798071d30 2026-03-09T19:31:12.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.274+0000 7f379cceb640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3798102a80 0x7f37980717f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:12.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.274+0000 7f379cceb640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3798102a80 0x7f37980717f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:49282/0 (socket says 192.168.123.107:49282) 2026-03-09T19:31:12.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.274+0000 7f379cceb640 1 -- 192.168.123.107:0/2063755179 learned_addr learned my addr 192.168.123.107:0/2063755179 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:31:12.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.274+0000 7f378ffff640 1 --2- 192.168.123.107:0/2063755179 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3798071d30 0x7f37980721b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:12.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.274+0000 7f379cceb640 1 -- 192.168.123.107:0/2063755179 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3798071d30 msgr2=0x7f37980721b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.274+0000 7f379cceb640 1 --2- 192.168.123.107:0/2063755179 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3798071d30 0x7f37980721b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.274+0000 7f379cceb640 1 -- 192.168.123.107:0/2063755179 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3788009660 con 0x7f3798102a80 2026-03-09T19:31:12.274 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.275+0000 7f379cceb640 1 --2- 192.168.123.107:0/2063755179 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3798102a80 0x7f37980717f0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f378000cc20 tx=0x7f3780007590 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:12.274 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.275+0000 7f378dffb640 1 -- 192.168.123.107:0/2063755179 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3780007cb0 con 0x7f3798102a80 2026-03-09T19:31:12.274 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.275+0000 7f378dffb640 1 -- 192.168.123.107:0/2063755179 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3780007e10 con 0x7f3798102a80 2026-03-09T19:31:12.276 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.275+0000 7f378dffb640 1 -- 192.168.123.107:0/2063755179 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f378000f660 con 0x7f3798102a80 2026-03-09T19:31:12.276 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.275+0000 7f379ef76640 1 -- 192.168.123.107:0/2063755179 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3798079790 con 0x7f3798102a80 2026-03-09T19:31:12.276 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.275+0000 7f379ef76640 1 -- 192.168.123.107:0/2063755179 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f37981a4b80 con 0x7f3798102a80 2026-03-09T19:31:12.280 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.281+0000 7f379ef76640 1 -- 192.168.123.107:0/2063755179 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3764005350 con 0x7f3798102a80 2026-03-09T19:31:12.284 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.284+0000 7f378dffb640 1 -- 192.168.123.107:0/2063755179 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3780002870 con 0x7f3798102a80 2026-03-09T19:31:12.284 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.284+0000 7f378dffb640 1 --2- 192.168.123.107:0/2063755179 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f37680779b0 0x7f3768079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:12.284 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.284+0000 7f378dffb640 1 -- 192.168.123.107:0/2063755179 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6538+0+0 (secure 0 0 0) 0x7f378001d030 con 0x7f3798102a80 2026-03-09T19:31:12.284 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.285+0000 7f378ffff640 1 --2- 192.168.123.107:0/2063755179 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f37680779b0 0x7f3768079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:12.284 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.285+0000 7f378dffb640 1 -- 192.168.123.107:0/2063755179 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3780062a30 con 0x7f3798102a80 2026-03-09T19:31:12.284 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.285+0000 7f378ffff640 1 --2- 192.168.123.107:0/2063755179 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f37680779b0 0x7f3768079e70 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f3798072f20 tx=0x7f378803a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:12.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.421+0000 7f379ef76640 1 -- 192.168.123.107:0/2063755179 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f37640058d0 con 0x7f3798102a80 2026-03-09T19:31:12.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.422+0000 7f378dffb640 1 -- 192.168.123.107:0/2063755179 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+844 (secure 0 0 0) 0x7f3780062180 con 0x7f3798102a80 2026-03-09T19:31:12.421 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:31:12.421 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:31:12.421 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:31:12.421 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:31:12.421 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:31:12.421 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:31:12.421 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:31:12.421 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:31:12.421 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 5, 2026-03-09T19:31:12.422 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T19:31:12.422 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:31:12.422 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:31:12.422 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:31:12.422 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:31:12.422 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:31:12.422 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 9, 2026-03-09T19:31:12.422 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-09T19:31:12.422 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:31:12.422 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:31:12.425 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.425+0000 7f379ef76640 1 -- 192.168.123.107:0/2063755179 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f37680779b0 msgr2=0x7f3768079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.425 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.425+0000 7f379ef76640 1 --2- 192.168.123.107:0/2063755179 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f37680779b0 0x7f3768079e70 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f3798072f20 tx=0x7f378803a040 comp rx=0 tx=0).stop 2026-03-09T19:31:12.425 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.425+0000 7f379ef76640 1 -- 192.168.123.107:0/2063755179 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3798102a80 msgr2=0x7f37980717f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.425 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.425+0000 7f379ef76640 1 --2- 192.168.123.107:0/2063755179 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3798102a80 0x7f37980717f0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f378000cc20 tx=0x7f3780007590 comp rx=0 tx=0).stop 2026-03-09T19:31:12.425 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.425+0000 7f379ef76640 1 -- 192.168.123.107:0/2063755179 shutdown_connections 2026-03-09T19:31:12.425 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.425+0000 7f379ef76640 1 --2- 192.168.123.107:0/2063755179 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f37680779b0 0x7f3768079e70 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.425 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.426+0000 7f379ef76640 1 --2- 192.168.123.107:0/2063755179 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3798071d30 0x7f37980721b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.425 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.426+0000 7f379ef76640 1 --2- 192.168.123.107:0/2063755179 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3798102a80 0x7f37980717f0 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.425 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.426+0000 7f379ef76640 1 -- 192.168.123.107:0/2063755179 >> 192.168.123.107:0/2063755179 conn(0x7f37980fe250 msgr2=0x7f37980ffde0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:12.425 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.426+0000 7f379ef76640 1 -- 192.168.123.107:0/2063755179 shutdown_connections 2026-03-09T19:31:12.425 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.426+0000 7f379ef76640 1 -- 192.168.123.107:0/2063755179 wait complete. 2026-03-09T19:31:12.487 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.488+0000 7f1fd734a640 1 -- 192.168.123.107:0/190495868 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1fd00751a0 msgr2=0x7f1fd0073600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.487 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.488+0000 7f1fd734a640 1 --2- 192.168.123.107:0/190495868 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1fd00751a0 0x7f1fd0073600 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f1fc40099b0 tx=0x7f1fc402f240 comp rx=0 tx=0).stop 2026-03-09T19:31:12.487 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.489+0000 7f1fd734a640 1 -- 192.168.123.107:0/190495868 shutdown_connections 2026-03-09T19:31:12.488 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.489+0000 7f1fd734a640 1 --2- 192.168.123.107:0/190495868 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1fd0073b40 0x7f1fd0073fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.488 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.489+0000 7f1fd734a640 1 --2- 192.168.123.107:0/190495868 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1fd00751a0 0x7f1fd0073600 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.488 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.489+0000 7f1fd734a640 1 -- 192.168.123.107:0/190495868 >> 192.168.123.107:0/190495868 conn(0x7f1fd00fbf80 msgr2=0x7f1fd00fe3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:12.488 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.489+0000 7f1fd734a640 1 -- 192.168.123.107:0/190495868 shutdown_connections 2026-03-09T19:31:12.488 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.489+0000 7f1fd734a640 1 -- 192.168.123.107:0/190495868 wait complete. 2026-03-09T19:31:12.488 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.490+0000 7f1fd734a640 1 Processor -- start 2026-03-09T19:31:12.489 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.490+0000 7f1fd734a640 1 -- start start 2026-03-09T19:31:12.489 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.490+0000 7f1fd734a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1fd0073b40 0x7f1fd019e8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:12.489 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.490+0000 7f1fd50bf640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1fd0073b40 0x7f1fd019e8f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:12.489 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.490+0000 7f1fd50bf640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1fd0073b40 0x7f1fd019e8f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:49294/0 (socket says 192.168.123.107:49294) 2026-03-09T19:31:12.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.490+0000 7f1fd734a640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1fd00751a0 0x7f1fd019ee30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:12.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.490+0000 7f1fd734a640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1fd019f400 con 0x7f1fd0073b40 2026-03-09T19:31:12.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.490+0000 7f1fd734a640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1fd019f570 con 0x7f1fd00751a0 2026-03-09T19:31:12.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.490+0000 7f1fd50bf640 1 -- 192.168.123.107:0/2589332439 learned_addr learned my addr 192.168.123.107:0/2589332439 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:31:12.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.491+0000 7f1fd48be640 1 --2- 192.168.123.107:0/2589332439 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1fd00751a0 0x7f1fd019ee30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:12.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.491+0000 7f1fd50bf640 1 -- 192.168.123.107:0/2589332439 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1fd00751a0 msgr2=0x7f1fd019ee30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.491+0000 7f1fd50bf640 1 --2- 192.168.123.107:0/2589332439 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1fd00751a0 0x7f1fd019ee30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.491+0000 7f1fd50bf640 1 -- 192.168.123.107:0/2589332439 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1fc4009660 con 0x7f1fd0073b40 2026-03-09T19:31:12.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.491+0000 7f1fd50bf640 1 --2- 192.168.123.107:0/2589332439 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1fd0073b40 0x7f1fd019e8f0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f1fc4004290 tx=0x7f1fc40042c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:12.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.491+0000 7f1fbe7fc640 1 -- 192.168.123.107:0/2589332439 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1fc403d070 con 0x7f1fd0073b40 2026-03-09T19:31:12.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.491+0000 7f1fd734a640 1 -- 192.168.123.107:0/2589332439 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1fd01a3fb0 con 0x7f1fd0073b40 2026-03-09T19:31:12.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.491+0000 7f1fd734a640 1 -- 192.168.123.107:0/2589332439 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1fd01a4420 con 0x7f1fd0073b40 2026-03-09T19:31:12.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.491+0000 7f1fbe7fc640 1 -- 192.168.123.107:0/2589332439 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f1fc402fcb0 con 0x7f1fd0073b40 2026-03-09T19:31:12.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.491+0000 7f1fbe7fc640 1 -- 192.168.123.107:0/2589332439 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1fc4041770 con 0x7f1fd0073b40 2026-03-09T19:31:12.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.493+0000 7f1fbe7fc640 1 -- 192.168.123.107:0/2589332439 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1fc4049050 con 0x7f1fd0073b40 2026-03-09T19:31:12.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.493+0000 7f1fbe7fc640 1 --2- 192.168.123.107:0/2589332439 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1fa40778e0 0x7f1fa4079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:12.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.494+0000 7f1fd48be640 1 --2- 192.168.123.107:0/2589332439 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1fa40778e0 0x7f1fa4079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:12.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.494+0000 7f1fbe7fc640 1 -- 192.168.123.107:0/2589332439 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6538+0+0 (secure 0 0 0) 0x7f1fc40bf3f0 con 0x7f1fd0073b40 2026-03-09T19:31:12.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.494+0000 7f1fd48be640 1 --2- 192.168.123.107:0/2589332439 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1fa40778e0 0x7f1fa4079da0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f1fd019fe10 tx=0x7f1fc0009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:12.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.495+0000 7f1fd734a640 1 -- 192.168.123.107:0/2589332439 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1f98005350 con 0x7f1fd0073b40 2026-03-09T19:31:12.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.497+0000 7f1fbe7fc640 1 -- 192.168.123.107:0/2589332439 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1fc4087a90 con 0x7f1fd0073b40 2026-03-09T19:31:12.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:12 vm08.local ceph-mon[103420]: osdmap e69: 6 total, 6 up, 6 in 2026-03-09T19:31:12.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:12 vm08.local ceph-mon[103420]: pgmap v105: 65 pgs: 1 active+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 229/291 objects degraded (78.694%); 0 B/s, 11 objects/s recovering 2026-03-09T19:31:12.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:12 vm08.local ceph-mon[103420]: from='client.34226 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:31:12.612 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:12 vm07.local ceph-mon[111841]: osdmap e69: 6 total, 6 up, 6 in 2026-03-09T19:31:12.612 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:12 vm07.local ceph-mon[111841]: pgmap v105: 65 pgs: 1 active+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 229/291 objects degraded (78.694%); 0 B/s, 11 objects/s recovering 2026-03-09T19:31:12.612 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:12 vm07.local ceph-mon[111841]: from='client.34226 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:31:12.612 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.613+0000 7f1fd734a640 1 -- 192.168.123.107:0/2589332439 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f1f98005e10 con 0x7f1fd0073b40 2026-03-09T19:31:12.616 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.617+0000 7f1fbe7fc640 1 -- 192.168.123.107:0/2589332439 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1926 (secure 0 0 0) 0x7f1fc40871e0 con 0x7f1fd0073b40 2026-03-09T19:31:12.617 INFO:teuthology.orchestra.run.vm07.stdout:e13 2026-03-09T19:31:12.617 INFO:teuthology.orchestra.run.vm07.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T19:31:12.617 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:31:12.617 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:31:12.617 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:31:12.617 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:31:12.617 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:31:12.617 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:31:12.617 INFO:teuthology.orchestra.run.vm07.stdout:epoch 13 2026-03-09T19:31:12.617 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:31:12.617 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:31:12.617 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:31:12.617 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:qdb_cluster leader: 0 members: 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:31:12.618 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:31:12.619 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.621+0000 7f1fd734a640 1 -- 192.168.123.107:0/2589332439 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1fa40778e0 msgr2=0x7f1fa4079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.621+0000 7f1fd734a640 1 --2- 192.168.123.107:0/2589332439 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1fa40778e0 0x7f1fa4079da0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f1fd019fe10 tx=0x7f1fc0009290 comp rx=0 tx=0).stop 2026-03-09T19:31:12.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.621+0000 7f1fd734a640 1 -- 192.168.123.107:0/2589332439 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1fd0073b40 msgr2=0x7f1fd019e8f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.621+0000 7f1fd734a640 1 --2- 192.168.123.107:0/2589332439 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1fd0073b40 0x7f1fd019e8f0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f1fc4004290 tx=0x7f1fc40042c0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.621+0000 7f1fd734a640 1 -- 192.168.123.107:0/2589332439 shutdown_connections 2026-03-09T19:31:12.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.621+0000 7f1fd734a640 1 --2- 192.168.123.107:0/2589332439 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1fa40778e0 0x7f1fa4079da0 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.621+0000 7f1fd734a640 1 --2- 192.168.123.107:0/2589332439 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1fd00751a0 0x7f1fd019ee30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.621+0000 7f1fd734a640 1 --2- 192.168.123.107:0/2589332439 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1fd0073b40 0x7f1fd019e8f0 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.621+0000 7f1fd734a640 1 -- 192.168.123.107:0/2589332439 >> 192.168.123.107:0/2589332439 conn(0x7f1fd00fbf80 msgr2=0x7f1fd00fd940 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:12.621 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.621+0000 7f1fd734a640 1 -- 192.168.123.107:0/2589332439 shutdown_connections 2026-03-09T19:31:12.621 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.621+0000 7f1fd734a640 1 -- 192.168.123.107:0/2589332439 wait complete. 2026-03-09T19:31:12.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.683+0000 7fb7cbf69640 1 -- 192.168.123.107:0/3829148319 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb7c40fece0 msgr2=0x7fb7c4106050 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.683+0000 7fb7cbf69640 1 --2- 192.168.123.107:0/3829148319 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb7c40fece0 0x7fb7c4106050 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7fb7b4009a00 tx=0x7fb7b402f280 comp rx=0 tx=0).stop 2026-03-09T19:31:12.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.683+0000 7fb7cbf69640 1 -- 192.168.123.107:0/3829148319 shutdown_connections 2026-03-09T19:31:12.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.683+0000 7fb7cbf69640 1 --2- 192.168.123.107:0/3829148319 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb7c40fece0 0x7fb7c4106050 secure :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7fb7b4009a00 tx=0x7fb7b402f280 comp rx=0 tx=0).stop 2026-03-09T19:31:12.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.683+0000 7fb7cbf69640 1 --2- 192.168.123.107:0/3829148319 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7c40fe3a0 0x7fb7c40fe7a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.683+0000 7fb7cbf69640 1 -- 192.168.123.107:0/3829148319 >> 192.168.123.107:0/3829148319 conn(0x7fb7c40fa150 msgr2=0x7fb7c40fc570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:12.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.684+0000 7fb7cbf69640 1 -- 192.168.123.107:0/3829148319 shutdown_connections 2026-03-09T19:31:12.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.684+0000 7fb7cbf69640 1 -- 192.168.123.107:0/3829148319 wait complete. 2026-03-09T19:31:12.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.684+0000 7fb7cbf69640 1 Processor -- start 2026-03-09T19:31:12.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.684+0000 7fb7cbf69640 1 -- start start 2026-03-09T19:31:12.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.684+0000 7fb7cbf69640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7c40fe3a0 0x7fb7c4196170 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:12.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.684+0000 7fb7cbf69640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb7c41966b0 0x7fb7c419b720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:12.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.684+0000 7fb7cbf69640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb7c4196b30 con 0x7fb7c41966b0 2026-03-09T19:31:12.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.684+0000 7fb7cbf69640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb7c4196ca0 con 0x7fb7c40fe3a0 2026-03-09T19:31:12.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.685+0000 7fb7c9cde640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7c40fe3a0 0x7fb7c4196170 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:12.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.685+0000 7fb7c9cde640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7c40fe3a0 0x7fb7c4196170 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:46988/0 (socket says 192.168.123.107:46988) 2026-03-09T19:31:12.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.685+0000 7fb7c9cde640 1 -- 192.168.123.107:0/93950069 learned_addr learned my addr 192.168.123.107:0/93950069 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:31:12.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.685+0000 7fb7c9cde640 1 -- 192.168.123.107:0/93950069 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb7c41966b0 msgr2=0x7fb7c419b720 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:31:12.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.685+0000 7fb7c94dd640 1 --2- 192.168.123.107:0/93950069 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb7c41966b0 0x7fb7c419b720 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:12.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.686+0000 7fb7c9cde640 1 --2- 192.168.123.107:0/93950069 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb7c41966b0 0x7fb7c419b720 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.686+0000 7fb7c9cde640 1 -- 192.168.123.107:0/93950069 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb7b4009660 con 0x7fb7c40fe3a0 2026-03-09T19:31:12.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.686+0000 7fb7c9cde640 1 --2- 192.168.123.107:0/93950069 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7c40fe3a0 0x7fb7c4196170 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fb7ac003990 tx=0x7fb7ac003e60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:12.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.686+0000 7fb7c94dd640 1 --2- 192.168.123.107:0/93950069 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb7c41966b0 0x7fb7c419b720 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:31:12.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.686+0000 7fb7baffd640 1 -- 192.168.123.107:0/93950069 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb7ac01c070 con 0x7fb7c40fe3a0 2026-03-09T19:31:12.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.686+0000 7fb7baffd640 1 -- 192.168.123.107:0/93950069 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fb7ac009db0 con 0x7fb7c40fe3a0 2026-03-09T19:31:12.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.687+0000 7fb7cbf69640 1 -- 192.168.123.107:0/93950069 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb7c419bcc0 con 0x7fb7c40fe3a0 2026-03-09T19:31:12.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.687+0000 7fb7cbf69640 1 -- 192.168.123.107:0/93950069 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb7c419c190 con 0x7fb7c40fe3a0 2026-03-09T19:31:12.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.687+0000 7fb7cbf69640 1 -- 192.168.123.107:0/93950069 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb790005350 con 0x7fb7c40fe3a0 2026-03-09T19:31:12.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.687+0000 7fb7baffd640 1 -- 192.168.123.107:0/93950069 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb7ac014040 con 0x7fb7c40fe3a0 2026-03-09T19:31:12.688 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.689+0000 7fb7baffd640 1 -- 192.168.123.107:0/93950069 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb7ac005000 con 0x7fb7c40fe3a0 2026-03-09T19:31:12.688 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.689+0000 7fb7baffd640 1 --2- 192.168.123.107:0/93950069 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb7a00776d0 0x7fb7a0079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:12.688 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.689+0000 7fb7baffd640 1 -- 192.168.123.107:0/93950069 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6538+0+0 (secure 0 0 0) 0x7fb7ac09dbd0 con 0x7fb7c40fe3a0 2026-03-09T19:31:12.688 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.690+0000 7fb7c94dd640 1 --2- 192.168.123.107:0/93950069 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb7a00776d0 0x7fb7a0079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:12.689 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.690+0000 7fb7c94dd640 1 --2- 192.168.123.107:0/93950069 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb7a00776d0 0x7fb7a0079b90 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb7c4197690 tx=0x7fb7b40023d0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:12.690 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.691+0000 7fb7baffd640 1 -- 192.168.123.107:0/93950069 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb7ac066320 con 0x7fb7c40fe3a0 2026-03-09T19:31:12.799 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.800+0000 7fb7cbf69640 1 -- 192.168.123.107:0/93950069 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fb790002bf0 con 0x7fb7a00776d0 2026-03-09T19:31:12.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.801+0000 7fb7baffd640 1 -- 192.168.123.107:0/93950069 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fb790002bf0 con 0x7fb7a00776d0 2026-03-09T19:31:12.800 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:31:12.800 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T19:31:12.800 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:31:12.800 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:31:12.800 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [ 2026-03-09T19:31:12.800 INFO:teuthology.orchestra.run.vm07.stdout: "mgr", 2026-03-09T19:31:12.800 INFO:teuthology.orchestra.run.vm07.stdout: "crash", 2026-03-09T19:31:12.800 INFO:teuthology.orchestra.run.vm07.stdout: "mon" 2026-03-09T19:31:12.800 INFO:teuthology.orchestra.run.vm07.stdout: ], 2026-03-09T19:31:12.800 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "7/23 daemons upgraded", 2026-03-09T19:31:12.800 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T19:31:12.801 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:31:12.801 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:31:12.803 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.804+0000 7fb7cbf69640 1 -- 192.168.123.107:0/93950069 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb7a00776d0 msgr2=0x7fb7a0079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.803 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.804+0000 7fb7cbf69640 1 --2- 192.168.123.107:0/93950069 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb7a00776d0 0x7fb7a0079b90 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb7c4197690 tx=0x7fb7b40023d0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.803 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.804+0000 7fb7cbf69640 1 -- 192.168.123.107:0/93950069 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7c40fe3a0 msgr2=0x7fb7c4196170 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.803 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.804+0000 7fb7cbf69640 1 --2- 192.168.123.107:0/93950069 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7c40fe3a0 0x7fb7c4196170 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fb7ac003990 tx=0x7fb7ac003e60 comp rx=0 tx=0).stop 2026-03-09T19:31:12.803 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.804+0000 7fb7cbf69640 1 -- 192.168.123.107:0/93950069 shutdown_connections 2026-03-09T19:31:12.803 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.804+0000 7fb7cbf69640 1 --2- 192.168.123.107:0/93950069 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb7a00776d0 0x7fb7a0079b90 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.803 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.804+0000 7fb7cbf69640 1 --2- 192.168.123.107:0/93950069 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb7c41966b0 0x7fb7c419b720 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.803 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.804+0000 7fb7cbf69640 1 --2- 192.168.123.107:0/93950069 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7c40fe3a0 0x7fb7c4196170 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.804 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.804+0000 7fb7cbf69640 1 -- 192.168.123.107:0/93950069 >> 192.168.123.107:0/93950069 conn(0x7fb7c40fa150 msgr2=0x7fb7c4104460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:12.804 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.804+0000 7fb7cbf69640 1 -- 192.168.123.107:0/93950069 shutdown_connections 2026-03-09T19:31:12.804 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.804+0000 7fb7cbf69640 1 -- 192.168.123.107:0/93950069 wait complete. 2026-03-09T19:31:12.862 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.863+0000 7f618a76e640 1 -- 192.168.123.107:0/2020401234 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6184103c60 msgr2=0x7f61841040e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:12.862 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.863+0000 7f618a76e640 1 --2- 192.168.123.107:0/2020401234 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6184103c60 0x7f61841040e0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f6170009a00 tx=0x7f617002f280 comp rx=0 tx=0).stop 2026-03-09T19:31:12.862 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.863+0000 7f618a76e640 1 -- 192.168.123.107:0/2020401234 shutdown_connections 2026-03-09T19:31:12.862 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.863+0000 7f618a76e640 1 --2- 192.168.123.107:0/2020401234 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6184103c60 0x7f61841040e0 secure :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f6170009a00 tx=0x7f617002f280 comp rx=0 tx=0).stop 2026-03-09T19:31:12.862 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.863+0000 7f618a76e640 1 --2- 192.168.123.107:0/2020401234 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6184102a60 0x7f6184102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.862 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.863+0000 7f618a76e640 1 -- 192.168.123.107:0/2020401234 >> 192.168.123.107:0/2020401234 conn(0x7f61840fe250 msgr2=0x7f6184100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:12.867 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.868+0000 7f618a76e640 1 -- 192.168.123.107:0/2020401234 shutdown_connections 2026-03-09T19:31:12.867 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.868+0000 7f618a76e640 1 -- 192.168.123.107:0/2020401234 wait complete. 2026-03-09T19:31:12.868 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.869+0000 7f618a76e640 1 Processor -- start 2026-03-09T19:31:12.868 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.869+0000 7f618a76e640 1 -- start start 2026-03-09T19:31:12.868 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.869+0000 7f618a76e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6184102a60 0x7f618419a680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:12.868 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.869+0000 7f618a76e640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f618419abc0 0x7f618419fc30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:12.869 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.869+0000 7f618a76e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f618419b040 con 0x7f6184102a60 2026-03-09T19:31:12.869 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.869+0000 7f618a76e640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f618419b1b0 con 0x7f618419abc0 2026-03-09T19:31:12.869 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.869+0000 7f6183fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6184102a60 0x7f618419a680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:12.869 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.869+0000 7f6183fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6184102a60 0x7f618419a680 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:49336/0 (socket says 192.168.123.107:49336) 2026-03-09T19:31:12.869 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.869+0000 7f6183fff640 1 -- 192.168.123.107:0/1865308765 learned_addr learned my addr 192.168.123.107:0/1865308765 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:31:12.869 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.870+0000 7f6183fff640 1 -- 192.168.123.107:0/1865308765 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f618419abc0 msgr2=0x7f618419fc30 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T19:31:12.869 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.870+0000 7f6183fff640 1 --2- 192.168.123.107:0/1865308765 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f618419abc0 0x7f618419fc30 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:12.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.870+0000 7f6183fff640 1 -- 192.168.123.107:0/1865308765 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6170009660 con 0x7f6184102a60 2026-03-09T19:31:12.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.870+0000 7f6183fff640 1 --2- 192.168.123.107:0/1865308765 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6184102a60 0x7f618419a680 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f61740059c0 tx=0x7f617400bcf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:12.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.871+0000 7f61817fa640 1 -- 192.168.123.107:0/1865308765 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6174004490 con 0x7f6184102a60 2026-03-09T19:31:12.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.871+0000 7f61817fa640 1 -- 192.168.123.107:0/1865308765 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6174009ce0 con 0x7f6184102a60 2026-03-09T19:31:12.871 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.872+0000 7f61817fa640 1 -- 192.168.123.107:0/1865308765 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f617401d410 con 0x7f6184102a60 2026-03-09T19:31:12.872 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.872+0000 7f618a76e640 1 -- 192.168.123.107:0/1865308765 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6184075690 con 0x7f6184102a60 2026-03-09T19:31:12.874 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.872+0000 7f618a76e640 1 -- 192.168.123.107:0/1865308765 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6184075b10 con 0x7f6184102a60 2026-03-09T19:31:12.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.873+0000 7f618a76e640 1 -- 192.168.123.107:0/1865308765 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6148005350 con 0x7f6184102a60 2026-03-09T19:31:12.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.875+0000 7f61817fa640 1 -- 192.168.123.107:0/1865308765 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6174009820 con 0x7f6184102a60 2026-03-09T19:31:12.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.875+0000 7f61817fa640 1 --2- 192.168.123.107:0/1865308765 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f6158077720 0x7f6158079be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:12.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.875+0000 7f61817fa640 1 -- 192.168.123.107:0/1865308765 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6538+0+0 (secure 0 0 0) 0x7f61740a20c0 con 0x7f6184102a60 2026-03-09T19:31:12.875 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.876+0000 7f61837fe640 1 --2- 192.168.123.107:0/1865308765 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f6158077720 0x7f6158079be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:12.876 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.877+0000 7f61817fa640 1 -- 192.168.123.107:0/1865308765 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f617406a750 con 0x7f6184102a60 2026-03-09T19:31:12.876 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:12.877+0000 7f61837fe640 1 --2- 192.168.123.107:0/1865308765 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f6158077720 0x7f6158079be0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f6170002c80 tx=0x7f61700023d0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:13.002 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:13.003+0000 7f618a76e640 1 -- 192.168.123.107:0/1865308765 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f6148005600 con 0x7f6184102a60 2026-03-09T19:31:13.002 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:13.003+0000 7f61817fa640 1 -- 192.168.123.107:0/1865308765 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+570 (secure 0 0 0) 0x7f6174069ea0 con 0x7f6184102a60 2026-03-09T19:31:13.002 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_WARN Degraded data redundancy: 229/291 objects degraded (78.694%), 2 pgs degraded, 3 pgs undersized 2026-03-09T19:31:13.002 INFO:teuthology.orchestra.run.vm07.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 229/291 objects degraded (78.694%), 2 pgs degraded, 3 pgs undersized 2026-03-09T19:31:13.002 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.11 is stuck undersized for 110s, current state active+recovering+undersized+degraded+remapped, last acting [3,4] 2026-03-09T19:31:13.002 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.17 is stuck undersized for 110s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,5] 2026-03-09T19:31:13.002 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.18 is stuck undersized for 110s, current state active+undersized+remapped, last acting [2,1] 2026-03-09T19:31:13.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:13.006+0000 7f618a76e640 1 -- 192.168.123.107:0/1865308765 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f6158077720 msgr2=0x7f6158079be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:13.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:13.006+0000 7f618a76e640 1 --2- 192.168.123.107:0/1865308765 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f6158077720 0x7f6158079be0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f6170002c80 tx=0x7f61700023d0 comp rx=0 tx=0).stop 2026-03-09T19:31:13.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:13.006+0000 7f618a76e640 1 -- 192.168.123.107:0/1865308765 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6184102a60 msgr2=0x7f618419a680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:13.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:13.006+0000 7f618a76e640 1 --2- 192.168.123.107:0/1865308765 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6184102a60 0x7f618419a680 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f61740059c0 tx=0x7f617400bcf0 comp rx=0 tx=0).stop 2026-03-09T19:31:13.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:13.006+0000 7f618a76e640 1 -- 192.168.123.107:0/1865308765 shutdown_connections 2026-03-09T19:31:13.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:13.006+0000 7f618a76e640 1 --2- 192.168.123.107:0/1865308765 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f6158077720 0x7f6158079be0 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:13.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:13.007+0000 7f618a76e640 1 --2- 192.168.123.107:0/1865308765 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f618419abc0 0x7f618419fc30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:13.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:13.007+0000 7f618a76e640 1 --2- 192.168.123.107:0/1865308765 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6184102a60 0x7f618419a680 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:13.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:13.007+0000 7f618a76e640 1 -- 192.168.123.107:0/1865308765 >> 192.168.123.107:0/1865308765 conn(0x7f61840fe250 msgr2=0x7f61840ffb30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:13.006 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:13.007+0000 7f618a76e640 1 -- 192.168.123.107:0/1865308765 shutdown_connections 2026-03-09T19:31:13.006 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:13.007+0000 7f618a76e640 1 -- 192.168.123.107:0/1865308765 wait complete. 2026-03-09T19:31:13.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:13 vm08.local ceph-mon[103420]: from='client.34230 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:31:13.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:13 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 229/291 objects degraded (78.694%), 2 pgs degraded, 3 pgs undersized (PG_DEGRADED) 2026-03-09T19:31:13.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:13 vm08.local ceph-mon[103420]: from='client.34234 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:31:13.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:13 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/2063755179' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:31:13.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:13 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/2589332439' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:31:13.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:13 vm08.local ceph-mon[103420]: from='client.44203 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:31:13.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:13 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1865308765' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:31:13.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:13 vm07.local ceph-mon[111841]: from='client.34230 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:31:13.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:13 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 229/291 objects degraded (78.694%), 2 pgs degraded, 3 pgs undersized (PG_DEGRADED) 2026-03-09T19:31:13.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:13 vm07.local ceph-mon[111841]: from='client.34234 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:31:13.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:13 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/2063755179' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:31:13.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:13 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/2589332439' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:31:13.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:13 vm07.local ceph-mon[111841]: from='client.44203 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:31:13.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:13 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1865308765' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:31:14.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:14 vm08.local ceph-mon[103420]: pgmap v106: 65 pgs: 1 active+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 229/291 objects degraded (78.694%); 0 B/s, 5 objects/s recovering 2026-03-09T19:31:14.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:14 vm07.local ceph-mon[111841]: pgmap v106: 65 pgs: 1 active+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 229/291 objects degraded (78.694%); 0 B/s, 5 objects/s recovering 2026-03-09T19:31:15.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:15 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:31:15.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:15 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:31:15.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:15 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:31:15.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:15 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:15.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:15 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:31:15.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:15 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:31:15.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:15 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:31:15.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:15 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:16.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:16 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:31:16.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:16 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:31:16.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:16 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:31:16.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:16 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:31:16.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:16 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:31:16.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:16 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:31:16.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:16 vm08.local ceph-mon[103420]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-09T19:31:16.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:16 vm08.local ceph-mon[103420]: pgmap v107: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 229/291 objects degraded (78.694%); 0 B/s, 11 objects/s recovering 2026-03-09T19:31:16.629 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:16 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:31:16.629 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:16 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:31:16.629 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:16 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:31:16.629 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:16 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:31:16.629 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:16 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:31:16.629 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:16 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:31:16.629 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:16 vm07.local ceph-mon[111841]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-09T19:31:16.629 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:16 vm07.local ceph-mon[111841]: pgmap v107: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 229/291 objects degraded (78.694%); 0 B/s, 11 objects/s recovering 2026-03-09T19:31:18.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:18 vm07.local ceph-mon[111841]: pgmap v108: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 229/291 objects degraded (78.694%); 0 B/s, 11 objects/s recovering 2026-03-09T19:31:18.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:18 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 229/291 objects degraded (78.694%), 2 pgs degraded, 2 pgs undersized (PG_DEGRADED) 2026-03-09T19:31:18.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:18 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:31:19.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:18 vm08.local ceph-mon[103420]: pgmap v108: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 229/291 objects degraded (78.694%); 0 B/s, 11 objects/s recovering 2026-03-09T19:31:19.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:18 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 229/291 objects degraded (78.694%), 2 pgs degraded, 2 pgs undersized (PG_DEGRADED) 2026-03-09T19:31:19.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:18 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:31:20.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:20 vm07.local ceph-mon[111841]: pgmap v109: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 229/291 objects degraded (78.694%); 0 B/s, 4 objects/s recovering 2026-03-09T19:31:21.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:20 vm08.local ceph-mon[103420]: pgmap v109: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 229/291 objects degraded (78.694%); 0 B/s, 4 objects/s recovering 2026-03-09T19:31:22.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:22 vm07.local ceph-mon[111841]: pgmap v110: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 229/291 objects degraded (78.694%); 0 B/s, 8 objects/s recovering 2026-03-09T19:31:23.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:22 vm08.local ceph-mon[103420]: pgmap v110: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 229/291 objects degraded (78.694%); 0 B/s, 8 objects/s recovering 2026-03-09T19:31:24.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:24 vm07.local ceph-mon[111841]: pgmap v111: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 229/291 objects degraded (78.694%); 0 B/s, 7 objects/s recovering 2026-03-09T19:31:25.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:24 vm08.local ceph-mon[103420]: pgmap v111: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 229/291 objects degraded (78.694%); 0 B/s, 7 objects/s recovering 2026-03-09T19:31:26.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:25 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 201/291 objects degraded (69.072%), 2 pgs degraded, 2 pgs undersized (PG_DEGRADED) 2026-03-09T19:31:26.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:25 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 201/291 objects degraded (69.072%), 2 pgs degraded, 2 pgs undersized (PG_DEGRADED) 2026-03-09T19:31:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:26 vm08.local ceph-mon[103420]: pgmap v112: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 201/291 objects degraded (69.072%); 0 B/s, 11 objects/s recovering 2026-03-09T19:31:27.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:26 vm07.local ceph-mon[111841]: pgmap v112: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 201/291 objects degraded (69.072%); 0 B/s, 11 objects/s recovering 2026-03-09T19:31:28.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:28 vm07.local ceph-mon[111841]: pgmap v113: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 201/291 objects degraded (69.072%); 0 B/s, 7 objects/s recovering 2026-03-09T19:31:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:28 vm08.local ceph-mon[103420]: pgmap v113: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 201/291 objects degraded (69.072%); 0 B/s, 7 objects/s recovering 2026-03-09T19:31:30.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:30 vm07.local ceph-mon[111841]: pgmap v114: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 201/291 objects degraded (69.072%); 0 B/s, 7 objects/s recovering 2026-03-09T19:31:30.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:30 vm07.local ceph-mon[111841]: osdmap e70: 6 total, 6 up, 6 in 2026-03-09T19:31:30.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:30 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:31:30.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:30 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:30.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:30 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T19:31:30.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:30 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:31:31.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:30 vm08.local ceph-mon[103420]: pgmap v114: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 201/291 objects degraded (69.072%); 0 B/s, 7 objects/s recovering 2026-03-09T19:31:31.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:30 vm08.local ceph-mon[103420]: osdmap e70: 6 total, 6 up, 6 in 2026-03-09T19:31:31.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:30 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:31:31.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:30 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:31.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:30 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T19:31:31.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:30 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:31:31.792 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:31 vm07.local systemd[1]: Stopping Ceph osd.1 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:31:31.792 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:31 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1[75486]: 2026-03-09T19:31:31.685+0000 7f5755204640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T19:31:31.792 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:31 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1[75486]: 2026-03-09T19:31:31.685+0000 7f5755204640 -1 osd.1 71 *** Got signal Terminated *** 2026-03-09T19:31:31.793 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:31 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1[75486]: 2026-03-09T19:31:31.685+0000 7f5755204640 -1 osd.1 71 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T19:31:32.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:31 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:31:32.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:31 vm08.local ceph-mon[103420]: Upgrade: osd.1 is safe to restart 2026-03-09T19:31:32.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:31 vm08.local ceph-mon[103420]: Upgrade: Updating osd.1 2026-03-09T19:31:32.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:31 vm08.local ceph-mon[103420]: Deploying daemon osd.1 on vm07 2026-03-09T19:31:32.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:31 vm08.local ceph-mon[103420]: osdmap e71: 6 total, 6 up, 6 in 2026-03-09T19:31:32.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:31 vm08.local ceph-mon[103420]: osd.1 marked itself down and dead 2026-03-09T19:31:32.129 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:31 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T19:31:32.129 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:31 vm07.local ceph-mon[111841]: Upgrade: osd.1 is safe to restart 2026-03-09T19:31:32.129 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:31 vm07.local ceph-mon[111841]: Upgrade: Updating osd.1 2026-03-09T19:31:32.129 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:31 vm07.local ceph-mon[111841]: Deploying daemon osd.1 on vm07 2026-03-09T19:31:32.129 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:31 vm07.local ceph-mon[111841]: osdmap e71: 6 total, 6 up, 6 in 2026-03-09T19:31:32.129 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:31 vm07.local ceph-mon[111841]: osd.1 marked itself down and dead 2026-03-09T19:31:32.129 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:31 vm07.local podman[125792]: 2026-03-09 19:31:31.906067249 +0000 UTC m=+0.237979968 container died 2b3c7dd9214428b779e6f3407d546babe6b54c934ce19db842242751f6b311cd (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, ceph=True, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default) 2026-03-09T19:31:32.129 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:31 vm07.local podman[125792]: 2026-03-09 19:31:31.934726629 +0000 UTC m=+0.266639348 container remove 2b3c7dd9214428b779e6f3407d546babe6b54c934ce19db842242751f6b311cd (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T19:31:32.129 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:31 vm07.local bash[125792]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1 2026-03-09T19:31:32.381 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local podman[125856]: 2026-03-09 19:31:32.131280768 +0000 UTC m=+0.019431686 container create c1e87577bdfa11bb3c4a7454077a6c58e4bdebb8728d46e019e089a8b0f961f5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-deactivate, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T19:31:32.381 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local podman[125856]: 2026-03-09 19:31:32.18096262 +0000 UTC m=+0.069113538 container init c1e87577bdfa11bb3c4a7454077a6c58e4bdebb8728d46e019e089a8b0f961f5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-deactivate, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T19:31:32.381 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local podman[125856]: 2026-03-09 19:31:32.184050842 +0000 UTC m=+0.072201760 container start c1e87577bdfa11bb3c4a7454077a6c58e4bdebb8728d46e019e089a8b0f961f5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-deactivate, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20260223, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T19:31:32.381 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local podman[125856]: 2026-03-09 19:31:32.192520616 +0000 UTC m=+0.080671525 container attach c1e87577bdfa11bb3c4a7454077a6c58e4bdebb8728d46e019e089a8b0f961f5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) 2026-03-09T19:31:32.381 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local podman[125856]: 2026-03-09 19:31:32.123769738 +0000 UTC m=+0.011920656 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:31:32.381 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local podman[125856]: 2026-03-09 19:31:32.328549987 +0000 UTC m=+0.216700905 container died c1e87577bdfa11bb3c4a7454077a6c58e4bdebb8728d46e019e089a8b0f961f5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid) 2026-03-09T19:31:32.381 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local podman[125856]: 2026-03-09 19:31:32.369670432 +0000 UTC m=+0.257821350 container remove c1e87577bdfa11bb3c4a7454077a6c58e4bdebb8728d46e019e089a8b0f961f5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-deactivate, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T19:31:32.728 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.1.service: Deactivated successfully. 2026-03-09T19:31:32.729 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local systemd[1]: Stopped Ceph osd.1 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:31:32.729 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.1.service: Consumed 51.369s CPU time. 2026-03-09T19:31:32.729 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local systemd[1]: Starting Ceph osd.1 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:31:33.019 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:32 vm07.local ceph-mon[111841]: pgmap v117: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+undersized+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 51/291 objects degraded (17.526%); 0 B/s, 11 objects/s recovering 2026-03-09T19:31:33.019 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:32 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 51/291 objects degraded (17.526%), 1 pg degraded, 2 pgs undersized (PG_DEGRADED) 2026-03-09T19:31:33.019 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:32 vm07.local ceph-mon[111841]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T19:31:33.019 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:32 vm07.local ceph-mon[111841]: osdmap e72: 6 total, 5 up, 6 in 2026-03-09T19:31:33.019 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local podman[125962]: 2026-03-09 19:31:32.759193912 +0000 UTC m=+0.029219268 container create 1b6ed8d57e193dd99d81a2f701cdd97b2cfadae733262e4a2b720b205ca581b3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T19:31:33.019 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local podman[125962]: 2026-03-09 19:31:32.744649495 +0000 UTC m=+0.014674862 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:31:33.020 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local podman[125962]: 2026-03-09 19:31:32.85156186 +0000 UTC m=+0.121587216 container init 1b6ed8d57e193dd99d81a2f701cdd97b2cfadae733262e4a2b720b205ca581b3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_REF=squid) 2026-03-09T19:31:33.020 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local podman[125962]: 2026-03-09 19:31:32.860623604 +0000 UTC m=+0.130648960 container start 1b6ed8d57e193dd99d81a2f701cdd97b2cfadae733262e4a2b720b205ca581b3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, ceph=True, CEPH_REF=squid) 2026-03-09T19:31:33.020 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local podman[125962]: 2026-03-09 19:31:32.884578395 +0000 UTC m=+0.154603751 container attach 1b6ed8d57e193dd99d81a2f701cdd97b2cfadae733262e4a2b720b205ca581b3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3) 2026-03-09T19:31:33.020 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate[125976]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:31:33.020 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:32 vm07.local bash[125962]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:31:33.020 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate[125976]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:31:33.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:32 vm08.local ceph-mon[103420]: pgmap v117: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+undersized+remapped, 63 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 51/291 objects degraded (17.526%); 0 B/s, 11 objects/s recovering 2026-03-09T19:31:33.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:32 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 51/291 objects degraded (17.526%), 1 pg degraded, 2 pgs undersized (PG_DEGRADED) 2026-03-09T19:31:33.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:32 vm08.local ceph-mon[103420]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T19:31:33.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:32 vm08.local ceph-mon[103420]: osdmap e72: 6 total, 5 up, 6 in 2026-03-09T19:31:33.478 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local bash[125962]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:31:33.888 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate[125976]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T19:31:33.888 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate[125976]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:31:33.888 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local bash[125962]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T19:31:33.888 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local bash[125962]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:31:33.888 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate[125976]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:31:33.888 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local bash[125962]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:31:33.888 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate[125976]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-09T19:31:33.888 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local bash[125962]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-09T19:31:33.888 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate[125976]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-6d60fa05-2322-44fa-b7ab-7e4d57ce1bb7/osd-block-0f0316a2-1b3a-4bd0-b463-b3d326b0fb51 --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-09T19:31:33.888 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local bash[125962]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-6d60fa05-2322-44fa-b7ab-7e4d57ce1bb7/osd-block-0f0316a2-1b3a-4bd0-b463-b3d326b0fb51 --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-09T19:31:33.888 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate[125976]: Running command: /usr/bin/ln -snf /dev/ceph-6d60fa05-2322-44fa-b7ab-7e4d57ce1bb7/osd-block-0f0316a2-1b3a-4bd0-b463-b3d326b0fb51 /var/lib/ceph/osd/ceph-1/block 2026-03-09T19:31:34.166 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:33 vm07.local ceph-mon[111841]: osdmap e73: 6 total, 5 up, 6 in 2026-03-09T19:31:34.166 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:34.166 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:31:34.167 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local bash[125962]: Running command: /usr/bin/ln -snf /dev/ceph-6d60fa05-2322-44fa-b7ab-7e4d57ce1bb7/osd-block-0f0316a2-1b3a-4bd0-b463-b3d326b0fb51 /var/lib/ceph/osd/ceph-1/block 2026-03-09T19:31:34.167 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate[125976]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-09T19:31:34.167 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local bash[125962]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-09T19:31:34.167 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate[125976]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-09T19:31:34.167 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local bash[125962]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-09T19:31:34.167 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate[125976]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-09T19:31:34.167 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local bash[125962]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-09T19:31:34.167 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate[125976]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-09T19:31:34.167 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local bash[125962]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-09T19:31:34.167 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:33 vm07.local podman[125962]: 2026-03-09 19:31:33.924738705 +0000 UTC m=+1.194764061 container died 1b6ed8d57e193dd99d81a2f701cdd97b2cfadae733262e4a2b720b205ca581b3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T19:31:34.167 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:34 vm07.local podman[125962]: 2026-03-09 19:31:34.049001549 +0000 UTC m=+1.319026905 container remove 1b6ed8d57e193dd99d81a2f701cdd97b2cfadae733262e4a2b720b205ca581b3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-activate, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS) 2026-03-09T19:31:34.167 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:34 vm07.local podman[126233]: 2026-03-09 19:31:34.168798784 +0000 UTC m=+0.020977561 container create 36b65f1069e55fbe78505c35f7e379b4cd8768055022b8ec0c7e226e3b04371c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T19:31:34.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:34 vm08.local ceph-mon[103420]: osdmap e73: 6 total, 5 up, 6 in 2026-03-09T19:31:34.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:34 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:34.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:34 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:31:34.479 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:34 vm07.local podman[126233]: 2026-03-09 19:31:34.208031626 +0000 UTC m=+0.060210403 container init 36b65f1069e55fbe78505c35f7e379b4cd8768055022b8ec0c7e226e3b04371c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T19:31:34.479 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:34 vm07.local podman[126233]: 2026-03-09 19:31:34.212513124 +0000 UTC m=+0.064691901 container start 36b65f1069e55fbe78505c35f7e379b4cd8768055022b8ec0c7e226e3b04371c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T19:31:34.479 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:34 vm07.local bash[126233]: 36b65f1069e55fbe78505c35f7e379b4cd8768055022b8ec0c7e226e3b04371c 2026-03-09T19:31:34.479 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:34 vm07.local podman[126233]: 2026-03-09 19:31:34.160679835 +0000 UTC m=+0.012858622 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:31:34.479 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:34 vm07.local systemd[1]: Started Ceph osd.1 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:31:34.876 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:34 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1[126243]: 2026-03-09T19:31:34.799+0000 7ff222d2e740 -1 Falling back to public interface 2026-03-09T19:31:35.132 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:34 vm07.local ceph-mon[111841]: pgmap v120: 65 pgs: 12 stale+active+clean, 1 active+recovering+undersized+degraded+remapped, 1 active+undersized+remapped, 51 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 51/291 objects degraded (17.526%); 0 B/s, 11 objects/s recovering 2026-03-09T19:31:35.132 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:35.132 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:35.132 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:31:35.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:35 vm08.local ceph-mon[103420]: pgmap v120: 65 pgs: 12 stale+active+clean, 1 active+recovering+undersized+degraded+remapped, 1 active+undersized+remapped, 51 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 51/291 objects degraded (17.526%); 0 B/s, 11 objects/s recovering 2026-03-09T19:31:35.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:35 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:35.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:35 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:35.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:35 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:31:36.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:36 vm07.local ceph-mon[111841]: pgmap v121: 65 pgs: 3 stale+active+clean, 1 active+recovering+undersized+degraded+remapped, 16 active+undersized, 13 active+undersized+degraded, 32 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 88/291 objects degraded (30.241%); 0 B/s, 7 objects/s recovering 2026-03-09T19:31:36.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:36 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:36.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:36 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:36.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:36 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:36.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:36 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:37.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:36 vm08.local ceph-mon[103420]: pgmap v121: 65 pgs: 3 stale+active+clean, 1 active+recovering+undersized+degraded+remapped, 16 active+undersized, 13 active+undersized+degraded, 32 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 88/291 objects degraded (30.241%); 0 B/s, 7 objects/s recovering 2026-03-09T19:31:37.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:36 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:37.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:36 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:37.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:36 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:37.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:36 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:38.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:38 vm07.local ceph-mon[111841]: pgmap v122: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 19 active+undersized, 15 active+undersized+degraded, 30 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 91/291 objects degraded (31.271%) 2026-03-09T19:31:38.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:38.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:38.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:31:38.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:31:38.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:38.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:31:38.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:31:38.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:31:38.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:31:38.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T19:31:38.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:38 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T19:31:38.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:38 vm07.local ceph-mon[111841]: Upgrade: unsafe to stop osd(s) at this time (13 PGs are or would become offline) 2026-03-09T19:31:38.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:38 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 91/291 objects degraded (31.271%), 16 pgs degraded, 1 pg undersized (PG_DEGRADED) 2026-03-09T19:31:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:38 vm08.local ceph-mon[103420]: pgmap v122: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 19 active+undersized, 15 active+undersized+degraded, 30 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 91/291 objects degraded (31.271%) 2026-03-09T19:31:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:38 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:38 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:38 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:31:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:38 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:31:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:38 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:38 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:31:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:38 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:31:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:38 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:31:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:38 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:31:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:38 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T19:31:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:38 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T19:31:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:38 vm08.local ceph-mon[103420]: Upgrade: unsafe to stop osd(s) at this time (13 PGs are or would become offline) 2026-03-09T19:31:39.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:38 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 91/291 objects degraded (31.271%), 16 pgs degraded, 1 pg undersized (PG_DEGRADED) 2026-03-09T19:31:39.728 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:39 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1[126243]: 2026-03-09T19:31:39.340+0000 7ff222d2e740 -1 osd.1 71 log_to_monitors true 2026-03-09T19:31:40.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:39 vm08.local ceph-mon[103420]: from='osd.1 [v2:192.168.123.107:6810/2870896909,v1:192.168.123.107:6811/2870896909]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T19:31:40.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:39 vm07.local ceph-mon[111841]: from='osd.1 [v2:192.168.123.107:6810/2870896909,v1:192.168.123.107:6811/2870896909]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T19:31:40.228 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:31:39 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1[126243]: 2026-03-09T19:31:39.762+0000 7ff21aac8640 -1 osd.1 71 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T19:31:41.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:40 vm08.local ceph-mon[103420]: pgmap v123: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 19 active+undersized, 15 active+undersized+degraded, 30 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 91/291 objects degraded (31.271%) 2026-03-09T19:31:41.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:40 vm08.local ceph-mon[103420]: from='osd.1 [v2:192.168.123.107:6810/2870896909,v1:192.168.123.107:6811/2870896909]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-09T19:31:41.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:40 vm08.local ceph-mon[103420]: osdmap e74: 6 total, 5 up, 6 in 2026-03-09T19:31:41.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:40 vm08.local ceph-mon[103420]: from='osd.1 [v2:192.168.123.107:6810/2870896909,v1:192.168.123.107:6811/2870896909]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-09T19:31:41.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:40 vm07.local ceph-mon[111841]: pgmap v123: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 19 active+undersized, 15 active+undersized+degraded, 30 active+clean; 218 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 91/291 objects degraded (31.271%) 2026-03-09T19:31:41.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:40 vm07.local ceph-mon[111841]: from='osd.1 [v2:192.168.123.107:6810/2870896909,v1:192.168.123.107:6811/2870896909]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-09T19:31:41.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:40 vm07.local ceph-mon[111841]: osdmap e74: 6 total, 5 up, 6 in 2026-03-09T19:31:41.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:40 vm07.local ceph-mon[111841]: from='osd.1 [v2:192.168.123.107:6810/2870896909,v1:192.168.123.107:6811/2870896909]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-09T19:31:42.089 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:41 vm07.local ceph-mon[111841]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T19:31:42.089 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:41 vm07.local ceph-mon[111841]: osd.1 [v2:192.168.123.107:6810/2870896909,v1:192.168.123.107:6811/2870896909] boot 2026-03-09T19:31:42.089 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:41 vm07.local ceph-mon[111841]: osdmap e75: 6 total, 6 up, 6 in 2026-03-09T19:31:42.089 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:41 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:31:42.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:41 vm08.local ceph-mon[103420]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T19:31:42.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:41 vm08.local ceph-mon[103420]: osd.1 [v2:192.168.123.107:6810/2870896909,v1:192.168.123.107:6811/2870896909] boot 2026-03-09T19:31:42.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:41 vm08.local ceph-mon[103420]: osdmap e75: 6 total, 6 up, 6 in 2026-03-09T19:31:42.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:41 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T19:31:43.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:42 vm07.local ceph-mon[111841]: pgmap v126: 65 pgs: 2 peering, 1 active+recovering+undersized+degraded+remapped, 17 active+undersized, 15 active+undersized+degraded, 30 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 91/291 objects degraded (31.271%) 2026-03-09T19:31:43.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:42 vm07.local ceph-mon[111841]: osdmap e76: 6 total, 6 up, 6 in 2026-03-09T19:31:43.071 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.071+0000 7f1443bd6640 1 -- 192.168.123.107:0/3151772634 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f143c0ff6e0 msgr2=0x7f143c0ffae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:43.071 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.071+0000 7f1443bd6640 1 --2- 192.168.123.107:0/3151772634 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f143c0ff6e0 0x7f143c0ffae0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f14300099b0 tx=0x7f143002f220 comp rx=0 tx=0).stop 2026-03-09T19:31:43.071 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.072+0000 7f1443bd6640 1 -- 192.168.123.107:0/3151772634 shutdown_connections 2026-03-09T19:31:43.071 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.072+0000 7f1443bd6640 1 --2- 192.168.123.107:0/3151772634 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f143c1008e0 0x7f143c100d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.071 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.072+0000 7f1443bd6640 1 --2- 192.168.123.107:0/3151772634 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f143c0ff6e0 0x7f143c0ffae0 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.073+0000 7f1443bd6640 1 -- 192.168.123.107:0/3151772634 >> 192.168.123.107:0/3151772634 conn(0x7f143c0fae50 msgr2=0x7f143c0fd2b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:43.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.073+0000 7f1443bd6640 1 -- 192.168.123.107:0/3151772634 shutdown_connections 2026-03-09T19:31:43.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.073+0000 7f1443bd6640 1 -- 192.168.123.107:0/3151772634 wait complete. 2026-03-09T19:31:43.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.074+0000 7f1443bd6640 1 Processor -- start 2026-03-09T19:31:43.073 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.074+0000 7f1443bd6640 1 -- start start 2026-03-09T19:31:43.073 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.074+0000 7f1443bd6640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f143c0ff6e0 0x7f143c198230 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:43.073 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.074+0000 7f1443bd6640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f143c1008e0 0x7f143c198770 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:43.073 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.074+0000 7f1443bd6640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f143c198d40 con 0x7f143c0ff6e0 2026-03-09T19:31:43.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.074+0000 7f1443bd6640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f143c198eb0 con 0x7f143c1008e0 2026-03-09T19:31:43.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.074+0000 7f144194b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f143c0ff6e0 0x7f143c198230 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:43.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.074+0000 7f144194b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f143c0ff6e0 0x7f143c198230 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:49432/0 (socket says 192.168.123.107:49432) 2026-03-09T19:31:43.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.074+0000 7f144194b640 1 -- 192.168.123.107:0/569037445 learned_addr learned my addr 192.168.123.107:0/569037445 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:31:43.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.075+0000 7f144194b640 1 -- 192.168.123.107:0/569037445 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f143c1008e0 msgr2=0x7f143c198770 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:31:43.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.075+0000 7f144114a640 1 --2- 192.168.123.107:0/569037445 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f143c1008e0 0x7f143c198770 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:43.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.075+0000 7f144194b640 1 --2- 192.168.123.107:0/569037445 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f143c1008e0 0x7f143c198770 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.075+0000 7f144194b640 1 -- 192.168.123.107:0/569037445 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f142c009590 con 0x7f143c0ff6e0 2026-03-09T19:31:43.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.075+0000 7f144114a640 1 --2- 192.168.123.107:0/569037445 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f143c1008e0 0x7f143c198770 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:31:43.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.075+0000 7f144194b640 1 --2- 192.168.123.107:0/569037445 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f143c0ff6e0 0x7f143c198230 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f143002f730 tx=0x7f14300043d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:43.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.076+0000 7f142affd640 1 -- 192.168.123.107:0/569037445 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f143002fd10 con 0x7f143c0ff6e0 2026-03-09T19:31:43.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.076+0000 7f1443bd6640 1 -- 192.168.123.107:0/569037445 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1430009660 con 0x7f143c0ff6e0 2026-03-09T19:31:43.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.076+0000 7f1443bd6640 1 -- 192.168.123.107:0/569037445 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f143c105e80 con 0x7f143c0ff6e0 2026-03-09T19:31:43.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.076+0000 7f142affd640 1 -- 192.168.123.107:0/569037445 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f143002fe70 con 0x7f143c0ff6e0 2026-03-09T19:31:43.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.076+0000 7f142affd640 1 -- 192.168.123.107:0/569037445 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1430038af0 con 0x7f143c0ff6e0 2026-03-09T19:31:43.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.077+0000 7f142affd640 1 -- 192.168.123.107:0/569037445 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1430038c50 con 0x7f143c0ff6e0 2026-03-09T19:31:43.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.078+0000 7f142affd640 1 --2- 192.168.123.107:0/569037445 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f14100779b0 0x7f1410079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:43.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.078+0000 7f142affd640 1 -- 192.168.123.107:0/569037445 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(76..76 src has 1..76) v4 ==== 6509+0+0 (secure 0 0 0) 0x7f14300c2fa0 con 0x7f143c0ff6e0 2026-03-09T19:31:43.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.078+0000 7f144114a640 1 --2- 192.168.123.107:0/569037445 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f14100779b0 0x7f1410079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:43.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.078+0000 7f1443bd6640 1 -- 192.168.123.107:0/569037445 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1404005350 con 0x7f143c0ff6e0 2026-03-09T19:31:43.080 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.081+0000 7f144114a640 1 --2- 192.168.123.107:0/569037445 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f14100779b0 0x7f1410079e70 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f143c199750 tx=0x7f142c009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:43.080 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.081+0000 7f142affd640 1 -- 192.168.123.107:0/569037445 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f143008c620 con 0x7f143c0ff6e0 2026-03-09T19:31:43.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:42 vm08.local ceph-mon[103420]: pgmap v126: 65 pgs: 2 peering, 1 active+recovering+undersized+degraded+remapped, 17 active+undersized, 15 active+undersized+degraded, 30 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 91/291 objects degraded (31.271%) 2026-03-09T19:31:43.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:42 vm08.local ceph-mon[103420]: osdmap e76: 6 total, 6 up, 6 in 2026-03-09T19:31:43.206 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.207+0000 7f1443bd6640 1 -- 192.168.123.107:0/569037445 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1404002bf0 con 0x7f14100779b0 2026-03-09T19:31:43.208 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.209+0000 7f142affd640 1 -- 192.168.123.107:0/569037445 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f1404002bf0 con 0x7f14100779b0 2026-03-09T19:31:43.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.212+0000 7f1428ff9640 1 -- 192.168.123.107:0/569037445 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f14100779b0 msgr2=0x7f1410079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:43.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.212+0000 7f1428ff9640 1 --2- 192.168.123.107:0/569037445 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f14100779b0 0x7f1410079e70 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f143c199750 tx=0x7f142c009290 comp rx=0 tx=0).stop 2026-03-09T19:31:43.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.212+0000 7f1428ff9640 1 -- 192.168.123.107:0/569037445 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f143c0ff6e0 msgr2=0x7f143c198230 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:43.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.212+0000 7f1428ff9640 1 --2- 192.168.123.107:0/569037445 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f143c0ff6e0 0x7f143c198230 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f143002f730 tx=0x7f14300043d0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.213+0000 7f1428ff9640 1 -- 192.168.123.107:0/569037445 shutdown_connections 2026-03-09T19:31:43.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.213+0000 7f1428ff9640 1 --2- 192.168.123.107:0/569037445 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f14100779b0 0x7f1410079e70 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.213+0000 7f1428ff9640 1 --2- 192.168.123.107:0/569037445 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f143c1008e0 0x7f143c198770 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.213+0000 7f1428ff9640 1 --2- 192.168.123.107:0/569037445 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f143c0ff6e0 0x7f143c198230 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.213+0000 7f1428ff9640 1 -- 192.168.123.107:0/569037445 >> 192.168.123.107:0/569037445 conn(0x7f143c0fae50 msgr2=0x7f143c0fc680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:43.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.213+0000 7f1428ff9640 1 -- 192.168.123.107:0/569037445 shutdown_connections 2026-03-09T19:31:43.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.213+0000 7f1428ff9640 1 -- 192.168.123.107:0/569037445 wait complete. 2026-03-09T19:31:43.227 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:31:43.297 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.298+0000 7f041bc1d640 1 -- 192.168.123.107:0/3169773625 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0414073b40 msgr2=0x7f0414073fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:43.297 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.298+0000 7f041bc1d640 1 --2- 192.168.123.107:0/3169773625 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0414073b40 0x7f0414073fa0 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f04040099b0 tx=0x7f040402f220 comp rx=0 tx=0).stop 2026-03-09T19:31:43.297 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.298+0000 7f041bc1d640 1 -- 192.168.123.107:0/3169773625 shutdown_connections 2026-03-09T19:31:43.297 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.298+0000 7f041bc1d640 1 --2- 192.168.123.107:0/3169773625 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0414073b40 0x7f0414073fa0 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.297 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.298+0000 7f041bc1d640 1 --2- 192.168.123.107:0/3169773625 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f04140751a0 0x7f0414073600 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.297 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.298+0000 7f041bc1d640 1 -- 192.168.123.107:0/3169773625 >> 192.168.123.107:0/3169773625 conn(0x7f04140fbf80 msgr2=0x7f04140fe3e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:43.297 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.298+0000 7f041bc1d640 1 -- 192.168.123.107:0/3169773625 shutdown_connections 2026-03-09T19:31:43.297 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.298+0000 7f041bc1d640 1 -- 192.168.123.107:0/3169773625 wait complete. 2026-03-09T19:31:43.298 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.299+0000 7f041bc1d640 1 Processor -- start 2026-03-09T19:31:43.298 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.299+0000 7f041bc1d640 1 -- start start 2026-03-09T19:31:43.298 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.299+0000 7f041bc1d640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0414073b40 0x7f041419a470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:43.298 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.299+0000 7f041bc1d640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f04140751a0 0x7f041419a9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:43.298 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.299+0000 7f041bc1d640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f041419af80 con 0x7f0414073b40 2026-03-09T19:31:43.298 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.299+0000 7f041bc1d640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f041419b0f0 con 0x7f04140751a0 2026-03-09T19:31:43.298 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.299+0000 7f0419992640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0414073b40 0x7f041419a470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:43.298 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.300+0000 7f0419191640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f04140751a0 0x7f041419a9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:43.298 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.300+0000 7f0419191640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f04140751a0 0x7f041419a9b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:53912/0 (socket says 192.168.123.107:53912) 2026-03-09T19:31:43.299 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.300+0000 7f0419191640 1 -- 192.168.123.107:0/3489594187 learned_addr learned my addr 192.168.123.107:0/3489594187 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:31:43.299 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.300+0000 7f0419191640 1 -- 192.168.123.107:0/3489594187 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0414073b40 msgr2=0x7f041419a470 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:43.299 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.300+0000 7f0419191640 1 --2- 192.168.123.107:0/3489594187 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0414073b40 0x7f041419a470 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.299 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.300+0000 7f0419191640 1 -- 192.168.123.107:0/3489594187 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f03fc009590 con 0x7f04140751a0 2026-03-09T19:31:43.299 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.300+0000 7f0419992640 1 --2- 192.168.123.107:0/3489594187 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0414073b40 0x7f041419a470 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:31:43.299 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.300+0000 7f0419191640 1 --2- 192.168.123.107:0/3489594187 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f04140751a0 0x7f041419a9b0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f0404002c20 tx=0x7f0404002910 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:43.300 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.301+0000 7f040affd640 1 -- 192.168.123.107:0/3489594187 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f040403d070 con 0x7f04140751a0 2026-03-09T19:31:43.300 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.301+0000 7f041bc1d640 1 -- 192.168.123.107:0/3489594187 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0404009660 con 0x7f04140751a0 2026-03-09T19:31:43.302 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.301+0000 7f041bc1d640 1 -- 192.168.123.107:0/3489594187 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f041419fe90 con 0x7f04140751a0 2026-03-09T19:31:43.302 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.301+0000 7f040affd640 1 -- 192.168.123.107:0/3489594187 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f0404002e20 con 0x7f04140751a0 2026-03-09T19:31:43.302 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.301+0000 7f040affd640 1 -- 192.168.123.107:0/3489594187 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0404041770 con 0x7f04140751a0 2026-03-09T19:31:43.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.304+0000 7f040affd640 1 -- 192.168.123.107:0/3489594187 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0404038730 con 0x7f04140751a0 2026-03-09T19:31:43.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.304+0000 7f040affd640 1 --2- 192.168.123.107:0/3489594187 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f03e40779b0 0x7f03e4079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:43.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.305+0000 7f0419992640 1 --2- 192.168.123.107:0/3489594187 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f03e40779b0 0x7f03e4079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:43.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.305+0000 7f040affd640 1 -- 192.168.123.107:0/3489594187 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(76..76 src has 1..76) v4 ==== 6509+0+0 (secure 0 0 0) 0x7f04040be3d0 con 0x7f04140751a0 2026-03-09T19:31:43.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.305+0000 7f0419992640 1 --2- 192.168.123.107:0/3489594187 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f03e40779b0 0x7f03e4079e70 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f03fc004750 tx=0x7f03fc009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:43.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.305+0000 7f041bc1d640 1 -- 192.168.123.107:0/3489594187 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f03d8005350 con 0x7f04140751a0 2026-03-09T19:31:43.308 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.309+0000 7f040affd640 1 -- 192.168.123.107:0/3489594187 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0404086bc0 con 0x7f04140751a0 2026-03-09T19:31:43.425 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.424+0000 7f041bc1d640 1 -- 192.168.123.107:0/3489594187 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f03d8002bf0 con 0x7f03e40779b0 2026-03-09T19:31:43.425 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.426+0000 7f040affd640 1 -- 192.168.123.107:0/3489594187 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f03d8002bf0 con 0x7f03e40779b0 2026-03-09T19:31:43.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.430+0000 7f041bc1d640 1 -- 192.168.123.107:0/3489594187 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f03e40779b0 msgr2=0x7f03e4079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:43.430 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.430+0000 7f041bc1d640 1 --2- 192.168.123.107:0/3489594187 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f03e40779b0 0x7f03e4079e70 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f03fc004750 tx=0x7f03fc009290 comp rx=0 tx=0).stop 2026-03-09T19:31:43.430 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.430+0000 7f041bc1d640 1 -- 192.168.123.107:0/3489594187 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f04140751a0 msgr2=0x7f041419a9b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:43.430 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.430+0000 7f041bc1d640 1 --2- 192.168.123.107:0/3489594187 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f04140751a0 0x7f041419a9b0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f0404002c20 tx=0x7f0404002910 comp rx=0 tx=0).stop 2026-03-09T19:31:43.430 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.431+0000 7f041bc1d640 1 -- 192.168.123.107:0/3489594187 shutdown_connections 2026-03-09T19:31:43.430 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.431+0000 7f041bc1d640 1 --2- 192.168.123.107:0/3489594187 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f03e40779b0 0x7f03e4079e70 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.430 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.431+0000 7f041bc1d640 1 --2- 192.168.123.107:0/3489594187 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f04140751a0 0x7f041419a9b0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.431 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.431+0000 7f041bc1d640 1 --2- 192.168.123.107:0/3489594187 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0414073b40 0x7f041419a470 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.431 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.431+0000 7f041bc1d640 1 -- 192.168.123.107:0/3489594187 >> 192.168.123.107:0/3489594187 conn(0x7f04140fbf80 msgr2=0x7f04140fd790 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:43.431 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.431+0000 7f041bc1d640 1 -- 192.168.123.107:0/3489594187 shutdown_connections 2026-03-09T19:31:43.431 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.431+0000 7f041bc1d640 1 -- 192.168.123.107:0/3489594187 wait complete. 2026-03-09T19:31:43.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.500+0000 7ff7c0870640 1 -- 192.168.123.107:0/497701540 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff7b4098a40 msgr2=0x7ff7b4098ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:43.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.500+0000 7ff7c0870640 1 --2- 192.168.123.107:0/497701540 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff7b4098a40 0x7ff7b4098ec0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7ff7a80099b0 tx=0x7ff7a802f220 comp rx=0 tx=0).stop 2026-03-09T19:31:43.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.501+0000 7ff7c0870640 1 -- 192.168.123.107:0/497701540 shutdown_connections 2026-03-09T19:31:43.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.501+0000 7ff7c0870640 1 --2- 192.168.123.107:0/497701540 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff7b4098a40 0x7ff7b4098ec0 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.501+0000 7ff7c0870640 1 --2- 192.168.123.107:0/497701540 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff7b4097840 0x7ff7b4097c40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.501+0000 7ff7c0870640 1 -- 192.168.123.107:0/497701540 >> 192.168.123.107:0/497701540 conn(0x7ff7b4092ff0 msgr2=0x7ff7b4095410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:43.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.501+0000 7ff7c0870640 1 -- 192.168.123.107:0/497701540 shutdown_connections 2026-03-09T19:31:43.500 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.501+0000 7ff7c0870640 1 -- 192.168.123.107:0/497701540 wait complete. 2026-03-09T19:31:43.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.503+0000 7ff7c0870640 1 Processor -- start 2026-03-09T19:31:43.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.503+0000 7ff7c0870640 1 -- start start 2026-03-09T19:31:43.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.503+0000 7ff7c0870640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff7b4097840 0x7ff7b412f160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:43.503 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.504+0000 7ff7bb7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff7b4097840 0x7ff7b412f160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:43.503 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.504+0000 7ff7bb7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff7b4097840 0x7ff7b412f160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:49466/0 (socket says 192.168.123.107:49466) 2026-03-09T19:31:43.503 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.504+0000 7ff7c0870640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff7b4098a40 0x7ff7b412f6a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:43.503 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.504+0000 7ff7bb7fe640 1 -- 192.168.123.107:0/380116796 learned_addr learned my addr 192.168.123.107:0/380116796 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:31:43.503 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.504+0000 7ff7baffd640 1 --2- 192.168.123.107:0/380116796 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff7b4098a40 0x7ff7b412f6a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:43.503 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.504+0000 7ff7c0870640 1 -- 192.168.123.107:0/380116796 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff7b412fc70 con 0x7ff7b4097840 2026-03-09T19:31:43.503 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.504+0000 7ff7c0870640 1 -- 192.168.123.107:0/380116796 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff7b412fde0 con 0x7ff7b4098a40 2026-03-09T19:31:43.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.505+0000 7ff7bb7fe640 1 -- 192.168.123.107:0/380116796 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff7b4098a40 msgr2=0x7ff7b412f6a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:43.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.505+0000 7ff7bb7fe640 1 --2- 192.168.123.107:0/380116796 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff7b4098a40 0x7ff7b412f6a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.505+0000 7ff7bb7fe640 1 -- 192.168.123.107:0/380116796 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff7a8009660 con 0x7ff7b4097840 2026-03-09T19:31:43.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.505+0000 7ff7baffd640 1 --2- 192.168.123.107:0/380116796 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff7b4098a40 0x7ff7b412f6a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:31:43.504 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.506+0000 7ff7bb7fe640 1 --2- 192.168.123.107:0/380116796 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff7b4097840 0x7ff7b412f160 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7ff7a000da70 tx=0x7ff7a000df40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:43.510 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.506+0000 7ff7b8ff9640 1 -- 192.168.123.107:0/380116796 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff7a0004280 con 0x7ff7b4097840 2026-03-09T19:31:43.510 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.506+0000 7ff7b8ff9640 1 -- 192.168.123.107:0/380116796 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff7a000bdf0 con 0x7ff7b4097840 2026-03-09T19:31:43.510 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.506+0000 7ff7b8ff9640 1 -- 192.168.123.107:0/380116796 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff7a0010460 con 0x7ff7b4097840 2026-03-09T19:31:43.510 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.506+0000 7ff7c0870640 1 -- 192.168.123.107:0/380116796 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff7b4134880 con 0x7ff7b4097840 2026-03-09T19:31:43.510 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.506+0000 7ff7c0870640 1 -- 192.168.123.107:0/380116796 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff7b4008b70 con 0x7ff7b4097840 2026-03-09T19:31:43.513 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.512+0000 7ff7b8ff9640 1 -- 192.168.123.107:0/380116796 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff7a00043e0 con 0x7ff7b4097840 2026-03-09T19:31:43.514 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.513+0000 7ff7b8ff9640 1 --2- 192.168.123.107:0/380116796 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff790077890 0x7ff790079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:43.514 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.513+0000 7ff7b8ff9640 1 -- 192.168.123.107:0/380116796 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(76..76 src has 1..76) v4 ==== 6509+0+0 (secure 0 0 0) 0x7ff7a0099b90 con 0x7ff7b4097840 2026-03-09T19:31:43.514 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.513+0000 7ff7baffd640 1 --2- 192.168.123.107:0/380116796 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff790077890 0x7ff790079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:43.514 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.514+0000 7ff7baffd640 1 --2- 192.168.123.107:0/380116796 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff790077890 0x7ff790079d50 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7ff7b4130680 tx=0x7ff7a803a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:43.514 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.515+0000 7ff7c0870640 1 -- 192.168.123.107:0/380116796 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff780005350 con 0x7ff7b4097840 2026-03-09T19:31:43.518 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.519+0000 7ff7b8ff9640 1 -- 192.168.123.107:0/380116796 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff7a0062300 con 0x7ff7b4097840 2026-03-09T19:31:43.623 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.624+0000 7ff7c0870640 1 -- 192.168.123.107:0/380116796 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7ff780002bf0 con 0x7ff790077890 2026-03-09T19:31:43.629 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.630+0000 7ff7b8ff9640 1 -- 192.168.123.107:0/380116796 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7ff780002bf0 con 0x7ff790077890 2026-03-09T19:31:43.629 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:31:43.629 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (8m) 7s ago 9m 23.2M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:31:43.629 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (9m) 7s ago 9m 9529k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:31:43.629 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (8m) 2m ago 8m 10.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:31:43.629 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (2m) 7s ago 9m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 91ed5a6dbf3f 2026-03-09T19:31:43.629 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (2m) 2m ago 8m 8321k - 19.2.3-678-ge911bdeb 654f31e6858e b2465a9d2305 2026-03-09T19:31:43.629 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (8m) 7s ago 9m 89.2M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:31:43.629 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (7m) 7s ago 7m 17.7M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:31:43.629 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (7m) 7s ago 7m 19.3M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:31:43.629 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (7m) 2m ago 7m 28.8M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:31:43.630 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (7m) 2m ago 7m 239M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:31:43.630 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:8443,9283,8765 running (3m) 7s ago 9m 598M - 19.2.3-678-ge911bdeb 654f31e6858e 6c1350e70bfa 2026-03-09T19:31:43.630 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (3m) 2m ago 8m 489M - 19.2.3-678-ge911bdeb 654f31e6858e c4c36685d8dc 2026-03-09T19:31:43.630 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (3m) 7s ago 9m 62.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e ad39140965d8 2026-03-09T19:31:43.630 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (3m) 2m ago 8m 46.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b4a58927ebfd 2026-03-09T19:31:43.630 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (9m) 7s ago 9m 15.3M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:31:43.630 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (8m) 2m ago 8m 16.6M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:31:43.630 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (2m) 7s ago 8m 223M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a203aa241656 2026-03-09T19:31:43.630 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (9s) 7s ago 8m 13.1M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 36b65f1069e5 2026-03-09T19:31:43.630 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (8m) 7s ago 8m 370M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 67f7c4b96ef8 2026-03-09T19:31:43.630 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (7m) 2m ago 7m 441M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 740e44caf4fc 2026-03-09T19:31:43.630 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (7m) 2m ago 7m 446M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d929d31f8a58 2026-03-09T19:31:43.630 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (7m) 2m ago 7m 348M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:31:43.630 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (3m) 7s ago 8m 58.9M - 2.43.0 a07b618ecd1d c09450c20f5f 2026-03-09T19:31:43.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.634+0000 7ff7c0870640 1 -- 192.168.123.107:0/380116796 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff790077890 msgr2=0x7ff790079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:43.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.634+0000 7ff7c0870640 1 --2- 192.168.123.107:0/380116796 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff790077890 0x7ff790079d50 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7ff7b4130680 tx=0x7ff7a803a040 comp rx=0 tx=0).stop 2026-03-09T19:31:43.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.634+0000 7ff7c0870640 1 -- 192.168.123.107:0/380116796 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff7b4097840 msgr2=0x7ff7b412f160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:43.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.634+0000 7ff7c0870640 1 --2- 192.168.123.107:0/380116796 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff7b4097840 0x7ff7b412f160 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7ff7a000da70 tx=0x7ff7a000df40 comp rx=0 tx=0).stop 2026-03-09T19:31:43.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.634+0000 7ff7c0870640 1 -- 192.168.123.107:0/380116796 shutdown_connections 2026-03-09T19:31:43.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.634+0000 7ff7c0870640 1 --2- 192.168.123.107:0/380116796 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff790077890 0x7ff790079d50 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.634+0000 7ff7c0870640 1 --2- 192.168.123.107:0/380116796 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff7b4098a40 0x7ff7b412f6a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.634+0000 7ff7c0870640 1 --2- 192.168.123.107:0/380116796 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff7b4097840 0x7ff7b412f160 secure :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7ff7a000da70 tx=0x7ff7a000df40 comp rx=0 tx=0).stop 2026-03-09T19:31:43.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.634+0000 7ff7c0870640 1 -- 192.168.123.107:0/380116796 >> 192.168.123.107:0/380116796 conn(0x7ff7b4092ff0 msgr2=0x7ff7b4094a80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:43.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.634+0000 7ff7c0870640 1 -- 192.168.123.107:0/380116796 shutdown_connections 2026-03-09T19:31:43.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.634+0000 7ff7c0870640 1 -- 192.168.123.107:0/380116796 wait complete. 2026-03-09T19:31:43.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.697+0000 7f22f3899640 1 -- 192.168.123.107:0/1240567230 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22ec103c60 msgr2=0x7f22ec1040e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:43.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.697+0000 7f22f3899640 1 --2- 192.168.123.107:0/1240567230 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22ec103c60 0x7f22ec1040e0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f22dc0099b0 tx=0x7f22dc02f220 comp rx=0 tx=0).stop 2026-03-09T19:31:43.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.698+0000 7f22f3899640 1 -- 192.168.123.107:0/1240567230 shutdown_connections 2026-03-09T19:31:43.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.698+0000 7f22f3899640 1 --2- 192.168.123.107:0/1240567230 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22ec103c60 0x7f22ec1040e0 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.698+0000 7f22f3899640 1 --2- 192.168.123.107:0/1240567230 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f22ec102a60 0x7f22ec102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.698+0000 7f22f3899640 1 -- 192.168.123.107:0/1240567230 >> 192.168.123.107:0/1240567230 conn(0x7f22ec0fe250 msgr2=0x7f22ec100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:43.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.702+0000 7f22f3899640 1 -- 192.168.123.107:0/1240567230 shutdown_connections 2026-03-09T19:31:43.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.702+0000 7f22f3899640 1 -- 192.168.123.107:0/1240567230 wait complete. 2026-03-09T19:31:43.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.702+0000 7f22f3899640 1 Processor -- start 2026-03-09T19:31:43.701 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.703+0000 7f22f3899640 1 -- start start 2026-03-09T19:31:43.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.703+0000 7f22f3899640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22ec102a60 0x7f22ec19e930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:43.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.703+0000 7f22f3899640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f22ec103c60 0x7f22ec19ee70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:43.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.703+0000 7f22f3899640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f22ec19f440 con 0x7f22ec102a60 2026-03-09T19:31:43.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.703+0000 7f22f3899640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f22ec19f5b0 con 0x7f22ec103c60 2026-03-09T19:31:43.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.704+0000 7f22f0e0d640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f22ec103c60 0x7f22ec19ee70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:43.703 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.704+0000 7f22f0e0d640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f22ec103c60 0x7f22ec19ee70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:53936/0 (socket says 192.168.123.107:53936) 2026-03-09T19:31:43.703 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.704+0000 7f22f0e0d640 1 -- 192.168.123.107:0/418050767 learned_addr learned my addr 192.168.123.107:0/418050767 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:31:43.703 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.704+0000 7f22f160e640 1 --2- 192.168.123.107:0/418050767 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22ec102a60 0x7f22ec19e930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:43.704 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.705+0000 7f22f0e0d640 1 -- 192.168.123.107:0/418050767 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22ec102a60 msgr2=0x7f22ec19e930 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:43.704 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.705+0000 7f22f0e0d640 1 --2- 192.168.123.107:0/418050767 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22ec102a60 0x7f22ec19e930 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.704 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.705+0000 7f22f0e0d640 1 -- 192.168.123.107:0/418050767 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f22dc009660 con 0x7f22ec103c60 2026-03-09T19:31:43.704 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.705+0000 7f22f160e640 1 --2- 192.168.123.107:0/418050767 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22ec102a60 0x7f22ec19e930 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:31:43.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.706+0000 7f22f0e0d640 1 --2- 192.168.123.107:0/418050767 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f22ec103c60 0x7f22ec19ee70 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f22dc002c20 tx=0x7f22dc0028f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:43.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.706+0000 7f22da7fc640 1 -- 192.168.123.107:0/418050767 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f22dc03d070 con 0x7f22ec103c60 2026-03-09T19:31:43.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.706+0000 7f22f3899640 1 -- 192.168.123.107:0/418050767 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f22ec1a3ff0 con 0x7f22ec103c60 2026-03-09T19:31:43.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.706+0000 7f22f3899640 1 -- 192.168.123.107:0/418050767 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f22ec1a44b0 con 0x7f22ec103c60 2026-03-09T19:31:43.706 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.706+0000 7f22da7fc640 1 -- 192.168.123.107:0/418050767 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f22dc02fc90 con 0x7f22ec103c60 2026-03-09T19:31:43.706 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.706+0000 7f22da7fc640 1 -- 192.168.123.107:0/418050767 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f22dc041770 con 0x7f22ec103c60 2026-03-09T19:31:43.706 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.707+0000 7f22f3899640 1 -- 192.168.123.107:0/418050767 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f22ec10b6b0 con 0x7f22ec103c60 2026-03-09T19:31:43.709 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.709+0000 7f22da7fc640 1 -- 192.168.123.107:0/418050767 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f22dc038730 con 0x7f22ec103c60 2026-03-09T19:31:43.709 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.709+0000 7f22da7fc640 1 --2- 192.168.123.107:0/418050767 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f22c80778e0 0x7f22c8079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:43.709 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.709+0000 7f22da7fc640 1 -- 192.168.123.107:0/418050767 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(76..76 src has 1..76) v4 ==== 6509+0+0 (secure 0 0 0) 0x7f22dc0bea30 con 0x7f22ec103c60 2026-03-09T19:31:43.709 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.710+0000 7f22f160e640 1 --2- 192.168.123.107:0/418050767 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f22c80778e0 0x7f22c8079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:43.710 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.711+0000 7f22da7fc640 1 -- 192.168.123.107:0/418050767 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f22dc087220 con 0x7f22ec103c60 2026-03-09T19:31:43.710 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.711+0000 7f22f160e640 1 --2- 192.168.123.107:0/418050767 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f22c80778e0 0x7f22c8079da0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f22ec103ac0 tx=0x7f22e00073d0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:43.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.851+0000 7f22f3899640 1 -- 192.168.123.107:0/418050767 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f22ec1a4760 con 0x7f22ec103c60 2026-03-09T19:31:43.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.852+0000 7f22da7fc640 1 -- 192.168.123.107:0/418050767 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+844 (secure 0 0 0) 0x7f22dc086970 con 0x7f22ec103c60 2026-03-09T19:31:43.852 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4, 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 8, 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:31:43.853 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:31:43.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.856+0000 7f22f3899640 1 -- 192.168.123.107:0/418050767 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f22c80778e0 msgr2=0x7f22c8079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:43.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.857+0000 7f22f3899640 1 --2- 192.168.123.107:0/418050767 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f22c80778e0 0x7f22c8079da0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f22ec103ac0 tx=0x7f22e00073d0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.857+0000 7f22f3899640 1 -- 192.168.123.107:0/418050767 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f22ec103c60 msgr2=0x7f22ec19ee70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:43.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.857+0000 7f22f3899640 1 --2- 192.168.123.107:0/418050767 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f22ec103c60 0x7f22ec19ee70 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f22dc002c20 tx=0x7f22dc0028f0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.857+0000 7f22f3899640 1 -- 192.168.123.107:0/418050767 shutdown_connections 2026-03-09T19:31:43.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.857+0000 7f22f3899640 1 --2- 192.168.123.107:0/418050767 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f22c80778e0 0x7f22c8079da0 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.857+0000 7f22f3899640 1 --2- 192.168.123.107:0/418050767 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f22ec103c60 0x7f22ec19ee70 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.857+0000 7f22f3899640 1 --2- 192.168.123.107:0/418050767 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22ec102a60 0x7f22ec19e930 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.857+0000 7f22f3899640 1 -- 192.168.123.107:0/418050767 >> 192.168.123.107:0/418050767 conn(0x7f22ec0fe250 msgr2=0x7f22ec0ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:43.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.857+0000 7f22f3899640 1 -- 192.168.123.107:0/418050767 shutdown_connections 2026-03-09T19:31:43.857 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.858+0000 7f22f3899640 1 -- 192.168.123.107:0/418050767 wait complete. 2026-03-09T19:31:43.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.924+0000 7f1b1ec7b640 1 -- 192.168.123.107:0/1676880116 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1b18102a60 msgr2=0x7f1b18102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:43.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.924+0000 7f1b1ec7b640 1 --2- 192.168.123.107:0/1676880116 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1b18102a60 0x7f1b18102e60 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f1b000099b0 tx=0x7f1b0002f220 comp rx=0 tx=0).stop 2026-03-09T19:31:43.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.924+0000 7f1b1ec7b640 1 -- 192.168.123.107:0/1676880116 shutdown_connections 2026-03-09T19:31:43.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.924+0000 7f1b1ec7b640 1 --2- 192.168.123.107:0/1676880116 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1b18103c60 0x7f1b181040e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.924+0000 7f1b1ec7b640 1 --2- 192.168.123.107:0/1676880116 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1b18102a60 0x7f1b18102e60 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.924+0000 7f1b1ec7b640 1 -- 192.168.123.107:0/1676880116 >> 192.168.123.107:0/1676880116 conn(0x7f1b180fe250 msgr2=0x7f1b18100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:43.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.925+0000 7f1b1ec7b640 1 -- 192.168.123.107:0/1676880116 shutdown_connections 2026-03-09T19:31:43.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.925+0000 7f1b1ec7b640 1 -- 192.168.123.107:0/1676880116 wait complete. 2026-03-09T19:31:43.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.925+0000 7f1b1ec7b640 1 Processor -- start 2026-03-09T19:31:43.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.925+0000 7f1b1ec7b640 1 -- start start 2026-03-09T19:31:43.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.926+0000 7f1b1ec7b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1b18102a60 0x7f1b18078fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:43.925 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.926+0000 7f1b1ec7b640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1b18103c60 0x7f1b180794e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:43.925 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.926+0000 7f1b1ec7b640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1b18075a00 con 0x7f1b18102a60 2026-03-09T19:31:43.925 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.926+0000 7f1b1ec7b640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1b18075b70 con 0x7f1b18103c60 2026-03-09T19:31:43.925 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.926+0000 7f1b1c9f0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1b18102a60 0x7f1b18078fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:43.925 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.926+0000 7f1b1c9f0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1b18102a60 0x7f1b18078fa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:49518/0 (socket says 192.168.123.107:49518) 2026-03-09T19:31:43.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.926+0000 7f1b1c9f0640 1 -- 192.168.123.107:0/141323169 learned_addr learned my addr 192.168.123.107:0/141323169 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:31:43.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.926+0000 7f1b1c9f0640 1 -- 192.168.123.107:0/141323169 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1b18103c60 msgr2=0x7f1b180794e0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:31:43.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.926+0000 7f1b1c9f0640 1 --2- 192.168.123.107:0/141323169 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1b18103c60 0x7f1b180794e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:43.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.926+0000 7f1b1c9f0640 1 -- 192.168.123.107:0/141323169 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1b00009660 con 0x7f1b18102a60 2026-03-09T19:31:43.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.927+0000 7f1b1c9f0640 1 --2- 192.168.123.107:0/141323169 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1b18102a60 0x7f1b18078fa0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f1b00002fc0 tx=0x7f1b000026e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:43.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.927+0000 7f1b0dffb640 1 -- 192.168.123.107:0/141323169 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1b0003d070 con 0x7f1b18102a60 2026-03-09T19:31:43.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.927+0000 7f1b0dffb640 1 -- 192.168.123.107:0/141323169 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f1b000028c0 con 0x7f1b18102a60 2026-03-09T19:31:43.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.928+0000 7f1b0dffb640 1 -- 192.168.123.107:0/141323169 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1b000418f0 con 0x7f1b18102a60 2026-03-09T19:31:43.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.928+0000 7f1b1ec7b640 1 -- 192.168.123.107:0/141323169 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1b18075df0 con 0x7f1b18102a60 2026-03-09T19:31:43.927 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.928+0000 7f1b1ec7b640 1 -- 192.168.123.107:0/141323169 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1b18076260 con 0x7f1b18102a60 2026-03-09T19:31:43.929 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.930+0000 7f1b0dffb640 1 -- 192.168.123.107:0/141323169 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1b0002fc90 con 0x7f1b18102a60 2026-03-09T19:31:43.929 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.930+0000 7f1b1ec7b640 1 -- 192.168.123.107:0/141323169 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1ae4005350 con 0x7f1b18102a60 2026-03-09T19:31:43.933 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.933+0000 7f1b0dffb640 1 --2- 192.168.123.107:0/141323169 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1ae80776d0 0x7f1ae8079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:43.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.933+0000 7f1b0dffb640 1 -- 192.168.123.107:0/141323169 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(76..76 src has 1..76) v4 ==== 6509+0+0 (secure 0 0 0) 0x7f1b000be480 con 0x7f1b18102a60 2026-03-09T19:31:43.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.934+0000 7f1b0dffb640 1 -- 192.168.123.107:0/141323169 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1b00086bc0 con 0x7f1b18102a60 2026-03-09T19:31:43.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.935+0000 7f1b0ffff640 1 --2- 192.168.123.107:0/141323169 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1ae80776d0 0x7f1ae8079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:43.934 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:43.935+0000 7f1b0ffff640 1 --2- 192.168.123.107:0/141323169 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1ae80776d0 0x7f1ae8079b90 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f1b08005fd0 tx=0x7f1b08009450 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:44.078 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.079+0000 7f1b1ec7b640 1 -- 192.168.123.107:0/141323169 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f1ae4005e10 con 0x7f1b18102a60 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.080+0000 7f1b0dffb640 1 -- 192.168.123.107:0/141323169 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1926 (secure 0 0 0) 0x7f1b00086310 con 0x7f1b18102a60 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:e13 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:epoch 13 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:31:44.079 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:31:44.080 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:31:44.080 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:31:44.080 INFO:teuthology.orchestra.run.vm07.stdout:qdb_cluster leader: 0 members: 2026-03-09T19:31:44.080 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:31:44.080 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:31:44.080 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:31:44.080 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:31:44.080 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:31:44.080 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:31:44.080 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:31:44.080 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:31:44.080 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:31:44.082 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.083+0000 7f1b1ec7b640 1 -- 192.168.123.107:0/141323169 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1ae80776d0 msgr2=0x7f1ae8079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:44.082 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.083+0000 7f1b1ec7b640 1 --2- 192.168.123.107:0/141323169 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1ae80776d0 0x7f1ae8079b90 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f1b08005fd0 tx=0x7f1b08009450 comp rx=0 tx=0).stop 2026-03-09T19:31:44.082 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.083+0000 7f1b1ec7b640 1 -- 192.168.123.107:0/141323169 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1b18102a60 msgr2=0x7f1b18078fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:44.082 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.083+0000 7f1b1ec7b640 1 --2- 192.168.123.107:0/141323169 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1b18102a60 0x7f1b18078fa0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f1b00002fc0 tx=0x7f1b000026e0 comp rx=0 tx=0).stop 2026-03-09T19:31:44.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.084+0000 7f1b1ec7b640 1 -- 192.168.123.107:0/141323169 shutdown_connections 2026-03-09T19:31:44.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.084+0000 7f1b1ec7b640 1 --2- 192.168.123.107:0/141323169 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1ae80776d0 0x7f1ae8079b90 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:44.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.084+0000 7f1b1ec7b640 1 --2- 192.168.123.107:0/141323169 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1b18103c60 0x7f1b180794e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:44.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.084+0000 7f1b1ec7b640 1 --2- 192.168.123.107:0/141323169 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1b18102a60 0x7f1b18078fa0 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:44.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.084+0000 7f1b1ec7b640 1 -- 192.168.123.107:0/141323169 >> 192.168.123.107:0/141323169 conn(0x7f1b180fe250 msgr2=0x7f1b180ff9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:44.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.085+0000 7f1b1ec7b640 1 -- 192.168.123.107:0/141323169 shutdown_connections 2026-03-09T19:31:44.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.085+0000 7f1b1ec7b640 1 -- 192.168.123.107:0/141323169 wait complete. 2026-03-09T19:31:44.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.147+0000 7fdb3aa39640 1 -- 192.168.123.107:0/953812033 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdb34102a60 msgr2=0x7fdb34102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:44.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.147+0000 7fdb3aa39640 1 --2- 192.168.123.107:0/953812033 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdb34102a60 0x7fdb34102e60 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fdb180098e0 tx=0x7fdb1802f190 comp rx=0 tx=0).stop 2026-03-09T19:31:44.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.147+0000 7fdb3aa39640 1 -- 192.168.123.107:0/953812033 shutdown_connections 2026-03-09T19:31:44.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.147+0000 7fdb3aa39640 1 --2- 192.168.123.107:0/953812033 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb34103c60 0x7fdb341040e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:44.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.147+0000 7fdb3aa39640 1 --2- 192.168.123.107:0/953812033 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdb34102a60 0x7fdb34102e60 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:44.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.147+0000 7fdb3aa39640 1 -- 192.168.123.107:0/953812033 >> 192.168.123.107:0/953812033 conn(0x7fdb340fe250 msgr2=0x7fdb34100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:44.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.148+0000 7fdb3aa39640 1 -- 192.168.123.107:0/953812033 shutdown_connections 2026-03-09T19:31:44.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.148+0000 7fdb3aa39640 1 -- 192.168.123.107:0/953812033 wait complete. 2026-03-09T19:31:44.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.148+0000 7fdb3aa39640 1 Processor -- start 2026-03-09T19:31:44.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.148+0000 7fdb3aa39640 1 -- start start 2026-03-09T19:31:44.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.148+0000 7fdb3aa39640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb34102a60 0x7fdb3419a430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:44.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.148+0000 7fdb3aa39640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdb34103c60 0x7fdb3419a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:44.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.148+0000 7fdb3aa39640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb3419af40 con 0x7fdb34102a60 2026-03-09T19:31:44.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.148+0000 7fdb3aa39640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb3419b0b0 con 0x7fdb34103c60 2026-03-09T19:31:44.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.149+0000 7fdb33fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb34102a60 0x7fdb3419a430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:44.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.149+0000 7fdb33fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb34102a60 0x7fdb3419a430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:49548/0 (socket says 192.168.123.107:49548) 2026-03-09T19:31:44.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.149+0000 7fdb33fff640 1 -- 192.168.123.107:0/422637939 learned_addr learned my addr 192.168.123.107:0/422637939 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:31:44.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.149+0000 7fdb33fff640 1 -- 192.168.123.107:0/422637939 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdb34103c60 msgr2=0x7fdb3419a970 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:31:44.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.149+0000 7fdb337fe640 1 --2- 192.168.123.107:0/422637939 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdb34103c60 0x7fdb3419a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:44.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.149+0000 7fdb33fff640 1 --2- 192.168.123.107:0/422637939 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdb34103c60 0x7fdb3419a970 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:44.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.149+0000 7fdb33fff640 1 -- 192.168.123.107:0/422637939 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdb20009660 con 0x7fdb34102a60 2026-03-09T19:31:44.149 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.149+0000 7fdb33fff640 1 --2- 192.168.123.107:0/422637939 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb34102a60 0x7fdb3419a430 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fdb1802fe30 tx=0x7fdb1802fe60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:44.149 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.150+0000 7fdb317fa640 1 -- 192.168.123.107:0/422637939 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdb1803c040 con 0x7fdb34102a60 2026-03-09T19:31:44.150 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.150+0000 7fdb3aa39640 1 -- 192.168.123.107:0/422637939 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdb18009590 con 0x7fdb34102a60 2026-03-09T19:31:44.150 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.150+0000 7fdb3aa39640 1 -- 192.168.123.107:0/422637939 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdb3419fe50 con 0x7fdb34102a60 2026-03-09T19:31:44.150 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.150+0000 7fdb317fa640 1 -- 192.168.123.107:0/422637939 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fdb180044b0 con 0x7fdb34102a60 2026-03-09T19:31:44.150 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.150+0000 7fdb317fa640 1 -- 192.168.123.107:0/422637939 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdb180388b0 con 0x7fdb34102a60 2026-03-09T19:31:44.151 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.152+0000 7fdb317fa640 1 -- 192.168.123.107:0/422637939 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fdb18038a10 con 0x7fdb34102a60 2026-03-09T19:31:44.151 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.152+0000 7fdb317fa640 1 --2- 192.168.123.107:0/422637939 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdb040778e0 0x7fdb04079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:44.151 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.152+0000 7fdb317fa640 1 -- 192.168.123.107:0/422637939 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(76..76 src has 1..76) v4 ==== 6509+0+0 (secure 0 0 0) 0x7fdb180bf720 con 0x7fdb34102a60 2026-03-09T19:31:44.151 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.153+0000 7fdb337fe640 1 --2- 192.168.123.107:0/422637939 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdb040778e0 0x7fdb04079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:44.152 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.153+0000 7fdb3aa39640 1 -- 192.168.123.107:0/422637939 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdaf8005350 con 0x7fdb34102a60 2026-03-09T19:31:44.154 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.154+0000 7fdb337fe640 1 --2- 192.168.123.107:0/422637939 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdb040778e0 0x7fdb04079da0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fdb3419b950 tx=0x7fdb20009340 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:44.155 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.156+0000 7fdb317fa640 1 -- 192.168.123.107:0/422637939 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fdb18087e90 con 0x7fdb34102a60 2026-03-09T19:31:44.265 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.266+0000 7fdb3aa39640 1 -- 192.168.123.107:0/422637939 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fdaf8002bf0 con 0x7fdb040778e0 2026-03-09T19:31:44.266 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.267+0000 7fdb317fa640 1 -- 192.168.123.107:0/422637939 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fdaf8002bf0 con 0x7fdb040778e0 2026-03-09T19:31:44.266 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:31:44.267 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T19:31:44.267 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:31:44.267 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:31:44.267 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [ 2026-03-09T19:31:44.267 INFO:teuthology.orchestra.run.vm07.stdout: "mgr", 2026-03-09T19:31:44.267 INFO:teuthology.orchestra.run.vm07.stdout: "crash", 2026-03-09T19:31:44.267 INFO:teuthology.orchestra.run.vm07.stdout: "mon" 2026-03-09T19:31:44.267 INFO:teuthology.orchestra.run.vm07.stdout: ], 2026-03-09T19:31:44.267 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "8/23 daemons upgraded", 2026-03-09T19:31:44.267 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T19:31:44.267 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:31:44.267 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:31:44.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.270+0000 7fdb3aa39640 1 -- 192.168.123.107:0/422637939 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdb040778e0 msgr2=0x7fdb04079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:44.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.270+0000 7fdb3aa39640 1 --2- 192.168.123.107:0/422637939 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdb040778e0 0x7fdb04079da0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fdb3419b950 tx=0x7fdb20009340 comp rx=0 tx=0).stop 2026-03-09T19:31:44.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.270+0000 7fdb3aa39640 1 -- 192.168.123.107:0/422637939 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb34102a60 msgr2=0x7fdb3419a430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:44.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.270+0000 7fdb3aa39640 1 --2- 192.168.123.107:0/422637939 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb34102a60 0x7fdb3419a430 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fdb1802fe30 tx=0x7fdb1802fe60 comp rx=0 tx=0).stop 2026-03-09T19:31:44.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.271+0000 7fdb3aa39640 1 -- 192.168.123.107:0/422637939 shutdown_connections 2026-03-09T19:31:44.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.271+0000 7fdb3aa39640 1 --2- 192.168.123.107:0/422637939 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fdb040778e0 0x7fdb04079da0 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:44.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.271+0000 7fdb3aa39640 1 --2- 192.168.123.107:0/422637939 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdb34103c60 0x7fdb3419a970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:44.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.271+0000 7fdb3aa39640 1 --2- 192.168.123.107:0/422637939 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb34102a60 0x7fdb3419a430 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:44.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.271+0000 7fdb3aa39640 1 -- 192.168.123.107:0/422637939 >> 192.168.123.107:0/422637939 conn(0x7fdb340fe250 msgr2=0x7fdb340ffd30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:44.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.271+0000 7fdb3aa39640 1 -- 192.168.123.107:0/422637939 shutdown_connections 2026-03-09T19:31:44.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.271+0000 7fdb3aa39640 1 -- 192.168.123.107:0/422637939 wait complete. 2026-03-09T19:31:44.327 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.327+0000 7efd56851640 1 -- 192.168.123.107:0/826104825 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd50103c60 msgr2=0x7efd501040e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:44.327 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.327+0000 7efd56851640 1 --2- 192.168.123.107:0/826104825 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd50103c60 0x7efd501040e0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7efd3c009a00 tx=0x7efd3c02f280 comp rx=0 tx=0).stop 2026-03-09T19:31:44.327 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.328+0000 7efd56851640 1 -- 192.168.123.107:0/826104825 shutdown_connections 2026-03-09T19:31:44.327 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.328+0000 7efd56851640 1 --2- 192.168.123.107:0/826104825 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd50103c60 0x7efd501040e0 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:44.327 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.328+0000 7efd56851640 1 --2- 192.168.123.107:0/826104825 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efd50102a60 0x7efd50102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:44.327 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.328+0000 7efd56851640 1 -- 192.168.123.107:0/826104825 >> 192.168.123.107:0/826104825 conn(0x7efd500fe250 msgr2=0x7efd50100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:44.327 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.328+0000 7efd56851640 1 -- 192.168.123.107:0/826104825 shutdown_connections 2026-03-09T19:31:44.327 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.328+0000 7efd56851640 1 -- 192.168.123.107:0/826104825 wait complete. 2026-03-09T19:31:44.327 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.329+0000 7efd56851640 1 Processor -- start 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.329+0000 7efd56851640 1 -- start start 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.332+0000 7efd56851640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efd50102a60 0x7efd5019a3f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.332+0000 7efd56851640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd50103c60 0x7efd5019a930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.332+0000 7efd56851640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efd5019af50 con 0x7efd50103c60 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.332+0000 7efd56851640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efd5019b0c0 con 0x7efd50102a60 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.332+0000 7efd4f7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd50103c60 0x7efd5019a930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.332+0000 7efd4f7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd50103c60 0x7efd5019a930 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:49574/0 (socket says 192.168.123.107:49574) 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.332+0000 7efd4f7fe640 1 -- 192.168.123.107:0/3566001058 learned_addr learned my addr 192.168.123.107:0/3566001058 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.332+0000 7efd4f7fe640 1 -- 192.168.123.107:0/3566001058 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efd50102a60 msgr2=0x7efd5019a3f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.332+0000 7efd4f7fe640 1 --2- 192.168.123.107:0/3566001058 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efd50102a60 0x7efd5019a3f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.332+0000 7efd4f7fe640 1 -- 192.168.123.107:0/3566001058 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efd3c009660 con 0x7efd50103c60 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.332+0000 7efd4f7fe640 1 --2- 192.168.123.107:0/3566001058 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd50103c60 0x7efd5019a930 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7efd3c0043d0 tx=0x7efd3c004400 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.333+0000 7efd4d7fa640 1 -- 192.168.123.107:0/3566001058 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efd3c02fae0 con 0x7efd50103c60 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.333+0000 7efd4d7fa640 1 -- 192.168.123.107:0/3566001058 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7efd3c02fc40 con 0x7efd50103c60 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.333+0000 7efd4d7fa640 1 -- 192.168.123.107:0/3566001058 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efd3c041970 con 0x7efd50103c60 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.333+0000 7efd56851640 1 -- 192.168.123.107:0/3566001058 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efd50079760 con 0x7efd50103c60 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.333+0000 7efd56851640 1 -- 192.168.123.107:0/3566001058 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efd501a45a0 con 0x7efd50103c60 2026-03-09T19:31:44.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.334+0000 7efd56851640 1 -- 192.168.123.107:0/3566001058 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efd1c005350 con 0x7efd50103c60 2026-03-09T19:31:44.337 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.338+0000 7efd4d7fa640 1 -- 192.168.123.107:0/3566001058 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7efd3c03f070 con 0x7efd50103c60 2026-03-09T19:31:44.337 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.338+0000 7efd4d7fa640 1 --2- 192.168.123.107:0/3566001058 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efd240779b0 0x7efd24079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:31:44.337 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.339+0000 7efd4ffff640 1 --2- 192.168.123.107:0/3566001058 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efd240779b0 0x7efd24079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:31:44.338 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.339+0000 7efd4d7fa640 1 -- 192.168.123.107:0/3566001058 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(76..76 src has 1..76) v4 ==== 6509+0+0 (secure 0 0 0) 0x7efd3c0be960 con 0x7efd50103c60 2026-03-09T19:31:44.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.339+0000 7efd4ffff640 1 --2- 192.168.123.107:0/3566001058 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efd240779b0 0x7efd24079e70 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7efd5019b950 tx=0x7efd40009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:31:44.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.339+0000 7efd4d7fa640 1 -- 192.168.123.107:0/3566001058 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7efd3c0870a0 con 0x7efd50103c60 2026-03-09T19:31:44.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.482+0000 7efd56851640 1 -- 192.168.123.107:0/3566001058 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7efd1c0058d0 con 0x7efd50103c60 2026-03-09T19:31:44.482 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.483+0000 7efd4d7fa640 1 -- 192.168.123.107:0/3566001058 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1169 (secure 0 0 0) 0x7efd3c0867f0 con 0x7efd50103c60 2026-03-09T19:31:44.482 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_WARN Degraded data redundancy: 91/291 objects degraded (31.271%), 16 pgs degraded, 1 pg undersized 2026-03-09T19:31:44.482 INFO:teuthology.orchestra.run.vm07.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 91/291 objects degraded (31.271%), 16 pgs degraded, 1 pg undersized 2026-03-09T19:31:44.482 INFO:teuthology.orchestra.run.vm07.stdout: pg 1.0 is active+undersized+degraded, acting [3,0] 2026-03-09T19:31:44.482 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.0 is active+undersized+degraded, acting [3,0] 2026-03-09T19:31:44.482 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.1 is active+undersized+degraded, acting [2,0] 2026-03-09T19:31:44.482 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.2 is active+undersized+degraded, acting [5,0] 2026-03-09T19:31:44.482 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.3 is active+undersized+degraded, acting [5,2] 2026-03-09T19:31:44.483 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.4 is active+undersized+degraded, acting [0,4] 2026-03-09T19:31:44.483 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.6 is active+undersized+degraded, acting [3,4] 2026-03-09T19:31:44.483 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.9 is active+undersized+degraded, acting [4,0] 2026-03-09T19:31:44.483 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.a is active+undersized+degraded, acting [4,3] 2026-03-09T19:31:44.483 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.d is active+undersized+degraded, acting [3,2] 2026-03-09T19:31:44.483 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.10 is active+undersized+degraded, acting [2,0] 2026-03-09T19:31:44.483 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.12 is active+undersized+degraded, acting [3,0] 2026-03-09T19:31:44.483 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.15 is active+undersized+degraded, acting [3,0] 2026-03-09T19:31:44.483 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.17 is active+undersized+degraded, acting [5,2] 2026-03-09T19:31:44.483 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.1b is active+undersized+degraded, acting [0,5] 2026-03-09T19:31:44.483 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.17 is stuck undersized for 2m, current state active+recovering+undersized+degraded+remapped, last acting [2,5] 2026-03-09T19:31:44.485 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.486+0000 7efd56851640 1 -- 192.168.123.107:0/3566001058 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efd240779b0 msgr2=0x7efd24079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:44.485 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.486+0000 7efd56851640 1 --2- 192.168.123.107:0/3566001058 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efd240779b0 0x7efd24079e70 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7efd5019b950 tx=0x7efd40009290 comp rx=0 tx=0).stop 2026-03-09T19:31:44.485 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.486+0000 7efd56851640 1 -- 192.168.123.107:0/3566001058 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd50103c60 msgr2=0x7efd5019a930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:31:44.486 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.486+0000 7efd56851640 1 --2- 192.168.123.107:0/3566001058 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd50103c60 0x7efd5019a930 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7efd3c0043d0 tx=0x7efd3c004400 comp rx=0 tx=0).stop 2026-03-09T19:31:44.486 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.487+0000 7efd56851640 1 -- 192.168.123.107:0/3566001058 shutdown_connections 2026-03-09T19:31:44.486 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.487+0000 7efd56851640 1 --2- 192.168.123.107:0/3566001058 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efd240779b0 0x7efd24079e70 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:44.486 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.487+0000 7efd56851640 1 --2- 192.168.123.107:0/3566001058 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efd50103c60 0x7efd5019a930 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:44.486 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.487+0000 7efd56851640 1 --2- 192.168.123.107:0/3566001058 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efd50102a60 0x7efd5019a3f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:31:44.486 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.487+0000 7efd56851640 1 -- 192.168.123.107:0/3566001058 >> 192.168.123.107:0/3566001058 conn(0x7efd500fe250 msgr2=0x7efd500ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:31:44.486 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.487+0000 7efd56851640 1 -- 192.168.123.107:0/3566001058 shutdown_connections 2026-03-09T19:31:44.486 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:31:44.487+0000 7efd56851640 1 -- 192.168.123.107:0/3566001058 wait complete. 2026-03-09T19:31:45.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:44 vm08.local ceph-mon[103420]: from='client.34256 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:31:45.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:44 vm08.local ceph-mon[103420]: from='client.44215 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:31:45.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:44 vm08.local ceph-mon[103420]: pgmap v128: 65 pgs: 2 peering, 1 active+recovering+undersized+degraded+remapped, 17 active+undersized, 15 active+undersized+degraded, 30 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 91/291 objects degraded (31.271%); 0 B/s, 7 objects/s recovering 2026-03-09T19:31:45.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:44 vm08.local ceph-mon[103420]: from='client.34264 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:31:45.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:44 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/418050767' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:31:45.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:44 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/141323169' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:31:45.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:44 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/3566001058' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:31:45.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:44 vm07.local ceph-mon[111841]: from='client.34256 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:31:45.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:44 vm07.local ceph-mon[111841]: from='client.44215 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:31:45.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:44 vm07.local ceph-mon[111841]: pgmap v128: 65 pgs: 2 peering, 1 active+recovering+undersized+degraded+remapped, 17 active+undersized, 15 active+undersized+degraded, 30 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 91/291 objects degraded (31.271%); 0 B/s, 7 objects/s recovering 2026-03-09T19:31:45.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:44 vm07.local ceph-mon[111841]: from='client.34264 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:31:45.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:44 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/418050767' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:31:45.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:44 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/141323169' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:31:45.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:44 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/3566001058' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:31:46.093 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:45 vm08.local ceph-mon[103420]: from='client.34276 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:31:46.130 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:45 vm07.local ceph-mon[111841]: from='client.34276 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:31:47.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:46 vm08.local ceph-mon[103420]: pgmap v129: 65 pgs: 2 peering, 1 active+recovering+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 22/291 objects degraded (7.560%); 0 B/s, 6 objects/s recovering 2026-03-09T19:31:47.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:46 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 22/291 objects degraded (7.560%), 1 pg degraded, 1 pg undersized (PG_DEGRADED) 2026-03-09T19:31:47.141 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:46 vm07.local ceph-mon[111841]: pgmap v129: 65 pgs: 2 peering, 1 active+recovering+undersized+degraded+remapped, 62 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 22/291 objects degraded (7.560%); 0 B/s, 6 objects/s recovering 2026-03-09T19:31:47.141 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:46 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 22/291 objects degraded (7.560%), 1 pg degraded, 1 pg undersized (PG_DEGRADED) 2026-03-09T19:31:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:48 vm08.local ceph-mon[103420]: pgmap v130: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 64 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 22/291 objects degraded (7.560%); 0 B/s, 5 objects/s recovering 2026-03-09T19:31:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:48 vm08.local ceph-mon[103420]: osdmap e77: 6 total, 6 up, 6 in 2026-03-09T19:31:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:31:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:48 vm07.local ceph-mon[111841]: pgmap v130: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 64 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 22/291 objects degraded (7.560%); 0 B/s, 5 objects/s recovering 2026-03-09T19:31:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:48 vm07.local ceph-mon[111841]: osdmap e77: 6 total, 6 up, 6 in 2026-03-09T19:31:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:31:50.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:49 vm08.local ceph-mon[103420]: osdmap e78: 6 total, 6 up, 6 in 2026-03-09T19:31:50.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:49 vm07.local ceph-mon[111841]: osdmap e78: 6 total, 6 up, 6 in 2026-03-09T19:31:51.145 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:50 vm08.local ceph-mon[103420]: pgmap v133: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 64 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 22/291 objects degraded (7.560%); 0 B/s, 5 objects/s recovering 2026-03-09T19:31:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:50 vm07.local ceph-mon[111841]: pgmap v133: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 64 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 22/291 objects degraded (7.560%); 0 B/s, 5 objects/s recovering 2026-03-09T19:31:52.191 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:51 vm07.local ceph-mon[111841]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 22/291 objects degraded (7.560%), 1 pg degraded, 1 pg undersized) 2026-03-09T19:31:52.191 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:51 vm07.local ceph-mon[111841]: Cluster is now healthy 2026-03-09T19:31:52.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:51 vm08.local ceph-mon[103420]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 22/291 objects degraded (7.560%), 1 pg degraded, 1 pg undersized) 2026-03-09T19:31:52.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:51 vm08.local ceph-mon[103420]: Cluster is now healthy 2026-03-09T19:31:53.280 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:53 vm07.local ceph-mon[111841]: pgmap v134: 65 pgs: 65 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 7 objects/s recovering 2026-03-09T19:31:53.280 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:53 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T19:31:53.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:53 vm08.local ceph-mon[103420]: pgmap v134: 65 pgs: 65 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 7 objects/s recovering 2026-03-09T19:31:53.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:53 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T19:31:54.100 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:54 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T19:31:54.101 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:54 vm07.local ceph-mon[111841]: Upgrade: osd.2 is safe to restart 2026-03-09T19:31:54.101 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:54 vm07.local ceph-mon[111841]: Upgrade: Updating osd.2 2026-03-09T19:31:54.101 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:54 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:54.101 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:54 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T19:31:54.101 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:54 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:31:54.101 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:54 vm07.local ceph-mon[111841]: Deploying daemon osd.2 on vm07 2026-03-09T19:31:54.101 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:54 vm07.local ceph-mon[111841]: pgmap v135: 65 pgs: 65 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 2 objects/s recovering 2026-03-09T19:31:54.479 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:54 vm07.local systemd[1]: Stopping Ceph osd.2 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:31:54.479 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:54 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2[83028]: 2026-03-09T19:31:54.239+0000 7f9a1a27d640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T19:31:54.479 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:54 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2[83028]: 2026-03-09T19:31:54.239+0000 7f9a1a27d640 -1 osd.2 78 *** Got signal Terminated *** 2026-03-09T19:31:54.479 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:54 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2[83028]: 2026-03-09T19:31:54.239+0000 7f9a1a27d640 -1 osd.2 78 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T19:31:54.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:54 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T19:31:54.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:54 vm08.local ceph-mon[103420]: Upgrade: osd.2 is safe to restart 2026-03-09T19:31:54.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:54 vm08.local ceph-mon[103420]: Upgrade: Updating osd.2 2026-03-09T19:31:54.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:54 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:54.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:54 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T19:31:54.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:54 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:31:54.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:54 vm08.local ceph-mon[103420]: Deploying daemon osd.2 on vm07 2026-03-09T19:31:54.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:54 vm08.local ceph-mon[103420]: pgmap v135: 65 pgs: 65 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 2 objects/s recovering 2026-03-09T19:31:55.371 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:55 vm07.local ceph-mon[111841]: osd.2 marked itself down and dead 2026-03-09T19:31:55.371 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local podman[132009]: 2026-03-09 19:31:55.181055195 +0000 UTC m=+0.957920332 container died 67f7c4b96ef86c0614b902eb1c1ae0f5090790fc9b8a5e239b2d7c720432778c (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T19:31:55.371 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local podman[132009]: 2026-03-09 19:31:55.200569944 +0000 UTC m=+0.977435081 container remove 67f7c4b96ef86c0614b902eb1c1ae0f5090790fc9b8a5e239b2d7c720432778c (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default) 2026-03-09T19:31:55.371 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local bash[132009]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2 2026-03-09T19:31:55.371 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local podman[132076]: 2026-03-09 19:31:55.350376945 +0000 UTC m=+0.015821356 container create 30135cbf394f40da63a9ce461ee2152320a3301187d8c04dd2c954be40faa183 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3) 2026-03-09T19:31:55.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:55 vm08.local ceph-mon[103420]: osd.2 marked itself down and dead 2026-03-09T19:31:55.623 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local podman[132076]: 2026-03-09 19:31:55.38312722 +0000 UTC m=+0.048571651 container init 30135cbf394f40da63a9ce461ee2152320a3301187d8c04dd2c954be40faa183 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-deactivate, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) 2026-03-09T19:31:55.623 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local podman[132076]: 2026-03-09 19:31:55.392035937 +0000 UTC m=+0.057480348 container start 30135cbf394f40da63a9ce461ee2152320a3301187d8c04dd2c954be40faa183 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T19:31:55.623 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local podman[132076]: 2026-03-09 19:31:55.392936122 +0000 UTC m=+0.058380533 container attach 30135cbf394f40da63a9ce461ee2152320a3301187d8c04dd2c954be40faa183 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T19:31:55.623 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local podman[132076]: 2026-03-09 19:31:55.343614417 +0000 UTC m=+0.009058828 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:31:55.623 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local podman[132076]: 2026-03-09 19:31:55.516825016 +0000 UTC m=+0.182269427 container died 30135cbf394f40da63a9ce461ee2152320a3301187d8c04dd2c954be40faa183 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T19:31:55.623 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local podman[132076]: 2026-03-09 19:31:55.530952592 +0000 UTC m=+0.196397003 container remove 30135cbf394f40da63a9ce461ee2152320a3301187d8c04dd2c954be40faa183 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T19:31:55.623 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.2.service: Deactivated successfully. 2026-03-09T19:31:55.623 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.2.service: Unit process 132087 (conmon) remains running after unit stopped. 2026-03-09T19:31:55.623 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.2.service: Unit process 132095 (podman) remains running after unit stopped. 2026-03-09T19:31:55.623 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local systemd[1]: Stopped Ceph osd.2 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:31:55.623 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.2.service: Consumed 42.365s CPU time, 932.8M memory peak. 2026-03-09T19:31:55.994 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local systemd[1]: Starting Ceph osd.2 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:31:55.994 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local podman[132176]: 2026-03-09 19:31:55.850005359 +0000 UTC m=+0.017867076 container create 73deb717ddabb7f934649544f5f840b420a70498816f3167300c15d56d0170d4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T19:31:55.994 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local podman[132176]: 2026-03-09 19:31:55.894322977 +0000 UTC m=+0.062184694 container init 73deb717ddabb7f934649544f5f840b420a70498816f3167300c15d56d0170d4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223) 2026-03-09T19:31:55.994 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local podman[132176]: 2026-03-09 19:31:55.897963652 +0000 UTC m=+0.065825369 container start 73deb717ddabb7f934649544f5f840b420a70498816f3167300c15d56d0170d4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid) 2026-03-09T19:31:55.994 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local podman[132176]: 2026-03-09 19:31:55.899474901 +0000 UTC m=+0.067336618 container attach 73deb717ddabb7f934649544f5f840b420a70498816f3167300c15d56d0170d4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T19:31:55.994 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local podman[132176]: 2026-03-09 19:31:55.841999494 +0000 UTC m=+0.009861220 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:31:55.994 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate[132188]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:31:55.994 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local bash[132176]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:31:55.994 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate[132188]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:31:55.994 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:55 vm07.local bash[132176]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:31:56.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:56 vm07.local ceph-mon[111841]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T19:31:56.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:56 vm07.local ceph-mon[111841]: osdmap e79: 6 total, 5 up, 6 in 2026-03-09T19:31:56.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:56 vm07.local ceph-mon[111841]: pgmap v137: 65 pgs: 9 stale+active+clean, 56 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 2 objects/s recovering 2026-03-09T19:31:56.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:56 vm08.local ceph-mon[103420]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T19:31:56.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:56 vm08.local ceph-mon[103420]: osdmap e79: 6 total, 5 up, 6 in 2026-03-09T19:31:56.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:56 vm08.local ceph-mon[103420]: pgmap v137: 65 pgs: 9 stale+active+clean, 56 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 2 objects/s recovering 2026-03-09T19:31:56.856 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate[132188]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T19:31:56.856 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate[132188]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:31:56.856 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local bash[132176]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T19:31:56.856 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local bash[132176]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:31:56.856 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate[132188]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:31:56.856 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local bash[132176]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:31:56.856 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate[132188]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-09T19:31:56.856 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local bash[132176]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-09T19:31:56.856 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate[132188]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-c6d41fb0-4dac-4206-8e78-c5a8fdd39414/osd-block-8ea4ceda-8f60-4699-976a-464d32f7e944 --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-09T19:31:56.856 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local bash[132176]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-c6d41fb0-4dac-4206-8e78-c5a8fdd39414/osd-block-8ea4ceda-8f60-4699-976a-464d32f7e944 --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-09T19:31:57.115 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate[132188]: Running command: /usr/bin/ln -snf /dev/ceph-c6d41fb0-4dac-4206-8e78-c5a8fdd39414/osd-block-8ea4ceda-8f60-4699-976a-464d32f7e944 /var/lib/ceph/osd/ceph-2/block 2026-03-09T19:31:57.115 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local bash[132176]: Running command: /usr/bin/ln -snf /dev/ceph-c6d41fb0-4dac-4206-8e78-c5a8fdd39414/osd-block-8ea4ceda-8f60-4699-976a-464d32f7e944 /var/lib/ceph/osd/ceph-2/block 2026-03-09T19:31:57.115 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate[132188]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-09T19:31:57.115 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local bash[132176]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-09T19:31:57.115 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate[132188]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-09T19:31:57.115 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local bash[132176]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-09T19:31:57.115 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate[132188]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-09T19:31:57.115 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local bash[132176]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-09T19:31:57.115 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate[132188]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-09T19:31:57.115 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local bash[132176]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-09T19:31:57.115 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local conmon[132188]: conmon 73deb717ddabb7f93464 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-73deb717ddabb7f934649544f5f840b420a70498816f3167300c15d56d0170d4.scope/container/memory.events 2026-03-09T19:31:57.116 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local podman[132176]: 2026-03-09 19:31:56.900173177 +0000 UTC m=+1.068034894 container died 73deb717ddabb7f934649544f5f840b420a70498816f3167300c15d56d0170d4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T19:31:57.116 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:56 vm07.local podman[132176]: 2026-03-09 19:31:56.918543865 +0000 UTC m=+1.086405572 container remove 73deb717ddabb7f934649544f5f840b420a70498816f3167300c15d56d0170d4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.41.3) 2026-03-09T19:31:57.116 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:57 vm07.local podman[132434]: 2026-03-09 19:31:57.003602575 +0000 UTC m=+0.018605199 container create 1b8bc1f96eb7dc5de3d08e7e26cff9d5eb54199781ba7059f7faf425b02c5d60 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2, org.label-schema.build-date=20260223, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T19:31:57.116 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:57 vm07.local podman[132434]: 2026-03-09 19:31:57.039191553 +0000 UTC m=+0.054194177 container init 1b8bc1f96eb7dc5de3d08e7e26cff9d5eb54199781ba7059f7faf425b02c5d60 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default) 2026-03-09T19:31:57.116 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:57 vm07.local podman[132434]: 2026-03-09 19:31:57.045265483 +0000 UTC m=+0.060268107 container start 1b8bc1f96eb7dc5de3d08e7e26cff9d5eb54199781ba7059f7faf425b02c5d60 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) 2026-03-09T19:31:57.116 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:57 vm07.local bash[132434]: 1b8bc1f96eb7dc5de3d08e7e26cff9d5eb54199781ba7059f7faf425b02c5d60 2026-03-09T19:31:57.116 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:57 vm07.local podman[132434]: 2026-03-09 19:31:56.996233801 +0000 UTC m=+0.011236434 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:31:57.116 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:57 vm07.local systemd[1]: Started Ceph osd.2 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:31:57.382 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:57 vm07.local ceph-mon[111841]: osdmap e80: 6 total, 5 up, 6 in 2026-03-09T19:31:57.382 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:57 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:57.382 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:57 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:57.382 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:57 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:31:57.382 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:31:57 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2[132445]: 2026-03-09T19:31:57.379+0000 7fcd2aceb740 -1 Falling back to public interface 2026-03-09T19:31:57.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:57 vm08.local ceph-mon[103420]: osdmap e80: 6 total, 5 up, 6 in 2026-03-09T19:31:57.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:57 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:57.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:57 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:57.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:57 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:31:58.163 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:58 vm07.local ceph-mon[111841]: pgmap v139: 65 pgs: 4 active+undersized, 6 stale+active+clean, 4 active+undersized+degraded, 51 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 10/291 objects degraded (3.436%) 2026-03-09T19:31:58.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:58 vm08.local ceph-mon[103420]: pgmap v139: 65 pgs: 4 active+undersized, 6 stale+active+clean, 4 active+undersized+degraded, 51 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 10/291 objects degraded (3.436%) 2026-03-09T19:31:59.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:59 vm07.local ceph-mon[111841]: Health check failed: Degraded data redundancy: 10/291 objects degraded (3.436%), 4 pgs degraded (PG_DEGRADED) 2026-03-09T19:31:59.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:59 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:59.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:31:59 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:59.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:59 vm08.local ceph-mon[103420]: Health check failed: Degraded data redundancy: 10/291 objects degraded (3.436%), 4 pgs degraded (PG_DEGRADED) 2026-03-09T19:31:59.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:59 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:31:59.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:31:59 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:00.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:00 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:00.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:00 vm07.local ceph-mon[111841]: pgmap v140: 65 pgs: 4 active+undersized, 6 stale+active+clean, 4 active+undersized+degraded, 51 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 10/291 objects degraded (3.436%) 2026-03-09T19:32:00.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:00 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:00.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:00 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:00.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:00 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:00.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:00 vm08.local ceph-mon[103420]: pgmap v140: 65 pgs: 4 active+undersized, 6 stale+active+clean, 4 active+undersized+degraded, 51 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 10/291 objects degraded (3.436%) 2026-03-09T19:32:00.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:00 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:00.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:00 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:01.479 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:32:01 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2[132445]: 2026-03-09T19:32:01.284+0000 7fcd2aceb740 -1 osd.2 78 log_to_monitors true 2026-03-09T19:32:02.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:02 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:02.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:02 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:02.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:02 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:32:02.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:02 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:32:02.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:02 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:02.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:02 vm08.local ceph-mon[103420]: from='osd.2 [v2:192.168.123.107:6818/2210390190,v1:192.168.123.107:6819/2210390190]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T19:32:02.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:02 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:32:02.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:02 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:02.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:02 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:02.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:02 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:02.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:02 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T19:32:02.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:02 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T19:32:02.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:02 vm08.local ceph-mon[103420]: Upgrade: unsafe to stop osd(s) at this time (2 PGs are or would become offline) 2026-03-09T19:32:02.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:02 vm08.local ceph-mon[103420]: pgmap v141: 65 pgs: 14 active+undersized, 13 active+undersized+degraded, 38 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 38/291 objects degraded (13.058%) 2026-03-09T19:32:02.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:02 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:02.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:02 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:02.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:02 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:32:02.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:02 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:32:02.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:02 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:02.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:02 vm07.local ceph-mon[111841]: from='osd.2 [v2:192.168.123.107:6818/2210390190,v1:192.168.123.107:6819/2210390190]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T19:32:02.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:02 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:32:02.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:02 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:02.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:02 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:02.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:02 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:02.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:02 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T19:32:02.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:02 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T19:32:02.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:02 vm07.local ceph-mon[111841]: Upgrade: unsafe to stop osd(s) at this time (2 PGs are or would become offline) 2026-03-09T19:32:02.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:02 vm07.local ceph-mon[111841]: pgmap v141: 65 pgs: 14 active+undersized, 13 active+undersized+degraded, 38 active+clean; 218 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 38/291 objects degraded (13.058%) 2026-03-09T19:32:02.729 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:32:02 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2[132445]: 2026-03-09T19:32:02.278+0000 7fcd22a85640 -1 osd.2 78 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T19:32:03.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:03 vm08.local ceph-mon[103420]: from='osd.2 [v2:192.168.123.107:6818/2210390190,v1:192.168.123.107:6819/2210390190]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-09T19:32:03.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:03 vm08.local ceph-mon[103420]: osdmap e81: 6 total, 5 up, 6 in 2026-03-09T19:32:03.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:03 vm08.local ceph-mon[103420]: from='osd.2 [v2:192.168.123.107:6818/2210390190,v1:192.168.123.107:6819/2210390190]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-09T19:32:03.730 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:03 vm07.local ceph-mon[111841]: from='osd.2 [v2:192.168.123.107:6818/2210390190,v1:192.168.123.107:6819/2210390190]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-09T19:32:03.730 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:03 vm07.local ceph-mon[111841]: osdmap e81: 6 total, 5 up, 6 in 2026-03-09T19:32:03.730 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:03 vm07.local ceph-mon[111841]: from='osd.2 [v2:192.168.123.107:6818/2210390190,v1:192.168.123.107:6819/2210390190]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-09T19:32:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:04 vm08.local ceph-mon[103420]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T19:32:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:04 vm08.local ceph-mon[103420]: osd.2 [v2:192.168.123.107:6818/2210390190,v1:192.168.123.107:6819/2210390190] boot 2026-03-09T19:32:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:04 vm08.local ceph-mon[103420]: osdmap e82: 6 total, 6 up, 6 in 2026-03-09T19:32:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:04 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:32:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:04 vm08.local ceph-mon[103420]: pgmap v144: 65 pgs: 14 active+undersized, 13 active+undersized+degraded, 38 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 38/291 objects degraded (13.058%) 2026-03-09T19:32:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:04 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:32:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:04 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 38/291 objects degraded (13.058%), 13 pgs degraded (PG_DEGRADED) 2026-03-09T19:32:05.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:04 vm07.local ceph-mon[111841]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T19:32:05.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:04 vm07.local ceph-mon[111841]: osd.2 [v2:192.168.123.107:6818/2210390190,v1:192.168.123.107:6819/2210390190] boot 2026-03-09T19:32:05.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:04 vm07.local ceph-mon[111841]: osdmap e82: 6 total, 6 up, 6 in 2026-03-09T19:32:05.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:04 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T19:32:05.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:04 vm07.local ceph-mon[111841]: pgmap v144: 65 pgs: 14 active+undersized, 13 active+undersized+degraded, 38 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 38/291 objects degraded (13.058%) 2026-03-09T19:32:05.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:04 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:32:05.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:04 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 38/291 objects degraded (13.058%), 13 pgs degraded (PG_DEGRADED) 2026-03-09T19:32:06.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:05 vm08.local ceph-mon[103420]: osdmap e83: 6 total, 6 up, 6 in 2026-03-09T19:32:06.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:05 vm07.local ceph-mon[111841]: osdmap e83: 6 total, 6 up, 6 in 2026-03-09T19:32:07.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:06 vm08.local ceph-mon[103420]: pgmap v146: 65 pgs: 7 activating, 7 active+undersized, 7 active+undersized+degraded, 44 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 20/291 objects degraded (6.873%) 2026-03-09T19:32:07.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:06 vm07.local ceph-mon[111841]: pgmap v146: 65 pgs: 7 activating, 7 active+undersized, 7 active+undersized+degraded, 44 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 20/291 objects degraded (6.873%) 2026-03-09T19:32:08.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:07 vm08.local ceph-mon[103420]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 20/291 objects degraded (6.873%), 7 pgs degraded) 2026-03-09T19:32:08.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:07 vm08.local ceph-mon[103420]: Cluster is now healthy 2026-03-09T19:32:08.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:07 vm07.local ceph-mon[111841]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 20/291 objects degraded (6.873%), 7 pgs degraded) 2026-03-09T19:32:08.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:07 vm07.local ceph-mon[111841]: Cluster is now healthy 2026-03-09T19:32:09.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:08 vm08.local ceph-mon[103420]: pgmap v147: 65 pgs: 7 activating, 58 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:09.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:08 vm07.local ceph-mon[111841]: pgmap v147: 65 pgs: 7 activating, 58 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:11.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:10 vm07.local ceph-mon[111841]: pgmap v148: 65 pgs: 7 activating, 58 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:11.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:10 vm08.local ceph-mon[103420]: pgmap v148: 65 pgs: 7 activating, 58 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:13.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:12 vm07.local ceph-mon[111841]: pgmap v149: 65 pgs: 65 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:13.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:12 vm08.local ceph-mon[103420]: pgmap v149: 65 pgs: 65 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:14.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.548+0000 7ff626b14640 1 -- 192.168.123.107:0/2380641430 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff620103c80 msgr2=0x7ff620104100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:14.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.548+0000 7ff626b14640 1 --2- 192.168.123.107:0/2380641430 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff620103c80 0x7ff620104100 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7ff610009a00 tx=0x7ff61002f290 comp rx=0 tx=0).stop 2026-03-09T19:32:14.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.550+0000 7ff626b14640 1 -- 192.168.123.107:0/2380641430 shutdown_connections 2026-03-09T19:32:14.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.550+0000 7ff626b14640 1 --2- 192.168.123.107:0/2380641430 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff620103c80 0x7ff620104100 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:14.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.550+0000 7ff626b14640 1 --2- 192.168.123.107:0/2380641430 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff620102a80 0x7ff620102e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:14.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.550+0000 7ff626b14640 1 -- 192.168.123.107:0/2380641430 >> 192.168.123.107:0/2380641430 conn(0x7ff6200fe250 msgr2=0x7ff620100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:14.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.550+0000 7ff626b14640 1 -- 192.168.123.107:0/2380641430 shutdown_connections 2026-03-09T19:32:14.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.550+0000 7ff626b14640 1 -- 192.168.123.107:0/2380641430 wait complete. 2026-03-09T19:32:14.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.550+0000 7ff626b14640 1 Processor -- start 2026-03-09T19:32:14.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.551+0000 7ff626b14640 1 -- start start 2026-03-09T19:32:14.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.551+0000 7ff626b14640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff620102a80 0x7ff620078fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:14.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.551+0000 7ff626b14640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff620103c80 0x7ff6200794e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:14.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.551+0000 7ff626b14640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff620075a00 con 0x7ff620103c80 2026-03-09T19:32:14.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.551+0000 7ff626b14640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff620075b70 con 0x7ff620102a80 2026-03-09T19:32:14.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.551+0000 7ff617fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff620103c80 0x7ff6200794e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:14.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.551+0000 7ff617fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff620103c80 0x7ff6200794e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:60696/0 (socket says 192.168.123.107:60696) 2026-03-09T19:32:14.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.551+0000 7ff617fff640 1 -- 192.168.123.107:0/3032063971 learned_addr learned my addr 192.168.123.107:0/3032063971 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:32:14.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.551+0000 7ff617fff640 1 -- 192.168.123.107:0/3032063971 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff620102a80 msgr2=0x7ff620078fa0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:14.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.551+0000 7ff617fff640 1 --2- 192.168.123.107:0/3032063971 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff620102a80 0x7ff620078fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:14.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.551+0000 7ff617fff640 1 -- 192.168.123.107:0/3032063971 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff610009660 con 0x7ff620103c80 2026-03-09T19:32:14.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.552+0000 7ff617fff640 1 --2- 192.168.123.107:0/3032063971 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff620103c80 0x7ff6200794e0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7ff61002f7a0 tx=0x7ff610031b10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:14.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.552+0000 7ff615ffb640 1 -- 192.168.123.107:0/3032063971 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff610031cb0 con 0x7ff620103c80 2026-03-09T19:32:14.552 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.552+0000 7ff626b14640 1 -- 192.168.123.107:0/3032063971 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff620075df0 con 0x7ff620103c80 2026-03-09T19:32:14.552 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.552+0000 7ff626b14640 1 -- 192.168.123.107:0/3032063971 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff620076360 con 0x7ff620103c80 2026-03-09T19:32:14.552 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.552+0000 7ff615ffb640 1 -- 192.168.123.107:0/3032063971 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff610031e10 con 0x7ff620103c80 2026-03-09T19:32:14.552 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.552+0000 7ff615ffb640 1 -- 192.168.123.107:0/3032063971 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff610031280 con 0x7ff620103c80 2026-03-09T19:32:14.552 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.553+0000 7ff615ffb640 1 -- 192.168.123.107:0/3032063971 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff61003f070 con 0x7ff620103c80 2026-03-09T19:32:14.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.554+0000 7ff615ffb640 1 --2- 192.168.123.107:0/3032063971 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff5f00779b0 0x7ff5f0079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:14.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.554+0000 7ff626b14640 1 -- 192.168.123.107:0/3032063971 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff5e8005350 con 0x7ff620103c80 2026-03-09T19:32:14.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.554+0000 7ff624889640 1 --2- 192.168.123.107:0/3032063971 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff5f00779b0 0x7ff5f0079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:14.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.554+0000 7ff624889640 1 --2- 192.168.123.107:0/3032063971 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff5f00779b0 0x7ff5f0079e70 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7ff608006fd0 tx=0x7ff608006d40 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:14.556 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.555+0000 7ff615ffb640 1 -- 192.168.123.107:0/3032063971 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6480+0+0 (secure 0 0 0) 0x7ff6100be510 con 0x7ff620103c80 2026-03-09T19:32:14.556 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.558+0000 7ff615ffb640 1 -- 192.168.123.107:0/3032063971 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff610086d40 con 0x7ff620103c80 2026-03-09T19:32:14.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.663+0000 7ff626b14640 1 -- 192.168.123.107:0/3032063971 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff5e8002bf0 con 0x7ff5f00779b0 2026-03-09T19:32:14.663 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.664+0000 7ff615ffb640 1 -- 192.168.123.107:0/3032063971 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7ff5e8002bf0 con 0x7ff5f00779b0 2026-03-09T19:32:14.665 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.667+0000 7ff626b14640 1 -- 192.168.123.107:0/3032063971 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff5f00779b0 msgr2=0x7ff5f0079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:14.665 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.667+0000 7ff626b14640 1 --2- 192.168.123.107:0/3032063971 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff5f00779b0 0x7ff5f0079e70 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7ff608006fd0 tx=0x7ff608006d40 comp rx=0 tx=0).stop 2026-03-09T19:32:14.666 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.667+0000 7ff626b14640 1 -- 192.168.123.107:0/3032063971 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff620103c80 msgr2=0x7ff6200794e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:14.666 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.667+0000 7ff626b14640 1 --2- 192.168.123.107:0/3032063971 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff620103c80 0x7ff6200794e0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7ff61002f7a0 tx=0x7ff610031b10 comp rx=0 tx=0).stop 2026-03-09T19:32:14.666 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.667+0000 7ff626b14640 1 -- 192.168.123.107:0/3032063971 shutdown_connections 2026-03-09T19:32:14.666 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.667+0000 7ff626b14640 1 --2- 192.168.123.107:0/3032063971 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff5f00779b0 0x7ff5f0079e70 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:14.666 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.667+0000 7ff626b14640 1 --2- 192.168.123.107:0/3032063971 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff620103c80 0x7ff6200794e0 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:14.666 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.667+0000 7ff626b14640 1 --2- 192.168.123.107:0/3032063971 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff620102a80 0x7ff620078fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:14.666 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.667+0000 7ff626b14640 1 -- 192.168.123.107:0/3032063971 >> 192.168.123.107:0/3032063971 conn(0x7ff6200fe250 msgr2=0x7ff6200ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:14.666 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.667+0000 7ff626b14640 1 -- 192.168.123.107:0/3032063971 shutdown_connections 2026-03-09T19:32:14.666 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.668+0000 7ff626b14640 1 -- 192.168.123.107:0/3032063971 wait complete. 2026-03-09T19:32:14.675 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:32:14.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.726+0000 7ffa97fff640 1 -- 192.168.123.107:0/3615599367 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa981028b0 msgr2=0x7ffa98102cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:14.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.726+0000 7ffa97fff640 1 --2- 192.168.123.107:0/3615599367 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa981028b0 0x7ffa98102cb0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7ffa8c0099b0 tx=0x7ffa8c02f260 comp rx=0 tx=0).stop 2026-03-09T19:32:14.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.726+0000 7ffa97fff640 1 -- 192.168.123.107:0/3615599367 shutdown_connections 2026-03-09T19:32:14.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.726+0000 7ffa97fff640 1 --2- 192.168.123.107:0/3615599367 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa98103ab0 0x7ffa98103f30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:14.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.726+0000 7ffa97fff640 1 --2- 192.168.123.107:0/3615599367 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa981028b0 0x7ffa98102cb0 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:14.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.726+0000 7ffa97fff640 1 -- 192.168.123.107:0/3615599367 >> 192.168.123.107:0/3615599367 conn(0x7ffa980fe040 msgr2=0x7ffa98100480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:14.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.726+0000 7ffa97fff640 1 -- 192.168.123.107:0/3615599367 shutdown_connections 2026-03-09T19:32:14.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.726+0000 7ffa97fff640 1 -- 192.168.123.107:0/3615599367 wait complete. 2026-03-09T19:32:14.726 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.727+0000 7ffa97fff640 1 Processor -- start 2026-03-09T19:32:14.726 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.727+0000 7ffa97fff640 1 -- start start 2026-03-09T19:32:14.726 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.727+0000 7ffa97fff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa981028b0 0x7ffa9819a5d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:14.726 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.727+0000 7ffa97fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa98103ab0 0x7ffa9819ab10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:14.726 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.727+0000 7ffa97fff640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffa9819b050 con 0x7ffa98103ab0 2026-03-09T19:32:14.726 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.727+0000 7ffa97fff640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffa9819b1c0 con 0x7ffa981028b0 2026-03-09T19:32:14.726 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.728+0000 7ffa96ffd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa981028b0 0x7ffa9819a5d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:14.727 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.728+0000 7ffa967fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa98103ab0 0x7ffa9819ab10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:14.727 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.728+0000 7ffa96ffd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa981028b0 0x7ffa9819a5d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:38430/0 (socket says 192.168.123.107:38430) 2026-03-09T19:32:14.727 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.728+0000 7ffa96ffd640 1 -- 192.168.123.107:0/3452663871 learned_addr learned my addr 192.168.123.107:0/3452663871 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:32:14.727 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.728+0000 7ffa967fc640 1 -- 192.168.123.107:0/3452663871 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa981028b0 msgr2=0x7ffa9819a5d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:14.727 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.728+0000 7ffa967fc640 1 --2- 192.168.123.107:0/3452663871 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa981028b0 0x7ffa9819a5d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:14.727 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.728+0000 7ffa967fc640 1 -- 192.168.123.107:0/3452663871 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ffa8c009660 con 0x7ffa98103ab0 2026-03-09T19:32:14.727 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.728+0000 7ffa967fc640 1 --2- 192.168.123.107:0/3452663871 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa98103ab0 0x7ffa9819ab10 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7ffa8000cc60 tx=0x7ffa80007590 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:14.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.729+0000 7ffa77fff640 1 -- 192.168.123.107:0/3452663871 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffa80007cb0 con 0x7ffa98103ab0 2026-03-09T19:32:14.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.729+0000 7ffa77fff640 1 -- 192.168.123.107:0/3452663871 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ffa80007e10 con 0x7ffa98103ab0 2026-03-09T19:32:14.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.729+0000 7ffa77fff640 1 -- 192.168.123.107:0/3452663871 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffa8000f450 con 0x7ffa98103ab0 2026-03-09T19:32:14.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.729+0000 7ffa97fff640 1 -- 192.168.123.107:0/3452663871 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ffa9819fca0 con 0x7ffa98103ab0 2026-03-09T19:32:14.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.729+0000 7ffa97fff640 1 -- 192.168.123.107:0/3452663871 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ffa981a01a0 con 0x7ffa98103ab0 2026-03-09T19:32:14.729 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.730+0000 7ffa97fff640 1 -- 192.168.123.107:0/3452663871 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ffa64005350 con 0x7ffa98103ab0 2026-03-09T19:32:14.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.731+0000 7ffa77fff640 1 -- 192.168.123.107:0/3452663871 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ffa800040a0 con 0x7ffa98103ab0 2026-03-09T19:32:14.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.732+0000 7ffa77fff640 1 --2- 192.168.123.107:0/3452663871 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ffa700778e0 0x7ffa70079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:14.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.732+0000 7ffa77fff640 1 -- 192.168.123.107:0/3452663871 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6480+0+0 (secure 0 0 0) 0x7ffa8001d030 con 0x7ffa98103ab0 2026-03-09T19:32:14.733 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.734+0000 7ffa96ffd640 1 --2- 192.168.123.107:0/3452663871 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ffa700778e0 0x7ffa70079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:14.734 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.734+0000 7ffa77fff640 1 -- 192.168.123.107:0/3452663871 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ffa80062280 con 0x7ffa98103ab0 2026-03-09T19:32:14.734 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.735+0000 7ffa96ffd640 1 --2- 192.168.123.107:0/3452663871 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ffa700778e0 0x7ffa70079da0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7ffa8c002c20 tx=0x7ffa8c03a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:14.847 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.847+0000 7ffa97fff640 1 -- 192.168.123.107:0/3452663871 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ffa64002bf0 con 0x7ffa700778e0 2026-03-09T19:32:14.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.853+0000 7ffa77fff640 1 -- 192.168.123.107:0/3452663871 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7ffa64002bf0 con 0x7ffa700778e0 2026-03-09T19:32:14.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.855+0000 7ffa97fff640 1 -- 192.168.123.107:0/3452663871 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ffa700778e0 msgr2=0x7ffa70079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:14.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.855+0000 7ffa97fff640 1 --2- 192.168.123.107:0/3452663871 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ffa700778e0 0x7ffa70079da0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7ffa8c002c20 tx=0x7ffa8c03a040 comp rx=0 tx=0).stop 2026-03-09T19:32:14.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.855+0000 7ffa97fff640 1 -- 192.168.123.107:0/3452663871 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa98103ab0 msgr2=0x7ffa9819ab10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:14.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.855+0000 7ffa97fff640 1 --2- 192.168.123.107:0/3452663871 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa98103ab0 0x7ffa9819ab10 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7ffa8000cc60 tx=0x7ffa80007590 comp rx=0 tx=0).stop 2026-03-09T19:32:14.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.856+0000 7ffa97fff640 1 -- 192.168.123.107:0/3452663871 shutdown_connections 2026-03-09T19:32:14.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.856+0000 7ffa97fff640 1 --2- 192.168.123.107:0/3452663871 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ffa700778e0 0x7ffa70079da0 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:14.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.856+0000 7ffa97fff640 1 --2- 192.168.123.107:0/3452663871 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa98103ab0 0x7ffa9819ab10 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:14.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.856+0000 7ffa97fff640 1 --2- 192.168.123.107:0/3452663871 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa981028b0 0x7ffa9819a5d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:14.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.856+0000 7ffa97fff640 1 -- 192.168.123.107:0/3452663871 >> 192.168.123.107:0/3452663871 conn(0x7ffa980fe040 msgr2=0x7ffa98106ce0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:14.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.856+0000 7ffa97fff640 1 -- 192.168.123.107:0/3452663871 shutdown_connections 2026-03-09T19:32:14.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.856+0000 7ffa97fff640 1 -- 192.168.123.107:0/3452663871 wait complete. 2026-03-09T19:32:14.919 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.920+0000 7f789aff1640 1 -- 192.168.123.107:0/3280291443 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7894101a10 msgr2=0x7f7894101e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:14.919 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.920+0000 7f789aff1640 1 --2- 192.168.123.107:0/3280291443 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7894101a10 0x7f7894101e90 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f7884009a00 tx=0x7f788402f290 comp rx=0 tx=0).stop 2026-03-09T19:32:14.919 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.921+0000 7f789aff1640 1 -- 192.168.123.107:0/3280291443 shutdown_connections 2026-03-09T19:32:14.919 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.921+0000 7f789aff1640 1 --2- 192.168.123.107:0/3280291443 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7894101a10 0x7f7894101e90 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:14.919 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.921+0000 7f789aff1640 1 --2- 192.168.123.107:0/3280291443 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7894100810 0x7f7894100c10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:14.920 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.921+0000 7f789aff1640 1 -- 192.168.123.107:0/3280291443 >> 192.168.123.107:0/3280291443 conn(0x7f78940fbf80 msgr2=0x7f78940fe3e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:14.920 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.921+0000 7f789aff1640 1 -- 192.168.123.107:0/3280291443 shutdown_connections 2026-03-09T19:32:14.920 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.921+0000 7f789aff1640 1 -- 192.168.123.107:0/3280291443 wait complete. 2026-03-09T19:32:14.920 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.921+0000 7f789aff1640 1 Processor -- start 2026-03-09T19:32:14.920 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.922+0000 7f789aff1640 1 -- start start 2026-03-09T19:32:14.921 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.922+0000 7f789aff1640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7894100810 0x7f7894075080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:14.921 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.922+0000 7f789aff1640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7894071ae0 0x7f7894071f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:14.921 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.922+0000 7f789aff1640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f78940724a0 con 0x7f7894100810 2026-03-09T19:32:14.921 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.922+0000 7f789aff1640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7894072610 con 0x7f7894071ae0 2026-03-09T19:32:14.921 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.922+0000 7f7898d66640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7894100810 0x7f7894075080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:14.921 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.922+0000 7f788bfff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7894071ae0 0x7f7894071f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:14.921 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.922+0000 7f788bfff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7894071ae0 0x7f7894071f60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:38440/0 (socket says 192.168.123.107:38440) 2026-03-09T19:32:14.921 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.922+0000 7f788bfff640 1 -- 192.168.123.107:0/3563113857 learned_addr learned my addr 192.168.123.107:0/3563113857 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:32:14.922 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.923+0000 7f788bfff640 1 -- 192.168.123.107:0/3563113857 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7894100810 msgr2=0x7f7894075080 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:14.922 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.923+0000 7f788bfff640 1 --2- 192.168.123.107:0/3563113857 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7894100810 0x7f7894075080 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:14.922 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.923+0000 7f788bfff640 1 -- 192.168.123.107:0/3563113857 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7884009660 con 0x7f7894071ae0 2026-03-09T19:32:14.922 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.923+0000 7f7898d66640 1 --2- 192.168.123.107:0/3563113857 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7894100810 0x7f7894075080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:32:14.922 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.924+0000 7f788bfff640 1 --2- 192.168.123.107:0/3563113857 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7894071ae0 0x7f7894071f60 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f788402f7a0 tx=0x7f7884031d40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:14.923 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.924+0000 7f7889ffb640 1 -- 192.168.123.107:0/3563113857 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f788402faf0 con 0x7f7894071ae0 2026-03-09T19:32:14.923 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.924+0000 7f789aff1640 1 -- 192.168.123.107:0/3563113857 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f78941a47f0 con 0x7f7894071ae0 2026-03-09T19:32:14.923 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.924+0000 7f789aff1640 1 -- 192.168.123.107:0/3563113857 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f78941a4d00 con 0x7f7894071ae0 2026-03-09T19:32:14.923 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.925+0000 7f7889ffb640 1 -- 192.168.123.107:0/3563113857 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f788402fc50 con 0x7f7894071ae0 2026-03-09T19:32:14.923 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.925+0000 7f7889ffb640 1 -- 192.168.123.107:0/3563113857 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7884038710 con 0x7f7894071ae0 2026-03-09T19:32:14.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.925+0000 7f7889ffb640 1 -- 192.168.123.107:0/3563113857 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7884048050 con 0x7f7894071ae0 2026-03-09T19:32:14.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.926+0000 7f7889ffb640 1 --2- 192.168.123.107:0/3563113857 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f7870077890 0x7f7870079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:14.925 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.926+0000 7f7889ffb640 1 -- 192.168.123.107:0/3563113857 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f788402fdc0 con 0x7f7894071ae0 2026-03-09T19:32:14.925 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.926+0000 7f7898d66640 1 --2- 192.168.123.107:0/3563113857 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f7870077890 0x7f7870079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:14.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.927+0000 7f7898d66640 1 --2- 192.168.123.107:0/3563113857 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f7870077890 0x7f7870079d50 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f7894101870 tx=0x7f787c009210 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:14.926 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.927+0000 7f789aff1640 1 -- 192.168.123.107:0/3563113857 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7894101e90 con 0x7f7894071ae0 2026-03-09T19:32:14.929 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:14.930+0000 7f7889ffb640 1 -- 192.168.123.107:0/3563113857 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7884084760 con 0x7f7894071ae0 2026-03-09T19:32:15.031 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.032+0000 7f789aff1640 1 -- 192.168.123.107:0/3563113857 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f7894105fa0 con 0x7f7870077890 2026-03-09T19:32:15.036 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.037+0000 7f7889ffb640 1 -- 192.168.123.107:0/3563113857 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f7894105fa0 con 0x7f7870077890 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (9m) 16s ago 9m 23.2M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (9m) 16s ago 9m 9773k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (9m) 3m ago 9m 10.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (3m) 16s ago 9m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 91ed5a6dbf3f 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (3m) 3m ago 9m 8321k - 19.2.3-678-ge911bdeb 654f31e6858e b2465a9d2305 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (9m) 16s ago 9m 89.5M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (7m) 16s ago 7m 17.8M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (7m) 16s ago 7m 19.5M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (7m) 3m ago 7m 28.8M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (7m) 3m ago 7m 239M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:8443,9283,8765 running (4m) 16s ago 10m 602M - 19.2.3-678-ge911bdeb 654f31e6858e 6c1350e70bfa 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (3m) 3m ago 9m 489M - 19.2.3-678-ge911bdeb 654f31e6858e c4c36685d8dc 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (3m) 16s ago 10m 62.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e ad39140965d8 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (3m) 3m ago 9m 46.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b4a58927ebfd 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (9m) 16s ago 9m 15.1M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (9m) 3m ago 9m 16.6M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (3m) 16s ago 8m 226M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a203aa241656 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (40s) 16s ago 8m 108M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 36b65f1069e5 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (18s) 16s ago 8m 13.8M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 1b8bc1f96eb7 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (8m) 3m ago 8m 441M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 740e44caf4fc 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (8m) 3m ago 8m 446M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 d929d31f8a58 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (8m) 3m ago 8m 348M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:32:15.037 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (3m) 16s ago 9m 59.1M - 2.43.0 a07b618ecd1d c09450c20f5f 2026-03-09T19:32:15.039 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.040+0000 7f789aff1640 1 -- 192.168.123.107:0/3563113857 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f7870077890 msgr2=0x7f7870079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.039 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.041+0000 7f789aff1640 1 --2- 192.168.123.107:0/3563113857 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f7870077890 0x7f7870079d50 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f7894101870 tx=0x7f787c009210 comp rx=0 tx=0).stop 2026-03-09T19:32:15.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.041+0000 7f789aff1640 1 -- 192.168.123.107:0/3563113857 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7894071ae0 msgr2=0x7f7894071f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.041+0000 7f789aff1640 1 --2- 192.168.123.107:0/3563113857 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7894071ae0 0x7f7894071f60 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f788402f7a0 tx=0x7f7884031d40 comp rx=0 tx=0).stop 2026-03-09T19:32:15.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.041+0000 7f789aff1640 1 -- 192.168.123.107:0/3563113857 shutdown_connections 2026-03-09T19:32:15.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.041+0000 7f789aff1640 1 --2- 192.168.123.107:0/3563113857 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f7870077890 0x7f7870079d50 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.041+0000 7f789aff1640 1 --2- 192.168.123.107:0/3563113857 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7894071ae0 0x7f7894071f60 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.040 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.042+0000 7f789aff1640 1 --2- 192.168.123.107:0/3563113857 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7894100810 0x7f7894075080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.041 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.042+0000 7f789aff1640 1 -- 192.168.123.107:0/3563113857 >> 192.168.123.107:0/3563113857 conn(0x7f78940fbf80 msgr2=0x7f78940fd900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:15.041 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.042+0000 7f789aff1640 1 -- 192.168.123.107:0/3563113857 shutdown_connections 2026-03-09T19:32:15.041 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.042+0000 7f789aff1640 1 -- 192.168.123.107:0/3563113857 wait complete. 2026-03-09T19:32:15.101 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.101+0000 7f825c406640 1 -- 192.168.123.107:0/4089689695 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8254103c80 msgr2=0x7f8254104100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.101 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.101+0000 7f825c406640 1 --2- 192.168.123.107:0/4089689695 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8254103c80 0x7f8254104100 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f8244009a00 tx=0x7f824402f290 comp rx=0 tx=0).stop 2026-03-09T19:32:15.101 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:14 vm07.local ceph-mon[111841]: pgmap v150: 65 pgs: 65 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:15.102 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.102+0000 7f825c406640 1 -- 192.168.123.107:0/4089689695 shutdown_connections 2026-03-09T19:32:15.102 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.102+0000 7f825c406640 1 --2- 192.168.123.107:0/4089689695 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8254103c80 0x7f8254104100 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.102 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.102+0000 7f825c406640 1 --2- 192.168.123.107:0/4089689695 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8254102a80 0x7f8254102e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.102 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.103+0000 7f825c406640 1 -- 192.168.123.107:0/4089689695 >> 192.168.123.107:0/4089689695 conn(0x7f82540fe250 msgr2=0x7f8254100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:15.104 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.105+0000 7f825c406640 1 -- 192.168.123.107:0/4089689695 shutdown_connections 2026-03-09T19:32:15.105 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.107+0000 7f825c406640 1 -- 192.168.123.107:0/4089689695 wait complete. 2026-03-09T19:32:15.106 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.107+0000 7f825c406640 1 Processor -- start 2026-03-09T19:32:15.107 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.108+0000 7f825c406640 1 -- start start 2026-03-09T19:32:15.107 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.108+0000 7f825c406640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8254102a80 0x7f825419a4d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:15.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.109+0000 7f825a17b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8254102a80 0x7f825419a4d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:15.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.109+0000 7f825a17b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8254102a80 0x7f825419a4d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:60758/0 (socket says 192.168.123.107:60758) 2026-03-09T19:32:15.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.108+0000 7f825c406640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8254103c80 0x7f825419aa10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:15.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.110+0000 7f825c406640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f825419afe0 con 0x7f8254102a80 2026-03-09T19:32:15.108 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.110+0000 7f825a17b640 1 -- 192.168.123.107:0/3009411012 learned_addr learned my addr 192.168.123.107:0/3009411012 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:32:15.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.110+0000 7f825c406640 1 -- 192.168.123.107:0/3009411012 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f825419b150 con 0x7f8254103c80 2026-03-09T19:32:15.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.110+0000 7f825997a640 1 --2- 192.168.123.107:0/3009411012 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8254103c80 0x7f825419aa10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:15.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.110+0000 7f825a17b640 1 -- 192.168.123.107:0/3009411012 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8254103c80 msgr2=0x7f825419aa10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.110+0000 7f825a17b640 1 --2- 192.168.123.107:0/3009411012 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8254103c80 0x7f825419aa10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.110+0000 7f825a17b640 1 -- 192.168.123.107:0/3009411012 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8244009660 con 0x7f8254102a80 2026-03-09T19:32:15.109 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.111+0000 7f825a17b640 1 --2- 192.168.123.107:0/3009411012 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8254102a80 0x7f825419a4d0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f824800b700 tx=0x7f824800bbd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:15.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.111+0000 7f82437fe640 1 -- 192.168.123.107:0/3009411012 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f824800be90 con 0x7f8254102a80 2026-03-09T19:32:15.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.111+0000 7f82437fe640 1 -- 192.168.123.107:0/3009411012 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f82480027a0 con 0x7f8254102a80 2026-03-09T19:32:15.111 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.112+0000 7f82437fe640 1 -- 192.168.123.107:0/3009411012 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f824800ca00 con 0x7f8254102a80 2026-03-09T19:32:15.111 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.112+0000 7f825c406640 1 -- 192.168.123.107:0/3009411012 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f825419fbf0 con 0x7f8254102a80 2026-03-09T19:32:15.112 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.112+0000 7f825c406640 1 -- 192.168.123.107:0/3009411012 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f82541a0110 con 0x7f8254102a80 2026-03-09T19:32:15.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.114+0000 7f82437fe640 1 -- 192.168.123.107:0/3009411012 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f824800cb60 con 0x7f8254102a80 2026-03-09T19:32:15.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.114+0000 7f82437fe640 1 --2- 192.168.123.107:0/3009411012 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8234077680 0x7f8234079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:15.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.114+0000 7f82437fe640 1 -- 192.168.123.107:0/3009411012 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f8248098fd0 con 0x7f8254102a80 2026-03-09T19:32:15.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.114+0000 7f825997a640 1 --2- 192.168.123.107:0/3009411012 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8234077680 0x7f8234079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:15.113 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.114+0000 7f825c406640 1 -- 192.168.123.107:0/3009411012 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8224005350 con 0x7f8254102a80 2026-03-09T19:32:15.114 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.115+0000 7f825997a640 1 --2- 192.168.123.107:0/3009411012 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8234077680 0x7f8234079b40 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f8244038660 tx=0x7f8244038470 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:15.117 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.118+0000 7f82437fe640 1 -- 192.168.123.107:0/3009411012 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f82480617e0 con 0x7f8254102a80 2026-03-09T19:32:15.255 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.256+0000 7f825c406640 1 -- 192.168.123.107:0/3009411012 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f8224005e10 con 0x7f8254102a80 2026-03-09T19:32:15.256 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.257+0000 7f82437fe640 1 -- 192.168.123.107:0/3009411012 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+844 (secure 0 0 0) 0x7f8248060f30 con 0x7f8254102a80 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 3, 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 3 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 7, 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 7 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:32:15.257 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:32:15.260 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.261+0000 7f825c406640 1 -- 192.168.123.107:0/3009411012 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8234077680 msgr2=0x7f8234079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.260 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.261+0000 7f825c406640 1 --2- 192.168.123.107:0/3009411012 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8234077680 0x7f8234079b40 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f8244038660 tx=0x7f8244038470 comp rx=0 tx=0).stop 2026-03-09T19:32:15.260 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.261+0000 7f825c406640 1 -- 192.168.123.107:0/3009411012 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8254102a80 msgr2=0x7f825419a4d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.260 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.261+0000 7f825c406640 1 --2- 192.168.123.107:0/3009411012 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8254102a80 0x7f825419a4d0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f824800b700 tx=0x7f824800bbd0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.261 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.262+0000 7f825c406640 1 -- 192.168.123.107:0/3009411012 shutdown_connections 2026-03-09T19:32:15.261 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.262+0000 7f825c406640 1 --2- 192.168.123.107:0/3009411012 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8234077680 0x7f8234079b40 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.261 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.262+0000 7f825c406640 1 --2- 192.168.123.107:0/3009411012 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8254103c80 0x7f825419aa10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.261 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.262+0000 7f825c406640 1 --2- 192.168.123.107:0/3009411012 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8254102a80 0x7f825419a4d0 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.261 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.262+0000 7f825c406640 1 -- 192.168.123.107:0/3009411012 >> 192.168.123.107:0/3009411012 conn(0x7f82540fe250 msgr2=0x7f82540ffd30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:15.261 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.262+0000 7f825c406640 1 -- 192.168.123.107:0/3009411012 shutdown_connections 2026-03-09T19:32:15.261 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.262+0000 7f825c406640 1 -- 192.168.123.107:0/3009411012 wait complete. 2026-03-09T19:32:15.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.324+0000 7ff38730c640 1 -- 192.168.123.107:0/2794463575 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff380102a80 msgr2=0x7ff380102e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.324+0000 7ff38730c640 1 --2- 192.168.123.107:0/2794463575 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff380102a80 0x7ff380102e80 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7ff3700099b0 tx=0x7ff37002f240 comp rx=0 tx=0).stop 2026-03-09T19:32:15.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.324+0000 7ff38730c640 1 -- 192.168.123.107:0/2794463575 shutdown_connections 2026-03-09T19:32:15.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.324+0000 7ff38730c640 1 --2- 192.168.123.107:0/2794463575 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff380103c80 0x7ff380104100 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.324+0000 7ff38730c640 1 --2- 192.168.123.107:0/2794463575 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff380102a80 0x7ff380102e80 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.324+0000 7ff38730c640 1 -- 192.168.123.107:0/2794463575 >> 192.168.123.107:0/2794463575 conn(0x7ff3800fe250 msgr2=0x7ff380100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:15.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.324+0000 7ff38730c640 1 -- 192.168.123.107:0/2794463575 shutdown_connections 2026-03-09T19:32:15.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.324+0000 7ff38730c640 1 -- 192.168.123.107:0/2794463575 wait complete. 2026-03-09T19:32:15.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.325+0000 7ff38730c640 1 Processor -- start 2026-03-09T19:32:15.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.325+0000 7ff38730c640 1 -- start start 2026-03-09T19:32:15.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.326+0000 7ff38730c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff380102a80 0x7ff38019a450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:15.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.326+0000 7ff385081640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff380102a80 0x7ff38019a450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:15.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.326+0000 7ff385081640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff380102a80 0x7ff38019a450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:60770/0 (socket says 192.168.123.107:60770) 2026-03-09T19:32:15.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.326+0000 7ff38730c640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff380103c80 0x7ff38019a990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:15.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.326+0000 7ff38730c640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff38019af60 con 0x7ff380102a80 2026-03-09T19:32:15.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.326+0000 7ff38730c640 1 -- 192.168.123.107:0/3566803157 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff38019b0d0 con 0x7ff380103c80 2026-03-09T19:32:15.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.326+0000 7ff385081640 1 -- 192.168.123.107:0/3566803157 learned_addr learned my addr 192.168.123.107:0/3566803157 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:32:15.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.326+0000 7ff384880640 1 --2- 192.168.123.107:0/3566803157 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff380103c80 0x7ff38019a990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:15.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.327+0000 7ff385081640 1 -- 192.168.123.107:0/3566803157 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff380103c80 msgr2=0x7ff38019a990 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.327+0000 7ff385081640 1 --2- 192.168.123.107:0/3566803157 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff380103c80 0x7ff38019a990 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.327+0000 7ff385081640 1 -- 192.168.123.107:0/3566803157 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff370009660 con 0x7ff380102a80 2026-03-09T19:32:15.326 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.327+0000 7ff385081640 1 --2- 192.168.123.107:0/3566803157 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff380102a80 0x7ff38019a450 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7ff370002fd0 tx=0x7ff370004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:15.328 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.328+0000 7ff36e7fc640 1 -- 192.168.123.107:0/3566803157 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff37003d070 con 0x7ff380102a80 2026-03-09T19:32:15.328 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.328+0000 7ff36e7fc640 1 -- 192.168.123.107:0/3566803157 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff370004440 con 0x7ff380102a80 2026-03-09T19:32:15.328 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.328+0000 7ff36e7fc640 1 -- 192.168.123.107:0/3566803157 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff3700418e0 con 0x7ff380102a80 2026-03-09T19:32:15.328 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.328+0000 7ff38730c640 1 -- 192.168.123.107:0/3566803157 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff38019fb10 con 0x7ff380102a80 2026-03-09T19:32:15.328 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.328+0000 7ff38730c640 1 -- 192.168.123.107:0/3566803157 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff380075910 con 0x7ff380102a80 2026-03-09T19:32:15.329 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.330+0000 7ff38730c640 1 -- 192.168.123.107:0/3566803157 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff348005350 con 0x7ff380102a80 2026-03-09T19:32:15.331 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.330+0000 7ff36e7fc640 1 -- 192.168.123.107:0/3566803157 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff370049050 con 0x7ff380102a80 2026-03-09T19:32:15.331 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.331+0000 7ff36e7fc640 1 --2- 192.168.123.107:0/3566803157 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff35c077890 0x7ff35c079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:15.331 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.331+0000 7ff36e7fc640 1 -- 192.168.123.107:0/3566803157 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6480+0+0 (secure 0 0 0) 0x7ff3700be450 con 0x7ff380102a80 2026-03-09T19:32:15.332 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.333+0000 7ff384880640 1 --2- 192.168.123.107:0/3566803157 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff35c077890 0x7ff35c079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:15.333 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.334+0000 7ff384880640 1 --2- 192.168.123.107:0/3566803157 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff35c077890 0x7ff35c079d50 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7ff38019b970 tx=0x7ff3740073d0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:15.333 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.334+0000 7ff36e7fc640 1 -- 192.168.123.107:0/3566803157 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff370086b50 con 0x7ff380102a80 2026-03-09T19:32:15.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:14 vm08.local ceph-mon[103420]: pgmap v150: 65 pgs: 65 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:15.450 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.449+0000 7ff38730c640 1 -- 192.168.123.107:0/3566803157 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7ff348005e10 con 0x7ff380102a80 2026-03-09T19:32:15.451 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.452+0000 7ff36e7fc640 1 -- 192.168.123.107:0/3566803157 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1926 (secure 0 0 0) 0x7ff37002fcb0 con 0x7ff380102a80 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:e13 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:epoch 13 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:32:15.453 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:32:15.454 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:32:15.454 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:32:15.454 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:32:15.454 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:32:15.454 INFO:teuthology.orchestra.run.vm07.stdout:qdb_cluster leader: 0 members: 2026-03-09T19:32:15.454 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:32:15.454 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:32:15.454 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:32:15.454 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:32:15.454 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:32:15.454 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:32:15.454 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:32:15.454 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:32:15.454 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:32:15.454 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.455+0000 7ff38730c640 1 -- 192.168.123.107:0/3566803157 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff35c077890 msgr2=0x7ff35c079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.454 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.455+0000 7ff38730c640 1 --2- 192.168.123.107:0/3566803157 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff35c077890 0x7ff35c079d50 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7ff38019b970 tx=0x7ff3740073d0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.455 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.456+0000 7ff38730c640 1 -- 192.168.123.107:0/3566803157 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff380102a80 msgr2=0x7ff38019a450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.455 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.456+0000 7ff38730c640 1 --2- 192.168.123.107:0/3566803157 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff380102a80 0x7ff38019a450 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7ff370002fd0 tx=0x7ff370004290 comp rx=0 tx=0).stop 2026-03-09T19:32:15.455 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.456+0000 7ff38730c640 1 -- 192.168.123.107:0/3566803157 shutdown_connections 2026-03-09T19:32:15.455 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.456+0000 7ff38730c640 1 --2- 192.168.123.107:0/3566803157 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff35c077890 0x7ff35c079d50 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.455 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.456+0000 7ff38730c640 1 --2- 192.168.123.107:0/3566803157 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff380103c80 0x7ff38019a990 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.455 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.456+0000 7ff38730c640 1 --2- 192.168.123.107:0/3566803157 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff380102a80 0x7ff38019a450 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.455 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.456+0000 7ff38730c640 1 -- 192.168.123.107:0/3566803157 >> 192.168.123.107:0/3566803157 conn(0x7ff3800fe250 msgr2=0x7ff3800ffa00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:15.455 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.456+0000 7ff38730c640 1 -- 192.168.123.107:0/3566803157 shutdown_connections 2026-03-09T19:32:15.455 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.456+0000 7ff38730c640 1 -- 192.168.123.107:0/3566803157 wait complete. 2026-03-09T19:32:15.518 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.518+0000 7f0033ea5640 1 -- 192.168.123.107:0/4111892460 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f002c102a80 msgr2=0x7f002c102e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.518 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.518+0000 7f0033ea5640 1 --2- 192.168.123.107:0/4111892460 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f002c102a80 0x7f002c102e80 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f00200099b0 tx=0x7f002002f220 comp rx=0 tx=0).stop 2026-03-09T19:32:15.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.519+0000 7f0033ea5640 1 -- 192.168.123.107:0/4111892460 shutdown_connections 2026-03-09T19:32:15.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.519+0000 7f0033ea5640 1 --2- 192.168.123.107:0/4111892460 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f002c103c80 0x7f002c104100 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.519+0000 7f0033ea5640 1 --2- 192.168.123.107:0/4111892460 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f002c102a80 0x7f002c102e80 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.519+0000 7f0033ea5640 1 -- 192.168.123.107:0/4111892460 >> 192.168.123.107:0/4111892460 conn(0x7f002c0fe250 msgr2=0x7f002c100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:15.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.519+0000 7f0033ea5640 1 -- 192.168.123.107:0/4111892460 shutdown_connections 2026-03-09T19:32:15.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.519+0000 7f0033ea5640 1 -- 192.168.123.107:0/4111892460 wait complete. 2026-03-09T19:32:15.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.520+0000 7f0033ea5640 1 Processor -- start 2026-03-09T19:32:15.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.520+0000 7f0033ea5640 1 -- start start 2026-03-09T19:32:15.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.520+0000 7f0033ea5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f002c102a80 0x7f002c112890 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:15.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.520+0000 7f0033ea5640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f002c103c80 0x7f002c10f2f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:15.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.520+0000 7f0033ea5640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f002c10f830 con 0x7f002c102a80 2026-03-09T19:32:15.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.520+0000 7f0033ea5640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f002c10f9a0 con 0x7f002c103c80 2026-03-09T19:32:15.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.520+0000 7f0031c1a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f002c102a80 0x7f002c112890 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:15.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.521+0000 7f0031419640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f002c103c80 0x7f002c10f2f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:15.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.521+0000 7f0031c1a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f002c102a80 0x7f002c112890 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:60792/0 (socket says 192.168.123.107:60792) 2026-03-09T19:32:15.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.521+0000 7f0031c1a640 1 -- 192.168.123.107:0/1675875892 learned_addr learned my addr 192.168.123.107:0/1675875892 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:32:15.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.521+0000 7f0031419640 1 -- 192.168.123.107:0/1675875892 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f002c102a80 msgr2=0x7f002c112890 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.521+0000 7f0031419640 1 --2- 192.168.123.107:0/1675875892 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f002c102a80 0x7f002c112890 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.521+0000 7f0031419640 1 -- 192.168.123.107:0/1675875892 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0020009660 con 0x7f002c103c80 2026-03-09T19:32:15.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.521+0000 7f0031c1a640 1 --2- 192.168.123.107:0/1675875892 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f002c102a80 0x7f002c112890 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:32:15.521 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.522+0000 7f0031419640 1 --2- 192.168.123.107:0/1675875892 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f002c103c80 0x7f002c10f2f0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f001c00d8d0 tx=0x7f001c00dda0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:15.521 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.522+0000 7f001affd640 1 -- 192.168.123.107:0/1675875892 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f001c004490 con 0x7f002c103c80 2026-03-09T19:32:15.521 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.522+0000 7f0033ea5640 1 -- 192.168.123.107:0/1675875892 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f002c10fc80 con 0x7f002c103c80 2026-03-09T19:32:15.521 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.522+0000 7f0033ea5640 1 -- 192.168.123.107:0/1675875892 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f002c1ad150 con 0x7f002c103c80 2026-03-09T19:32:15.521 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.522+0000 7f001affd640 1 -- 192.168.123.107:0/1675875892 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f001c007630 con 0x7f002c103c80 2026-03-09T19:32:15.521 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.522+0000 7f001affd640 1 -- 192.168.123.107:0/1675875892 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f001c002da0 con 0x7f002c103c80 2026-03-09T19:32:15.523 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.524+0000 7f001affd640 1 -- 192.168.123.107:0/1675875892 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f001c00b840 con 0x7f002c103c80 2026-03-09T19:32:15.523 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.524+0000 7f001affd640 1 --2- 192.168.123.107:0/1675875892 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f00000778e0 0x7f0000079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:15.523 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.524+0000 7f001affd640 1 -- 192.168.123.107:0/1675875892 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f001c099af0 con 0x7f002c103c80 2026-03-09T19:32:15.524 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.525+0000 7f0031c1a640 1 --2- 192.168.123.107:0/1675875892 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f00000778e0 0x7f0000079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:15.524 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.525+0000 7f0031c1a640 1 --2- 192.168.123.107:0/1675875892 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f00000778e0 0x7f0000079da0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f0020002410 tx=0x7f002003a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:15.524 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.525+0000 7f0033ea5640 1 -- 192.168.123.107:0/1675875892 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efff4005350 con 0x7f002c103c80 2026-03-09T19:32:15.528 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.529+0000 7f001affd640 1 -- 192.168.123.107:0/1675875892 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f001c0622a0 con 0x7f002c103c80 2026-03-09T19:32:15.638 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.638+0000 7f0033ea5640 1 -- 192.168.123.107:0/1675875892 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7efff4002bf0 con 0x7f00000778e0 2026-03-09T19:32:15.638 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.639+0000 7f001affd640 1 -- 192.168.123.107:0/1675875892 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7efff4002bf0 con 0x7f00000778e0 2026-03-09T19:32:15.639 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:32:15.639 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T19:32:15.639 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:32:15.639 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:32:15.639 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [ 2026-03-09T19:32:15.639 INFO:teuthology.orchestra.run.vm07.stdout: "mgr", 2026-03-09T19:32:15.639 INFO:teuthology.orchestra.run.vm07.stdout: "crash", 2026-03-09T19:32:15.639 INFO:teuthology.orchestra.run.vm07.stdout: "mon" 2026-03-09T19:32:15.639 INFO:teuthology.orchestra.run.vm07.stdout: ], 2026-03-09T19:32:15.639 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "9/23 daemons upgraded", 2026-03-09T19:32:15.639 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T19:32:15.639 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:32:15.639 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:32:15.641 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.642+0000 7f0033ea5640 1 -- 192.168.123.107:0/1675875892 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f00000778e0 msgr2=0x7f0000079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.641 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.642+0000 7f0033ea5640 1 --2- 192.168.123.107:0/1675875892 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f00000778e0 0x7f0000079da0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f0020002410 tx=0x7f002003a040 comp rx=0 tx=0).stop 2026-03-09T19:32:15.641 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.642+0000 7f0033ea5640 1 -- 192.168.123.107:0/1675875892 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f002c103c80 msgr2=0x7f002c10f2f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.641 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.642+0000 7f0033ea5640 1 --2- 192.168.123.107:0/1675875892 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f002c103c80 0x7f002c10f2f0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f001c00d8d0 tx=0x7f001c00dda0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.641 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.643+0000 7f0033ea5640 1 -- 192.168.123.107:0/1675875892 shutdown_connections 2026-03-09T19:32:15.642 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.643+0000 7f0033ea5640 1 --2- 192.168.123.107:0/1675875892 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f00000778e0 0x7f0000079da0 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.642 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.643+0000 7f0033ea5640 1 --2- 192.168.123.107:0/1675875892 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f002c103c80 0x7f002c10f2f0 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.642 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.643+0000 7f0033ea5640 1 --2- 192.168.123.107:0/1675875892 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f002c102a80 0x7f002c112890 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.642 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.643+0000 7f0033ea5640 1 -- 192.168.123.107:0/1675875892 >> 192.168.123.107:0/1675875892 conn(0x7f002c0fe250 msgr2=0x7f002c104ea0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:15.642 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.643+0000 7f0033ea5640 1 -- 192.168.123.107:0/1675875892 shutdown_connections 2026-03-09T19:32:15.642 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.643+0000 7f0033ea5640 1 -- 192.168.123.107:0/1675875892 wait complete. 2026-03-09T19:32:15.703 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.703+0000 7f7678e62640 1 -- 192.168.123.107:0/1129739355 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7674103c80 msgr2=0x7f7674104100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.703 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.703+0000 7f7678e62640 1 --2- 192.168.123.107:0/1129739355 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7674103c80 0x7f7674104100 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f76600099b0 tx=0x7f766002f220 comp rx=0 tx=0).stop 2026-03-09T19:32:15.706 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.705+0000 7f7678e62640 1 -- 192.168.123.107:0/1129739355 shutdown_connections 2026-03-09T19:32:15.706 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.705+0000 7f7678e62640 1 --2- 192.168.123.107:0/1129739355 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7674103c80 0x7f7674104100 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.706 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.705+0000 7f7678e62640 1 --2- 192.168.123.107:0/1129739355 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7674102a80 0x7f7674102e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.706 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.705+0000 7f7678e62640 1 -- 192.168.123.107:0/1129739355 >> 192.168.123.107:0/1129739355 conn(0x7f76740fe250 msgr2=0x7f7674100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:15.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.708+0000 7f7678e62640 1 -- 192.168.123.107:0/1129739355 shutdown_connections 2026-03-09T19:32:15.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.708+0000 7f7678e62640 1 -- 192.168.123.107:0/1129739355 wait complete. 2026-03-09T19:32:15.709 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.710+0000 7f7678e62640 1 Processor -- start 2026-03-09T19:32:15.709 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.710+0000 7f7678e62640 1 -- start start 2026-03-09T19:32:15.710 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.711+0000 7f7678e62640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7674102a80 0x7f767419a460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:15.711 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.711+0000 7f7678e62640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7674103c80 0x7f767419a9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:15.711 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.711+0000 7f7678e62640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f767419af70 con 0x7f7674102a80 2026-03-09T19:32:15.711 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.711+0000 7f7678e62640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f767419b0e0 con 0x7f7674103c80 2026-03-09T19:32:15.711 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.711+0000 7f7671d74640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7674103c80 0x7f767419a9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:15.711 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.712+0000 7f7671d74640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7674103c80 0x7f767419a9a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:38506/0 (socket says 192.168.123.107:38506) 2026-03-09T19:32:15.711 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.712+0000 7f7671d74640 1 -- 192.168.123.107:0/360019985 learned_addr learned my addr 192.168.123.107:0/360019985 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:32:15.711 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.711+0000 7f7672575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7674102a80 0x7f767419a460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:15.712 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.713+0000 7f7671d74640 1 -- 192.168.123.107:0/360019985 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7674102a80 msgr2=0x7f767419a460 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.712 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.713+0000 7f7671d74640 1 --2- 192.168.123.107:0/360019985 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7674102a80 0x7f767419a460 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.712 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.713+0000 7f7671d74640 1 -- 192.168.123.107:0/360019985 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7660009660 con 0x7f7674103c80 2026-03-09T19:32:15.712 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.713+0000 7f7672575640 1 --2- 192.168.123.107:0/360019985 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7674102a80 0x7f767419a460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:32:15.713 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.714+0000 7f7671d74640 1 --2- 192.168.123.107:0/360019985 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7674103c80 0x7f767419a9a0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f766002f730 tx=0x7f76600028f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:15.713 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.714+0000 7f765f7fe640 1 -- 192.168.123.107:0/360019985 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f766003d070 con 0x7f7674103c80 2026-03-09T19:32:15.713 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.714+0000 7f765f7fe640 1 -- 192.168.123.107:0/360019985 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f766002fc90 con 0x7f7674103c80 2026-03-09T19:32:15.713 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.714+0000 7f765f7fe640 1 -- 192.168.123.107:0/360019985 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7660041880 con 0x7f7674103c80 2026-03-09T19:32:15.714 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.714+0000 7f7678e62640 1 -- 192.168.123.107:0/360019985 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f767419fb20 con 0x7f7674103c80 2026-03-09T19:32:15.714 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.714+0000 7f7678e62640 1 -- 192.168.123.107:0/360019985 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f767419ffc0 con 0x7f7674103c80 2026-03-09T19:32:15.714 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.715+0000 7f7678e62640 1 -- 192.168.123.107:0/360019985 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f767410b6b0 con 0x7f7674103c80 2026-03-09T19:32:15.715 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.717+0000 7f765f7fe640 1 -- 192.168.123.107:0/360019985 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f76600419e0 con 0x7f7674103c80 2026-03-09T19:32:15.716 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.717+0000 7f765f7fe640 1 --2- 192.168.123.107:0/360019985 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f76400776d0 0x7f7640079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:15.716 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.717+0000 7f7672575640 1 --2- 192.168.123.107:0/360019985 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f76400776d0 0x7f7640079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:15.716 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.718+0000 7f765f7fe640 1 -- 192.168.123.107:0/360019985 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f76600bea80 con 0x7f7674103c80 2026-03-09T19:32:15.717 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.718+0000 7f7672575640 1 --2- 192.168.123.107:0/360019985 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f76400776d0 0x7f7640079b90 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f7674103ae0 tx=0x7f76680073d0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:15.718 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.719+0000 7f765f7fe640 1 -- 192.168.123.107:0/360019985 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f76600872b0 con 0x7f7674103c80 2026-03-09T19:32:15.860 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.860+0000 7f7678e62640 1 -- 192.168.123.107:0/360019985 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f767410b8f0 con 0x7f7674103c80 2026-03-09T19:32:15.860 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.861+0000 7f765f7fe640 1 -- 192.168.123.107:0/360019985 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f7660086a00 con 0x7f7674103c80 2026-03-09T19:32:15.860 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_OK 2026-03-09T19:32:15.863 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.864+0000 7f7678e62640 1 -- 192.168.123.107:0/360019985 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f76400776d0 msgr2=0x7f7640079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.863 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.864+0000 7f7678e62640 1 --2- 192.168.123.107:0/360019985 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f76400776d0 0x7f7640079b90 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f7674103ae0 tx=0x7f76680073d0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.863 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.864+0000 7f7678e62640 1 -- 192.168.123.107:0/360019985 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7674103c80 msgr2=0x7f767419a9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:15.864 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.864+0000 7f7678e62640 1 --2- 192.168.123.107:0/360019985 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7674103c80 0x7f767419a9a0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f766002f730 tx=0x7f76600028f0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.864 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.865+0000 7f7678e62640 1 -- 192.168.123.107:0/360019985 shutdown_connections 2026-03-09T19:32:15.864 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.865+0000 7f7678e62640 1 --2- 192.168.123.107:0/360019985 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f76400776d0 0x7f7640079b90 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.864 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.865+0000 7f7678e62640 1 --2- 192.168.123.107:0/360019985 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7674103c80 0x7f767419a9a0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.864 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.865+0000 7f7678e62640 1 --2- 192.168.123.107:0/360019985 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7674102a80 0x7f767419a460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:15.864 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.865+0000 7f7678e62640 1 -- 192.168.123.107:0/360019985 >> 192.168.123.107:0/360019985 conn(0x7f76740fe250 msgr2=0x7f76740ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:15.864 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.865+0000 7f7678e62640 1 -- 192.168.123.107:0/360019985 shutdown_connections 2026-03-09T19:32:15.864 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:15.865+0000 7f7678e62640 1 -- 192.168.123.107:0/360019985 wait complete. 2026-03-09T19:32:15.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:15 vm07.local ceph-mon[111841]: from='client.34288 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:32:15.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:15 vm07.local ceph-mon[111841]: from='client.34292 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:32:15.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:15 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/3009411012' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:15.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:15 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/3566803157' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:32:15.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:15 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/360019985' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:32:16.246 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:15 vm08.local ceph-mon[103420]: from='client.34288 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:32:16.246 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:15 vm08.local ceph-mon[103420]: from='client.34292 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:32:16.246 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:15 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/3009411012' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:16.246 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:15 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/3566803157' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:32:16.246 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:15 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/360019985' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:32:17.324 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:17 vm08.local ceph-mon[103420]: from='client.44235 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:32:17.324 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:17 vm08.local ceph-mon[103420]: pgmap v151: 65 pgs: 65 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:17.324 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:17 vm08.local ceph-mon[103420]: from='client.44245 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:32:17.324 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:17 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T19:32:17.435 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:17 vm07.local ceph-mon[111841]: from='client.44235 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:32:17.435 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:17 vm07.local ceph-mon[111841]: pgmap v151: 65 pgs: 65 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:17.435 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:17 vm07.local ceph-mon[111841]: from='client.44245 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:32:17.435 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:17 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T19:32:18.166 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:18 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T19:32:18.166 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:18 vm08.local ceph-mon[103420]: Upgrade: osd.3 is safe to restart 2026-03-09T19:32:18.166 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:18 vm08.local ceph-mon[103420]: Upgrade: Updating osd.3 2026-03-09T19:32:18.166 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:18 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:18.166 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:18 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T19:32:18.166 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:18 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:32:18.166 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:18 vm08.local ceph-mon[103420]: Deploying daemon osd.3 on vm08 2026-03-09T19:32:18.166 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:18 vm08.local ceph-mon[103420]: pgmap v152: 65 pgs: 65 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:18.432 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:18 vm08.local systemd[1]: Stopping Ceph osd.3 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:32:18.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:18 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T19:32:18.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:18 vm07.local ceph-mon[111841]: Upgrade: osd.3 is safe to restart 2026-03-09T19:32:18.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:18 vm07.local ceph-mon[111841]: Upgrade: Updating osd.3 2026-03-09T19:32:18.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:18 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:18.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:18 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T19:32:18.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:18 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:32:18.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:18 vm07.local ceph-mon[111841]: Deploying daemon osd.3 on vm08 2026-03-09T19:32:18.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:18 vm07.local ceph-mon[111841]: pgmap v152: 65 pgs: 65 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:18.845 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:18 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3[63647]: 2026-03-09T19:32:18.433+0000 7f42ee614640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T19:32:18.845 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:18 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3[63647]: 2026-03-09T19:32:18.433+0000 7f42ee614640 -1 osd.3 83 *** Got signal Terminated *** 2026-03-09T19:32:18.845 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:18 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3[63647]: 2026-03-09T19:32:18.433+0000 7f42ee614640 -1 osd.3 83 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T19:32:19.429 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:19 vm08.local ceph-mon[103420]: osd.3 marked itself down and dead 2026-03-09T19:32:19.429 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:19 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:19.429 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:19 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:32:19.429 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:19 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:19.429 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:19 vm08.local podman[112518]: 2026-03-09 19:32:19.251550576 +0000 UTC m=+0.834761180 container died 740e44caf4fc546325a02baec062909ad8064260c9d2a7f649d6ac1f05f295fb (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS) 2026-03-09T19:32:19.429 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:19 vm08.local podman[112518]: 2026-03-09 19:32:19.268598761 +0000 UTC m=+0.851809365 container remove 740e44caf4fc546325a02baec062909ad8064260c9d2a7f649d6ac1f05f295fb (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef) 2026-03-09T19:32:19.429 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:19 vm08.local bash[112518]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3 2026-03-09T19:32:19.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:19 vm07.local ceph-mon[111841]: osd.3 marked itself down and dead 2026-03-09T19:32:19.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:19 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:19.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:19 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:32:19.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:19 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:19.720 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:19 vm08.local podman[112586]: 2026-03-09 19:32:19.48527219 +0000 UTC m=+0.028197918 container create ba04d6ce51429ee9a0ca193f404582bcad94f347c4cbe7973b873b2874974dfd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-deactivate, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default) 2026-03-09T19:32:19.720 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:19 vm08.local podman[112586]: 2026-03-09 19:32:19.540696185 +0000 UTC m=+0.083621913 container init ba04d6ce51429ee9a0ca193f404582bcad94f347c4cbe7973b873b2874974dfd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T19:32:19.720 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:19 vm08.local podman[112586]: 2026-03-09 19:32:19.544904283 +0000 UTC m=+0.087830011 container start ba04d6ce51429ee9a0ca193f404582bcad94f347c4cbe7973b873b2874974dfd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-deactivate, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default) 2026-03-09T19:32:19.720 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:19 vm08.local podman[112586]: 2026-03-09 19:32:19.547820994 +0000 UTC m=+0.090746722 container attach ba04d6ce51429ee9a0ca193f404582bcad94f347c4cbe7973b873b2874974dfd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-deactivate, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T19:32:19.720 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:19 vm08.local podman[112586]: 2026-03-09 19:32:19.473790707 +0000 UTC m=+0.016716435 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:32:19.980 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:19 vm08.local podman[112586]: 2026-03-09 19:32:19.721858727 +0000 UTC m=+0.264784455 container died ba04d6ce51429ee9a0ca193f404582bcad94f347c4cbe7973b873b2874974dfd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default) 2026-03-09T19:32:19.980 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:19 vm08.local podman[112586]: 2026-03-09 19:32:19.759259884 +0000 UTC m=+0.302185602 container remove ba04d6ce51429ee9a0ca193f404582bcad94f347c4cbe7973b873b2874974dfd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-deactivate, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2) 2026-03-09T19:32:19.980 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:19 vm08.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.3.service: Deactivated successfully. 2026-03-09T19:32:19.980 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:19 vm08.local systemd[1]: Stopped Ceph osd.3 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:32:19.980 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:19 vm08.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.3.service: Consumed 59.107s CPU time. 2026-03-09T19:32:20.235 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:20 vm08.local ceph-mon[103420]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T19:32:20.235 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:20 vm08.local ceph-mon[103420]: osdmap e84: 6 total, 5 up, 6 in 2026-03-09T19:32:20.235 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:20 vm08.local ceph-mon[103420]: pgmap v154: 65 pgs: 9 peering, 13 stale+active+clean, 43 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:20.235 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:19 vm08.local systemd[1]: Starting Ceph osd.3 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:32:20.236 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local podman[112698]: 2026-03-09 19:32:20.100278051 +0000 UTC m=+0.019259837 container create 31d5f39d24a7351cf4fd46eda5fea5d5c911392539474b4dfa71d23c1f5e45ab (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T19:32:20.236 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local podman[112698]: 2026-03-09 19:32:20.145632516 +0000 UTC m=+0.064614292 container init 31d5f39d24a7351cf4fd46eda5fea5d5c911392539474b4dfa71d23c1f5e45ab (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223) 2026-03-09T19:32:20.236 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local podman[112698]: 2026-03-09 19:32:20.14878663 +0000 UTC m=+0.067768416 container start 31d5f39d24a7351cf4fd46eda5fea5d5c911392539474b4dfa71d23c1f5e45ab (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223) 2026-03-09T19:32:20.236 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local podman[112698]: 2026-03-09 19:32:20.150800972 +0000 UTC m=+0.069782758 container attach 31d5f39d24a7351cf4fd46eda5fea5d5c911392539474b4dfa71d23c1f5e45ab (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T19:32:20.236 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local podman[112698]: 2026-03-09 19:32:20.092799079 +0000 UTC m=+0.011780876 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:32:20.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:20 vm07.local ceph-mon[111841]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T19:32:20.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:20 vm07.local ceph-mon[111841]: osdmap e84: 6 total, 5 up, 6 in 2026-03-09T19:32:20.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:20 vm07.local ceph-mon[111841]: pgmap v154: 65 pgs: 9 peering, 13 stale+active+clean, 43 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:20.595 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate[112711]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:32:20.595 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local bash[112698]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:32:20.595 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate[112711]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:32:20.595 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local bash[112698]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:32:21.066 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate[112711]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T19:32:21.066 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate[112711]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:32:21.066 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local bash[112698]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T19:32:21.066 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local bash[112698]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:32:21.066 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate[112711]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:32:21.066 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local bash[112698]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:32:21.066 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate[112711]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-09T19:32:21.066 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local bash[112698]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-09T19:32:21.066 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate[112711]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-276f479b-6113-4db2-a4bc-6e32e9d723b4/osd-block-175a772b-2920-452f-9d34-5c2a70bb1cb1 --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-09T19:32:21.066 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:20 vm08.local bash[112698]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-276f479b-6113-4db2-a4bc-6e32e9d723b4/osd-block-175a772b-2920-452f-9d34-5c2a70bb1cb1 --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-09T19:32:21.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:21 vm08.local ceph-mon[103420]: Health check failed: Reduced data availability: 3 pgs peering (PG_AVAILABILITY) 2026-03-09T19:32:21.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:21 vm08.local ceph-mon[103420]: osdmap e85: 6 total, 5 up, 6 in 2026-03-09T19:32:21.345 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate[112711]: Running command: /usr/bin/ln -snf /dev/ceph-276f479b-6113-4db2-a4bc-6e32e9d723b4/osd-block-175a772b-2920-452f-9d34-5c2a70bb1cb1 /var/lib/ceph/osd/ceph-3/block 2026-03-09T19:32:21.345 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local bash[112698]: Running command: /usr/bin/ln -snf /dev/ceph-276f479b-6113-4db2-a4bc-6e32e9d723b4/osd-block-175a772b-2920-452f-9d34-5c2a70bb1cb1 /var/lib/ceph/osd/ceph-3/block 2026-03-09T19:32:21.345 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate[112711]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-09T19:32:21.345 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local bash[112698]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-09T19:32:21.345 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate[112711]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-09T19:32:21.345 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local bash[112698]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-09T19:32:21.345 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate[112711]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-09T19:32:21.345 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local bash[112698]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-09T19:32:21.345 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate[112711]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-09T19:32:21.345 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local bash[112698]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-09T19:32:21.345 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local podman[112919]: 2026-03-09 19:32:21.121198205 +0000 UTC m=+0.012900311 container died 31d5f39d24a7351cf4fd46eda5fea5d5c911392539474b4dfa71d23c1f5e45ab (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T19:32:21.345 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local podman[112919]: 2026-03-09 19:32:21.139020669 +0000 UTC m=+0.030722785 container remove 31d5f39d24a7351cf4fd46eda5fea5d5c911392539474b4dfa71d23c1f5e45ab (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS) 2026-03-09T19:32:21.346 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local podman[112957]: 2026-03-09 19:32:21.241439922 +0000 UTC m=+0.018315349 container create bde783ff786f7d70024e8d83200249af84676e292292afdc9a39c62370406acc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T19:32:21.346 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local podman[112957]: 2026-03-09 19:32:21.281308246 +0000 UTC m=+0.058183683 container init bde783ff786f7d70024e8d83200249af84676e292292afdc9a39c62370406acc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223) 2026-03-09T19:32:21.346 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local podman[112957]: 2026-03-09 19:32:21.284882968 +0000 UTC m=+0.061758395 container start bde783ff786f7d70024e8d83200249af84676e292292afdc9a39c62370406acc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_REF=squid, ceph=True, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS) 2026-03-09T19:32:21.346 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local bash[112957]: bde783ff786f7d70024e8d83200249af84676e292292afdc9a39c62370406acc 2026-03-09T19:32:21.346 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local podman[112957]: 2026-03-09 19:32:21.234450537 +0000 UTC m=+0.011325973 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:32:21.346 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:21 vm08.local systemd[1]: Started Ceph osd.3 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:32:21.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:21 vm07.local ceph-mon[111841]: Health check failed: Reduced data availability: 3 pgs peering (PG_AVAILABILITY) 2026-03-09T19:32:21.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:21 vm07.local ceph-mon[111841]: osdmap e85: 6 total, 5 up, 6 in 2026-03-09T19:32:22.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:22 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:22.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:22 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:22.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:22 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:32:22.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:22 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:22.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:22 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:22.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:22 vm08.local ceph-mon[103420]: pgmap v156: 65 pgs: 4 active+undersized, 27 peering, 1 stale+active+clean, 3 active+undersized+degraded, 30 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 9/291 objects degraded (3.093%) 2026-03-09T19:32:22.595 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:22 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3[112967]: 2026-03-09T19:32:22.144+0000 7f7669755740 -1 Falling back to public interface 2026-03-09T19:32:22.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:22 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:22.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:22 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:22.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:22 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:32:22.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:22 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:22.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:22 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:22.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:22 vm07.local ceph-mon[111841]: pgmap v156: 65 pgs: 4 active+undersized, 27 peering, 1 stale+active+clean, 3 active+undersized+degraded, 30 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 9/291 objects degraded (3.093%) 2026-03-09T19:32:23.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:23 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:23.736 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:23 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:23.736 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:23 vm07.local ceph-mon[111841]: Health check failed: Degraded data redundancy: 9/291 objects degraded (3.093%), 3 pgs degraded (PG_DEGRADED) 2026-03-09T19:32:23.736 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:23 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:23.736 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:23 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:23.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:23 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:23.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:23 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:23.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:23 vm08.local ceph-mon[103420]: Health check failed: Degraded data redundancy: 9/291 objects degraded (3.093%), 3 pgs degraded (PG_DEGRADED) 2026-03-09T19:32:23.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:23 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:23.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:23 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:24.393 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:24 vm07.local ceph-mon[111841]: pgmap v157: 65 pgs: 6 active+undersized, 27 peering, 5 active+undersized+degraded, 27 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 13/291 objects degraded (4.467%) 2026-03-09T19:32:24.393 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:24 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:24.393 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:24 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:24.393 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:24 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:32:24.393 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:24 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:32:24.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:24 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:24.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:24 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:32:24.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:24 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:24.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:24 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:24.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:24 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:24.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:24 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T19:32:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:24 vm08.local ceph-mon[103420]: pgmap v157: 65 pgs: 6 active+undersized, 27 peering, 5 active+undersized+degraded, 27 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 13/291 objects degraded (4.467%) 2026-03-09T19:32:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:24 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:24 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:24 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:32:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:24 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:32:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:24 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:24 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:32:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:24 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:24 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:24 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:24.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:24 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T19:32:25.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:25 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T19:32:25.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:25 vm07.local ceph-mon[111841]: Upgrade: unsafe to stop osd(s) at this time (18 PGs are or would become offline) 2026-03-09T19:32:25.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:25 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T19:32:25.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:25 vm08.local ceph-mon[103420]: Upgrade: unsafe to stop osd(s) at this time (18 PGs are or would become offline) 2026-03-09T19:32:26.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:26 vm07.local ceph-mon[111841]: pgmap v158: 65 pgs: 19 active+undersized, 19 active+undersized+degraded, 27 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 64/291 objects degraded (21.993%) 2026-03-09T19:32:26.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:26 vm08.local ceph-mon[103420]: pgmap v158: 65 pgs: 19 active+undersized, 19 active+undersized+degraded, 27 active+clean; 218 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 64/291 objects degraded (21.993%) 2026-03-09T19:32:27.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:27 vm08.local ceph-mon[103420]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 12 pgs peering) 2026-03-09T19:32:27.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:27 vm08.local ceph-mon[103420]: from='osd.3 [v2:192.168.123.108:6800/3730448920,v1:192.168.123.108:6801/3730448920]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T19:32:27.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:27 vm08.local ceph-mon[103420]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T19:32:27.595 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:27 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3[112967]: 2026-03-09T19:32:27.345+0000 7f7669755740 -1 osd.3 83 log_to_monitors true 2026-03-09T19:32:27.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:27 vm07.local ceph-mon[111841]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 12 pgs peering) 2026-03-09T19:32:27.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:27 vm07.local ceph-mon[111841]: from='osd.3 [v2:192.168.123.108:6800/3730448920,v1:192.168.123.108:6801/3730448920]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T19:32:27.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:27 vm07.local ceph-mon[111841]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T19:32:28.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:28 vm07.local ceph-mon[111841]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-09T19:32:28.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:28 vm07.local ceph-mon[111841]: osdmap e86: 6 total, 5 up, 6 in 2026-03-09T19:32:28.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:28 vm07.local ceph-mon[111841]: from='osd.3 [v2:192.168.123.108:6800/3730448920,v1:192.168.123.108:6801/3730448920]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:32:28.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:28 vm07.local ceph-mon[111841]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:32:28.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:28 vm07.local ceph-mon[111841]: pgmap v160: 65 pgs: 19 active+undersized, 19 active+undersized+degraded, 27 active+clean; 218 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 64/291 objects degraded (21.993%) 2026-03-09T19:32:28.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:28 vm07.local ceph-mon[111841]: from='osd.3 ' entity='osd.3' 2026-03-09T19:32:28.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:28 vm08.local ceph-mon[103420]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-09T19:32:28.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:28 vm08.local ceph-mon[103420]: osdmap e86: 6 total, 5 up, 6 in 2026-03-09T19:32:28.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:28 vm08.local ceph-mon[103420]: from='osd.3 [v2:192.168.123.108:6800/3730448920,v1:192.168.123.108:6801/3730448920]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:32:28.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:28 vm08.local ceph-mon[103420]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:32:28.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:28 vm08.local ceph-mon[103420]: pgmap v160: 65 pgs: 19 active+undersized, 19 active+undersized+degraded, 27 active+clean; 218 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 64/291 objects degraded (21.993%) 2026-03-09T19:32:28.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:28 vm08.local ceph-mon[103420]: from='osd.3 ' entity='osd.3' 2026-03-09T19:32:28.845 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:32:28 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3[112967]: 2026-03-09T19:32:28.421+0000 7f7660cee640 -1 osd.3 83 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T19:32:29.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:29 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 64/291 objects degraded (21.993%), 19 pgs degraded (PG_DEGRADED) 2026-03-09T19:32:29.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:29 vm07.local ceph-mon[111841]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T19:32:29.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:29 vm07.local ceph-mon[111841]: osd.3 [v2:192.168.123.108:6800/3730448920,v1:192.168.123.108:6801/3730448920] boot 2026-03-09T19:32:29.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:29 vm07.local ceph-mon[111841]: osdmap e87: 6 total, 6 up, 6 in 2026-03-09T19:32:29.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:29 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:32:29.746 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:29 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 64/291 objects degraded (21.993%), 19 pgs degraded (PG_DEGRADED) 2026-03-09T19:32:29.746 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:29 vm08.local ceph-mon[103420]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T19:32:29.746 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:29 vm08.local ceph-mon[103420]: osd.3 [v2:192.168.123.108:6800/3730448920,v1:192.168.123.108:6801/3730448920] boot 2026-03-09T19:32:29.746 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:29 vm08.local ceph-mon[103420]: osdmap e87: 6 total, 6 up, 6 in 2026-03-09T19:32:29.746 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:29 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T19:32:30.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:30 vm07.local ceph-mon[111841]: pgmap v162: 65 pgs: 6 peering, 15 active+undersized, 17 active+undersized+degraded, 27 active+clean; 218 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 54/291 objects degraded (18.557%) 2026-03-09T19:32:30.844 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:30 vm08.local ceph-mon[103420]: pgmap v162: 65 pgs: 6 peering, 15 active+undersized, 17 active+undersized+degraded, 27 active+clean; 218 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 54/291 objects degraded (18.557%) 2026-03-09T19:32:31.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:31 vm07.local ceph-mon[111841]: osdmap e88: 6 total, 6 up, 6 in 2026-03-09T19:32:31.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:31 vm08.local ceph-mon[103420]: osdmap e88: 6 total, 6 up, 6 in 2026-03-09T19:32:32.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:32 vm07.local ceph-mon[111841]: pgmap v164: 65 pgs: 15 peering, 7 active+undersized, 12 active+undersized+degraded, 31 active+clean; 218 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 36/291 objects degraded (12.371%) 2026-03-09T19:32:32.844 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:32 vm08.local ceph-mon[103420]: pgmap v164: 65 pgs: 15 peering, 7 active+undersized, 12 active+undersized+degraded, 31 active+clean; 218 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 36/291 objects degraded (12.371%) 2026-03-09T19:32:34.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:34 vm07.local ceph-mon[111841]: pgmap v165: 65 pgs: 15 peering, 50 active+clean; 218 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:34.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:34.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:32:34.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:34 vm07.local ceph-mon[111841]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 36/291 objects degraded (12.371%), 12 pgs degraded) 2026-03-09T19:32:34.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:34 vm07.local ceph-mon[111841]: Cluster is now healthy 2026-03-09T19:32:35.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:34 vm08.local ceph-mon[103420]: pgmap v165: 65 pgs: 15 peering, 50 active+clean; 218 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:35.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:34 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:35.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:34 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:32:35.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:34 vm08.local ceph-mon[103420]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 36/291 objects degraded (12.371%), 12 pgs degraded) 2026-03-09T19:32:35.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:34 vm08.local ceph-mon[103420]: Cluster is now healthy 2026-03-09T19:32:36.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:36 vm07.local ceph-mon[111841]: pgmap v166: 65 pgs: 65 active+clean; 218 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:37.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:36 vm08.local ceph-mon[103420]: pgmap v166: 65 pgs: 65 active+clean; 218 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:38.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:38 vm07.local ceph-mon[111841]: pgmap v167: 65 pgs: 65 active+clean; 218 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:39.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:38 vm08.local ceph-mon[103420]: pgmap v167: 65 pgs: 65 active+clean; 218 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:39.635 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T19:32:39.635 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:39.635 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T19:32:39.635 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:32:39.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T19:32:39.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:39.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T19:32:39.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:32:40.595 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:40 vm08.local systemd[1]: Stopping Ceph osd.4 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:32:40.595 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:40 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4[69823]: 2026-03-09T19:32:40.399+0000 7f8c37d09640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T19:32:40.595 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:40 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4[69823]: 2026-03-09T19:32:40.400+0000 7f8c37d09640 -1 osd.4 88 *** Got signal Terminated *** 2026-03-09T19:32:40.595 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:40 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4[69823]: 2026-03-09T19:32:40.400+0000 7f8c37d09640 -1 osd.4 88 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T19:32:40.941 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:40 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T19:32:40.941 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:40 vm08.local ceph-mon[103420]: Upgrade: osd.4 is safe to restart 2026-03-09T19:32:40.941 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:40 vm08.local ceph-mon[103420]: Upgrade: Updating osd.4 2026-03-09T19:32:40.941 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:40 vm08.local ceph-mon[103420]: Deploying daemon osd.4 on vm08 2026-03-09T19:32:40.941 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:40 vm08.local ceph-mon[103420]: pgmap v168: 65 pgs: 65 active+clean; 218 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:40.941 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:40 vm08.local ceph-mon[103420]: osd.4 marked itself down and dead 2026-03-09T19:32:40.941 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:40 vm08.local podman[118265]: 2026-03-09 19:32:40.722192421 +0000 UTC m=+0.336292772 container died d929d31f8a58964fd155780c974008e46a426998fc9cea39befac848ba1ec665 (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) 2026-03-09T19:32:40.941 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:40 vm08.local podman[118265]: 2026-03-09 19:32:40.75204773 +0000 UTC m=+0.366148081 container remove d929d31f8a58964fd155780c974008e46a426998fc9cea39befac848ba1ec665 (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True) 2026-03-09T19:32:40.941 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:40 vm08.local bash[118265]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4 2026-03-09T19:32:40.941 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:40 vm08.local podman[118330]: 2026-03-09 19:32:40.943028101 +0000 UTC m=+0.017158872 container create 66f72d91508e4a4f24cfe649aa7cb6c14de68746a587a9cb2c600c86c2bad65c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, ceph=True) 2026-03-09T19:32:40.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:40 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T19:32:40.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:40 vm07.local ceph-mon[111841]: Upgrade: osd.4 is safe to restart 2026-03-09T19:32:40.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:40 vm07.local ceph-mon[111841]: Upgrade: Updating osd.4 2026-03-09T19:32:40.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:40 vm07.local ceph-mon[111841]: Deploying daemon osd.4 on vm08 2026-03-09T19:32:40.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:40 vm07.local ceph-mon[111841]: pgmap v168: 65 pgs: 65 active+clean; 218 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:40.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:40 vm07.local ceph-mon[111841]: osd.4 marked itself down and dead 2026-03-09T19:32:41.254 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:40 vm08.local podman[118330]: 2026-03-09 19:32:40.986053126 +0000 UTC m=+0.060183907 container init 66f72d91508e4a4f24cfe649aa7cb6c14de68746a587a9cb2c600c86c2bad65c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0) 2026-03-09T19:32:41.254 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:40 vm08.local podman[118330]: 2026-03-09 19:32:40.989606197 +0000 UTC m=+0.063736959 container start 66f72d91508e4a4f24cfe649aa7cb6c14de68746a587a9cb2c600c86c2bad65c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-deactivate, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True) 2026-03-09T19:32:41.254 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:40 vm08.local podman[118330]: 2026-03-09 19:32:40.991808039 +0000 UTC m=+0.065938810 container attach 66f72d91508e4a4f24cfe649aa7cb6c14de68746a587a9cb2c600c86c2bad65c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid) 2026-03-09T19:32:41.254 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local podman[118330]: 2026-03-09 19:32:40.93618981 +0000 UTC m=+0.010320591 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:32:41.254 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local conmon[118341]: conmon 66f72d91508e4a4f24cf : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-66f72d91508e4a4f24cfe649aa7cb6c14de68746a587a9cb2c600c86c2bad65c.scope/container/memory.events 2026-03-09T19:32:41.254 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local podman[118330]: 2026-03-09 19:32:41.13728702 +0000 UTC m=+0.211417791 container died 66f72d91508e4a4f24cfe649aa7cb6c14de68746a587a9cb2c600c86c2bad65c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-deactivate, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.41.3, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T19:32:41.254 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local podman[118330]: 2026-03-09 19:32:41.156777637 +0000 UTC m=+0.230908408 container remove 66f72d91508e4a4f24cfe649aa7cb6c14de68746a587a9cb2c600c86c2bad65c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS) 2026-03-09T19:32:41.254 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.4.service: Deactivated successfully. 2026-03-09T19:32:41.254 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.4.service: Unit process 118341 (conmon) remains running after unit stopped. 2026-03-09T19:32:41.254 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local systemd[1]: Stopped Ceph osd.4 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:32:41.254 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.4.service: Consumed 52.796s CPU time, 694.2M memory peak. 2026-03-09T19:32:41.607 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local systemd[1]: Starting Ceph osd.4 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:32:41.607 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local podman[118431]: 2026-03-09 19:32:41.472116562 +0000 UTC m=+0.018502699 container create f1f102db33f7a2696c7b3fcad7033e8612cc5ec58b521f77f73e1e70a84a99ef (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T19:32:41.607 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local podman[118431]: 2026-03-09 19:32:41.519452557 +0000 UTC m=+0.065838703 container init f1f102db33f7a2696c7b3fcad7033e8612cc5ec58b521f77f73e1e70a84a99ef (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) 2026-03-09T19:32:41.607 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local podman[118431]: 2026-03-09 19:32:41.52280392 +0000 UTC m=+0.069190057 container start f1f102db33f7a2696c7b3fcad7033e8612cc5ec58b521f77f73e1e70a84a99ef (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.build-date=20260223, io.buildah.version=1.41.3) 2026-03-09T19:32:41.607 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local podman[118431]: 2026-03-09 19:32:41.525816539 +0000 UTC m=+0.072202676 container attach f1f102db33f7a2696c7b3fcad7033e8612cc5ec58b521f77f73e1e70a84a99ef (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T19:32:41.607 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local podman[118431]: 2026-03-09 19:32:41.465519402 +0000 UTC m=+0.011905539 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:32:41.607 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate[118442]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:32:41.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:41 vm07.local ceph-mon[111841]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T19:32:41.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:41 vm07.local ceph-mon[111841]: osdmap e89: 6 total, 5 up, 6 in 2026-03-09T19:32:42.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:41 vm08.local ceph-mon[103420]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T19:32:42.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:41 vm08.local ceph-mon[103420]: osdmap e89: 6 total, 5 up, 6 in 2026-03-09T19:32:42.096 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local bash[118431]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:32:42.096 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate[118442]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:32:42.096 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:41 vm08.local bash[118431]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:32:42.465 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate[118442]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T19:32:42.465 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local bash[118431]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T19:32:42.465 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local bash[118431]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:32:42.465 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate[118442]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:32:42.465 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate[118442]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:32:42.465 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local bash[118431]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:32:42.465 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate[118442]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-09T19:32:42.465 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local bash[118431]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-09T19:32:42.465 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate[118442]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-a41701ec-ed74-40bd-ae64-dcb091b208be/osd-block-a265b553-1e86-4bab-beff-db9f81381120 --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-09T19:32:42.465 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local bash[118431]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-a41701ec-ed74-40bd-ae64-dcb091b208be/osd-block-a265b553-1e86-4bab-beff-db9f81381120 --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-09T19:32:42.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:42 vm08.local ceph-mon[103420]: pgmap v170: 65 pgs: 8 peering, 4 stale+active+clean, 53 active+clean; 218 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:42.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:42 vm08.local ceph-mon[103420]: Health check failed: Reduced data availability: 2 pgs peering (PG_AVAILABILITY) 2026-03-09T19:32:42.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:42 vm08.local ceph-mon[103420]: osdmap e90: 6 total, 5 up, 6 in 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate[118442]: Running command: /usr/bin/ln -snf /dev/ceph-a41701ec-ed74-40bd-ae64-dcb091b208be/osd-block-a265b553-1e86-4bab-beff-db9f81381120 /var/lib/ceph/osd/ceph-4/block 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local bash[118431]: Running command: /usr/bin/ln -snf /dev/ceph-a41701ec-ed74-40bd-ae64-dcb091b208be/osd-block-a265b553-1e86-4bab-beff-db9f81381120 /var/lib/ceph/osd/ceph-4/block 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate[118442]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local bash[118431]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate[118442]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local bash[118431]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate[118442]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local bash[118431]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate[118442]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local bash[118431]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local podman[118431]: 2026-03-09 19:32:42.499422601 +0000 UTC m=+1.045808738 container died f1f102db33f7a2696c7b3fcad7033e8612cc5ec58b521f77f73e1e70a84a99ef (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local podman[118431]: 2026-03-09 19:32:42.526291681 +0000 UTC m=+1.072677818 container remove f1f102db33f7a2696c7b3fcad7033e8612cc5ec58b521f77f73e1e70a84a99ef (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-activate, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0) 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local podman[118698]: 2026-03-09 19:32:42.622389128 +0000 UTC m=+0.019410698 container create 588104e3b774b502dcbbb887e9f5d42504fb76d3bf824b6c834e6e5bcb7360aa (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.41.3) 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local podman[118698]: 2026-03-09 19:32:42.661966909 +0000 UTC m=+0.058988479 container init 588104e3b774b502dcbbb887e9f5d42504fb76d3bf824b6c834e6e5bcb7360aa (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local podman[118698]: 2026-03-09 19:32:42.6652956 +0000 UTC m=+0.062317170 container start 588104e3b774b502dcbbb887e9f5d42504fb76d3bf824b6c834e6e5bcb7360aa (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local bash[118698]: 588104e3b774b502dcbbb887e9f5d42504fb76d3bf824b6c834e6e5bcb7360aa 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local podman[118698]: 2026-03-09 19:32:42.614841679 +0000 UTC m=+0.011863258 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:32:42.847 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:42 vm08.local systemd[1]: Started Ceph osd.4 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:32:42.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:42 vm07.local ceph-mon[111841]: pgmap v170: 65 pgs: 8 peering, 4 stale+active+clean, 53 active+clean; 218 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T19:32:42.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:42 vm07.local ceph-mon[111841]: Health check failed: Reduced data availability: 2 pgs peering (PG_AVAILABILITY) 2026-03-09T19:32:42.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:42 vm07.local ceph-mon[111841]: osdmap e90: 6 total, 5 up, 6 in 2026-03-09T19:32:43.595 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:43 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4[118709]: 2026-03-09T19:32:43.506+0000 7fb853225740 -1 Falling back to public interface 2026-03-09T19:32:43.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:43 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:43.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:43 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:43.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:43 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:32:43.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:43 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:44.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:43 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:44.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:43 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:44.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:43 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:32:44.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:43 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:44.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:44 vm08.local ceph-mon[103420]: pgmap v172: 65 pgs: 8 active+undersized, 8 peering, 2 stale+active+clean, 5 active+undersized+degraded, 42 active+clean; 218 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 23/291 objects degraded (7.904%) 2026-03-09T19:32:44.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:44 vm08.local ceph-mon[103420]: Health check failed: Degraded data redundancy: 23/291 objects degraded (7.904%), 5 pgs degraded (PG_DEGRADED) 2026-03-09T19:32:44.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:44 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:44.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:44 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:44.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:44 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:44.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:44 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:44.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:44 vm07.local ceph-mon[111841]: pgmap v172: 65 pgs: 8 active+undersized, 8 peering, 2 stale+active+clean, 5 active+undersized+degraded, 42 active+clean; 218 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 23/291 objects degraded (7.904%) 2026-03-09T19:32:44.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:44 vm07.local ceph-mon[111841]: Health check failed: Degraded data redundancy: 23/291 objects degraded (7.904%), 5 pgs degraded (PG_DEGRADED) 2026-03-09T19:32:44.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:44 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:44.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:44 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:44.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:44 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:44.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:44 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:45.941 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.941+0000 7f5190c91640 1 -- 192.168.123.107:0/1136520848 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f518c06ab10 msgr2=0x7f518c06af70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:45.941 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.941+0000 7f5190c91640 1 --2- 192.168.123.107:0/1136520848 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f518c06ab10 0x7f518c06af70 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f51800099b0 tx=0x7f518002f240 comp rx=0 tx=0).stop 2026-03-09T19:32:45.941 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.942+0000 7f5190c91640 1 -- 192.168.123.107:0/1136520848 shutdown_connections 2026-03-09T19:32:45.941 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.942+0000 7f5190c91640 1 --2- 192.168.123.107:0/1136520848 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f518c06ab10 0x7f518c06af70 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:45.941 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.942+0000 7f5190c91640 1 --2- 192.168.123.107:0/1136520848 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f518c06a1d0 0x7f518c06a5d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:45.941 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.942+0000 7f5190c91640 1 -- 192.168.123.107:0/1136520848 >> 192.168.123.107:0/1136520848 conn(0x7f518c06f7b0 msgr2=0x7f518c071bf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:45.941 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.942+0000 7f5190c91640 1 -- 192.168.123.107:0/1136520848 shutdown_connections 2026-03-09T19:32:45.941 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.942+0000 7f5190c91640 1 -- 192.168.123.107:0/1136520848 wait complete. 2026-03-09T19:32:45.942 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.943+0000 7f5190c91640 1 Processor -- start 2026-03-09T19:32:45.942 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.943+0000 7f5190c91640 1 -- start start 2026-03-09T19:32:45.942 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.943+0000 7f5190c91640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f518c06a1d0 0x7f518c1a7300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:45.942 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.943+0000 7f5190c91640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f518c06ab10 0x7f518c1a7840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:45.942 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.943+0000 7f5190c91640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f518c1a7e10 con 0x7f518c06a1d0 2026-03-09T19:32:45.942 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.943+0000 7f5190c91640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f518c1a7f80 con 0x7f518c06ab10 2026-03-09T19:32:45.943 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.944+0000 7f518a575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f518c06a1d0 0x7f518c1a7300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:45.943 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.944+0000 7f518a575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f518c06a1d0 0x7f518c1a7300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:45096/0 (socket says 192.168.123.107:45096) 2026-03-09T19:32:45.943 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.944+0000 7f518a575640 1 -- 192.168.123.107:0/1672475434 learned_addr learned my addr 192.168.123.107:0/1672475434 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:32:45.943 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.944+0000 7f5189d74640 1 --2- 192.168.123.107:0/1672475434 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f518c06ab10 0x7f518c1a7840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:45.943 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.944+0000 7f518a575640 1 -- 192.168.123.107:0/1672475434 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f518c06ab10 msgr2=0x7f518c1a7840 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:45.943 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.944+0000 7f518a575640 1 --2- 192.168.123.107:0/1672475434 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f518c06ab10 0x7f518c1a7840 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:45.943 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.944+0000 7f518a575640 1 -- 192.168.123.107:0/1672475434 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5180009660 con 0x7f518c06a1d0 2026-03-09T19:32:45.944 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.945+0000 7f518a575640 1 --2- 192.168.123.107:0/1672475434 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f518c06a1d0 0x7f518c1a7300 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f517400d900 tx=0x7f517400ddd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:45.944 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.945+0000 7f51737fe640 1 -- 192.168.123.107:0/1672475434 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5174004490 con 0x7f518c06a1d0 2026-03-09T19:32:45.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.945+0000 7f51737fe640 1 -- 192.168.123.107:0/1672475434 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f51740076c0 con 0x7f518c06a1d0 2026-03-09T19:32:45.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.945+0000 7f5190c91640 1 -- 192.168.123.107:0/1672475434 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f518c1aca20 con 0x7f518c06a1d0 2026-03-09T19:32:45.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.945+0000 7f51737fe640 1 -- 192.168.123.107:0/1672475434 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5174010460 con 0x7f518c06a1d0 2026-03-09T19:32:45.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.945+0000 7f5190c91640 1 -- 192.168.123.107:0/1672475434 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f518c1acf70 con 0x7f518c06a1d0 2026-03-09T19:32:45.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.947+0000 7f51737fe640 1 -- 192.168.123.107:0/1672475434 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f51740105c0 con 0x7f518c06a1d0 2026-03-09T19:32:45.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.947+0000 7f51737fe640 1 --2- 192.168.123.107:0/1672475434 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f51600779b0 0x7f5160079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:45.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.947+0000 7f51737fe640 1 -- 192.168.123.107:0/1672475434 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(90..90 src has 1..90) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f517409abe0 con 0x7f518c06a1d0 2026-03-09T19:32:45.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.947+0000 7f5189d74640 1 --2- 192.168.123.107:0/1672475434 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f51600779b0 0x7f5160079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:45.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.948+0000 7f5189d74640 1 --2- 192.168.123.107:0/1672475434 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f51600779b0 0x7f5160079e70 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f518c1a8820 tx=0x7f518003a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:45.947 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.949+0000 7f5190c91640 1 -- 192.168.123.107:0/1672475434 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5150005350 con 0x7f518c06a1d0 2026-03-09T19:32:45.952 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:45.953+0000 7f51737fe640 1 -- 192.168.123.107:0/1672475434 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5174063410 con 0x7f518c06a1d0 2026-03-09T19:32:46.062 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.063+0000 7f5190c91640 1 -- 192.168.123.107:0/1672475434 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f5150002bf0 con 0x7f51600779b0 2026-03-09T19:32:46.063 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.064+0000 7f51737fe640 1 -- 192.168.123.107:0/1672475434 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f5150002bf0 con 0x7f51600779b0 2026-03-09T19:32:46.065 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.066+0000 7f5190c91640 1 -- 192.168.123.107:0/1672475434 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f51600779b0 msgr2=0x7f5160079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.065 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.066+0000 7f5190c91640 1 --2- 192.168.123.107:0/1672475434 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f51600779b0 0x7f5160079e70 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f518c1a8820 tx=0x7f518003a040 comp rx=0 tx=0).stop 2026-03-09T19:32:46.065 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.066+0000 7f5190c91640 1 -- 192.168.123.107:0/1672475434 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f518c06a1d0 msgr2=0x7f518c1a7300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.065 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.066+0000 7f5190c91640 1 --2- 192.168.123.107:0/1672475434 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f518c06a1d0 0x7f518c1a7300 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f517400d900 tx=0x7f517400ddd0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.066 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.067+0000 7f5190c91640 1 -- 192.168.123.107:0/1672475434 shutdown_connections 2026-03-09T19:32:46.066 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.067+0000 7f5190c91640 1 --2- 192.168.123.107:0/1672475434 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f51600779b0 0x7f5160079e70 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.066 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.067+0000 7f5190c91640 1 --2- 192.168.123.107:0/1672475434 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f518c06ab10 0x7f518c1a7840 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.066 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.067+0000 7f5190c91640 1 --2- 192.168.123.107:0/1672475434 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f518c06a1d0 0x7f518c1a7300 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.066 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.067+0000 7f5190c91640 1 -- 192.168.123.107:0/1672475434 >> 192.168.123.107:0/1672475434 conn(0x7f518c06f7b0 msgr2=0x7f518c108a30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:46.066 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.067+0000 7f5190c91640 1 -- 192.168.123.107:0/1672475434 shutdown_connections 2026-03-09T19:32:46.066 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.067+0000 7f5190c91640 1 -- 192.168.123.107:0/1672475434 wait complete. 2026-03-09T19:32:46.074 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:32:46.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.131+0000 7f00554db640 1 -- 192.168.123.107:0/3748401316 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f005010b920 msgr2=0x7f005010bd20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.131+0000 7f00554db640 1 --2- 192.168.123.107:0/3748401316 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f005010b920 0x7f005010bd20 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f00380099b0 tx=0x7f003802f240 comp rx=0 tx=0).stop 2026-03-09T19:32:46.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.131+0000 7f00554db640 1 -- 192.168.123.107:0/3748401316 shutdown_connections 2026-03-09T19:32:46.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.131+0000 7f00554db640 1 --2- 192.168.123.107:0/3748401316 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f005010c6a0 0x7f005010cb00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.131+0000 7f00554db640 1 --2- 192.168.123.107:0/3748401316 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f005010b920 0x7f005010bd20 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.131+0000 7f00554db640 1 -- 192.168.123.107:0/3748401316 >> 192.168.123.107:0/3748401316 conn(0x7f005006a890 msgr2=0x7f005006acc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:46.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.132+0000 7f00554db640 1 -- 192.168.123.107:0/3748401316 shutdown_connections 2026-03-09T19:32:46.130 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.132+0000 7f00554db640 1 -- 192.168.123.107:0/3748401316 wait complete. 2026-03-09T19:32:46.131 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.132+0000 7f00554db640 1 Processor -- start 2026-03-09T19:32:46.131 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.132+0000 7f00554db640 1 -- start start 2026-03-09T19:32:46.131 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.133+0000 7f00554db640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f005010b920 0x7f00501a2e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:46.131 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.133+0000 7f00554db640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f005010c6a0 0x7f00501a3360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:46.131 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.133+0000 7f00554db640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f00501a3930 con 0x7f005010b920 2026-03-09T19:32:46.131 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.133+0000 7f00554db640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f00501a3aa0 con 0x7f005010c6a0 2026-03-09T19:32:46.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.133+0000 7f004effd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f005010b920 0x7f00501a2e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:46.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.133+0000 7f004effd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f005010b920 0x7f00501a2e20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:45126/0 (socket says 192.168.123.107:45126) 2026-03-09T19:32:46.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.133+0000 7f004effd640 1 -- 192.168.123.107:0/3872405453 learned_addr learned my addr 192.168.123.107:0/3872405453 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:32:46.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.133+0000 7f004e7fc640 1 --2- 192.168.123.107:0/3872405453 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f005010c6a0 0x7f00501a3360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:46.133 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.134+0000 7f004e7fc640 1 -- 192.168.123.107:0/3872405453 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f005010b920 msgr2=0x7f00501a2e20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.133 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.134+0000 7f004e7fc640 1 --2- 192.168.123.107:0/3872405453 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f005010b920 0x7f00501a2e20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.133 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.134+0000 7f004e7fc640 1 -- 192.168.123.107:0/3872405453 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0038009660 con 0x7f005010c6a0 2026-03-09T19:32:46.133 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.134+0000 7f004e7fc640 1 --2- 192.168.123.107:0/3872405453 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f005010c6a0 0x7f00501a3360 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f004400d6b0 tx=0x7f004400db80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:46.133 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.134+0000 7f002ffff640 1 -- 192.168.123.107:0/3872405453 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0044004280 con 0x7f005010c6a0 2026-03-09T19:32:46.133 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.135+0000 7f00554db640 1 -- 192.168.123.107:0/3872405453 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f00501a8540 con 0x7f005010c6a0 2026-03-09T19:32:46.134 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.135+0000 7f00554db640 1 -- 192.168.123.107:0/3872405453 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f00501a8a90 con 0x7f005010c6a0 2026-03-09T19:32:46.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.135+0000 7f002ffff640 1 -- 192.168.123.107:0/3872405453 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f0044004d60 con 0x7f005010c6a0 2026-03-09T19:32:46.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.135+0000 7f002ffff640 1 -- 192.168.123.107:0/3872405453 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0044005020 con 0x7f005010c6a0 2026-03-09T19:32:46.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.136+0000 7f002dffb640 1 -- 192.168.123.107:0/3872405453 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f00501140a0 con 0x7f005010c6a0 2026-03-09T19:32:46.136 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.137+0000 7f002ffff640 1 -- 192.168.123.107:0/3872405453 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0044005250 con 0x7f005010c6a0 2026-03-09T19:32:46.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.138+0000 7f002ffff640 1 --2- 192.168.123.107:0/3872405453 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0028077890 0x7f0028079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:46.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.138+0000 7f002ffff640 1 -- 192.168.123.107:0/3872405453 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(90..90 src has 1..90) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f0044099c00 con 0x7f005010c6a0 2026-03-09T19:32:46.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.138+0000 7f004effd640 1 --2- 192.168.123.107:0/3872405453 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0028077890 0x7f0028079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:46.137 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.139+0000 7f004effd640 1 --2- 192.168.123.107:0/3872405453 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0028077890 0x7f0028079d50 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f0038002c30 tx=0x7f003803a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:46.139 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.140+0000 7f002ffff640 1 -- 192.168.123.107:0/3872405453 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0044062430 con 0x7f005010c6a0 2026-03-09T19:32:46.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.244+0000 7f002dffb640 1 -- 192.168.123.107:0/3872405453 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f00501142e0 con 0x7f0028077890 2026-03-09T19:32:46.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.245+0000 7f002ffff640 1 -- 192.168.123.107:0/3872405453 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f00501142e0 con 0x7f0028077890 2026-03-09T19:32:46.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.248+0000 7f002dffb640 1 -- 192.168.123.107:0/3872405453 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0028077890 msgr2=0x7f0028079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.247 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.248+0000 7f002dffb640 1 --2- 192.168.123.107:0/3872405453 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0028077890 0x7f0028079d50 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f0038002c30 tx=0x7f003803a040 comp rx=0 tx=0).stop 2026-03-09T19:32:46.247 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.248+0000 7f002dffb640 1 -- 192.168.123.107:0/3872405453 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f005010c6a0 msgr2=0x7f00501a3360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.247 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.248+0000 7f002dffb640 1 --2- 192.168.123.107:0/3872405453 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f005010c6a0 0x7f00501a3360 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f004400d6b0 tx=0x7f004400db80 comp rx=0 tx=0).stop 2026-03-09T19:32:46.247 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.248+0000 7f002dffb640 1 -- 192.168.123.107:0/3872405453 shutdown_connections 2026-03-09T19:32:46.247 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.248+0000 7f002dffb640 1 --2- 192.168.123.107:0/3872405453 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0028077890 0x7f0028079d50 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.247 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.248+0000 7f002dffb640 1 --2- 192.168.123.107:0/3872405453 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f005010c6a0 0x7f00501a3360 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.248 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.249+0000 7f002dffb640 1 --2- 192.168.123.107:0/3872405453 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f005010b920 0x7f00501a2e20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.248 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.249+0000 7f002dffb640 1 -- 192.168.123.107:0/3872405453 >> 192.168.123.107:0/3872405453 conn(0x7f005006a890 msgr2=0x7f005010a6e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:46.248 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.249+0000 7f002dffb640 1 -- 192.168.123.107:0/3872405453 shutdown_connections 2026-03-09T19:32:46.248 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.249+0000 7f002dffb640 1 -- 192.168.123.107:0/3872405453 wait complete. 2026-03-09T19:32:46.306 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.307+0000 7f8047f0d640 1 -- 192.168.123.107:0/490758542 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8040073b40 msgr2=0x7f8040073fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.306 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.307+0000 7f8047f0d640 1 --2- 192.168.123.107:0/490758542 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8040073b40 0x7f8040073fa0 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f80300099b0 tx=0x7f803002f240 comp rx=0 tx=0).stop 2026-03-09T19:32:46.306 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.307+0000 7f8047f0d640 1 -- 192.168.123.107:0/490758542 shutdown_connections 2026-03-09T19:32:46.306 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.307+0000 7f8047f0d640 1 --2- 192.168.123.107:0/490758542 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8040073b40 0x7f8040073fa0 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.306 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.307+0000 7f8047f0d640 1 --2- 192.168.123.107:0/490758542 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80400751a0 0x7f8040073600 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.306 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.307+0000 7f8047f0d640 1 -- 192.168.123.107:0/490758542 >> 192.168.123.107:0/490758542 conn(0x7f80400fbfb0 msgr2=0x7f80400fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:46.306 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.308+0000 7f8047f0d640 1 -- 192.168.123.107:0/490758542 shutdown_connections 2026-03-09T19:32:46.307 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.308+0000 7f8047f0d640 1 -- 192.168.123.107:0/490758542 wait complete. 2026-03-09T19:32:46.307 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.308+0000 7f8047f0d640 1 Processor -- start 2026-03-09T19:32:46.307 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.308+0000 7f8047f0d640 1 -- start start 2026-03-09T19:32:46.307 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.309+0000 7f8047f0d640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8040073b40 0x7f804019e910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:46.307 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.309+0000 7f8047f0d640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80400751a0 0x7f804019ee50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:46.307 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.309+0000 7f8047f0d640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f804019f420 con 0x7f8040073b40 2026-03-09T19:32:46.308 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.309+0000 7f8047f0d640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f804019f590 con 0x7f80400751a0 2026-03-09T19:32:46.308 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.309+0000 7f8045c82640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8040073b40 0x7f804019e910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:46.308 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.309+0000 7f8045c82640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8040073b40 0x7f804019e910 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:45142/0 (socket says 192.168.123.107:45142) 2026-03-09T19:32:46.308 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.309+0000 7f8045c82640 1 -- 192.168.123.107:0/4000459000 learned_addr learned my addr 192.168.123.107:0/4000459000 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:32:46.308 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.309+0000 7f8045481640 1 --2- 192.168.123.107:0/4000459000 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80400751a0 0x7f804019ee50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:46.308 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.309+0000 7f8045481640 1 -- 192.168.123.107:0/4000459000 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8040073b40 msgr2=0x7f804019e910 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.308 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.309+0000 7f8045481640 1 --2- 192.168.123.107:0/4000459000 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8040073b40 0x7f804019e910 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.308 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.309+0000 7f8045481640 1 -- 192.168.123.107:0/4000459000 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8030009660 con 0x7f80400751a0 2026-03-09T19:32:46.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.310+0000 7f8045481640 1 --2- 192.168.123.107:0/4000459000 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80400751a0 0x7f804019ee50 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f803002f750 tx=0x7f8030004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:46.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.310+0000 7f802effd640 1 -- 192.168.123.107:0/4000459000 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f803003d070 con 0x7f80400751a0 2026-03-09T19:32:46.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.310+0000 7f8047f0d640 1 -- 192.168.123.107:0/4000459000 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f80401a3fd0 con 0x7f80400751a0 2026-03-09T19:32:46.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.310+0000 7f8047f0d640 1 -- 192.168.123.107:0/4000459000 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f80401a4490 con 0x7f80400751a0 2026-03-09T19:32:46.310 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.311+0000 7f802effd640 1 -- 192.168.123.107:0/4000459000 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f80300043c0 con 0x7f80400751a0 2026-03-09T19:32:46.310 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.311+0000 7f802effd640 1 -- 192.168.123.107:0/4000459000 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80300416d0 con 0x7f80400751a0 2026-03-09T19:32:46.310 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.312+0000 7f802cff9640 1 -- 192.168.123.107:0/4000459000 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f804010fb90 con 0x7f80400751a0 2026-03-09T19:32:46.312 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.313+0000 7f802effd640 1 -- 192.168.123.107:0/4000459000 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8030038730 con 0x7f80400751a0 2026-03-09T19:32:46.312 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.313+0000 7f802effd640 1 --2- 192.168.123.107:0/4000459000 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8018077890 0x7f8018079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:46.312 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.313+0000 7f802effd640 1 -- 192.168.123.107:0/4000459000 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(90..90 src has 1..90) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f80300be960 con 0x7f80400751a0 2026-03-09T19:32:46.312 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.313+0000 7f8045c82640 1 --2- 192.168.123.107:0/4000459000 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8018077890 0x7f8018079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:46.313 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.314+0000 7f8045c82640 1 --2- 192.168.123.107:0/4000459000 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8018077890 0x7f8018079d50 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f8034005fd0 tx=0x7f80340074e0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:46.314 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.315+0000 7f802effd640 1 -- 192.168.123.107:0/4000459000 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8030087190 con 0x7f80400751a0 2026-03-09T19:32:46.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.413+0000 7f802cff9640 1 -- 192.168.123.107:0/4000459000 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f804010fd20 con 0x7f8018077890 2026-03-09T19:32:46.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.419+0000 7f802effd640 1 -- 192.168.123.107:0/4000459000 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f804010fd20 con 0x7f8018077890 2026-03-09T19:32:46.418 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:32:46.418 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (9m) 47s ago 10m 23.2M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:32:46.418 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (10m) 47s ago 10m 9773k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (9m) 2s ago 9m 10.4M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (3m) 47s ago 10m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 91ed5a6dbf3f 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (3m) 2s ago 9m 8321k - 19.2.3-678-ge911bdeb 654f31e6858e b2465a9d2305 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (9m) 47s ago 10m 89.5M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (8m) 47s ago 8m 17.8M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (8m) 47s ago 8m 19.5M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (8m) 2s ago 8m 29.0M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (8m) 2s ago 8m 178M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:8443,9283,8765 running (4m) 47s ago 10m 602M - 19.2.3-678-ge911bdeb 654f31e6858e 6c1350e70bfa 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (4m) 2s ago 9m 493M - 19.2.3-678-ge911bdeb 654f31e6858e c4c36685d8dc 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (4m) 47s ago 11m 62.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e ad39140965d8 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (4m) 2s ago 9m 55.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b4a58927ebfd 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (10m) 47s ago 10m 15.1M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (9m) 2s ago 9m 16.5M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (3m) 47s ago 9m 226M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a203aa241656 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (72s) 47s ago 9m 108M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 36b65f1069e5 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (49s) 47s ago 9m 13.8M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 1b8bc1f96eb7 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (25s) 2s ago 9m 180M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bde783ff786f 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (3s) 2s ago 8m 13.1M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 588104e3b774 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (8m) 2s ago 8m 423M 4096M 18.2.7-1055-gab47f43c b6fe7eb6a9d0 3b5f87af08af 2026-03-09T19:32:46.419 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (4m) 47s ago 10m 59.1M - 2.43.0 a07b618ecd1d c09450c20f5f 2026-03-09T19:32:46.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.422+0000 7f802cff9640 1 -- 192.168.123.107:0/4000459000 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8018077890 msgr2=0x7f8018079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.422+0000 7f802cff9640 1 --2- 192.168.123.107:0/4000459000 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8018077890 0x7f8018079d50 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f8034005fd0 tx=0x7f80340074e0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.422+0000 7f802cff9640 1 -- 192.168.123.107:0/4000459000 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80400751a0 msgr2=0x7f804019ee50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.422+0000 7f802cff9640 1 --2- 192.168.123.107:0/4000459000 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80400751a0 0x7f804019ee50 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f803002f750 tx=0x7f8030004290 comp rx=0 tx=0).stop 2026-03-09T19:32:46.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.422+0000 7f802cff9640 1 -- 192.168.123.107:0/4000459000 shutdown_connections 2026-03-09T19:32:46.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.422+0000 7f802cff9640 1 --2- 192.168.123.107:0/4000459000 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8018077890 0x7f8018079d50 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.422 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.423+0000 7f802cff9640 1 --2- 192.168.123.107:0/4000459000 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80400751a0 0x7f804019ee50 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.422 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.423+0000 7f802cff9640 1 --2- 192.168.123.107:0/4000459000 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8040073b40 0x7f804019e910 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.422 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.423+0000 7f802cff9640 1 -- 192.168.123.107:0/4000459000 >> 192.168.123.107:0/4000459000 conn(0x7f80400fbfb0 msgr2=0x7f80400fd790 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:46.422 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.423+0000 7f802cff9640 1 -- 192.168.123.107:0/4000459000 shutdown_connections 2026-03-09T19:32:46.422 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.423+0000 7f802cff9640 1 -- 192.168.123.107:0/4000459000 wait complete. 2026-03-09T19:32:46.483 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.483+0000 7f950675b640 1 -- 192.168.123.107:0/2183477822 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9500103ab0 msgr2=0x7f9500103f30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.483 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.483+0000 7f950675b640 1 --2- 192.168.123.107:0/2183477822 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9500103ab0 0x7f9500103f30 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f94f00099b0 tx=0x7f94f002f220 comp rx=0 tx=0).stop 2026-03-09T19:32:46.483 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.484+0000 7f950675b640 1 -- 192.168.123.107:0/2183477822 shutdown_connections 2026-03-09T19:32:46.483 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.484+0000 7f950675b640 1 --2- 192.168.123.107:0/2183477822 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9500103ab0 0x7f9500103f30 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.483 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.484+0000 7f950675b640 1 --2- 192.168.123.107:0/2183477822 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95001028b0 0x7f9500102cb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.483 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.484+0000 7f950675b640 1 -- 192.168.123.107:0/2183477822 >> 192.168.123.107:0/2183477822 conn(0x7f95000fe060 msgr2=0x7f9500100480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:46.483 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.484+0000 7f950675b640 1 -- 192.168.123.107:0/2183477822 shutdown_connections 2026-03-09T19:32:46.483 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.484+0000 7f950675b640 1 -- 192.168.123.107:0/2183477822 wait complete. 2026-03-09T19:32:46.483 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.484+0000 7f950675b640 1 Processor -- start 2026-03-09T19:32:46.483 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.485+0000 7f950675b640 1 -- start start 2026-03-09T19:32:46.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.485+0000 7f950675b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f95001028b0 0x7f950019a2e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:46.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.485+0000 7f950675b640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9500103ab0 0x7f950019a820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:46.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.485+0000 7f950675b640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f950019ad60 con 0x7f95001028b0 2026-03-09T19:32:46.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.485+0000 7f950675b640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f950019aed0 con 0x7f9500103ab0 2026-03-09T19:32:46.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.485+0000 7f9504f58640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9500103ab0 0x7f950019a820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:46.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.485+0000 7f9504f58640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9500103ab0 0x7f950019a820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:56202/0 (socket says 192.168.123.107:56202) 2026-03-09T19:32:46.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.485+0000 7f9504f58640 1 -- 192.168.123.107:0/3477946203 learned_addr learned my addr 192.168.123.107:0/3477946203 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:32:46.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.485+0000 7f9505759640 1 --2- 192.168.123.107:0/3477946203 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f95001028b0 0x7f950019a2e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:46.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.485+0000 7f9504f58640 1 -- 192.168.123.107:0/3477946203 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f95001028b0 msgr2=0x7f950019a2e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.485+0000 7f9504f58640 1 --2- 192.168.123.107:0/3477946203 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f95001028b0 0x7f950019a2e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.486+0000 7f9504f58640 1 -- 192.168.123.107:0/3477946203 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f94f0009660 con 0x7f9500103ab0 2026-03-09T19:32:46.485 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.486+0000 7f9505759640 1 --2- 192.168.123.107:0/3477946203 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f95001028b0 0x7f950019a2e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:32:46.485 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.486+0000 7f9504f58640 1 --2- 192.168.123.107:0/3477946203 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9500103ab0 0x7f950019a820 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f94f0002410 tx=0x7f94f0002980 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:46.485 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.486+0000 7f94ee7fc640 1 -- 192.168.123.107:0/3477946203 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f94f003d070 con 0x7f9500103ab0 2026-03-09T19:32:46.485 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.486+0000 7f950675b640 1 -- 192.168.123.107:0/3477946203 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f950019f950 con 0x7f9500103ab0 2026-03-09T19:32:46.485 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.486+0000 7f950675b640 1 -- 192.168.123.107:0/3477946203 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f950019fdf0 con 0x7f9500103ab0 2026-03-09T19:32:46.486 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.487+0000 7f94ee7fc640 1 -- 192.168.123.107:0/3477946203 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f94f002fd50 con 0x7f9500103ab0 2026-03-09T19:32:46.486 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.487+0000 7f94ee7fc640 1 -- 192.168.123.107:0/3477946203 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f94f0041a00 con 0x7f9500103ab0 2026-03-09T19:32:46.487 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.488+0000 7f94ee7fc640 1 -- 192.168.123.107:0/3477946203 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f94f004b430 con 0x7f9500103ab0 2026-03-09T19:32:46.487 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.488+0000 7f950675b640 1 -- 192.168.123.107:0/3477946203 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f94c8005350 con 0x7f9500103ab0 2026-03-09T19:32:46.487 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.488+0000 7f94ee7fc640 1 --2- 192.168.123.107:0/3477946203 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f94dc077680 0x7f94dc079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:46.487 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.489+0000 7f9505759640 1 --2- 192.168.123.107:0/3477946203 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f94dc077680 0x7f94dc079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:46.488 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.489+0000 7f94ee7fc640 1 -- 192.168.123.107:0/3477946203 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(90..90 src has 1..90) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f94f00be7c0 con 0x7f9500103ab0 2026-03-09T19:32:46.488 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.489+0000 7f9505759640 1 --2- 192.168.123.107:0/3477946203 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f94dc077680 0x7f94dc079b40 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f9500103910 tx=0x7f94f4005eb0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:46.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.491+0000 7f94ee7fc640 1 -- 192.168.123.107:0/3477946203 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f94f0086e40 con 0x7f9500103ab0 2026-03-09T19:32:46.634 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.635+0000 7f950675b640 1 -- 192.168.123.107:0/3477946203 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f94c8005e10 con 0x7f9500103ab0 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.636+0000 7f94ee7fc640 1 -- 192.168.123.107:0/3477946203 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+844 (secure 0 0 0) 0x7f94f0086590 con 0x7f9500103ab0 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 1, 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 5, 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 8 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:32:46.635 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:32:46.637 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.638+0000 7f950675b640 1 -- 192.168.123.107:0/3477946203 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f94dc077680 msgr2=0x7f94dc079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.637 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.638+0000 7f950675b640 1 --2- 192.168.123.107:0/3477946203 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f94dc077680 0x7f94dc079b40 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f9500103910 tx=0x7f94f4005eb0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.637 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.638+0000 7f950675b640 1 -- 192.168.123.107:0/3477946203 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9500103ab0 msgr2=0x7f950019a820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.637 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.638+0000 7f950675b640 1 --2- 192.168.123.107:0/3477946203 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9500103ab0 0x7f950019a820 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f94f0002410 tx=0x7f94f0002980 comp rx=0 tx=0).stop 2026-03-09T19:32:46.638 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.639+0000 7f950675b640 1 -- 192.168.123.107:0/3477946203 shutdown_connections 2026-03-09T19:32:46.638 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.639+0000 7f950675b640 1 --2- 192.168.123.107:0/3477946203 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f94dc077680 0x7f94dc079b40 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.638 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.639+0000 7f950675b640 1 --2- 192.168.123.107:0/3477946203 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9500103ab0 0x7f950019a820 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.638 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.639+0000 7f950675b640 1 --2- 192.168.123.107:0/3477946203 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f95001028b0 0x7f950019a2e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.638 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.639+0000 7f950675b640 1 -- 192.168.123.107:0/3477946203 >> 192.168.123.107:0/3477946203 conn(0x7f95000fe060 msgr2=0x7f95000ffbb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:46.638 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.639+0000 7f950675b640 1 -- 192.168.123.107:0/3477946203 shutdown_connections 2026-03-09T19:32:46.638 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.639+0000 7f950675b640 1 -- 192.168.123.107:0/3477946203 wait complete. 2026-03-09T19:32:46.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.697+0000 7f71501dd640 1 -- 192.168.123.107:0/4260391256 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f714810c6a0 msgr2=0x7f714810cb00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.697+0000 7f71501dd640 1 --2- 192.168.123.107:0/4260391256 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f714810c6a0 0x7f714810cb00 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f71380099b0 tx=0x7f713802f240 comp rx=0 tx=0).stop 2026-03-09T19:32:46.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.697+0000 7f71501dd640 1 -- 192.168.123.107:0/4260391256 shutdown_connections 2026-03-09T19:32:46.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.697+0000 7f71501dd640 1 --2- 192.168.123.107:0/4260391256 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f714810c6a0 0x7f714810cb00 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.697+0000 7f71501dd640 1 --2- 192.168.123.107:0/4260391256 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f714810b920 0x7f714810bd20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.697+0000 7f71501dd640 1 -- 192.168.123.107:0/4260391256 >> 192.168.123.107:0/4260391256 conn(0x7f714806a890 msgr2=0x7f714806acc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:46.697 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.698+0000 7f71501dd640 1 -- 192.168.123.107:0/4260391256 shutdown_connections 2026-03-09T19:32:46.697 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.698+0000 7f71501dd640 1 -- 192.168.123.107:0/4260391256 wait complete. 2026-03-09T19:32:46.697 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.698+0000 7f71501dd640 1 Processor -- start 2026-03-09T19:32:46.697 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.698+0000 7f71501dd640 1 -- start start 2026-03-09T19:32:46.697 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.699+0000 7f71501dd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f714810b920 0x7f71481a2df0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:46.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.699+0000 7f71501dd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f714810c6a0 0x7f71481a3330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:46.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.699+0000 7f71501dd640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f71481a3900 con 0x7f714810b920 2026-03-09T19:32:46.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.699+0000 7f71501dd640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f71481a3a70 con 0x7f714810c6a0 2026-03-09T19:32:46.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.699+0000 7f714df52640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f714810b920 0x7f71481a2df0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:46.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.699+0000 7f714d751640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f714810c6a0 0x7f71481a3330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:46.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.699+0000 7f714d751640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f714810c6a0 0x7f71481a3330 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:56230/0 (socket says 192.168.123.107:56230) 2026-03-09T19:32:46.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.699+0000 7f714d751640 1 -- 192.168.123.107:0/714473091 learned_addr learned my addr 192.168.123.107:0/714473091 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:32:46.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.700+0000 7f714d751640 1 -- 192.168.123.107:0/714473091 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f714810b920 msgr2=0x7f71481a2df0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.698 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.700+0000 7f714d751640 1 --2- 192.168.123.107:0/714473091 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f714810b920 0x7f71481a2df0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.700+0000 7f714d751640 1 -- 192.168.123.107:0/714473091 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7138009660 con 0x7f714810c6a0 2026-03-09T19:32:46.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.700+0000 7f714d751640 1 --2- 192.168.123.107:0/714473091 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f714810c6a0 0x7f71481a3330 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f7138009630 tx=0x7f7138004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:46.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.700+0000 7f7136ffd640 1 -- 192.168.123.107:0/714473091 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f713803d070 con 0x7f714810c6a0 2026-03-09T19:32:46.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.700+0000 7f71501dd640 1 -- 192.168.123.107:0/714473091 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f71481a84b0 con 0x7f714810c6a0 2026-03-09T19:32:46.699 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.700+0000 7f71501dd640 1 -- 192.168.123.107:0/714473091 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f71481a89a0 con 0x7f714810c6a0 2026-03-09T19:32:46.700 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.700+0000 7f7136ffd640 1 -- 192.168.123.107:0/714473091 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f7138004440 con 0x7f714810c6a0 2026-03-09T19:32:46.700 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.700+0000 7f7136ffd640 1 -- 192.168.123.107:0/714473091 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7138041690 con 0x7f714810c6a0 2026-03-09T19:32:46.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.702+0000 7f7136ffd640 1 -- 192.168.123.107:0/714473091 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f71380417f0 con 0x7f714810c6a0 2026-03-09T19:32:46.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.702+0000 7f71501dd640 1 -- 192.168.123.107:0/714473091 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7110005350 con 0x7f714810c6a0 2026-03-09T19:32:46.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.703+0000 7f7136ffd640 1 --2- 192.168.123.107:0/714473091 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f71200778e0 0x7f7120079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:46.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.703+0000 7f7136ffd640 1 -- 192.168.123.107:0/714473091 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(90..90 src has 1..90) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f71380be4b0 con 0x7f714810c6a0 2026-03-09T19:32:46.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.703+0000 7f714df52640 1 --2- 192.168.123.107:0/714473091 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f71200778e0 0x7f7120079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:46.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.703+0000 7f714df52640 1 --2- 192.168.123.107:0/714473091 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f71200778e0 0x7f7120079da0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f713c007c70 tx=0x7f713c0073d0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:46.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.706+0000 7f7136ffd640 1 -- 192.168.123.107:0/714473091 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7138087c00 con 0x7f714810c6a0 2026-03-09T19:32:46.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.822+0000 7f71501dd640 1 -- 192.168.123.107:0/714473091 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f7110005e10 con 0x7f714810c6a0 2026-03-09T19:32:46.823 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.824+0000 7f7136ffd640 1 -- 192.168.123.107:0/714473091 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1926 (secure 0 0 0) 0x7f7138087350 con 0x7f714810c6a0 2026-03-09T19:32:46.823 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:46.823 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:46.823 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:32:46.823 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:32:46.823 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:46.824 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:46 vm07.local ceph-mon[111841]: pgmap v173: 65 pgs: 15 active+undersized, 8 peering, 11 active+undersized+degraded, 31 active+clean; 218 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 39/291 objects degraded (13.402%) 2026-03-09T19:32:46.824 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:32:46.824 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:46.824 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:46.824 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:46.824 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T19:32:46.824 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:46 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T19:32:46.824 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:46 vm07.local ceph-mon[111841]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:e13 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:epoch 13 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:24:32.867256+0000 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:qdb_cluster leader: 0 members: 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:32:46.825 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:32:46.828 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.829+0000 7f71501dd640 1 -- 192.168.123.107:0/714473091 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f71200778e0 msgr2=0x7f7120079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.828 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.829+0000 7f71501dd640 1 --2- 192.168.123.107:0/714473091 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f71200778e0 0x7f7120079da0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f713c007c70 tx=0x7f713c0073d0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.828 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.829+0000 7f71501dd640 1 -- 192.168.123.107:0/714473091 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f714810c6a0 msgr2=0x7f71481a3330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.828 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.829+0000 7f71501dd640 1 --2- 192.168.123.107:0/714473091 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f714810c6a0 0x7f71481a3330 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f7138009630 tx=0x7f7138004290 comp rx=0 tx=0).stop 2026-03-09T19:32:46.828 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.830+0000 7f71501dd640 1 -- 192.168.123.107:0/714473091 shutdown_connections 2026-03-09T19:32:46.829 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.830+0000 7f71501dd640 1 --2- 192.168.123.107:0/714473091 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f71200778e0 0x7f7120079da0 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.829 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.830+0000 7f71501dd640 1 --2- 192.168.123.107:0/714473091 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f714810c6a0 0x7f71481a3330 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.829 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.830+0000 7f71501dd640 1 --2- 192.168.123.107:0/714473091 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f714810b920 0x7f71481a2df0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.829 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.830+0000 7f71501dd640 1 -- 192.168.123.107:0/714473091 >> 192.168.123.107:0/714473091 conn(0x7f714806a890 msgr2=0x7f714810a670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:46.829 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.830+0000 7f71501dd640 1 -- 192.168.123.107:0/714473091 shutdown_connections 2026-03-09T19:32:46.829 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.830+0000 7f71501dd640 1 -- 192.168.123.107:0/714473091 wait complete. 2026-03-09T19:32:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:32:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:32:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:46 vm08.local ceph-mon[103420]: pgmap v173: 65 pgs: 15 active+undersized, 8 peering, 11 active+undersized+degraded, 31 active+clean; 218 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 39/291 objects degraded (13.402%) 2026-03-09T19:32:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:32:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T19:32:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:46 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T19:32:46.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:46 vm08.local ceph-mon[103420]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-09T19:32:46.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.890+0000 7f6624845640 1 -- 192.168.123.107:0/3665090422 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f661c1007f0 msgr2=0x7f661c100bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.890+0000 7f6624845640 1 --2- 192.168.123.107:0/3665090422 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f661c1007f0 0x7f661c100bf0 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f66140099b0 tx=0x7f661402f240 comp rx=0 tx=0).stop 2026-03-09T19:32:46.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.890+0000 7f6624845640 1 -- 192.168.123.107:0/3665090422 shutdown_connections 2026-03-09T19:32:46.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.890+0000 7f6624845640 1 --2- 192.168.123.107:0/3665090422 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f661c1019f0 0x7f661c101e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.890+0000 7f6624845640 1 --2- 192.168.123.107:0/3665090422 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f661c1007f0 0x7f661c100bf0 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.890+0000 7f6624845640 1 -- 192.168.123.107:0/3665090422 >> 192.168.123.107:0/3665090422 conn(0x7f661c0fbf80 msgr2=0x7f661c0fe3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:46.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.891+0000 7f6624845640 1 -- 192.168.123.107:0/3665090422 shutdown_connections 2026-03-09T19:32:46.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.891+0000 7f6624845640 1 -- 192.168.123.107:0/3665090422 wait complete. 2026-03-09T19:32:46.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.891+0000 7f6624845640 1 Processor -- start 2026-03-09T19:32:46.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.891+0000 7f6624845640 1 -- start start 2026-03-09T19:32:46.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.891+0000 7f6624845640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f661c1007f0 0x7f661c19a4a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:46.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.891+0000 7f6624845640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f661c1019f0 0x7f661c19a9e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:46.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.892+0000 7f6624845640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f661c19afb0 con 0x7f661c1019f0 2026-03-09T19:32:46.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.892+0000 7f6624845640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f661c19b120 con 0x7f661c1007f0 2026-03-09T19:32:46.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.892+0000 7f6621db9640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f661c1019f0 0x7f661c19a9e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:46.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.892+0000 7f66225ba640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f661c1007f0 0x7f661c19a4a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:46.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.892+0000 7f66225ba640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f661c1007f0 0x7f661c19a4a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:56250/0 (socket says 192.168.123.107:56250) 2026-03-09T19:32:46.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.892+0000 7f66225ba640 1 -- 192.168.123.107:0/2505724746 learned_addr learned my addr 192.168.123.107:0/2505724746 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:32:46.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.892+0000 7f6621db9640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f661c1019f0 0x7f661c19a9e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:45174/0 (socket says 192.168.123.107:45174) 2026-03-09T19:32:46.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.892+0000 7f66225ba640 1 -- 192.168.123.107:0/2505724746 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f661c1019f0 msgr2=0x7f661c19a9e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:46.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.892+0000 7f66225ba640 1 --2- 192.168.123.107:0/2505724746 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f661c1019f0 0x7f661c19a9e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:46.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.892+0000 7f66225ba640 1 -- 192.168.123.107:0/2505724746 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6614009660 con 0x7f661c1007f0 2026-03-09T19:32:46.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.893+0000 7f66225ba640 1 --2- 192.168.123.107:0/2505724746 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f661c1007f0 0x7f661c19a4a0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f6614002a90 tx=0x7f6614004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:46.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.893+0000 7f66077fe640 1 -- 192.168.123.107:0/2505724746 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f661403d070 con 0x7f661c1007f0 2026-03-09T19:32:46.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.893+0000 7f6624845640 1 -- 192.168.123.107:0/2505724746 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f661c19faf0 con 0x7f661c1007f0 2026-03-09T19:32:46.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.893+0000 7f6624845640 1 -- 192.168.123.107:0/2505724746 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f661c19ffe0 con 0x7f661c1007f0 2026-03-09T19:32:46.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.894+0000 7f66077fe640 1 -- 192.168.123.107:0/2505724746 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6614004440 con 0x7f661c1007f0 2026-03-09T19:32:46.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.894+0000 7f66077fe640 1 -- 192.168.123.107:0/2505724746 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6614041740 con 0x7f661c1007f0 2026-03-09T19:32:46.893 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.894+0000 7f66077fe640 1 -- 192.168.123.107:0/2505724746 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f661404b430 con 0x7f661c1007f0 2026-03-09T19:32:46.893 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.894+0000 7f6624845640 1 -- 192.168.123.107:0/2505724746 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f65ec005350 con 0x7f661c1007f0 2026-03-09T19:32:46.894 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.895+0000 7f66077fe640 1 --2- 192.168.123.107:0/2505724746 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f65fc0778e0 0x7f65fc079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:46.894 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.895+0000 7f6621db9640 1 --2- 192.168.123.107:0/2505724746 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f65fc0778e0 0x7f65fc079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:46.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.896+0000 7f66077fe640 1 -- 192.168.123.107:0/2505724746 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(90..90 src has 1..90) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f66140be3d0 con 0x7f661c1007f0 2026-03-09T19:32:46.895 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.896+0000 7f6621db9640 1 --2- 192.168.123.107:0/2505724746 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f65fc0778e0 0x7f65fc079da0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f661c19b950 tx=0x7f6608009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:46.898 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:46.899+0000 7f66077fe640 1 -- 192.168.123.107:0/2505724746 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6614087bc0 con 0x7f661c1007f0 2026-03-09T19:32:47.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.005+0000 7f6624845640 1 -- 192.168.123.107:0/2505724746 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f65ec002bf0 con 0x7f65fc0778e0 2026-03-09T19:32:47.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.006+0000 7f66077fe640 1 -- 192.168.123.107:0/2505724746 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f65ec002bf0 con 0x7f65fc0778e0 2026-03-09T19:32:47.005 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:32:47.005 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T19:32:47.005 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:32:47.005 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:32:47.005 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [ 2026-03-09T19:32:47.005 INFO:teuthology.orchestra.run.vm07.stdout: "mgr", 2026-03-09T19:32:47.006 INFO:teuthology.orchestra.run.vm07.stdout: "crash", 2026-03-09T19:32:47.006 INFO:teuthology.orchestra.run.vm07.stdout: "mon" 2026-03-09T19:32:47.006 INFO:teuthology.orchestra.run.vm07.stdout: ], 2026-03-09T19:32:47.006 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "11/23 daemons upgraded", 2026-03-09T19:32:47.006 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T19:32:47.006 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:32:47.006 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:32:47.008 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.009+0000 7f6624845640 1 -- 192.168.123.107:0/2505724746 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f65fc0778e0 msgr2=0x7f65fc079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:47.008 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.009+0000 7f6624845640 1 --2- 192.168.123.107:0/2505724746 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f65fc0778e0 0x7f65fc079da0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f661c19b950 tx=0x7f6608009290 comp rx=0 tx=0).stop 2026-03-09T19:32:47.008 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.010+0000 7f6624845640 1 -- 192.168.123.107:0/2505724746 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f661c1007f0 msgr2=0x7f661c19a4a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:47.008 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.010+0000 7f6624845640 1 --2- 192.168.123.107:0/2505724746 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f661c1007f0 0x7f661c19a4a0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f6614002a90 tx=0x7f6614004290 comp rx=0 tx=0).stop 2026-03-09T19:32:47.009 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.010+0000 7f6624845640 1 -- 192.168.123.107:0/2505724746 shutdown_connections 2026-03-09T19:32:47.009 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.010+0000 7f6624845640 1 --2- 192.168.123.107:0/2505724746 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f65fc0778e0 0x7f65fc079da0 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:47.009 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.010+0000 7f6624845640 1 --2- 192.168.123.107:0/2505724746 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f661c1019f0 0x7f661c19a9e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:47.009 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.010+0000 7f6624845640 1 --2- 192.168.123.107:0/2505724746 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f661c1007f0 0x7f661c19a4a0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:47.009 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.010+0000 7f6624845640 1 -- 192.168.123.107:0/2505724746 >> 192.168.123.107:0/2505724746 conn(0x7f661c0fbf80 msgr2=0x7f661c0fdb10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:47.009 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.010+0000 7f6624845640 1 -- 192.168.123.107:0/2505724746 shutdown_connections 2026-03-09T19:32:47.009 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.010+0000 7f6624845640 1 -- 192.168.123.107:0/2505724746 wait complete. 2026-03-09T19:32:47.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.071+0000 7fc8eb611640 1 -- 192.168.123.107:0/2654861030 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc8e4102a60 msgr2=0x7fc8e4102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:47.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.071+0000 7fc8eb611640 1 --2- 192.168.123.107:0/2654861030 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc8e4102a60 0x7fc8e4102e60 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fc8d40099b0 tx=0x7fc8d402f220 comp rx=0 tx=0).stop 2026-03-09T19:32:47.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.071+0000 7fc8eb611640 1 -- 192.168.123.107:0/2654861030 shutdown_connections 2026-03-09T19:32:47.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.071+0000 7fc8eb611640 1 --2- 192.168.123.107:0/2654861030 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8e4103c60 0x7fc8e41040e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:47.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.071+0000 7fc8eb611640 1 --2- 192.168.123.107:0/2654861030 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc8e4102a60 0x7fc8e4102e60 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:47.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.071+0000 7fc8eb611640 1 -- 192.168.123.107:0/2654861030 >> 192.168.123.107:0/2654861030 conn(0x7fc8e40fe250 msgr2=0x7fc8e4100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:47.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.071+0000 7fc8eb611640 1 -- 192.168.123.107:0/2654861030 shutdown_connections 2026-03-09T19:32:47.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.071+0000 7fc8eb611640 1 -- 192.168.123.107:0/2654861030 wait complete. 2026-03-09T19:32:47.071 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.072+0000 7fc8eb611640 1 Processor -- start 2026-03-09T19:32:47.071 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.072+0000 7fc8eb611640 1 -- start start 2026-03-09T19:32:47.071 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.072+0000 7fc8eb611640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc8e4102a60 0x7fc8e419a430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:47.071 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.072+0000 7fc8eb611640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8e4103c60 0x7fc8e419a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:47.071 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.072+0000 7fc8eb611640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc8e419af40 con 0x7fc8e4102a60 2026-03-09T19:32:47.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.072+0000 7fc8eb611640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc8e419b0b0 con 0x7fc8e4103c60 2026-03-09T19:32:47.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.072+0000 7fc8e9386640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc8e4102a60 0x7fc8e419a430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:47.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.072+0000 7fc8e9386640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc8e4102a60 0x7fc8e419a430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:45192/0 (socket says 192.168.123.107:45192) 2026-03-09T19:32:47.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.073+0000 7fc8e9386640 1 -- 192.168.123.107:0/3006766070 learned_addr learned my addr 192.168.123.107:0/3006766070 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:32:47.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.073+0000 7fc8e8b85640 1 --2- 192.168.123.107:0/3006766070 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8e4103c60 0x7fc8e419a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:47.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.080+0000 7fc8e9386640 1 -- 192.168.123.107:0/3006766070 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8e4103c60 msgr2=0x7fc8e419a970 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:47.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.080+0000 7fc8e9386640 1 --2- 192.168.123.107:0/3006766070 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8e4103c60 0x7fc8e419a970 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:47.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.080+0000 7fc8e9386640 1 -- 192.168.123.107:0/3006766070 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc8d4009660 con 0x7fc8e4102a60 2026-03-09T19:32:47.080 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.081+0000 7fc8e8b85640 1 --2- 192.168.123.107:0/3006766070 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8e4103c60 0x7fc8e419a970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:32:47.080 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.081+0000 7fc8e9386640 1 --2- 192.168.123.107:0/3006766070 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc8e4102a60 0x7fc8e419a430 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7fc8d4002710 tx=0x7fc8d4002740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:47.080 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.081+0000 7fc8da7fc640 1 -- 192.168.123.107:0/3006766070 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc8d403d070 con 0x7fc8e4102a60 2026-03-09T19:32:47.081 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.082+0000 7fc8da7fc640 1 -- 192.168.123.107:0/3006766070 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fc8d402fc90 con 0x7fc8e4102a60 2026-03-09T19:32:47.082 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.083+0000 7fc8da7fc640 1 -- 192.168.123.107:0/3006766070 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc8d40418a0 con 0x7fc8e4102a60 2026-03-09T19:32:47.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.084+0000 7fc8eb611640 1 -- 192.168.123.107:0/3006766070 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc8e419faf0 con 0x7fc8e4102a60 2026-03-09T19:32:47.084 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.084+0000 7fc8eb611640 1 -- 192.168.123.107:0/3006766070 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc8e419ff60 con 0x7fc8e4102a60 2026-03-09T19:32:47.084 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.085+0000 7fc8eb611640 1 -- 192.168.123.107:0/3006766070 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc8b4005350 con 0x7fc8e4102a60 2026-03-09T19:32:47.091 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.089+0000 7fc8da7fc640 1 -- 192.168.123.107:0/3006766070 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc8d4041a00 con 0x7fc8e4102a60 2026-03-09T19:32:47.092 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.090+0000 7fc8da7fc640 1 --2- 192.168.123.107:0/3006766070 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc8bc0776d0 0x7fc8bc079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:32:47.092 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.090+0000 7fc8da7fc640 1 -- 192.168.123.107:0/3006766070 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(90..90 src has 1..90) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fc8d40be6a0 con 0x7fc8e4102a60 2026-03-09T19:32:47.092 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.093+0000 7fc8e8b85640 1 --2- 192.168.123.107:0/3006766070 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc8bc0776d0 0x7fc8bc079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:32:47.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.094+0000 7fc8da7fc640 1 -- 192.168.123.107:0/3006766070 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc8d4086e20 con 0x7fc8e4102a60 2026-03-09T19:32:47.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.094+0000 7fc8e8b85640 1 --2- 192.168.123.107:0/3006766070 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc8bc0776d0 0x7fc8bc079b90 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fc8e419b950 tx=0x7fc8cc005f50 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:32:47.228 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.228+0000 7fc8eb611640 1 -- 192.168.123.107:0/3006766070 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fc8b4005600 con 0x7fc8e4102a60 2026-03-09T19:32:47.241 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.242+0000 7fc8da7fc640 1 -- 192.168.123.107:0/3006766070 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1138 (secure 0 0 0) 0x7fc8d4086570 con 0x7fc8e4102a60 2026-03-09T19:32:47.241 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_WARN 1 osds down; Reduced data availability: 2 pgs peering; Degraded data redundancy: 39/291 objects degraded (13.402%), 11 pgs degraded 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout:[WRN] OSD_DOWN: 1 osds down 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout: osd.4 (root=default,host=vm08) is down 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout:[WRN] PG_AVAILABILITY: Reduced data availability: 2 pgs peering 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.f is stuck peering for 2m, current state peering, last acting [0,5] 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout: pg 3.3 is stuck peering for 2m, current state peering, last acting [0,3] 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 39/291 objects degraded (13.402%), 11 pgs degraded 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.4 is active+undersized+degraded, acting [1,0] 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.5 is active+undersized+degraded, acting [3,0] 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.6 is active+undersized+degraded, acting [1,3] 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.7 is active+undersized+degraded, acting [3,2] 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.9 is active+undersized+degraded, acting [1,0] 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.a is active+undersized+degraded, acting [1,3] 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.b is active+undersized+degraded, acting [3,5] 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.14 is active+undersized+degraded, acting [3,5] 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.18 is active+undersized+degraded, acting [5,3] 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.1a is active+undersized+degraded, acting [3,5] 2026-03-09T19:32:47.242 INFO:teuthology.orchestra.run.vm07.stdout: pg 2.1c is active+undersized+degraded, acting [5,2] 2026-03-09T19:32:47.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.245+0000 7fc8affff640 1 -- 192.168.123.107:0/3006766070 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc8bc0776d0 msgr2=0x7fc8bc079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:47.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.245+0000 7fc8affff640 1 --2- 192.168.123.107:0/3006766070 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc8bc0776d0 0x7fc8bc079b90 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fc8e419b950 tx=0x7fc8cc005f50 comp rx=0 tx=0).stop 2026-03-09T19:32:47.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.246+0000 7fc8affff640 1 -- 192.168.123.107:0/3006766070 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc8e4102a60 msgr2=0x7fc8e419a430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:32:47.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.246+0000 7fc8affff640 1 --2- 192.168.123.107:0/3006766070 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc8e4102a60 0x7fc8e419a430 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7fc8d4002710 tx=0x7fc8d4002740 comp rx=0 tx=0).stop 2026-03-09T19:32:47.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.246+0000 7fc8affff640 1 -- 192.168.123.107:0/3006766070 shutdown_connections 2026-03-09T19:32:47.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.246+0000 7fc8affff640 1 --2- 192.168.123.107:0/3006766070 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc8bc0776d0 0x7fc8bc079b90 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:47.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.246+0000 7fc8affff640 1 --2- 192.168.123.107:0/3006766070 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8e4103c60 0x7fc8e419a970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:47.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.246+0000 7fc8affff640 1 --2- 192.168.123.107:0/3006766070 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc8e4102a60 0x7fc8e419a430 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:32:47.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.247+0000 7fc8affff640 1 -- 192.168.123.107:0/3006766070 >> 192.168.123.107:0/3006766070 conn(0x7fc8e40fe250 msgr2=0x7fc8e40ff9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:32:47.248 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.249+0000 7fc8affff640 1 -- 192.168.123.107:0/3006766070 shutdown_connections 2026-03-09T19:32:47.248 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:32:47.249+0000 7fc8affff640 1 -- 192.168.123.107:0/3006766070 wait complete. 2026-03-09T19:32:47.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:47 vm07.local ceph-mon[111841]: from='client.34320 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:32:47.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:47 vm07.local ceph-mon[111841]: from='client.44263 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:32:47.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:47 vm07.local ceph-mon[111841]: from='client.44267 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:32:47.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:47 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/3477946203' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:47.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:47 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/714473091' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:32:47.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:47 vm07.local ceph-mon[111841]: from='client.44277 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:32:47.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:47 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/3006766070' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:32:47.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:47 vm08.local ceph-mon[103420]: from='client.34320 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:32:47.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:47 vm08.local ceph-mon[103420]: from='client.44263 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:32:47.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:47 vm08.local ceph-mon[103420]: from='client.44267 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:32:47.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:47 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/3477946203' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:32:47.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:47 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/714473091' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:32:47.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:47 vm08.local ceph-mon[103420]: from='client.44277 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:32:47.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:47 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/3006766070' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:32:48.584 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:48 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4[118709]: 2026-03-09T19:32:48.319+0000 7fb853225740 -1 osd.4 88 log_to_monitors true 2026-03-09T19:32:48.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:48 vm08.local ceph-mon[103420]: pgmap v174: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 218 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 50/291 objects degraded (17.182%) 2026-03-09T19:32:48.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:48 vm08.local ceph-mon[103420]: from='osd.4 [v2:192.168.123.108:6808/2759032272,v1:192.168.123.108:6809/2759032272]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T19:32:48.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:48 vm08.local ceph-mon[103420]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T19:32:48.845 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:32:48 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4[118709]: 2026-03-09T19:32:48.607+0000 7fb84a7be640 -1 osd.4 88 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T19:32:48.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:48 vm07.local ceph-mon[111841]: pgmap v174: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 218 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 50/291 objects degraded (17.182%) 2026-03-09T19:32:48.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:48 vm07.local ceph-mon[111841]: from='osd.4 [v2:192.168.123.108:6808/2759032272,v1:192.168.123.108:6809/2759032272]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T19:32:48.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:48 vm07.local ceph-mon[111841]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T19:32:49.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:49 vm08.local ceph-mon[103420]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 2 pgs peering) 2026-03-09T19:32:49.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:49 vm08.local ceph-mon[103420]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-09T19:32:49.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:49 vm08.local ceph-mon[103420]: osdmap e91: 6 total, 5 up, 6 in 2026-03-09T19:32:49.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:49 vm08.local ceph-mon[103420]: from='osd.4 [v2:192.168.123.108:6808/2759032272,v1:192.168.123.108:6809/2759032272]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:32:49.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:49 vm08.local ceph-mon[103420]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:32:49.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:49.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:32:49.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:49 vm07.local ceph-mon[111841]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 2 pgs peering) 2026-03-09T19:32:49.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:49 vm07.local ceph-mon[111841]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-09T19:32:49.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:49 vm07.local ceph-mon[111841]: osdmap e91: 6 total, 5 up, 6 in 2026-03-09T19:32:49.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:49 vm07.local ceph-mon[111841]: from='osd.4 [v2:192.168.123.108:6808/2759032272,v1:192.168.123.108:6809/2759032272]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:32:49.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:49 vm07.local ceph-mon[111841]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:32:49.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:32:49.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:32:50.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:50 vm07.local ceph-mon[111841]: pgmap v176: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 50/291 objects degraded (17.182%) 2026-03-09T19:32:50.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:50 vm07.local ceph-mon[111841]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T19:32:50.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:50 vm07.local ceph-mon[111841]: osd.4 [v2:192.168.123.108:6808/2759032272,v1:192.168.123.108:6809/2759032272] boot 2026-03-09T19:32:50.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:50 vm07.local ceph-mon[111841]: osdmap e92: 6 total, 6 up, 6 in 2026-03-09T19:32:50.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:32:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:50 vm08.local ceph-mon[103420]: pgmap v176: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 50/291 objects degraded (17.182%) 2026-03-09T19:32:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:50 vm08.local ceph-mon[103420]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T19:32:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:50 vm08.local ceph-mon[103420]: osd.4 [v2:192.168.123.108:6808/2759032272,v1:192.168.123.108:6809/2759032272] boot 2026-03-09T19:32:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:50 vm08.local ceph-mon[103420]: osdmap e92: 6 total, 6 up, 6 in 2026-03-09T19:32:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T19:32:51.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:51 vm07.local ceph-mon[111841]: osdmap e93: 6 total, 6 up, 6 in 2026-03-09T19:32:52.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:51 vm08.local ceph-mon[103420]: osdmap e93: 6 total, 6 up, 6 in 2026-03-09T19:32:52.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:52 vm07.local ceph-mon[111841]: pgmap v179: 65 pgs: 7 peering, 13 active+undersized, 8 active+undersized+degraded, 37 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 33/291 objects degraded (11.340%) 2026-03-09T19:32:52.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:52 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 33/291 objects degraded (11.340%), 8 pgs degraded (PG_DEGRADED) 2026-03-09T19:32:53.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:52 vm08.local ceph-mon[103420]: pgmap v179: 65 pgs: 7 peering, 13 active+undersized, 8 active+undersized+degraded, 37 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 33/291 objects degraded (11.340%) 2026-03-09T19:32:53.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:52 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 33/291 objects degraded (11.340%), 8 pgs degraded (PG_DEGRADED) 2026-03-09T19:32:54.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:54 vm07.local ceph-mon[111841]: pgmap v180: 65 pgs: 7 peering, 7 active+undersized, 3 active+undersized+degraded, 48 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 10/291 objects degraded (3.436%) 2026-03-09T19:32:55.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:54 vm08.local ceph-mon[103420]: pgmap v180: 65 pgs: 7 peering, 7 active+undersized, 3 active+undersized+degraded, 48 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 10/291 objects degraded (3.436%) 2026-03-09T19:32:56.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:55 vm08.local ceph-mon[103420]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 10/291 objects degraded (3.436%), 3 pgs degraded) 2026-03-09T19:32:56.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:55 vm08.local ceph-mon[103420]: Cluster is now healthy 2026-03-09T19:32:56.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:55 vm07.local ceph-mon[111841]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 10/291 objects degraded (3.436%), 3 pgs degraded) 2026-03-09T19:32:56.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:55 vm07.local ceph-mon[111841]: Cluster is now healthy 2026-03-09T19:32:57.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:56 vm08.local ceph-mon[103420]: pgmap v181: 65 pgs: 65 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-09T19:32:57.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:56 vm07.local ceph-mon[111841]: pgmap v181: 65 pgs: 65 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-09T19:32:58.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:32:58 vm07.local ceph-mon[111841]: pgmap v182: 65 pgs: 65 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-09T19:32:59.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:32:58 vm08.local ceph-mon[103420]: pgmap v182: 65 pgs: 65 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:01.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:00 vm08.local ceph-mon[103420]: pgmap v183: 65 pgs: 65 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:01.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:00 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T19:33:01.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:00 vm07.local ceph-mon[111841]: pgmap v183: 65 pgs: 65 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:01.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:00 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T19:33:01.973 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:01 vm08.local systemd[1]: Stopping Ceph osd.5 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:33:01.973 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:01 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[76487]: 2026-03-09T19:33:01.828+0000 7f96e9372640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T19:33:01.973 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:01 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[76487]: 2026-03-09T19:33:01.828+0000 7f96e9372640 -1 osd.5 93 *** Got signal Terminated *** 2026-03-09T19:33:01.973 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:01 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[76487]: 2026-03-09T19:33:01.828+0000 7f96e9372640 -1 osd.5 93 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T19:33:02.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:01 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T19:33:02.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:01 vm07.local ceph-mon[111841]: Upgrade: osd.5 is safe to restart 2026-03-09T19:33:02.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:01 vm07.local ceph-mon[111841]: Upgrade: Updating osd.5 2026-03-09T19:33:02.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:02.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T19:33:02.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:02.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:01 vm07.local ceph-mon[111841]: Deploying daemon osd.5 on vm08 2026-03-09T19:33:02.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:01 vm07.local ceph-mon[111841]: osd.5 marked itself down and dead 2026-03-09T19:33:02.247 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local podman[124025]: 2026-03-09 19:33:02.04137764 +0000 UTC m=+0.229773072 container died 3b5f87af08af6a9e57e460f93b94fffe91604a425150d855a8692ed2640a7ec5 (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T19:33:02.247 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local podman[124025]: 2026-03-09 19:33:02.06461928 +0000 UTC m=+0.253014712 container remove 3b5f87af08af6a9e57e460f93b94fffe91604a425150d855a8692ed2640a7ec5 (image=quay.ceph.io/ceph-ci/ceph@sha256:5c4977cb71fa562f9eeef885de8392360f108b57ddc9fb513207223fafb4cf40, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=ab47f43c099b2cbae6e21342fe673ce251da54d6, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T19:33:02.247 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local bash[124025]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5 2026-03-09T19:33:02.247 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local podman[124091]: 2026-03-09 19:33:02.221647397 +0000 UTC m=+0.018235119 container create d5b62899dd38e719348623068ca83f10ffe2c7c2b4c1a0b317c88e0f5ec9b8a4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-deactivate, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T19:33:02.248 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:01 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T19:33:02.248 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:01 vm08.local ceph-mon[103420]: Upgrade: osd.5 is safe to restart 2026-03-09T19:33:02.248 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:01 vm08.local ceph-mon[103420]: Upgrade: Updating osd.5 2026-03-09T19:33:02.248 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:02.248 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T19:33:02.248 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:02.248 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:01 vm08.local ceph-mon[103420]: Deploying daemon osd.5 on vm08 2026-03-09T19:33:02.248 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:01 vm08.local ceph-mon[103420]: osd.5 marked itself down and dead 2026-03-09T19:33:02.524 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local podman[124091]: 2026-03-09 19:33:02.265570022 +0000 UTC m=+0.062157755 container init d5b62899dd38e719348623068ca83f10ffe2c7c2b4c1a0b317c88e0f5ec9b8a4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-deactivate, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T19:33:02.524 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local podman[124091]: 2026-03-09 19:33:02.269311387 +0000 UTC m=+0.065899109 container start d5b62899dd38e719348623068ca83f10ffe2c7c2b4c1a0b317c88e0f5ec9b8a4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T19:33:02.525 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local podman[124091]: 2026-03-09 19:33:02.273766338 +0000 UTC m=+0.070354060 container attach d5b62899dd38e719348623068ca83f10ffe2c7c2b4c1a0b317c88e0f5ec9b8a4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-deactivate, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, ceph=True) 2026-03-09T19:33:02.525 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local podman[124091]: 2026-03-09 19:33:02.214059572 +0000 UTC m=+0.010647305 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:33:02.525 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local podman[124091]: 2026-03-09 19:33:02.40226193 +0000 UTC m=+0.198849652 container died d5b62899dd38e719348623068ca83f10ffe2c7c2b4c1a0b317c88e0f5ec9b8a4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-deactivate, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default) 2026-03-09T19:33:02.525 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local podman[124091]: 2026-03-09 19:33:02.425058447 +0000 UTC m=+0.221646169 container remove d5b62899dd38e719348623068ca83f10ffe2c7c2b4c1a0b317c88e0f5ec9b8a4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , ceph=True, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T19:33:02.525 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.5.service: Deactivated successfully. 2026-03-09T19:33:02.525 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local systemd[1]: Stopped Ceph osd.5 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:33:02.525 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.5.service: Consumed 48.330s CPU time. 2026-03-09T19:33:02.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local systemd[1]: Starting Ceph osd.5 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:33:02.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local podman[124193]: 2026-03-09 19:33:02.728677591 +0000 UTC m=+0.019134342 container create 0b21ab7e8b6ecc08123cf3e408a02a429780a93c53473f4213d1d7ae8e8ced8d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True) 2026-03-09T19:33:02.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local podman[124193]: 2026-03-09 19:33:02.773137092 +0000 UTC m=+0.063593843 container init 0b21ab7e8b6ecc08123cf3e408a02a429780a93c53473f4213d1d7ae8e8ced8d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0) 2026-03-09T19:33:02.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local podman[124193]: 2026-03-09 19:33:02.776421942 +0000 UTC m=+0.066878693 container start 0b21ab7e8b6ecc08123cf3e408a02a429780a93c53473f4213d1d7ae8e8ced8d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223) 2026-03-09T19:33:02.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local podman[124193]: 2026-03-09 19:33:02.785749604 +0000 UTC m=+0.076206355 container attach 0b21ab7e8b6ecc08123cf3e408a02a429780a93c53473f4213d1d7ae8e8ced8d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid) 2026-03-09T19:33:02.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local podman[124193]: 2026-03-09 19:33:02.72092042 +0000 UTC m=+0.011377171 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:33:02.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate[124204]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:33:02.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local bash[124193]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:33:02.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate[124204]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:33:02.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:02 vm08.local bash[124193]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:33:03.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:02 vm07.local ceph-mon[111841]: pgmap v184: 65 pgs: 65 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:03.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:02 vm07.local ceph-mon[111841]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T19:33:03.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:02 vm07.local ceph-mon[111841]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-09T19:33:03.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:02 vm07.local ceph-mon[111841]: osdmap e94: 6 total, 5 up, 6 in 2026-03-09T19:33:03.338 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:02 vm08.local ceph-mon[103420]: pgmap v184: 65 pgs: 65 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:03.338 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:02 vm08.local ceph-mon[103420]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T19:33:03.338 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:02 vm08.local ceph-mon[103420]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-09T19:33:03.338 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:02 vm08.local ceph-mon[103420]: osdmap e94: 6 total, 5 up, 6 in 2026-03-09T19:33:03.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate[124204]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T19:33:03.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate[124204]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:33:03.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local bash[124193]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T19:33:03.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local bash[124193]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:33:03.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate[124204]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:33:03.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local bash[124193]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T19:33:03.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate[124204]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-09T19:33:03.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local bash[124193]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-09T19:33:03.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate[124204]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-ad218d84-ec26-4cff-9330-dba6f56e9ac2/osd-block-44d4390e-f9a3-490b-9d44-f60b53e3d568 --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-09T19:33:03.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local bash[124193]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-ad218d84-ec26-4cff-9330-dba6f56e9ac2/osd-block-44d4390e-f9a3-490b-9d44-f60b53e3d568 --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-09T19:33:03.946 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate[124204]: Running command: /usr/bin/ln -snf /dev/ceph-ad218d84-ec26-4cff-9330-dba6f56e9ac2/osd-block-44d4390e-f9a3-490b-9d44-f60b53e3d568 /var/lib/ceph/osd/ceph-5/block 2026-03-09T19:33:03.946 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local bash[124193]: Running command: /usr/bin/ln -snf /dev/ceph-ad218d84-ec26-4cff-9330-dba6f56e9ac2/osd-block-44d4390e-f9a3-490b-9d44-f60b53e3d568 /var/lib/ceph/osd/ceph-5/block 2026-03-09T19:33:03.946 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate[124204]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-09T19:33:03.946 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local bash[124193]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-09T19:33:03.946 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate[124204]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-09T19:33:03.946 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local bash[124193]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-09T19:33:03.946 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate[124204]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-09T19:33:03.946 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local bash[124193]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-09T19:33:03.946 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate[124204]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-09T19:33:03.946 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local bash[124193]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-09T19:33:03.946 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local conmon[124204]: conmon 0b21ab7e8b6ecc08123c : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-0b21ab7e8b6ecc08123cf3e408a02a429780a93c53473f4213d1d7ae8e8ced8d.scope/container/memory.events 2026-03-09T19:33:03.946 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local podman[124193]: 2026-03-09 19:33:03.722991153 +0000 UTC m=+1.013447904 container died 0b21ab7e8b6ecc08123cf3e408a02a429780a93c53473f4213d1d7ae8e8ced8d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T19:33:03.946 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local podman[124193]: 2026-03-09 19:33:03.757638405 +0000 UTC m=+1.048095156 container remove 0b21ab7e8b6ecc08123cf3e408a02a429780a93c53473f4213d1d7ae8e8ced8d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-activate, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS) 2026-03-09T19:33:03.946 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local podman[124459]: 2026-03-09 19:33:03.889140515 +0000 UTC m=+0.022861989 container create 7f9a10e5f49da9f931793b109206913b1f1a5415b83c86bee46c3bf709ea8a55 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid) 2026-03-09T19:33:03.946 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local podman[124459]: 2026-03-09 19:33:03.925163293 +0000 UTC m=+0.058884777 container init 7f9a10e5f49da9f931793b109206913b1f1a5415b83c86bee46c3bf709ea8a55 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T19:33:03.946 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local podman[124459]: 2026-03-09 19:33:03.947982282 +0000 UTC m=+0.081703756 container start 7f9a10e5f49da9f931793b109206913b1f1a5415b83c86bee46c3bf709ea8a55 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_REF=squid) 2026-03-09T19:33:04.301 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local bash[124459]: 7f9a10e5f49da9f931793b109206913b1f1a5415b83c86bee46c3bf709ea8a55 2026-03-09T19:33:04.302 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local podman[124459]: 2026-03-09 19:33:03.876351825 +0000 UTC m=+0.010073309 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:33:04.302 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:03 vm08.local systemd[1]: Started Ceph osd.5 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:33:04.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:03 vm08.local ceph-mon[103420]: osdmap e95: 6 total, 5 up, 6 in 2026-03-09T19:33:04.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:04.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:33:04.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:03 vm07.local ceph-mon[111841]: osdmap e95: 6 total, 5 up, 6 in 2026-03-09T19:33:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:33:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:04.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:04 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[124469]: 2026-03-09T19:33:04.301+0000 7fe5502c3740 -1 Falling back to public interface 2026-03-09T19:33:05.055 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:05 vm08.local ceph-mon[103420]: pgmap v187: 65 pgs: 15 peering, 7 stale+active+clean, 43 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:05.055 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:05 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:05.055 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:05 vm08.local ceph-mon[103420]: Health check failed: Reduced data availability: 6 pgs peering (PG_AVAILABILITY) 2026-03-09T19:33:05.055 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:05 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:05.055 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:05 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:05.055 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:05 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:05.055 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:05 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:05.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:05 vm07.local ceph-mon[111841]: pgmap v187: 65 pgs: 15 peering, 7 stale+active+clean, 43 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:05.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:05 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:05.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:05 vm07.local ceph-mon[111841]: Health check failed: Reduced data availability: 6 pgs peering (PG_AVAILABILITY) 2026-03-09T19:33:05.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:05 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:05.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:05 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:05.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:05 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:05.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:05 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:06.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:06 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:06.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:06 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:06.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:06 vm08.local ceph-mon[103420]: pgmap v188: 65 pgs: 5 active+undersized, 15 peering, 2 stale+active+clean, 5 active+undersized+degraded, 38 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 13/291 objects degraded (4.467%) 2026-03-09T19:33:06.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:06 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:06.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:06 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:06.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:06 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:06.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:06 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:06.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:06 vm07.local ceph-mon[111841]: pgmap v188: 65 pgs: 5 active+undersized, 15 peering, 2 stale+active+clean, 5 active+undersized+degraded, 38 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 13/291 objects degraded (4.467%) 2026-03-09T19:33:06.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:06 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:06.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:06 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: Health check failed: Degraded data redundancy: 13/291 objects degraded (4.467%), 5 pgs degraded (PG_DEGRADED) 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-09T19:33:07.955 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-09T19:33:07.956 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-09T19:33:07.956 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-09T19:33:07.956 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-09T19:33:07.956 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-09T19:33:07.956 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-09T19:33:07.956 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: Health check failed: Degraded data redundancy: 13/291 objects degraded (4.467%), 5 pgs degraded (PG_DEGRADED) 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-09T19:33:08.068 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-09T19:33:08.345 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:08 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[124469]: 2026-03-09T19:33:08.069+0000 7fe5502c3740 -1 osd.5 93 log_to_monitors true 2026-03-09T19:33:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:08 vm08.local ceph-mon[103420]: Upgrade: Setting container_image for all osd 2026-03-09T19:33:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:08 vm08.local ceph-mon[103420]: Upgrade: Setting require_osd_release to 19 squid 2026-03-09T19:33:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:08 vm08.local ceph-mon[103420]: pgmap v189: 65 pgs: 14 active+undersized, 14 active+undersized+degraded, 37 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 46/291 objects degraded (15.808%) 2026-03-09T19:33:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:08 vm08.local ceph-mon[103420]: from='osd.5 [v2:192.168.123.108:6816/1168359833,v1:192.168.123.108:6817/1168359833]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T19:33:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:08 vm08.local ceph-mon[103420]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T19:33:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:08 vm08.local ceph-mon[103420]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-09T19:33:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-09T19:33:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:08 vm08.local ceph-mon[103420]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-09T19:33:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:08 vm08.local ceph-mon[103420]: osdmap e96: 6 total, 5 up, 6 in 2026-03-09T19:33:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:08 vm08.local ceph-mon[103420]: from='osd.5 [v2:192.168.123.108:6816/1168359833,v1:192.168.123.108:6817/1168359833]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:33:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:08 vm08.local ceph-mon[103420]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:33:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-09T19:33:08.845 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:33:08 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[124469]: 2026-03-09T19:33:08.431+0000 7fe54785c640 -1 osd.5 93 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T19:33:08.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:08 vm07.local ceph-mon[111841]: Upgrade: Setting container_image for all osd 2026-03-09T19:33:08.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:08 vm07.local ceph-mon[111841]: Upgrade: Setting require_osd_release to 19 squid 2026-03-09T19:33:08.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:08 vm07.local ceph-mon[111841]: pgmap v189: 65 pgs: 14 active+undersized, 14 active+undersized+degraded, 37 active+clean; 218 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 46/291 objects degraded (15.808%) 2026-03-09T19:33:08.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:08 vm07.local ceph-mon[111841]: from='osd.5 [v2:192.168.123.108:6816/1168359833,v1:192.168.123.108:6817/1168359833]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T19:33:08.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:08 vm07.local ceph-mon[111841]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T19:33:08.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:08 vm07.local ceph-mon[111841]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-09T19:33:08.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-09T19:33:08.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:08 vm07.local ceph-mon[111841]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-09T19:33:08.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:08 vm07.local ceph-mon[111841]: osdmap e96: 6 total, 5 up, 6 in 2026-03-09T19:33:08.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:08 vm07.local ceph-mon[111841]: from='osd.5 [v2:192.168.123.108:6816/1168359833,v1:192.168.123.108:6817/1168359833]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:33:08.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:08 vm07.local ceph-mon[111841]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-09T19:33:08.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:08.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-09T19:33:09.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: Upgrade: Scaling down filesystem cephfs 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 6 pgs peering) 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: stopping daemon mds.cephfs.vm08.jwsqrf 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: fsmap cephfs:2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm08.jwsqrf=up:stopping} 2 up:standby 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: osd.5 [v2:192.168.123.108:6816/1168359833,v1:192.168.123.108:6817/1168359833] boot 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: osdmap e97: 6 total, 6 up, 6 in 2026-03-09T19:33:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:09 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: Upgrade: Scaling down filesystem cephfs 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 6 pgs peering) 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: stopping daemon mds.cephfs.vm08.jwsqrf 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: fsmap cephfs:2 {0=cephfs.vm08.zcaqju=up:active,1=cephfs.vm08.jwsqrf=up:stopping} 2 up:standby 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: osd.5 [v2:192.168.123.108:6816/1168359833,v1:192.168.123.108:6817/1168359833] boot 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: osdmap e97: 6 total, 6 up, 6 in 2026-03-09T19:33:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:09 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T19:33:10.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:10 vm07.local ceph-mon[111841]: pgmap v192: 65 pgs: 14 active+undersized, 14 active+undersized+degraded, 37 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 46/291 objects degraded (15.808%) 2026-03-09T19:33:10.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:10 vm07.local ceph-mon[111841]: osdmap e98: 6 total, 6 up, 6 in 2026-03-09T19:33:11.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:10 vm08.local ceph-mon[103420]: pgmap v192: 65 pgs: 14 active+undersized, 14 active+undersized+degraded, 37 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 46/291 objects degraded (15.808%) 2026-03-09T19:33:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:10 vm08.local ceph-mon[103420]: osdmap e98: 6 total, 6 up, 6 in 2026-03-09T19:33:11.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:11 vm07.local ceph-mon[111841]: Health check update: Degraded data redundancy: 44/291 objects degraded (15.120%), 13 pgs degraded (PG_DEGRADED) 2026-03-09T19:33:12.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:11 vm08.local ceph-mon[103420]: Health check update: Degraded data redundancy: 44/291 objects degraded (15.120%), 13 pgs degraded (PG_DEGRADED) 2026-03-09T19:33:12.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:12 vm07.local ceph-mon[111841]: pgmap v194: 65 pgs: 2 peering, 12 active+undersized, 13 active+undersized+degraded, 38 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 44/291 objects degraded (15.120%) 2026-03-09T19:33:13.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:12 vm08.local ceph-mon[103420]: pgmap v194: 65 pgs: 2 peering, 12 active+undersized, 13 active+undersized+degraded, 38 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 44/291 objects degraded (15.120%) 2026-03-09T19:33:14.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:14 vm07.local ceph-mon[111841]: pgmap v195: 65 pgs: 2 peering, 1 active+undersized, 2 active+undersized+degraded, 60 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 170 B/s wr, 0 op/s; 5/291 objects degraded (1.718%) 2026-03-09T19:33:15.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:14 vm08.local ceph-mon[103420]: pgmap v195: 65 pgs: 2 peering, 1 active+undersized, 2 active+undersized+degraded, 60 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 170 B/s wr, 0 op/s; 5/291 objects degraded (1.718%) 2026-03-09T19:33:15.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:15 vm07.local ceph-mon[111841]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 5/291 objects degraded (1.718%), 2 pgs degraded) 2026-03-09T19:33:15.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:15 vm07.local ceph-mon[111841]: Cluster is now healthy 2026-03-09T19:33:16.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:15 vm08.local ceph-mon[103420]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 5/291 objects degraded (1.718%), 2 pgs degraded) 2026-03-09T19:33:16.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:15 vm08.local ceph-mon[103420]: Cluster is now healthy 2026-03-09T19:33:16.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:16 vm07.local ceph-mon[111841]: pgmap v196: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 427 B/s wr, 0 op/s 2026-03-09T19:33:16.996 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:16 vm08.local ceph-mon[103420]: pgmap v196: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 427 B/s wr, 0 op/s 2026-03-09T19:33:17.314 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.314+0000 7fd19ea91640 1 -- 192.168.123.107:0/1047342199 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd198076df0 msgr2=0x7fd198077250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:17.314 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.314+0000 7fd19ea91640 1 --2- 192.168.123.107:0/1047342199 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd198076df0 0x7fd198077250 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7fd1800099b0 tx=0x7fd18002f260 comp rx=0 tx=0).stop 2026-03-09T19:33:17.314 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.315+0000 7fd19ea91640 1 -- 192.168.123.107:0/1047342199 shutdown_connections 2026-03-09T19:33:17.314 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.315+0000 7fd19ea91640 1 --2- 192.168.123.107:0/1047342199 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd198076df0 0x7fd198077250 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.314 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.315+0000 7fd19ea91640 1 --2- 192.168.123.107:0/1047342199 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd198075ba0 0x7fd198075fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.314 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.315+0000 7fd19ea91640 1 -- 192.168.123.107:0/1047342199 >> 192.168.123.107:0/1047342199 conn(0x7fd1980fe250 msgr2=0x7fd198100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:17.314 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.315+0000 7fd19ea91640 1 -- 192.168.123.107:0/1047342199 shutdown_connections 2026-03-09T19:33:17.314 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.316+0000 7fd19ea91640 1 -- 192.168.123.107:0/1047342199 wait complete. 2026-03-09T19:33:17.315 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.316+0000 7fd19ea91640 1 Processor -- start 2026-03-09T19:33:17.315 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.316+0000 7fd19ea91640 1 -- start start 2026-03-09T19:33:17.315 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.316+0000 7fd19ea91640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd198075ba0 0x7fd19819e920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:17.315 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.316+0000 7fd19ea91640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd198076df0 0x7fd19819ee60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:17.316 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.317+0000 7fd19ea91640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd19819f430 con 0x7fd198076df0 2026-03-09T19:33:17.316 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.317+0000 7fd19ea91640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd19819f5a0 con 0x7fd198075ba0 2026-03-09T19:33:17.316 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.317+0000 7fd19c806640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd198075ba0 0x7fd19819e920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:17.316 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.317+0000 7fd19c806640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd198075ba0 0x7fd19819e920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:51654/0 (socket says 192.168.123.107:51654) 2026-03-09T19:33:17.316 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.317+0000 7fd19c806640 1 -- 192.168.123.107:0/2555973006 learned_addr learned my addr 192.168.123.107:0/2555973006 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:33:17.316 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.317+0000 7fd18ffff640 1 --2- 192.168.123.107:0/2555973006 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd198076df0 0x7fd19819ee60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:17.316 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.317+0000 7fd19c806640 1 -- 192.168.123.107:0/2555973006 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd198076df0 msgr2=0x7fd19819ee60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:17.316 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.317+0000 7fd19c806640 1 --2- 192.168.123.107:0/2555973006 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd198076df0 0x7fd19819ee60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.316 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.317+0000 7fd19c806640 1 -- 192.168.123.107:0/2555973006 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd180009660 con 0x7fd198075ba0 2026-03-09T19:33:17.316 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.317+0000 7fd18ffff640 1 --2- 192.168.123.107:0/2555973006 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd198076df0 0x7fd19819ee60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:33:17.316 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.318+0000 7fd19c806640 1 --2- 192.168.123.107:0/2555973006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd198075ba0 0x7fd19819e920 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fd18800d8d0 tx=0x7fd18800dda0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:17.317 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.318+0000 7fd18dffb640 1 -- 192.168.123.107:0/2555973006 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd188004490 con 0x7fd198075ba0 2026-03-09T19:33:17.317 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.318+0000 7fd19ea91640 1 -- 192.168.123.107:0/2555973006 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd1981a3fe0 con 0x7fd198075ba0 2026-03-09T19:33:17.318 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.318+0000 7fd19ea91640 1 -- 192.168.123.107:0/2555973006 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd1981a4500 con 0x7fd198075ba0 2026-03-09T19:33:17.318 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.318+0000 7fd18dffb640 1 -- 192.168.123.107:0/2555973006 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fd1880076c0 con 0x7fd198075ba0 2026-03-09T19:33:17.318 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.318+0000 7fd18dffb640 1 -- 192.168.123.107:0/2555973006 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd188010460 con 0x7fd198075ba0 2026-03-09T19:33:17.318 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.319+0000 7fd19ea91640 1 -- 192.168.123.107:0/2555973006 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd164005350 con 0x7fd198075ba0 2026-03-09T19:33:17.319 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.320+0000 7fd18dffb640 1 -- 192.168.123.107:0/2555973006 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd1880026e0 con 0x7fd198075ba0 2026-03-09T19:33:17.319 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.320+0000 7fd18dffb640 1 --2- 192.168.123.107:0/2555973006 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd174077890 0x7fd174079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:17.319 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.320+0000 7fd18dffb640 1 -- 192.168.123.107:0/2555973006 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(98..98 src has 1..98) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fd188099df0 con 0x7fd198075ba0 2026-03-09T19:33:17.320 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.321+0000 7fd18ffff640 1 --2- 192.168.123.107:0/2555973006 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd174077890 0x7fd174079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:17.320 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.321+0000 7fd18ffff640 1 --2- 192.168.123.107:0/2555973006 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd174077890 0x7fd174079d50 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fd19819fe40 tx=0x7fd18003a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:17.322 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.323+0000 7fd18dffb640 1 -- 192.168.123.107:0/2555973006 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd188062620 con 0x7fd198075ba0 2026-03-09T19:33:17.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.429+0000 7fd19ea91640 1 -- 192.168.123.107:0/2555973006 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd164002bf0 con 0x7fd174077890 2026-03-09T19:33:17.430 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.431+0000 7fd18dffb640 1 -- 192.168.123.107:0/2555973006 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7fd164002bf0 con 0x7fd174077890 2026-03-09T19:33:17.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.433+0000 7fd19ea91640 1 -- 192.168.123.107:0/2555973006 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd174077890 msgr2=0x7fd174079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:17.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.433+0000 7fd19ea91640 1 --2- 192.168.123.107:0/2555973006 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd174077890 0x7fd174079d50 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fd19819fe40 tx=0x7fd18003a040 comp rx=0 tx=0).stop 2026-03-09T19:33:17.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.433+0000 7fd19ea91640 1 -- 192.168.123.107:0/2555973006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd198075ba0 msgr2=0x7fd19819e920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:17.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.433+0000 7fd19ea91640 1 --2- 192.168.123.107:0/2555973006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd198075ba0 0x7fd19819e920 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fd18800d8d0 tx=0x7fd18800dda0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.433+0000 7fd19ea91640 1 -- 192.168.123.107:0/2555973006 shutdown_connections 2026-03-09T19:33:17.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.433+0000 7fd19ea91640 1 --2- 192.168.123.107:0/2555973006 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fd174077890 0x7fd174079d50 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.433+0000 7fd19ea91640 1 --2- 192.168.123.107:0/2555973006 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd198076df0 0x7fd19819ee60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.433+0000 7fd19ea91640 1 --2- 192.168.123.107:0/2555973006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd198075ba0 0x7fd19819e920 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.433 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.433+0000 7fd19ea91640 1 -- 192.168.123.107:0/2555973006 >> 192.168.123.107:0/2555973006 conn(0x7fd1980fe250 msgr2=0x7fd1980ffda0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:17.433 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.434+0000 7fd19ea91640 1 -- 192.168.123.107:0/2555973006 shutdown_connections 2026-03-09T19:33:17.433 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.434+0000 7fd19ea91640 1 -- 192.168.123.107:0/2555973006 wait complete. 2026-03-09T19:33:17.442 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:33:17.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.495+0000 7f3ca79ef640 1 -- 192.168.123.107:0/4225720314 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3ca0102a60 msgr2=0x7f3ca0102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:17.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.495+0000 7f3ca79ef640 1 --2- 192.168.123.107:0/4225720314 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3ca0102a60 0x7f3ca0102e60 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f3c880099b0 tx=0x7f3c8802f260 comp rx=0 tx=0).stop 2026-03-09T19:33:17.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.495+0000 7f3ca79ef640 1 -- 192.168.123.107:0/4225720314 shutdown_connections 2026-03-09T19:33:17.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.495+0000 7f3ca79ef640 1 --2- 192.168.123.107:0/4225720314 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ca0103c60 0x7f3ca01040e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.495+0000 7f3ca79ef640 1 --2- 192.168.123.107:0/4225720314 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3ca0102a60 0x7f3ca0102e60 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.495+0000 7f3ca79ef640 1 -- 192.168.123.107:0/4225720314 >> 192.168.123.107:0/4225720314 conn(0x7f3ca00fe250 msgr2=0x7f3ca0100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:17.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.496+0000 7f3ca79ef640 1 -- 192.168.123.107:0/4225720314 shutdown_connections 2026-03-09T19:33:17.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.496+0000 7f3ca79ef640 1 -- 192.168.123.107:0/4225720314 wait complete. 2026-03-09T19:33:17.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.496+0000 7f3ca79ef640 1 Processor -- start 2026-03-09T19:33:17.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.496+0000 7f3ca79ef640 1 -- start start 2026-03-09T19:33:17.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.497+0000 7f3ca79ef640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ca0102a60 0x7f3ca019a4e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:17.495 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.497+0000 7f3ca79ef640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3ca0103c60 0x7f3ca019aa20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:17.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.497+0000 7f3ca4f63640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3ca0103c60 0x7f3ca019aa20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:17.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.497+0000 7f3ca4f63640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3ca0103c60 0x7f3ca019aa20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:60748/0 (socket says 192.168.123.107:60748) 2026-03-09T19:33:17.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.497+0000 7f3ca4f63640 1 -- 192.168.123.107:0/116841223 learned_addr learned my addr 192.168.123.107:0/116841223 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:33:17.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.497+0000 7f3ca79ef640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3ca019aff0 con 0x7f3ca0103c60 2026-03-09T19:33:17.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.497+0000 7f3ca79ef640 1 -- 192.168.123.107:0/116841223 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3ca019b160 con 0x7f3ca0102a60 2026-03-09T19:33:17.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.497+0000 7f3ca5764640 1 --2- 192.168.123.107:0/116841223 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ca0102a60 0x7f3ca019a4e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:17.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.497+0000 7f3ca4f63640 1 -- 192.168.123.107:0/116841223 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ca0102a60 msgr2=0x7f3ca019a4e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:17.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.497+0000 7f3ca4f63640 1 --2- 192.168.123.107:0/116841223 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ca0102a60 0x7f3ca019a4e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.497 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.498+0000 7f3ca4f63640 1 -- 192.168.123.107:0/116841223 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3c88009660 con 0x7f3ca0103c60 2026-03-09T19:33:17.497 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.498+0000 7f3ca4f63640 1 --2- 192.168.123.107:0/116841223 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3ca0103c60 0x7f3ca019aa20 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f3c9000b730 tx=0x7f3c9000bc00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:17.497 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.498+0000 7f3c967fc640 1 -- 192.168.123.107:0/116841223 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3c90004280 con 0x7f3ca0103c60 2026-03-09T19:33:17.497 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.498+0000 7f3c967fc640 1 -- 192.168.123.107:0/116841223 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3c900043e0 con 0x7f3ca0103c60 2026-03-09T19:33:17.497 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.498+0000 7f3ca79ef640 1 -- 192.168.123.107:0/116841223 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3ca019fc00 con 0x7f3ca0103c60 2026-03-09T19:33:17.498 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.498+0000 7f3ca79ef640 1 -- 192.168.123.107:0/116841223 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3ca01a01d0 con 0x7f3ca0103c60 2026-03-09T19:33:17.498 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.500+0000 7f3c967fc640 1 -- 192.168.123.107:0/116841223 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3c9000ca90 con 0x7f3ca0103c60 2026-03-09T19:33:17.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.500+0000 7f3ca79ef640 1 -- 192.168.123.107:0/116841223 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3c6c005350 con 0x7f3ca0103c60 2026-03-09T19:33:17.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.501+0000 7f3c967fc640 1 -- 192.168.123.107:0/116841223 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3c9000ccb0 con 0x7f3ca0103c60 2026-03-09T19:33:17.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.501+0000 7f3c967fc640 1 --2- 192.168.123.107:0/116841223 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3c7c077750 0x7f3c7c079c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:17.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.501+0000 7f3ca5764640 1 --2- 192.168.123.107:0/116841223 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3c7c077750 0x7f3c7c079c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:17.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.501+0000 7f3c967fc640 1 -- 192.168.123.107:0/116841223 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(98..98 src has 1..98) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f3c90099ec0 con 0x7f3ca0103c60 2026-03-09T19:33:17.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.501+0000 7f3ca5764640 1 --2- 192.168.123.107:0/116841223 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3c7c077750 0x7f3c7c079c10 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f3c88002c20 tx=0x7f3c8803a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:17.502 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.503+0000 7f3c967fc640 1 -- 192.168.123.107:0/116841223 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3c90062750 con 0x7f3ca0103c60 2026-03-09T19:33:17.612 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.612+0000 7f3ca79ef640 1 -- 192.168.123.107:0/116841223 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3c6c002bf0 con 0x7f3c7c077750 2026-03-09T19:33:17.616 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.617+0000 7f3c967fc640 1 -- 192.168.123.107:0/116841223 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f3c6c002bf0 con 0x7f3c7c077750 2026-03-09T19:33:17.619 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.620+0000 7f3ca79ef640 1 -- 192.168.123.107:0/116841223 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3c7c077750 msgr2=0x7f3c7c079c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:17.619 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.620+0000 7f3ca79ef640 1 --2- 192.168.123.107:0/116841223 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3c7c077750 0x7f3c7c079c10 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f3c88002c20 tx=0x7f3c8803a040 comp rx=0 tx=0).stop 2026-03-09T19:33:17.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.621+0000 7f3ca79ef640 1 -- 192.168.123.107:0/116841223 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3ca0103c60 msgr2=0x7f3ca019aa20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:17.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.621+0000 7f3ca79ef640 1 --2- 192.168.123.107:0/116841223 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3ca0103c60 0x7f3ca019aa20 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f3c9000b730 tx=0x7f3c9000bc00 comp rx=0 tx=0).stop 2026-03-09T19:33:17.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.621+0000 7f3ca79ef640 1 -- 192.168.123.107:0/116841223 shutdown_connections 2026-03-09T19:33:17.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.621+0000 7f3ca79ef640 1 --2- 192.168.123.107:0/116841223 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3c7c077750 0x7f3c7c079c10 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.621+0000 7f3ca79ef640 1 --2- 192.168.123.107:0/116841223 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3ca0103c60 0x7f3ca019aa20 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.621+0000 7f3ca79ef640 1 --2- 192.168.123.107:0/116841223 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ca0102a60 0x7f3ca019a4e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.621+0000 7f3ca79ef640 1 -- 192.168.123.107:0/116841223 >> 192.168.123.107:0/116841223 conn(0x7f3ca00fe250 msgr2=0x7f3ca00ffd60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:17.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.622+0000 7f3ca79ef640 1 -- 192.168.123.107:0/116841223 shutdown_connections 2026-03-09T19:33:17.620 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.622+0000 7f3ca79ef640 1 -- 192.168.123.107:0/116841223 wait complete. 2026-03-09T19:33:17.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.682+0000 7fdf0369d640 1 -- 192.168.123.107:0/506024534 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdefc102a60 msgr2=0x7fdefc102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:17.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.682+0000 7fdf0369d640 1 --2- 192.168.123.107:0/506024534 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdefc102a60 0x7fdefc102e60 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fdef00099b0 tx=0x7fdef002f220 comp rx=0 tx=0).stop 2026-03-09T19:33:17.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.683+0000 7fdf0369d640 1 -- 192.168.123.107:0/506024534 shutdown_connections 2026-03-09T19:33:17.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.683+0000 7fdf0369d640 1 --2- 192.168.123.107:0/506024534 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdefc103c60 0x7fdefc1040e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.683+0000 7fdf0369d640 1 --2- 192.168.123.107:0/506024534 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdefc102a60 0x7fdefc102e60 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.683+0000 7fdf0369d640 1 -- 192.168.123.107:0/506024534 >> 192.168.123.107:0/506024534 conn(0x7fdefc0fe250 msgr2=0x7fdefc100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:17.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.683+0000 7fdf0369d640 1 -- 192.168.123.107:0/506024534 shutdown_connections 2026-03-09T19:33:17.682 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.683+0000 7fdf0369d640 1 -- 192.168.123.107:0/506024534 wait complete. 2026-03-09T19:33:17.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.684+0000 7fdf0369d640 1 Processor -- start 2026-03-09T19:33:17.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.684+0000 7fdf0369d640 1 -- start start 2026-03-09T19:33:17.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.684+0000 7fdf0369d640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdefc102a60 0x7fdefc078fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:17.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.684+0000 7fdf0369d640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdefc103c60 0x7fdefc0794e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:17.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.684+0000 7fdf0369d640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdefc075a00 con 0x7fdefc102a60 2026-03-09T19:33:17.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.684+0000 7fdf0369d640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdefc075b70 con 0x7fdefc103c60 2026-03-09T19:33:17.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.684+0000 7fdf01412640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdefc102a60 0x7fdefc078fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:17.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.685+0000 7fdf01412640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdefc102a60 0x7fdefc078fa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:60766/0 (socket says 192.168.123.107:60766) 2026-03-09T19:33:17.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.685+0000 7fdf01412640 1 -- 192.168.123.107:0/1589144355 learned_addr learned my addr 192.168.123.107:0/1589144355 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:33:17.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.685+0000 7fdf01412640 1 -- 192.168.123.107:0/1589144355 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdefc103c60 msgr2=0x7fdefc0794e0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:33:17.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.685+0000 7fdf00c11640 1 --2- 192.168.123.107:0/1589144355 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdefc103c60 0x7fdefc0794e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:17.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.685+0000 7fdf01412640 1 --2- 192.168.123.107:0/1589144355 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdefc103c60 0x7fdefc0794e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.685+0000 7fdf01412640 1 -- 192.168.123.107:0/1589144355 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdef0009660 con 0x7fdefc102a60 2026-03-09T19:33:17.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.686+0000 7fdf01412640 1 --2- 192.168.123.107:0/1589144355 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdefc102a60 0x7fdefc078fa0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7fdef0002910 tx=0x7fdef0002940 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:17.687 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.686+0000 7fdeea7fc640 1 -- 192.168.123.107:0/1589144355 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdef003d070 con 0x7fdefc102a60 2026-03-09T19:33:17.687 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.686+0000 7fdf0369d640 1 -- 192.168.123.107:0/1589144355 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdefc075df0 con 0x7fdefc102a60 2026-03-09T19:33:17.687 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.686+0000 7fdf0369d640 1 -- 192.168.123.107:0/1589144355 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdefc076360 con 0x7fdefc102a60 2026-03-09T19:33:17.687 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.686+0000 7fdeea7fc640 1 -- 192.168.123.107:0/1589144355 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fdef002fd50 con 0x7fdefc102a60 2026-03-09T19:33:17.687 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.686+0000 7fdeea7fc640 1 -- 192.168.123.107:0/1589144355 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdef0041aa0 con 0x7fdefc102a60 2026-03-09T19:33:17.687 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.688+0000 7fdeea7fc640 1 -- 192.168.123.107:0/1589144355 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fdef004b430 con 0x7fdefc102a60 2026-03-09T19:33:17.688 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.689+0000 7fdeea7fc640 1 --2- 192.168.123.107:0/1589144355 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fded00779b0 0x7fded0079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:17.688 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.689+0000 7fdeea7fc640 1 -- 192.168.123.107:0/1589144355 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(98..98 src has 1..98) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fdef00bfb10 con 0x7fdefc102a60 2026-03-09T19:33:17.688 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.689+0000 7fdf00c11640 1 --2- 192.168.123.107:0/1589144355 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fded00779b0 0x7fded0079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:17.689 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.690+0000 7fdf0369d640 1 -- 192.168.123.107:0/1589144355 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdec4005350 con 0x7fdefc102a60 2026-03-09T19:33:17.689 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.690+0000 7fdf00c11640 1 --2- 192.168.123.107:0/1589144355 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fded00779b0 0x7fded0079e70 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fdefc076f20 tx=0x7fdeec008040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:17.694 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.695+0000 7fdeea7fc640 1 -- 192.168.123.107:0/1589144355 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fdef0088230 con 0x7fdefc102a60 2026-03-09T19:33:17.792 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.793+0000 7fdf0369d640 1 -- 192.168.123.107:0/1589144355 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fdec4002bf0 con 0x7fded00779b0 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.798+0000 7fdeea7fc640 1 -- 192.168.123.107:0/1589144355 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3600 (secure 0 0 0) 0x7fdec4002bf0 con 0x7fded00779b0 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (10m) 78s ago 10m 23.2M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (11m) 78s ago 11m 9773k - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (10m) 12s ago 10m 10.8M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (4m) 78s ago 11m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 91ed5a6dbf3f 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (4m) 12s ago 10m 8321k - 19.2.3-678-ge911bdeb 654f31e6858e b2465a9d2305 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (10m) 78s ago 10m 89.5M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (8m) 78s ago 8m 17.8M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 57e019a15225 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (8m) 78s ago 8m 19.5M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 724cdeb4189c 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (8m) 12s ago 8m 28.7M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (8m) 12s ago 8m 179M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:8443,9283,8765 running (5m) 78s ago 11m 602M - 19.2.3-678-ge911bdeb 654f31e6858e 6c1350e70bfa 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (4m) 12s ago 10m 493M - 19.2.3-678-ge911bdeb 654f31e6858e c4c36685d8dc 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (4m) 78s ago 11m 62.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e ad39140965d8 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (4m) 12s ago 10m 56.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b4a58927ebfd 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (10m) 78s ago 10m 15.1M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (10m) 12s ago 10m 16.3M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (4m) 78s ago 10m 226M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a203aa241656 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (103s) 78s ago 9m 108M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 36b65f1069e5 2026-03-09T19:33:17.797 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (80s) 78s ago 9m 13.8M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 1b8bc1f96eb7 2026-03-09T19:33:17.798 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (56s) 12s ago 9m 181M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bde783ff786f 2026-03-09T19:33:17.798 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (35s) 12s ago 9m 125M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 588104e3b774 2026-03-09T19:33:17.798 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (13s) 12s ago 9m 46.3M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7f9a10e5f49d 2026-03-09T19:33:17.798 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (4m) 78s ago 10m 59.1M - 2.43.0 a07b618ecd1d c09450c20f5f 2026-03-09T19:33:17.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.801+0000 7fdf0369d640 1 -- 192.168.123.107:0/1589144355 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fded00779b0 msgr2=0x7fded0079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:17.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.801+0000 7fdf0369d640 1 --2- 192.168.123.107:0/1589144355 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fded00779b0 0x7fded0079e70 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fdefc076f20 tx=0x7fdeec008040 comp rx=0 tx=0).stop 2026-03-09T19:33:17.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.801+0000 7fdf0369d640 1 -- 192.168.123.107:0/1589144355 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdefc102a60 msgr2=0x7fdefc078fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:17.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.801+0000 7fdf0369d640 1 --2- 192.168.123.107:0/1589144355 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdefc102a60 0x7fdefc078fa0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7fdef0002910 tx=0x7fdef0002940 comp rx=0 tx=0).stop 2026-03-09T19:33:17.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.801+0000 7fdf0369d640 1 -- 192.168.123.107:0/1589144355 shutdown_connections 2026-03-09T19:33:17.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.801+0000 7fdf0369d640 1 --2- 192.168.123.107:0/1589144355 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fded00779b0 0x7fded0079e70 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.801+0000 7fdf0369d640 1 --2- 192.168.123.107:0/1589144355 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdefc103c60 0x7fdefc0794e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.801+0000 7fdf0369d640 1 --2- 192.168.123.107:0/1589144355 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdefc102a60 0x7fdefc078fa0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.801+0000 7fdf0369d640 1 -- 192.168.123.107:0/1589144355 >> 192.168.123.107:0/1589144355 conn(0x7fdefc0fe250 msgr2=0x7fdefc0ffd60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:17.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.801+0000 7fdf0369d640 1 -- 192.168.123.107:0/1589144355 shutdown_connections 2026-03-09T19:33:17.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.801+0000 7fdf0369d640 1 -- 192.168.123.107:0/1589144355 wait complete. 2026-03-09T19:33:17.859 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.860+0000 7ff913450640 1 -- 192.168.123.107:0/282691357 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff90c103c80 msgr2=0x7ff90c104100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:17.859 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.860+0000 7ff913450640 1 --2- 192.168.123.107:0/282691357 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff90c103c80 0x7ff90c104100 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7ff8fc0099b0 tx=0x7ff8fc02f220 comp rx=0 tx=0).stop 2026-03-09T19:33:17.859 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.860+0000 7ff913450640 1 -- 192.168.123.107:0/282691357 shutdown_connections 2026-03-09T19:33:17.859 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.860+0000 7ff913450640 1 --2- 192.168.123.107:0/282691357 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff90c103c80 0x7ff90c104100 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.859 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.860+0000 7ff913450640 1 --2- 192.168.123.107:0/282691357 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff90c102a80 0x7ff90c102e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.859 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.860+0000 7ff913450640 1 -- 192.168.123.107:0/282691357 >> 192.168.123.107:0/282691357 conn(0x7ff90c0fe250 msgr2=0x7ff90c100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:17.859 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.860+0000 7ff913450640 1 -- 192.168.123.107:0/282691357 shutdown_connections 2026-03-09T19:33:17.859 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.861+0000 7ff913450640 1 -- 192.168.123.107:0/282691357 wait complete. 2026-03-09T19:33:17.860 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.861+0000 7ff913450640 1 Processor -- start 2026-03-09T19:33:17.860 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.861+0000 7ff913450640 1 -- start start 2026-03-09T19:33:17.860 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.861+0000 7ff913450640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff90c102a80 0x7ff90c19a460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:17.860 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.861+0000 7ff913450640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff90c103c80 0x7ff90c19a9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:17.861 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.861+0000 7ff913450640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff90c19af70 con 0x7ff90c103c80 2026-03-09T19:33:17.861 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.861+0000 7ff913450640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff90c19b0e0 con 0x7ff90c102a80 2026-03-09T19:33:17.861 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.862+0000 7ff9111c5640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff90c102a80 0x7ff90c19a460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:17.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.862+0000 7ff9109c4640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff90c103c80 0x7ff90c19a9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:17.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.862+0000 7ff9109c4640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff90c103c80 0x7ff90c19a9a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:60786/0 (socket says 192.168.123.107:60786) 2026-03-09T19:33:17.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.862+0000 7ff9109c4640 1 -- 192.168.123.107:0/2334933776 learned_addr learned my addr 192.168.123.107:0/2334933776 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:33:17.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.862+0000 7ff9109c4640 1 -- 192.168.123.107:0/2334933776 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff90c102a80 msgr2=0x7ff90c19a460 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:17.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.862+0000 7ff9109c4640 1 --2- 192.168.123.107:0/2334933776 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff90c102a80 0x7ff90c19a460 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:17.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.862+0000 7ff9109c4640 1 -- 192.168.123.107:0/2334933776 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff8fc009660 con 0x7ff90c103c80 2026-03-09T19:33:17.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.862+0000 7ff9111c5640 1 --2- 192.168.123.107:0/2334933776 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff90c102a80 0x7ff90c19a460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:33:17.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.862+0000 7ff9109c4640 1 --2- 192.168.123.107:0/2334933776 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff90c103c80 0x7ff90c19a9a0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7ff8fc02f730 tx=0x7ff8fc002910 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:17.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.865+0000 7ff8fa7fc640 1 -- 192.168.123.107:0/2334933776 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff8fc03d070 con 0x7ff90c103c80 2026-03-09T19:33:17.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.865+0000 7ff913450640 1 -- 192.168.123.107:0/2334933776 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff90c075630 con 0x7ff90c103c80 2026-03-09T19:33:17.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.865+0000 7ff913450640 1 -- 192.168.123.107:0/2334933776 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff90c075b20 con 0x7ff90c103c80 2026-03-09T19:33:17.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.866+0000 7ff8fa7fc640 1 -- 192.168.123.107:0/2334933776 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff8fc002e20 con 0x7ff90c103c80 2026-03-09T19:33:17.865 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.866+0000 7ff8fa7fc640 1 -- 192.168.123.107:0/2334933776 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff8fc0416d0 con 0x7ff90c103c80 2026-03-09T19:33:17.866 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.866+0000 7ff8fa7fc640 1 -- 192.168.123.107:0/2334933776 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff8fc049050 con 0x7ff90c103c80 2026-03-09T19:33:17.866 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.867+0000 7ff8fa7fc640 1 --2- 192.168.123.107:0/2334933776 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff8dc077a80 0x7ff8dc079f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:17.866 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.867+0000 7ff9111c5640 1 --2- 192.168.123.107:0/2334933776 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff8dc077a80 0x7ff8dc079f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:17.866 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.867+0000 7ff9111c5640 1 --2- 192.168.123.107:0/2334933776 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff8dc077a80 0x7ff8dc079f40 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7ff90c103ae0 tx=0x7ff900018040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:17.866 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.867+0000 7ff8fa7fc640 1 -- 192.168.123.107:0/2334933776 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(98..98 src has 1..98) v4 ==== 6480+0+0 (secure 0 0 0) 0x7ff8fc0bf340 con 0x7ff90c103c80 2026-03-09T19:33:17.867 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.868+0000 7ff913450640 1 -- 192.168.123.107:0/2334933776 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff8d8005350 con 0x7ff90c103c80 2026-03-09T19:33:17.870 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.871+0000 7ff8fa7fc640 1 -- 192.168.123.107:0/2334933776 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff8fc087ac0 con 0x7ff90c103c80 2026-03-09T19:33:17.998 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.998+0000 7ff913450640 1 -- 192.168.123.107:0/2334933776 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7ff8d8005e10 con 0x7ff90c103c80 2026-03-09T19:33:17.998 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:17.999+0000 7ff8fa7fc640 1 -- 192.168.123.107:0/2334933776 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+739 (secure 0 0 0) 0x7ff8fc087210 con 0x7ff90c103c80 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 4, 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 10 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:33:17.999 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:33:18.001 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.002+0000 7ff913450640 1 -- 192.168.123.107:0/2334933776 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff8dc077a80 msgr2=0x7ff8dc079f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:18.001 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.002+0000 7ff913450640 1 --2- 192.168.123.107:0/2334933776 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff8dc077a80 0x7ff8dc079f40 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7ff90c103ae0 tx=0x7ff900018040 comp rx=0 tx=0).stop 2026-03-09T19:33:18.001 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.002+0000 7ff913450640 1 -- 192.168.123.107:0/2334933776 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff90c103c80 msgr2=0x7ff90c19a9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:18.001 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.002+0000 7ff913450640 1 --2- 192.168.123.107:0/2334933776 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff90c103c80 0x7ff90c19a9a0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7ff8fc02f730 tx=0x7ff8fc002910 comp rx=0 tx=0).stop 2026-03-09T19:33:18.001 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.003+0000 7ff913450640 1 -- 192.168.123.107:0/2334933776 shutdown_connections 2026-03-09T19:33:18.002 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.003+0000 7ff913450640 1 --2- 192.168.123.107:0/2334933776 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff8dc077a80 0x7ff8dc079f40 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.002 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.003+0000 7ff913450640 1 --2- 192.168.123.107:0/2334933776 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff90c103c80 0x7ff90c19a9a0 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.002 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.003+0000 7ff913450640 1 --2- 192.168.123.107:0/2334933776 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff90c102a80 0x7ff90c19a460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.002 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.003+0000 7ff913450640 1 -- 192.168.123.107:0/2334933776 >> 192.168.123.107:0/2334933776 conn(0x7ff90c0fe250 msgr2=0x7ff90c0ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:18.002 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.003+0000 7ff913450640 1 -- 192.168.123.107:0/2334933776 shutdown_connections 2026-03-09T19:33:18.002 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.003+0000 7ff913450640 1 -- 192.168.123.107:0/2334933776 wait complete. 2026-03-09T19:33:18.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.060+0000 7f3a7cd7d640 1 -- 192.168.123.107:0/3718951587 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3a78103c60 msgr2=0x7f3a781040e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:18.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.060+0000 7f3a7cd7d640 1 --2- 192.168.123.107:0/3718951587 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3a78103c60 0x7f3a781040e0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f3a6c0099b0 tx=0x7f3a6c02f220 comp rx=0 tx=0).stop 2026-03-09T19:33:18.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.061+0000 7f3a7cd7d640 1 -- 192.168.123.107:0/3718951587 shutdown_connections 2026-03-09T19:33:18.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.061+0000 7f3a7cd7d640 1 --2- 192.168.123.107:0/3718951587 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3a78103c60 0x7f3a781040e0 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.061+0000 7f3a7cd7d640 1 --2- 192.168.123.107:0/3718951587 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a78102a60 0x7f3a78102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.061+0000 7f3a7cd7d640 1 -- 192.168.123.107:0/3718951587 >> 192.168.123.107:0/3718951587 conn(0x7f3a780fe250 msgr2=0x7f3a78100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:18.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.061+0000 7f3a7cd7d640 1 -- 192.168.123.107:0/3718951587 shutdown_connections 2026-03-09T19:33:18.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.061+0000 7f3a7cd7d640 1 -- 192.168.123.107:0/3718951587 wait complete. 2026-03-09T19:33:18.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.062+0000 7f3a7cd7d640 1 Processor -- start 2026-03-09T19:33:18.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.062+0000 7f3a7cd7d640 1 -- start start 2026-03-09T19:33:18.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.062+0000 7f3a7cd7d640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3a78102a60 0x7f3a7819a3f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:18.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.062+0000 7f3a7cd7d640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a78103c60 0x7f3a7819a930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:18.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.062+0000 7f3a7cd7d640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3a7819af00 con 0x7f3a78102a60 2026-03-09T19:33:18.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.062+0000 7f3a7cd7d640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3a7819b070 con 0x7f3a78103c60 2026-03-09T19:33:18.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.062+0000 7f3a76575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3a78102a60 0x7f3a7819a3f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:18.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.062+0000 7f3a76575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3a78102a60 0x7f3a7819a3f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:60800/0 (socket says 192.168.123.107:60800) 2026-03-09T19:33:18.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.062+0000 7f3a76575640 1 -- 192.168.123.107:0/3612895676 learned_addr learned my addr 192.168.123.107:0/3612895676 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:33:18.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.062+0000 7f3a75d74640 1 --2- 192.168.123.107:0/3612895676 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a78103c60 0x7f3a7819a930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:18.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.062+0000 7f3a76575640 1 -- 192.168.123.107:0/3612895676 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a78103c60 msgr2=0x7f3a7819a930 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:18.062 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.062+0000 7f3a76575640 1 --2- 192.168.123.107:0/3612895676 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a78103c60 0x7f3a7819a930 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.062 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.062+0000 7f3a76575640 1 -- 192.168.123.107:0/3612895676 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3a6c009660 con 0x7f3a78102a60 2026-03-09T19:33:18.062 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.063+0000 7f3a76575640 1 --2- 192.168.123.107:0/3612895676 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3a78102a60 0x7f3a7819a3f0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f3a6000cc90 tx=0x7f3a60007590 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:18.062 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.063+0000 7f3a5f7fe640 1 -- 192.168.123.107:0/3612895676 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3a60007cb0 con 0x7f3a78102a60 2026-03-09T19:33:18.063 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.063+0000 7f3a7cd7d640 1 -- 192.168.123.107:0/3612895676 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3a7819fb10 con 0x7f3a78102a60 2026-03-09T19:33:18.066 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.068+0000 7f3a5f7fe640 1 -- 192.168.123.107:0/3612895676 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3a60007e10 con 0x7f3a78102a60 2026-03-09T19:33:18.067 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.068+0000 7f3a5f7fe640 1 -- 192.168.123.107:0/3612895676 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3a60005230 con 0x7f3a78102a60 2026-03-09T19:33:18.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.070+0000 7f3a7cd7d640 1 -- 192.168.123.107:0/3612895676 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3a781a0060 con 0x7f3a78102a60 2026-03-09T19:33:18.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.071+0000 7f3a5f7fe640 1 -- 192.168.123.107:0/3612895676 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3a60016030 con 0x7f3a78102a60 2026-03-09T19:33:18.073 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.072+0000 7f3a7cd7d640 1 -- 192.168.123.107:0/3612895676 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3a78102e60 con 0x7f3a78102a60 2026-03-09T19:33:18.073 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.073+0000 7f3a5f7fe640 1 --2- 192.168.123.107:0/3612895676 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3a480776d0 0x7f3a48079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:18.073 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.073+0000 7f3a5f7fe640 1 -- 192.168.123.107:0/3612895676 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(98..98 src has 1..98) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f3a60098b40 con 0x7f3a78102a60 2026-03-09T19:33:18.073 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.075+0000 7f3a75d74640 1 --2- 192.168.123.107:0/3612895676 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3a480776d0 0x7f3a48079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:18.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.075+0000 7f3a75d74640 1 --2- 192.168.123.107:0/3612895676 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3a480776d0 0x7f3a48079b90 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f3a7819b910 tx=0x7f3a6c03a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:18.082 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.080+0000 7f3a5f7fe640 1 -- 192.168.123.107:0/3612895676 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3a600612b0 con 0x7f3a78102a60 2026-03-09T19:33:18.201 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.202+0000 7f3a7cd7d640 1 -- 192.168.123.107:0/3612895676 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f3a7810b940 con 0x7f3a78102a60 2026-03-09T19:33:18.204 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.205+0000 7f3a5f7fe640 1 -- 192.168.123.107:0/3612895676 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 14 v14) v1 ==== 76+0+1937 (secure 0 0 0) 0x7f3a60060a00 con 0x7f3a78102a60 2026-03-09T19:33:18.204 INFO:teuthology.orchestra.run.vm07.stdout:e14 2026-03-09T19:33:18.204 INFO:teuthology.orchestra.run.vm07.stdout:btime 2026-03-09T19:33:08:696919+0000 2026-03-09T19:33:18.204 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:33:18.204 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:33:18.204 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:33:18.204 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:33:18.204 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:33:18.204 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:33:18.204 INFO:teuthology.orchestra.run.vm07.stdout:epoch 14 2026-03-09T19:33:18.204 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:33:18.204 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:33:18.204 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:33:08.696919+0000 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 1 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279,1=24285} 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:qdb_cluster leader: 24279 members: 24279 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{1:24285} state up:stopping seq 5 join_fscid=1 addr [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:14500} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6828/969322030,v1:192.168.123.107:6829/969322030] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:14510} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/1298912984,v1:192.168.123.107:6827/1298912984] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:33:18.205 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 14 2026-03-09T19:33:18.207 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.208+0000 7f3a7cd7d640 1 -- 192.168.123.107:0/3612895676 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3a480776d0 msgr2=0x7f3a48079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:18.207 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.208+0000 7f3a7cd7d640 1 --2- 192.168.123.107:0/3612895676 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3a480776d0 0x7f3a48079b90 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f3a7819b910 tx=0x7f3a6c03a040 comp rx=0 tx=0).stop 2026-03-09T19:33:18.207 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.208+0000 7f3a7cd7d640 1 -- 192.168.123.107:0/3612895676 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3a78102a60 msgr2=0x7f3a7819a3f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:18.207 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.208+0000 7f3a7cd7d640 1 --2- 192.168.123.107:0/3612895676 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3a78102a60 0x7f3a7819a3f0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f3a6000cc90 tx=0x7f3a60007590 comp rx=0 tx=0).stop 2026-03-09T19:33:18.207 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.208+0000 7f3a7cd7d640 1 -- 192.168.123.107:0/3612895676 shutdown_connections 2026-03-09T19:33:18.207 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.208+0000 7f3a7cd7d640 1 --2- 192.168.123.107:0/3612895676 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f3a480776d0 0x7f3a48079b90 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.207 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.208+0000 7f3a7cd7d640 1 --2- 192.168.123.107:0/3612895676 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a78103c60 0x7f3a7819a930 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.207 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.208+0000 7f3a7cd7d640 1 --2- 192.168.123.107:0/3612895676 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3a78102a60 0x7f3a7819a3f0 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.207 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.208+0000 7f3a7cd7d640 1 -- 192.168.123.107:0/3612895676 >> 192.168.123.107:0/3612895676 conn(0x7f3a780fe250 msgr2=0x7f3a780ffd10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:18.207 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.208+0000 7f3a7cd7d640 1 -- 192.168.123.107:0/3612895676 shutdown_connections 2026-03-09T19:33:18.207 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.208+0000 7f3a7cd7d640 1 -- 192.168.123.107:0/3612895676 wait complete. 2026-03-09T19:33:18.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.265+0000 7f4d7b20c640 1 -- 192.168.123.107:0/2427587861 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d74100800 msgr2=0x7f4d74100c00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:18.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.265+0000 7f4d7b20c640 1 --2- 192.168.123.107:0/2427587861 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d74100800 0x7f4d74100c00 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f4d5c0099b0 tx=0x7f4d5c02f240 comp rx=0 tx=0).stop 2026-03-09T19:33:18.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.265+0000 7f4d7b20c640 1 -- 192.168.123.107:0/2427587861 shutdown_connections 2026-03-09T19:33:18.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.265+0000 7f4d7b20c640 1 --2- 192.168.123.107:0/2427587861 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d74101a00 0x7f4d74101e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.265+0000 7f4d7b20c640 1 --2- 192.168.123.107:0/2427587861 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d74100800 0x7f4d74100c00 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.265+0000 7f4d7b20c640 1 -- 192.168.123.107:0/2427587861 >> 192.168.123.107:0/2427587861 conn(0x7f4d740fbfb0 msgr2=0x7f4d740fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:18.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.265+0000 7f4d7b20c640 1 -- 192.168.123.107:0/2427587861 shutdown_connections 2026-03-09T19:33:18.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.266+0000 7f4d7b20c640 1 -- 192.168.123.107:0/2427587861 wait complete. 2026-03-09T19:33:18.265 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.266+0000 7f4d7b20c640 1 Processor -- start 2026-03-09T19:33:18.265 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.266+0000 7f4d7b20c640 1 -- start start 2026-03-09T19:33:18.265 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.266+0000 7f4d7b20c640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d74100800 0x7f4d7410f2f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:18.265 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.266+0000 7f4d7b20c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d74101a00 0x7f4d7410f830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:18.265 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.266+0000 7f4d7b20c640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d7410fe00 con 0x7f4d74101a00 2026-03-09T19:33:18.265 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.266+0000 7f4d7b20c640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d7410ff70 con 0x7f4d74100800 2026-03-09T19:33:18.265 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.267+0000 7f4d6bfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d74101a00 0x7f4d7410f830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:18.266 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.267+0000 7f4d6bfff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d74101a00 0x7f4d7410f830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:60828/0 (socket says 192.168.123.107:60828) 2026-03-09T19:33:18.266 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.267+0000 7f4d6bfff640 1 -- 192.168.123.107:0/2045724461 learned_addr learned my addr 192.168.123.107:0/2045724461 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:33:18.266 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.267+0000 7f4d6bfff640 1 -- 192.168.123.107:0/2045724461 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d74100800 msgr2=0x7f4d7410f2f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:33:18.266 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.267+0000 7f4d6bfff640 1 --2- 192.168.123.107:0/2045724461 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d74100800 0x7f4d7410f2f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.266 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.267+0000 7f4d6bfff640 1 -- 192.168.123.107:0/2045724461 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4d5c009660 con 0x7f4d74101a00 2026-03-09T19:33:18.266 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.268+0000 7f4d6bfff640 1 --2- 192.168.123.107:0/2045724461 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d74101a00 0x7f4d7410f830 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f4d6400e990 tx=0x7f4d6400ee60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:18.267 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.268+0000 7f4d69ffb640 1 -- 192.168.123.107:0/2045724461 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4d6400cd30 con 0x7f4d74101a00 2026-03-09T19:33:18.267 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.268+0000 7f4d69ffb640 1 -- 192.168.123.107:0/2045724461 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f4d6400ce90 con 0x7f4d74101a00 2026-03-09T19:33:18.267 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.268+0000 7f4d69ffb640 1 -- 192.168.123.107:0/2045724461 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4d64010640 con 0x7f4d74101a00 2026-03-09T19:33:18.267 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.268+0000 7f4d7b20c640 1 -- 192.168.123.107:0/2045724461 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4d74112a30 con 0x7f4d74101a00 2026-03-09T19:33:18.267 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.268+0000 7f4d7b20c640 1 -- 192.168.123.107:0/2045724461 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4d74112f80 con 0x7f4d74101a00 2026-03-09T19:33:18.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.270+0000 7f4d69ffb640 1 -- 192.168.123.107:0/2045724461 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4d640107a0 con 0x7f4d74101a00 2026-03-09T19:33:18.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.270+0000 7f4d7b20c640 1 -- 192.168.123.107:0/2045724461 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4d3c005350 con 0x7f4d74101a00 2026-03-09T19:33:18.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.273+0000 7f4d69ffb640 1 --2- 192.168.123.107:0/2045724461 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4d50077890 0x7f4d50079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:18.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.273+0000 7f4d69ffb640 1 -- 192.168.123.107:0/2045724461 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(98..98 src has 1..98) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f4d64014070 con 0x7f4d74101a00 2026-03-09T19:33:18.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.273+0000 7f4d78f81640 1 --2- 192.168.123.107:0/2045724461 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4d50077890 0x7f4d50079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:18.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.274+0000 7f4d78f81640 1 --2- 192.168.123.107:0/2045724461 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4d50077890 0x7f4d50079d50 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f4d5c02f750 tx=0x7f4d5c0023d0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:18.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.274+0000 7f4d69ffb640 1 -- 192.168.123.107:0/2045724461 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4d64063700 con 0x7f4d74101a00 2026-03-09T19:33:18.376 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.376+0000 7f4d7b20c640 1 -- 192.168.123.107:0/2045724461 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f4d3c002bf0 con 0x7f4d50077890 2026-03-09T19:33:18.376 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.377+0000 7f4d69ffb640 1 -- 192.168.123.107:0/2045724461 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f4d3c002bf0 con 0x7f4d50077890 2026-03-09T19:33:18.376 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:33:18.377 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T19:33:18.377 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:33:18.377 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:33:18.377 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [ 2026-03-09T19:33:18.377 INFO:teuthology.orchestra.run.vm07.stdout: "mgr", 2026-03-09T19:33:18.377 INFO:teuthology.orchestra.run.vm07.stdout: "crash", 2026-03-09T19:33:18.377 INFO:teuthology.orchestra.run.vm07.stdout: "osd", 2026-03-09T19:33:18.377 INFO:teuthology.orchestra.run.vm07.stdout: "mon" 2026-03-09T19:33:18.377 INFO:teuthology.orchestra.run.vm07.stdout: ], 2026-03-09T19:33:18.377 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "12/23 daemons upgraded", 2026-03-09T19:33:18.377 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T19:33:18.377 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:33:18.377 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:33:18.379 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.380+0000 7f4d7b20c640 1 -- 192.168.123.107:0/2045724461 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4d50077890 msgr2=0x7f4d50079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:18.379 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.380+0000 7f4d7b20c640 1 --2- 192.168.123.107:0/2045724461 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4d50077890 0x7f4d50079d50 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f4d5c02f750 tx=0x7f4d5c0023d0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.379 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.380+0000 7f4d7b20c640 1 -- 192.168.123.107:0/2045724461 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d74101a00 msgr2=0x7f4d7410f830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:18.379 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.380+0000 7f4d7b20c640 1 --2- 192.168.123.107:0/2045724461 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d74101a00 0x7f4d7410f830 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f4d6400e990 tx=0x7f4d6400ee60 comp rx=0 tx=0).stop 2026-03-09T19:33:18.379 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.380+0000 7f4d7b20c640 1 -- 192.168.123.107:0/2045724461 shutdown_connections 2026-03-09T19:33:18.379 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.380+0000 7f4d7b20c640 1 --2- 192.168.123.107:0/2045724461 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4d50077890 0x7f4d50079d50 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.379 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.380+0000 7f4d7b20c640 1 --2- 192.168.123.107:0/2045724461 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d74101a00 0x7f4d7410f830 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.379 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.380+0000 7f4d7b20c640 1 --2- 192.168.123.107:0/2045724461 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d74100800 0x7f4d7410f2f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.379 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.381+0000 7f4d7b20c640 1 -- 192.168.123.107:0/2045724461 >> 192.168.123.107:0/2045724461 conn(0x7f4d740fbfb0 msgr2=0x7f4d740fd7f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:18.380 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.381+0000 7f4d7b20c640 1 -- 192.168.123.107:0/2045724461 shutdown_connections 2026-03-09T19:33:18.380 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.381+0000 7f4d7b20c640 1 -- 192.168.123.107:0/2045724461 wait complete. 2026-03-09T19:33:18.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.439+0000 7efe57ae7640 1 -- 192.168.123.107:0/3547823902 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efe50102a80 msgr2=0x7efe50102e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:18.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.439+0000 7efe57ae7640 1 --2- 192.168.123.107:0/3547823902 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efe50102a80 0x7efe50102e80 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7efe440098e0 tx=0x7efe4402f190 comp rx=0 tx=0).stop 2026-03-09T19:33:18.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.439+0000 7efe57ae7640 1 -- 192.168.123.107:0/3547823902 shutdown_connections 2026-03-09T19:33:18.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.439+0000 7efe57ae7640 1 --2- 192.168.123.107:0/3547823902 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efe50103c80 0x7efe50104100 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.439+0000 7efe57ae7640 1 --2- 192.168.123.107:0/3547823902 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efe50102a80 0x7efe50102e80 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.439+0000 7efe57ae7640 1 -- 192.168.123.107:0/3547823902 >> 192.168.123.107:0/3547823902 conn(0x7efe500fe250 msgr2=0x7efe50100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:18.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.439+0000 7efe57ae7640 1 -- 192.168.123.107:0/3547823902 shutdown_connections 2026-03-09T19:33:18.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.439+0000 7efe57ae7640 1 -- 192.168.123.107:0/3547823902 wait complete. 2026-03-09T19:33:18.439 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.440+0000 7efe57ae7640 1 Processor -- start 2026-03-09T19:33:18.439 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.440+0000 7efe57ae7640 1 -- start start 2026-03-09T19:33:18.439 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.440+0000 7efe57ae7640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efe50102a80 0x7efe5019a450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:18.439 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.440+0000 7efe57ae7640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efe50103c80 0x7efe5019a990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:18.439 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.440+0000 7efe57ae7640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efe5019af60 con 0x7efe50102a80 2026-03-09T19:33:18.440 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.441+0000 7efe5585c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efe50102a80 0x7efe5019a450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:18.440 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.441+0000 7efe5505b640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efe50103c80 0x7efe5019a990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:18.440 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.441+0000 7efe5505b640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efe50103c80 0x7efe5019a990 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:51746/0 (socket says 192.168.123.107:51746) 2026-03-09T19:33:18.440 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.441+0000 7efe5505b640 1 -- 192.168.123.107:0/3805515844 learned_addr learned my addr 192.168.123.107:0/3805515844 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:33:18.440 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.440+0000 7efe57ae7640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efe5019b0d0 con 0x7efe50103c80 2026-03-09T19:33:18.440 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.441+0000 7efe5585c640 1 -- 192.168.123.107:0/3805515844 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efe50103c80 msgr2=0x7efe5019a990 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:18.440 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.441+0000 7efe5585c640 1 --2- 192.168.123.107:0/3805515844 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efe50103c80 0x7efe5019a990 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.440 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.441+0000 7efe5585c640 1 -- 192.168.123.107:0/3805515844 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efe44009590 con 0x7efe50102a80 2026-03-09T19:33:18.441 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.442+0000 7efe5585c640 1 --2- 192.168.123.107:0/3805515844 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efe50102a80 0x7efe5019a450 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7efe50103ae0 tx=0x7efe44031c00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:18.441 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.442+0000 7efe3effd640 1 -- 192.168.123.107:0/3805515844 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efe4403d070 con 0x7efe50102a80 2026-03-09T19:33:18.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.442+0000 7efe3effd640 1 -- 192.168.123.107:0/3805515844 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7efe4402fe70 con 0x7efe50102a80 2026-03-09T19:33:18.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.442+0000 7efe57ae7640 1 -- 192.168.123.107:0/3805515844 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efe5019fb10 con 0x7efe50102a80 2026-03-09T19:33:18.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.442+0000 7efe3effd640 1 -- 192.168.123.107:0/3805515844 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efe44031750 con 0x7efe50102a80 2026-03-09T19:33:18.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.442+0000 7efe57ae7640 1 -- 192.168.123.107:0/3805515844 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efe501a0000 con 0x7efe50102a80 2026-03-09T19:33:18.446 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.444+0000 7efe3effd640 1 -- 192.168.123.107:0/3805515844 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7efe44049050 con 0x7efe50102a80 2026-03-09T19:33:18.446 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.444+0000 7efe57ae7640 1 -- 192.168.123.107:0/3805515844 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efe18005350 con 0x7efe50102a80 2026-03-09T19:33:18.446 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.447+0000 7efe3effd640 1 --2- 192.168.123.107:0/3805515844 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efe2c077890 0x7efe2c079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:18.447 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.447+0000 7efe3effd640 1 -- 192.168.123.107:0/3805515844 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(98..98 src has 1..98) v4 ==== 6480+0+0 (secure 0 0 0) 0x7efe440be2e0 con 0x7efe50102a80 2026-03-09T19:33:18.447 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.447+0000 7efe5505b640 1 --2- 192.168.123.107:0/3805515844 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efe2c077890 0x7efe2c079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:18.447 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.448+0000 7efe3effd640 1 -- 192.168.123.107:0/3805515844 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7efe44087950 con 0x7efe50102a80 2026-03-09T19:33:18.447 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.448+0000 7efe5505b640 1 --2- 192.168.123.107:0/3805515844 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efe2c077890 0x7efe2c079d50 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7efe5019b970 tx=0x7efe40009340 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:18.596 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.596+0000 7efe57ae7640 1 -- 192.168.123.107:0/3805515844 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7efe180051c0 con 0x7efe50102a80 2026-03-09T19:33:18.596 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.597+0000 7efe3effd640 1 -- 192.168.123.107:0/3805515844 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7efe440870a0 con 0x7efe50102a80 2026-03-09T19:33:18.596 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_OK 2026-03-09T19:33:18.599 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.600+0000 7efe57ae7640 1 -- 192.168.123.107:0/3805515844 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efe2c077890 msgr2=0x7efe2c079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:18.599 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.600+0000 7efe57ae7640 1 --2- 192.168.123.107:0/3805515844 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efe2c077890 0x7efe2c079d50 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7efe5019b970 tx=0x7efe40009340 comp rx=0 tx=0).stop 2026-03-09T19:33:18.599 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.600+0000 7efe57ae7640 1 -- 192.168.123.107:0/3805515844 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efe50102a80 msgr2=0x7efe5019a450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:18.599 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.600+0000 7efe57ae7640 1 --2- 192.168.123.107:0/3805515844 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efe50102a80 0x7efe5019a450 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7efe50103ae0 tx=0x7efe44031c00 comp rx=0 tx=0).stop 2026-03-09T19:33:18.599 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.600+0000 7efe57ae7640 1 -- 192.168.123.107:0/3805515844 shutdown_connections 2026-03-09T19:33:18.599 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.600+0000 7efe57ae7640 1 --2- 192.168.123.107:0/3805515844 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efe2c077890 0x7efe2c079d50 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.599 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.600+0000 7efe57ae7640 1 --2- 192.168.123.107:0/3805515844 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efe50103c80 0x7efe5019a990 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.599 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.600+0000 7efe57ae7640 1 --2- 192.168.123.107:0/3805515844 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efe50102a80 0x7efe5019a450 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:18.599 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.600+0000 7efe57ae7640 1 -- 192.168.123.107:0/3805515844 >> 192.168.123.107:0/3805515844 conn(0x7efe500fe250 msgr2=0x7efe500ffa00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:18.599 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.600+0000 7efe57ae7640 1 -- 192.168.123.107:0/3805515844 shutdown_connections 2026-03-09T19:33:18.599 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:18.600+0000 7efe57ae7640 1 -- 192.168.123.107:0/3805515844 wait complete. 2026-03-09T19:33:18.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:18 vm07.local ceph-mon[111841]: from='client.44289 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:33:18.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:18 vm07.local ceph-mon[111841]: pgmap v197: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 375 B/s wr, 0 op/s 2026-03-09T19:33:18.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:18 vm07.local ceph-mon[111841]: from='client.34354 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:33:18.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:18 vm07.local ceph-mon[111841]: from='client.34358 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:33:18.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:18 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/2334933776' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:18.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:18 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/3612895676' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:33:18.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:18 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/3805515844' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:33:18.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:18 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:18.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:18 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:33:19.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:18 vm08.local ceph-mon[103420]: from='client.44289 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:33:19.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:18 vm08.local ceph-mon[103420]: pgmap v197: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 375 B/s wr, 0 op/s 2026-03-09T19:33:19.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:18 vm08.local ceph-mon[103420]: from='client.34354 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:33:19.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:18 vm08.local ceph-mon[103420]: from='client.34358 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:33:19.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:18 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/2334933776' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:19.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:18 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/3612895676' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:33:19.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:18 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/3805515844' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:33:19.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:18 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:19.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:18 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:33:20.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:19 vm08.local ceph-mon[103420]: from='client.34370 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:33:20.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:19 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:20.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:19 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:20.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:19 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:33:20.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:19 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:20.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:19 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:20.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:19 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:20.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:19 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:20.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:19 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:20.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:19 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:20.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:19 vm08.local ceph-mon[103420]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-09T19:33:20.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:19 vm07.local ceph-mon[111841]: from='client.34370 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:33:20.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:19 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:20.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:19 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:20.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:19 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:33:20.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:19 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:20.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:19 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:20.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:19 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:20.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:19 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:20.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:19 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:20.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:19 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:20.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:19 vm07.local ceph-mon[111841]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-09T19:33:21.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:20 vm08.local ceph-mon[103420]: pgmap v198: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 307 B/s wr, 0 op/s 2026-03-09T19:33:21.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:20 vm07.local ceph-mon[111841]: pgmap v198: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 307 B/s wr, 0 op/s 2026-03-09T19:33:23.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:22 vm08.local ceph-mon[103420]: pgmap v199: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 366 B/s wr, 0 op/s 2026-03-09T19:33:23.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:22 vm07.local ceph-mon[111841]: pgmap v199: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 366 B/s wr, 0 op/s 2026-03-09T19:33:24.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:24 vm07.local ceph-mon[111841]: pgmap v200: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 341 B/s wr, 0 op/s 2026-03-09T19:33:24.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:24 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:25.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:24 vm08.local ceph-mon[103420]: pgmap v200: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 341 B/s wr, 0 op/s 2026-03-09T19:33:25.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:24 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:26 vm07.local ceph-mon[111841]: pgmap v201: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 255 B/s wr, 1 op/s 2026-03-09T19:33:27.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:26 vm08.local ceph-mon[103420]: pgmap v201: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 255 B/s wr, 1 op/s 2026-03-09T19:33:29.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:28 vm07.local ceph-mon[111841]: pgmap v202: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 85 B/s wr, 2 op/s 2026-03-09T19:33:29.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:28 vm08.local ceph-mon[103420]: pgmap v202: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 85 B/s wr, 2 op/s 2026-03-09T19:33:30.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:29 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:30.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:29 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:30.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:29 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:33:30.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:29 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:30.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:29 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:30.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:29 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:30.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:29 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:30.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:29 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:30.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:29 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:30.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:29 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:30.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:29 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:30.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:29 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:33:30.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:29 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:30.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:29 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:30.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:29 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:30.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:29 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:30.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:29 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:30.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:29 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:31.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:30 vm08.local ceph-mon[103420]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-09T19:33:31.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:30 vm08.local ceph-mon[103420]: pgmap v203: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 341 B/s wr, 2 op/s 2026-03-09T19:33:31.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:30 vm07.local ceph-mon[111841]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-09T19:33:31.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:30 vm07.local ceph-mon[111841]: pgmap v203: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 341 B/s wr, 2 op/s 2026-03-09T19:33:33.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:33 vm08.local ceph-mon[103420]: pgmap v204: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 341 B/s wr, 2 op/s 2026-03-09T19:33:33.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:33 vm08.local ceph-mon[103420]: daemon mds.cephfs.vm08.jwsqrf finished stopping rank 1 in filesystem cephfs (now has 1 ranks) 2026-03-09T19:33:33.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:33 vm07.local ceph-mon[111841]: pgmap v204: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 341 B/s wr, 2 op/s 2026-03-09T19:33:33.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:33 vm07.local ceph-mon[111841]: daemon mds.cephfs.vm08.jwsqrf finished stopping rank 1 in filesystem cephfs (now has 1 ranks) 2026-03-09T19:33:34.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:34 vm08.local ceph-mon[103420]: mds.1 [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] down:stopped 2026-03-09T19:33:34.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:34 vm08.local ceph-mon[103420]: fsmap cephfs:1 {0=cephfs.vm08.zcaqju=up:active} 2 up:standby 2026-03-09T19:33:34.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:34 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:33:34.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:34 vm07.local ceph-mon[111841]: mds.1 [v2:192.168.123.108:6826/3082155067,v1:192.168.123.108:6827/3082155067] down:stopped 2026-03-09T19:33:34.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:34 vm07.local ceph-mon[111841]: fsmap cephfs:1 {0=cephfs.vm08.zcaqju=up:active} 2 up:standby 2026-03-09T19:33:34.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:33:35.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:35 vm08.local ceph-mon[103420]: pgmap v205: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 255 B/s wr, 2 op/s 2026-03-09T19:33:35.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:35 vm08.local ceph-mon[103420]: mds.? [v2:192.168.123.108:6826/2328013860,v1:192.168.123.108:6827/2328013860] up:boot 2026-03-09T19:33:35.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:35 vm08.local ceph-mon[103420]: fsmap cephfs:1 {0=cephfs.vm08.zcaqju=up:active} 3 up:standby 2026-03-09T19:33:35.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:35 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.jwsqrf"}]: dispatch 2026-03-09T19:33:35.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:35 vm07.local ceph-mon[111841]: pgmap v205: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 255 B/s wr, 2 op/s 2026-03-09T19:33:35.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:35 vm07.local ceph-mon[111841]: mds.? [v2:192.168.123.108:6826/2328013860,v1:192.168.123.108:6827/2328013860] up:boot 2026-03-09T19:33:35.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:35 vm07.local ceph-mon[111841]: fsmap cephfs:1 {0=cephfs.vm08.zcaqju=up:active} 3 up:standby 2026-03-09T19:33:35.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:35 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.jwsqrf"}]: dispatch 2026-03-09T19:33:36.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:36 vm08.local ceph-mon[103420]: pgmap v206: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 255 B/s wr, 1 op/s 2026-03-09T19:33:36.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:36 vm07.local ceph-mon[111841]: pgmap v206: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 255 B/s wr, 1 op/s 2026-03-09T19:33:38.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:38 vm07.local ceph-mon[111841]: pgmap v207: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 255 B/s wr, 1 op/s 2026-03-09T19:33:39.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:38 vm08.local ceph-mon[103420]: pgmap v207: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 255 B/s wr, 1 op/s 2026-03-09T19:33:39.669 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:39.669 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:39.669 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:33:39.669 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:39.669 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:39.669 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:39.669 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:39.669 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:39.669 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:39.669 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm07.uizncw"]}]: dispatch 2026-03-09T19:33:39.669 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:39.669 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.uizncw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:33:39.669 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:40.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:40.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:40.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:33:40.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:40.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:40.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:40.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:40.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:40.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:40.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm07.uizncw"]}]: dispatch 2026-03-09T19:33:40.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:40.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.uizncw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:33:40.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:41.256 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:41 vm07.local ceph-mon[111841]: Upgrade: It appears safe to stop mds.cephfs.vm07.uizncw 2026-03-09T19:33:41.256 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:41 vm07.local ceph-mon[111841]: Upgrade: Updating mds.cephfs.vm07.uizncw 2026-03-09T19:33:41.256 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:41 vm07.local ceph-mon[111841]: Deploying daemon mds.cephfs.vm07.uizncw on vm07 2026-03-09T19:33:41.256 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:41 vm07.local ceph-mon[111841]: pgmap v208: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 255 B/s wr, 0 op/s 2026-03-09T19:33:41.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:41 vm08.local ceph-mon[103420]: Upgrade: It appears safe to stop mds.cephfs.vm07.uizncw 2026-03-09T19:33:41.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:41 vm08.local ceph-mon[103420]: Upgrade: Updating mds.cephfs.vm07.uizncw 2026-03-09T19:33:41.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:41 vm08.local ceph-mon[103420]: Deploying daemon mds.cephfs.vm07.uizncw on vm07 2026-03-09T19:33:41.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:41 vm08.local ceph-mon[103420]: pgmap v208: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 255 B/s wr, 0 op/s 2026-03-09T19:33:42.326 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:42 vm07.local ceph-mon[111841]: osdmap e99: 6 total, 6 up, 6 in 2026-03-09T19:33:42.326 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:42 vm07.local ceph-mon[111841]: fsmap cephfs:1 {0=cephfs.vm08.zcaqju=up:active} 2 up:standby 2026-03-09T19:33:42.326 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:42 vm07.local ceph-mon[111841]: pgmap v210: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:42.326 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:42 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:42.326 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:42 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:42.326 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:42 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:42.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:42 vm08.local ceph-mon[103420]: osdmap e99: 6 total, 6 up, 6 in 2026-03-09T19:33:42.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:42 vm08.local ceph-mon[103420]: fsmap cephfs:1 {0=cephfs.vm08.zcaqju=up:active} 2 up:standby 2026-03-09T19:33:42.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:42 vm08.local ceph-mon[103420]: pgmap v210: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:42.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:42 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:42.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:42 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:42.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:42 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:43.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:43 vm07.local ceph-mon[111841]: mds.? [v2:192.168.123.107:6826/2856024060,v1:192.168.123.107:6827/2856024060] up:boot 2026-03-09T19:33:43.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:43 vm07.local ceph-mon[111841]: fsmap cephfs:1 {0=cephfs.vm08.zcaqju=up:active} 3 up:standby 2026-03-09T19:33:43.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:43 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.uizncw"}]: dispatch 2026-03-09T19:33:43.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:43 vm08.local ceph-mon[103420]: mds.? [v2:192.168.123.107:6826/2856024060,v1:192.168.123.107:6827/2856024060] up:boot 2026-03-09T19:33:43.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:43 vm08.local ceph-mon[103420]: fsmap cephfs:1 {0=cephfs.vm08.zcaqju=up:active} 3 up:standby 2026-03-09T19:33:43.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:43 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.uizncw"}]: dispatch 2026-03-09T19:33:44.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:44 vm07.local ceph-mon[111841]: pgmap v211: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:44.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:44 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:44.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:44 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:44.853 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:44 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:45.192 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:44 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:45.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:44 vm08.local ceph-mon[103420]: pgmap v211: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:45.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:44 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:45.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:44 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:45.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:44 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:45.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:44 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:46.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: Detected new or changed devices on vm07 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm07.zkmcyw"]}]: dispatch 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: Upgrade: It appears safe to stop mds.cephfs.vm07.zkmcyw 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: pgmap v212: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: Upgrade: Updating mds.cephfs.vm07.zkmcyw 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.zkmcyw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:46.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:46 vm07.local ceph-mon[111841]: Deploying daemon mds.cephfs.vm07.zkmcyw on vm07 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: Detected new or changed devices on vm07 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm07.zkmcyw"]}]: dispatch 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: Upgrade: It appears safe to stop mds.cephfs.vm07.zkmcyw 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: pgmap v212: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: Upgrade: Updating mds.cephfs.vm07.zkmcyw 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.zkmcyw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:46.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:46 vm08.local ceph-mon[103420]: Deploying daemon mds.cephfs.vm07.zkmcyw on vm07 2026-03-09T19:33:47.415 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:47 vm07.local ceph-mon[111841]: osdmap e100: 6 total, 6 up, 6 in 2026-03-09T19:33:47.415 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:47 vm07.local ceph-mon[111841]: fsmap cephfs:1 {0=cephfs.vm08.zcaqju=up:active} 2 up:standby 2026-03-09T19:33:47.589 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:47 vm08.local ceph-mon[103420]: osdmap e100: 6 total, 6 up, 6 in 2026-03-09T19:33:47.589 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:47 vm08.local ceph-mon[103420]: fsmap cephfs:1 {0=cephfs.vm08.zcaqju=up:active} 2 up:standby 2026-03-09T19:33:48.679 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.680+0000 7fcc7f577640 1 -- 192.168.123.107:0/2603803325 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcc780a5920 msgr2=0x7fcc780a5d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:48.679 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.680+0000 7fcc7d573640 1 -- 192.168.123.107:0/2603803325 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcc80069070 con 0x7fcc780a5920 2026-03-09T19:33:48.679 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.680+0000 7fcc7f577640 1 --2- 192.168.123.107:0/2603803325 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcc780a5920 0x7fcc780a5d80 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7fcc80066a00 tx=0x7fcc80091cf0 comp rx=0 tx=0).stop 2026-03-09T19:33:48.679 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.680+0000 7fcc7f577640 1 -- 192.168.123.107:0/2603803325 shutdown_connections 2026-03-09T19:33:48.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.680+0000 7fcc7f577640 1 --2- 192.168.123.107:0/2603803325 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcc780a5920 0x7fcc780a5d80 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:48.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.680+0000 7fcc7f577640 1 --2- 192.168.123.107:0/2603803325 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcc780a4830 0x7fcc780a4c30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:48.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.680+0000 7fcc7f577640 1 -- 192.168.123.107:0/2603803325 >> 192.168.123.107:0/2603803325 conn(0x7fcc7809fe80 msgr2=0x7fcc780a22e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:48.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.680+0000 7fcc7f577640 1 -- 192.168.123.107:0/2603803325 shutdown_connections 2026-03-09T19:33:48.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.680+0000 7fcc7f577640 1 -- 192.168.123.107:0/2603803325 wait complete. 2026-03-09T19:33:48.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.681+0000 7fcc7f577640 1 Processor -- start 2026-03-09T19:33:48.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.681+0000 7fcc7f577640 1 -- start start 2026-03-09T19:33:48.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.681+0000 7fcc7f577640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcc780a4830 0x7fcc78144ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:48.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.681+0000 7fcc7f577640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcc780a5920 0x7fcc78145000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:48.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.681+0000 7fcc7f577640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcc781455d0 con 0x7fcc780a5920 2026-03-09T19:33:48.681 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.681+0000 7fcc7f577640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcc78145740 con 0x7fcc780a4830 2026-03-09T19:33:48.681 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.682+0000 7fcc7e575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcc780a4830 0x7fcc78144ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:48.681 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.682+0000 7fcc7e575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcc780a4830 0x7fcc78144ac0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:52976/0 (socket says 192.168.123.107:52976) 2026-03-09T19:33:48.681 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.682+0000 7fcc7e575640 1 -- 192.168.123.107:0/1734980334 learned_addr learned my addr 192.168.123.107:0/1734980334 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:33:48.681 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.682+0000 7fcc7e575640 1 -- 192.168.123.107:0/1734980334 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcc780a5920 msgr2=0x7fcc78145000 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:48.681 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.682+0000 7fcc7e575640 1 --2- 192.168.123.107:0/1734980334 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcc780a5920 0x7fcc78145000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:48.681 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.682+0000 7fcc7e575640 1 -- 192.168.123.107:0/1734980334 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcc8004f090 con 0x7fcc780a4830 2026-03-09T19:33:48.681 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.682+0000 7fcc7e575640 1 --2- 192.168.123.107:0/1734980334 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcc780a4830 0x7fcc78144ac0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fcc7000d8d0 tx=0x7fcc7000dda0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:48.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.683+0000 7fcc677fe640 1 -- 192.168.123.107:0/1734980334 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcc70004490 con 0x7fcc780a4830 2026-03-09T19:33:48.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.683+0000 7fcc7f577640 1 -- 192.168.123.107:0/1734980334 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcc780ac790 con 0x7fcc780a4830 2026-03-09T19:33:48.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.684+0000 7fcc7f577640 1 -- 192.168.123.107:0/1734980334 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcc780acce0 con 0x7fcc780a4830 2026-03-09T19:33:48.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.684+0000 7fcc677fe640 1 -- 192.168.123.107:0/1734980334 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fcc7000bd00 con 0x7fcc780a4830 2026-03-09T19:33:48.683 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.684+0000 7fcc677fe640 1 -- 192.168.123.107:0/1734980334 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcc70010460 con 0x7fcc780a4830 2026-03-09T19:33:48.684 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.685+0000 7fcc7f577640 1 -- 192.168.123.107:0/1734980334 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcc780b5d50 con 0x7fcc780a4830 2026-03-09T19:33:48.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.686+0000 7fcc677fe640 1 -- 192.168.123.107:0/1734980334 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcc700027e0 con 0x7fcc780a4830 2026-03-09T19:33:48.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.686+0000 7fcc677fe640 1 --2- 192.168.123.107:0/1734980334 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcc5c077910 0x7fcc5c079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:48.685 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.687+0000 7fcc677fe640 1 -- 192.168.123.107:0/1734980334 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(100..100 src has 1..100) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fcc70099fd0 con 0x7fcc780a4830 2026-03-09T19:33:48.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.687+0000 7fcc7dd74640 1 --2- 192.168.123.107:0/1734980334 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcc5c077910 0x7fcc5c079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:48.686 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.687+0000 7fcc7dd74640 1 --2- 192.168.123.107:0/1734980334 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcc5c077910 0x7fcc5c079dd0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7fcc80092200 tx=0x7fcc80063ff0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:48.689 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.690+0000 7fcc677fe640 1 -- 192.168.123.107:0/1734980334 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcc70062750 con 0x7fcc780a4830 2026-03-09T19:33:48.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:48.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:48.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:48 vm07.local ceph-mon[111841]: pgmap v214: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:48.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:48.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:48 vm07.local ceph-mon[111841]: mds.? [v2:192.168.123.107:6828/670620212,v1:192.168.123.107:6829/670620212] up:boot 2026-03-09T19:33:48.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:48 vm07.local ceph-mon[111841]: fsmap cephfs:1 {0=cephfs.vm08.zcaqju=up:active} 3 up:standby 2026-03-09T19:33:48.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.zkmcyw"}]: dispatch 2026-03-09T19:33:48.811 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.811+0000 7fcc7f577640 1 -- 192.168.123.107:0/1734980334 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fcc780a9d80 con 0x7fcc5c077910 2026-03-09T19:33:48.815 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.814+0000 7fcc677fe640 1 -- 192.168.123.107:0/1734980334 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7fcc780a9d80 con 0x7fcc5c077910 2026-03-09T19:33:48.818 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.819+0000 7fcc657fa640 1 -- 192.168.123.107:0/1734980334 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcc5c077910 msgr2=0x7fcc5c079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:48.818 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.819+0000 7fcc657fa640 1 --2- 192.168.123.107:0/1734980334 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcc5c077910 0x7fcc5c079dd0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7fcc80092200 tx=0x7fcc80063ff0 comp rx=0 tx=0).stop 2026-03-09T19:33:48.818 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.819+0000 7fcc657fa640 1 -- 192.168.123.107:0/1734980334 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcc780a4830 msgr2=0x7fcc78144ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:48.818 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.819+0000 7fcc657fa640 1 --2- 192.168.123.107:0/1734980334 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcc780a4830 0x7fcc78144ac0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fcc7000d8d0 tx=0x7fcc7000dda0 comp rx=0 tx=0).stop 2026-03-09T19:33:48.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.821+0000 7fcc657fa640 1 -- 192.168.123.107:0/1734980334 shutdown_connections 2026-03-09T19:33:48.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.821+0000 7fcc657fa640 1 --2- 192.168.123.107:0/1734980334 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcc5c077910 0x7fcc5c079dd0 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:48.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.821+0000 7fcc657fa640 1 --2- 192.168.123.107:0/1734980334 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcc780a5920 0x7fcc78145000 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:48.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.821+0000 7fcc657fa640 1 --2- 192.168.123.107:0/1734980334 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcc780a4830 0x7fcc78144ac0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:48.820 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.821+0000 7fcc657fa640 1 -- 192.168.123.107:0/1734980334 >> 192.168.123.107:0/1734980334 conn(0x7fcc7809fe80 msgr2=0x7fcc780a1710 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:48.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.823+0000 7fcc657fa640 1 -- 192.168.123.107:0/1734980334 shutdown_connections 2026-03-09T19:33:48.822 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.823+0000 7fcc657fa640 1 -- 192.168.123.107:0/1734980334 wait complete. 2026-03-09T19:33:48.838 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:33:48.887 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.888+0000 7f810baaf640 1 -- 192.168.123.107:0/909037458 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8104076df0 msgr2=0x7f8104077250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:48.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.888+0000 7f810baaf640 1 --2- 192.168.123.107:0/909037458 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8104076df0 0x7f8104077250 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f80f80099b0 tx=0x7f80f802f220 comp rx=0 tx=0).stop 2026-03-09T19:33:48.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.888+0000 7f810baaf640 1 -- 192.168.123.107:0/909037458 shutdown_connections 2026-03-09T19:33:48.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.888+0000 7f810baaf640 1 --2- 192.168.123.107:0/909037458 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8104076df0 0x7f8104077250 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:48.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.888+0000 7f810baaf640 1 --2- 192.168.123.107:0/909037458 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8104075ba0 0x7f8104075fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:48.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.888+0000 7f810baaf640 1 -- 192.168.123.107:0/909037458 >> 192.168.123.107:0/909037458 conn(0x7f81040fe250 msgr2=0x7f8104100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:48.891 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.888+0000 7f810baaf640 1 -- 192.168.123.107:0/909037458 shutdown_connections 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.889+0000 7f810baaf640 1 -- 192.168.123.107:0/909037458 wait complete. 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.889+0000 7f810baaf640 1 Processor -- start 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.889+0000 7f810baaf640 1 -- start start 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.889+0000 7f810baaf640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8104075ba0 0x7f810419e900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.889+0000 7f810baaf640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8104076df0 0x7f810419ee40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.889+0000 7f810baaf640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f810419f410 con 0x7f8104076df0 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.889+0000 7f810baaf640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f810419f580 con 0x7f8104075ba0 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.890+0000 7f8109824640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8104075ba0 0x7f810419e900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.890+0000 7f8109824640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8104075ba0 0x7f810419e900 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:53002/0 (socket says 192.168.123.107:53002) 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.890+0000 7f8109824640 1 -- 192.168.123.107:0/2303451143 learned_addr learned my addr 192.168.123.107:0/2303451143 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.890+0000 7f8109023640 1 --2- 192.168.123.107:0/2303451143 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8104076df0 0x7f810419ee40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.890+0000 7f8109824640 1 -- 192.168.123.107:0/2303451143 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8104076df0 msgr2=0x7f810419ee40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.890+0000 7f8109824640 1 --2- 192.168.123.107:0/2303451143 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8104076df0 0x7f810419ee40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.890+0000 7f8109824640 1 -- 192.168.123.107:0/2303451143 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f80f8009660 con 0x7f8104075ba0 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.890+0000 7f8109023640 1 --2- 192.168.123.107:0/2303451143 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8104076df0 0x7f810419ee40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.890+0000 7f8109824640 1 --2- 192.168.123.107:0/2303451143 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8104075ba0 0x7f810419e900 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f80f400d900 tx=0x7f80f400ddd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.891+0000 7f80f2ffd640 1 -- 192.168.123.107:0/2303451143 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80f4004490 con 0x7f8104075ba0 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.891+0000 7f810baaf640 1 -- 192.168.123.107:0/2303451143 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f81041a3fb0 con 0x7f8104075ba0 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.891+0000 7f810baaf640 1 -- 192.168.123.107:0/2303451143 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f81041a44a0 con 0x7f8104075ba0 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.891+0000 7f80f2ffd640 1 -- 192.168.123.107:0/2303451143 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f80f4007610 con 0x7f8104075ba0 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.891+0000 7f80f2ffd640 1 -- 192.168.123.107:0/2303451143 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80f4005230 con 0x7f8104075ba0 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.892+0000 7f80f2ffd640 1 -- 192.168.123.107:0/2303451143 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f80f400b840 con 0x7f8104075ba0 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.892+0000 7f80f2ffd640 1 --2- 192.168.123.107:0/2303451143 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f80e00778e0 0x7f80e0079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.893+0000 7f80f2ffd640 1 -- 192.168.123.107:0/2303451143 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(100..100 src has 1..100) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f80f40998b0 con 0x7f8104075ba0 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.893+0000 7f8109023640 1 --2- 192.168.123.107:0/2303451143 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f80e00778e0 0x7f80e0079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.893+0000 7f810baaf640 1 -- 192.168.123.107:0/2303451143 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f80cc005350 con 0x7f8104075ba0 2026-03-09T19:33:48.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.893+0000 7f8109023640 1 --2- 192.168.123.107:0/2303451143 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f80e00778e0 0x7f80e0079da0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f80f8002c20 tx=0x7f80f80047c0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:48.897 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:48.897+0000 7f80f2ffd640 1 -- 192.168.123.107:0/2303451143 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f80f4061fb0 con 0x7f8104075ba0 2026-03-09T19:33:49.000 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.000+0000 7f810baaf640 1 -- 192.168.123.107:0/2303451143 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f80cc002bf0 con 0x7f80e00778e0 2026-03-09T19:33:49.001 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.002+0000 7f80f2ffd640 1 -- 192.168.123.107:0/2303451143 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f80cc002bf0 con 0x7f80e00778e0 2026-03-09T19:33:49.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.005+0000 7f80f0f79640 1 -- 192.168.123.107:0/2303451143 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f80e00778e0 msgr2=0x7f80e0079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.005+0000 7f80f0f79640 1 --2- 192.168.123.107:0/2303451143 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f80e00778e0 0x7f80e0079da0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f80f8002c20 tx=0x7f80f80047c0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.005+0000 7f80f0f79640 1 -- 192.168.123.107:0/2303451143 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8104075ba0 msgr2=0x7f810419e900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.005+0000 7f80f0f79640 1 --2- 192.168.123.107:0/2303451143 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8104075ba0 0x7f810419e900 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f80f400d900 tx=0x7f80f400ddd0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.005+0000 7f80f0f79640 1 -- 192.168.123.107:0/2303451143 shutdown_connections 2026-03-09T19:33:49.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.005+0000 7f80f0f79640 1 --2- 192.168.123.107:0/2303451143 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f80e00778e0 0x7f80e0079da0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.005+0000 7f80f0f79640 1 --2- 192.168.123.107:0/2303451143 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8104076df0 0x7f810419ee40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.005+0000 7f80f0f79640 1 --2- 192.168.123.107:0/2303451143 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8104075ba0 0x7f810419e900 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.005+0000 7f80f0f79640 1 -- 192.168.123.107:0/2303451143 >> 192.168.123.107:0/2303451143 conn(0x7f81040fe250 msgr2=0x7f81040ffa70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:49.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.005+0000 7f80f0f79640 1 -- 192.168.123.107:0/2303451143 shutdown_connections 2026-03-09T19:33:49.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.006+0000 7f80f0f79640 1 -- 192.168.123.107:0/2303451143 wait complete. 2026-03-09T19:33:49.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.069+0000 7f73a34ca640 1 -- 192.168.123.107:0/2901081700 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f739c103c90 msgr2=0x7f739c106080 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.069+0000 7f73a34ca640 1 --2- 192.168.123.107:0/2901081700 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f739c103c90 0x7f739c106080 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f73900099b0 tx=0x7f739002f220 comp rx=0 tx=0).stop 2026-03-09T19:33:49.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.070+0000 7f73a34ca640 1 -- 192.168.123.107:0/2901081700 shutdown_connections 2026-03-09T19:33:49.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.070+0000 7f73a34ca640 1 --2- 192.168.123.107:0/2901081700 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f739c103c90 0x7f739c106080 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.070+0000 7f73a34ca640 1 --2- 192.168.123.107:0/2901081700 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f739c101360 0x7f739c103750 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.070+0000 7f73a34ca640 1 -- 192.168.123.107:0/2901081700 >> 192.168.123.107:0/2901081700 conn(0x7f739c0fae20 msgr2=0x7f739c0fd280 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:49.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.070+0000 7f73a34ca640 1 -- 192.168.123.107:0/2901081700 shutdown_connections 2026-03-09T19:33:49.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.070+0000 7f73a34ca640 1 -- 192.168.123.107:0/2901081700 wait complete. 2026-03-09T19:33:49.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.071+0000 7f73a34ca640 1 Processor -- start 2026-03-09T19:33:49.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.071+0000 7f73a34ca640 1 -- start start 2026-03-09T19:33:49.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.071+0000 7f73a34ca640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f739c101360 0x7f739c19a510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:49.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.071+0000 7f73a34ca640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f739c103c90 0x7f739c19aa50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:49.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.071+0000 7f73a34ca640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f739c19b020 con 0x7f739c101360 2026-03-09T19:33:49.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.071+0000 7f73a34ca640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f739c19b190 con 0x7f739c103c90 2026-03-09T19:33:49.071 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.072+0000 7f73a123f640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f739c101360 0x7f739c19a510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:49.071 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.072+0000 7f73a123f640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f739c101360 0x7f739c19a510 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:55712/0 (socket says 192.168.123.107:55712) 2026-03-09T19:33:49.071 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.072+0000 7f73a123f640 1 -- 192.168.123.107:0/282188123 learned_addr learned my addr 192.168.123.107:0/282188123 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:33:49.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.072+0000 7f73a0a3e640 1 --2- 192.168.123.107:0/282188123 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f739c103c90 0x7f739c19aa50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:49.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.073+0000 7f73a123f640 1 -- 192.168.123.107:0/282188123 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f739c103c90 msgr2=0x7f739c19aa50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.073+0000 7f73a123f640 1 --2- 192.168.123.107:0/282188123 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f739c103c90 0x7f739c19aa50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.073+0000 7f73a123f640 1 -- 192.168.123.107:0/282188123 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7390009660 con 0x7f739c101360 2026-03-09T19:33:49.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.073+0000 7f73a123f640 1 --2- 192.168.123.107:0/282188123 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f739c101360 0x7f739c19a510 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f738c00b730 tx=0x7f738c00bc00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:49.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.073+0000 7f738a7fc640 1 -- 192.168.123.107:0/282188123 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f738c004280 con 0x7f739c101360 2026-03-09T19:33:49.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.073+0000 7f738a7fc640 1 -- 192.168.123.107:0/282188123 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f738c0043e0 con 0x7f739c101360 2026-03-09T19:33:49.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.073+0000 7f73a34ca640 1 -- 192.168.123.107:0/282188123 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f739c19fc30 con 0x7f739c101360 2026-03-09T19:33:49.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.074+0000 7f73a34ca640 1 -- 192.168.123.107:0/282188123 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f739c1a0150 con 0x7f739c101360 2026-03-09T19:33:49.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.076+0000 7f738a7fc640 1 -- 192.168.123.107:0/282188123 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f738c00ca90 con 0x7f739c101360 2026-03-09T19:33:49.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.076+0000 7f73a34ca640 1 -- 192.168.123.107:0/282188123 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7364005350 con 0x7f739c101360 2026-03-09T19:33:49.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.076+0000 7f738a7fc640 1 -- 192.168.123.107:0/282188123 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f738c00ccb0 con 0x7f739c101360 2026-03-09T19:33:49.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.076+0000 7f738a7fc640 1 --2- 192.168.123.107:0/282188123 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f7378077820 0x7f7378079ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:49.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.077+0000 7f738a7fc640 1 -- 192.168.123.107:0/282188123 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(100..100 src has 1..100) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f738c099ef0 con 0x7f739c101360 2026-03-09T19:33:49.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.078+0000 7f73a0a3e640 1 --2- 192.168.123.107:0/282188123 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f7378077820 0x7f7378079ce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:49.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.078+0000 7f73a0a3e640 1 --2- 192.168.123.107:0/282188123 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f7378077820 0x7f7378079ce0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f7390002c20 tx=0x7f739003a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:49.078 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.079+0000 7f738a7fc640 1 -- 192.168.123.107:0/282188123 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f738c062780 con 0x7f739c101360 2026-03-09T19:33:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:48 vm08.local ceph-mon[103420]: pgmap v214: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:48 vm08.local ceph-mon[103420]: mds.? [v2:192.168.123.107:6828/670620212,v1:192.168.123.107:6829/670620212] up:boot 2026-03-09T19:33:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:48 vm08.local ceph-mon[103420]: fsmap cephfs:1 {0=cephfs.vm08.zcaqju=up:active} 3 up:standby 2026-03-09T19:33:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.zkmcyw"}]: dispatch 2026-03-09T19:33:49.189 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.189+0000 7f73a34ca640 1 -- 192.168.123.107:0/282188123 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f7364002bf0 con 0x7f7378077820 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (10m) 5s ago 11m 23.2M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (11m) 5s ago 11m 9.86M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 65624baaa996 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (10m) 43s ago 10m 10.8M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 15bd7844f6a6 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (4m) 5s ago 11m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 91ed5a6dbf3f 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (4m) 43s ago 10m 8321k - 19.2.3-678-ge911bdeb 654f31e6858e b2465a9d2305 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (10m) 5s ago 11m 89.7M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (7s) 5s ago 9m 18.0M - 19.2.3-678-ge911bdeb 654f31e6858e ccdf01cf9a0b 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 starting - - - - 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (9m) 43s ago 9m 28.7M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae67d5e4f4a4 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (9m) 43s ago 9m 179M - 18.2.7-1055-gab47f43c b6fe7eb6a9d0 ae6ef2cf1874 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:8443,9283,8765 running (5m) 5s ago 12m 605M - 19.2.3-678-ge911bdeb 654f31e6858e 6c1350e70bfa 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (5m) 43s ago 10m 493M - 19.2.3-678-ge911bdeb 654f31e6858e c4c36685d8dc 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (5m) 5s ago 12m 67.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e ad39140965d8 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (5m) 43s ago 10m 56.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b4a58927ebfd 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (11m) 5s ago 11m 14.7M - 1.5.0 0da6a335fe13 80bb004b27b8 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (10m) 43s ago 10m 16.3M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (4m) 5s ago 10m 226M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a203aa241656 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (2m) 5s ago 10m 111M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 36b65f1069e5 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (112s) 5s ago 10m 95.8M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 1b8bc1f96eb7 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (87s) 43s ago 10m 181M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bde783ff786f 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (66s) 43s ago 9m 125M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 588104e3b774 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (45s) 43s ago 9m 46.3M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7f9a10e5f49d 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (5m) 5s ago 11m 59.5M - 2.43.0 a07b618ecd1d c09450c20f5f 2026-03-09T19:33:49.195 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.195+0000 7f738a7fc640 1 -- 192.168.123.107:0/282188123 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3600 (secure 0 0 0) 0x7f7364002bf0 con 0x7f7378077820 2026-03-09T19:33:49.198 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.199+0000 7f736bfff640 1 -- 192.168.123.107:0/282188123 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f7378077820 msgr2=0x7f7378079ce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.198 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.199+0000 7f736bfff640 1 --2- 192.168.123.107:0/282188123 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f7378077820 0x7f7378079ce0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f7390002c20 tx=0x7f739003a040 comp rx=0 tx=0).stop 2026-03-09T19:33:49.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.199+0000 7f736bfff640 1 -- 192.168.123.107:0/282188123 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f739c101360 msgr2=0x7f739c19a510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.200+0000 7f736bfff640 1 --2- 192.168.123.107:0/282188123 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f739c101360 0x7f739c19a510 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f738c00b730 tx=0x7f738c00bc00 comp rx=0 tx=0).stop 2026-03-09T19:33:49.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.200+0000 7f736bfff640 1 -- 192.168.123.107:0/282188123 shutdown_connections 2026-03-09T19:33:49.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.200+0000 7f736bfff640 1 --2- 192.168.123.107:0/282188123 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f7378077820 0x7f7378079ce0 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.200+0000 7f736bfff640 1 --2- 192.168.123.107:0/282188123 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f739c103c90 0x7f739c19aa50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.200+0000 7f736bfff640 1 --2- 192.168.123.107:0/282188123 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f739c101360 0x7f739c19a510 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.200+0000 7f736bfff640 1 -- 192.168.123.107:0/282188123 >> 192.168.123.107:0/282188123 conn(0x7f739c0fae20 msgr2=0x7f739c0ff8d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:49.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.200+0000 7f736bfff640 1 -- 192.168.123.107:0/282188123 shutdown_connections 2026-03-09T19:33:49.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.200+0000 7f736bfff640 1 -- 192.168.123.107:0/282188123 wait complete. 2026-03-09T19:33:49.264 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.265+0000 7f4ac9f89640 1 -- 192.168.123.107:0/1089658090 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ac4072370 msgr2=0x7f4ac410c590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.268 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.265+0000 7f4ac9f89640 1 --2- 192.168.123.107:0/1089658090 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ac4072370 0x7f4ac410c590 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f4ab80099b0 tx=0x7f4ab802f220 comp rx=0 tx=0).stop 2026-03-09T19:33:49.268 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.269+0000 7f4ac9f89640 1 -- 192.168.123.107:0/1089658090 shutdown_connections 2026-03-09T19:33:49.268 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.269+0000 7f4ac9f89640 1 --2- 192.168.123.107:0/1089658090 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ac4072370 0x7f4ac410c590 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.268 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.269+0000 7f4ac9f89640 1 --2- 192.168.123.107:0/1089658090 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4ac40719a0 0x7f4ac4071da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.268 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.269+0000 7f4ac9f89640 1 -- 192.168.123.107:0/1089658090 >> 192.168.123.107:0/1089658090 conn(0x7f4ac406d4f0 msgr2=0x7f4ac406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:49.268 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.269+0000 7f4ac9f89640 1 -- 192.168.123.107:0/1089658090 shutdown_connections 2026-03-09T19:33:49.268 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.269+0000 7f4ac9f89640 1 -- 192.168.123.107:0/1089658090 wait complete. 2026-03-09T19:33:49.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.270+0000 7f4ac9f89640 1 Processor -- start 2026-03-09T19:33:49.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.270+0000 7f4ac9f89640 1 -- start start 2026-03-09T19:33:49.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.270+0000 7f4ac9f89640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ac40719a0 0x7f4ac41a7390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:49.269 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.270+0000 7f4ac9f89640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4ac4072370 0x7f4ac41a78d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:49.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.270+0000 7f4ac9f89640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4ac41a7ea0 con 0x7f4ac40719a0 2026-03-09T19:33:49.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.270+0000 7f4ac9f89640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4ac41a8010 con 0x7f4ac4072370 2026-03-09T19:33:49.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.271+0000 7f4ac3fff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4ac4072370 0x7f4ac41a78d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:49.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.271+0000 7f4ac3fff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4ac4072370 0x7f4ac41a78d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:53034/0 (socket says 192.168.123.107:53034) 2026-03-09T19:33:49.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.271+0000 7f4ac3fff640 1 -- 192.168.123.107:0/1429502289 learned_addr learned my addr 192.168.123.107:0/1429502289 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:33:49.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.271+0000 7f4ac8f87640 1 --2- 192.168.123.107:0/1429502289 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ac40719a0 0x7f4ac41a7390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:49.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.271+0000 7f4ac3fff640 1 -- 192.168.123.107:0/1429502289 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ac40719a0 msgr2=0x7f4ac41a7390 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.271+0000 7f4ac3fff640 1 --2- 192.168.123.107:0/1429502289 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ac40719a0 0x7f4ac41a7390 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.271+0000 7f4ac3fff640 1 -- 192.168.123.107:0/1429502289 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4ab8009660 con 0x7f4ac4072370 2026-03-09T19:33:49.270 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.271+0000 7f4ac8f87640 1 --2- 192.168.123.107:0/1429502289 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ac40719a0 0x7f4ac41a7390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T19:33:49.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.272+0000 7f4ac3fff640 1 --2- 192.168.123.107:0/1429502289 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4ac4072370 0x7f4ac41a78d0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f4ab802f730 tx=0x7f4ab8004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:49.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.272+0000 7f4ac1ffb640 1 -- 192.168.123.107:0/1429502289 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4ab803d070 con 0x7f4ac4072370 2026-03-09T19:33:49.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.272+0000 7f4ac9f89640 1 -- 192.168.123.107:0/1429502289 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4ac410ee50 con 0x7f4ac4072370 2026-03-09T19:33:49.271 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.272+0000 7f4ac1ffb640 1 -- 192.168.123.107:0/1429502289 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f4ab8038730 con 0x7f4ac4072370 2026-03-09T19:33:49.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.272+0000 7f4ac9f89640 1 -- 192.168.123.107:0/1429502289 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4ac410f340 con 0x7f4ac4072370 2026-03-09T19:33:49.272 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.273+0000 7f4ac1ffb640 1 -- 192.168.123.107:0/1429502289 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4ab8031280 con 0x7f4ac4072370 2026-03-09T19:33:49.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.274+0000 7f4ac9f89640 1 -- 192.168.123.107:0/1429502289 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4ac4071da0 con 0x7f4ac4072370 2026-03-09T19:33:49.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.274+0000 7f4ac1ffb640 1 -- 192.168.123.107:0/1429502289 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4ab8049050 con 0x7f4ac4072370 2026-03-09T19:33:49.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.275+0000 7f4ac1ffb640 1 --2- 192.168.123.107:0/1429502289 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4aa00776d0 0x7f4aa0079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:49.273 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.275+0000 7f4ac1ffb640 1 -- 192.168.123.107:0/1429502289 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(100..100 src has 1..100) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f4ab80be500 con 0x7f4ac4072370 2026-03-09T19:33:49.274 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.275+0000 7f4ac8f87640 1 --2- 192.168.123.107:0/1429502289 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4aa00776d0 0x7f4aa0079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:49.274 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.275+0000 7f4ac8f87640 1 --2- 192.168.123.107:0/1429502289 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4aa00776d0 0x7f4aa0079b90 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f4aac005eb0 tx=0x7f4aac005e40 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:49.276 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.277+0000 7f4ac1ffb640 1 -- 192.168.123.107:0/1429502289 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4ab8086c00 con 0x7f4ac4072370 2026-03-09T19:33:49.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.439+0000 7f4ac9f89640 1 -- 192.168.123.107:0/1429502289 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f4ac410f610 con 0x7f4ac4072370 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.440+0000 7f4ac1ffb640 1 -- 192.168.123.107:0/1429502289 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+845 (secure 0 0 0) 0x7f4ab8086350 con 0x7f4ac4072370 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2, 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 18.2.7-1055-gab47f43c (ab47f43c099b2cbae6e21342fe673ce251da54d6) reef (stable)": 2, 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 12 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:33:49.439 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:33:49.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.443+0000 7f4a937fe640 1 -- 192.168.123.107:0/1429502289 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4aa00776d0 msgr2=0x7f4aa0079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.443+0000 7f4a937fe640 1 --2- 192.168.123.107:0/1429502289 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4aa00776d0 0x7f4aa0079b90 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f4aac005eb0 tx=0x7f4aac005e40 comp rx=0 tx=0).stop 2026-03-09T19:33:49.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.443+0000 7f4a937fe640 1 -- 192.168.123.107:0/1429502289 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4ac4072370 msgr2=0x7f4ac41a78d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.443+0000 7f4a937fe640 1 --2- 192.168.123.107:0/1429502289 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4ac4072370 0x7f4ac41a78d0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f4ab802f730 tx=0x7f4ab8004290 comp rx=0 tx=0).stop 2026-03-09T19:33:49.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.443+0000 7f4a937fe640 1 -- 192.168.123.107:0/1429502289 shutdown_connections 2026-03-09T19:33:49.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.443+0000 7f4a937fe640 1 --2- 192.168.123.107:0/1429502289 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4aa00776d0 0x7f4aa0079b90 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.443+0000 7f4a937fe640 1 --2- 192.168.123.107:0/1429502289 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4ac4072370 0x7f4ac41a78d0 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.443+0000 7f4a937fe640 1 --2- 192.168.123.107:0/1429502289 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ac40719a0 0x7f4ac41a7390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.443+0000 7f4a937fe640 1 -- 192.168.123.107:0/1429502289 >> 192.168.123.107:0/1429502289 conn(0x7f4ac406d4f0 msgr2=0x7f4ac40707b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:49.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.444+0000 7f4a937fe640 1 -- 192.168.123.107:0/1429502289 shutdown_connections 2026-03-09T19:33:49.443 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.444+0000 7f4a937fe640 1 -- 192.168.123.107:0/1429502289 wait complete. 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.515+0000 7f961b50f640 1 -- 192.168.123.107:0/3632784635 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f96140719c0 msgr2=0x7f9614071dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.515+0000 7f961b50f640 1 --2- 192.168.123.107:0/3632784635 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f96140719c0 0x7f9614071dc0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f96100099b0 tx=0x7f961002f240 comp rx=0 tx=0).stop 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.515+0000 7f961b50f640 1 -- 192.168.123.107:0/3632784635 shutdown_connections 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.515+0000 7f961b50f640 1 --2- 192.168.123.107:0/3632784635 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9614072390 0x7f961410c590 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.515+0000 7f961b50f640 1 --2- 192.168.123.107:0/3632784635 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f96140719c0 0x7f9614071dc0 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.515+0000 7f961b50f640 1 -- 192.168.123.107:0/3632784635 >> 192.168.123.107:0/3632784635 conn(0x7f961406d4f0 msgr2=0x7f961406f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.516+0000 7f961b50f640 1 -- 192.168.123.107:0/3632784635 shutdown_connections 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.516+0000 7f961b50f640 1 -- 192.168.123.107:0/3632784635 wait complete. 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.516+0000 7f961b50f640 1 Processor -- start 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.516+0000 7f961b50f640 1 -- start start 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.516+0000 7f961b50f640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f96140719c0 0x7f96141159b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.516+0000 7f961b50f640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9614072390 0x7f9614115ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.516+0000 7f961b50f640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f96141173f0 con 0x7f96140719c0 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.516+0000 7f961b50f640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9614117560 con 0x7f9614072390 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.516+0000 7f9618a83640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9614072390 0x7f9614115ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.516+0000 7f9618a83640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9614072390 0x7f9614115ef0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:53050/0 (socket says 192.168.123.107:53050) 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.516+0000 7f9618a83640 1 -- 192.168.123.107:0/1083740578 learned_addr learned my addr 192.168.123.107:0/1083740578 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.517+0000 7f9618a83640 1 -- 192.168.123.107:0/1083740578 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f96140719c0 msgr2=0x7f96141159b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.517+0000 7f9618a83640 1 --2- 192.168.123.107:0/1083740578 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f96140719c0 0x7f96141159b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.517+0000 7f9618a83640 1 -- 192.168.123.107:0/1083740578 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9610009660 con 0x7f9614072390 2026-03-09T19:33:49.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.517+0000 7f9618a83640 1 --2- 192.168.123.107:0/1083740578 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9614072390 0x7f9614115ef0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f960c0049b0 tx=0x7f960c00d4a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:49.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.517+0000 7f96027fc640 1 -- 192.168.123.107:0/1083740578 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f960c00dbb0 con 0x7f9614072390 2026-03-09T19:33:49.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.517+0000 7f961b50f640 1 -- 192.168.123.107:0/1083740578 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9614116550 con 0x7f9614072390 2026-03-09T19:33:49.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.517+0000 7f961b50f640 1 -- 192.168.123.107:0/1083740578 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f961411d870 con 0x7f9614072390 2026-03-09T19:33:49.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.518+0000 7f96027fc640 1 -- 192.168.123.107:0/1083740578 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f960c00f040 con 0x7f9614072390 2026-03-09T19:33:49.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.518+0000 7f96027fc640 1 -- 192.168.123.107:0/1083740578 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f960c013600 con 0x7f9614072390 2026-03-09T19:33:49.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.518+0000 7f961b50f640 1 -- 192.168.123.107:0/1083740578 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9614071dc0 con 0x7f9614072390 2026-03-09T19:33:49.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.520+0000 7f96027fc640 1 -- 192.168.123.107:0/1083740578 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f960c007500 con 0x7f9614072390 2026-03-09T19:33:49.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.520+0000 7f96027fc640 1 --2- 192.168.123.107:0/1083740578 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f95f8077990 0x7f95f8079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:49.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.520+0000 7f96027fc640 1 -- 192.168.123.107:0/1083740578 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(100..100 src has 1..100) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f960c0991b0 con 0x7f9614072390 2026-03-09T19:33:49.520 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.520+0000 7f9619284640 1 --2- 192.168.123.107:0/1083740578 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f95f8077990 0x7f95f8079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:49.521 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.522+0000 7f9619284640 1 --2- 192.168.123.107:0/1083740578 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f95f8077990 0x7f95f8079e50 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f961002f750 tx=0x7f96100047c0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:49.522 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.523+0000 7f96027fc640 1 -- 192.168.123.107:0/1083740578 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f960c061960 con 0x7f9614072390 2026-03-09T19:33:49.645 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.645+0000 7f961b50f640 1 -- 192.168.123.107:0/1083740578 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f96141185d0 con 0x7f9614072390 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.647+0000 7f96027fc640 1 -- 192.168.123.107:0/1083740578 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 20 v20) v1 ==== 76+0+1930 (secure 0 0 0) 0x7f960c0187a0 con 0x7f9614072390 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:e20 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:btime 2026-03-09T19:33:48:296741+0000 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:epoch 15 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:33:32.839429+0000 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 41 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 1 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:in 0 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:up {0=24279} 2026-03-09T19:33:49.646 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout:stopped 1 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout:qdb_cluster leader: 24279 members: 24279 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{0:24279} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.108:6824/1034426472,v1:192.168.123.108:6825/1034426472] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{-1:34378} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6826/2328013860,v1:192.168.123.108:6827/2328013860] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{-1:34382} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/2856024060,v1:192.168.123.107:6827/2856024060] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{-1:34386} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6828/670620212,v1:192.168.123.107:6829/670620212] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T19:33:49.647 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 20 2026-03-09T19:33:49.661 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.662+0000 7f95e3fff640 1 -- 192.168.123.107:0/1083740578 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f95f8077990 msgr2=0x7f95f8079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.661 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.662+0000 7f95e3fff640 1 --2- 192.168.123.107:0/1083740578 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f95f8077990 0x7f95f8079e50 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f961002f750 tx=0x7f96100047c0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.661 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.662+0000 7f95e3fff640 1 -- 192.168.123.107:0/1083740578 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9614072390 msgr2=0x7f9614115ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.661 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.662+0000 7f95e3fff640 1 --2- 192.168.123.107:0/1083740578 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9614072390 0x7f9614115ef0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f960c0049b0 tx=0x7f960c00d4a0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.662+0000 7f95e3fff640 1 -- 192.168.123.107:0/1083740578 shutdown_connections 2026-03-09T19:33:49.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.662+0000 7f95e3fff640 1 --2- 192.168.123.107:0/1083740578 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f95f8077990 0x7f95f8079e50 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.662+0000 7f95e3fff640 1 --2- 192.168.123.107:0/1083740578 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9614072390 0x7f9614115ef0 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.662+0000 7f95e3fff640 1 --2- 192.168.123.107:0/1083740578 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f96140719c0 0x7f96141159b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.662+0000 7f95e3fff640 1 -- 192.168.123.107:0/1083740578 >> 192.168.123.107:0/1083740578 conn(0x7f961406d4f0 msgr2=0x7f9614070760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:49.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.663+0000 7f95e3fff640 1 -- 192.168.123.107:0/1083740578 shutdown_connections 2026-03-09T19:33:49.662 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.663+0000 7f95e3fff640 1 -- 192.168.123.107:0/1083740578 wait complete. 2026-03-09T19:33:49.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.729+0000 7f8e11344640 1 -- 192.168.123.107:0/2331126177 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e0c0719a0 msgr2=0x7f8e0c071da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.728 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.729+0000 7f8e11344640 1 --2- 192.168.123.107:0/2331126177 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e0c0719a0 0x7f8e0c071da0 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f8df40099b0 tx=0x7f8df402f240 comp rx=0 tx=0).stop 2026-03-09T19:33:49.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.729+0000 7f8e11344640 1 -- 192.168.123.107:0/2331126177 shutdown_connections 2026-03-09T19:33:49.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.729+0000 7f8e11344640 1 --2- 192.168.123.107:0/2331126177 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e0c072370 0x7f8e0c10c590 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.729+0000 7f8e11344640 1 --2- 192.168.123.107:0/2331126177 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e0c0719a0 0x7f8e0c071da0 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.729+0000 7f8e11344640 1 -- 192.168.123.107:0/2331126177 >> 192.168.123.107:0/2331126177 conn(0x7f8e0c06d4f0 msgr2=0x7f8e0c06f930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:49.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.731+0000 7f8e11344640 1 -- 192.168.123.107:0/2331126177 shutdown_connections 2026-03-09T19:33:49.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.731+0000 7f8e11344640 1 -- 192.168.123.107:0/2331126177 wait complete. 2026-03-09T19:33:49.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.731+0000 7f8e11344640 1 Processor -- start 2026-03-09T19:33:49.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.731+0000 7f8e11344640 1 -- start start 2026-03-09T19:33:49.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.731+0000 7f8e11344640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e0c0719a0 0x7f8e0c115970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:49.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.732+0000 7f8e11344640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e0c072370 0x7f8e0c115eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:49.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.732+0000 7f8e11344640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8e0c1173b0 con 0x7f8e0c0719a0 2026-03-09T19:33:49.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.732+0000 7f8e11344640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8e0c117520 con 0x7f8e0c072370 2026-03-09T19:33:49.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.732+0000 7f8e0b7fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e0c072370 0x7f8e0c115eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:49.732 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.732+0000 7f8e0b7fe640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e0c072370 0x7f8e0c115eb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:53080/0 (socket says 192.168.123.107:53080) 2026-03-09T19:33:49.732 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.732+0000 7f8e0b7fe640 1 -- 192.168.123.107:0/1536054742 learned_addr learned my addr 192.168.123.107:0/1536054742 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:33:49.732 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.733+0000 7f8e0b7fe640 1 -- 192.168.123.107:0/1536054742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e0c0719a0 msgr2=0x7f8e0c115970 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:33:49.732 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.733+0000 7f8e0bfff640 1 --2- 192.168.123.107:0/1536054742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e0c0719a0 0x7f8e0c115970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:49.732 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.733+0000 7f8e0b7fe640 1 --2- 192.168.123.107:0/1536054742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e0c0719a0 0x7f8e0c115970 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.732 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.733+0000 7f8e0b7fe640 1 -- 192.168.123.107:0/1536054742 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8df4009660 con 0x7f8e0c072370 2026-03-09T19:33:49.732 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.733+0000 7f8e0bfff640 1 --2- 192.168.123.107:0/1536054742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e0c0719a0 0x7f8e0c115970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:33:49.732 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.733+0000 7f8e0b7fe640 1 --2- 192.168.123.107:0/1536054742 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e0c072370 0x7f8e0c115eb0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f8e0400ba50 tx=0x7f8e0400bf20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:49.732 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.734+0000 7f8e097fa640 1 -- 192.168.123.107:0/1536054742 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8e04002d10 con 0x7f8e0c072370 2026-03-09T19:33:49.733 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.734+0000 7f8e11344640 1 -- 192.168.123.107:0/1536054742 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8e0c116510 con 0x7f8e0c072370 2026-03-09T19:33:49.733 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.734+0000 7f8e11344640 1 -- 192.168.123.107:0/1536054742 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8e0c1b5990 con 0x7f8e0c072370 2026-03-09T19:33:49.733 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.735+0000 7f8e097fa640 1 -- 192.168.123.107:0/1536054742 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8e04002e70 con 0x7f8e0c072370 2026-03-09T19:33:49.734 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.735+0000 7f8e097fa640 1 -- 192.168.123.107:0/1536054742 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8e04015470 con 0x7f8e0c072370 2026-03-09T19:33:49.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.736+0000 7f8e097fa640 1 -- 192.168.123.107:0/1536054742 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8e040155d0 con 0x7f8e0c072370 2026-03-09T19:33:49.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.737+0000 7f8e097fa640 1 --2- 192.168.123.107:0/1536054742 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8ddc0778e0 0x7f8ddc079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:49.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.737+0000 7f8e0bfff640 1 --2- 192.168.123.107:0/1536054742 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8ddc0778e0 0x7f8ddc079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:49.736 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.737+0000 7f8e097fa640 1 -- 192.168.123.107:0/1536054742 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(100..100 src has 1..100) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f8e0409a650 con 0x7f8e0c072370 2026-03-09T19:33:49.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.738+0000 7f8e11344640 1 -- 192.168.123.107:0/1536054742 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8dd8005350 con 0x7f8e0c072370 2026-03-09T19:33:49.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.738+0000 7f8e0bfff640 1 --2- 192.168.123.107:0/1536054742 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8ddc0778e0 0x7f8ddc079da0 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f8df4002bf0 tx=0x7f8df40023d0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:49.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.742+0000 7f8e097fa640 1 -- 192.168.123.107:0/1536054742 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8e04062d50 con 0x7f8e0c072370 2026-03-09T19:33:49.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.855+0000 7f8e11344640 1 -- 192.168.123.107:0/1536054742 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8dd8002bf0 con 0x7f8ddc0778e0 2026-03-09T19:33:49.859 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:33:49.859 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:49 vm07.local ceph-mon[111841]: from='client.44315 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:33:49.859 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:49 vm07.local ceph-mon[111841]: from='client.44319 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:33:49.859 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:49.859 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:49.859 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:49 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1429502289' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:49.859 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.857+0000 7f8e097fa640 1 -- 192.168.123.107:0/1536054742 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f8dd8002bf0 con 0x7f8ddc0778e0 2026-03-09T19:33:49.860 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:33:49.860 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T19:33:49.860 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:33:49.860 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:33:49.860 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [ 2026-03-09T19:33:49.860 INFO:teuthology.orchestra.run.vm07.stdout: "mgr", 2026-03-09T19:33:49.860 INFO:teuthology.orchestra.run.vm07.stdout: "crash", 2026-03-09T19:33:49.860 INFO:teuthology.orchestra.run.vm07.stdout: "osd", 2026-03-09T19:33:49.860 INFO:teuthology.orchestra.run.vm07.stdout: "mon" 2026-03-09T19:33:49.860 INFO:teuthology.orchestra.run.vm07.stdout: ], 2026-03-09T19:33:49.860 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "14/23 daemons upgraded", 2026-03-09T19:33:49.860 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Currently upgrading mds daemons", 2026-03-09T19:33:49.860 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:33:49.860 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:33:49.862 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.863+0000 7f8e11344640 1 -- 192.168.123.107:0/1536054742 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8ddc0778e0 msgr2=0x7f8ddc079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.862 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.863+0000 7f8e11344640 1 --2- 192.168.123.107:0/1536054742 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8ddc0778e0 0x7f8ddc079da0 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f8df4002bf0 tx=0x7f8df40023d0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.863 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.863+0000 7f8e11344640 1 -- 192.168.123.107:0/1536054742 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e0c072370 msgr2=0x7f8e0c115eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.863 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.864+0000 7f8e11344640 1 --2- 192.168.123.107:0/1536054742 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e0c072370 0x7f8e0c115eb0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f8e0400ba50 tx=0x7f8e0400bf20 comp rx=0 tx=0).stop 2026-03-09T19:33:49.863 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.864+0000 7f8e11344640 1 -- 192.168.123.107:0/1536054742 shutdown_connections 2026-03-09T19:33:49.863 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.864+0000 7f8e11344640 1 --2- 192.168.123.107:0/1536054742 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8ddc0778e0 0x7f8ddc079da0 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.863 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.864+0000 7f8e11344640 1 --2- 192.168.123.107:0/1536054742 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e0c072370 0x7f8e0c115eb0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.863 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.864+0000 7f8e11344640 1 --2- 192.168.123.107:0/1536054742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e0c0719a0 0x7f8e0c115970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.863 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.864+0000 7f8e11344640 1 -- 192.168.123.107:0/1536054742 >> 192.168.123.107:0/1536054742 conn(0x7f8e0c06d4f0 msgr2=0x7f8e0c070730 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:49.863 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.864+0000 7f8e11344640 1 -- 192.168.123.107:0/1536054742 shutdown_connections 2026-03-09T19:33:49.863 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.864+0000 7f8e11344640 1 -- 192.168.123.107:0/1536054742 wait complete. 2026-03-09T19:33:49.944 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.944+0000 7f65a4da1640 1 -- 192.168.123.107:0/707118423 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a00ff9f0 msgr2=0x7f65a00ffe70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.944 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.944+0000 7f65a4da1640 1 --2- 192.168.123.107:0/707118423 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a00ff9f0 0x7f65a00ffe70 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f65900099b0 tx=0x7f659002f220 comp rx=0 tx=0).stop 2026-03-09T19:33:49.944 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.944+0000 7f65a4da1640 1 -- 192.168.123.107:0/707118423 shutdown_connections 2026-03-09T19:33:49.944 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.944+0000 7f65a4da1640 1 --2- 192.168.123.107:0/707118423 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a00ff9f0 0x7f65a00ffe70 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.944 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.944+0000 7f65a4da1640 1 --2- 192.168.123.107:0/707118423 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f65a00fe7f0 0x7f65a00febf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.944 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.944+0000 7f65a4da1640 1 -- 192.168.123.107:0/707118423 >> 192.168.123.107:0/707118423 conn(0x7f65a00f9f80 msgr2=0x7f65a00fc3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:49.946 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.946+0000 7f65a4da1640 1 -- 192.168.123.107:0/707118423 shutdown_connections 2026-03-09T19:33:49.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.951+0000 7f65a4da1640 1 -- 192.168.123.107:0/707118423 wait complete. 2026-03-09T19:33:49.952 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.953+0000 7f65a4da1640 1 Processor -- start 2026-03-09T19:33:49.952 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.954+0000 7f65a4da1640 1 -- start start 2026-03-09T19:33:49.953 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.954+0000 7f65a4da1640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a00fe7f0 0x7f65a019a290 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:49.953 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.954+0000 7f65a4da1640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f65a00ff9f0 0x7f65a019a7d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:49.953 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.954+0000 7f659f7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a00fe7f0 0x7f65a019a290 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:49.953 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.954+0000 7f659f7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a00fe7f0 0x7f65a019a290 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:55776/0 (socket says 192.168.123.107:55776) 2026-03-09T19:33:49.953 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.954+0000 7f659f7fe640 1 -- 192.168.123.107:0/2234480142 learned_addr learned my addr 192.168.123.107:0/2234480142 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:33:49.953 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.955+0000 7f65a4da1640 1 -- 192.168.123.107:0/2234480142 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f65a019ada0 con 0x7f65a00fe7f0 2026-03-09T19:33:49.953 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.955+0000 7f65a4da1640 1 -- 192.168.123.107:0/2234480142 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f65a019af10 con 0x7f65a00ff9f0 2026-03-09T19:33:49.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.955+0000 7f659effd640 1 --2- 192.168.123.107:0/2234480142 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f65a00ff9f0 0x7f65a019a7d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:49.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.955+0000 7f659f7fe640 1 -- 192.168.123.107:0/2234480142 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f65a00ff9f0 msgr2=0x7f65a019a7d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:49.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.955+0000 7f659f7fe640 1 --2- 192.168.123.107:0/2234480142 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f65a00ff9f0 0x7f65a019a7d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:49.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.955+0000 7f659f7fe640 1 -- 192.168.123.107:0/2234480142 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6590009660 con 0x7f65a00fe7f0 2026-03-09T19:33:49.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.955+0000 7f659f7fe640 1 --2- 192.168.123.107:0/2234480142 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a00fe7f0 0x7f65a019a290 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f658c00d930 tx=0x7f658c00de00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:49.958 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.959+0000 7f659cff9640 1 -- 192.168.123.107:0/2234480142 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f658c004490 con 0x7f65a00fe7f0 2026-03-09T19:33:49.958 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.959+0000 7f65a4da1640 1 -- 192.168.123.107:0/2234480142 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f65a019f9b0 con 0x7f65a00fe7f0 2026-03-09T19:33:49.958 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.959+0000 7f65a4da1640 1 -- 192.168.123.107:0/2234480142 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f65a019fed0 con 0x7f65a00fe7f0 2026-03-09T19:33:49.959 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.960+0000 7f659cff9640 1 -- 192.168.123.107:0/2234480142 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f658c007620 con 0x7f65a00fe7f0 2026-03-09T19:33:49.959 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.961+0000 7f659cff9640 1 -- 192.168.123.107:0/2234480142 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f658c002d80 con 0x7f65a00fe7f0 2026-03-09T19:33:49.960 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.961+0000 7f659cff9640 1 -- 192.168.123.107:0/2234480142 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f658c010480 con 0x7f65a00fe7f0 2026-03-09T19:33:49.960 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.961+0000 7f659cff9640 1 --2- 192.168.123.107:0/2234480142 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f65740777a0 0x7f6574079c60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:33:49.960 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.961+0000 7f659effd640 1 --2- 192.168.123.107:0/2234480142 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f65740777a0 0x7f6574079c60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:33:49.960 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.961+0000 7f65a4da1640 1 -- 192.168.123.107:0/2234480142 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6564005350 con 0x7f65a00fe7f0 2026-03-09T19:33:49.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.965+0000 7f659effd640 1 --2- 192.168.123.107:0/2234480142 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f65740777a0 0x7f6574079c60 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f6590002410 tx=0x7f659003a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:33:49.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.965+0000 7f659cff9640 1 -- 192.168.123.107:0/2234480142 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(100..100 src has 1..100) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f658c09ab30 con 0x7f65a00fe7f0 2026-03-09T19:33:49.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:49.966+0000 7f659cff9640 1 -- 192.168.123.107:0/2234480142 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f658c05ead0 con 0x7f65a00fe7f0 2026-03-09T19:33:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:33:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:49 vm08.local ceph-mon[103420]: from='client.44315 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:33:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:49 vm08.local ceph-mon[103420]: from='client.44319 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:33:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:50.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:49 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1429502289' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:50.124 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:50.120+0000 7f65a4da1640 1 -- 192.168.123.107:0/2234480142 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f6564005600 con 0x7f65a00fe7f0 2026-03-09T19:33:50.124 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:50.125+0000 7f659cff9640 1 -- 192.168.123.107:0/2234480142 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f658c062ab0 con 0x7f65a00fe7f0 2026-03-09T19:33:50.127 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_OK 2026-03-09T19:33:50.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:50.128+0000 7f65a4da1640 1 -- 192.168.123.107:0/2234480142 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f65740777a0 msgr2=0x7f6574079c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:50.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:50.128+0000 7f65a4da1640 1 --2- 192.168.123.107:0/2234480142 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f65740777a0 0x7f6574079c60 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f6590002410 tx=0x7f659003a040 comp rx=0 tx=0).stop 2026-03-09T19:33:50.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:50.128+0000 7f65a4da1640 1 -- 192.168.123.107:0/2234480142 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a00fe7f0 msgr2=0x7f65a019a290 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:33:50.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:50.128+0000 7f65a4da1640 1 --2- 192.168.123.107:0/2234480142 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a00fe7f0 0x7f65a019a290 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f658c00d930 tx=0x7f658c00de00 comp rx=0 tx=0).stop 2026-03-09T19:33:50.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:50.129+0000 7f65a4da1640 1 -- 192.168.123.107:0/2234480142 shutdown_connections 2026-03-09T19:33:50.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:50.129+0000 7f65a4da1640 1 --2- 192.168.123.107:0/2234480142 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f65740777a0 0x7f6574079c60 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:50.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:50.129+0000 7f65a4da1640 1 --2- 192.168.123.107:0/2234480142 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f65a00ff9f0 0x7f65a019a7d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:50.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:50.129+0000 7f65a4da1640 1 --2- 192.168.123.107:0/2234480142 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65a00fe7f0 0x7f65a019a290 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:33:50.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:50.129+0000 7f65a4da1640 1 -- 192.168.123.107:0/2234480142 >> 192.168.123.107:0/2234480142 conn(0x7f65a00f9f80 msgr2=0x7f65a01066e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:33:50.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:50.129+0000 7f65a4da1640 1 -- 192.168.123.107:0/2234480142 shutdown_connections 2026-03-09T19:33:50.128 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:33:50.129+0000 7f65a4da1640 1 -- 192.168.123.107:0/2234480142 wait complete. 2026-03-09T19:33:50.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:50 vm07.local ceph-mon[111841]: from='client.34394 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:33:50.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:50 vm07.local ceph-mon[111841]: pgmap v215: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:50.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:50 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1083740578' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:33:50.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:50 vm07.local ceph-mon[111841]: from='client.44329 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:33:50.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:50.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:50.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:50 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/2234480142' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:33:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:50 vm08.local ceph-mon[103420]: from='client.34394 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:33:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:50 vm08.local ceph-mon[103420]: pgmap v215: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:50 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1083740578' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:33:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:50 vm08.local ceph-mon[103420]: from='client.44329 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:33:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:50 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/2234480142' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm08.zcaqju"]}]: dispatch 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.zcaqju", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: osdmap e101: 6 total, 6 up, 6 in 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: Standby daemon mds.cephfs.vm08.jwsqrf assigned to filesystem cephfs as rank 0 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T19:33:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:51 vm08.local ceph-mon[103420]: fsmap cephfs:1/1 {0=cephfs.vm08.jwsqrf=up:replay} 2 up:standby 2026-03-09T19:33:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07[111836]: 2026-03-09T19:33:51.904+0000 7f8a8e586640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm08.zcaqju"]}]: dispatch 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.zcaqju", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: osdmap e101: 6 total, 6 up, 6 in 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: Standby daemon mds.cephfs.vm08.jwsqrf assigned to filesystem cephfs as rank 0 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T19:33:52.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:51 vm07.local ceph-mon[111841]: fsmap cephfs:1/1 {0=cephfs.vm08.jwsqrf=up:replay} 2 up:standby 2026-03-09T19:33:53.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:52 vm07.local ceph-mon[111841]: Upgrade: It appears safe to stop mds.cephfs.vm08.zcaqju 2026-03-09T19:33:53.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:52 vm07.local ceph-mon[111841]: Upgrade: Updating mds.cephfs.vm08.zcaqju 2026-03-09T19:33:53.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:52 vm07.local ceph-mon[111841]: Deploying daemon mds.cephfs.vm08.zcaqju on vm08 2026-03-09T19:33:53.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:52 vm07.local ceph-mon[111841]: pgmap v216: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:52 vm08.local ceph-mon[103420]: Upgrade: It appears safe to stop mds.cephfs.vm08.zcaqju 2026-03-09T19:33:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:52 vm08.local ceph-mon[103420]: Upgrade: Updating mds.cephfs.vm08.zcaqju 2026-03-09T19:33:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:52 vm08.local ceph-mon[103420]: Deploying daemon mds.cephfs.vm08.zcaqju on vm08 2026-03-09T19:33:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:52 vm08.local ceph-mon[103420]: pgmap v216: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:33:55.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:55 vm07.local ceph-mon[111841]: pgmap v218: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 8.5 MiB/s rd, 2 op/s 2026-03-09T19:33:55.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:55 vm08.local ceph-mon[103420]: pgmap v218: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 8.5 MiB/s rd, 2 op/s 2026-03-09T19:33:57.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:56 vm07.local ceph-mon[111841]: pgmap v219: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 12 MiB/s rd, 4 op/s 2026-03-09T19:33:57.238 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:56 vm08.local ceph-mon[103420]: pgmap v219: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 12 MiB/s rd, 4 op/s 2026-03-09T19:33:58.162 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:57 vm08.local ceph-mon[103420]: reconnect by client.24305 192.168.144.1:0/3035669578 after 0.00600001 2026-03-09T19:33:58.162 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:57 vm08.local ceph-mon[103420]: reconnect by client.14532 192.168.144.1:0/1264854118 after 0.00600001 2026-03-09T19:33:58.162 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:57 vm08.local ceph-mon[103420]: mds.? [v2:192.168.123.108:6826/2328013860,v1:192.168.123.108:6827/2328013860] up:reconnect 2026-03-09T19:33:58.162 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:57 vm08.local ceph-mon[103420]: fsmap cephfs:1/1 {0=cephfs.vm08.jwsqrf=up:reconnect} 2 up:standby 2026-03-09T19:33:58.162 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:57 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:58.162 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:57 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:58.162 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:57 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:58.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:57 vm07.local ceph-mon[111841]: reconnect by client.24305 192.168.144.1:0/3035669578 after 0.00600001 2026-03-09T19:33:58.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:57 vm07.local ceph-mon[111841]: reconnect by client.14532 192.168.144.1:0/1264854118 after 0.00600001 2026-03-09T19:33:58.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:57 vm07.local ceph-mon[111841]: mds.? [v2:192.168.123.108:6826/2328013860,v1:192.168.123.108:6827/2328013860] up:reconnect 2026-03-09T19:33:58.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:57 vm07.local ceph-mon[111841]: fsmap cephfs:1/1 {0=cephfs.vm08.jwsqrf=up:reconnect} 2 up:standby 2026-03-09T19:33:58.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:57 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:58.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:57 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:58.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:57 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:33:59.193 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:58 vm08.local ceph-mon[103420]: pgmap v220: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 5 op/s 2026-03-09T19:33:59.193 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:58 vm08.local ceph-mon[103420]: mds.? [v2:192.168.123.108:6826/2328013860,v1:192.168.123.108:6827/2328013860] up:rejoin 2026-03-09T19:33:59.193 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:58 vm08.local ceph-mon[103420]: mds.? [v2:192.168.123.108:6824/3904856878,v1:192.168.123.108:6825/3904856878] up:boot 2026-03-09T19:33:59.193 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:58 vm08.local ceph-mon[103420]: fsmap cephfs:1/1 {0=cephfs.vm08.jwsqrf=up:rejoin} 3 up:standby 2026-03-09T19:33:59.193 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:58 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.zcaqju"}]: dispatch 2026-03-09T19:33:59.193 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:58 vm08.local ceph-mon[103420]: daemon mds.cephfs.vm08.jwsqrf is now active in filesystem cephfs as rank 0 2026-03-09T19:33:59.193 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:58 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:59.193 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:58 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:59.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:58 vm07.local ceph-mon[111841]: pgmap v220: 65 pgs: 65 active+clean; 218 MiB data, 950 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 5 op/s 2026-03-09T19:33:59.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:58 vm07.local ceph-mon[111841]: mds.? [v2:192.168.123.108:6826/2328013860,v1:192.168.123.108:6827/2328013860] up:rejoin 2026-03-09T19:33:59.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:58 vm07.local ceph-mon[111841]: mds.? [v2:192.168.123.108:6824/3904856878,v1:192.168.123.108:6825/3904856878] up:boot 2026-03-09T19:33:59.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:58 vm07.local ceph-mon[111841]: fsmap cephfs:1/1 {0=cephfs.vm08.jwsqrf=up:rejoin} 3 up:standby 2026-03-09T19:33:59.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:58 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.zcaqju"}]: dispatch 2026-03-09T19:33:59.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:58 vm07.local ceph-mon[111841]: daemon mds.cephfs.vm08.jwsqrf is now active in filesystem cephfs as rank 0 2026-03-09T19:33:59.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:58 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:33:59.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:58 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:00.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:59 vm07.local ceph-mon[111841]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T19:34:00.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:59 vm07.local ceph-mon[111841]: Cluster is now healthy 2026-03-09T19:34:00.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:59 vm07.local ceph-mon[111841]: mds.? [v2:192.168.123.108:6826/2328013860,v1:192.168.123.108:6827/2328013860] up:active 2026-03-09T19:34:00.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:59 vm07.local ceph-mon[111841]: fsmap cephfs:1 {0=cephfs.vm08.jwsqrf=up:active} 3 up:standby 2026-03-09T19:34:00.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:59 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:00.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:33:59 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:59 vm08.local ceph-mon[103420]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T19:34:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:59 vm08.local ceph-mon[103420]: Cluster is now healthy 2026-03-09T19:34:00.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:59 vm08.local ceph-mon[103420]: mds.? [v2:192.168.123.108:6826/2328013860,v1:192.168.123.108:6827/2328013860] up:active 2026-03-09T19:34:00.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:59 vm08.local ceph-mon[103420]: fsmap cephfs:1 {0=cephfs.vm08.jwsqrf=up:active} 3 up:standby 2026-03-09T19:34:00.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:59 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:00.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:33:59 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:01.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-mon[111841]: pgmap v221: 65 pgs: 65 active+clean; 218 MiB data, 946 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 102 B/s wr, 6 op/s 2026-03-09T19:34:01.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-mon[111841]: Detected new or changed devices on vm08 2026-03-09T19:34:01.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:01.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:01.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:01.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:01.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:01.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:01.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:01.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:01.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:01.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:01.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm08.jwsqrf"]}]: dispatch 2026-03-09T19:34:01.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:01.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.jwsqrf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:34:01.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:01.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:01 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm07[111836]: 2026-03-09T19:34:01.466+0000 7f8a8e586640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T19:34:01.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:01 vm08.local ceph-mon[103420]: pgmap v221: 65 pgs: 65 active+clean; 218 MiB data, 946 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 102 B/s wr, 6 op/s 2026-03-09T19:34:01.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:01 vm08.local ceph-mon[103420]: Detected new or changed devices on vm08 2026-03-09T19:34:01.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:01.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:01.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:01.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:01.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:01.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:01.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:01.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:01.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:01.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:01.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm08.jwsqrf"]}]: dispatch 2026-03-09T19:34:01.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:01.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.jwsqrf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T19:34:01.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:01 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:02.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:02 vm07.local ceph-mon[111841]: Upgrade: It appears safe to stop mds.cephfs.vm08.jwsqrf 2026-03-09T19:34:02.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:02 vm07.local ceph-mon[111841]: Upgrade: Updating mds.cephfs.vm08.jwsqrf 2026-03-09T19:34:02.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:02 vm07.local ceph-mon[111841]: Deploying daemon mds.cephfs.vm08.jwsqrf on vm08 2026-03-09T19:34:02.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:02 vm07.local ceph-mon[111841]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T19:34:02.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:02 vm07.local ceph-mon[111841]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T19:34:02.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:02 vm07.local ceph-mon[111841]: pgmap v222: 65 pgs: 65 active+clean; 214 MiB data, 942 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 307 B/s wr, 7 op/s 2026-03-09T19:34:02.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:02 vm07.local ceph-mon[111841]: osdmap e102: 6 total, 6 up, 6 in 2026-03-09T19:34:02.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:02 vm07.local ceph-mon[111841]: Standby daemon mds.cephfs.vm07.uizncw assigned to filesystem cephfs as rank 0 2026-03-09T19:34:02.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:02 vm07.local ceph-mon[111841]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T19:34:02.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:02 vm07.local ceph-mon[111841]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T19:34:02.729 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:02 vm07.local ceph-mon[111841]: fsmap cephfs:1/1 {0=cephfs.vm07.uizncw=up:replay} 2 up:standby 2026-03-09T19:34:02.767 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:02 vm08.local ceph-mon[103420]: Upgrade: It appears safe to stop mds.cephfs.vm08.jwsqrf 2026-03-09T19:34:02.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:02 vm08.local ceph-mon[103420]: Upgrade: Updating mds.cephfs.vm08.jwsqrf 2026-03-09T19:34:02.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:02 vm08.local ceph-mon[103420]: Deploying daemon mds.cephfs.vm08.jwsqrf on vm08 2026-03-09T19:34:02.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:02 vm08.local ceph-mon[103420]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T19:34:02.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:02 vm08.local ceph-mon[103420]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T19:34:02.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:02 vm08.local ceph-mon[103420]: pgmap v222: 65 pgs: 65 active+clean; 214 MiB data, 942 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 307 B/s wr, 7 op/s 2026-03-09T19:34:02.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:02 vm08.local ceph-mon[103420]: osdmap e102: 6 total, 6 up, 6 in 2026-03-09T19:34:02.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:02 vm08.local ceph-mon[103420]: Standby daemon mds.cephfs.vm07.uizncw assigned to filesystem cephfs as rank 0 2026-03-09T19:34:02.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:02 vm08.local ceph-mon[103420]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T19:34:02.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:02 vm08.local ceph-mon[103420]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T19:34:02.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:02 vm08.local ceph-mon[103420]: fsmap cephfs:1/1 {0=cephfs.vm07.uizncw=up:replay} 2 up:standby 2026-03-09T19:34:04.629 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:04 vm08.local ceph-mon[103420]: pgmap v224: 65 pgs: 65 active+clean; 214 MiB data, 938 MiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 511 B/s wr, 10 op/s 2026-03-09T19:34:04.629 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:04 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:04.629 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:04 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:34:04.629 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:04 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:04.629 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:04 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:04.629 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:04 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:04 vm07.local ceph-mon[111841]: pgmap v224: 65 pgs: 65 active+clean; 214 MiB data, 938 MiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 511 B/s wr, 10 op/s 2026-03-09T19:34:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:04 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:04 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:34:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:04 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:04 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:04.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:04 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:05.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:05 vm07.local ceph-mon[111841]: mds.? [v2:192.168.123.108:6826/2173796097,v1:192.168.123.108:6827/2173796097] up:boot 2026-03-09T19:34:05.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:05 vm07.local ceph-mon[111841]: fsmap cephfs:1/1 {0=cephfs.vm07.uizncw=up:replay} 3 up:standby 2026-03-09T19:34:05.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:05 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.jwsqrf"}]: dispatch 2026-03-09T19:34:05.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:05 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:05.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:05 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:06.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:05 vm08.local ceph-mon[103420]: mds.? [v2:192.168.123.108:6826/2173796097,v1:192.168.123.108:6827/2173796097] up:boot 2026-03-09T19:34:06.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:05 vm08.local ceph-mon[103420]: fsmap cephfs:1/1 {0=cephfs.vm07.uizncw=up:replay} 3 up:standby 2026-03-09T19:34:06.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:05 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.jwsqrf"}]: dispatch 2026-03-09T19:34:06.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:05 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:06.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:05 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:07.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:07 vm07.local ceph-mon[111841]: pgmap v225: 65 pgs: 65 active+clean; 214 MiB data, 938 MiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 511 B/s wr, 10 op/s 2026-03-09T19:34:07.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:07.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:07 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:07.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:07 vm07.local ceph-mon[111841]: reconnect by client.14532 192.168.144.1:0/1264854118 after 0 2026-03-09T19:34:07.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:07 vm07.local ceph-mon[111841]: mds.? [v2:192.168.123.107:6826/2856024060,v1:192.168.123.107:6827/2856024060] up:reconnect 2026-03-09T19:34:07.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:07 vm07.local ceph-mon[111841]: fsmap cephfs:1/1 {0=cephfs.vm07.uizncw=up:reconnect} 3 up:standby 2026-03-09T19:34:07.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:07 vm07.local ceph-mon[111841]: reconnect by client.24305 192.168.144.1:0/3035669578 after 0.001 2026-03-09T19:34:07.824 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:07 vm08.local ceph-mon[103420]: pgmap v225: 65 pgs: 65 active+clean; 214 MiB data, 938 MiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 511 B/s wr, 10 op/s 2026-03-09T19:34:07.824 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:07.824 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:07 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:07.824 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:07 vm08.local ceph-mon[103420]: reconnect by client.14532 192.168.144.1:0/1264854118 after 0 2026-03-09T19:34:07.824 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:07 vm08.local ceph-mon[103420]: mds.? [v2:192.168.123.107:6826/2856024060,v1:192.168.123.107:6827/2856024060] up:reconnect 2026-03-09T19:34:07.824 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:07 vm08.local ceph-mon[103420]: fsmap cephfs:1/1 {0=cephfs.vm07.uizncw=up:reconnect} 3 up:standby 2026-03-09T19:34:07.824 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:07 vm08.local ceph-mon[103420]: reconnect by client.24305 192.168.144.1:0/3035669578 after 0.001 2026-03-09T19:34:08.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:08.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:08.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:08.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:08.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:08.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:08.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:08.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:08.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:08.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:08.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:08.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: Upgrade: Setting container_image for all mds 2026-03-09T19:34:08.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:08.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm07.uizncw"}]: dispatch 2026-03-09T19:34:08.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm07.uizncw"}]': finished 2026-03-09T19:34:08.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm07.zkmcyw"}]: dispatch 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm07.zkmcyw"}]': finished 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.jwsqrf"}]: dispatch 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.jwsqrf"}]': finished 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.zcaqju"}]: dispatch 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.zcaqju"}]': finished 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: Upgrade: Scaling up filesystem cephfs max_mds to 2 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]: dispatch 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: pgmap v226: 65 pgs: 65 active+clean; 214 MiB data, 938 MiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 511 B/s wr, 10 op/s 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]': finished 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: mds.? [v2:192.168.123.107:6826/2856024060,v1:192.168.123.107:6827/2856024060] up:rejoin 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: fsmap cephfs:1/1 {0=cephfs.vm07.uizncw=up:rejoin} 3 up:standby 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: Upgrade: Setting container_image for all rgw 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: Upgrade: Setting container_image for all rbd-mirror 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: daemon mds.cephfs.vm07.uizncw is now active in filesystem cephfs as rank 0 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:34:08.537 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:08 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:08.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: Upgrade: Setting container_image for all mds 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm07.uizncw"}]: dispatch 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm07.uizncw"}]': finished 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm07.zkmcyw"}]: dispatch 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm07.zkmcyw"}]': finished 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.jwsqrf"}]: dispatch 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.jwsqrf"}]': finished 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.zcaqju"}]: dispatch 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.zcaqju"}]': finished 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: Upgrade: Scaling up filesystem cephfs max_mds to 2 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]: dispatch 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: pgmap v226: 65 pgs: 65 active+clean; 214 MiB data, 938 MiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 511 B/s wr, 10 op/s 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]': finished 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: mds.? [v2:192.168.123.107:6826/2856024060,v1:192.168.123.107:6827/2856024060] up:rejoin 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: fsmap cephfs:1/1 {0=cephfs.vm07.uizncw=up:rejoin} 3 up:standby 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: Upgrade: Setting container_image for all rgw 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: Upgrade: Setting container_image for all rbd-mirror 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: daemon mds.cephfs.vm07.uizncw is now active in filesystem cephfs as rank 0 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:34:08.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:08 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:09.691 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:09 vm08.local ceph-mon[103420]: Upgrade: Updating ceph-exporter.vm07 (1/2) 2026-03-09T19:34:09.691 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:09 vm08.local ceph-mon[103420]: Deploying daemon ceph-exporter.vm07 on vm07 2026-03-09T19:34:09.692 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:09 vm08.local ceph-mon[103420]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T19:34:09.692 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:09 vm08.local ceph-mon[103420]: mds.? [v2:192.168.123.107:6826/2856024060,v1:192.168.123.107:6827/2856024060] up:active 2026-03-09T19:34:09.692 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:09 vm08.local ceph-mon[103420]: daemon mds.cephfs.vm07.zkmcyw assigned to filesystem cephfs as rank 1 (now has 2 ranks) 2026-03-09T19:34:09.692 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:09 vm08.local ceph-mon[103420]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-09T19:34:09.692 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:09 vm08.local ceph-mon[103420]: Cluster is now healthy 2026-03-09T19:34:09.692 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:09 vm08.local ceph-mon[103420]: fsmap cephfs:1 {0=cephfs.vm07.uizncw=up:active} 3 up:standby 2026-03-09T19:34:09.692 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:09 vm08.local ceph-mon[103420]: fsmap cephfs:2 {0=cephfs.vm07.uizncw=up:active,1=cephfs.vm07.zkmcyw=up:starting} 2 up:standby 2026-03-09T19:34:09.692 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:09 vm08.local ceph-mon[103420]: daemon mds.cephfs.vm07.zkmcyw is now active in filesystem cephfs as rank 1 2026-03-09T19:34:09.692 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:09 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:09.692 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:09 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:09.692 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:09 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:09.692 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:09 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:34:09.692 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:09 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:09.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:09 vm07.local ceph-mon[111841]: Upgrade: Updating ceph-exporter.vm07 (1/2) 2026-03-09T19:34:09.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:09 vm07.local ceph-mon[111841]: Deploying daemon ceph-exporter.vm07 on vm07 2026-03-09T19:34:09.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:09 vm07.local ceph-mon[111841]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T19:34:09.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:09 vm07.local ceph-mon[111841]: mds.? [v2:192.168.123.107:6826/2856024060,v1:192.168.123.107:6827/2856024060] up:active 2026-03-09T19:34:09.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:09 vm07.local ceph-mon[111841]: daemon mds.cephfs.vm07.zkmcyw assigned to filesystem cephfs as rank 1 (now has 2 ranks) 2026-03-09T19:34:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:09 vm07.local ceph-mon[111841]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-09T19:34:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:09 vm07.local ceph-mon[111841]: Cluster is now healthy 2026-03-09T19:34:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:09 vm07.local ceph-mon[111841]: fsmap cephfs:1 {0=cephfs.vm07.uizncw=up:active} 3 up:standby 2026-03-09T19:34:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:09 vm07.local ceph-mon[111841]: fsmap cephfs:2 {0=cephfs.vm07.uizncw=up:active,1=cephfs.vm07.zkmcyw=up:starting} 2 up:standby 2026-03-09T19:34:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:09 vm07.local ceph-mon[111841]: daemon mds.cephfs.vm07.zkmcyw is now active in filesystem cephfs as rank 1 2026-03-09T19:34:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:09 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:09 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:09 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:09 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T19:34:09.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:09 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:10 vm08.local ceph-mon[103420]: Upgrade: Updating ceph-exporter.vm08 (2/2) 2026-03-09T19:34:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:10 vm08.local ceph-mon[103420]: Deploying daemon ceph-exporter.vm08 on vm08 2026-03-09T19:34:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:10 vm08.local ceph-mon[103420]: pgmap v227: 65 pgs: 65 active+clean; 214 MiB data, 938 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 511 B/s wr, 10 op/s 2026-03-09T19:34:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:10 vm08.local ceph-mon[103420]: mds.? [v2:192.168.123.107:6828/670620212,v1:192.168.123.107:6829/670620212] up:active 2026-03-09T19:34:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:10 vm08.local ceph-mon[103420]: fsmap cephfs:2 {0=cephfs.vm07.uizncw=up:active,1=cephfs.vm07.zkmcyw=up:active} 2 up:standby 2026-03-09T19:34:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:10 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:10 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:11.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:10 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:11.187 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:10 vm07.local ceph-mon[111841]: Upgrade: Updating ceph-exporter.vm08 (2/2) 2026-03-09T19:34:11.187 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:10 vm07.local ceph-mon[111841]: Deploying daemon ceph-exporter.vm08 on vm08 2026-03-09T19:34:11.187 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:10 vm07.local ceph-mon[111841]: pgmap v227: 65 pgs: 65 active+clean; 214 MiB data, 938 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 511 B/s wr, 10 op/s 2026-03-09T19:34:11.187 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:10 vm07.local ceph-mon[111841]: mds.? [v2:192.168.123.107:6828/670620212,v1:192.168.123.107:6829/670620212] up:active 2026-03-09T19:34:11.187 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:10 vm07.local ceph-mon[111841]: fsmap cephfs:2 {0=cephfs.vm07.uizncw=up:active,1=cephfs.vm07.zkmcyw=up:active} 2 up:standby 2026-03-09T19:34:11.187 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:10 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:11.187 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:10 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:11.187 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:10 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:12.442 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:12 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:12.442 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:12 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:12.442 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:12 vm07.local ceph-mon[111841]: pgmap v228: 65 pgs: 65 active+clean; 214 MiB data, 938 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 614 B/s wr, 11 op/s 2026-03-09T19:34:12.442 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:12 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:12.442 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:12 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:12.442 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:12 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:12.442 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:12 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:12.442 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:12 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:12.442 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:12 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:12.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:12 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:12.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:12 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:12.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:12 vm08.local ceph-mon[103420]: pgmap v228: 65 pgs: 65 active+clean; 214 MiB data, 938 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 614 B/s wr, 11 op/s 2026-03-09T19:34:12.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:12 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:12.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:12 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:12.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:12 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:12.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:12 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:12.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:12 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:12.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:12 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: pgmap v229: 65 pgs: 65 active+clean; 214 MiB data, 938 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 5.1 KiB/s wr, 19 op/s 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm07"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm07"}]': finished 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm08"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm08"}]': finished 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.529 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:14 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: pgmap v229: 65 pgs: 65 active+clean; 214 MiB data, 938 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 5.1 KiB/s wr, 19 op/s 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm07"}]: dispatch 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm07"}]': finished 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm08"}]: dispatch 2026-03-09T19:34:14.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm08"}]': finished 2026-03-09T19:34:14.596 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.596 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.596 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.596 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:14.596 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:14.596 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:14 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:15.408 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:15 vm07.local ceph-mon[111841]: Upgrade: Setting container_image for all ceph-exporter 2026-03-09T19:34:15.408 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:15 vm07.local ceph-mon[111841]: Upgrade: Setting container_image for all iscsi 2026-03-09T19:34:15.408 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:15 vm07.local ceph-mon[111841]: Upgrade: Setting container_image for all nfs 2026-03-09T19:34:15.408 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:15 vm07.local ceph-mon[111841]: Upgrade: Setting container_image for all nvmeof 2026-03-09T19:34:15.408 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:15 vm07.local ceph-mon[111841]: Upgrade: Updating node-exporter.vm07 (1/2) 2026-03-09T19:34:15.408 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:15 vm07.local ceph-mon[111841]: Deploying daemon node-exporter.vm07 on vm07 2026-03-09T19:34:15.587 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:15 vm08.local ceph-mon[103420]: Upgrade: Setting container_image for all ceph-exporter 2026-03-09T19:34:15.587 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:15 vm08.local ceph-mon[103420]: Upgrade: Setting container_image for all iscsi 2026-03-09T19:34:15.587 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:15 vm08.local ceph-mon[103420]: Upgrade: Setting container_image for all nfs 2026-03-09T19:34:15.587 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:15 vm08.local ceph-mon[103420]: Upgrade: Setting container_image for all nvmeof 2026-03-09T19:34:15.587 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:15 vm08.local ceph-mon[103420]: Upgrade: Updating node-exporter.vm07 (1/2) 2026-03-09T19:34:15.588 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:15 vm08.local ceph-mon[103420]: Deploying daemon node-exporter.vm07 on vm07 2026-03-09T19:34:16.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:16 vm08.local ceph-mon[103420]: pgmap v230: 65 pgs: 65 active+clean; 214 MiB data, 938 MiB used, 119 GiB / 120 GiB avail; 12 MiB/s rd, 4.8 KiB/s wr, 15 op/s 2026-03-09T19:34:16.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:16 vm07.local ceph-mon[111841]: pgmap v230: 65 pgs: 65 active+clean; 214 MiB data, 938 MiB used, 119 GiB / 120 GiB avail; 12 MiB/s rd, 4.8 KiB/s wr, 15 op/s 2026-03-09T19:34:19.043 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:18 vm08.local ceph-mon[103420]: pgmap v231: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail; 8.3 MiB/s rd, 4.8 KiB/s wr, 14 op/s 2026-03-09T19:34:19.043 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:18 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:19.043 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:18 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:19.043 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:18 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:19.043 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:18 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:34:19.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:18 vm07.local ceph-mon[111841]: pgmap v231: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail; 8.3 MiB/s rd, 4.8 KiB/s wr, 14 op/s 2026-03-09T19:34:19.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:18 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:19.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:18 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:19.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:18 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:19.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:18 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:34:20.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.191+0000 7f4c4b21b640 1 -- 192.168.123.107:0/1856067463 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4c44103c60 msgr2=0x7f4c441040e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.191+0000 7f4c4b21b640 1 --2- 192.168.123.107:0/1856067463 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4c44103c60 0x7f4c441040e0 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f4c34009a00 tx=0x7f4c3402f290 comp rx=0 tx=0).stop 2026-03-09T19:34:20.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.193+0000 7f4c4b21b640 1 -- 192.168.123.107:0/1856067463 shutdown_connections 2026-03-09T19:34:20.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.193+0000 7f4c4b21b640 1 --2- 192.168.123.107:0/1856067463 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4c44103c60 0x7f4c441040e0 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.193+0000 7f4c4b21b640 1 --2- 192.168.123.107:0/1856067463 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c44102a60 0x7f4c44102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.193+0000 7f4c4b21b640 1 -- 192.168.123.107:0/1856067463 >> 192.168.123.107:0/1856067463 conn(0x7f4c440fe250 msgr2=0x7f4c44100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:20.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.194+0000 7f4c4b21b640 1 -- 192.168.123.107:0/1856067463 shutdown_connections 2026-03-09T19:34:20.193 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.194+0000 7f4c4b21b640 1 -- 192.168.123.107:0/1856067463 wait complete. 2026-03-09T19:34:20.193 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.194+0000 7f4c4b21b640 1 Processor -- start 2026-03-09T19:34:20.193 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.194+0000 7f4c4b21b640 1 -- start start 2026-03-09T19:34:20.193 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.195+0000 7f4c4b21b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4c44102a60 0x7f4c4406d350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:20.193 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.195+0000 7f4c4b21b640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c44103c60 0x7f4c4406d890 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:20.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.195+0000 7f4c4b21b640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c4406de60 con 0x7f4c44102a60 2026-03-09T19:34:20.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.195+0000 7f4c4b21b640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c4406dfd0 con 0x7f4c44103c60 2026-03-09T19:34:20.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.195+0000 7f4c3bfff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c44103c60 0x7f4c4406d890 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:20.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.195+0000 7f4c3bfff640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c44103c60 0x7f4c4406d890 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:44694/0 (socket says 192.168.123.107:44694) 2026-03-09T19:34:20.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.195+0000 7f4c3bfff640 1 -- 192.168.123.107:0/1278579587 learned_addr learned my addr 192.168.123.107:0/1278579587 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:34:20.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.195+0000 7f4c3bfff640 1 -- 192.168.123.107:0/1278579587 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4c44102a60 msgr2=0x7f4c4406d350 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.195+0000 7f4c48f90640 1 --2- 192.168.123.107:0/1278579587 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4c44102a60 0x7f4c4406d350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:20.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.196+0000 7f4c3bfff640 1 --2- 192.168.123.107:0/1278579587 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4c44102a60 0x7f4c4406d350 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.195 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.196+0000 7f4c3bfff640 1 -- 192.168.123.107:0/1278579587 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4c34009660 con 0x7f4c44103c60 2026-03-09T19:34:20.195 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.196+0000 7f4c48f90640 1 --2- 192.168.123.107:0/1278579587 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4c44102a60 0x7f4c4406d350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:34:20.195 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.196+0000 7f4c3bfff640 1 --2- 192.168.123.107:0/1278579587 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c44103c60 0x7f4c4406d890 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f4c3402f7a0 tx=0x7f4c340043d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:20.195 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.196+0000 7f4c39ffb640 1 -- 192.168.123.107:0/1278579587 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4c3402fd00 con 0x7f4c44103c60 2026-03-09T19:34:20.195 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.196+0000 7f4c39ffb640 1 -- 192.168.123.107:0/1278579587 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f4c3402fe60 con 0x7f4c44103c60 2026-03-09T19:34:20.195 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.196+0000 7f4c39ffb640 1 -- 192.168.123.107:0/1278579587 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4c34041b90 con 0x7f4c44103c60 2026-03-09T19:34:20.195 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.197+0000 7f4c4b21b640 1 -- 192.168.123.107:0/1278579587 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4c44072a40 con 0x7f4c44103c60 2026-03-09T19:34:20.195 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.197+0000 7f4c4b21b640 1 -- 192.168.123.107:0/1278579587 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4c44072fb0 con 0x7f4c44103c60 2026-03-09T19:34:20.197 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.198+0000 7f4c4b21b640 1 -- 192.168.123.107:0/1278579587 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4c0c005350 con 0x7f4c44103c60 2026-03-09T19:34:20.197 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.199+0000 7f4c39ffb640 1 -- 192.168.123.107:0/1278579587 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4c34048020 con 0x7f4c44103c60 2026-03-09T19:34:20.198 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.199+0000 7f4c39ffb640 1 --2- 192.168.123.107:0/1278579587 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c1c0778e0 0x7f4c1c079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:20.198 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.199+0000 7f4c39ffb640 1 -- 192.168.123.107:0/1278579587 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f4c340bf210 con 0x7f4c44103c60 2026-03-09T19:34:20.198 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.199+0000 7f4c48f90640 1 --2- 192.168.123.107:0/1278579587 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c1c0778e0 0x7f4c1c079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:20.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.200+0000 7f4c48f90640 1 --2- 192.168.123.107:0/1278579587 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c1c0778e0 0x7f4c1c079da0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f4c44103ac0 tx=0x7f4c2c008040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:20.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.201+0000 7f4c39ffb640 1 -- 192.168.123.107:0/1278579587 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4c340878c0 con 0x7f4c44103c60 2026-03-09T19:34:20.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:19 vm07.local ceph-mon[111841]: Upgrade: Updating node-exporter.vm08 (2/2) 2026-03-09T19:34:20.229 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:19 vm07.local ceph-mon[111841]: Deploying daemon node-exporter.vm08 on vm08 2026-03-09T19:34:20.305 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.306+0000 7f4c4b21b640 1 -- 192.168.123.107:0/1278579587 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f4c0c002bf0 con 0x7f4c1c0778e0 2026-03-09T19:34:20.306 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.307+0000 7f4c39ffb640 1 -- 192.168.123.107:0/1278579587 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+466 (secure 0 0 0) 0x7f4c0c002bf0 con 0x7f4c1c0778e0 2026-03-09T19:34:20.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.310+0000 7f4c4b21b640 1 -- 192.168.123.107:0/1278579587 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c1c0778e0 msgr2=0x7f4c1c079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.310+0000 7f4c4b21b640 1 --2- 192.168.123.107:0/1278579587 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c1c0778e0 0x7f4c1c079da0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f4c44103ac0 tx=0x7f4c2c008040 comp rx=0 tx=0).stop 2026-03-09T19:34:20.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.310+0000 7f4c4b21b640 1 -- 192.168.123.107:0/1278579587 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c44103c60 msgr2=0x7f4c4406d890 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.310+0000 7f4c4b21b640 1 --2- 192.168.123.107:0/1278579587 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c44103c60 0x7f4c4406d890 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f4c3402f7a0 tx=0x7f4c340043d0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.310+0000 7f4c4b21b640 1 -- 192.168.123.107:0/1278579587 shutdown_connections 2026-03-09T19:34:20.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.310+0000 7f4c4b21b640 1 --2- 192.168.123.107:0/1278579587 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c1c0778e0 0x7f4c1c079da0 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.310+0000 7f4c4b21b640 1 --2- 192.168.123.107:0/1278579587 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c44103c60 0x7f4c4406d890 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.311+0000 7f4c4b21b640 1 --2- 192.168.123.107:0/1278579587 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4c44102a60 0x7f4c4406d350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.311+0000 7f4c4b21b640 1 -- 192.168.123.107:0/1278579587 >> 192.168.123.107:0/1278579587 conn(0x7f4c440fe250 msgr2=0x7f4c440ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:20.309 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.311+0000 7f4c4b21b640 1 -- 192.168.123.107:0/1278579587 shutdown_connections 2026-03-09T19:34:20.310 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.311+0000 7f4c4b21b640 1 -- 192.168.123.107:0/1278579587 wait complete. 2026-03-09T19:34:20.318 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:34:20.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:19 vm08.local ceph-mon[103420]: Upgrade: Updating node-exporter.vm08 (2/2) 2026-03-09T19:34:20.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:19 vm08.local ceph-mon[103420]: Deploying daemon node-exporter.vm08 on vm08 2026-03-09T19:34:20.368 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.370+0000 7f10698eb640 1 -- 192.168.123.107:0/3540796599 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1064103c80 msgr2=0x7f1064104100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.368 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.370+0000 7f10698eb640 1 --2- 192.168.123.107:0/3540796599 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1064103c80 0x7f1064104100 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f10500099b0 tx=0x7f105002f220 comp rx=0 tx=0).stop 2026-03-09T19:34:20.368 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.370+0000 7f10698eb640 1 -- 192.168.123.107:0/3540796599 shutdown_connections 2026-03-09T19:34:20.368 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.370+0000 7f10698eb640 1 --2- 192.168.123.107:0/3540796599 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1064103c80 0x7f1064104100 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.368 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.370+0000 7f10698eb640 1 --2- 192.168.123.107:0/3540796599 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1064102a80 0x7f1064102e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.368 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.370+0000 7f10698eb640 1 -- 192.168.123.107:0/3540796599 >> 192.168.123.107:0/3540796599 conn(0x7f10640fe250 msgr2=0x7f1064100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:20.368 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.370+0000 7f10698eb640 1 -- 192.168.123.107:0/3540796599 shutdown_connections 2026-03-09T19:34:20.368 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.370+0000 7f10698eb640 1 -- 192.168.123.107:0/3540796599 wait complete. 2026-03-09T19:34:20.369 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.371+0000 7f10698eb640 1 Processor -- start 2026-03-09T19:34:20.369 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.371+0000 7f10698eb640 1 -- start start 2026-03-09T19:34:20.369 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.371+0000 7f10698eb640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1064102a80 0x7f106419a430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:20.369 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.371+0000 7f10698eb640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1064103c80 0x7f106419a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:20.369 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.371+0000 7f10698eb640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f106419aeb0 con 0x7f1064103c80 2026-03-09T19:34:20.369 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.371+0000 7f10698eb640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f106419b020 con 0x7f1064102a80 2026-03-09T19:34:20.369 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.371+0000 7f1062ffd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1064102a80 0x7f106419a430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:20.370 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.372+0000 7f10627fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1064103c80 0x7f106419a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:20.370 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.372+0000 7f10627fc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1064103c80 0x7f106419a970 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:32872/0 (socket says 192.168.123.107:32872) 2026-03-09T19:34:20.370 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.372+0000 7f10627fc640 1 -- 192.168.123.107:0/2130414548 learned_addr learned my addr 192.168.123.107:0/2130414548 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:34:20.370 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.372+0000 7f10627fc640 1 -- 192.168.123.107:0/2130414548 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1064102a80 msgr2=0x7f106419a430 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.370 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.371+0000 7f1062ffd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1064102a80 0x7f106419a430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:44716/0 (socket says 192.168.123.107:44716) 2026-03-09T19:34:20.370 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.372+0000 7f10627fc640 1 --2- 192.168.123.107:0/2130414548 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1064102a80 0x7f106419a430 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.370 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.372+0000 7f10627fc640 1 -- 192.168.123.107:0/2130414548 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1050009660 con 0x7f1064103c80 2026-03-09T19:34:20.370 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.372+0000 7f1062ffd640 1 --2- 192.168.123.107:0/2130414548 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1064102a80 0x7f106419a430 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:34:20.370 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.372+0000 7f10627fc640 1 --2- 192.168.123.107:0/2130414548 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1064103c80 0x7f106419a970 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f1050002410 tx=0x7f1050002980 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:20.370 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.372+0000 7f10688e9640 1 -- 192.168.123.107:0/2130414548 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f105003d070 con 0x7f1064103c80 2026-03-09T19:34:20.370 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.373+0000 7f10688e9640 1 -- 192.168.123.107:0/2130414548 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f105002fd50 con 0x7f1064103c80 2026-03-09T19:34:20.370 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.373+0000 7f10698eb640 1 -- 192.168.123.107:0/2130414548 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f106419faa0 con 0x7f1064103c80 2026-03-09T19:34:20.371 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.373+0000 7f10688e9640 1 -- 192.168.123.107:0/2130414548 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1050041a60 con 0x7f1064103c80 2026-03-09T19:34:20.371 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.373+0000 7f10698eb640 1 -- 192.168.123.107:0/2130414548 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f106419ff40 con 0x7f1064103c80 2026-03-09T19:34:20.376 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.375+0000 7f10688e9640 1 -- 192.168.123.107:0/2130414548 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1050038730 con 0x7f1064103c80 2026-03-09T19:34:20.376 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.375+0000 7f10698eb640 1 -- 192.168.123.107:0/2130414548 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1030005350 con 0x7f1064103c80 2026-03-09T19:34:20.376 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.375+0000 7f10688e9640 1 --2- 192.168.123.107:0/2130414548 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f103c077890 0x7f103c079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:20.376 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.376+0000 7f10688e9640 1 -- 192.168.123.107:0/2130414548 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f10500be890 con 0x7f1064103c80 2026-03-09T19:34:20.377 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.379+0000 7f1062ffd640 1 --2- 192.168.123.107:0/2130414548 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f103c077890 0x7f103c079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:20.377 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.379+0000 7f10688e9640 1 -- 192.168.123.107:0/2130414548 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1050086e40 con 0x7f1064103c80 2026-03-09T19:34:20.377 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.379+0000 7f1062ffd640 1 --2- 192.168.123.107:0/2130414548 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f103c077890 0x7f103c079d50 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f1064103ae0 tx=0x7f1058005eb0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:20.488 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.490+0000 7f10698eb640 1 -- 192.168.123.107:0/2130414548 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1030002bf0 con 0x7f103c077890 2026-03-09T19:34:20.489 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.491+0000 7f10688e9640 1 -- 192.168.123.107:0/2130414548 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+466 (secure 0 0 0) 0x7f1030002bf0 con 0x7f103c077890 2026-03-09T19:34:20.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.494+0000 7f10698eb640 1 -- 192.168.123.107:0/2130414548 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f103c077890 msgr2=0x7f103c079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.494+0000 7f10698eb640 1 --2- 192.168.123.107:0/2130414548 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f103c077890 0x7f103c079d50 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f1064103ae0 tx=0x7f1058005eb0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.494+0000 7f10698eb640 1 -- 192.168.123.107:0/2130414548 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1064103c80 msgr2=0x7f106419a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.494+0000 7f10698eb640 1 --2- 192.168.123.107:0/2130414548 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1064103c80 0x7f106419a970 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f1050002410 tx=0x7f1050002980 comp rx=0 tx=0).stop 2026-03-09T19:34:20.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.494+0000 7f10698eb640 1 -- 192.168.123.107:0/2130414548 shutdown_connections 2026-03-09T19:34:20.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.494+0000 7f10698eb640 1 --2- 192.168.123.107:0/2130414548 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f103c077890 0x7f103c079d50 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.494+0000 7f10698eb640 1 --2- 192.168.123.107:0/2130414548 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1064103c80 0x7f106419a970 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.494+0000 7f10698eb640 1 --2- 192.168.123.107:0/2130414548 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1064102a80 0x7f106419a430 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.494+0000 7f10698eb640 1 -- 192.168.123.107:0/2130414548 >> 192.168.123.107:0/2130414548 conn(0x7f10640fe250 msgr2=0x7f10640ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:20.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.494+0000 7f10698eb640 1 -- 192.168.123.107:0/2130414548 shutdown_connections 2026-03-09T19:34:20.492 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.494+0000 7f10698eb640 1 -- 192.168.123.107:0/2130414548 wait complete. 2026-03-09T19:34:20.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.553+0000 7f8c8d63d640 1 -- 192.168.123.107:0/3404039188 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c88102a80 msgr2=0x7f8c88102e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.553+0000 7f8c8d63d640 1 --2- 192.168.123.107:0/3404039188 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c88102a80 0x7f8c88102e80 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f8c7c0099b0 tx=0x7f8c7c02f220 comp rx=0 tx=0).stop 2026-03-09T19:34:20.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.553+0000 7f8c8d63d640 1 -- 192.168.123.107:0/3404039188 shutdown_connections 2026-03-09T19:34:20.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.553+0000 7f8c8d63d640 1 --2- 192.168.123.107:0/3404039188 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c88103c80 0x7f8c88104100 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.553+0000 7f8c8d63d640 1 --2- 192.168.123.107:0/3404039188 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c88102a80 0x7f8c88102e80 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.553+0000 7f8c8d63d640 1 -- 192.168.123.107:0/3404039188 >> 192.168.123.107:0/3404039188 conn(0x7f8c880fe250 msgr2=0x7f8c88100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:20.552 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.554+0000 7f8c8d63d640 1 -- 192.168.123.107:0/3404039188 shutdown_connections 2026-03-09T19:34:20.552 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.554+0000 7f8c8d63d640 1 -- 192.168.123.107:0/3404039188 wait complete. 2026-03-09T19:34:20.552 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.554+0000 7f8c8d63d640 1 Processor -- start 2026-03-09T19:34:20.552 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.554+0000 7f8c8d63d640 1 -- start start 2026-03-09T19:34:20.552 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.555+0000 7f8c8d63d640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c88102a80 0x7f8c8819a470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:20.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.555+0000 7f8c8d63d640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c88103c80 0x7f8c8819a9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:20.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.555+0000 7f8c8d63d640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8c8819af80 con 0x7f8c88102a80 2026-03-09T19:34:20.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.555+0000 7f8c8d63d640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8c8819b0f0 con 0x7f8c88103c80 2026-03-09T19:34:20.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.555+0000 7f8c86ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c88102a80 0x7f8c8819a470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:20.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.555+0000 7f8c86ffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c88102a80 0x7f8c8819a470 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:32882/0 (socket says 192.168.123.107:32882) 2026-03-09T19:34:20.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.555+0000 7f8c86ffd640 1 -- 192.168.123.107:0/2461968292 learned_addr learned my addr 192.168.123.107:0/2461968292 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:34:20.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.555+0000 7f8c86ffd640 1 -- 192.168.123.107:0/2461968292 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c88103c80 msgr2=0x7f8c8819a9b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.555+0000 7f8c867fc640 1 --2- 192.168.123.107:0/2461968292 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c88103c80 0x7f8c8819a9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:20.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.556+0000 7f8c86ffd640 1 --2- 192.168.123.107:0/2461968292 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c88103c80 0x7f8c8819a9b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.556+0000 7f8c86ffd640 1 -- 192.168.123.107:0/2461968292 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8c7c009660 con 0x7f8c88102a80 2026-03-09T19:34:20.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.556+0000 7f8c867fc640 1 --2- 192.168.123.107:0/2461968292 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c88103c80 0x7f8c8819a9b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:34:20.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.556+0000 7f8c86ffd640 1 --2- 192.168.123.107:0/2461968292 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c88102a80 0x7f8c8819a470 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f8c7c002980 tx=0x7f8c7c0029b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:20.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.556+0000 7f8c6bfff640 1 -- 192.168.123.107:0/2461968292 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8c7c03d070 con 0x7f8c88102a80 2026-03-09T19:34:20.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.556+0000 7f8c6bfff640 1 -- 192.168.123.107:0/2461968292 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8c7c02fd50 con 0x7f8c88102a80 2026-03-09T19:34:20.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.556+0000 7f8c6bfff640 1 -- 192.168.123.107:0/2461968292 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8c7c041a10 con 0x7f8c88102a80 2026-03-09T19:34:20.560 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.557+0000 7f8c8d63d640 1 -- 192.168.123.107:0/2461968292 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8c8819fb30 con 0x7f8c88102a80 2026-03-09T19:34:20.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.557+0000 7f8c8d63d640 1 -- 192.168.123.107:0/2461968292 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8c8819ffa0 con 0x7f8c88102a80 2026-03-09T19:34:20.561 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.563+0000 7f8c6bfff640 1 -- 192.168.123.107:0/2461968292 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8c7c041b70 con 0x7f8c88102a80 2026-03-09T19:34:20.562 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.564+0000 7f8c6bfff640 1 --2- 192.168.123.107:0/2461968292 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8c600776d0 0x7f8c60079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:20.562 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.564+0000 7f8c867fc640 1 --2- 192.168.123.107:0/2461968292 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8c600776d0 0x7f8c60079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:20.562 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.564+0000 7f8c6bfff640 1 -- 192.168.123.107:0/2461968292 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f8c7c0be9a0 con 0x7f8c88102a80 2026-03-09T19:34:20.562 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.565+0000 7f8c867fc640 1 --2- 192.168.123.107:0/2461968292 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8c600776d0 0x7f8c60079b90 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f8c8819b990 tx=0x7f8c74005e30 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:20.563 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.565+0000 7f8c8d63d640 1 -- 192.168.123.107:0/2461968292 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8c54005350 con 0x7f8c88102a80 2026-03-09T19:34:20.566 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.568+0000 7f8c6bfff640 1 -- 192.168.123.107:0/2461968292 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8c7c087050 con 0x7f8c88102a80 2026-03-09T19:34:20.666 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.668+0000 7f8c8d63d640 1 -- 192.168.123.107:0/2461968292 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f8c54002bf0 con 0x7f8c600776d0 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.674+0000 7f8c6bfff640 1 -- 192.168.123.107:0/2461968292 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f8c54002bf0 con 0x7f8c600776d0 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (11m) 8s ago 11m 23.2M - 0.25.0 c8568f914cd2 976a1914d389 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (11s) 8s ago 12m 9911k - 19.2.3-678-ge911bdeb 654f31e6858e 143b4d120468 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (10s) 9s ago 11m 9638k - 19.2.3-678-ge911bdeb 654f31e6858e bb4c6a92b1d5 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (5m) 8s ago 12m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 91ed5a6dbf3f 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (5m) 9s ago 11m 8321k - 19.2.3-678-ge911bdeb 654f31e6858e b2465a9d2305 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (11m) 8s ago 11m 90.2M - 9.4.7 954c08fa6188 34d173509259 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (39s) 8s ago 9m 169M - 19.2.3-678-ge911bdeb 654f31e6858e ccdf01cf9a0b 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (33s) 8s ago 9m 20.2M - 19.2.3-678-ge911bdeb 654f31e6858e 15f71649e080 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (16s) 9s ago 9m 15.4M - 19.2.3-678-ge911bdeb 654f31e6858e b1b66121ee0c 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (23s) 9s ago 9m 20.6M - 19.2.3-678-ge911bdeb 654f31e6858e 6c9f1adefb0b 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:8443,9283,8765 running (6m) 8s ago 12m 607M - 19.2.3-678-ge911bdeb 654f31e6858e 6c1350e70bfa 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (5m) 9s ago 11m 494M - 19.2.3-678-ge911bdeb 654f31e6858e c4c36685d8dc 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (5m) 8s ago 12m 68.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e ad39140965d8 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (5m) 9s ago 11m 56.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b4a58927ebfd 2026-03-09T19:34:20.672 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 starting - - - - 2026-03-09T19:34:20.673 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (11m) 9s ago 11m 16.8M - 1.5.0 0da6a335fe13 fbd23e6c240f 2026-03-09T19:34:20.673 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (5m) 8s ago 11m 229M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a203aa241656 2026-03-09T19:34:20.673 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (2m) 8s ago 10m 121M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 36b65f1069e5 2026-03-09T19:34:20.673 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (2m) 8s ago 10m 102M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 1b8bc1f96eb7 2026-03-09T19:34:20.673 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (119s) 9s ago 10m 192M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bde783ff786f 2026-03-09T19:34:20.673 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (98s) 9s ago 10m 132M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 588104e3b774 2026-03-09T19:34:20.673 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (76s) 9s ago 10m 119M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7f9a10e5f49d 2026-03-09T19:34:20.673 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (6m) 8s ago 11m 60.0M - 2.43.0 a07b618ecd1d c09450c20f5f 2026-03-09T19:34:20.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.677+0000 7f8c8d63d640 1 -- 192.168.123.107:0/2461968292 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8c600776d0 msgr2=0x7f8c60079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.677+0000 7f8c8d63d640 1 --2- 192.168.123.107:0/2461968292 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8c600776d0 0x7f8c60079b90 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f8c8819b990 tx=0x7f8c74005e30 comp rx=0 tx=0).stop 2026-03-09T19:34:20.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.677+0000 7f8c8d63d640 1 -- 192.168.123.107:0/2461968292 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c88102a80 msgr2=0x7f8c8819a470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.677+0000 7f8c8d63d640 1 --2- 192.168.123.107:0/2461968292 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c88102a80 0x7f8c8819a470 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f8c7c002980 tx=0x7f8c7c0029b0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.678+0000 7f8c8d63d640 1 -- 192.168.123.107:0/2461968292 shutdown_connections 2026-03-09T19:34:20.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.678+0000 7f8c8d63d640 1 --2- 192.168.123.107:0/2461968292 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f8c600776d0 0x7f8c60079b90 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.678+0000 7f8c8d63d640 1 --2- 192.168.123.107:0/2461968292 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c88103c80 0x7f8c8819a9b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.678+0000 7f8c8d63d640 1 --2- 192.168.123.107:0/2461968292 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8c88102a80 0x7f8c8819a470 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.678+0000 7f8c8d63d640 1 -- 192.168.123.107:0/2461968292 >> 192.168.123.107:0/2461968292 conn(0x7f8c880fe250 msgr2=0x7f8c880ffa10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:20.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.678+0000 7f8c8d63d640 1 -- 192.168.123.107:0/2461968292 shutdown_connections 2026-03-09T19:34:20.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.678+0000 7f8c8d63d640 1 -- 192.168.123.107:0/2461968292 wait complete. 2026-03-09T19:34:20.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.740+0000 7f1f5c98c640 1 -- 192.168.123.107:0/2779207469 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1f54102a60 msgr2=0x7f1f54102e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.740+0000 7f1f5c98c640 1 --2- 192.168.123.107:0/2779207469 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1f54102a60 0x7f1f54102e60 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f1f400099b0 tx=0x7f1f4002f220 comp rx=0 tx=0).stop 2026-03-09T19:34:20.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.740+0000 7f1f5c98c640 1 -- 192.168.123.107:0/2779207469 shutdown_connections 2026-03-09T19:34:20.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.740+0000 7f1f5c98c640 1 --2- 192.168.123.107:0/2779207469 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1f54103c60 0x7f1f541040e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.740+0000 7f1f5c98c640 1 --2- 192.168.123.107:0/2779207469 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1f54102a60 0x7f1f54102e60 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.740+0000 7f1f5c98c640 1 -- 192.168.123.107:0/2779207469 >> 192.168.123.107:0/2779207469 conn(0x7f1f540fe250 msgr2=0x7f1f54100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:20.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.741+0000 7f1f5c98c640 1 -- 192.168.123.107:0/2779207469 shutdown_connections 2026-03-09T19:34:20.738 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.741+0000 7f1f5c98c640 1 -- 192.168.123.107:0/2779207469 wait complete. 2026-03-09T19:34:20.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.741+0000 7f1f5c98c640 1 Processor -- start 2026-03-09T19:34:20.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.741+0000 7f1f5c98c640 1 -- start start 2026-03-09T19:34:20.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.741+0000 7f1f5c98c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1f54102a60 0x7f1f5419a480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:20.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.742+0000 7f1f5c98c640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1f54103c60 0x7f1f5419a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:20.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.742+0000 7f1f5c98c640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1f5419af90 con 0x7f1f54102a60 2026-03-09T19:34:20.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.742+0000 7f1f5c98c640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1f5419b100 con 0x7f1f54103c60 2026-03-09T19:34:20.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.742+0000 7f1f5a701640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1f54102a60 0x7f1f5419a480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:20.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.742+0000 7f1f5a701640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1f54102a60 0x7f1f5419a480 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:32888/0 (socket says 192.168.123.107:32888) 2026-03-09T19:34:20.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.742+0000 7f1f5a701640 1 -- 192.168.123.107:0/3852095728 learned_addr learned my addr 192.168.123.107:0/3852095728 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:34:20.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.742+0000 7f1f5a701640 1 -- 192.168.123.107:0/3852095728 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1f54103c60 msgr2=0x7f1f5419a9c0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:34:20.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.742+0000 7f1f59f00640 1 --2- 192.168.123.107:0/3852095728 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1f54103c60 0x7f1f5419a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:20.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.742+0000 7f1f5a701640 1 --2- 192.168.123.107:0/3852095728 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1f54103c60 0x7f1f5419a9c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.742+0000 7f1f5a701640 1 -- 192.168.123.107:0/3852095728 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1f40009660 con 0x7f1f54102a60 2026-03-09T19:34:20.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.742+0000 7f1f5a701640 1 --2- 192.168.123.107:0/3852095728 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1f54102a60 0x7f1f5419a480 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f1f40002940 tx=0x7f1f40002970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:20.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.743+0000 7f1f3f7fe640 1 -- 192.168.123.107:0/3852095728 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1f4003d070 con 0x7f1f54102a60 2026-03-09T19:34:20.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.743+0000 7f1f3f7fe640 1 -- 192.168.123.107:0/3852095728 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f1f4002fd50 con 0x7f1f54102a60 2026-03-09T19:34:20.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.743+0000 7f1f5c98c640 1 -- 192.168.123.107:0/3852095728 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1f5419fb40 con 0x7f1f54102a60 2026-03-09T19:34:20.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.743+0000 7f1f5c98c640 1 -- 192.168.123.107:0/3852095728 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1f541a0030 con 0x7f1f54102a60 2026-03-09T19:34:20.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.743+0000 7f1f3f7fe640 1 -- 192.168.123.107:0/3852095728 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1f40041a50 con 0x7f1f54102a60 2026-03-09T19:34:20.743 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.744+0000 7f1f3f7fe640 1 -- 192.168.123.107:0/3852095728 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1f40049050 con 0x7f1f54102a60 2026-03-09T19:34:20.743 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.745+0000 7f1f3f7fe640 1 --2- 192.168.123.107:0/3852095728 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1f280778e0 0x7f1f28079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:20.743 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.745+0000 7f1f3f7fe640 1 -- 192.168.123.107:0/3852095728 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f1f400bf730 con 0x7f1f54102a60 2026-03-09T19:34:20.743 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.745+0000 7f1f59f00640 1 --2- 192.168.123.107:0/3852095728 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1f280778e0 0x7f1f28079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:20.743 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.745+0000 7f1f5c98c640 1 -- 192.168.123.107:0/3852095728 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1f24005350 con 0x7f1f54102a60 2026-03-09T19:34:20.746 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.746+0000 7f1f59f00640 1 --2- 192.168.123.107:0/3852095728 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1f280778e0 0x7f1f28079da0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f1f4c005fd0 tx=0x7f1f4c005d60 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:20.746 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.748+0000 7f1f3f7fe640 1 -- 192.168.123.107:0/3852095728 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1f40087dc0 con 0x7f1f54102a60 2026-03-09T19:34:20.892 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.894+0000 7f1f5c98c640 1 -- 192.168.123.107:0/3852095728 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f1f24005e10 con 0x7f1f54102a60 2026-03-09T19:34:20.897 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.899+0000 7f1f3f7fe640 1 -- 192.168.123.107:0/3852095728 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f1f40087510 con 0x7f1f54102a60 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:34:20.898 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:34:20.900 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.902+0000 7f1f5c98c640 1 -- 192.168.123.107:0/3852095728 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1f280778e0 msgr2=0x7f1f28079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.900 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.902+0000 7f1f5c98c640 1 --2- 192.168.123.107:0/3852095728 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1f280778e0 0x7f1f28079da0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f1f4c005fd0 tx=0x7f1f4c005d60 comp rx=0 tx=0).stop 2026-03-09T19:34:20.900 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.902+0000 7f1f5c98c640 1 -- 192.168.123.107:0/3852095728 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1f54102a60 msgr2=0x7f1f5419a480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.900 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.902+0000 7f1f5c98c640 1 --2- 192.168.123.107:0/3852095728 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1f54102a60 0x7f1f5419a480 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f1f40002940 tx=0x7f1f40002970 comp rx=0 tx=0).stop 2026-03-09T19:34:20.901 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.903+0000 7f1f5c98c640 1 -- 192.168.123.107:0/3852095728 shutdown_connections 2026-03-09T19:34:20.901 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.903+0000 7f1f5c98c640 1 --2- 192.168.123.107:0/3852095728 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f1f280778e0 0x7f1f28079da0 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.901 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.903+0000 7f1f5c98c640 1 --2- 192.168.123.107:0/3852095728 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1f54103c60 0x7f1f5419a9c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.901 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.903+0000 7f1f5c98c640 1 --2- 192.168.123.107:0/3852095728 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1f54102a60 0x7f1f5419a480 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.901 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.903+0000 7f1f5c98c640 1 -- 192.168.123.107:0/3852095728 >> 192.168.123.107:0/3852095728 conn(0x7f1f540fe250 msgr2=0x7f1f540ffd90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:20.901 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.903+0000 7f1f5c98c640 1 -- 192.168.123.107:0/3852095728 shutdown_connections 2026-03-09T19:34:20.901 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.903+0000 7f1f5c98c640 1 -- 192.168.123.107:0/3852095728 wait complete. 2026-03-09T19:34:20.963 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.965+0000 7f0d190bf640 1 -- 192.168.123.107:0/2348856308 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d14103c80 msgr2=0x7f0d14104100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.963 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.965+0000 7f0d190bf640 1 --2- 192.168.123.107:0/2348856308 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d14103c80 0x7f0d14104100 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f0d080099b0 tx=0x7f0d0802f240 comp rx=0 tx=0).stop 2026-03-09T19:34:20.963 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.965+0000 7f0d190bf640 1 -- 192.168.123.107:0/2348856308 shutdown_connections 2026-03-09T19:34:20.963 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.965+0000 7f0d190bf640 1 --2- 192.168.123.107:0/2348856308 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d14103c80 0x7f0d14104100 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.963 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.965+0000 7f0d190bf640 1 --2- 192.168.123.107:0/2348856308 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d14102a80 0x7f0d14102e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.963 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.965+0000 7f0d190bf640 1 -- 192.168.123.107:0/2348856308 >> 192.168.123.107:0/2348856308 conn(0x7f0d140fe250 msgr2=0x7f0d14100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:20.963 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.965+0000 7f0d190bf640 1 -- 192.168.123.107:0/2348856308 shutdown_connections 2026-03-09T19:34:20.963 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.965+0000 7f0d190bf640 1 -- 192.168.123.107:0/2348856308 wait complete. 2026-03-09T19:34:20.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.966+0000 7f0d190bf640 1 Processor -- start 2026-03-09T19:34:20.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.966+0000 7f0d190bf640 1 -- start start 2026-03-09T19:34:20.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.966+0000 7f0d190bf640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d14102a80 0x7f0d1419a460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:20.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.966+0000 7f0d190bf640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d14103c80 0x7f0d1419a9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:20.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.966+0000 7f0d190bf640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0d1419af70 con 0x7f0d14102a80 2026-03-09T19:34:20.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.966+0000 7f0d190bf640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0d1419b0e0 con 0x7f0d14103c80 2026-03-09T19:34:20.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.966+0000 7f0d12d76640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d14102a80 0x7f0d1419a460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:20.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.966+0000 7f0d12d76640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d14102a80 0x7f0d1419a460 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:32902/0 (socket says 192.168.123.107:32902) 2026-03-09T19:34:20.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.966+0000 7f0d12d76640 1 -- 192.168.123.107:0/1900449070 learned_addr learned my addr 192.168.123.107:0/1900449070 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:34:20.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.967+0000 7f0d12575640 1 --2- 192.168.123.107:0/1900449070 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d14103c80 0x7f0d1419a9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:20.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.967+0000 7f0d12d76640 1 -- 192.168.123.107:0/1900449070 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d14103c80 msgr2=0x7f0d1419a9a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:20.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.967+0000 7f0d12d76640 1 --2- 192.168.123.107:0/1900449070 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d14103c80 0x7f0d1419a9a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:20.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.967+0000 7f0d12d76640 1 -- 192.168.123.107:0/1900449070 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0d08009660 con 0x7f0d14102a80 2026-03-09T19:34:20.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.967+0000 7f0d12d76640 1 --2- 192.168.123.107:0/1900449070 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d14102a80 0x7f0d1419a460 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7f0cfc00e990 tx=0x7f0cfc00ee60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:20.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.967+0000 7f0cf3fff640 1 -- 192.168.123.107:0/1900449070 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0cfc009800 con 0x7f0d14102a80 2026-03-09T19:34:20.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.967+0000 7f0cf3fff640 1 -- 192.168.123.107:0/1900449070 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f0cfc004590 con 0x7f0d14102a80 2026-03-09T19:34:20.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.967+0000 7f0d190bf640 1 -- 192.168.123.107:0/1900449070 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0d1419fb80 con 0x7f0d14102a80 2026-03-09T19:34:20.967 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.968+0000 7f0cf3fff640 1 -- 192.168.123.107:0/1900449070 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0cfc010640 con 0x7f0d14102a80 2026-03-09T19:34:20.967 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.968+0000 7f0d190bf640 1 -- 192.168.123.107:0/1900449070 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0d141a00d0 con 0x7f0d14102a80 2026-03-09T19:34:20.970 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.969+0000 7f0cf3fff640 1 -- 192.168.123.107:0/1900449070 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0cfc0107a0 con 0x7f0d14102a80 2026-03-09T19:34:20.970 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.969+0000 7f0d190bf640 1 -- 192.168.123.107:0/1900449070 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0ce0005350 con 0x7f0d14102a80 2026-03-09T19:34:20.970 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.970+0000 7f0cf3fff640 1 --2- 192.168.123.107:0/1900449070 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0cec0778e0 0x7f0cec079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:20.970 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.970+0000 7f0cf3fff640 1 -- 192.168.123.107:0/1900449070 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f0cfc014070 con 0x7f0d14102a80 2026-03-09T19:34:20.971 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.973+0000 7f0d12575640 1 --2- 192.168.123.107:0/1900449070 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0cec0778e0 0x7f0cec079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:20.971 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.973+0000 7f0d12575640 1 --2- 192.168.123.107:0/1900449070 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0cec0778e0 0x7f0cec079da0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f0d1419b980 tx=0x7f0d0803a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:20.971 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:20.973+0000 7f0cf3fff640 1 -- 192.168.123.107:0/1900449070 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0cfc0639a0 con 0x7f0d14102a80 2026-03-09T19:34:21.091 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.093+0000 7f0d190bf640 1 -- 192.168.123.107:0/1900449070 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f0ce0005e10 con 0x7f0d14102a80 2026-03-09T19:34:21.092 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.094+0000 7f0cf3fff640 1 -- 192.168.123.107:0/1900449070 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 33 v33) v1 ==== 76+0+1990 (secure 0 0 0) 0x7f0cfc0630f0 con 0x7f0d14102a80 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:e33 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:btime 2026-03-09T19:34:09:773803+0000 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:epoch 33 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:34:09.773799+0000 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 102 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:up {0=34382,1=34386} 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:qdb_cluster leader: 34382 members: 34382,34386 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{0:34382} state up:active seq 9 join_fscid=1 addr [v2:192.168.123.107:6826/2856024060,v1:192.168.123.107:6827/2856024060] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{1:34386} state up:active seq 7 join_fscid=1 addr [v2:192.168.123.107:6828/670620212,v1:192.168.123.107:6829/670620212] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{-1:34410} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6826/2173796097,v1:192.168.123.108:6827/2173796097] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{-1:44337} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3904856878,v1:192.168.123.108:6825/3904856878] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T19:34:21.093 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 33 2026-03-09T19:34:21.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.097+0000 7f0d190bf640 1 -- 192.168.123.107:0/1900449070 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0cec0778e0 msgr2=0x7f0cec079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:21.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.097+0000 7f0d190bf640 1 --2- 192.168.123.107:0/1900449070 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0cec0778e0 0x7f0cec079da0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f0d1419b980 tx=0x7f0d0803a040 comp rx=0 tx=0).stop 2026-03-09T19:34:21.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.097+0000 7f0d190bf640 1 -- 192.168.123.107:0/1900449070 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d14102a80 msgr2=0x7f0d1419a460 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:21.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.097+0000 7f0d190bf640 1 --2- 192.168.123.107:0/1900449070 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d14102a80 0x7f0d1419a460 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7f0cfc00e990 tx=0x7f0cfc00ee60 comp rx=0 tx=0).stop 2026-03-09T19:34:21.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.098+0000 7f0d190bf640 1 -- 192.168.123.107:0/1900449070 shutdown_connections 2026-03-09T19:34:21.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.098+0000 7f0d190bf640 1 --2- 192.168.123.107:0/1900449070 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0cec0778e0 0x7f0cec079da0 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:21.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.098+0000 7f0d190bf640 1 --2- 192.168.123.107:0/1900449070 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d14103c80 0x7f0d1419a9a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:21.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.098+0000 7f0d190bf640 1 --2- 192.168.123.107:0/1900449070 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d14102a80 0x7f0d1419a460 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:21.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.098+0000 7f0d190bf640 1 -- 192.168.123.107:0/1900449070 >> 192.168.123.107:0/1900449070 conn(0x7f0d140fe250 msgr2=0x7f0d140ffd50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:21.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.098+0000 7f0d190bf640 1 -- 192.168.123.107:0/1900449070 shutdown_connections 2026-03-09T19:34:21.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.098+0000 7f0d190bf640 1 -- 192.168.123.107:0/1900449070 wait complete. 2026-03-09T19:34:21.156 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:20 vm07.local ceph-mon[111841]: pgmap v232: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail; 1.9 MiB/s rd, 4.8 KiB/s wr, 12 op/s 2026-03-09T19:34:21.157 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:20 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/3852095728' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:21.157 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.158+0000 7f0d779f1640 1 -- 192.168.123.107:0/2104845513 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d7010b920 msgr2=0x7f0d7010bd20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:21.157 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.158+0000 7f0d779f1640 1 --2- 192.168.123.107:0/2104845513 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d7010b920 0x7f0d7010bd20 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f0d640099b0 tx=0x7f0d6402f240 comp rx=0 tx=0).stop 2026-03-09T19:34:21.160 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.160+0000 7f0d779f1640 1 -- 192.168.123.107:0/2104845513 shutdown_connections 2026-03-09T19:34:21.160 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.161+0000 7f0d779f1640 1 --2- 192.168.123.107:0/2104845513 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d7010c6a0 0x7f0d7010cb00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:21.160 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.161+0000 7f0d779f1640 1 --2- 192.168.123.107:0/2104845513 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d7010b920 0x7f0d7010bd20 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:21.160 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.162+0000 7f0d779f1640 1 -- 192.168.123.107:0/2104845513 >> 192.168.123.107:0/2104845513 conn(0x7f0d7006a890 msgr2=0x7f0d7006acc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:21.161 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.163+0000 7f0d779f1640 1 -- 192.168.123.107:0/2104845513 shutdown_connections 2026-03-09T19:34:21.161 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.163+0000 7f0d779f1640 1 -- 192.168.123.107:0/2104845513 wait complete. 2026-03-09T19:34:21.162 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.164+0000 7f0d779f1640 1 Processor -- start 2026-03-09T19:34:21.163 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.165+0000 7f0d779f1640 1 -- start start 2026-03-09T19:34:21.163 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.165+0000 7f0d779f1640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d7010b920 0x7f0d701a2ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:21.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.166+0000 7f0d779f1640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d7010c6a0 0x7f0d701a33e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:21.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.166+0000 7f0d779f1640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0d701a39b0 con 0x7f0d7010b920 2026-03-09T19:34:21.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.166+0000 7f0d75766640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d7010b920 0x7f0d701a2ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:21.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.166+0000 7f0d75766640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d7010b920 0x7f0d701a2ea0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:32928/0 (socket says 192.168.123.107:32928) 2026-03-09T19:34:21.164 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.166+0000 7f0d75766640 1 -- 192.168.123.107:0/882700506 learned_addr learned my addr 192.168.123.107:0/882700506 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:34:21.165 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.167+0000 7f0d74f65640 1 --2- 192.168.123.107:0/882700506 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d7010c6a0 0x7f0d701a33e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:21.165 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.167+0000 7f0d779f1640 1 -- 192.168.123.107:0/882700506 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0d701a3b20 con 0x7f0d7010c6a0 2026-03-09T19:34:21.166 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.168+0000 7f0d74f65640 1 -- 192.168.123.107:0/882700506 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d7010b920 msgr2=0x7f0d701a2ea0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:21.166 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.169+0000 7f0d74f65640 1 --2- 192.168.123.107:0/882700506 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d7010b920 0x7f0d701a2ea0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:21.167 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.169+0000 7f0d74f65640 1 -- 192.168.123.107:0/882700506 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0d64009660 con 0x7f0d7010c6a0 2026-03-09T19:34:21.167 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.169+0000 7f0d75766640 1 --2- 192.168.123.107:0/882700506 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d7010b920 0x7f0d701a2ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:34:21.168 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.171+0000 7f0d74f65640 1 --2- 192.168.123.107:0/882700506 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d7010c6a0 0x7f0d701a33e0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f0d5c00b4d0 tx=0x7f0d5c00b9a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:21.169 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.171+0000 7f0d627fc640 1 -- 192.168.123.107:0/882700506 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0d5c004280 con 0x7f0d7010c6a0 2026-03-09T19:34:21.169 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.172+0000 7f0d627fc640 1 -- 192.168.123.107:0/882700506 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f0d5c0043e0 con 0x7f0d7010c6a0 2026-03-09T19:34:21.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.172+0000 7f0d779f1640 1 -- 192.168.123.107:0/882700506 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0d701a85c0 con 0x7f0d7010c6a0 2026-03-09T19:34:21.170 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.172+0000 7f0d627fc640 1 -- 192.168.123.107:0/882700506 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0d5c010b50 con 0x7f0d7010c6a0 2026-03-09T19:34:21.171 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.172+0000 7f0d779f1640 1 -- 192.168.123.107:0/882700506 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0d701a8b90 con 0x7f0d7010c6a0 2026-03-09T19:34:21.172 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.174+0000 7f0d779f1640 1 -- 192.168.123.107:0/882700506 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0d7010bd20 con 0x7f0d7010c6a0 2026-03-09T19:34:21.172 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.174+0000 7f0d627fc640 1 -- 192.168.123.107:0/882700506 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0d5c0026e0 con 0x7f0d7010c6a0 2026-03-09T19:34:21.175 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.177+0000 7f0d627fc640 1 --2- 192.168.123.107:0/882700506 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0d4c0777a0 0x7f0d4c079c60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:21.175 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.177+0000 7f0d627fc640 1 -- 192.168.123.107:0/882700506 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f0d5c098e60 con 0x7f0d7010c6a0 2026-03-09T19:34:21.176 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.178+0000 7f0d75766640 1 --2- 192.168.123.107:0/882700506 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0d4c0777a0 0x7f0d4c079c60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:21.176 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.178+0000 7f0d75766640 1 --2- 192.168.123.107:0/882700506 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0d4c0777a0 0x7f0d4c079c60 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f0d64002c30 tx=0x7f0d6403a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:21.178 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.180+0000 7f0d627fc640 1 -- 192.168.123.107:0/882700506 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0d5c061510 con 0x7f0d7010c6a0 2026-03-09T19:34:21.299 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.299+0000 7f0d779f1640 1 -- 192.168.123.107:0/882700506 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f0d701108e0 con 0x7f0d4c0777a0 2026-03-09T19:34:21.299 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:34:21.299 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T19:34:21.299 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": true, 2026-03-09T19:34:21.300 INFO:teuthology.orchestra.run.vm07.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T19:34:21.300 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [ 2026-03-09T19:34:21.300 INFO:teuthology.orchestra.run.vm07.stdout: "crash", 2026-03-09T19:34:21.300 INFO:teuthology.orchestra.run.vm07.stdout: "osd", 2026-03-09T19:34:21.300 INFO:teuthology.orchestra.run.vm07.stdout: "ceph-exporter", 2026-03-09T19:34:21.300 INFO:teuthology.orchestra.run.vm07.stdout: "mds", 2026-03-09T19:34:21.300 INFO:teuthology.orchestra.run.vm07.stdout: "mon", 2026-03-09T19:34:21.300 INFO:teuthology.orchestra.run.vm07.stdout: "mgr" 2026-03-09T19:34:21.300 INFO:teuthology.orchestra.run.vm07.stdout: ], 2026-03-09T19:34:21.300 INFO:teuthology.orchestra.run.vm07.stdout: "progress": "18/23 daemons upgraded", 2026-03-09T19:34:21.300 INFO:teuthology.orchestra.run.vm07.stdout: "message": "Currently upgrading node-exporter daemons", 2026-03-09T19:34:21.300 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:34:21.300 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:34:21.300 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.301+0000 7f0d627fc640 1 -- 192.168.123.107:0/882700506 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+466 (secure 0 0 0) 0x7f0d701108e0 con 0x7f0d4c0777a0 2026-03-09T19:34:21.305 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.307+0000 7f0d3ffff640 1 -- 192.168.123.107:0/882700506 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0d4c0777a0 msgr2=0x7f0d4c079c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:21.305 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.307+0000 7f0d3ffff640 1 --2- 192.168.123.107:0/882700506 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0d4c0777a0 0x7f0d4c079c60 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f0d64002c30 tx=0x7f0d6403a040 comp rx=0 tx=0).stop 2026-03-09T19:34:21.305 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.307+0000 7f0d3ffff640 1 -- 192.168.123.107:0/882700506 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d7010c6a0 msgr2=0x7f0d701a33e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:21.305 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.307+0000 7f0d3ffff640 1 --2- 192.168.123.107:0/882700506 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d7010c6a0 0x7f0d701a33e0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f0d5c00b4d0 tx=0x7f0d5c00b9a0 comp rx=0 tx=0).stop 2026-03-09T19:34:21.305 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.307+0000 7f0d3ffff640 1 -- 192.168.123.107:0/882700506 shutdown_connections 2026-03-09T19:34:21.305 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.307+0000 7f0d3ffff640 1 --2- 192.168.123.107:0/882700506 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0d4c0777a0 0x7f0d4c079c60 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:21.305 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.307+0000 7f0d3ffff640 1 --2- 192.168.123.107:0/882700506 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d7010c6a0 0x7f0d701a33e0 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:21.305 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.307+0000 7f0d3ffff640 1 --2- 192.168.123.107:0/882700506 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d7010b920 0x7f0d701a2ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:21.305 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.307+0000 7f0d3ffff640 1 -- 192.168.123.107:0/882700506 >> 192.168.123.107:0/882700506 conn(0x7f0d7006a890 msgr2=0x7f0d7010a650 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:21.306 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.307+0000 7f0d3ffff640 1 -- 192.168.123.107:0/882700506 shutdown_connections 2026-03-09T19:34:21.306 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.307+0000 7f0d3ffff640 1 -- 192.168.123.107:0/882700506 wait complete. 2026-03-09T19:34:21.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:20 vm08.local ceph-mon[103420]: pgmap v232: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail; 1.9 MiB/s rd, 4.8 KiB/s wr, 12 op/s 2026-03-09T19:34:21.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:20 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/3852095728' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:21.371 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.372+0000 7f6ce2d66640 1 -- 192.168.123.107:0/2946202575 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6cd40a5d10 msgr2=0x7f6cd40a6110 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:21.371 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.372+0000 7f6ce2d66640 1 --2- 192.168.123.107:0/2946202575 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6cd40a5d10 0x7f6cd40a6110 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f6cd8009a00 tx=0x7f6cd802f290 comp rx=0 tx=0).stop 2026-03-09T19:34:21.372 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.373+0000 7f6ce2d66640 1 -- 192.168.123.107:0/2946202575 shutdown_connections 2026-03-09T19:34:21.372 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.373+0000 7f6ce2d66640 1 --2- 192.168.123.107:0/2946202575 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6cd40a4350 0x7f6cd40a47d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:21.372 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.373+0000 7f6ce2d66640 1 --2- 192.168.123.107:0/2946202575 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6cd40a5d10 0x7f6cd40a6110 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:21.372 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.373+0000 7f6ce2d66640 1 -- 192.168.123.107:0/2946202575 >> 192.168.123.107:0/2946202575 conn(0x7f6cd409fea0 msgr2=0x7f6cd40a2300 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:21.374 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.374+0000 7f6ce2d66640 1 -- 192.168.123.107:0/2946202575 shutdown_connections 2026-03-09T19:34:21.374 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.374+0000 7f6ce2d66640 1 -- 192.168.123.107:0/2946202575 wait complete. 2026-03-09T19:34:21.374 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.375+0000 7f6ce2d66640 1 Processor -- start 2026-03-09T19:34:21.374 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.375+0000 7f6ce2d66640 1 -- start start 2026-03-09T19:34:21.374 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.375+0000 7f6ce2d66640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6cd40a4350 0x7f6cd40cff40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:21.374 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.375+0000 7f6ce2d66640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6cd40d18f0 0x7f6cd40d0480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:21.374 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.375+0000 7f6ce2d66640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6cd40d09c0 con 0x7f6cd40d18f0 2026-03-09T19:34:21.374 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.375+0000 7f6ce2d66640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6cd40d0b30 con 0x7f6cd40a4350 2026-03-09T19:34:21.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.375+0000 7f6ce1563640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6cd40d18f0 0x7f6cd40d0480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:21.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.375+0000 7f6ce1563640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6cd40d18f0 0x7f6cd40d0480 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:32944/0 (socket says 192.168.123.107:32944) 2026-03-09T19:34:21.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.375+0000 7f6ce1563640 1 -- 192.168.123.107:0/1289846788 learned_addr learned my addr 192.168.123.107:0/1289846788 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:34:21.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.375+0000 7f6ce1563640 1 -- 192.168.123.107:0/1289846788 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6cd40a4350 msgr2=0x7f6cd40cff40 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:34:21.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.375+0000 7f6ce1563640 1 --2- 192.168.123.107:0/1289846788 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6cd40a4350 0x7f6cd40cff40 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:21.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.375+0000 7f6ce1563640 1 -- 192.168.123.107:0/1289846788 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6cd8009660 con 0x7f6cd40d18f0 2026-03-09T19:34:21.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.375+0000 7f6ce1563640 1 --2- 192.168.123.107:0/1289846788 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6cd40d18f0 0x7f6cd40d0480 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f6cdc066a00 tx=0x7f6cdc070830 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:21.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.376+0000 7f6cd2ffd640 1 -- 192.168.123.107:0/1289846788 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6cdc079070 con 0x7f6cd40d18f0 2026-03-09T19:34:21.375 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.376+0000 7f6cd2ffd640 1 -- 192.168.123.107:0/1289846788 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6cdc075a90 con 0x7f6cd40d18f0 2026-03-09T19:34:21.376 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.376+0000 7f6cd2ffd640 1 -- 192.168.123.107:0/1289846788 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6cdc074b60 con 0x7f6cd40d18f0 2026-03-09T19:34:21.377 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.376+0000 7f6ce2d66640 1 -- 192.168.123.107:0/1289846788 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6cd4010fd0 con 0x7f6cd40d18f0 2026-03-09T19:34:21.377 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.376+0000 7f6ce2d66640 1 -- 192.168.123.107:0/1289846788 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6cd40114d0 con 0x7f6cd40d18f0 2026-03-09T19:34:21.377 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.378+0000 7f6cd2ffd640 1 -- 192.168.123.107:0/1289846788 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6cdc074cc0 con 0x7f6cd40d18f0 2026-03-09T19:34:21.377 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.378+0000 7f6cd2ffd640 1 --2- 192.168.123.107:0/1289846788 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f6cbc077680 0x7f6cbc079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:21.377 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.378+0000 7f6cd2ffd640 1 -- 192.168.123.107:0/1289846788 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f6cdc0fbbc0 con 0x7f6cd40d18f0 2026-03-09T19:34:21.377 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.379+0000 7f6ce1d64640 1 --2- 192.168.123.107:0/1289846788 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f6cbc077680 0x7f6cbc079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:21.377 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.379+0000 7f6ce2d66640 1 -- 192.168.123.107:0/1289846788 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6ca8005350 con 0x7f6cd40d18f0 2026-03-09T19:34:21.387 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.385+0000 7f6ce1d64640 1 --2- 192.168.123.107:0/1289846788 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f6cbc077680 0x7f6cbc079b40 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f6cd8009a00 tx=0x7f6cd80023d0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:21.387 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.385+0000 7f6cd2ffd640 1 -- 192.168.123.107:0/1289846788 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6cdc0c41e0 con 0x7f6cd40d18f0 2026-03-09T19:34:21.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.533+0000 7f6ce2d66640 1 -- 192.168.123.107:0/1289846788 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f6ca8005600 con 0x7f6cd40d18f0 2026-03-09T19:34:21.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.534+0000 7f6cd2ffd640 1 -- 192.168.123.107:0/1289846788 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f6cdc0c3930 con 0x7f6cd40d18f0 2026-03-09T19:34:21.532 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_OK 2026-03-09T19:34:21.535 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.537+0000 7f6cd0ff9640 1 -- 192.168.123.107:0/1289846788 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f6cbc077680 msgr2=0x7f6cbc079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:21.535 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.537+0000 7f6cd0ff9640 1 --2- 192.168.123.107:0/1289846788 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f6cbc077680 0x7f6cbc079b40 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f6cd8009a00 tx=0x7f6cd80023d0 comp rx=0 tx=0).stop 2026-03-09T19:34:21.536 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.537+0000 7f6cd0ff9640 1 -- 192.168.123.107:0/1289846788 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6cd40d18f0 msgr2=0x7f6cd40d0480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:21.536 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.537+0000 7f6cd0ff9640 1 --2- 192.168.123.107:0/1289846788 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6cd40d18f0 0x7f6cd40d0480 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f6cdc066a00 tx=0x7f6cdc070830 comp rx=0 tx=0).stop 2026-03-09T19:34:21.536 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.538+0000 7f6cd0ff9640 1 -- 192.168.123.107:0/1289846788 shutdown_connections 2026-03-09T19:34:21.536 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.538+0000 7f6cd0ff9640 1 --2- 192.168.123.107:0/1289846788 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f6cbc077680 0x7f6cbc079b40 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:21.536 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.538+0000 7f6cd0ff9640 1 --2- 192.168.123.107:0/1289846788 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6cd40d18f0 0x7f6cd40d0480 secure :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f6cdc066a00 tx=0x7f6cdc070830 comp rx=0 tx=0).stop 2026-03-09T19:34:21.536 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.538+0000 7f6cd0ff9640 1 --2- 192.168.123.107:0/1289846788 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6cd40a4350 0x7f6cd40cff40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:21.536 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.538+0000 7f6cd0ff9640 1 -- 192.168.123.107:0/1289846788 >> 192.168.123.107:0/1289846788 conn(0x7f6cd409fea0 msgr2=0x7f6cd40067b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:21.536 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.538+0000 7f6cd0ff9640 1 -- 192.168.123.107:0/1289846788 shutdown_connections 2026-03-09T19:34:21.537 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:21.538+0000 7f6cd0ff9640 1 -- 192.168.123.107:0/1289846788 wait complete. 2026-03-09T19:34:22.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:21 vm07.local ceph-mon[111841]: from='client.44345 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:34:22.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:21 vm07.local ceph-mon[111841]: from='client.34418 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:34:22.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:21 vm07.local ceph-mon[111841]: from='client.34422 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:34:22.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:21 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1900449070' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:34:22.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:21 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:22.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:21 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:22.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:21 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:22.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:21 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1289846788' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:34:22.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:21 vm08.local ceph-mon[103420]: from='client.44345 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:34:22.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:21 vm08.local ceph-mon[103420]: from='client.34418 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:34:22.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:21 vm08.local ceph-mon[103420]: from='client.34422 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:34:22.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:21 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1900449070' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:34:22.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:21 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:22.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:21 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:22.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:21 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:22.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:21 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1289846788' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:34:23.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:22 vm08.local ceph-mon[103420]: from='client.44351 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:34:23.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:22 vm08.local ceph-mon[103420]: pgmap v233: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail; 351 KiB/s rd, 4.7 KiB/s wr, 11 op/s 2026-03-09T19:34:23.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:22 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:23.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:22 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:23.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:22 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:23.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:22 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:23.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:22 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:23.097 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:22 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:23.380 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:22 vm07.local ceph-mon[111841]: from='client.44351 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:34:23.380 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:22 vm07.local ceph-mon[111841]: pgmap v233: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail; 351 KiB/s rd, 4.7 KiB/s wr, 11 op/s 2026-03-09T19:34:23.380 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:22 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:23.380 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:22 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:23.380 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:22 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:23.380 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:22 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:23.380 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:22 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:23.380 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:22 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:24.417 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:24 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:24.417 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:24 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:24.417 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:24 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:24.417 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:24 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:24.417 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:24 vm07.local ceph-mon[111841]: pgmap v234: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail; 7.7 KiB/s rd, 4.5 KiB/s wr, 9 op/s 2026-03-09T19:34:24.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:24 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:24.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:24 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:24.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:24 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:24.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:24 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:24.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:24 vm08.local ceph-mon[103420]: pgmap v234: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail; 7.7 KiB/s rd, 4.5 KiB/s wr, 9 op/s 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: Upgrade: Updating prometheus.vm07 2026-03-09T19:34:25.479 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:25 vm07.local ceph-mon[111841]: Deploying daemon prometheus.vm07 on vm07 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: Upgrade: Updating prometheus.vm07 2026-03-09T19:34:25.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:25 vm08.local ceph-mon[103420]: Deploying daemon prometheus.vm07 on vm07 2026-03-09T19:34:26.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:26 vm07.local ceph-mon[111841]: pgmap v235: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail; 511 B/s rd, 0 B/s wr, 0 op/s 2026-03-09T19:34:26.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:26 vm08.local ceph-mon[103420]: pgmap v235: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail; 511 B/s rd, 0 B/s wr, 0 op/s 2026-03-09T19:34:28.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:28 vm07.local ceph-mon[111841]: pgmap v236: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:29.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:28 vm08.local ceph-mon[103420]: pgmap v236: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:30.694 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:30 vm07.local ceph-mon[111841]: pgmap v237: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:30.694 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:30 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:30.694 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:30 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:30.694 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:30 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:31.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:30 vm08.local ceph-mon[103420]: pgmap v237: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:31.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:30 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:31.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:30 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:31.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:30 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:32.971 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:32 vm07.local ceph-mon[111841]: pgmap v238: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:32.971 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:32 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:32.971 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:32 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:32.971 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:32 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:32.971 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:32 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:33.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:32 vm08.local ceph-mon[103420]: pgmap v238: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:33.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:32 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:33.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:32 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:33.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:32 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:33.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:32 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:33.939 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:34.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:34:34.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:34.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:34.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:34.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:34.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T19:34:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:34.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:34 vm07.local ceph-mon[111841]: pgmap v239: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:34.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:34 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T19:34:35.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:34 vm08.local ceph-mon[103420]: pgmap v239: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:35.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:34 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T19:34:35.882 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:35 vm07.local ceph-mon[111841]: Upgrade: Updating alertmanager.vm07 2026-03-09T19:34:35.882 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:35 vm07.local ceph-mon[111841]: Deploying daemon alertmanager.vm07 on vm07 2026-03-09T19:34:35.882 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:35 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:35.882 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:35 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:35.882 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:35 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:36.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:35 vm08.local ceph-mon[103420]: Upgrade: Updating alertmanager.vm07 2026-03-09T19:34:36.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:35 vm08.local ceph-mon[103420]: Deploying daemon alertmanager.vm07 on vm07 2026-03-09T19:34:36.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:35 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:36.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:35 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:36.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:35 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:37.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:36 vm08.local ceph-mon[103420]: pgmap v240: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:37.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:36 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:37.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:36 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:37.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:36 vm07.local ceph-mon[111841]: pgmap v240: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:37.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:36 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:37.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:36 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:38.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:38.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:38 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:38.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:38 vm07.local ceph-mon[111841]: pgmap v241: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:38.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:38 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:38.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:38 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:38.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:38 vm08.local ceph-mon[103420]: pgmap v241: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.625 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:39 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:39.845 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:39 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:40.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:40 vm07.local ceph-mon[111841]: Upgrade: Updating grafana.vm07 2026-03-09T19:34:40.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:40 vm07.local ceph-mon[111841]: Deploying daemon grafana.vm07 on vm07 2026-03-09T19:34:40.728 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:40 vm07.local ceph-mon[111841]: pgmap v242: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:40.843 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:40 vm08.local ceph-mon[103420]: Upgrade: Updating grafana.vm07 2026-03-09T19:34:40.843 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:40 vm08.local ceph-mon[103420]: Deploying daemon grafana.vm07 on vm07 2026-03-09T19:34:40.843 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:40 vm08.local ceph-mon[103420]: pgmap v242: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:42.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:42 vm07.local ceph-mon[111841]: pgmap v243: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:43.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:42 vm08.local ceph-mon[103420]: pgmap v243: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:44.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:44 vm07.local ceph-mon[111841]: pgmap v244: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:45.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:44 vm08.local ceph-mon[103420]: pgmap v244: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:47.227 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:46 vm07.local ceph-mon[111841]: pgmap v245: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:47.227 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:47.227 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:47.227 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:46 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:47.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:46 vm08.local ceph-mon[103420]: pgmap v245: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:47.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:47.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:47.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:46 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:48.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:48 vm07.local ceph-mon[111841]: pgmap v246: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:48.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:48.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:48.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:48.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:48.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:34:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:48 vm08.local ceph-mon[103420]: pgmap v246: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: Upgrade: Finalizing container_image settings 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-09T19:34:50.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: Upgrade: Complete! 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: pgmap v247: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:50.980 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: Upgrade: Finalizing container_image settings 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T19:34:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: Upgrade: Complete! 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: pgmap v247: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:34:51.096 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:51.608 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.610+0000 7f32780f5640 1 -- 192.168.123.107:0/2894330837 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f32700751a0 msgr2=0x7f3270073600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:51.608 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.610+0000 7f32780f5640 1 --2- 192.168.123.107:0/2894330837 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f32700751a0 0x7f3270073600 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f3264009930 tx=0x7f326402f1d0 comp rx=0 tx=0).stop 2026-03-09T19:34:51.608 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.610+0000 7f32780f5640 1 -- 192.168.123.107:0/2894330837 shutdown_connections 2026-03-09T19:34:51.608 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.610+0000 7f32780f5640 1 --2- 192.168.123.107:0/2894330837 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3270073b40 0x7f3270073fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:51.608 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.610+0000 7f32780f5640 1 --2- 192.168.123.107:0/2894330837 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f32700751a0 0x7f3270073600 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:51.608 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.610+0000 7f32780f5640 1 -- 192.168.123.107:0/2894330837 >> 192.168.123.107:0/2894330837 conn(0x7f32700fbf80 msgr2=0x7f32700fe3c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:51.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.610+0000 7f32780f5640 1 -- 192.168.123.107:0/2894330837 shutdown_connections 2026-03-09T19:34:51.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.611+0000 7f32780f5640 1 -- 192.168.123.107:0/2894330837 wait complete. 2026-03-09T19:34:51.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.611+0000 7f32780f5640 1 Processor -- start 2026-03-09T19:34:51.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.611+0000 7f32780f5640 1 -- start start 2026-03-09T19:34:51.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.611+0000 7f32780f5640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3270073b40 0x7f327019a430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:51.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.611+0000 7f32780f5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f32700751a0 0x7f327019a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:51.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.611+0000 7f32780f5640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f327019af40 con 0x7f32700751a0 2026-03-09T19:34:51.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.611+0000 7f32780f5640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f327019b0b0 con 0x7f3270073b40 2026-03-09T19:34:51.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.612+0000 7f3275669640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f32700751a0 0x7f327019a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:51.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.612+0000 7f3275669640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f32700751a0 0x7f327019a970 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43406/0 (socket says 192.168.123.107:43406) 2026-03-09T19:34:51.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.612+0000 7f3275669640 1 -- 192.168.123.107:0/2563736618 learned_addr learned my addr 192.168.123.107:0/2563736618 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:34:51.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.612+0000 7f3275e6a640 1 --2- 192.168.123.107:0/2563736618 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3270073b40 0x7f327019a430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:51.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.612+0000 7f3275669640 1 -- 192.168.123.107:0/2563736618 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3270073b40 msgr2=0x7f327019a430 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:51.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.612+0000 7f3275669640 1 --2- 192.168.123.107:0/2563736618 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3270073b40 0x7f327019a430 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:51.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.612+0000 7f3275669640 1 -- 192.168.123.107:0/2563736618 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3264009590 con 0x7f32700751a0 2026-03-09T19:34:51.610 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.612+0000 7f3275669640 1 --2- 192.168.123.107:0/2563736618 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f32700751a0 0x7f327019a970 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f326000ed10 tx=0x7f326000c6a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:51.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.613+0000 7f325effd640 1 -- 192.168.123.107:0/2563736618 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f326000eed0 con 0x7f32700751a0 2026-03-09T19:34:51.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.613+0000 7f325effd640 1 -- 192.168.123.107:0/2563736618 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f3260004590 con 0x7f32700751a0 2026-03-09T19:34:51.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.613+0000 7f325effd640 1 -- 192.168.123.107:0/2563736618 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3260010640 con 0x7f32700751a0 2026-03-09T19:34:51.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.613+0000 7f32780f5640 1 -- 192.168.123.107:0/2563736618 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f327019fb50 con 0x7f32700751a0 2026-03-09T19:34:51.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.613+0000 7f32780f5640 1 -- 192.168.123.107:0/2563736618 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3270100510 con 0x7f32700751a0 2026-03-09T19:34:51.615 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.614+0000 7f325effd640 1 -- 192.168.123.107:0/2563736618 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f32600107a0 con 0x7f32700751a0 2026-03-09T19:34:51.615 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.614+0000 7f32780f5640 1 -- 192.168.123.107:0/2563736618 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3238005350 con 0x7f32700751a0 2026-03-09T19:34:51.615 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.615+0000 7f325effd640 1 --2- 192.168.123.107:0/2563736618 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f324c0778e0 0x7f324c079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:51.615 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.615+0000 7f325effd640 1 -- 192.168.123.107:0/2563736618 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f3260014070 con 0x7f32700751a0 2026-03-09T19:34:51.615 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.615+0000 7f3275e6a640 1 --2- 192.168.123.107:0/2563736618 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f324c0778e0 0x7f324c079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:51.615 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.616+0000 7f3275e6a640 1 --2- 192.168.123.107:0/2563736618 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f324c0778e0 0x7f324c079da0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f326402f6e0 tx=0x7f32640023d0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:51.616 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.618+0000 7f325effd640 1 -- 192.168.123.107:0/2563736618 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3260062a60 con 0x7f32700751a0 2026-03-09T19:34:51.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.724+0000 7f32780f5640 1 -- 192.168.123.107:0/2563736618 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3238002bf0 con 0x7f324c0778e0 2026-03-09T19:34:51.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.725+0000 7f325effd640 1 -- 192.168.123.107:0/2563736618 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f3238002bf0 con 0x7f324c0778e0 2026-03-09T19:34:51.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.727+0000 7f32780f5640 1 -- 192.168.123.107:0/2563736618 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f324c0778e0 msgr2=0x7f324c079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:51.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.727+0000 7f32780f5640 1 --2- 192.168.123.107:0/2563736618 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f324c0778e0 0x7f324c079da0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f326402f6e0 tx=0x7f32640023d0 comp rx=0 tx=0).stop 2026-03-09T19:34:51.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.727+0000 7f32780f5640 1 -- 192.168.123.107:0/2563736618 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f32700751a0 msgr2=0x7f327019a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:51.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.727+0000 7f32780f5640 1 --2- 192.168.123.107:0/2563736618 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f32700751a0 0x7f327019a970 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f326000ed10 tx=0x7f326000c6a0 comp rx=0 tx=0).stop 2026-03-09T19:34:51.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.728+0000 7f32780f5640 1 -- 192.168.123.107:0/2563736618 shutdown_connections 2026-03-09T19:34:51.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.728+0000 7f32780f5640 1 --2- 192.168.123.107:0/2563736618 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f324c0778e0 0x7f324c079da0 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:51.725 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.728+0000 7f32780f5640 1 --2- 192.168.123.107:0/2563736618 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f32700751a0 0x7f327019a970 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:51.726 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.728+0000 7f32780f5640 1 --2- 192.168.123.107:0/2563736618 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3270073b40 0x7f327019a430 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:51.726 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.728+0000 7f32780f5640 1 -- 192.168.123.107:0/2563736618 >> 192.168.123.107:0/2563736618 conn(0x7f32700fbf80 msgr2=0x7f32700fdab0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:51.726 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.728+0000 7f32780f5640 1 -- 192.168.123.107:0/2563736618 shutdown_connections 2026-03-09T19:34:51.726 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:51.728+0000 7f32780f5640 1 -- 192.168.123.107:0/2563736618 wait complete. 2026-03-09T19:34:51.797 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-09T19:34:51.988 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:34:52.301 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.303+0000 7f4cc3322640 1 -- 192.168.123.107:0/2563533415 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cbc1089d0 msgr2=0x7f4cbc108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:52.301 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.303+0000 7f4cc3322640 1 --2- 192.168.123.107:0/2563533415 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cbc1089d0 0x7f4cbc108db0 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f4cac009a00 tx=0x7f4cac02f290 comp rx=0 tx=0).stop 2026-03-09T19:34:52.302 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.304+0000 7f4cc3322640 1 -- 192.168.123.107:0/2563533415 shutdown_connections 2026-03-09T19:34:52.302 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.304+0000 7f4cc3322640 1 --2- 192.168.123.107:0/2563533415 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cbc1029d0 0x7f4cbc102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:52.302 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.304+0000 7f4cc3322640 1 --2- 192.168.123.107:0/2563533415 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cbc1089d0 0x7f4cbc108db0 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:52.302 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.304+0000 7f4cc3322640 1 -- 192.168.123.107:0/2563533415 >> 192.168.123.107:0/2563533415 conn(0x7f4cbc0fe710 msgr2=0x7f4cbc100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:52.302 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.304+0000 7f4cc3322640 1 -- 192.168.123.107:0/2563533415 shutdown_connections 2026-03-09T19:34:52.302 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.304+0000 7f4cc3322640 1 -- 192.168.123.107:0/2563533415 wait complete. 2026-03-09T19:34:52.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.305+0000 7f4cc3322640 1 Processor -- start 2026-03-09T19:34:52.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.305+0000 7f4cc3322640 1 -- start start 2026-03-09T19:34:52.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.305+0000 7f4cc3322640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cbc1029d0 0x7f4cbc1a0680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:52.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.306+0000 7f4cc3322640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cbc1089d0 0x7f4cbc1a0bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:52.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.306+0000 7f4cc3322640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4cbc19a770 con 0x7f4cbc1089d0 2026-03-09T19:34:52.303 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.306+0000 7f4cc3322640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4cbc19a8e0 con 0x7f4cbc1029d0 2026-03-09T19:34:52.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.306+0000 7f4cc0896640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cbc1089d0 0x7f4cbc1a0bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:52.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.306+0000 7f4cc0896640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cbc1089d0 0x7f4cbc1a0bc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43434/0 (socket says 192.168.123.107:43434) 2026-03-09T19:34:52.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.306+0000 7f4cc0896640 1 -- 192.168.123.107:0/3719426344 learned_addr learned my addr 192.168.123.107:0/3719426344 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:34:52.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.306+0000 7f4cc0896640 1 -- 192.168.123.107:0/3719426344 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cbc1029d0 msgr2=0x7f4cbc1a0680 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:34:52.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.306+0000 7f4cc0896640 1 --2- 192.168.123.107:0/3719426344 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cbc1029d0 0x7f4cbc1a0680 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:52.304 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.307+0000 7f4cc0896640 1 -- 192.168.123.107:0/3719426344 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4cac009660 con 0x7f4cbc1089d0 2026-03-09T19:34:52.305 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.307+0000 7f4cc0896640 1 --2- 192.168.123.107:0/3719426344 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cbc1089d0 0x7f4cbc1a0bc0 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7f4cb800c370 tx=0x7f4cb800c840 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:52.306 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.307+0000 7f4caa7fc640 1 -- 192.168.123.107:0/3719426344 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4cb8009280 con 0x7f4cbc1089d0 2026-03-09T19:34:52.306 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.307+0000 7f4caa7fc640 1 -- 192.168.123.107:0/3719426344 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f4cb800f040 con 0x7f4cbc1089d0 2026-03-09T19:34:52.306 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.307+0000 7f4caa7fc640 1 -- 192.168.123.107:0/3719426344 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4cb8007500 con 0x7f4cbc1089d0 2026-03-09T19:34:52.306 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.307+0000 7f4cc3322640 1 -- 192.168.123.107:0/3719426344 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4cbc19ab60 con 0x7f4cbc1089d0 2026-03-09T19:34:52.306 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.307+0000 7f4cc3322640 1 -- 192.168.123.107:0/3719426344 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4cbc19b0b0 con 0x7f4cbc1089d0 2026-03-09T19:34:52.310 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.309+0000 7f4caa7fc640 1 -- 192.168.123.107:0/3719426344 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4cb8007660 con 0x7f4cbc1089d0 2026-03-09T19:34:52.310 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.309+0000 7f4cc3322640 1 -- 192.168.123.107:0/3719426344 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4cbc104110 con 0x7f4cbc1089d0 2026-03-09T19:34:52.310 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.309+0000 7f4caa7fc640 1 --2- 192.168.123.107:0/3719426344 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c980778e0 0x7f4c98079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:52.310 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.310+0000 7f4caa7fc640 1 -- 192.168.123.107:0/3719426344 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f4cb8099870 con 0x7f4cbc1089d0 2026-03-09T19:34:52.310 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.310+0000 7f4cc1097640 1 --2- 192.168.123.107:0/3719426344 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c980778e0 0x7f4c98079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:52.310 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.313+0000 7f4caa7fc640 1 -- 192.168.123.107:0/3719426344 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4cb8061f20 con 0x7f4cbc1089d0 2026-03-09T19:34:52.311 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.313+0000 7f4cc1097640 1 --2- 192.168.123.107:0/3719426344 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c980778e0 0x7f4c98079da0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f4cac009a00 tx=0x7f4cac0023d0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:52.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.421+0000 7f4cc3322640 1 -- 192.168.123.107:0/3719426344 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f4cbc19bc80 con 0x7f4c980778e0 2026-03-09T19:34:52.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.430+0000 7f4caa7fc640 1 -- 192.168.123.107:0/3719426344 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f4cbc19bc80 con 0x7f4c980778e0 2026-03-09T19:34:52.429 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:34:52.429 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (17s) 4s ago 12m 16.7M - 0.25.0 c8568f914cd2 900f5bd509c9 2026-03-09T19:34:52.429 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (43s) 4s ago 12m 10.5M - 19.2.3-678-ge911bdeb 654f31e6858e 143b4d120468 2026-03-09T19:34:52.429 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (41s) 30s ago 11m 10.2M - 19.2.3-678-ge911bdeb 654f31e6858e bb4c6a92b1d5 2026-03-09T19:34:52.429 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (5m) 4s ago 12m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 91ed5a6dbf3f 2026-03-09T19:34:52.429 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (5m) 30s ago 11m 8321k - 19.2.3-678-ge911bdeb 654f31e6858e b2465a9d2305 2026-03-09T19:34:52.429 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (6s) 4s ago 12m 42.8M - 10.4.0 c8b91775d855 a4189864679c 2026-03-09T19:34:52.429 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (70s) 4s ago 10m 101M - 19.2.3-678-ge911bdeb 654f31e6858e ccdf01cf9a0b 2026-03-09T19:34:52.429 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (64s) 4s ago 10m 21.3M - 19.2.3-678-ge911bdeb 654f31e6858e 15f71649e080 2026-03-09T19:34:52.429 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (47s) 30s ago 10m 15.5M - 19.2.3-678-ge911bdeb 654f31e6858e b1b66121ee0c 2026-03-09T19:34:52.430 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (55s) 30s ago 10m 20.7M - 19.2.3-678-ge911bdeb 654f31e6858e 6c9f1adefb0b 2026-03-09T19:34:52.430 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:8443,9283,8765 running (6m) 4s ago 13m 618M - 19.2.3-678-ge911bdeb 654f31e6858e 6c1350e70bfa 2026-03-09T19:34:52.430 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (6m) 30s ago 11m 494M - 19.2.3-678-ge911bdeb 654f31e6858e c4c36685d8dc 2026-03-09T19:34:52.430 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (6m) 4s ago 13m 70.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e ad39140965d8 2026-03-09T19:34:52.430 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (6m) 30s ago 11m 58.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b4a58927ebfd 2026-03-09T19:34:52.430 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (34s) 4s ago 12m 8992k - 1.7.0 72c9c2088986 b64ceec40b43 2026-03-09T19:34:52.430 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (31s) 30s ago 11m 5758k - 1.7.0 72c9c2088986 d2e9f420f638 2026-03-09T19:34:52.430 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (5m) 4s ago 11m 229M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a203aa241656 2026-03-09T19:34:52.430 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (3m) 4s ago 11m 121M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 36b65f1069e5 2026-03-09T19:34:52.430 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (2m) 4s ago 11m 102M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 1b8bc1f96eb7 2026-03-09T19:34:52.430 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (2m) 30s ago 11m 192M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bde783ff786f 2026-03-09T19:34:52.430 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (2m) 30s ago 10m 133M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 588104e3b774 2026-03-09T19:34:52.430 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (108s) 30s ago 10m 117M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7f9a10e5f49d 2026-03-09T19:34:52.430 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (21s) 4s ago 12m 45.8M - 2.51.0 1d3b7f56885b e0028c6b96d6 2026-03-09T19:34:52.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.434+0000 7f4cc3322640 1 -- 192.168.123.107:0/3719426344 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c980778e0 msgr2=0x7f4c98079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:52.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.434+0000 7f4cc3322640 1 --2- 192.168.123.107:0/3719426344 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c980778e0 0x7f4c98079da0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f4cac009a00 tx=0x7f4cac0023d0 comp rx=0 tx=0).stop 2026-03-09T19:34:52.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.434+0000 7f4cc3322640 1 -- 192.168.123.107:0/3719426344 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cbc1089d0 msgr2=0x7f4cbc1a0bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:52.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.434+0000 7f4cc3322640 1 --2- 192.168.123.107:0/3719426344 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cbc1089d0 0x7f4cbc1a0bc0 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7f4cb800c370 tx=0x7f4cb800c840 comp rx=0 tx=0).stop 2026-03-09T19:34:52.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.434+0000 7f4cc3322640 1 -- 192.168.123.107:0/3719426344 shutdown_connections 2026-03-09T19:34:52.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.434+0000 7f4cc3322640 1 --2- 192.168.123.107:0/3719426344 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c980778e0 0x7f4c98079da0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:52.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.434+0000 7f4cc3322640 1 --2- 192.168.123.107:0/3719426344 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cbc1089d0 0x7f4cbc1a0bc0 unknown :-1 s=CLOSED pgs=155 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:52.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.434+0000 7f4cc3322640 1 --2- 192.168.123.107:0/3719426344 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cbc1029d0 0x7f4cbc1a0680 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:52.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.434+0000 7f4cc3322640 1 -- 192.168.123.107:0/3719426344 >> 192.168.123.107:0/3719426344 conn(0x7f4cbc0fe710 msgr2=0x7f4cbc106550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:52.433 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.435+0000 7f4cc3322640 1 -- 192.168.123.107:0/3719426344 shutdown_connections 2026-03-09T19:34:52.433 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.435+0000 7f4cc3322640 1 -- 192.168.123.107:0/3719426344 wait complete. 2026-03-09T19:34:52.486 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade status' 2026-03-09T19:34:52.673 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:34:52.959 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:52 vm07.local ceph-mon[111841]: pgmap v248: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:52.959 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:52 vm07.local ceph-mon[111841]: from='client.34442 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:34:52.959 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.959+0000 7fc6e6536640 1 -- 192.168.123.107:0/3567134204 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6e0102970 msgr2=0x7fc6e0102dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:52.959 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.959+0000 7fc6e6536640 1 --2- 192.168.123.107:0/3567134204 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6e0102970 0x7fc6e0102dd0 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7fc6cc0099b0 tx=0x7fc6cc02f220 comp rx=0 tx=0).stop 2026-03-09T19:34:52.960 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.962+0000 7fc6e6536640 1 -- 192.168.123.107:0/3567134204 shutdown_connections 2026-03-09T19:34:52.960 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.962+0000 7fc6e6536640 1 --2- 192.168.123.107:0/3567134204 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6e0102970 0x7fc6e0102dd0 unknown :-1 s=CLOSED pgs=156 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:52.960 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.962+0000 7fc6e6536640 1 --2- 192.168.123.107:0/3567134204 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc6e0108970 0x7fc6e0108d50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:52.960 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.962+0000 7fc6e6536640 1 -- 192.168.123.107:0/3567134204 >> 192.168.123.107:0/3567134204 conn(0x7fc6e00fe670 msgr2=0x7fc6e0100a90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:52.960 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.962+0000 7fc6e6536640 1 -- 192.168.123.107:0/3567134204 shutdown_connections 2026-03-09T19:34:52.960 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.962+0000 7fc6e6536640 1 -- 192.168.123.107:0/3567134204 wait complete. 2026-03-09T19:34:52.960 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.963+0000 7fc6e6536640 1 Processor -- start 2026-03-09T19:34:52.961 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.963+0000 7fc6e6536640 1 -- start start 2026-03-09T19:34:52.961 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.963+0000 7fc6e6536640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6e0102970 0x7fc6e01a0620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:52.961 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.963+0000 7fc6dffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6e0102970 0x7fc6e01a0620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:52.961 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.963+0000 7fc6dffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6e0102970 0x7fc6e01a0620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43444/0 (socket says 192.168.123.107:43444) 2026-03-09T19:34:52.961 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.964+0000 7fc6e6536640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc6e0108970 0x7fc6e01a0b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:52.961 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.964+0000 7fc6dffff640 1 -- 192.168.123.107:0/1034015116 learned_addr learned my addr 192.168.123.107:0/1034015116 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:34:52.962 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.964+0000 7fc6e6536640 1 -- 192.168.123.107:0/1034015116 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc6e01a10f0 con 0x7fc6e0102970 2026-03-09T19:34:52.962 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.964+0000 7fc6e6536640 1 -- 192.168.123.107:0/1034015116 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc6e019a710 con 0x7fc6e0108970 2026-03-09T19:34:52.962 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.964+0000 7fc6df7fe640 1 --2- 192.168.123.107:0/1034015116 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc6e0108970 0x7fc6e01a0b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:52.962 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.964+0000 7fc6df7fe640 1 -- 192.168.123.107:0/1034015116 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6e0102970 msgr2=0x7fc6e01a0620 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:52.962 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.964+0000 7fc6df7fe640 1 --2- 192.168.123.107:0/1034015116 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6e0102970 0x7fc6e01a0620 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:52.962 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.964+0000 7fc6df7fe640 1 -- 192.168.123.107:0/1034015116 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc6cc009660 con 0x7fc6e0108970 2026-03-09T19:34:52.962 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.964+0000 7fc6dffff640 1 --2- 192.168.123.107:0/1034015116 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6e0102970 0x7fc6e01a0620 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:34:52.963 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.965+0000 7fc6df7fe640 1 --2- 192.168.123.107:0/1034015116 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc6e0108970 0x7fc6e01a0b60 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fc6cc009980 tx=0x7fc6cc0043d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:52.963 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.965+0000 7fc6dd7fa640 1 -- 192.168.123.107:0/1034015116 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc6cc03d070 con 0x7fc6e0108970 2026-03-09T19:34:52.963 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.965+0000 7fc6dd7fa640 1 -- 192.168.123.107:0/1034015116 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fc6cc02fc90 con 0x7fc6e0108970 2026-03-09T19:34:52.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.966+0000 7fc6e6536640 1 -- 192.168.123.107:0/1034015116 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc6e019a990 con 0x7fc6e0108970 2026-03-09T19:34:52.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.966+0000 7fc6e6536640 1 -- 192.168.123.107:0/1034015116 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc6e019ae80 con 0x7fc6e0108970 2026-03-09T19:34:52.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.966+0000 7fc6dd7fa640 1 -- 192.168.123.107:0/1034015116 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc6cc041840 con 0x7fc6e0108970 2026-03-09T19:34:52.964 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.966+0000 7fc6e6536640 1 -- 192.168.123.107:0/1034015116 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc6e01040b0 con 0x7fc6e0108970 2026-03-09T19:34:52.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.967+0000 7fc6dd7fa640 1 -- 192.168.123.107:0/1034015116 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc6cc038730 con 0x7fc6e0108970 2026-03-09T19:34:52.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.967+0000 7fc6dd7fa640 1 --2- 192.168.123.107:0/1034015116 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc6b40776d0 0x7fc6b4079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:52.965 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.968+0000 7fc6dd7fa640 1 -- 192.168.123.107:0/1034015116 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7fc6cc0be600 con 0x7fc6e0108970 2026-03-09T19:34:52.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.968+0000 7fc6dffff640 1 --2- 192.168.123.107:0/1034015116 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc6b40776d0 0x7fc6b4079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:52.966 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.968+0000 7fc6dffff640 1 --2- 192.168.123.107:0/1034015116 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc6b40776d0 0x7fc6b4079b90 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fc6d0004240 tx=0x7fc6d000a480 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:52.968 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:52.970+0000 7fc6dd7fa640 1 -- 192.168.123.107:0/1034015116 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc6cc086d60 con 0x7fc6e0108970 2026-03-09T19:34:53.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.079+0000 7fc6e6536640 1 -- 192.168.123.107:0/1034015116 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc6e019bc20 con 0x7fc6b40776d0 2026-03-09T19:34:53.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.081+0000 7fc6dd7fa640 1 -- 192.168.123.107:0/1034015116 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7fc6e019bc20 con 0x7fc6b40776d0 2026-03-09T19:34:53.079 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:34:53.079 INFO:teuthology.orchestra.run.vm07.stdout: "target_image": null, 2026-03-09T19:34:53.079 INFO:teuthology.orchestra.run.vm07.stdout: "in_progress": false, 2026-03-09T19:34:53.079 INFO:teuthology.orchestra.run.vm07.stdout: "which": "", 2026-03-09T19:34:53.079 INFO:teuthology.orchestra.run.vm07.stdout: "services_complete": [], 2026-03-09T19:34:53.079 INFO:teuthology.orchestra.run.vm07.stdout: "progress": null, 2026-03-09T19:34:53.079 INFO:teuthology.orchestra.run.vm07.stdout: "message": "", 2026-03-09T19:34:53.079 INFO:teuthology.orchestra.run.vm07.stdout: "is_paused": false 2026-03-09T19:34:53.079 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:34:53.081 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.083+0000 7fc6e6536640 1 -- 192.168.123.107:0/1034015116 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc6b40776d0 msgr2=0x7fc6b4079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:53.081 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.083+0000 7fc6e6536640 1 --2- 192.168.123.107:0/1034015116 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc6b40776d0 0x7fc6b4079b90 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fc6d0004240 tx=0x7fc6d000a480 comp rx=0 tx=0).stop 2026-03-09T19:34:53.081 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.083+0000 7fc6e6536640 1 -- 192.168.123.107:0/1034015116 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc6e0108970 msgr2=0x7fc6e01a0b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:53.081 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.083+0000 7fc6e6536640 1 --2- 192.168.123.107:0/1034015116 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc6e0108970 0x7fc6e01a0b60 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fc6cc009980 tx=0x7fc6cc0043d0 comp rx=0 tx=0).stop 2026-03-09T19:34:53.082 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.084+0000 7fc6e6536640 1 -- 192.168.123.107:0/1034015116 shutdown_connections 2026-03-09T19:34:53.082 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.084+0000 7fc6e6536640 1 --2- 192.168.123.107:0/1034015116 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fc6b40776d0 0x7fc6b4079b90 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:53.082 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.084+0000 7fc6e6536640 1 --2- 192.168.123.107:0/1034015116 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc6e0108970 0x7fc6e01a0b60 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:53.082 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.084+0000 7fc6e6536640 1 --2- 192.168.123.107:0/1034015116 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc6e0102970 0x7fc6e01a0620 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:53.082 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.084+0000 7fc6e6536640 1 -- 192.168.123.107:0/1034015116 >> 192.168.123.107:0/1034015116 conn(0x7fc6e00fe670 msgr2=0x7fc6e010c930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:53.082 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.084+0000 7fc6e6536640 1 -- 192.168.123.107:0/1034015116 shutdown_connections 2026-03-09T19:34:53.082 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.084+0000 7fc6e6536640 1 -- 192.168.123.107:0/1034015116 wait complete. 2026-03-09T19:34:53.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:52 vm08.local ceph-mon[103420]: pgmap v248: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:53.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:52 vm08.local ceph-mon[103420]: from='client.34442 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:34:53.155 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph health detail' 2026-03-09T19:34:53.314 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:34:53.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.583+0000 7ff5d754f640 1 -- 192.168.123.107:0/2184988812 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5d00ffe80 msgr2=0x7ff5d010cd50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:53.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.583+0000 7ff5d754f640 1 --2- 192.168.123.107:0/2184988812 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5d00ffe80 0x7ff5d010cd50 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7ff5c00099b0 tx=0x7ff5c002f220 comp rx=0 tx=0).stop 2026-03-09T19:34:53.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.584+0000 7ff5d754f640 1 -- 192.168.123.107:0/2184988812 shutdown_connections 2026-03-09T19:34:53.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.584+0000 7ff5d754f640 1 --2- 192.168.123.107:0/2184988812 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5d00ffe80 0x7ff5d010cd50 unknown :-1 s=CLOSED pgs=157 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:53.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.584+0000 7ff5d754f640 1 --2- 192.168.123.107:0/2184988812 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff5d00ff560 0x7ff5d00ff940 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:53.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.584+0000 7ff5d754f640 1 -- 192.168.123.107:0/2184988812 >> 192.168.123.107:0/2184988812 conn(0x7ff5d00fb3d0 msgr2=0x7ff5d00fd7f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:53.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.584+0000 7ff5d754f640 1 -- 192.168.123.107:0/2184988812 shutdown_connections 2026-03-09T19:34:53.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.584+0000 7ff5d754f640 1 -- 192.168.123.107:0/2184988812 wait complete. 2026-03-09T19:34:53.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.584+0000 7ff5d754f640 1 Processor -- start 2026-03-09T19:34:53.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.585+0000 7ff5d754f640 1 -- start start 2026-03-09T19:34:53.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.585+0000 7ff5d754f640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5d00ff560 0x7ff5d01a0650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:53.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.585+0000 7ff5d754f640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff5d00ffe80 0x7ff5d01a0b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:53.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.585+0000 7ff5d754f640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff5d019a740 con 0x7ff5d00ff560 2026-03-09T19:34:53.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.585+0000 7ff5d754f640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff5d019a8b0 con 0x7ff5d00ffe80 2026-03-09T19:34:53.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.585+0000 7ff5d4ac3640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff5d00ffe80 0x7ff5d01a0b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:53.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.585+0000 7ff5d4ac3640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff5d00ffe80 0x7ff5d01a0b90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:59910/0 (socket says 192.168.123.107:59910) 2026-03-09T19:34:53.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.585+0000 7ff5d4ac3640 1 -- 192.168.123.107:0/2183962775 learned_addr learned my addr 192.168.123.107:0/2183962775 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:34:53.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.585+0000 7ff5d52c4640 1 --2- 192.168.123.107:0/2183962775 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5d00ff560 0x7ff5d01a0650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:53.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.585+0000 7ff5d4ac3640 1 -- 192.168.123.107:0/2183962775 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5d00ff560 msgr2=0x7ff5d01a0650 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:53.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.586+0000 7ff5d4ac3640 1 --2- 192.168.123.107:0/2183962775 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5d00ff560 0x7ff5d01a0650 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:53.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.586+0000 7ff5d4ac3640 1 -- 192.168.123.107:0/2183962775 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff5c0009660 con 0x7ff5d00ffe80 2026-03-09T19:34:53.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.586+0000 7ff5d52c4640 1 --2- 192.168.123.107:0/2183962775 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5d00ff560 0x7ff5d01a0650 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T19:34:53.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.586+0000 7ff5d4ac3640 1 --2- 192.168.123.107:0/2183962775 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff5d00ffe80 0x7ff5d01a0b90 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7ff5c0009980 tx=0x7ff5c0004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:53.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.586+0000 7ff5c67fc640 1 -- 192.168.123.107:0/2183962775 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff5c003d070 con 0x7ff5d00ffe80 2026-03-09T19:34:53.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.586+0000 7ff5d754f640 1 -- 192.168.123.107:0/2183962775 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff5d019ab30 con 0x7ff5d00ffe80 2026-03-09T19:34:53.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.586+0000 7ff5c67fc640 1 -- 192.168.123.107:0/2183962775 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff5c0004400 con 0x7ff5d00ffe80 2026-03-09T19:34:53.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.586+0000 7ff5c67fc640 1 -- 192.168.123.107:0/2183962775 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff5c0041740 con 0x7ff5d00ffe80 2026-03-09T19:34:53.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.587+0000 7ff5d754f640 1 -- 192.168.123.107:0/2183962775 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff5d019b020 con 0x7ff5d00ffe80 2026-03-09T19:34:53.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.587+0000 7ff5d754f640 1 -- 192.168.123.107:0/2183962775 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff5d00696a0 con 0x7ff5d00ffe80 2026-03-09T19:34:53.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.588+0000 7ff5c67fc640 1 -- 192.168.123.107:0/2183962775 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff5c0038730 con 0x7ff5d00ffe80 2026-03-09T19:34:53.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.588+0000 7ff5c67fc640 1 --2- 192.168.123.107:0/2183962775 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff5ac0778e0 0x7ff5ac079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:53.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.589+0000 7ff5c67fc640 1 -- 192.168.123.107:0/2183962775 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7ff5c00be570 con 0x7ff5d00ffe80 2026-03-09T19:34:53.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.589+0000 7ff5d52c4640 1 --2- 192.168.123.107:0/2183962775 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff5ac0778e0 0x7ff5ac079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:53.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.590+0000 7ff5d52c4640 1 --2- 192.168.123.107:0/2183962775 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff5ac0778e0 0x7ff5ac079da0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7ff5b80046d0 tx=0x7ff5b8004420 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:53.589 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.591+0000 7ff5c67fc640 1 -- 192.168.123.107:0/2183962775 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff5c0086ba0 con 0x7ff5d00ffe80 2026-03-09T19:34:53.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.733+0000 7ff5d754f640 1 -- 192.168.123.107:0/2183962775 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7ff5d019bc50 con 0x7ff5d00ffe80 2026-03-09T19:34:53.731 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.733+0000 7ff5c67fc640 1 -- 192.168.123.107:0/2183962775 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7ff5c00862f0 con 0x7ff5d00ffe80 2026-03-09T19:34:53.731 INFO:teuthology.orchestra.run.vm07.stdout:HEALTH_OK 2026-03-09T19:34:53.734 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.736+0000 7ff5d754f640 1 -- 192.168.123.107:0/2183962775 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff5ac0778e0 msgr2=0x7ff5ac079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:53.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.736+0000 7ff5d754f640 1 --2- 192.168.123.107:0/2183962775 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff5ac0778e0 0x7ff5ac079da0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7ff5b80046d0 tx=0x7ff5b8004420 comp rx=0 tx=0).stop 2026-03-09T19:34:53.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.736+0000 7ff5d754f640 1 -- 192.168.123.107:0/2183962775 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff5d00ffe80 msgr2=0x7ff5d01a0b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:53.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.736+0000 7ff5d754f640 1 --2- 192.168.123.107:0/2183962775 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff5d00ffe80 0x7ff5d01a0b90 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7ff5c0009980 tx=0x7ff5c0004290 comp rx=0 tx=0).stop 2026-03-09T19:34:53.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.736+0000 7ff5d754f640 1 -- 192.168.123.107:0/2183962775 shutdown_connections 2026-03-09T19:34:53.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.736+0000 7ff5d754f640 1 --2- 192.168.123.107:0/2183962775 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff5ac0778e0 0x7ff5ac079da0 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:53.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.736+0000 7ff5d754f640 1 --2- 192.168.123.107:0/2183962775 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff5d00ffe80 0x7ff5d01a0b90 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:53.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.736+0000 7ff5d754f640 1 --2- 192.168.123.107:0/2183962775 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5d00ff560 0x7ff5d01a0650 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:53.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.736+0000 7ff5d754f640 1 -- 192.168.123.107:0/2183962775 >> 192.168.123.107:0/2183962775 conn(0x7ff5d00fb3d0 msgr2=0x7ff5d00fbb10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:53.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.737+0000 7ff5d754f640 1 -- 192.168.123.107:0/2183962775 shutdown_connections 2026-03-09T19:34:53.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:53.737+0000 7ff5d754f640 1 -- 192.168.123.107:0/2183962775 wait complete. 2026-03-09T19:34:53.816 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-09T19:34:53.976 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:34:54.003 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:53 vm07.local ceph-mon[111841]: from='client.34446 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:34:54.003 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:53 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:54.003 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:53 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/2183962775' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:34:54.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:53 vm08.local ceph-mon[103420]: from='client.34446 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:34:54.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:53 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:34:54.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:53 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/2183962775' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T19:34:54.228 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.229+0000 7efc93271640 1 -- 192.168.123.107:0/751960 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efc8c1089d0 msgr2=0x7efc8c108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:54.228 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.229+0000 7efc93271640 1 --2- 192.168.123.107:0/751960 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efc8c1089d0 0x7efc8c108db0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7efc740098e0 tx=0x7efc7402f190 comp rx=0 tx=0).stop 2026-03-09T19:34:54.228 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.230+0000 7efc93271640 1 -- 192.168.123.107:0/751960 shutdown_connections 2026-03-09T19:34:54.228 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.230+0000 7efc93271640 1 --2- 192.168.123.107:0/751960 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efc8c1029d0 0x7efc8c102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:54.228 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.230+0000 7efc93271640 1 --2- 192.168.123.107:0/751960 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efc8c1089d0 0x7efc8c108db0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:54.228 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.230+0000 7efc93271640 1 -- 192.168.123.107:0/751960 >> 192.168.123.107:0/751960 conn(0x7efc8c0fe710 msgr2=0x7efc8c100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:54.228 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.231+0000 7efc93271640 1 -- 192.168.123.107:0/751960 shutdown_connections 2026-03-09T19:34:54.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.231+0000 7efc93271640 1 -- 192.168.123.107:0/751960 wait complete. 2026-03-09T19:34:54.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.231+0000 7efc93271640 1 Processor -- start 2026-03-09T19:34:54.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.231+0000 7efc93271640 1 -- start start 2026-03-09T19:34:54.229 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.232+0000 7efc93271640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efc8c1029d0 0x7efc8c1a0620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:54.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.232+0000 7efc93271640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efc8c1089d0 0x7efc8c1a0b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:54.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.232+0000 7efc93271640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efc8c1a1180 con 0x7efc8c1029d0 2026-03-09T19:34:54.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.232+0000 7efc90fe6640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efc8c1029d0 0x7efc8c1a0620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:54.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.232+0000 7efc90fe6640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efc8c1029d0 0x7efc8c1a0620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:43474/0 (socket says 192.168.123.107:43474) 2026-03-09T19:34:54.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.232+0000 7efc90fe6640 1 -- 192.168.123.107:0/2528940918 learned_addr learned my addr 192.168.123.107:0/2528940918 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:34:54.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.232+0000 7efc93271640 1 -- 192.168.123.107:0/2528940918 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efc8c19a710 con 0x7efc8c1089d0 2026-03-09T19:34:54.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.232+0000 7efc83fff640 1 --2- 192.168.123.107:0/2528940918 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efc8c1089d0 0x7efc8c1a0b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:54.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.233+0000 7efc90fe6640 1 -- 192.168.123.107:0/2528940918 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efc8c1089d0 msgr2=0x7efc8c1a0b60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:54.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.233+0000 7efc90fe6640 1 --2- 192.168.123.107:0/2528940918 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efc8c1089d0 0x7efc8c1a0b60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:54.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.233+0000 7efc90fe6640 1 -- 192.168.123.107:0/2528940918 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efc7c009660 con 0x7efc8c1029d0 2026-03-09T19:34:54.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.233+0000 7efc90fe6640 1 --2- 192.168.123.107:0/2528940918 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efc8c1029d0 0x7efc8c1a0620 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7efc74005f00 tx=0x7efc74004480 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:54.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.233+0000 7efc81ffb640 1 -- 192.168.123.107:0/2528940918 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efc7403d070 con 0x7efc8c1029d0 2026-03-09T19:34:54.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.233+0000 7efc81ffb640 1 -- 192.168.123.107:0/2528940918 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7efc740045a0 con 0x7efc8c1029d0 2026-03-09T19:34:54.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.233+0000 7efc93271640 1 -- 192.168.123.107:0/2528940918 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efc74009590 con 0x7efc8c1029d0 2026-03-09T19:34:54.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.234+0000 7efc93271640 1 -- 192.168.123.107:0/2528940918 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efc8c19acf0 con 0x7efc8c1029d0 2026-03-09T19:34:54.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.235+0000 7efc81ffb640 1 -- 192.168.123.107:0/2528940918 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efc74041680 con 0x7efc8c1029d0 2026-03-09T19:34:54.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.235+0000 7efc93271640 1 -- 192.168.123.107:0/2528940918 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efc8c104110 con 0x7efc8c1029d0 2026-03-09T19:34:54.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.237+0000 7efc81ffb640 1 -- 192.168.123.107:0/2528940918 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7efc74049050 con 0x7efc8c1029d0 2026-03-09T19:34:54.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.237+0000 7efc81ffb640 1 --2- 192.168.123.107:0/2528940918 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efc5c077820 0x7efc5c079ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:34:54.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.237+0000 7efc81ffb640 1 -- 192.168.123.107:0/2528940918 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7efc740bf3b0 con 0x7efc8c1029d0 2026-03-09T19:34:54.237 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.239+0000 7efc83fff640 1 --2- 192.168.123.107:0/2528940918 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efc5c077820 0x7efc5c079ce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:34:54.237 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.239+0000 7efc83fff640 1 --2- 192.168.123.107:0/2528940918 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efc5c077820 0x7efc5c079ce0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7efc8c19bb80 tx=0x7efc7c009340 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:34:54.237 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.239+0000 7efc81ffb640 1 -- 192.168.123.107:0/2528940918 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7efc74087a60 con 0x7efc8c1029d0 2026-03-09T19:34:54.381 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.383+0000 7efc93271640 1 -- 192.168.123.107:0/2528940918 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7efc8c10fc80 con 0x7efc8c1029d0 2026-03-09T19:34:54.382 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.385+0000 7efc81ffb640 1 -- 192.168.123.107:0/2528940918 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7efc740871b0 con 0x7efc8c1029d0 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:34:54.384 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:34:54.386 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.388+0000 7efc93271640 1 -- 192.168.123.107:0/2528940918 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efc5c077820 msgr2=0x7efc5c079ce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:54.386 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.389+0000 7efc93271640 1 --2- 192.168.123.107:0/2528940918 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efc5c077820 0x7efc5c079ce0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7efc8c19bb80 tx=0x7efc7c009340 comp rx=0 tx=0).stop 2026-03-09T19:34:54.387 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.389+0000 7efc93271640 1 -- 192.168.123.107:0/2528940918 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efc8c1029d0 msgr2=0x7efc8c1a0620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:34:54.387 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.389+0000 7efc93271640 1 --2- 192.168.123.107:0/2528940918 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efc8c1029d0 0x7efc8c1a0620 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7efc74005f00 tx=0x7efc74004480 comp rx=0 tx=0).stop 2026-03-09T19:34:54.387 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.389+0000 7efc93271640 1 -- 192.168.123.107:0/2528940918 shutdown_connections 2026-03-09T19:34:54.387 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.390+0000 7efc93271640 1 --2- 192.168.123.107:0/2528940918 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7efc5c077820 0x7efc5c079ce0 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:54.388 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.390+0000 7efc93271640 1 --2- 192.168.123.107:0/2528940918 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efc8c1089d0 0x7efc8c1a0b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:54.388 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.390+0000 7efc93271640 1 --2- 192.168.123.107:0/2528940918 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efc8c1029d0 0x7efc8c1a0620 unknown :-1 s=CLOSED pgs=158 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:34:54.388 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.390+0000 7efc93271640 1 -- 192.168.123.107:0/2528940918 >> 192.168.123.107:0/2528940918 conn(0x7efc8c0fe710 msgr2=0x7efc8c10c9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:34:54.388 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.390+0000 7efc93271640 1 -- 192.168.123.107:0/2528940918 shutdown_connections 2026-03-09T19:34:54.388 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:34:54.390+0000 7efc93271640 1 -- 192.168.123.107:0/2528940918 wait complete. 2026-03-09T19:34:54.458 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'echo "wait for servicemap items w/ changing names to refresh"' 2026-03-09T19:34:54.624 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:34:54.847 INFO:teuthology.orchestra.run.vm07.stdout:wait for servicemap items w/ changing names to refresh 2026-03-09T19:34:54.890 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'sleep 60' 2026-03-09T19:34:54.920 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:54 vm07.local ceph-mon[111841]: from='client.44359 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:34:54.920 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:54 vm07.local ceph-mon[111841]: pgmap v249: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:54.920 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:54 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/2528940918' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:55.054 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:34:55.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:54 vm08.local ceph-mon[103420]: from='client.44359 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:34:55.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:54 vm08.local ceph-mon[103420]: pgmap v249: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:55.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:54 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/2528940918' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:34:57.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:56 vm08.local ceph-mon[103420]: pgmap v250: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:57.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:56 vm07.local ceph-mon[111841]: pgmap v250: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:59.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:34:58 vm08.local ceph-mon[103420]: pgmap v251: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:34:59.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:34:58 vm07.local ceph-mon[111841]: pgmap v251: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:01.044 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:00 vm08.local ceph-mon[103420]: pgmap v252: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:01.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:00 vm07.local ceph-mon[111841]: pgmap v252: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:03.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:02 vm08.local ceph-mon[103420]: pgmap v253: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:03.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:02 vm07.local ceph-mon[111841]: pgmap v253: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:04.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:35:04.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:35:05.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:04 vm08.local ceph-mon[103420]: pgmap v254: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:05.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:04 vm07.local ceph-mon[111841]: pgmap v254: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:07.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:06 vm08.local ceph-mon[103420]: pgmap v255: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:07.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:06 vm07.local ceph-mon[111841]: pgmap v255: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:09.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:08 vm08.local ceph-mon[103420]: pgmap v256: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:09.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:08 vm07.local ceph-mon[111841]: pgmap v256: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:11.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:10 vm08.local ceph-mon[103420]: pgmap v257: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:11.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:10 vm07.local ceph-mon[111841]: pgmap v257: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:13.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:12 vm08.local ceph-mon[103420]: pgmap v258: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:13.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:12 vm07.local ceph-mon[111841]: pgmap v258: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:15.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:14 vm08.local ceph-mon[103420]: pgmap v259: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:14 vm07.local ceph-mon[111841]: pgmap v259: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:17.318 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:16 vm07.local ceph-mon[111841]: pgmap v260: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:17.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:16 vm08.local ceph-mon[103420]: pgmap v260: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:19.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:19 vm08.local ceph-mon[103420]: pgmap v261: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:19.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:19 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:35:19.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:19 vm07.local ceph-mon[111841]: pgmap v261: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:19.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:19 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:35:21.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:21 vm08.local ceph-mon[103420]: pgmap v262: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:21.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:21 vm07.local ceph-mon[111841]: pgmap v262: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:22.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:22 vm07.local ceph-mon[111841]: pgmap v263: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:22.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:22 vm08.local ceph-mon[103420]: pgmap v263: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:24.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:24 vm07.local ceph-mon[111841]: pgmap v264: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:25.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:24 vm08.local ceph-mon[103420]: pgmap v264: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:26 vm07.local ceph-mon[111841]: pgmap v265: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:26 vm08.local ceph-mon[103420]: pgmap v265: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:28.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:28 vm07.local ceph-mon[111841]: pgmap v266: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:29.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:28 vm08.local ceph-mon[103420]: pgmap v266: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:30.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:30 vm07.local ceph-mon[111841]: pgmap v267: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:31.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:30 vm08.local ceph-mon[103420]: pgmap v267: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:32.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:32 vm07.local ceph-mon[111841]: pgmap v268: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:33.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:32 vm08.local ceph-mon[103420]: pgmap v268: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:33.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:35:34.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:35:34.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:34 vm07.local ceph-mon[111841]: pgmap v269: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:35.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:34 vm08.local ceph-mon[103420]: pgmap v269: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:36.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:36 vm07.local ceph-mon[111841]: pgmap v270: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:37.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:36 vm08.local ceph-mon[103420]: pgmap v270: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:38.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:38 vm07.local ceph-mon[111841]: pgmap v271: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:39.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:38 vm08.local ceph-mon[103420]: pgmap v271: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:41.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:40 vm08.local ceph-mon[103420]: pgmap v272: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:41.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:40 vm07.local ceph-mon[111841]: pgmap v272: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:43.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:42 vm08.local ceph-mon[103420]: pgmap v273: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:43.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:42 vm07.local ceph-mon[111841]: pgmap v273: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:45.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:44 vm07.local ceph-mon[111841]: pgmap v274: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:45.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:44 vm08.local ceph-mon[103420]: pgmap v274: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:47.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:46 vm07.local ceph-mon[111841]: pgmap v275: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:47.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:46 vm08.local ceph-mon[103420]: pgmap v275: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:48 vm07.local ceph-mon[111841]: pgmap v276: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:35:49.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:48 vm08.local ceph-mon[103420]: pgmap v276: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:49.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:35:50.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:49 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:35:50.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:49 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:35:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:50 vm07.local ceph-mon[111841]: pgmap v277: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:35:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:35:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:35:51.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:50 vm08.local ceph-mon[103420]: pgmap v277: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:51.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:35:51.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:35:51.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:35:53.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:53 vm07.local ceph-mon[111841]: pgmap v278: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:53.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:53 vm08.local ceph-mon[103420]: pgmap v278: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:54.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:54 vm08.local ceph-mon[103420]: pgmap v279: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:54.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:54 vm07.local ceph-mon[111841]: pgmap v279: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:55.292 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-09T19:35:55.475 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:35:55.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.741+0000 7f0170ece640 1 -- 192.168.123.107:0/4144200872 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f016c072340 msgr2=0x7f016c072720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:55.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.741+0000 7f0170ece640 1 --2- 192.168.123.107:0/4144200872 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f016c072340 0x7f016c072720 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f01540098e0 tx=0x7f015402f190 comp rx=0 tx=0).stop 2026-03-09T19:35:55.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.742+0000 7f0170ece640 1 -- 192.168.123.107:0/4144200872 shutdown_connections 2026-03-09T19:35:55.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.742+0000 7f0170ece640 1 --2- 192.168.123.107:0/4144200872 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f016c072cf0 0x7f016c10cd90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:55.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.742+0000 7f0170ece640 1 --2- 192.168.123.107:0/4144200872 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f016c072340 0x7f016c072720 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:55.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.742+0000 7f0170ece640 1 -- 192.168.123.107:0/4144200872 >> 192.168.123.107:0/4144200872 conn(0x7f016c06b7f0 msgr2=0x7f016c06bc00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:35:55.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.742+0000 7f0170ece640 1 -- 192.168.123.107:0/4144200872 shutdown_connections 2026-03-09T19:35:55.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.742+0000 7f0170ece640 1 -- 192.168.123.107:0/4144200872 wait complete. 2026-03-09T19:35:55.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.743+0000 7f0170ece640 1 Processor -- start 2026-03-09T19:35:55.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.743+0000 7f0170ece640 1 -- start start 2026-03-09T19:35:55.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.743+0000 7f0170ece640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f016c072340 0x7f016c114d60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:55.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.743+0000 7f0170ece640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f016c072cf0 0x7f016c1152a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:55.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.743+0000 7f0170ece640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f016c115930 con 0x7f016c072cf0 2026-03-09T19:35:55.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.743+0000 7f0170ece640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f016c1b5cd0 con 0x7f016c072340 2026-03-09T19:35:55.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.743+0000 7f016a575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f016c072340 0x7f016c114d60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:55.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.743+0000 7f016a575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f016c072340 0x7f016c114d60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:35252/0 (socket says 192.168.123.107:35252) 2026-03-09T19:35:55.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.743+0000 7f016a575640 1 -- 192.168.123.107:0/179647910 learned_addr learned my addr 192.168.123.107:0/179647910 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:35:55.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.743+0000 7f016a575640 1 -- 192.168.123.107:0/179647910 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f016c072cf0 msgr2=0x7f016c1152a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:35:55.741 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.743+0000 7f016a575640 1 --2- 192.168.123.107:0/179647910 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f016c072cf0 0x7f016c1152a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:55.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.743+0000 7f016a575640 1 -- 192.168.123.107:0/179647910 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f014c009660 con 0x7f016c072340 2026-03-09T19:35:55.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.744+0000 7f016a575640 1 --2- 192.168.123.107:0/179647910 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f016c072340 0x7f016c114d60 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f0154005f00 tx=0x7f0154004480 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:35:55.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.744+0000 7f01637fe640 1 -- 192.168.123.107:0/179647910 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f015403d070 con 0x7f016c072340 2026-03-09T19:35:55.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.744+0000 7f01637fe640 1 -- 192.168.123.107:0/179647910 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f01540045a0 con 0x7f016c072340 2026-03-09T19:35:55.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.744+0000 7f01637fe640 1 -- 192.168.123.107:0/179647910 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0154041620 con 0x7f016c072340 2026-03-09T19:35:55.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.744+0000 7f0170ece640 1 -- 192.168.123.107:0/179647910 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0154009590 con 0x7f016c072340 2026-03-09T19:35:55.742 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.744+0000 7f0170ece640 1 -- 192.168.123.107:0/179647910 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f016c1b6280 con 0x7f016c072340 2026-03-09T19:35:55.743 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.745+0000 7f0170ece640 1 -- 192.168.123.107:0/179647910 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f016c108780 con 0x7f016c072340 2026-03-09T19:35:55.743 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.745+0000 7f01637fe640 1 -- 192.168.123.107:0/179647910 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f015402fbc0 con 0x7f016c072340 2026-03-09T19:35:55.744 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.746+0000 7f01637fe640 1 --2- 192.168.123.107:0/179647910 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f01400776d0 0x7f0140079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:55.744 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.746+0000 7f01637fe640 1 -- 192.168.123.107:0/179647910 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f01540be5c0 con 0x7f016c072340 2026-03-09T19:35:55.744 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.746+0000 7f0163fff640 1 --2- 192.168.123.107:0/179647910 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f01400776d0 0x7f0140079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:55.744 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.747+0000 7f0163fff640 1 --2- 192.168.123.107:0/179647910 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f01400776d0 0x7f0140079b90 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f016c116310 tx=0x7f014c009340 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:35:55.746 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.748+0000 7f01637fe640 1 -- 192.168.123.107:0/179647910 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0154086c70 con 0x7f016c072340 2026-03-09T19:35:55.841 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.842+0000 7f0170ece640 1 -- 192.168.123.107:0/179647910 --> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f016c11cbe0 con 0x7f01400776d0 2026-03-09T19:35:55.845 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.848+0000 7f01637fe640 1 -- 192.168.123.107:0/179647910 <== mgr.24557 v2:192.168.123.107:6800/418954333 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f016c11cbe0 con 0x7f01400776d0 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:alertmanager.vm07 vm07 *:9093,9094 running (80s) 68s ago 13m 16.7M - 0.25.0 c8568f914cd2 900f5bd509c9 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm07 vm07 running (106s) 68s ago 13m 10.5M - 19.2.3-678-ge911bdeb 654f31e6858e 143b4d120468 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:ceph-exporter.vm08 vm08 running (105s) 93s ago 13m 10.2M - 19.2.3-678-ge911bdeb 654f31e6858e bb4c6a92b1d5 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm07 vm07 running (6m) 68s ago 13m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 91ed5a6dbf3f 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:crash.vm08 vm08 running (6m) 93s ago 13m 8321k - 19.2.3-678-ge911bdeb 654f31e6858e b2465a9d2305 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:grafana.vm07 vm07 *:3000 running (70s) 68s ago 13m 42.8M - 10.4.0 c8b91775d855 a4189864679c 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.uizncw vm07 running (2m) 68s ago 11m 101M - 19.2.3-678-ge911bdeb 654f31e6858e ccdf01cf9a0b 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm07.zkmcyw vm07 running (2m) 68s ago 11m 21.3M - 19.2.3-678-ge911bdeb 654f31e6858e 15f71649e080 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.jwsqrf vm08 running (111s) 93s ago 11m 15.5M - 19.2.3-678-ge911bdeb 654f31e6858e b1b66121ee0c 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:mds.cephfs.vm08.zcaqju vm08 running (118s) 93s ago 11m 20.7M - 19.2.3-678-ge911bdeb 654f31e6858e 6c9f1adefb0b 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm07.xacuym vm07 *:8443,9283,8765 running (7m) 68s ago 14m 618M - 19.2.3-678-ge911bdeb 654f31e6858e 6c1350e70bfa 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:mgr.vm08.mxylvw vm08 *:8443,9283,8765 running (7m) 93s ago 12m 494M - 19.2.3-678-ge911bdeb 654f31e6858e c4c36685d8dc 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm07 vm07 running (7m) 68s ago 14m 70.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e ad39140965d8 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:mon.vm08 vm08 running (7m) 93s ago 12m 58.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e b4a58927ebfd 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm07 vm07 *:9100 running (97s) 68s ago 13m 8992k - 1.7.0 72c9c2088986 b64ceec40b43 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:node-exporter.vm08 vm08 *:9100 running (94s) 93s ago 12m 5758k - 1.7.0 72c9c2088986 d2e9f420f638 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:osd.0 vm07 running (6m) 68s ago 12m 229M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a203aa241656 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:osd.1 vm07 running (4m) 68s ago 12m 121M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 36b65f1069e5 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:osd.2 vm07 running (3m) 68s ago 12m 102M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 1b8bc1f96eb7 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:osd.3 vm08 running (3m) 93s ago 12m 192M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bde783ff786f 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:osd.4 vm08 running (3m) 93s ago 12m 133M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 588104e3b774 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:osd.5 vm08 running (2m) 93s ago 11m 117M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7f9a10e5f49d 2026-03-09T19:35:55.846 INFO:teuthology.orchestra.run.vm07.stdout:prometheus.vm07 vm07 *:9095 running (85s) 68s ago 13m 45.8M - 2.51.0 1d3b7f56885b e0028c6b96d6 2026-03-09T19:35:55.847 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.850+0000 7f0170ece640 1 -- 192.168.123.107:0/179647910 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f01400776d0 msgr2=0x7f0140079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:55.847 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.850+0000 7f0170ece640 1 --2- 192.168.123.107:0/179647910 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f01400776d0 0x7f0140079b90 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f016c116310 tx=0x7f014c009340 comp rx=0 tx=0).stop 2026-03-09T19:35:55.848 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.850+0000 7f0170ece640 1 -- 192.168.123.107:0/179647910 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f016c072340 msgr2=0x7f016c114d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:55.848 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.850+0000 7f0170ece640 1 --2- 192.168.123.107:0/179647910 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f016c072340 0x7f016c114d60 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f0154005f00 tx=0x7f0154004480 comp rx=0 tx=0).stop 2026-03-09T19:35:55.848 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.850+0000 7f0170ece640 1 -- 192.168.123.107:0/179647910 shutdown_connections 2026-03-09T19:35:55.848 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.850+0000 7f0170ece640 1 --2- 192.168.123.107:0/179647910 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f01400776d0 0x7f0140079b90 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:55.848 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.850+0000 7f0170ece640 1 --2- 192.168.123.107:0/179647910 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f016c072cf0 0x7f016c1152a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:55.848 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.851+0000 7f0170ece640 1 --2- 192.168.123.107:0/179647910 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f016c072340 0x7f016c114d60 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:55.848 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.851+0000 7f0170ece640 1 -- 192.168.123.107:0/179647910 >> 192.168.123.107:0/179647910 conn(0x7f016c06b7f0 msgr2=0x7f016c10dda0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:35:55.849 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.851+0000 7f0170ece640 1 -- 192.168.123.107:0/179647910 shutdown_connections 2026-03-09T19:35:55.849 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:55.851+0000 7f0170ece640 1 -- 192.168.123.107:0/179647910 wait complete. 2026-03-09T19:35:55.903 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-09T19:35:56.036 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:35:56.252 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.253+0000 7f4cc60ec640 1 -- 192.168.123.107:0/1488995172 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cc0075720 msgr2=0x7f4cc0075b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:56.252 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.253+0000 7f4cc60ec640 1 --2- 192.168.123.107:0/1488995172 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cc0075720 0x7f4cc0075b00 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f4cb0009a00 tx=0x7f4cb002f270 comp rx=0 tx=0).stop 2026-03-09T19:35:56.252 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.253+0000 7f4cc60ec640 1 -- 192.168.123.107:0/1488995172 shutdown_connections 2026-03-09T19:35:56.252 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.253+0000 7f4cc60ec640 1 --2- 192.168.123.107:0/1488995172 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cc0076040 0x7f4cc0111330 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:56.252 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.253+0000 7f4cc60ec640 1 --2- 192.168.123.107:0/1488995172 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cc0075720 0x7f4cc0075b00 unknown :-1 s=CLOSED pgs=159 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:56.252 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.253+0000 7f4cc60ec640 1 -- 192.168.123.107:0/1488995172 >> 192.168.123.107:0/1488995172 conn(0x7f4cc00fe710 msgr2=0x7f4cc0100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:35:56.252 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.254+0000 7f4cc60ec640 1 -- 192.168.123.107:0/1488995172 shutdown_connections 2026-03-09T19:35:56.252 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.254+0000 7f4cc60ec640 1 -- 192.168.123.107:0/1488995172 wait complete. 2026-03-09T19:35:56.252 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.254+0000 7f4cc60ec640 1 Processor -- start 2026-03-09T19:35:56.253 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.255+0000 7f4cc60ec640 1 -- start start 2026-03-09T19:35:56.253 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.255+0000 7f4cc60ec640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cc0075720 0x7f4cc0072600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:56.253 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.255+0000 7f4cc60ec640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cc0076040 0x7f4cc0072b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:56.253 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.255+0000 7f4cc60ec640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4cc006d720 con 0x7f4cc0075720 2026-03-09T19:35:56.253 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.255+0000 7f4cc60ec640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4cc006d890 con 0x7f4cc0076040 2026-03-09T19:35:56.253 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.255+0000 7f4cbeffd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cc0076040 0x7f4cc0072b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:56.253 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.255+0000 7f4cbf7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cc0075720 0x7f4cc0072600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:56.253 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.255+0000 7f4cbf7fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cc0075720 0x7f4cc0072600 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:57512/0 (socket says 192.168.123.107:57512) 2026-03-09T19:35:56.253 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.255+0000 7f4cbf7fe640 1 -- 192.168.123.107:0/803696571 learned_addr learned my addr 192.168.123.107:0/803696571 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:35:56.253 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.255+0000 7f4cbf7fe640 1 -- 192.168.123.107:0/803696571 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cc0076040 msgr2=0x7f4cc0072b40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:56.253 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.255+0000 7f4cbf7fe640 1 --2- 192.168.123.107:0/803696571 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cc0076040 0x7f4cc0072b40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:56.253 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.255+0000 7f4cbf7fe640 1 -- 192.168.123.107:0/803696571 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4cb0009660 con 0x7f4cc0075720 2026-03-09T19:35:56.253 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.256+0000 7f4cbf7fe640 1 --2- 192.168.123.107:0/803696571 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cc0075720 0x7f4cc0072600 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f4cb00040c0 tx=0x7f4cb0031d20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:35:56.254 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.256+0000 7f4cbcff9640 1 -- 192.168.123.107:0/803696571 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4cb0031e90 con 0x7f4cc0075720 2026-03-09T19:35:56.254 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.256+0000 7f4cbcff9640 1 -- 192.168.123.107:0/803696571 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f4cb0002a00 con 0x7f4cc0075720 2026-03-09T19:35:56.254 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.256+0000 7f4cbcff9640 1 -- 192.168.123.107:0/803696571 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4cb0038470 con 0x7f4cc0075720 2026-03-09T19:35:56.254 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.256+0000 7f4cc60ec640 1 -- 192.168.123.107:0/803696571 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4cc006da90 con 0x7f4cc0075720 2026-03-09T19:35:56.254 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.256+0000 7f4cc60ec640 1 -- 192.168.123.107:0/803696571 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4cc006def0 con 0x7f4cc0075720 2026-03-09T19:35:56.256 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.258+0000 7f4cbcff9640 1 -- 192.168.123.107:0/803696571 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4cb003f030 con 0x7f4cc0075720 2026-03-09T19:35:56.256 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.258+0000 7f4cc60ec640 1 -- 192.168.123.107:0/803696571 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4cc0076e60 con 0x7f4cc0075720 2026-03-09T19:35:56.258 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.258+0000 7f4cbcff9640 1 --2- 192.168.123.107:0/803696571 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c94077680 0x7f4c94079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:56.258 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.258+0000 7f4cbcff9640 1 -- 192.168.123.107:0/803696571 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f4cb00be1e0 con 0x7f4cc0075720 2026-03-09T19:35:56.258 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.259+0000 7f4cbeffd640 1 --2- 192.168.123.107:0/803696571 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c94077680 0x7f4c94079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:56.259 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.260+0000 7f4cbeffd640 1 --2- 192.168.123.107:0/803696571 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c94077680 0x7f4c94079b40 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f4cc006eea0 tx=0x7f4cac009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:35:56.259 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.261+0000 7f4cbcff9640 1 -- 192.168.123.107:0/803696571 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4cb0086810 con 0x7f4cc0075720 2026-03-09T19:35:56.381 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.383+0000 7f4cc60ec640 1 -- 192.168.123.107:0/803696571 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f4cc006ec50 con 0x7f4cc0075720 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.386+0000 7f4cbcff9640 1 -- 192.168.123.107:0/803696571 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f4cb0086810 con 0x7f4cc0075720 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout: "mon": { 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout: "mgr": { 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout: "osd": { 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout: "mds": { 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout: "overall": { 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-09T19:35:56.384 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-09T19:35:56.386 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.388+0000 7f4cc60ec640 1 -- 192.168.123.107:0/803696571 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c94077680 msgr2=0x7f4c94079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:56.386 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.388+0000 7f4cc60ec640 1 --2- 192.168.123.107:0/803696571 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c94077680 0x7f4c94079b40 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f4cc006eea0 tx=0x7f4cac009290 comp rx=0 tx=0).stop 2026-03-09T19:35:56.386 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.388+0000 7f4cc60ec640 1 -- 192.168.123.107:0/803696571 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cc0075720 msgr2=0x7f4cc0072600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:56.386 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.388+0000 7f4cc60ec640 1 --2- 192.168.123.107:0/803696571 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cc0075720 0x7f4cc0072600 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f4cb00040c0 tx=0x7f4cb0031d20 comp rx=0 tx=0).stop 2026-03-09T19:35:56.386 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.388+0000 7f4cc60ec640 1 -- 192.168.123.107:0/803696571 shutdown_connections 2026-03-09T19:35:56.386 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.388+0000 7f4cc60ec640 1 --2- 192.168.123.107:0/803696571 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f4c94077680 0x7f4c94079b40 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:56.386 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.388+0000 7f4cc60ec640 1 --2- 192.168.123.107:0/803696571 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4cc0076040 0x7f4cc0072b40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:56.386 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.388+0000 7f4cc60ec640 1 --2- 192.168.123.107:0/803696571 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4cc0075720 0x7f4cc0072600 unknown :-1 s=CLOSED pgs=160 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:56.386 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.388+0000 7f4cc60ec640 1 -- 192.168.123.107:0/803696571 >> 192.168.123.107:0/803696571 conn(0x7f4cc00fe710 msgr2=0x7f4cc00ffe60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:35:56.386 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.389+0000 7f4cc60ec640 1 -- 192.168.123.107:0/803696571 shutdown_connections 2026-03-09T19:35:56.386 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.389+0000 7f4cc60ec640 1 -- 192.168.123.107:0/803696571 wait complete. 2026-03-09T19:35:56.423 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | length == 1'"'"'' 2026-03-09T19:35:56.560 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:35:56.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.799+0000 7f405e573640 1 -- 192.168.123.107:0/104013100 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40581029d0 msgr2=0x7f4058102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:56.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.799+0000 7f405e573640 1 --2- 192.168.123.107:0/104013100 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40581029d0 0x7f4058102e30 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f40440099b0 tx=0x7f404402f220 comp rx=0 tx=0).stop 2026-03-09T19:35:56.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.800+0000 7f405e573640 1 -- 192.168.123.107:0/104013100 shutdown_connections 2026-03-09T19:35:56.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.800+0000 7f405e573640 1 --2- 192.168.123.107:0/104013100 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40581029d0 0x7f4058102e30 unknown :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:56.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.800+0000 7f405e573640 1 --2- 192.168.123.107:0/104013100 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f40581089d0 0x7f4058108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:56.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.800+0000 7f405e573640 1 -- 192.168.123.107:0/104013100 >> 192.168.123.107:0/104013100 conn(0x7f40580fe710 msgr2=0x7f4058100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:35:56.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.800+0000 7f405e573640 1 -- 192.168.123.107:0/104013100 shutdown_connections 2026-03-09T19:35:56.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.801+0000 7f405e573640 1 -- 192.168.123.107:0/104013100 wait complete. 2026-03-09T19:35:56.799 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.801+0000 7f405e573640 1 Processor -- start 2026-03-09T19:35:56.799 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.801+0000 7f405e573640 1 -- start start 2026-03-09T19:35:56.799 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.801+0000 7f405e573640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f40581029d0 0x7f40581a0640 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:56.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.802+0000 7f405e573640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40581089d0 0x7f40581a0b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:56.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.802+0000 7f405e573640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f40581a11d0 con 0x7f40581089d0 2026-03-09T19:35:56.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.802+0000 7f405e573640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f405819a730 con 0x7f40581029d0 2026-03-09T19:35:56.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.802+0000 7f40577fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40581089d0 0x7f40581a0b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:56.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.802+0000 7f40577fe640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40581089d0 0x7f40581a0b80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:57528/0 (socket says 192.168.123.107:57528) 2026-03-09T19:35:56.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.802+0000 7f40577fe640 1 -- 192.168.123.107:0/4181961174 learned_addr learned my addr 192.168.123.107:0/4181961174 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:35:56.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.802+0000 7f40577fe640 1 -- 192.168.123.107:0/4181961174 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f40581029d0 msgr2=0x7f40581a0640 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:56.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.802+0000 7f4057fff640 1 --2- 192.168.123.107:0/4181961174 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f40581029d0 0x7f40581a0640 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:56.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.803+0000 7f40577fe640 1 --2- 192.168.123.107:0/4181961174 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f40581029d0 0x7f40581a0640 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:56.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.803+0000 7f40577fe640 1 -- 192.168.123.107:0/4181961174 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4044009660 con 0x7f40581089d0 2026-03-09T19:35:56.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.803+0000 7f4057fff640 1 --2- 192.168.123.107:0/4181961174 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f40581029d0 0x7f40581a0640 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:35:56.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.803+0000 7f40577fe640 1 --2- 192.168.123.107:0/4181961174 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40581089d0 0x7f40581a0b80 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7f404402f730 tx=0x7f40440043d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:35:56.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.803+0000 7f40557fa640 1 -- 192.168.123.107:0/4181961174 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f404403d070 con 0x7f40581089d0 2026-03-09T19:35:56.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.803+0000 7f40557fa640 1 -- 192.168.123.107:0/4181961174 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f404402fc90 con 0x7f40581089d0 2026-03-09T19:35:56.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.803+0000 7f40557fa640 1 -- 192.168.123.107:0/4181961174 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f40440417b0 con 0x7f40581089d0 2026-03-09T19:35:56.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.803+0000 7f405e573640 1 -- 192.168.123.107:0/4181961174 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f405819a9b0 con 0x7f40581089d0 2026-03-09T19:35:56.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.803+0000 7f405e573640 1 -- 192.168.123.107:0/4181961174 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f405819adc0 con 0x7f40581089d0 2026-03-09T19:35:56.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.804+0000 7f405e573640 1 -- 192.168.123.107:0/4181961174 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4024005350 con 0x7f40581089d0 2026-03-09T19:35:56.805 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.807+0000 7f40557fa640 1 -- 192.168.123.107:0/4181961174 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4044038730 con 0x7f40581089d0 2026-03-09T19:35:56.806 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.807+0000 7f40557fa640 1 --2- 192.168.123.107:0/4181961174 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f402c0779b0 0x7f402c079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:56.806 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.807+0000 7f40557fa640 1 -- 192.168.123.107:0/4181961174 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f40440becf0 con 0x7f40581089d0 2026-03-09T19:35:56.806 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.807+0000 7f4057fff640 1 --2- 192.168.123.107:0/4181961174 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f402c0779b0 0x7f402c079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:56.806 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.808+0000 7f4057fff640 1 --2- 192.168.123.107:0/4181961174 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f402c0779b0 0x7f402c079e70 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f4048008bc0 tx=0x7f4048005e90 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:35:56.806 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.808+0000 7f40557fa640 1 -- 192.168.123.107:0/4181961174 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f40440873a0 con 0x7f40581089d0 2026-03-09T19:35:56.844 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:56 vm08.local ceph-mon[103420]: pgmap v280: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:56.844 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:56 vm08.local ceph-mon[103420]: from='client.44371 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:35:56.844 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:56 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/803696571' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:35:56.931 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:56 vm07.local ceph-mon[111841]: pgmap v280: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:56.931 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:56 vm07.local ceph-mon[111841]: from='client.44371 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T19:35:56.931 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:56 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/803696571' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:35:56.931 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.933+0000 7f405e573640 1 -- 192.168.123.107:0/4181961174 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f40240058d0 con 0x7f40581089d0 2026-03-09T19:35:56.932 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.934+0000 7f40557fa640 1 -- 192.168.123.107:0/4181961174 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f4044086af0 con 0x7f40581089d0 2026-03-09T19:35:56.935 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.937+0000 7f405e573640 1 -- 192.168.123.107:0/4181961174 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f402c0779b0 msgr2=0x7f402c079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:56.935 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.937+0000 7f405e573640 1 --2- 192.168.123.107:0/4181961174 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f402c0779b0 0x7f402c079e70 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f4048008bc0 tx=0x7f4048005e90 comp rx=0 tx=0).stop 2026-03-09T19:35:56.935 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.937+0000 7f405e573640 1 -- 192.168.123.107:0/4181961174 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40581089d0 msgr2=0x7f40581a0b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:56.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.937+0000 7f405e573640 1 --2- 192.168.123.107:0/4181961174 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40581089d0 0x7f40581a0b80 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7f404402f730 tx=0x7f40440043d0 comp rx=0 tx=0).stop 2026-03-09T19:35:56.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.937+0000 7f405e573640 1 -- 192.168.123.107:0/4181961174 shutdown_connections 2026-03-09T19:35:56.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.937+0000 7f405e573640 1 --2- 192.168.123.107:0/4181961174 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f402c0779b0 0x7f402c079e70 secure :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f4048008bc0 tx=0x7f4048005e90 comp rx=0 tx=0).stop 2026-03-09T19:35:56.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.937+0000 7f405e573640 1 --2- 192.168.123.107:0/4181961174 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40581089d0 0x7f40581a0b80 secure :-1 s=CLOSED pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7f404402f730 tx=0x7f40440043d0 comp rx=0 tx=0).stop 2026-03-09T19:35:56.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.937+0000 7f405e573640 1 --2- 192.168.123.107:0/4181961174 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f40581029d0 0x7f40581a0640 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:56.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.937+0000 7f405e573640 1 -- 192.168.123.107:0/4181961174 >> 192.168.123.107:0/4181961174 conn(0x7f40580fe710 msgr2=0x7f405810c990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:35:56.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.938+0000 7f405e573640 1 -- 192.168.123.107:0/4181961174 shutdown_connections 2026-03-09T19:35:56.936 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:56.938+0000 7f405e573640 1 -- 192.168.123.107:0/4181961174 wait complete. 2026-03-09T19:35:56.943 INFO:teuthology.orchestra.run.vm07.stdout:true 2026-03-09T19:35:56.997 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | keys'"'"' | grep $sha1' 2026-03-09T19:35:57.159 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:35:57.410 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.411+0000 7f45d15f5640 1 -- 192.168.123.107:0/3830367568 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45cc108820 msgr2=0x7f45cc108c00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:57.410 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.411+0000 7f45d15f5640 1 --2- 192.168.123.107:0/3830367568 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45cc108820 0x7f45cc108c00 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f45c00099b0 tx=0x7f45c002f240 comp rx=0 tx=0).stop 2026-03-09T19:35:57.410 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.412+0000 7f45d15f5640 1 -- 192.168.123.107:0/3830367568 shutdown_connections 2026-03-09T19:35:57.410 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.412+0000 7f45d15f5640 1 --2- 192.168.123.107:0/3830367568 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f45cc102820 0x7f45cc102c80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:57.410 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.412+0000 7f45d15f5640 1 --2- 192.168.123.107:0/3830367568 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45cc108820 0x7f45cc108c00 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:57.410 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.412+0000 7f45d15f5640 1 -- 192.168.123.107:0/3830367568 >> 192.168.123.107:0/3830367568 conn(0x7f45cc0fe580 msgr2=0x7f45cc1009a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:35:57.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.412+0000 7f45d15f5640 1 -- 192.168.123.107:0/3830367568 shutdown_connections 2026-03-09T19:35:57.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.413+0000 7f45d15f5640 1 -- 192.168.123.107:0/3830367568 wait complete. 2026-03-09T19:35:57.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.413+0000 7f45d15f5640 1 Processor -- start 2026-03-09T19:35:57.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.413+0000 7f45d15f5640 1 -- start start 2026-03-09T19:35:57.411 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.413+0000 7f45d15f5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f45cc102820 0x7f45cc079600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:57.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.413+0000 7f45d15f5640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45cc075660 0x7f45cc075ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:57.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.413+0000 7f45d15f5640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f45cc076090 con 0x7f45cc102820 2026-03-09T19:35:57.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.413+0000 7f45d15f5640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f45cc076200 con 0x7f45cc075660 2026-03-09T19:35:57.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.414+0000 7f45ca7fc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45cc075660 0x7f45cc075ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:57.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.414+0000 7f45caffd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f45cc102820 0x7f45cc079600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:57.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.414+0000 7f45ca7fc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45cc075660 0x7f45cc075ac0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:35310/0 (socket says 192.168.123.107:35310) 2026-03-09T19:35:57.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.414+0000 7f45ca7fc640 1 -- 192.168.123.107:0/21401851 learned_addr learned my addr 192.168.123.107:0/21401851 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:35:57.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.415+0000 7f45caffd640 1 -- 192.168.123.107:0/21401851 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45cc075660 msgr2=0x7f45cc075ac0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:57.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.415+0000 7f45caffd640 1 --2- 192.168.123.107:0/21401851 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45cc075660 0x7f45cc075ac0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:57.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.415+0000 7f45caffd640 1 -- 192.168.123.107:0/21401851 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f45b4009590 con 0x7f45cc102820 2026-03-09T19:35:57.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.415+0000 7f45caffd640 1 --2- 192.168.123.107:0/21401851 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f45cc102820 0x7f45cc079600 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f45c0002410 tx=0x7f45c0004830 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:35:57.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.416+0000 7f45abfff640 1 -- 192.168.123.107:0/21401851 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f45c003d070 con 0x7f45cc102820 2026-03-09T19:35:57.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.416+0000 7f45abfff640 1 -- 192.168.123.107:0/21401851 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f45c0002bd0 con 0x7f45cc102820 2026-03-09T19:35:57.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.416+0000 7f45abfff640 1 -- 192.168.123.107:0/21401851 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f45c0038d00 con 0x7f45cc102820 2026-03-09T19:35:57.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.416+0000 7f45d15f5640 1 -- 192.168.123.107:0/21401851 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f45c0009660 con 0x7f45cc102820 2026-03-09T19:35:57.415 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.416+0000 7f45d15f5640 1 -- 192.168.123.107:0/21401851 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f45cc071790 con 0x7f45cc102820 2026-03-09T19:35:57.415 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.417+0000 7f45d15f5640 1 -- 192.168.123.107:0/21401851 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4598005350 con 0x7f45cc102820 2026-03-09T19:35:57.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.422+0000 7f45abfff640 1 -- 192.168.123.107:0/21401851 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f45c0004360 con 0x7f45cc102820 2026-03-09T19:35:57.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.422+0000 7f45abfff640 1 --2- 192.168.123.107:0/21401851 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f45ac0777c0 0x7f45ac079c80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:57.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.422+0000 7f45abfff640 1 -- 192.168.123.107:0/21401851 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f45c00be670 con 0x7f45cc102820 2026-03-09T19:35:57.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.422+0000 7f45abfff640 1 -- 192.168.123.107:0/21401851 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f45c00beaf0 con 0x7f45cc102820 2026-03-09T19:35:57.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.422+0000 7f45ca7fc640 1 --2- 192.168.123.107:0/21401851 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f45ac0777c0 0x7f45ac079c80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:57.424 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.426+0000 7f45ca7fc640 1 --2- 192.168.123.107:0/21401851 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f45ac0777c0 0x7f45ac079c80 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f45b40098d0 tx=0x7f45b4009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:35:57.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.553+0000 7f45d15f5640 1 -- 192.168.123.107:0/21401851 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f45980058d0 con 0x7f45cc102820 2026-03-09T19:35:57.552 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.554+0000 7f45abfff640 1 -- 192.168.123.107:0/21401851 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f45c0086dd0 con 0x7f45cc102820 2026-03-09T19:35:57.557 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.560+0000 7f45a9ffb640 1 -- 192.168.123.107:0/21401851 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f45ac0777c0 msgr2=0x7f45ac079c80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:57.557 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.560+0000 7f45a9ffb640 1 --2- 192.168.123.107:0/21401851 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f45ac0777c0 0x7f45ac079c80 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f45b40098d0 tx=0x7f45b4009290 comp rx=0 tx=0).stop 2026-03-09T19:35:57.558 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.560+0000 7f45a9ffb640 1 -- 192.168.123.107:0/21401851 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f45cc102820 msgr2=0x7f45cc079600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:57.558 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.560+0000 7f45a9ffb640 1 --2- 192.168.123.107:0/21401851 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f45cc102820 0x7f45cc079600 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f45c0002410 tx=0x7f45c0004830 comp rx=0 tx=0).stop 2026-03-09T19:35:57.558 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.561+0000 7f45a9ffb640 1 -- 192.168.123.107:0/21401851 shutdown_connections 2026-03-09T19:35:57.558 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.561+0000 7f45a9ffb640 1 --2- 192.168.123.107:0/21401851 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f45ac0777c0 0x7f45ac079c80 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:57.559 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.561+0000 7f45a9ffb640 1 --2- 192.168.123.107:0/21401851 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45cc075660 0x7f45cc075ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:57.559 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.561+0000 7f45a9ffb640 1 --2- 192.168.123.107:0/21401851 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f45cc102820 0x7f45cc079600 unknown :-1 s=CLOSED pgs=163 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:57.559 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.561+0000 7f45a9ffb640 1 -- 192.168.123.107:0/21401851 >> 192.168.123.107:0/21401851 conn(0x7f45cc0fe580 msgr2=0x7f45cc0ff5a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:35:57.559 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.561+0000 7f45a9ffb640 1 -- 192.168.123.107:0/21401851 shutdown_connections 2026-03-09T19:35:57.559 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.561+0000 7f45a9ffb640 1 -- 192.168.123.107:0/21401851 wait complete. 2026-03-09T19:35:57.566 INFO:teuthology.orchestra.run.vm07.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)" 2026-03-09T19:35:57.610 DEBUG:teuthology.parallel:result is None 2026-03-09T19:35:57.610 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T19:35:57.613 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm07.local 2026-03-09T19:35:57.613 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- bash -c 'ceph fs dump' 2026-03-09T19:35:57.762 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:35:57.802 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:57 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/4181961174' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:35:57.802 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:57 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/21401851' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:35:57.989 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.990+0000 7f0d41e24640 1 -- 192.168.123.107:0/1181313196 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d3c076040 msgr2=0x7f0d3c111330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:57.989 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.990+0000 7f0d41e24640 1 --2- 192.168.123.107:0/1181313196 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d3c076040 0x7f0d3c111330 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f0d280099b0 tx=0x7f0d2802f220 comp rx=0 tx=0).stop 2026-03-09T19:35:57.989 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.991+0000 7f0d41e24640 1 -- 192.168.123.107:0/1181313196 shutdown_connections 2026-03-09T19:35:57.989 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.991+0000 7f0d41e24640 1 --2- 192.168.123.107:0/1181313196 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d3c076040 0x7f0d3c111330 unknown :-1 s=CLOSED pgs=164 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:57.989 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.991+0000 7f0d41e24640 1 --2- 192.168.123.107:0/1181313196 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d3c075720 0x7f0d3c075b00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:57.989 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.991+0000 7f0d41e24640 1 -- 192.168.123.107:0/1181313196 >> 192.168.123.107:0/1181313196 conn(0x7f0d3c0fe710 msgr2=0x7f0d3c100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:35:57.989 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.991+0000 7f0d41e24640 1 -- 192.168.123.107:0/1181313196 shutdown_connections 2026-03-09T19:35:57.989 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.991+0000 7f0d41e24640 1 -- 192.168.123.107:0/1181313196 wait complete. 2026-03-09T19:35:57.989 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.991+0000 7f0d41e24640 1 Processor -- start 2026-03-09T19:35:57.989 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.991+0000 7f0d41e24640 1 -- start start 2026-03-09T19:35:57.989 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.992+0000 7f0d41e24640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d3c075720 0x7f0d3c10bd80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:57.990 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.992+0000 7f0d41e24640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d3c076040 0x7f0d3c10c2c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:57.990 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.992+0000 7f0d41e24640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0d3c10c8c0 con 0x7f0d3c075720 2026-03-09T19:35:57.990 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.992+0000 7f0d41e24640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0d3c10ca30 con 0x7f0d3c076040 2026-03-09T19:35:57.990 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.992+0000 7f0d3affd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d3c076040 0x7f0d3c10c2c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:57.990 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.992+0000 7f0d3affd640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d3c076040 0x7f0d3c10c2c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:35332/0 (socket says 192.168.123.107:35332) 2026-03-09T19:35:57.990 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.992+0000 7f0d3affd640 1 -- 192.168.123.107:0/1118347671 learned_addr learned my addr 192.168.123.107:0/1118347671 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:35:57.990 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.992+0000 7f0d3affd640 1 -- 192.168.123.107:0/1118347671 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d3c075720 msgr2=0x7f0d3c10bd80 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:35:57.990 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.992+0000 7f0d3b7fe640 1 --2- 192.168.123.107:0/1118347671 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d3c075720 0x7f0d3c10bd80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:57.990 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.993+0000 7f0d3affd640 1 --2- 192.168.123.107:0/1118347671 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d3c075720 0x7f0d3c10bd80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:57.990 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.993+0000 7f0d3affd640 1 -- 192.168.123.107:0/1118347671 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0d28009660 con 0x7f0d3c076040 2026-03-09T19:35:57.990 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.993+0000 7f0d3b7fe640 1 --2- 192.168.123.107:0/1118347671 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d3c075720 0x7f0d3c10bd80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:35:57.991 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.993+0000 7f0d3affd640 1 --2- 192.168.123.107:0/1118347671 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d3c076040 0x7f0d3c10c2c0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f0d28002410 tx=0x7f0d28031cd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:35:57.991 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.993+0000 7f0d38ff9640 1 -- 192.168.123.107:0/1118347671 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0d2803d070 con 0x7f0d3c076040 2026-03-09T19:35:57.991 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.993+0000 7f0d41e24640 1 -- 192.168.123.107:0/1118347671 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0d3c1ad430 con 0x7f0d3c076040 2026-03-09T19:35:57.991 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.993+0000 7f0d41e24640 1 -- 192.168.123.107:0/1118347671 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0d3c1ad920 con 0x7f0d3c076040 2026-03-09T19:35:57.991 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.993+0000 7f0d41e24640 1 -- 192.168.123.107:0/1118347671 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0d08005350 con 0x7f0d3c076040 2026-03-09T19:35:57.991 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.994+0000 7f0d38ff9640 1 -- 192.168.123.107:0/1118347671 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f0d28004440 con 0x7f0d3c076040 2026-03-09T19:35:57.991 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.994+0000 7f0d38ff9640 1 -- 192.168.123.107:0/1118347671 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0d28038e50 con 0x7f0d3c076040 2026-03-09T19:35:57.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.995+0000 7f0d38ff9640 1 -- 192.168.123.107:0/1118347671 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0d2802fa80 con 0x7f0d3c076040 2026-03-09T19:35:57.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.995+0000 7f0d38ff9640 1 --2- 192.168.123.107:0/1118347671 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0d100778e0 0x7f0d10079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:57.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.995+0000 7f0d38ff9640 1 -- 192.168.123.107:0/1118347671 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f0d280bede0 con 0x7f0d3c076040 2026-03-09T19:35:57.993 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.995+0000 7f0d3b7fe640 1 --2- 192.168.123.107:0/1118347671 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0d100778e0 0x7f0d10079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:57.994 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.996+0000 7f0d3b7fe640 1 --2- 192.168.123.107:0/1118347671 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0d100778e0 0x7f0d10079da0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f0d20008bc0 tx=0x7f0d20005e90 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:35:57.995 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:57.997+0000 7f0d38ff9640 1 -- 192.168.123.107:0/1118347671 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0d28087540 con 0x7f0d3c076040 2026-03-09T19:35:58.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:57 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/4181961174' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:35:58.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:57 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/21401851' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T19:35:58.106 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.107+0000 7f0d41e24640 1 -- 192.168.123.107:0/1118347671 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f0d080058d0 con 0x7f0d3c076040 2026-03-09T19:35:58.107 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.109+0000 7f0d38ff9640 1 -- 192.168.123.107:0/1118347671 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 33 v33) v1 ==== 76+0+1990 (secure 0 0 0) 0x7f0d28086c90 con 0x7f0d3c076040 2026-03-09T19:35:58.109 INFO:teuthology.orchestra.run.vm07.stdout:e33 2026-03-09T19:35:58.109 INFO:teuthology.orchestra.run.vm07.stdout:btime 2026-03-09T19:34:09:773803+0000 2026-03-09T19:35:58.109 INFO:teuthology.orchestra.run.vm07.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T19:35:58.109 INFO:teuthology.orchestra.run.vm07.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T19:35:58.109 INFO:teuthology.orchestra.run.vm07.stdout:legacy client fscid: 1 2026-03-09T19:35:58.109 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:35:58.109 INFO:teuthology.orchestra.run.vm07.stdout:Filesystem 'cephfs' (1) 2026-03-09T19:35:58.109 INFO:teuthology.orchestra.run.vm07.stdout:fs_name cephfs 2026-03-09T19:35:58.109 INFO:teuthology.orchestra.run.vm07.stdout:epoch 33 2026-03-09T19:35:58.109 INFO:teuthology.orchestra.run.vm07.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T19:35:58.109 INFO:teuthology.orchestra.run.vm07.stdout:created 2026-03-09T19:24:23.601314+0000 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:modified 2026-03-09T19:34:09.773799+0000 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:tableserver 0 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:root 0 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:session_timeout 60 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:session_autoclose 300 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:max_file_size 1099511627776 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:max_xattr_size 65536 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:required_client_features {} 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:last_failure 0 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:last_failure_osd_epoch 102 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:max_mds 2 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:in 0,1 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:up {0=34382,1=34386} 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:failed 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:damaged 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:stopped 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:data_pools [3] 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:metadata_pool 2 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:inline_data disabled 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:balancer 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:bal_rank_mask -1 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:standby_count_wanted 1 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:qdb_cluster leader: 34382 members: 34382,34386 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.uizncw{0:34382} state up:active seq 9 join_fscid=1 addr [v2:192.168.123.107:6826/2856024060,v1:192.168.123.107:6827/2856024060] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm07.zkmcyw{1:34386} state up:active seq 7 join_fscid=1 addr [v2:192.168.123.107:6828/670620212,v1:192.168.123.107:6829/670620212] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:Standby daemons: 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.jwsqrf{-1:34410} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6826/2173796097,v1:192.168.123.108:6827/2173796097] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stdout:[mds.cephfs.vm08.zcaqju{-1:44337} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3904856878,v1:192.168.123.108:6825/3904856878] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 33 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.112+0000 7f0d41e24640 1 -- 192.168.123.107:0/1118347671 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0d100778e0 msgr2=0x7f0d10079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:58.110 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.112+0000 7f0d41e24640 1 --2- 192.168.123.107:0/1118347671 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0d100778e0 0x7f0d10079da0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f0d20008bc0 tx=0x7f0d20005e90 comp rx=0 tx=0).stop 2026-03-09T19:35:58.111 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.113+0000 7f0d41e24640 1 -- 192.168.123.107:0/1118347671 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d3c076040 msgr2=0x7f0d3c10c2c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:58.111 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.113+0000 7f0d41e24640 1 --2- 192.168.123.107:0/1118347671 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d3c076040 0x7f0d3c10c2c0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f0d28002410 tx=0x7f0d28031cd0 comp rx=0 tx=0).stop 2026-03-09T19:35:58.111 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.113+0000 7f0d41e24640 1 -- 192.168.123.107:0/1118347671 shutdown_connections 2026-03-09T19:35:58.111 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.113+0000 7f0d41e24640 1 --2- 192.168.123.107:0/1118347671 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f0d100778e0 0x7f0d10079da0 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:58.111 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.113+0000 7f0d41e24640 1 --2- 192.168.123.107:0/1118347671 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d3c076040 0x7f0d3c10c2c0 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:58.111 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.113+0000 7f0d41e24640 1 --2- 192.168.123.107:0/1118347671 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0d3c075720 0x7f0d3c10bd80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:58.111 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.114+0000 7f0d41e24640 1 -- 192.168.123.107:0/1118347671 >> 192.168.123.107:0/1118347671 conn(0x7f0d3c0fe710 msgr2=0x7f0d3c0ffce0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:35:58.112 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.114+0000 7f0d41e24640 1 -- 192.168.123.107:0/1118347671 shutdown_connections 2026-03-09T19:35:58.112 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.114+0000 7f0d41e24640 1 -- 192.168.123.107:0/1118347671 wait complete. 2026-03-09T19:35:58.170 INFO:teuthology.run_tasks:Running task fs.post_upgrade_checks... 2026-03-09T19:35:58.172 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 2026-03-09T19:35:58.323 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:35:58.546 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.547+0000 7f774cc88640 1 -- 192.168.123.107:0/3776309592 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7748072340 msgr2=0x7f7748072720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:58.546 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.547+0000 7f774cc88640 1 --2- 192.168.123.107:0/3776309592 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7748072340 0x7f7748072720 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7f77380099b0 tx=0x7f773802f240 comp rx=0 tx=0).stop 2026-03-09T19:35:58.546 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.548+0000 7f774cc88640 1 -- 192.168.123.107:0/3776309592 shutdown_connections 2026-03-09T19:35:58.546 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.548+0000 7f774cc88640 1 --2- 192.168.123.107:0/3776309592 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7748072cf0 0x7f774810cd90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:58.546 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.548+0000 7f774cc88640 1 --2- 192.168.123.107:0/3776309592 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7748072340 0x7f7748072720 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:58.546 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.548+0000 7f774cc88640 1 -- 192.168.123.107:0/3776309592 >> 192.168.123.107:0/3776309592 conn(0x7f774806b7f0 msgr2=0x7f774806bc00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:35:58.546 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.548+0000 7f774cc88640 1 -- 192.168.123.107:0/3776309592 shutdown_connections 2026-03-09T19:35:58.546 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.548+0000 7f774cc88640 1 -- 192.168.123.107:0/3776309592 wait complete. 2026-03-09T19:35:58.546 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.548+0000 7f774cc88640 1 Processor -- start 2026-03-09T19:35:58.546 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.548+0000 7f774cc88640 1 -- start start 2026-03-09T19:35:58.546 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.549+0000 7f774cc88640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7748072340 0x7f77481ad500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:58.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.549+0000 7f774cc88640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7748072cf0 0x7f77481ada40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:58.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.549+0000 7f774cc88640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f77481a75f0 con 0x7f7748072340 2026-03-09T19:35:58.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.549+0000 7f774cc88640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f77481a7760 con 0x7f7748072cf0 2026-03-09T19:35:58.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.549+0000 7f7746575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7748072340 0x7f77481ad500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:58.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.549+0000 7f7746575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7748072340 0x7f77481ad500 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:57592/0 (socket says 192.168.123.107:57592) 2026-03-09T19:35:58.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.549+0000 7f7746575640 1 -- 192.168.123.107:0/1989669145 learned_addr learned my addr 192.168.123.107:0/1989669145 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:35:58.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.549+0000 7f773ffff640 1 --2- 192.168.123.107:0/1989669145 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7748072cf0 0x7f77481ada40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:58.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.549+0000 7f773ffff640 1 -- 192.168.123.107:0/1989669145 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7748072340 msgr2=0x7f77481ad500 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:58.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.549+0000 7f773ffff640 1 --2- 192.168.123.107:0/1989669145 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7748072340 0x7f77481ad500 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:58.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.549+0000 7f773ffff640 1 -- 192.168.123.107:0/1989669145 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7738009660 con 0x7f7748072cf0 2026-03-09T19:35:58.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.549+0000 7f773ffff640 1 --2- 192.168.123.107:0/1989669145 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7748072cf0 0x7f77481ada40 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f7748069a50 tx=0x7f772800ede0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:35:58.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.550+0000 7f773f7fe640 1 -- 192.168.123.107:0/1989669145 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f772800cc70 con 0x7f7748072cf0 2026-03-09T19:35:58.548 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.550+0000 7f773f7fe640 1 -- 192.168.123.107:0/1989669145 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f7728004590 con 0x7f7748072cf0 2026-03-09T19:35:58.548 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.550+0000 7f774cc88640 1 -- 192.168.123.107:0/1989669145 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f77481a7a40 con 0x7f7748072cf0 2026-03-09T19:35:58.548 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.550+0000 7f773f7fe640 1 -- 192.168.123.107:0/1989669145 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7728010430 con 0x7f7748072cf0 2026-03-09T19:35:58.548 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.550+0000 7f774cc88640 1 -- 192.168.123.107:0/1989669145 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f77481a7f90 con 0x7f7748072cf0 2026-03-09T19:35:58.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.551+0000 7f773f7fe640 1 -- 192.168.123.107:0/1989669145 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f77280026e0 con 0x7f7748072cf0 2026-03-09T19:35:58.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.551+0000 7f773f7fe640 1 --2- 192.168.123.107:0/1989669145 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f77140776d0 0x7f7714079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:58.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.552+0000 7f7746575640 1 --2- 192.168.123.107:0/1989669145 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f77140776d0 0x7f7714079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:58.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.552+0000 7f773f7fe640 1 -- 192.168.123.107:0/1989669145 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f7728014070 con 0x7f7748072cf0 2026-03-09T19:35:58.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.552+0000 7f7746575640 1 --2- 192.168.123.107:0/1989669145 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f77140776d0 0x7f7714079b90 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f7738002410 tx=0x7f773803a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:35:58.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.552+0000 7f774cc88640 1 -- 192.168.123.107:0/1989669145 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7748108780 con 0x7f7748072cf0 2026-03-09T19:35:58.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.555+0000 7f773f7fe640 1 -- 192.168.123.107:0/1989669145 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f772809e050 con 0x7f7748072cf0 2026-03-09T19:35:58.672 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.673+0000 7f774cc88640 1 -- 192.168.123.107:0/1989669145 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f77481a8b00 con 0x7f7748072cf0 2026-03-09T19:35:58.672 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.675+0000 7f773f7fe640 1 -- 192.168.123.107:0/1989669145 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 33 v33) v1 ==== 94+0+5284 (secure 0 0 0) 0x7f7728061ac0 con 0x7f7748072cf0 2026-03-09T19:35:58.673 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:35:58.673 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":33,"btime":"2026-03-09T19:34:09:773803+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34410,"name":"cephfs.vm08.jwsqrf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/2173796097","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2173796097},{"type":"v1","addr":"192.168.123.108:6827","nonce":2173796097}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":28},{"gid":44337,"name":"cephfs.vm08.zcaqju","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3904856878","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3904856878},{"type":"v1","addr":"192.168.123.108:6825","nonce":3904856878}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24}],"filesystems":[{"mdsmap":{"epoch":33,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:34:09.773799+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":102,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":2,"in":[0,1],"up":{"mds_0":34382,"mds_1":34386},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34382":{"gid":34382,"name":"cephfs.vm07.uizncw","rank":0,"incarnation":27,"state":"up:active","state_seq":9,"addr":"192.168.123.107:6827/2856024060","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":2856024060},{"type":"v1","addr":"192.168.123.107:6827","nonce":2856024060}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}},"gid_34386":{"gid":34386,"name":"cephfs.vm07.zkmcyw","rank":1,"incarnation":32,"state":"up:active","state_seq":7,"addr":"192.168.123.107:6829/670620212","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":670620212},{"type":"v1","addr":"192.168.123.107:6829","nonce":670620212}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34382,"qdb_cluster":[34382,34386]},"id":1}]} 2026-03-09T19:35:58.673 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 33 2026-03-09T19:35:58.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.677+0000 7f774cc88640 1 -- 192.168.123.107:0/1989669145 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f77140776d0 msgr2=0x7f7714079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:58.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.677+0000 7f774cc88640 1 --2- 192.168.123.107:0/1989669145 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f77140776d0 0x7f7714079b90 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f7738002410 tx=0x7f773803a040 comp rx=0 tx=0).stop 2026-03-09T19:35:58.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.677+0000 7f774cc88640 1 -- 192.168.123.107:0/1989669145 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7748072cf0 msgr2=0x7f77481ada40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:58.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.677+0000 7f774cc88640 1 --2- 192.168.123.107:0/1989669145 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7748072cf0 0x7f77481ada40 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f7748069a50 tx=0x7f772800ede0 comp rx=0 tx=0).stop 2026-03-09T19:35:58.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.677+0000 7f774cc88640 1 -- 192.168.123.107:0/1989669145 shutdown_connections 2026-03-09T19:35:58.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.677+0000 7f774cc88640 1 --2- 192.168.123.107:0/1989669145 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f77140776d0 0x7f7714079b90 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:58.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.677+0000 7f774cc88640 1 --2- 192.168.123.107:0/1989669145 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7748072cf0 0x7f77481ada40 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:58.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.677+0000 7f774cc88640 1 --2- 192.168.123.107:0/1989669145 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7748072340 0x7f77481ad500 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:58.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.677+0000 7f774cc88640 1 -- 192.168.123.107:0/1989669145 >> 192.168.123.107:0/1989669145 conn(0x7f774806b7f0 msgr2=0x7f774810df80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:35:58.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.677+0000 7f774cc88640 1 -- 192.168.123.107:0/1989669145 shutdown_connections 2026-03-09T19:35:58.676 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:58.677+0000 7f774cc88640 1 -- 192.168.123.107:0/1989669145 wait complete. 2026-03-09T19:35:58.735 DEBUG:tasks.fs:checking fs fscid=1,name=cephfs state = {'epoch': 11, 'max_mds': 2, 'flags': 18} 2026-03-09T19:35:58.735 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 12 2026-03-09T19:35:58.873 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:35:58.923 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:58 vm07.local ceph-mon[111841]: pgmap v281: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:58.923 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:58 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1118347671' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:35:58.923 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:58 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1989669145' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T19:35:59.088 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.090+0000 7f95a372e640 1 -- 192.168.123.107:0/3741287316 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f959c1029d0 msgr2=0x7f959c102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:59.089 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.090+0000 7f95a372e640 1 --2- 192.168.123.107:0/3741287316 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f959c1029d0 0x7f959c102e30 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f958c0099b0 tx=0x7f958c02f220 comp rx=0 tx=0).stop 2026-03-09T19:35:59.089 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.091+0000 7f95a372e640 1 -- 192.168.123.107:0/3741287316 shutdown_connections 2026-03-09T19:35:59.089 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.091+0000 7f95a372e640 1 --2- 192.168.123.107:0/3741287316 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f959c1029d0 0x7f959c102e30 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:59.089 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.091+0000 7f95a372e640 1 --2- 192.168.123.107:0/3741287316 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f959c1089d0 0x7f959c108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:59.089 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.091+0000 7f95a372e640 1 -- 192.168.123.107:0/3741287316 >> 192.168.123.107:0/3741287316 conn(0x7f959c0fe710 msgr2=0x7f959c100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:35:59.089 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.091+0000 7f95a372e640 1 -- 192.168.123.107:0/3741287316 shutdown_connections 2026-03-09T19:35:59.089 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.091+0000 7f95a372e640 1 -- 192.168.123.107:0/3741287316 wait complete. 2026-03-09T19:35:59.089 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.091+0000 7f95a372e640 1 Processor -- start 2026-03-09T19:35:59.089 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.091+0000 7f95a372e640 1 -- start start 2026-03-09T19:35:59.089 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.092+0000 7f95a372e640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f959c1029d0 0x7f959c075700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:59.090 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.092+0000 7f95a372e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f959c1089d0 0x7f959c075c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:59.090 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.092+0000 7f95a372e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f959c079780 con 0x7f959c1089d0 2026-03-09T19:35:59.090 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.092+0000 7f95a372e640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f959c0798f0 con 0x7f959c1029d0 2026-03-09T19:35:59.090 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.092+0000 7f95a0ca2640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f959c1089d0 0x7f959c075c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:59.090 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.092+0000 7f95a0ca2640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f959c1089d0 0x7f959c075c40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:33670/0 (socket says 192.168.123.107:33670) 2026-03-09T19:35:59.090 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.092+0000 7f95a0ca2640 1 -- 192.168.123.107:0/1162076068 learned_addr learned my addr 192.168.123.107:0/1162076068 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:35:59.091 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.092+0000 7f95a0ca2640 1 -- 192.168.123.107:0/1162076068 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f959c1029d0 msgr2=0x7f959c075700 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T19:35:59.091 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.092+0000 7f95a0ca2640 1 --2- 192.168.123.107:0/1162076068 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f959c1029d0 0x7f959c075700 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:59.091 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.092+0000 7f95a0ca2640 1 -- 192.168.123.107:0/1162076068 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9584009590 con 0x7f959c1089d0 2026-03-09T19:35:59.091 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.092+0000 7f95a0ca2640 1 --2- 192.168.123.107:0/1162076068 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f959c1089d0 0x7f959c075c40 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7f958c002410 tx=0x7f958c004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:35:59.091 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.093+0000 7f95927fc640 1 -- 192.168.123.107:0/1162076068 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f958c03d070 con 0x7f959c1089d0 2026-03-09T19:35:59.091 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.093+0000 7f95927fc640 1 -- 192.168.123.107:0/1162076068 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f958c038520 con 0x7f959c1089d0 2026-03-09T19:35:59.091 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.093+0000 7f95927fc640 1 -- 192.168.123.107:0/1162076068 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f958c041690 con 0x7f959c1089d0 2026-03-09T19:35:59.091 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.093+0000 7f95a372e640 1 -- 192.168.123.107:0/1162076068 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f958c009660 con 0x7f959c1089d0 2026-03-09T19:35:59.091 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.093+0000 7f95a372e640 1 -- 192.168.123.107:0/1162076068 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f959c076510 con 0x7f959c1089d0 2026-03-09T19:35:59.092 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.094+0000 7f95a372e640 1 -- 192.168.123.107:0/1162076068 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9564005350 con 0x7f959c1089d0 2026-03-09T19:35:59.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:58 vm08.local ceph-mon[103420]: pgmap v281: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:35:59.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:58 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1118347671' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T19:35:59.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:58 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1989669145' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T19:35:59.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.095+0000 7f95927fc640 1 -- 192.168.123.107:0/1162076068 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f958c038730 con 0x7f959c1089d0 2026-03-09T19:35:59.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.095+0000 7f95927fc640 1 --2- 192.168.123.107:0/1162076068 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f95780779b0 0x7f9578079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:59.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.095+0000 7f95927fc640 1 -- 192.168.123.107:0/1162076068 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f958c0be380 con 0x7f959c1089d0 2026-03-09T19:35:59.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.097+0000 7f95927fc640 1 -- 192.168.123.107:0/1162076068 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f958c086a30 con 0x7f959c1089d0 2026-03-09T19:35:59.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.098+0000 7f95a14a3640 1 --2- 192.168.123.107:0/1162076068 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f95780779b0 0x7f9578079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:59.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.098+0000 7f95a14a3640 1 --2- 192.168.123.107:0/1162076068 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f95780779b0 0x7f9578079e70 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f9584004610 tx=0x7f9584009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:35:59.206 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.207+0000 7f95a372e640 1 -- 192.168.123.107:0/1162076068 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 12, "format": "json"} v 0) v1 -- 0x7f95640051c0 con 0x7f959c1089d0 2026-03-09T19:35:59.206 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.208+0000 7f95927fc640 1 -- 192.168.123.107:0/1162076068 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 12, "format": "json"}]=0 dumped fsmap epoch 12 v33) v1 ==== 107+0+4926 (secure 0 0 0) 0x7f958c086180 con 0x7f959c1089d0 2026-03-09T19:35:59.207 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:35:59.207 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":12,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14500,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6829/969322030","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":969322030},{"type":"v1","addr":"192.168.123.107:6829","nonce":969322030}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":12},{"gid":14510,"name":"cephfs.vm07.uizncw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/1298912984","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":1298912984},{"type":"v1","addr":"192.168.123.107:6827","nonce":1298912984}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":12,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:24:31.864653+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":41,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":2,"in":[0,1],"up":{"mds_0":24279,"mds_1":24285},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24279":{"gid":24279,"name":"cephfs.vm08.zcaqju","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.108:6825/1034426472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1034426472},{"type":"v1","addr":"192.168.123.108:6825","nonce":1034426472}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24285":{"gid":24285,"name":"cephfs.vm08.jwsqrf","rank":1,"incarnation":9,"state":"up:rejoin","state_seq":4,"addr":"192.168.123.108:6827/3082155067","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":3082155067},{"type":"v1","addr":"192.168.123.108:6827","nonce":3082155067}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T19:35:59.208 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 12 2026-03-09T19:35:59.209 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.211+0000 7f95a372e640 1 -- 192.168.123.107:0/1162076068 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f95780779b0 msgr2=0x7f9578079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:59.209 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.211+0000 7f95a372e640 1 --2- 192.168.123.107:0/1162076068 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f95780779b0 0x7f9578079e70 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f9584004610 tx=0x7f9584009290 comp rx=0 tx=0).stop 2026-03-09T19:35:59.209 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.212+0000 7f95a372e640 1 -- 192.168.123.107:0/1162076068 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f959c1089d0 msgr2=0x7f959c075c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:59.210 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.212+0000 7f95a372e640 1 --2- 192.168.123.107:0/1162076068 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f959c1089d0 0x7f959c075c40 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7f958c002410 tx=0x7f958c004290 comp rx=0 tx=0).stop 2026-03-09T19:35:59.210 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.212+0000 7f95a372e640 1 -- 192.168.123.107:0/1162076068 shutdown_connections 2026-03-09T19:35:59.210 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.212+0000 7f95a372e640 1 --2- 192.168.123.107:0/1162076068 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f95780779b0 0x7f9578079e70 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:59.210 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.212+0000 7f95a372e640 1 --2- 192.168.123.107:0/1162076068 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f959c1089d0 0x7f959c075c40 unknown :-1 s=CLOSED pgs=166 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:59.210 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.212+0000 7f95a372e640 1 --2- 192.168.123.107:0/1162076068 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f959c1029d0 0x7f959c075700 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:59.210 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.213+0000 7f95a372e640 1 -- 192.168.123.107:0/1162076068 >> 192.168.123.107:0/1162076068 conn(0x7f959c0fe710 msgr2=0x7f959c10ca00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:35:59.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.213+0000 7f95a372e640 1 -- 192.168.123.107:0/1162076068 shutdown_connections 2026-03-09T19:35:59.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.213+0000 7f95a372e640 1 -- 192.168.123.107:0/1162076068 wait complete. 2026-03-09T19:35:59.270 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 13 2026-03-09T19:35:59.413 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:35:59.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.644+0000 7feaa82eb640 1 -- 192.168.123.107:0/265245325 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feaa01029d0 msgr2=0x7feaa0102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:59.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.644+0000 7feaa82eb640 1 --2- 192.168.123.107:0/265245325 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feaa01029d0 0x7feaa0102e30 secure :-1 s=READY pgs=167 cs=0 l=1 rev1=1 crypto rx=0x7fea88009a00 tx=0x7fea8802f270 comp rx=0 tx=0).stop 2026-03-09T19:35:59.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.645+0000 7feaa82eb640 1 -- 192.168.123.107:0/265245325 shutdown_connections 2026-03-09T19:35:59.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.645+0000 7feaa82eb640 1 --2- 192.168.123.107:0/265245325 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feaa01029d0 0x7feaa0102e30 unknown :-1 s=CLOSED pgs=167 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:59.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.645+0000 7feaa82eb640 1 --2- 192.168.123.107:0/265245325 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feaa01089d0 0x7feaa0108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:59.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.645+0000 7feaa82eb640 1 -- 192.168.123.107:0/265245325 >> 192.168.123.107:0/265245325 conn(0x7feaa00fe710 msgr2=0x7feaa0100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:35:59.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.645+0000 7feaa82eb640 1 -- 192.168.123.107:0/265245325 shutdown_connections 2026-03-09T19:35:59.643 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.645+0000 7feaa82eb640 1 -- 192.168.123.107:0/265245325 wait complete. 2026-03-09T19:35:59.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.646+0000 7feaa82eb640 1 Processor -- start 2026-03-09T19:35:59.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.646+0000 7feaa82eb640 1 -- start start 2026-03-09T19:35:59.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.646+0000 7feaa82eb640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feaa01089d0 0x7feaa01a08a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:59.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.646+0000 7feaa82eb640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feaa01a0de0 0x7feaa019a990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:59.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.646+0000 7feaa82eb640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feaa01a1260 con 0x7feaa01089d0 2026-03-09T19:35:59.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.646+0000 7feaa82eb640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feaa01a13a0 con 0x7feaa01a0de0 2026-03-09T19:35:59.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.646+0000 7feaa6060640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feaa01089d0 0x7feaa01a08a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:59.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.647+0000 7feaa6060640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feaa01089d0 0x7feaa01a08a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:33696/0 (socket says 192.168.123.107:33696) 2026-03-09T19:35:59.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.647+0000 7feaa6060640 1 -- 192.168.123.107:0/1506026169 learned_addr learned my addr 192.168.123.107:0/1506026169 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:35:59.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.647+0000 7feaa585f640 1 --2- 192.168.123.107:0/1506026169 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feaa01a0de0 0x7feaa019a990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:59.645 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.647+0000 7feaa6060640 1 -- 192.168.123.107:0/1506026169 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feaa01a0de0 msgr2=0x7feaa019a990 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:59.645 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.647+0000 7feaa6060640 1 --2- 192.168.123.107:0/1506026169 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feaa01a0de0 0x7feaa019a990 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:59.645 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.647+0000 7feaa6060640 1 -- 192.168.123.107:0/1506026169 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fea88009660 con 0x7feaa01089d0 2026-03-09T19:35:59.645 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.647+0000 7feaa6060640 1 --2- 192.168.123.107:0/1506026169 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feaa01089d0 0x7feaa01a08a0 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7fea9000ca30 tx=0x7fea9000cf00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:35:59.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.648+0000 7fea977fe640 1 -- 192.168.123.107:0/1506026169 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fea90004430 con 0x7feaa01089d0 2026-03-09T19:35:59.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.648+0000 7fea977fe640 1 -- 192.168.123.107:0/1506026169 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fea90004590 con 0x7feaa01089d0 2026-03-09T19:35:59.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.648+0000 7fea977fe640 1 -- 192.168.123.107:0/1506026169 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fea9000f660 con 0x7feaa01089d0 2026-03-09T19:35:59.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.648+0000 7feaa82eb640 1 -- 192.168.123.107:0/1506026169 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7feaa019af90 con 0x7feaa01089d0 2026-03-09T19:35:59.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.648+0000 7feaa82eb640 1 -- 192.168.123.107:0/1506026169 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7feaa019b400 con 0x7feaa01089d0 2026-03-09T19:35:59.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.649+0000 7fea977fe640 1 -- 192.168.123.107:0/1506026169 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fea9000f7c0 con 0x7feaa01089d0 2026-03-09T19:35:59.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.649+0000 7fea977fe640 1 --2- 192.168.123.107:0/1506026169 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fea7c0778e0 0x7fea7c079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:35:59.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.649+0000 7fea977fe640 1 -- 192.168.123.107:0/1506026169 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7fea9009ac80 con 0x7feaa01089d0 2026-03-09T19:35:59.648 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.650+0000 7feaa82eb640 1 -- 192.168.123.107:0/1506026169 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feaa0104110 con 0x7feaa01089d0 2026-03-09T19:35:59.648 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.650+0000 7feaa585f640 1 --2- 192.168.123.107:0/1506026169 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fea7c0778e0 0x7fea7c079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:35:59.651 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.653+0000 7feaa585f640 1 --2- 192.168.123.107:0/1506026169 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fea7c0778e0 0x7fea7c079da0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7fea88002410 tx=0x7fea88002c80 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:35:59.651 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.654+0000 7fea977fe640 1 -- 192.168.123.107:0/1506026169 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fea900632b0 con 0x7feaa01089d0 2026-03-09T19:35:59.709 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:35:59 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1162076068' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 12, "format": "json"}]: dispatch 2026-03-09T19:35:59.763 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.765+0000 7feaa82eb640 1 -- 192.168.123.107:0/1506026169 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 13, "format": "json"} v 0) v1 -- 0x7feaa019b6e0 con 0x7feaa01089d0 2026-03-09T19:35:59.763 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.765+0000 7fea977fe640 1 -- 192.168.123.107:0/1506026169 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 13, "format": "json"}]=0 dumped fsmap epoch 13 v33) v1 ==== 107+0+4926 (secure 0 0 0) 0x7fea9001d090 con 0x7feaa01089d0 2026-03-09T19:35:59.763 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:35:59.764 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":13,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14500,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6829/969322030","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":969322030},{"type":"v1","addr":"192.168.123.107:6829","nonce":969322030}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":12},{"gid":14510,"name":"cephfs.vm07.uizncw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/1298912984","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":1298912984},{"type":"v1","addr":"192.168.123.107:6827","nonce":1298912984}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":13,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:24:32.867256+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":41,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":2,"in":[0,1],"up":{"mds_0":24279,"mds_1":24285},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24279":{"gid":24279,"name":"cephfs.vm08.zcaqju","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.108:6825/1034426472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1034426472},{"type":"v1","addr":"192.168.123.108:6825","nonce":1034426472}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24285":{"gid":24285,"name":"cephfs.vm08.jwsqrf","rank":1,"incarnation":9,"state":"up:active","state_seq":5,"addr":"192.168.123.108:6827/3082155067","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":3082155067},{"type":"v1","addr":"192.168.123.108:6827","nonce":3082155067}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T19:35:59.764 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 13 2026-03-09T19:35:59.765 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.768+0000 7feaa82eb640 1 -- 192.168.123.107:0/1506026169 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fea7c0778e0 msgr2=0x7fea7c079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:59.765 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.768+0000 7feaa82eb640 1 --2- 192.168.123.107:0/1506026169 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fea7c0778e0 0x7fea7c079da0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7fea88002410 tx=0x7fea88002c80 comp rx=0 tx=0).stop 2026-03-09T19:35:59.766 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.768+0000 7feaa82eb640 1 -- 192.168.123.107:0/1506026169 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feaa01089d0 msgr2=0x7feaa01a08a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:35:59.766 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.768+0000 7feaa82eb640 1 --2- 192.168.123.107:0/1506026169 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feaa01089d0 0x7feaa01a08a0 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7fea9000ca30 tx=0x7fea9000cf00 comp rx=0 tx=0).stop 2026-03-09T19:35:59.766 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.768+0000 7feaa82eb640 1 -- 192.168.123.107:0/1506026169 shutdown_connections 2026-03-09T19:35:59.766 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.768+0000 7feaa82eb640 1 --2- 192.168.123.107:0/1506026169 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fea7c0778e0 0x7fea7c079da0 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:59.766 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.768+0000 7feaa82eb640 1 --2- 192.168.123.107:0/1506026169 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feaa01a0de0 0x7feaa019a990 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:59.766 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.768+0000 7feaa82eb640 1 --2- 192.168.123.107:0/1506026169 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feaa01089d0 0x7feaa01a08a0 unknown :-1 s=CLOSED pgs=168 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:35:59.766 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.768+0000 7feaa82eb640 1 -- 192.168.123.107:0/1506026169 >> 192.168.123.107:0/1506026169 conn(0x7feaa00fe710 msgr2=0x7feaa00feaf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:35:59.766 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.768+0000 7feaa82eb640 1 -- 192.168.123.107:0/1506026169 shutdown_connections 2026-03-09T19:35:59.766 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:35:59.768+0000 7feaa82eb640 1 -- 192.168.123.107:0/1506026169 wait complete. 2026-03-09T19:35:59.832 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 14 2026-03-09T19:35:59.987 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:00.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:35:59 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1162076068' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 12, "format": "json"}]: dispatch 2026-03-09T19:36:00.580 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.581+0000 7f683652e640 1 -- 192.168.123.107:0/1968997740 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68301022a0 msgr2=0x7f683010a790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:00.580 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.581+0000 7f683652e640 1 --2- 192.168.123.107:0/1968997740 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68301022a0 0x7f683010a790 secure :-1 s=READY pgs=169 cs=0 l=1 rev1=1 crypto rx=0x7f68200099e0 tx=0x7f682002f2f0 comp rx=0 tx=0).stop 2026-03-09T19:36:00.580 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.582+0000 7f683652e640 1 -- 192.168.123.107:0/1968997740 shutdown_connections 2026-03-09T19:36:00.580 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.582+0000 7f683652e640 1 --2- 192.168.123.107:0/1968997740 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68301022a0 0x7f683010a790 unknown :-1 s=CLOSED pgs=169 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:00.580 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.582+0000 7f683652e640 1 --2- 192.168.123.107:0/1968997740 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6830101980 0x7f6830101d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:00.580 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.582+0000 7f683652e640 1 -- 192.168.123.107:0/1968997740 >> 192.168.123.107:0/1968997740 conn(0x7f68300fb340 msgr2=0x7f68300fd760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:00.580 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.583+0000 7f683652e640 1 -- 192.168.123.107:0/1968997740 shutdown_connections 2026-03-09T19:36:00.581 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.583+0000 7f683652e640 1 -- 192.168.123.107:0/1968997740 wait complete. 2026-03-09T19:36:00.581 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.583+0000 7f683652e640 1 Processor -- start 2026-03-09T19:36:00.581 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.583+0000 7f683652e640 1 -- start start 2026-03-09T19:36:00.581 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.584+0000 7f683652e640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6830101980 0x7f68301a0660 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:00.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.584+0000 7f683652e640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f68301022a0 0x7f68301a0ba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:00.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.584+0000 7f683652e640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f68301a11a0 con 0x7f6830101980 2026-03-09T19:36:00.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.584+0000 7f683652e640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f683019a770 con 0x7f68301022a0 2026-03-09T19:36:00.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.584+0000 7f682ffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6830101980 0x7f68301a0660 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:00.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.584+0000 7f682ffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6830101980 0x7f68301a0660 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:33706/0 (socket says 192.168.123.107:33706) 2026-03-09T19:36:00.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.584+0000 7f682ffff640 1 -- 192.168.123.107:0/955104120 learned_addr learned my addr 192.168.123.107:0/955104120 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:00.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.584+0000 7f682f7fe640 1 --2- 192.168.123.107:0/955104120 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f68301022a0 0x7f68301a0ba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:00.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.584+0000 7f682ffff640 1 -- 192.168.123.107:0/955104120 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f68301022a0 msgr2=0x7f68301a0ba0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:00.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.584+0000 7f682ffff640 1 --2- 192.168.123.107:0/955104120 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f68301022a0 0x7f68301a0ba0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:00.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.585+0000 7f682ffff640 1 -- 192.168.123.107:0/955104120 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6820009660 con 0x7f6830101980 2026-03-09T19:36:00.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.585+0000 7f682ffff640 1 --2- 192.168.123.107:0/955104120 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6830101980 0x7f68301a0660 secure :-1 s=READY pgs=170 cs=0 l=1 rev1=1 crypto rx=0x7f681c00ca30 tx=0x7f681c00cf00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:00.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.585+0000 7f682d7fa640 1 -- 192.168.123.107:0/955104120 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f681c004430 con 0x7f6830101980 2026-03-09T19:36:00.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.585+0000 7f682d7fa640 1 -- 192.168.123.107:0/955104120 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f681c004590 con 0x7f6830101980 2026-03-09T19:36:00.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.585+0000 7f682d7fa640 1 -- 192.168.123.107:0/955104120 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f681c00f660 con 0x7f6830101980 2026-03-09T19:36:00.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.585+0000 7f683652e640 1 -- 192.168.123.107:0/955104120 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f683019aa50 con 0x7f6830101980 2026-03-09T19:36:00.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.585+0000 7f683652e640 1 -- 192.168.123.107:0/955104120 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f683019afa0 con 0x7f6830101980 2026-03-09T19:36:00.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.586+0000 7f683652e640 1 -- 192.168.123.107:0/955104120 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f67f4005350 con 0x7f6830101980 2026-03-09T19:36:00.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.587+0000 7f682d7fa640 1 -- 192.168.123.107:0/955104120 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f681c00f7c0 con 0x7f6830101980 2026-03-09T19:36:00.588 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.587+0000 7f682d7fa640 1 --2- 192.168.123.107:0/955104120 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f68040778e0 0x7f6804079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:00.588 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.587+0000 7f682d7fa640 1 -- 192.168.123.107:0/955104120 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f681c099e50 con 0x7f6830101980 2026-03-09T19:36:00.588 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.590+0000 7f682f7fe640 1 --2- 192.168.123.107:0/955104120 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f68040778e0 0x7f6804079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:00.588 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.590+0000 7f682f7fe640 1 --2- 192.168.123.107:0/955104120 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f68040778e0 0x7f6804079da0 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f683019bef0 tx=0x7f682003a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:00.588 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.590+0000 7f682d7fa640 1 -- 192.168.123.107:0/955104120 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f681c062540 con 0x7f6830101980 2026-03-09T19:36:00.720 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.722+0000 7f683652e640 1 -- 192.168.123.107:0/955104120 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 14, "format": "json"} v 0) v1 -- 0x7f67f40051c0 con 0x7f6830101980 2026-03-09T19:36:00.778 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.780+0000 7f682d7fa640 1 -- 192.168.123.107:0/955104120 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 14, "format": "json"}]=0 dumped fsmap epoch 14 v33) v1 ==== 107+0+4937 (secure 0 0 0) 0x7f681c061c90 con 0x7f6830101980 2026-03-09T19:36:00.793 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:00.793 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":14,"btime":"2026-03-09T19:33:08:696919+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14500,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6829/969322030","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":969322030},{"type":"v1","addr":"192.168.123.107:6829","nonce":969322030}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":12},{"gid":14510,"name":"cephfs.vm07.uizncw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/1298912984","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":1298912984},{"type":"v1","addr":"192.168.123.107:6827","nonce":1298912984}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":14,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:33:08.696919+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":41,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0,1],"up":{"mds_0":24279,"mds_1":24285},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24279":{"gid":24279,"name":"cephfs.vm08.zcaqju","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.108:6825/1034426472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1034426472},{"type":"v1","addr":"192.168.123.108:6825","nonce":1034426472}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24285":{"gid":24285,"name":"cephfs.vm08.jwsqrf","rank":1,"incarnation":9,"state":"up:stopping","state_seq":5,"addr":"192.168.123.108:6827/3082155067","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":3082155067},{"type":"v1","addr":"192.168.123.108:6827","nonce":3082155067}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24279,"qdb_cluster":[24279]},"id":1}]} 2026-03-09T19:36:00.794 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 14 2026-03-09T19:36:00.794 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.781+0000 7f683652e640 1 -- 192.168.123.107:0/955104120 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f68040778e0 msgr2=0x7f6804079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:00.794 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.781+0000 7f683652e640 1 --2- 192.168.123.107:0/955104120 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f68040778e0 0x7f6804079da0 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f683019bef0 tx=0x7f682003a040 comp rx=0 tx=0).stop 2026-03-09T19:36:00.794 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.781+0000 7f683652e640 1 -- 192.168.123.107:0/955104120 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6830101980 msgr2=0x7f68301a0660 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:00.794 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.781+0000 7f683652e640 1 --2- 192.168.123.107:0/955104120 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6830101980 0x7f68301a0660 secure :-1 s=READY pgs=170 cs=0 l=1 rev1=1 crypto rx=0x7f681c00ca30 tx=0x7f681c00cf00 comp rx=0 tx=0).stop 2026-03-09T19:36:00.794 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.782+0000 7f683652e640 1 -- 192.168.123.107:0/955104120 shutdown_connections 2026-03-09T19:36:00.794 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.782+0000 7f683652e640 1 --2- 192.168.123.107:0/955104120 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f68040778e0 0x7f6804079da0 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:00.794 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.782+0000 7f683652e640 1 --2- 192.168.123.107:0/955104120 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f68301022a0 0x7f68301a0ba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:00.794 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.782+0000 7f683652e640 1 --2- 192.168.123.107:0/955104120 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6830101980 0x7f68301a0660 unknown :-1 s=CLOSED pgs=170 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:00.794 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.782+0000 7f683652e640 1 -- 192.168.123.107:0/955104120 >> 192.168.123.107:0/955104120 conn(0x7f68300fb340 msgr2=0x7f68300ff960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:00.794 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.782+0000 7f683652e640 1 -- 192.168.123.107:0/955104120 shutdown_connections 2026-03-09T19:36:00.794 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:00.782+0000 7f683652e640 1 -- 192.168.123.107:0/955104120 wait complete. 2026-03-09T19:36:00.824 DEBUG:tasks.fs:max_mds reduced in epoch 14 2026-03-09T19:36:00.824 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 15 2026-03-09T19:36:00.966 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:00.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:00 vm07.local ceph-mon[111841]: pgmap v282: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:00.979 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:00 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1506026169' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-09T19:36:01.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:00 vm08.local ceph-mon[103420]: pgmap v282: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:01.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:00 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1506026169' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-09T19:36:01.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.193+0000 7f54c6d34640 1 -- 192.168.123.107:0/1612109703 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f54c01089d0 msgr2=0x7f54c0108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:01.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.193+0000 7f54c6d34640 1 --2- 192.168.123.107:0/1612109703 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f54c01089d0 0x7f54c0108db0 secure :-1 s=READY pgs=171 cs=0 l=1 rev1=1 crypto rx=0x7f54a80099b0 tx=0x7f54a802f220 comp rx=0 tx=0).stop 2026-03-09T19:36:01.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.194+0000 7f54c6d34640 1 -- 192.168.123.107:0/1612109703 shutdown_connections 2026-03-09T19:36:01.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.194+0000 7f54c6d34640 1 --2- 192.168.123.107:0/1612109703 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f54c01029d0 0x7f54c0102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:01.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.194+0000 7f54c6d34640 1 --2- 192.168.123.107:0/1612109703 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f54c01089d0 0x7f54c0108db0 unknown :-1 s=CLOSED pgs=171 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:01.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.194+0000 7f54c6d34640 1 -- 192.168.123.107:0/1612109703 >> 192.168.123.107:0/1612109703 conn(0x7f54c00fe710 msgr2=0x7f54c0100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:01.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.194+0000 7f54c6d34640 1 -- 192.168.123.107:0/1612109703 shutdown_connections 2026-03-09T19:36:01.192 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.194+0000 7f54c6d34640 1 -- 192.168.123.107:0/1612109703 wait complete. 2026-03-09T19:36:01.193 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.195+0000 7f54c6d34640 1 Processor -- start 2026-03-09T19:36:01.193 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.195+0000 7f54c6d34640 1 -- start start 2026-03-09T19:36:01.193 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.195+0000 7f54c6d34640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f54c01029d0 0x7f54c01a0680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:01.193 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.195+0000 7f54c6d34640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f54c01089d0 0x7f54c01a0bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:01.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.195+0000 7f54c6d34640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f54c019a770 con 0x7f54c01089d0 2026-03-09T19:36:01.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.195+0000 7f54c6d34640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f54c019a8e0 con 0x7f54c01029d0 2026-03-09T19:36:01.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.196+0000 7f54b7fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f54c01089d0 0x7f54c01a0bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:01.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.196+0000 7f54b7fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f54c01089d0 0x7f54c01a0bc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:33732/0 (socket says 192.168.123.107:33732) 2026-03-09T19:36:01.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.196+0000 7f54b7fff640 1 -- 192.168.123.107:0/81268684 learned_addr learned my addr 192.168.123.107:0/81268684 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:01.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.196+0000 7f54c4aa9640 1 --2- 192.168.123.107:0/81268684 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f54c01029d0 0x7f54c01a0680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:01.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.196+0000 7f54b7fff640 1 -- 192.168.123.107:0/81268684 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f54c01029d0 msgr2=0x7f54c01a0680 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:01.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.196+0000 7f54b7fff640 1 --2- 192.168.123.107:0/81268684 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f54c01029d0 0x7f54c01a0680 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:01.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.196+0000 7f54b7fff640 1 -- 192.168.123.107:0/81268684 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f54a8009660 con 0x7f54c01089d0 2026-03-09T19:36:01.194 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.196+0000 7f54b7fff640 1 --2- 192.168.123.107:0/81268684 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f54c01089d0 0x7f54c01a0bc0 secure :-1 s=READY pgs=172 cs=0 l=1 rev1=1 crypto rx=0x7f54b000da30 tx=0x7f54b000df00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:01.195 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.197+0000 7f54b5ffb640 1 -- 192.168.123.107:0/81268684 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f54b000bb80 con 0x7f54c01089d0 2026-03-09T19:36:01.195 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.197+0000 7f54b5ffb640 1 -- 192.168.123.107:0/81268684 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f54b0004590 con 0x7f54c01089d0 2026-03-09T19:36:01.195 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.197+0000 7f54b5ffb640 1 -- 192.168.123.107:0/81268684 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f54b0010460 con 0x7f54c01089d0 2026-03-09T19:36:01.196 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.197+0000 7f54c6d34640 1 -- 192.168.123.107:0/81268684 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f54c019abc0 con 0x7f54c01089d0 2026-03-09T19:36:01.196 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.198+0000 7f54c6d34640 1 -- 192.168.123.107:0/81268684 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f54c019b090 con 0x7f54c01089d0 2026-03-09T19:36:01.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.199+0000 7f54c6d34640 1 -- 192.168.123.107:0/81268684 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f548c005350 con 0x7f54c01089d0 2026-03-09T19:36:01.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.199+0000 7f54b5ffb640 1 -- 192.168.123.107:0/81268684 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f54b000bce0 con 0x7f54c01089d0 2026-03-09T19:36:01.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.200+0000 7f54b5ffb640 1 --2- 192.168.123.107:0/81268684 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f54900777a0 0x7f5490079c60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:01.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.200+0000 7f54b5ffb640 1 -- 192.168.123.107:0/81268684 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f54b00991e0 con 0x7f54c01089d0 2026-03-09T19:36:01.200 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.202+0000 7f54c4aa9640 1 --2- 192.168.123.107:0/81268684 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f54900777a0 0x7f5490079c60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:01.201 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.203+0000 7f54c4aa9640 1 --2- 192.168.123.107:0/81268684 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f54900777a0 0x7f5490079c60 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f54a8002410 tx=0x7f54a803a040 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:01.201 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.203+0000 7f54b5ffb640 1 -- 192.168.123.107:0/81268684 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f54b009e050 con 0x7f54c01089d0 2026-03-09T19:36:01.321 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.323+0000 7f54c6d34640 1 -- 192.168.123.107:0/81268684 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 15, "format": "json"} v 0) v1 -- 0x7f548c0058d0 con 0x7f54c01089d0 2026-03-09T19:36:01.323 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:01.323 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":15,"btime":"2026-03-09T19:33:33:006858+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14500,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6829/969322030","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":969322030},{"type":"v1","addr":"192.168.123.107:6829","nonce":969322030}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":12},{"gid":14510,"name":"cephfs.vm07.uizncw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/1298912984","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":1298912984},{"type":"v1","addr":"192.168.123.107:6827","nonce":1298912984}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":15,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:33:32.839429+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":41,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24279},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24279":{"gid":24279,"name":"cephfs.vm08.zcaqju","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.108:6825/1034426472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1034426472},{"type":"v1","addr":"192.168.123.108:6825","nonce":1034426472}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24279,"qdb_cluster":[24279]},"id":1}]} 2026-03-09T19:36:01.323 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.324+0000 7f54b5ffb640 1 -- 192.168.123.107:0/81268684 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 15, "format": "json"}]=0 dumped fsmap epoch 15 v33) v1 ==== 107+0+4138 (secure 0 0 0) 0x7f54b006a020 con 0x7f54c01089d0 2026-03-09T19:36:01.323 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 15 2026-03-09T19:36:01.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.326+0000 7f54c6d34640 1 -- 192.168.123.107:0/81268684 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f54900777a0 msgr2=0x7f5490079c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:01.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.326+0000 7f54c6d34640 1 --2- 192.168.123.107:0/81268684 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f54900777a0 0x7f5490079c60 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f54a8002410 tx=0x7f54a803a040 comp rx=0 tx=0).stop 2026-03-09T19:36:01.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.326+0000 7f54c6d34640 1 -- 192.168.123.107:0/81268684 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f54c01089d0 msgr2=0x7f54c01a0bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:01.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.326+0000 7f54c6d34640 1 --2- 192.168.123.107:0/81268684 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f54c01089d0 0x7f54c01a0bc0 secure :-1 s=READY pgs=172 cs=0 l=1 rev1=1 crypto rx=0x7f54b000da30 tx=0x7f54b000df00 comp rx=0 tx=0).stop 2026-03-09T19:36:01.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.326+0000 7f54c6d34640 1 -- 192.168.123.107:0/81268684 shutdown_connections 2026-03-09T19:36:01.324 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.327+0000 7f54c6d34640 1 --2- 192.168.123.107:0/81268684 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f54900777a0 0x7f5490079c60 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:01.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.327+0000 7f54c6d34640 1 --2- 192.168.123.107:0/81268684 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f54c01089d0 0x7f54c01a0bc0 unknown :-1 s=CLOSED pgs=172 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:01.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.327+0000 7f54c6d34640 1 --2- 192.168.123.107:0/81268684 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f54c01029d0 0x7f54c01a0680 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:01.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.327+0000 7f54c6d34640 1 -- 192.168.123.107:0/81268684 >> 192.168.123.107:0/81268684 conn(0x7f54c00fe710 msgr2=0x7f54c0106550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:01.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.327+0000 7f54c6d34640 1 -- 192.168.123.107:0/81268684 shutdown_connections 2026-03-09T19:36:01.325 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.327+0000 7f54c6d34640 1 -- 192.168.123.107:0/81268684 wait complete. 2026-03-09T19:36:01.380 DEBUG:tasks.fs:max_mds reduced in epoch 15 2026-03-09T19:36:01.380 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 16 2026-03-09T19:36:01.519 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:01.750 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.751+0000 7f680434c640 1 -- 192.168.123.107:0/2523287074 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67fc1089d0 msgr2=0x7f67fc108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:01.750 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.751+0000 7f680434c640 1 --2- 192.168.123.107:0/2523287074 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67fc1089d0 0x7f67fc108db0 secure :-1 s=READY pgs=173 cs=0 l=1 rev1=1 crypto rx=0x7f67e40099b0 tx=0x7f67e402f220 comp rx=0 tx=0).stop 2026-03-09T19:36:01.750 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.752+0000 7f680434c640 1 -- 192.168.123.107:0/2523287074 shutdown_connections 2026-03-09T19:36:01.750 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.752+0000 7f680434c640 1 --2- 192.168.123.107:0/2523287074 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67fc1029d0 0x7f67fc102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:01.750 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.752+0000 7f680434c640 1 --2- 192.168.123.107:0/2523287074 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67fc1089d0 0x7f67fc108db0 unknown :-1 s=CLOSED pgs=173 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:01.750 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.752+0000 7f680434c640 1 -- 192.168.123.107:0/2523287074 >> 192.168.123.107:0/2523287074 conn(0x7f67fc0fe710 msgr2=0x7f67fc100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:01.750 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.752+0000 7f680434c640 1 -- 192.168.123.107:0/2523287074 shutdown_connections 2026-03-09T19:36:01.750 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.752+0000 7f680434c640 1 -- 192.168.123.107:0/2523287074 wait complete. 2026-03-09T19:36:01.750 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.753+0000 7f680434c640 1 Processor -- start 2026-03-09T19:36:01.751 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.753+0000 7f680434c640 1 -- start start 2026-03-09T19:36:01.751 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.753+0000 7f680434c640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67fc1029d0 0x7f67fc1a06a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:01.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.753+0000 7f680434c640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67fc1089d0 0x7f67fc1a0be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:01.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.753+0000 7f680434c640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f67fc19a790 con 0x7f67fc1029d0 2026-03-09T19:36:01.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.753+0000 7f680434c640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f67fc19a900 con 0x7f67fc1089d0 2026-03-09T19:36:01.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.753+0000 7f68020c1640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67fc1029d0 0x7f67fc1a06a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:01.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.753+0000 7f68020c1640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67fc1029d0 0x7f67fc1a06a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:33756/0 (socket says 192.168.123.107:33756) 2026-03-09T19:36:01.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.753+0000 7f68020c1640 1 -- 192.168.123.107:0/1112206183 learned_addr learned my addr 192.168.123.107:0/1112206183 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:01.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.753+0000 7f68020c1640 1 -- 192.168.123.107:0/1112206183 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67fc1089d0 msgr2=0x7f67fc1a0be0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:36:01.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.756+0000 7f68018c0640 1 --2- 192.168.123.107:0/1112206183 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67fc1089d0 0x7f67fc1a0be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:01.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.757+0000 7f68020c1640 1 --2- 192.168.123.107:0/1112206183 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67fc1089d0 0x7f67fc1a0be0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:01.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.757+0000 7f68020c1640 1 -- 192.168.123.107:0/1112206183 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f67e4009660 con 0x7f67fc1029d0 2026-03-09T19:36:01.757 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.757+0000 7f68020c1640 1 --2- 192.168.123.107:0/1112206183 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67fc1029d0 0x7f67fc1a06a0 secure :-1 s=READY pgs=174 cs=0 l=1 rev1=1 crypto rx=0x7f67e4005ec0 tx=0x7f67e4004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:01.758 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.757+0000 7f67f37fe640 1 -- 192.168.123.107:0/1112206183 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f67e403d070 con 0x7f67fc1029d0 2026-03-09T19:36:01.758 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.757+0000 7f67f37fe640 1 -- 192.168.123.107:0/1112206183 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f67e4038730 con 0x7f67fc1029d0 2026-03-09T19:36:01.758 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.757+0000 7f67f37fe640 1 -- 192.168.123.107:0/1112206183 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f67e4041620 con 0x7f67fc1029d0 2026-03-09T19:36:01.758 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.757+0000 7f680434c640 1 -- 192.168.123.107:0/1112206183 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f67fc19ab80 con 0x7f67fc1029d0 2026-03-09T19:36:01.758 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.758+0000 7f680434c640 1 -- 192.168.123.107:0/1112206183 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f67fc19aee0 con 0x7f67fc1029d0 2026-03-09T19:36:01.758 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.759+0000 7f67f37fe640 1 -- 192.168.123.107:0/1112206183 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f67e404b430 con 0x7f67fc1029d0 2026-03-09T19:36:01.762 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.759+0000 7f67f37fe640 1 --2- 192.168.123.107:0/1112206183 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f67d80779b0 0x7f67d8079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:01.762 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.759+0000 7f67f37fe640 1 -- 192.168.123.107:0/1112206183 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f67e40bf4c0 con 0x7f67fc1029d0 2026-03-09T19:36:01.762 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.760+0000 7f680434c640 1 -- 192.168.123.107:0/1112206183 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f67fc104110 con 0x7f67fc1029d0 2026-03-09T19:36:01.762 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.760+0000 7f68018c0640 1 --2- 192.168.123.107:0/1112206183 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67fc1089d0 0x7f67fc1a0be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:36:01.762 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.761+0000 7f68018c0640 1 --2- 192.168.123.107:0/1112206183 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f67d80779b0 0x7f67d8079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:01.763 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.761+0000 7f68018c0640 1 --2- 192.168.123.107:0/1112206183 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f67d80779b0 0x7f67d8079e70 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f67fc19bc00 tx=0x7f67ec005f50 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:01.763 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.764+0000 7f67f37fe640 1 -- 192.168.123.107:0/1112206183 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f67e4087b70 con 0x7f67fc1029d0 2026-03-09T19:36:01.884 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:01 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/955104120' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-09T19:36:01.884 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:01 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/81268684' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-09T19:36:01.885 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.886+0000 7f680434c640 1 -- 192.168.123.107:0/1112206183 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 16, "format": "json"} v 0) v1 -- 0x7f67fc102e30 con 0x7f67fc1029d0 2026-03-09T19:36:01.886 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.888+0000 7f67f37fe640 1 -- 192.168.123.107:0/1112206183 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 16, "format": "json"}]=0 dumped fsmap epoch 16 v33) v1 ==== 107+0+4921 (secure 0 0 0) 0x7f67e4046090 con 0x7f67fc1029d0 2026-03-09T19:36:01.886 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:01.886 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":16,"btime":"2026-03-09T19:33:34:014117+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14500,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6829/969322030","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":969322030},{"type":"v1","addr":"192.168.123.107:6829","nonce":969322030}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":12},{"gid":14510,"name":"cephfs.vm07.uizncw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/1298912984","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":1298912984},{"type":"v1","addr":"192.168.123.107:6827","nonce":1298912984}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":34378,"name":"cephfs.vm08.jwsqrf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/2328013860","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2328013860},{"type":"v1","addr":"192.168.123.108:6827","nonce":2328013860}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":16}],"filesystems":[{"mdsmap":{"epoch":15,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:33:32.839429+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":41,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24279},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24279":{"gid":24279,"name":"cephfs.vm08.zcaqju","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.108:6825/1034426472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1034426472},{"type":"v1","addr":"192.168.123.108:6825","nonce":1034426472}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24279,"qdb_cluster":[24279]},"id":1}]} 2026-03-09T19:36:01.887 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 16 2026-03-09T19:36:01.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.891+0000 7f680434c640 1 -- 192.168.123.107:0/1112206183 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f67d80779b0 msgr2=0x7f67d8079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:01.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.891+0000 7f680434c640 1 --2- 192.168.123.107:0/1112206183 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f67d80779b0 0x7f67d8079e70 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f67fc19bc00 tx=0x7f67ec005f50 comp rx=0 tx=0).stop 2026-03-09T19:36:01.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.891+0000 7f680434c640 1 -- 192.168.123.107:0/1112206183 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67fc1029d0 msgr2=0x7f67fc1a06a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:01.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.891+0000 7f680434c640 1 --2- 192.168.123.107:0/1112206183 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67fc1029d0 0x7f67fc1a06a0 secure :-1 s=READY pgs=174 cs=0 l=1 rev1=1 crypto rx=0x7f67e4005ec0 tx=0x7f67e4004290 comp rx=0 tx=0).stop 2026-03-09T19:36:01.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.891+0000 7f680434c640 1 -- 192.168.123.107:0/1112206183 shutdown_connections 2026-03-09T19:36:01.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.891+0000 7f680434c640 1 --2- 192.168.123.107:0/1112206183 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f67d80779b0 0x7f67d8079e70 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:01.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.891+0000 7f680434c640 1 --2- 192.168.123.107:0/1112206183 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67fc1089d0 0x7f67fc1a0be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:01.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.891+0000 7f680434c640 1 --2- 192.168.123.107:0/1112206183 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67fc1029d0 0x7f67fc1a06a0 unknown :-1 s=CLOSED pgs=174 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:01.889 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.891+0000 7f680434c640 1 -- 192.168.123.107:0/1112206183 >> 192.168.123.107:0/1112206183 conn(0x7f67fc0fe710 msgr2=0x7f67fc1001d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:01.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.891+0000 7f680434c640 1 -- 192.168.123.107:0/1112206183 shutdown_connections 2026-03-09T19:36:01.890 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:01.891+0000 7f680434c640 1 -- 192.168.123.107:0/1112206183 wait complete. 2026-03-09T19:36:02.048 DEBUG:tasks.fs:max_mds reduced in epoch 16 2026-03-09T19:36:02.048 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 17 2026-03-09T19:36:02.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:01 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/955104120' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-09T19:36:02.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:01 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/81268684' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-09T19:36:02.193 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:02.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.419+0000 7ffa04e9a640 1 -- 192.168.123.107:0/337366705 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa000ffe40 msgr2=0x7ffa0010cd10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:02.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.419+0000 7ffa04e9a640 1 --2- 192.168.123.107:0/337366705 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa000ffe40 0x7ffa0010cd10 secure :-1 s=READY pgs=175 cs=0 l=1 rev1=1 crypto rx=0x7ff9f40099b0 tx=0x7ff9f402f220 comp rx=0 tx=0).stop 2026-03-09T19:36:02.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.420+0000 7ffa04e9a640 1 -- 192.168.123.107:0/337366705 shutdown_connections 2026-03-09T19:36:02.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.420+0000 7ffa04e9a640 1 --2- 192.168.123.107:0/337366705 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa000ffe40 0x7ffa0010cd10 unknown :-1 s=CLOSED pgs=175 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:02.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.420+0000 7ffa04e9a640 1 --2- 192.168.123.107:0/337366705 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa000ff520 0x7ffa000ff900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:02.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.420+0000 7ffa04e9a640 1 -- 192.168.123.107:0/337366705 >> 192.168.123.107:0/337366705 conn(0x7ffa000fb330 msgr2=0x7ffa000fd750 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:02.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.420+0000 7ffa04e9a640 1 -- 192.168.123.107:0/337366705 shutdown_connections 2026-03-09T19:36:02.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.420+0000 7ffa04e9a640 1 -- 192.168.123.107:0/337366705 wait complete. 2026-03-09T19:36:02.418 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.420+0000 7ffa04e9a640 1 Processor -- start 2026-03-09T19:36:02.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.420+0000 7ffa04e9a640 1 -- start start 2026-03-09T19:36:02.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.421+0000 7ffa04e9a640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa000ff520 0x7ffa00101880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:02.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.421+0000 7ffa04e9a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa000ffe40 0x7ffa00101dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:02.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.421+0000 7ffa04e9a640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffa00105900 con 0x7ffa000ffe40 2026-03-09T19:36:02.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.421+0000 7ffa04e9a640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffa00105a70 con 0x7ffa000ff520 2026-03-09T19:36:02.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.421+0000 7ff9fe575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa000ff520 0x7ffa00101880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:02.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.421+0000 7ff9fe575640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa000ff520 0x7ffa00101880 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:58026/0 (socket says 192.168.123.107:58026) 2026-03-09T19:36:02.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.421+0000 7ff9fe575640 1 -- 192.168.123.107:0/2257810872 learned_addr learned my addr 192.168.123.107:0/2257810872 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:02.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.421+0000 7ff9fdd74640 1 --2- 192.168.123.107:0/2257810872 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa000ffe40 0x7ffa00101dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:02.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.421+0000 7ff9fe575640 1 -- 192.168.123.107:0/2257810872 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa000ffe40 msgr2=0x7ffa00101dc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:02.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.421+0000 7ff9fe575640 1 --2- 192.168.123.107:0/2257810872 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa000ffe40 0x7ffa00101dc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:02.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.421+0000 7ff9fe575640 1 -- 192.168.123.107:0/2257810872 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff9f4009660 con 0x7ffa000ff520 2026-03-09T19:36:02.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.421+0000 7ff9fdd74640 1 --2- 192.168.123.107:0/2257810872 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa000ffe40 0x7ffa00101dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:36:02.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.422+0000 7ff9fe575640 1 --2- 192.168.123.107:0/2257810872 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa000ff520 0x7ffa00101880 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7ff9e800b700 tx=0x7ff9e800bbd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:02.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.422+0000 7ff9e77fe640 1 -- 192.168.123.107:0/2257810872 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff9e800be90 con 0x7ffa000ff520 2026-03-09T19:36:02.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.422+0000 7ffa04e9a640 1 -- 192.168.123.107:0/2257810872 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ffa001023c0 con 0x7ffa000ff520 2026-03-09T19:36:02.420 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.422+0000 7ffa04e9a640 1 -- 192.168.123.107:0/2257810872 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ffa00071810 con 0x7ffa000ff520 2026-03-09T19:36:02.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.423+0000 7ff9e77fe640 1 -- 192.168.123.107:0/2257810872 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff9e8002ba0 con 0x7ffa000ff520 2026-03-09T19:36:02.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.423+0000 7ff9e77fe640 1 -- 192.168.123.107:0/2257810872 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff9e800cab0 con 0x7ffa000ff520 2026-03-09T19:36:02.421 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.423+0000 7ffa04e9a640 1 -- 192.168.123.107:0/2257810872 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff9cc005350 con 0x7ffa000ff520 2026-03-09T19:36:02.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.425+0000 7ff9e77fe640 1 -- 192.168.123.107:0/2257810872 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff9e8004380 con 0x7ffa000ff520 2026-03-09T19:36:02.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.425+0000 7ff9e77fe640 1 --2- 192.168.123.107:0/2257810872 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff9d0077890 0x7ff9d0079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:02.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.425+0000 7ff9e77fe640 1 -- 192.168.123.107:0/2257810872 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7ff9e8099260 con 0x7ffa000ff520 2026-03-09T19:36:02.423 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.426+0000 7ff9fdd74640 1 --2- 192.168.123.107:0/2257810872 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff9d0077890 0x7ff9d0079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:02.424 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.426+0000 7ff9fdd74640 1 --2- 192.168.123.107:0/2257810872 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff9d0077890 0x7ff9d0079d50 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7ffa00103000 tx=0x7ff9f4002cf0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:02.425 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.427+0000 7ff9e77fe640 1 -- 192.168.123.107:0/2257810872 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff9e8061a20 con 0x7ffa000ff520 2026-03-09T19:36:02.543 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.545+0000 7ffa04e9a640 1 -- 192.168.123.107:0/2257810872 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 17, "format": "json"} v 0) v1 -- 0x7ff9cc0051c0 con 0x7ffa000ff520 2026-03-09T19:36:02.544 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.546+0000 7ff9e77fe640 1 -- 192.168.123.107:0/2257810872 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 17, "format": "json"}]=0 dumped fsmap epoch 17 v33) v1 ==== 107+0+4138 (secure 0 0 0) 0x7ff9e8061170 con 0x7ffa000ff520 2026-03-09T19:36:02.545 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:02.545 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":17,"btime":"2026-03-09T19:33:40:679213+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14500,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6829/969322030","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":969322030},{"type":"v1","addr":"192.168.123.107:6829","nonce":969322030}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":12},{"gid":34378,"name":"cephfs.vm08.jwsqrf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/2328013860","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2328013860},{"type":"v1","addr":"192.168.123.108:6827","nonce":2328013860}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":16}],"filesystems":[{"mdsmap":{"epoch":15,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:33:32.839429+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":41,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24279},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24279":{"gid":24279,"name":"cephfs.vm08.zcaqju","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.108:6825/1034426472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1034426472},{"type":"v1","addr":"192.168.123.108:6825","nonce":1034426472}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24279,"qdb_cluster":[24279]},"id":1}]} 2026-03-09T19:36:02.545 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 17 2026-03-09T19:36:02.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.549+0000 7ffa04e9a640 1 -- 192.168.123.107:0/2257810872 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff9d0077890 msgr2=0x7ff9d0079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:02.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.549+0000 7ffa04e9a640 1 --2- 192.168.123.107:0/2257810872 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff9d0077890 0x7ff9d0079d50 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7ffa00103000 tx=0x7ff9f4002cf0 comp rx=0 tx=0).stop 2026-03-09T19:36:02.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.549+0000 7ffa04e9a640 1 -- 192.168.123.107:0/2257810872 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa000ff520 msgr2=0x7ffa00101880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:02.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.549+0000 7ffa04e9a640 1 --2- 192.168.123.107:0/2257810872 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa000ff520 0x7ffa00101880 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7ff9e800b700 tx=0x7ff9e800bbd0 comp rx=0 tx=0).stop 2026-03-09T19:36:02.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.549+0000 7ffa04e9a640 1 -- 192.168.123.107:0/2257810872 shutdown_connections 2026-03-09T19:36:02.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.549+0000 7ffa04e9a640 1 --2- 192.168.123.107:0/2257810872 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff9d0077890 0x7ff9d0079d50 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:02.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.549+0000 7ffa04e9a640 1 --2- 192.168.123.107:0/2257810872 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ffa000ffe40 0x7ffa00101dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:02.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.549+0000 7ffa04e9a640 1 --2- 192.168.123.107:0/2257810872 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa000ff520 0x7ffa00101880 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:02.547 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.549+0000 7ffa04e9a640 1 -- 192.168.123.107:0/2257810872 >> 192.168.123.107:0/2257810872 conn(0x7ffa000fb330 msgr2=0x7ffa000fbac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:02.548 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.549+0000 7ffa04e9a640 1 -- 192.168.123.107:0/2257810872 shutdown_connections 2026-03-09T19:36:02.548 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:02.549+0000 7ffa04e9a640 1 -- 192.168.123.107:0/2257810872 wait complete. 2026-03-09T19:36:02.606 DEBUG:tasks.fs:max_mds reduced in epoch 17 2026-03-09T19:36:02.606 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 18 2026-03-09T19:36:02.779 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:02.806 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:02 vm07.local ceph-mon[111841]: pgmap v283: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:02.806 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:02 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1112206183' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-09T19:36:02.806 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:02 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/2257810872' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-09T19:36:03.022 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.024+0000 7ff3faa85640 1 -- 192.168.123.107:0/2335106959 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3f407ace0 msgr2=0x7ff3f407b0c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:03.023 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.024+0000 7ff3faa85640 1 --2- 192.168.123.107:0/2335106959 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3f407ace0 0x7ff3f407b0c0 secure :-1 s=READY pgs=176 cs=0 l=1 rev1=1 crypto rx=0x7ff3e40099b0 tx=0x7ff3e402f220 comp rx=0 tx=0).stop 2026-03-09T19:36:03.023 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.025+0000 7ff3faa85640 1 -- 192.168.123.107:0/2335106959 shutdown_connections 2026-03-09T19:36:03.023 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.025+0000 7ff3faa85640 1 --2- 192.168.123.107:0/2335106959 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3f407b690 0x7ff3f410bba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:03.023 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.025+0000 7ff3faa85640 1 --2- 192.168.123.107:0/2335106959 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3f407ace0 0x7ff3f407b0c0 unknown :-1 s=CLOSED pgs=176 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:03.023 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.025+0000 7ff3faa85640 1 -- 192.168.123.107:0/2335106959 >> 192.168.123.107:0/2335106959 conn(0x7ff3f40768b0 msgr2=0x7ff3f4078cd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:03.023 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.025+0000 7ff3faa85640 1 -- 192.168.123.107:0/2335106959 shutdown_connections 2026-03-09T19:36:03.023 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.025+0000 7ff3faa85640 1 -- 192.168.123.107:0/2335106959 wait complete. 2026-03-09T19:36:03.023 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.025+0000 7ff3faa85640 1 Processor -- start 2026-03-09T19:36:03.023 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.026+0000 7ff3faa85640 1 -- start start 2026-03-09T19:36:03.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.026+0000 7ff3faa85640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3f407ace0 0x7ff3f419eda0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:03.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.026+0000 7ff3faa85640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3f407b690 0x7ff3f419f2e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:03.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.026+0000 7ff3faa85640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff3f419f970 con 0x7ff3f407ace0 2026-03-09T19:36:03.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.026+0000 7ff3faa85640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff3f41a36e0 con 0x7ff3f407b690 2026-03-09T19:36:03.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.026+0000 7ff3f3fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3f407ace0 0x7ff3f419eda0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:03.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.026+0000 7ff3f3fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3f407ace0 0x7ff3f419eda0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:33786/0 (socket says 192.168.123.107:33786) 2026-03-09T19:36:03.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.026+0000 7ff3f3fff640 1 -- 192.168.123.107:0/1296755003 learned_addr learned my addr 192.168.123.107:0/1296755003 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:03.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.026+0000 7ff3f3fff640 1 -- 192.168.123.107:0/1296755003 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3f407b690 msgr2=0x7ff3f419f2e0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:36:03.024 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.026+0000 7ff3f37fe640 1 --2- 192.168.123.107:0/1296755003 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3f407b690 0x7ff3f419f2e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:03.025 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.027+0000 7ff3f3fff640 1 --2- 192.168.123.107:0/1296755003 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3f407b690 0x7ff3f419f2e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:03.025 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.027+0000 7ff3f3fff640 1 -- 192.168.123.107:0/1296755003 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff3e4009660 con 0x7ff3f407ace0 2026-03-09T19:36:03.025 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.027+0000 7ff3f37fe640 1 --2- 192.168.123.107:0/1296755003 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3f407b690 0x7ff3f419f2e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:36:03.025 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.027+0000 7ff3f3fff640 1 --2- 192.168.123.107:0/1296755003 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3f407ace0 0x7ff3f419eda0 secure :-1 s=READY pgs=177 cs=0 l=1 rev1=1 crypto rx=0x7ff3e4009980 tx=0x7ff3e4004360 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:03.025 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.027+0000 7ff3f17fa640 1 -- 192.168.123.107:0/1296755003 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff3e403d070 con 0x7ff3f407ace0 2026-03-09T19:36:03.025 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.027+0000 7ff3f17fa640 1 -- 192.168.123.107:0/1296755003 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff3e402fc90 con 0x7ff3f407ace0 2026-03-09T19:36:03.025 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.027+0000 7ff3faa85640 1 -- 192.168.123.107:0/1296755003 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff3f41a3960 con 0x7ff3f407ace0 2026-03-09T19:36:03.026 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.028+0000 7ff3faa85640 1 -- 192.168.123.107:0/1296755003 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff3f41a3e50 con 0x7ff3f407ace0 2026-03-09T19:36:03.027 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.028+0000 7ff3f17fa640 1 -- 192.168.123.107:0/1296755003 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff3e4041920 con 0x7ff3f407ace0 2026-03-09T19:36:03.030 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.029+0000 7ff3faa85640 1 -- 192.168.123.107:0/1296755003 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff3c0005350 con 0x7ff3f407ace0 2026-03-09T19:36:03.030 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.030+0000 7ff3f17fa640 1 -- 192.168.123.107:0/1296755003 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff3e4038cf0 con 0x7ff3f407ace0 2026-03-09T19:36:03.030 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.031+0000 7ff3f17fa640 1 --2- 192.168.123.107:0/1296755003 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff3d4077890 0x7ff3d4079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:03.030 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.031+0000 7ff3f17fa640 1 -- 192.168.123.107:0/1296755003 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7ff3e40be800 con 0x7ff3f407ace0 2026-03-09T19:36:03.031 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.033+0000 7ff3f17fa640 1 -- 192.168.123.107:0/1296755003 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff3e4086780 con 0x7ff3f407ace0 2026-03-09T19:36:03.031 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.033+0000 7ff3f37fe640 1 --2- 192.168.123.107:0/1296755003 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff3d4077890 0x7ff3d4079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:03.031 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.034+0000 7ff3f37fe640 1 --2- 192.168.123.107:0/1296755003 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff3d4077890 0x7ff3d4079d50 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7ff3f41a0350 tx=0x7ff3e0005e30 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:03.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:02 vm08.local ceph-mon[103420]: pgmap v283: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:03.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:02 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1112206183' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-09T19:36:03.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:02 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/2257810872' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-09T19:36:03.145 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.147+0000 7ff3faa85640 1 -- 192.168.123.107:0/1296755003 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 18, "format": "json"} v 0) v1 -- 0x7ff3c00051c0 con 0x7ff3f407ace0 2026-03-09T19:36:03.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.148+0000 7ff3f17fa640 1 -- 192.168.123.107:0/1296755003 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 18, "format": "json"}]=0 dumped fsmap epoch 18 v33) v1 ==== 107+0+4989 (secure 0 0 0) 0x7ff3e40865a0 con 0x7ff3f407ace0 2026-03-09T19:36:03.149 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:03.149 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":18,"btime":"2026-03-09T19:33:42:050554+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14500,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6829/969322030","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":969322030},{"type":"v1","addr":"192.168.123.107:6829","nonce":969322030}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":12},{"gid":34378,"name":"cephfs.vm08.jwsqrf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/2328013860","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2328013860},{"type":"v1","addr":"192.168.123.108:6827","nonce":2328013860}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":16},{"gid":34382,"name":"cephfs.vm07.uizncw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/2856024060","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":2856024060},{"type":"v1","addr":"192.168.123.107:6827","nonce":2856024060}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18}],"filesystems":[{"mdsmap":{"epoch":15,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:33:32.839429+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":41,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24279},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24279":{"gid":24279,"name":"cephfs.vm08.zcaqju","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.108:6825/1034426472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1034426472},{"type":"v1","addr":"192.168.123.108:6825","nonce":1034426472}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24279,"qdb_cluster":[24279]},"id":1}]} 2026-03-09T19:36:03.149 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 18 2026-03-09T19:36:03.150 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.152+0000 7ff3faa85640 1 -- 192.168.123.107:0/1296755003 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff3d4077890 msgr2=0x7ff3d4079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:03.150 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.152+0000 7ff3faa85640 1 --2- 192.168.123.107:0/1296755003 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff3d4077890 0x7ff3d4079d50 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7ff3f41a0350 tx=0x7ff3e0005e30 comp rx=0 tx=0).stop 2026-03-09T19:36:03.150 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.152+0000 7ff3faa85640 1 -- 192.168.123.107:0/1296755003 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3f407ace0 msgr2=0x7ff3f419eda0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:03.150 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.152+0000 7ff3faa85640 1 --2- 192.168.123.107:0/1296755003 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3f407ace0 0x7ff3f419eda0 secure :-1 s=READY pgs=177 cs=0 l=1 rev1=1 crypto rx=0x7ff3e4009980 tx=0x7ff3e4004360 comp rx=0 tx=0).stop 2026-03-09T19:36:03.150 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.152+0000 7ff3faa85640 1 -- 192.168.123.107:0/1296755003 shutdown_connections 2026-03-09T19:36:03.150 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.152+0000 7ff3faa85640 1 --2- 192.168.123.107:0/1296755003 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff3d4077890 0x7ff3d4079d50 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:03.150 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.152+0000 7ff3faa85640 1 --2- 192.168.123.107:0/1296755003 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3f407b690 0x7ff3f419f2e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:03.150 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.152+0000 7ff3faa85640 1 --2- 192.168.123.107:0/1296755003 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff3f407ace0 0x7ff3f419eda0 unknown :-1 s=CLOSED pgs=177 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:03.151 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.152+0000 7ff3faa85640 1 -- 192.168.123.107:0/1296755003 >> 192.168.123.107:0/1296755003 conn(0x7ff3f40768b0 msgr2=0x7ff3f4109df0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:03.151 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.153+0000 7ff3faa85640 1 -- 192.168.123.107:0/1296755003 shutdown_connections 2026-03-09T19:36:03.151 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.153+0000 7ff3faa85640 1 -- 192.168.123.107:0/1296755003 wait complete. 2026-03-09T19:36:03.196 DEBUG:tasks.fs:max_mds reduced in epoch 18 2026-03-09T19:36:03.196 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 19 2026-03-09T19:36:03.347 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:03.604 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.606+0000 7fabd85b0640 1 -- 192.168.123.107:0/1903719358 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fabd01089d0 msgr2=0x7fabd0108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:03.605 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.606+0000 7fabd85b0640 1 --2- 192.168.123.107:0/1903719358 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fabd01089d0 0x7fabd0108db0 secure :-1 s=READY pgs=178 cs=0 l=1 rev1=1 crypto rx=0x7fabb8009a00 tx=0x7fabb802f280 comp rx=0 tx=0).stop 2026-03-09T19:36:03.605 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.607+0000 7fabd85b0640 1 -- 192.168.123.107:0/1903719358 shutdown_connections 2026-03-09T19:36:03.605 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.607+0000 7fabd85b0640 1 --2- 192.168.123.107:0/1903719358 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fabd01029d0 0x7fabd0102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:03.606 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.607+0000 7fabd85b0640 1 --2- 192.168.123.107:0/1903719358 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fabd01089d0 0x7fabd0108db0 unknown :-1 s=CLOSED pgs=178 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:03.606 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.607+0000 7fabd85b0640 1 -- 192.168.123.107:0/1903719358 >> 192.168.123.107:0/1903719358 conn(0x7fabd00fe710 msgr2=0x7fabd0100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:03.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.609+0000 7fabd85b0640 1 -- 192.168.123.107:0/1903719358 shutdown_connections 2026-03-09T19:36:03.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.610+0000 7fabd85b0640 1 -- 192.168.123.107:0/1903719358 wait complete. 2026-03-09T19:36:03.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.610+0000 7fabd85b0640 1 Processor -- start 2026-03-09T19:36:03.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.610+0000 7fabd85b0640 1 -- start start 2026-03-09T19:36:03.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.610+0000 7fabd85b0640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fabd01029d0 0x7fabd006b670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:03.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.610+0000 7fabd85b0640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fabd01089d0 0x7fabd006bbb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:03.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.610+0000 7fabd85b0640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fabd006c180 con 0x7fabd01029d0 2026-03-09T19:36:03.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.610+0000 7fabd85b0640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fabd006c2f0 con 0x7fabd01089d0 2026-03-09T19:36:03.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.611+0000 7fabd5b24640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fabd01089d0 0x7fabd006bbb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:03.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.611+0000 7fabd5b24640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fabd01089d0 0x7fabd006bbb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:58062/0 (socket says 192.168.123.107:58062) 2026-03-09T19:36:03.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.611+0000 7fabd5b24640 1 -- 192.168.123.107:0/936440469 learned_addr learned my addr 192.168.123.107:0/936440469 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:03.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.612+0000 7fabd5b24640 1 -- 192.168.123.107:0/936440469 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fabd01029d0 msgr2=0x7fabd006b670 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:36:03.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.612+0000 7fabd5b24640 1 --2- 192.168.123.107:0/936440469 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fabd01029d0 0x7fabd006b670 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:03.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.612+0000 7fabd5b24640 1 -- 192.168.123.107:0/936440469 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fabb8009660 con 0x7fabd01089d0 2026-03-09T19:36:03.611 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.612+0000 7fabd5b24640 1 --2- 192.168.123.107:0/936440469 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fabd01089d0 0x7fabd006bbb0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fabc000d8d0 tx=0x7fabc000dda0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:03.612 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.612+0000 7fabc77fe640 1 -- 192.168.123.107:0/936440469 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fabc0004490 con 0x7fabd01089d0 2026-03-09T19:36:03.612 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.613+0000 7fabd85b0640 1 -- 192.168.123.107:0/936440469 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fabd010ebd0 con 0x7fabd01089d0 2026-03-09T19:36:03.612 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.613+0000 7fabc77fe640 1 -- 192.168.123.107:0/936440469 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fabc0004d60 con 0x7fabd01089d0 2026-03-09T19:36:03.612 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.613+0000 7fabc77fe640 1 -- 192.168.123.107:0/936440469 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fabc0005230 con 0x7fabd01089d0 2026-03-09T19:36:03.612 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.614+0000 7fabd85b0640 1 -- 192.168.123.107:0/936440469 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fabd010f120 con 0x7fabd01089d0 2026-03-09T19:36:03.613 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.615+0000 7fabc77fe640 1 -- 192.168.123.107:0/936440469 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fabc000b9d0 con 0x7fabd01089d0 2026-03-09T19:36:03.614 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.616+0000 7fabc77fe640 1 --2- 192.168.123.107:0/936440469 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7faba8077890 0x7faba8079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:03.614 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.616+0000 7fabc77fe640 1 -- 192.168.123.107:0/936440469 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7fabc0098e40 con 0x7fabd01089d0 2026-03-09T19:36:03.614 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.616+0000 7fabd85b0640 1 -- 192.168.123.107:0/936440469 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fabd0104110 con 0x7fabd01089d0 2026-03-09T19:36:03.616 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.618+0000 7fabd6325640 1 --2- 192.168.123.107:0/936440469 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7faba8077890 0x7faba8079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:03.616 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.618+0000 7fabd6325640 1 --2- 192.168.123.107:0/936440469 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7faba8077890 0x7faba8079d50 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fabb802f790 tx=0x7fabb80023d0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:03.618 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.620+0000 7fabc77fe640 1 -- 192.168.123.107:0/936440469 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fabc009e050 con 0x7fabd01089d0 2026-03-09T19:36:03.734 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.736+0000 7fabd85b0640 1 -- 192.168.123.107:0/936440469 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 19, "format": "json"} v 0) v1 -- 0x7fabd00767a0 con 0x7fabd01089d0 2026-03-09T19:36:03.735 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.737+0000 7fabc77fe640 1 -- 192.168.123.107:0/936440469 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 19, "format": "json"}]=0 dumped fsmap epoch 19 v33) v1 ==== 107+0+4209 (secure 0 0 0) 0x7fabc0014090 con 0x7fabd01089d0 2026-03-09T19:36:03.736 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:03.736 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":19,"btime":"2026-03-09T19:33:46:714266+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34378,"name":"cephfs.vm08.jwsqrf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/2328013860","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2328013860},{"type":"v1","addr":"192.168.123.108:6827","nonce":2328013860}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":16},{"gid":34382,"name":"cephfs.vm07.uizncw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/2856024060","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":2856024060},{"type":"v1","addr":"192.168.123.107:6827","nonce":2856024060}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18}],"filesystems":[{"mdsmap":{"epoch":15,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:33:32.839429+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":41,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24279},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24279":{"gid":24279,"name":"cephfs.vm08.zcaqju","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.108:6825/1034426472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1034426472},{"type":"v1","addr":"192.168.123.108:6825","nonce":1034426472}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24279,"qdb_cluster":[24279]},"id":1}]} 2026-03-09T19:36:03.736 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 19 2026-03-09T19:36:03.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.741+0000 7fabd85b0640 1 -- 192.168.123.107:0/936440469 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7faba8077890 msgr2=0x7faba8079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:03.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.741+0000 7fabd85b0640 1 --2- 192.168.123.107:0/936440469 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7faba8077890 0x7faba8079d50 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fabb802f790 tx=0x7fabb80023d0 comp rx=0 tx=0).stop 2026-03-09T19:36:03.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.741+0000 7fabd85b0640 1 -- 192.168.123.107:0/936440469 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fabd01089d0 msgr2=0x7fabd006bbb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:03.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.741+0000 7fabd85b0640 1 --2- 192.168.123.107:0/936440469 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fabd01089d0 0x7fabd006bbb0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fabc000d8d0 tx=0x7fabc000dda0 comp rx=0 tx=0).stop 2026-03-09T19:36:03.739 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.741+0000 7fabd85b0640 1 -- 192.168.123.107:0/936440469 shutdown_connections 2026-03-09T19:36:03.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.741+0000 7fabd85b0640 1 --2- 192.168.123.107:0/936440469 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7faba8077890 0x7faba8079d50 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:03.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.741+0000 7fabd85b0640 1 --2- 192.168.123.107:0/936440469 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fabd01089d0 0x7fabd006bbb0 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:03.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.741+0000 7fabd85b0640 1 --2- 192.168.123.107:0/936440469 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fabd01029d0 0x7fabd006b670 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:03.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.741+0000 7fabd85b0640 1 -- 192.168.123.107:0/936440469 >> 192.168.123.107:0/936440469 conn(0x7fabd00fe710 msgr2=0x7fabd0109830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:03.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.741+0000 7fabd85b0640 1 -- 192.168.123.107:0/936440469 shutdown_connections 2026-03-09T19:36:03.740 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:03.742+0000 7fabd85b0640 1 -- 192.168.123.107:0/936440469 wait complete. 2026-03-09T19:36:03.817 DEBUG:tasks.fs:max_mds reduced in epoch 19 2026-03-09T19:36:03.817 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 20 2026-03-09T19:36:03.976 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:04.005 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:03 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1296755003' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-09T19:36:04.005 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:36:04.005 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:03 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/936440469' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-09T19:36:04.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:03 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1296755003' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-09T19:36:04.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:36:04.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:03 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/936440469' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-09T19:36:04.238 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.240+0000 7f9cdc2d5640 1 -- 192.168.123.107:0/1597811211 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cd40ffea0 msgr2=0x7f9cd410cd70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:04.239 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.240+0000 7f9cdc2d5640 1 --2- 192.168.123.107:0/1597811211 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cd40ffea0 0x7f9cd410cd70 secure :-1 s=READY pgs=179 cs=0 l=1 rev1=1 crypto rx=0x7f9cc40099b0 tx=0x7f9cc402f220 comp rx=0 tx=0).stop 2026-03-09T19:36:04.239 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.240+0000 7f9cdc2d5640 1 -- 192.168.123.107:0/1597811211 shutdown_connections 2026-03-09T19:36:04.239 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.241+0000 7f9cdc2d5640 1 --2- 192.168.123.107:0/1597811211 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cd40ffea0 0x7f9cd410cd70 unknown :-1 s=CLOSED pgs=179 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:04.239 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.241+0000 7f9cdc2d5640 1 --2- 192.168.123.107:0/1597811211 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cd40ff580 0x7f9cd40ff960 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:04.239 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.241+0000 7f9cdc2d5640 1 -- 192.168.123.107:0/1597811211 >> 192.168.123.107:0/1597811211 conn(0x7f9cd40fb3d0 msgr2=0x7f9cd40fd7f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:04.239 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.241+0000 7f9cdc2d5640 1 -- 192.168.123.107:0/1597811211 shutdown_connections 2026-03-09T19:36:04.239 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.241+0000 7f9cdc2d5640 1 -- 192.168.123.107:0/1597811211 wait complete. 2026-03-09T19:36:04.239 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.242+0000 7f9cdc2d5640 1 Processor -- start 2026-03-09T19:36:04.240 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.242+0000 7f9cdc2d5640 1 -- start start 2026-03-09T19:36:04.240 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.242+0000 7f9cdc2d5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cd40ff580 0x7f9cd41018e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:04.240 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.242+0000 7f9cda04a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cd40ff580 0x7f9cd41018e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:04.240 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.243+0000 7f9cda04a640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cd40ff580 0x7f9cd41018e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:33804/0 (socket says 192.168.123.107:33804) 2026-03-09T19:36:04.241 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.243+0000 7f9cdc2d5640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cd40ffea0 0x7f9cd4101e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:04.241 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.243+0000 7f9cda04a640 1 -- 192.168.123.107:0/3863653907 learned_addr learned my addr 192.168.123.107:0/3863653907 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:04.241 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.243+0000 7f9cdc2d5640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9cd4105960 con 0x7f9cd40ff580 2026-03-09T19:36:04.241 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.243+0000 7f9cdc2d5640 1 -- 192.168.123.107:0/3863653907 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9cd4105ad0 con 0x7f9cd40ffea0 2026-03-09T19:36:04.241 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.243+0000 7f9cd9849640 1 --2- 192.168.123.107:0/3863653907 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cd40ffea0 0x7f9cd4101e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:04.241 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.243+0000 7f9cd9849640 1 -- 192.168.123.107:0/3863653907 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cd40ff580 msgr2=0x7f9cd41018e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:04.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.244+0000 7f9cd9849640 1 --2- 192.168.123.107:0/3863653907 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cd40ff580 0x7f9cd41018e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:04.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.244+0000 7f9cd9849640 1 -- 192.168.123.107:0/3863653907 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9cc4009660 con 0x7f9cd40ffea0 2026-03-09T19:36:04.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.244+0000 7f9cda04a640 1 --2- 192.168.123.107:0/3863653907 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cd40ff580 0x7f9cd41018e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:36:04.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.244+0000 7f9cd9849640 1 --2- 192.168.123.107:0/3863653907 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cd40ffea0 0x7f9cd4101e20 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f9cc4009980 tx=0x7f9cc40043d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:04.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.244+0000 7f9ccb7fe640 1 -- 192.168.123.107:0/3863653907 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9cc403d070 con 0x7f9cd40ffea0 2026-03-09T19:36:04.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.244+0000 7f9cdc2d5640 1 -- 192.168.123.107:0/3863653907 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9cd41023c0 con 0x7f9cd40ffea0 2026-03-09T19:36:04.242 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.245+0000 7f9cdc2d5640 1 -- 192.168.123.107:0/3863653907 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9cd41a9140 con 0x7f9cd40ffea0 2026-03-09T19:36:04.243 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.245+0000 7f9ccb7fe640 1 -- 192.168.123.107:0/3863653907 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9cc402fc90 con 0x7f9cd40ffea0 2026-03-09T19:36:04.243 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.245+0000 7f9ccb7fe640 1 -- 192.168.123.107:0/3863653907 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9cc4041840 con 0x7f9cd40ffea0 2026-03-09T19:36:04.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.246+0000 7f9cdc2d5640 1 -- 192.168.123.107:0/3863653907 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9cd4069680 con 0x7f9cd40ffea0 2026-03-09T19:36:04.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.246+0000 7f9ccb7fe640 1 -- 192.168.123.107:0/3863653907 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9cc4038730 con 0x7f9cd40ffea0 2026-03-09T19:36:04.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.247+0000 7f9ccb7fe640 1 --2- 192.168.123.107:0/3863653907 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9ca80778e0 0x7f9ca8079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:04.245 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.247+0000 7f9cda04a640 1 --2- 192.168.123.107:0/3863653907 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9ca80778e0 0x7f9ca8079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:04.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.248+0000 7f9cda04a640 1 --2- 192.168.123.107:0/3863653907 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9ca80778e0 0x7f9ca8079da0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f9cbc0046b0 tx=0x7f9cbc0092c0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:04.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.248+0000 7f9ccb7fe640 1 -- 192.168.123.107:0/3863653907 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f9cc40be540 con 0x7f9cd40ffea0 2026-03-09T19:36:04.247 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.249+0000 7f9ccb7fe640 1 -- 192.168.123.107:0/3863653907 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9cc4086ca0 con 0x7f9cd40ffea0 2026-03-09T19:36:04.369 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.370+0000 7f9cdc2d5640 1 -- 192.168.123.107:0/3863653907 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 20, "format": "json"} v 0) v1 -- 0x7f9cd410fde0 con 0x7f9cd40ffea0 2026-03-09T19:36:04.370 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.371+0000 7f9ccb7fe640 1 -- 192.168.123.107:0/3863653907 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 20, "format": "json"}]=0 dumped fsmap epoch 20 v33) v1 ==== 107+0+5057 (secure 0 0 0) 0x7f9cc40863f0 con 0x7f9cd40ffea0 2026-03-09T19:36:04.370 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:04.370 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":20,"btime":"2026-03-09T19:33:48:296741+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34378,"name":"cephfs.vm08.jwsqrf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/2328013860","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2328013860},{"type":"v1","addr":"192.168.123.108:6827","nonce":2328013860}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":16},{"gid":34382,"name":"cephfs.vm07.uizncw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/2856024060","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":2856024060},{"type":"v1","addr":"192.168.123.107:6827","nonce":2856024060}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":34386,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6829/670620212","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":670620212},{"type":"v1","addr":"192.168.123.107:6829","nonce":670620212}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20}],"filesystems":[{"mdsmap":{"epoch":15,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:33:32.839429+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":41,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24279},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24279":{"gid":24279,"name":"cephfs.vm08.zcaqju","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.108:6825/1034426472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1034426472},{"type":"v1","addr":"192.168.123.108:6825","nonce":1034426472}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24279,"qdb_cluster":[24279]},"id":1}]} 2026-03-09T19:36:04.370 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 20 2026-03-09T19:36:04.372 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.374+0000 7f9cdc2d5640 1 -- 192.168.123.107:0/3863653907 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9ca80778e0 msgr2=0x7f9ca8079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:04.373 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.374+0000 7f9cdc2d5640 1 --2- 192.168.123.107:0/3863653907 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9ca80778e0 0x7f9ca8079da0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f9cbc0046b0 tx=0x7f9cbc0092c0 comp rx=0 tx=0).stop 2026-03-09T19:36:04.373 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.374+0000 7f9cdc2d5640 1 -- 192.168.123.107:0/3863653907 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cd40ffea0 msgr2=0x7f9cd4101e20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:04.373 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.374+0000 7f9cdc2d5640 1 --2- 192.168.123.107:0/3863653907 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cd40ffea0 0x7f9cd4101e20 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f9cc4009980 tx=0x7f9cc40043d0 comp rx=0 tx=0).stop 2026-03-09T19:36:04.373 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.375+0000 7f9cdc2d5640 1 -- 192.168.123.107:0/3863653907 shutdown_connections 2026-03-09T19:36:04.373 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.375+0000 7f9cdc2d5640 1 --2- 192.168.123.107:0/3863653907 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9ca80778e0 0x7f9ca8079da0 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:04.373 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.375+0000 7f9cdc2d5640 1 --2- 192.168.123.107:0/3863653907 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cd40ffea0 0x7f9cd4101e20 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:04.373 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.375+0000 7f9cdc2d5640 1 --2- 192.168.123.107:0/3863653907 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cd40ff580 0x7f9cd41018e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:04.373 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.375+0000 7f9cdc2d5640 1 -- 192.168.123.107:0/3863653907 >> 192.168.123.107:0/3863653907 conn(0x7f9cd40fb3d0 msgr2=0x7f9cd40fbb80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:04.373 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.375+0000 7f9cdc2d5640 1 -- 192.168.123.107:0/3863653907 shutdown_connections 2026-03-09T19:36:04.374 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.375+0000 7f9cdc2d5640 1 -- 192.168.123.107:0/3863653907 wait complete. 2026-03-09T19:36:04.436 DEBUG:tasks.fs:max_mds reduced in epoch 20 2026-03-09T19:36:04.436 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 21 2026-03-09T19:36:04.595 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:04.848 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.849+0000 7f31ac8eb640 1 -- 192.168.123.107:0/452192712 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31a41029d0 msgr2=0x7f31a4102e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:04.848 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.849+0000 7f31ac8eb640 1 --2- 192.168.123.107:0/452192712 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31a41029d0 0x7f31a4102e30 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f319c0099b0 tx=0x7f319c02f220 comp rx=0 tx=0).stop 2026-03-09T19:36:04.848 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.850+0000 7f31ac8eb640 1 -- 192.168.123.107:0/452192712 shutdown_connections 2026-03-09T19:36:04.848 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.850+0000 7f31ac8eb640 1 --2- 192.168.123.107:0/452192712 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31a41029d0 0x7f31a4102e30 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:04.848 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.850+0000 7f31ac8eb640 1 --2- 192.168.123.107:0/452192712 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31a41089d0 0x7f31a4108db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:04.848 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.850+0000 7f31ac8eb640 1 -- 192.168.123.107:0/452192712 >> 192.168.123.107:0/452192712 conn(0x7f31a40fe710 msgr2=0x7f31a4100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:04.848 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.850+0000 7f31ac8eb640 1 -- 192.168.123.107:0/452192712 shutdown_connections 2026-03-09T19:36:04.848 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.851+0000 7f31ac8eb640 1 -- 192.168.123.107:0/452192712 wait complete. 2026-03-09T19:36:04.849 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.851+0000 7f31ac8eb640 1 Processor -- start 2026-03-09T19:36:04.849 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.851+0000 7f31ac8eb640 1 -- start start 2026-03-09T19:36:04.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.852+0000 7f31ac8eb640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31a41029d0 0x7f31a41a0680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:04.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.852+0000 7f31aa660640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31a41029d0 0x7f31a41a0680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:04.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.852+0000 7f31aa660640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31a41029d0 0x7f31a41a0680 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:33826/0 (socket says 192.168.123.107:33826) 2026-03-09T19:36:04.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.852+0000 7f31ac8eb640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31a41089d0 0x7f31a41a0bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:04.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.852+0000 7f31ac8eb640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f31a419a770 con 0x7f31a41029d0 2026-03-09T19:36:04.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.852+0000 7f31ac8eb640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f31a419a8e0 con 0x7f31a41089d0 2026-03-09T19:36:04.850 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.852+0000 7f31aa660640 1 -- 192.168.123.107:0/716819798 learned_addr learned my addr 192.168.123.107:0/716819798 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:04.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.853+0000 7f31a9e5f640 1 --2- 192.168.123.107:0/716819798 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31a41089d0 0x7f31a41a0bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:04.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.853+0000 7f31a9e5f640 1 -- 192.168.123.107:0/716819798 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31a41029d0 msgr2=0x7f31a41a0680 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:04.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.853+0000 7f31a9e5f640 1 --2- 192.168.123.107:0/716819798 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31a41029d0 0x7f31a41a0680 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:04.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.853+0000 7f31a9e5f640 1 -- 192.168.123.107:0/716819798 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3194009590 con 0x7f31a41089d0 2026-03-09T19:36:04.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.853+0000 7f31aa660640 1 --2- 192.168.123.107:0/716819798 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31a41029d0 0x7f31a41a0680 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T19:36:04.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.853+0000 7f31a9e5f640 1 --2- 192.168.123.107:0/716819798 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31a41089d0 0x7f31a41a0bc0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f319c02f730 tx=0x7f319c004290 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:04.851 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.853+0000 7f31937fe640 1 -- 192.168.123.107:0/716819798 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f319c03d070 con 0x7f31a41089d0 2026-03-09T19:36:04.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.853+0000 7f31937fe640 1 -- 192.168.123.107:0/716819798 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f319c0384e0 con 0x7f31a41089d0 2026-03-09T19:36:04.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.853+0000 7f31937fe640 1 -- 192.168.123.107:0/716819798 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f319c041620 con 0x7f31a41089d0 2026-03-09T19:36:04.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.854+0000 7f31ac8eb640 1 -- 192.168.123.107:0/716819798 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f319c009660 con 0x7f31a41089d0 2026-03-09T19:36:04.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.854+0000 7f31ac8eb640 1 -- 192.168.123.107:0/716819798 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f31a419aec0 con 0x7f31a41089d0 2026-03-09T19:36:04.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.855+0000 7f31ac8eb640 1 -- 192.168.123.107:0/716819798 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3174005350 con 0x7f31a41089d0 2026-03-09T19:36:04.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.856+0000 7f31937fe640 1 -- 192.168.123.107:0/716819798 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f319c02fa80 con 0x7f31a41089d0 2026-03-09T19:36:04.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.856+0000 7f31937fe640 1 --2- 192.168.123.107:0/716819798 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f31780778e0 0x7f3178079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:04.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.856+0000 7f31937fe640 1 -- 192.168.123.107:0/716819798 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f319c0be920 con 0x7f31a41089d0 2026-03-09T19:36:04.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.856+0000 7f31aa660640 1 --2- 192.168.123.107:0/716819798 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f31780778e0 0x7f3178079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:04.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.857+0000 7f31aa660640 1 --2- 192.168.123.107:0/716819798 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f31780778e0 0x7f3178079da0 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f3194009920 tx=0x7f3194009290 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:04.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.858+0000 7f31937fe640 1 -- 192.168.123.107:0/716819798 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f319c086fd0 con 0x7f31a41089d0 2026-03-09T19:36:04.969 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:04 vm07.local ceph-mon[111841]: pgmap v284: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:04.969 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:04 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/3863653907' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-09T19:36:04.969 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.971+0000 7f31ac8eb640 1 -- 192.168.123.107:0/716819798 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 21, "format": "json"} v 0) v1 -- 0x7f31740051c0 con 0x7f31a41089d0 2026-03-09T19:36:04.970 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.972+0000 7f31937fe640 1 -- 192.168.123.107:0/716819798 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 21, "format": "json"}]=0 dumped fsmap epoch 21 v33) v1 ==== 107+0+4256 (secure 0 0 0) 0x7f319c086720 con 0x7f31a41089d0 2026-03-09T19:36:04.972 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:04.972 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":21,"btime":"2026-03-09T19:33:51:906102+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34378,"name":"cephfs.vm08.jwsqrf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/2328013860","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2328013860},{"type":"v1","addr":"192.168.123.108:6827","nonce":2328013860}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":16},{"gid":34382,"name":"cephfs.vm07.uizncw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/2856024060","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":2856024060},{"type":"v1","addr":"192.168.123.107:6827","nonce":2856024060}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":34386,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6829/670620212","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":670620212},{"type":"v1","addr":"192.168.123.107:6829","nonce":670620212}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20}],"filesystems":[{"mdsmap":{"epoch":21,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:33:51.906100+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":101,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[1],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T19:36:04.972 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 21 2026-03-09T19:36:04.974 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.976+0000 7f31ac8eb640 1 -- 192.168.123.107:0/716819798 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f31780778e0 msgr2=0x7f3178079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:04.974 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.977+0000 7f31ac8eb640 1 --2- 192.168.123.107:0/716819798 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f31780778e0 0x7f3178079da0 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f3194009920 tx=0x7f3194009290 comp rx=0 tx=0).stop 2026-03-09T19:36:04.975 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.977+0000 7f31ac8eb640 1 -- 192.168.123.107:0/716819798 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31a41089d0 msgr2=0x7f31a41a0bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:04.975 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.977+0000 7f31ac8eb640 1 --2- 192.168.123.107:0/716819798 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31a41089d0 0x7f31a41a0bc0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f319c02f730 tx=0x7f319c004290 comp rx=0 tx=0).stop 2026-03-09T19:36:04.975 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.978+0000 7f31ac8eb640 1 -- 192.168.123.107:0/716819798 shutdown_connections 2026-03-09T19:36:04.975 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.978+0000 7f31ac8eb640 1 --2- 192.168.123.107:0/716819798 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f31780778e0 0x7f3178079da0 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:04.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.978+0000 7f31ac8eb640 1 --2- 192.168.123.107:0/716819798 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31a41089d0 0x7f31a41a0bc0 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:04.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.978+0000 7f31ac8eb640 1 --2- 192.168.123.107:0/716819798 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f31a41029d0 0x7f31a41a0680 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:04.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.978+0000 7f31ac8eb640 1 -- 192.168.123.107:0/716819798 >> 192.168.123.107:0/716819798 conn(0x7f31a40fe710 msgr2=0x7f31a410ca00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:04.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.978+0000 7f31ac8eb640 1 -- 192.168.123.107:0/716819798 shutdown_connections 2026-03-09T19:36:04.976 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:04.978+0000 7f31ac8eb640 1 -- 192.168.123.107:0/716819798 wait complete. 2026-03-09T19:36:05.050 DEBUG:tasks.fs:max_mds reduced in epoch 21 2026-03-09T19:36:05.050 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 22 2026-03-09T19:36:05.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:04 vm08.local ceph-mon[103420]: pgmap v284: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:05.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:04 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/3863653907' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-09T19:36:05.207 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:05.475 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.476+0000 7f976c5fa640 1 -- 192.168.123.107:0/4116966850 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f97641089d0 msgr2=0x7f9764108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:05.475 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.476+0000 7f976c5fa640 1 --2- 192.168.123.107:0/4116966850 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f97641089d0 0x7f9764108db0 secure :-1 s=READY pgs=180 cs=0 l=1 rev1=1 crypto rx=0x7f974c009a00 tx=0x7f974c02f270 comp rx=0 tx=0).stop 2026-03-09T19:36:05.475 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.477+0000 7f976c5fa640 1 -- 192.168.123.107:0/4116966850 shutdown_connections 2026-03-09T19:36:05.475 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.477+0000 7f976c5fa640 1 --2- 192.168.123.107:0/4116966850 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f97641029d0 0x7f9764102e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:05.475 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.477+0000 7f976c5fa640 1 --2- 192.168.123.107:0/4116966850 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f97641089d0 0x7f9764108db0 secure :-1 s=CLOSED pgs=180 cs=0 l=1 rev1=1 crypto rx=0x7f974c009a00 tx=0x7f974c02f270 comp rx=0 tx=0).stop 2026-03-09T19:36:05.475 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.477+0000 7f976c5fa640 1 -- 192.168.123.107:0/4116966850 >> 192.168.123.107:0/4116966850 conn(0x7f97640fe710 msgr2=0x7f9764100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:05.475 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.477+0000 7f976c5fa640 1 -- 192.168.123.107:0/4116966850 shutdown_connections 2026-03-09T19:36:05.476 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.478+0000 7f976c5fa640 1 -- 192.168.123.107:0/4116966850 wait complete. 2026-03-09T19:36:05.476 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.478+0000 7f976c5fa640 1 Processor -- start 2026-03-09T19:36:05.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.478+0000 7f976c5fa640 1 -- start start 2026-03-09T19:36:05.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.478+0000 7f976c5fa640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f97641029d0 0x7f9764075700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:05.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.478+0000 7f976c5fa640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9764079680 0x7f9764075c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:05.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.478+0000 7f976c5fa640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9764076210 con 0x7f97641029d0 2026-03-09T19:36:05.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.478+0000 7f976c5fa640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9764076380 con 0x7f9764079680 2026-03-09T19:36:05.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.479+0000 7f9769b6e640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9764079680 0x7f9764075c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:05.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.479+0000 7f9769b6e640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9764079680 0x7f9764075c40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:58100/0 (socket says 192.168.123.107:58100) 2026-03-09T19:36:05.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.479+0000 7f9769b6e640 1 -- 192.168.123.107:0/2444050592 learned_addr learned my addr 192.168.123.107:0/2444050592 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:05.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.479+0000 7f9769b6e640 1 -- 192.168.123.107:0/2444050592 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f97641029d0 msgr2=0x7f9764075700 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:36:05.477 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.479+0000 7f976a36f640 1 --2- 192.168.123.107:0/2444050592 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f97641029d0 0x7f9764075700 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:05.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.479+0000 7f9769b6e640 1 --2- 192.168.123.107:0/2444050592 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f97641029d0 0x7f9764075700 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:05.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.480+0000 7f9769b6e640 1 -- 192.168.123.107:0/2444050592 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f974c009660 con 0x7f9764079680 2026-03-09T19:36:05.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.480+0000 7f976a36f640 1 --2- 192.168.123.107:0/2444050592 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f97641029d0 0x7f9764075700 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:36:05.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.480+0000 7f9769b6e640 1 --2- 192.168.123.107:0/2444050592 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9764079680 0x7f9764075c40 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f975400d900 tx=0x7f975400ddd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:05.478 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.480+0000 7f975b7fe640 1 -- 192.168.123.107:0/2444050592 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9754004490 con 0x7f9764079680 2026-03-09T19:36:05.479 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.480+0000 7f975b7fe640 1 -- 192.168.123.107:0/2444050592 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9754004d60 con 0x7f9764079680 2026-03-09T19:36:05.479 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.480+0000 7f976c5fa640 1 -- 192.168.123.107:0/2444050592 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f97640719c0 con 0x7f9764079680 2026-03-09T19:36:05.479 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.480+0000 7f975b7fe640 1 -- 192.168.123.107:0/2444050592 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9754005230 con 0x7f9764079680 2026-03-09T19:36:05.479 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.480+0000 7f976c5fa640 1 -- 192.168.123.107:0/2444050592 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9764071f10 con 0x7f9764079680 2026-03-09T19:36:05.479 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.481+0000 7f976c5fa640 1 -- 192.168.123.107:0/2444050592 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f972c005350 con 0x7f9764079680 2026-03-09T19:36:05.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.482+0000 7f975b7fe640 1 -- 192.168.123.107:0/2444050592 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f975400b8d0 con 0x7f9764079680 2026-03-09T19:36:05.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.483+0000 7f975b7fe640 1 --2- 192.168.123.107:0/2444050592 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9740077890 0x7f9740079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:05.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.483+0000 7f976a36f640 1 --2- 192.168.123.107:0/2444050592 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9740077890 0x7f9740079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:05.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.484+0000 7f975b7fe640 1 -- 192.168.123.107:0/2444050592 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f975409a1a0 con 0x7f9764079680 2026-03-09T19:36:05.482 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.484+0000 7f976a36f640 1 --2- 192.168.123.107:0/2444050592 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9740077890 0x7f9740079d50 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f974c0040c0 tx=0x7f974c002d80 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:05.482 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.484+0000 7f975b7fe640 1 -- 192.168.123.107:0/2444050592 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9754062850 con 0x7f9764079680 2026-03-09T19:36:05.601 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.603+0000 7f976c5fa640 1 -- 192.168.123.107:0/2444050592 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 22, "format": "json"} v 0) v1 -- 0x7f972c0058d0 con 0x7f9764079680 2026-03-09T19:36:05.603 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.604+0000 7f975b7fe640 1 -- 192.168.123.107:0/2444050592 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 22, "format": "json"}]=0 dumped fsmap epoch 22 v33) v1 ==== 107+0+4267 (secure 0 0 0) 0x7f9754061fa0 con 0x7f9764079680 2026-03-09T19:36:05.604 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:05.605 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":22,"btime":"2026-03-09T19:33:51:912843+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34382,"name":"cephfs.vm07.uizncw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/2856024060","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":2856024060},{"type":"v1","addr":"192.168.123.107:6827","nonce":2856024060}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":34386,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6829/670620212","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":670620212},{"type":"v1","addr":"192.168.123.107:6829","nonce":670620212}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20}],"filesystems":[{"mdsmap":{"epoch":22,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:33:51.912836+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":101,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":34378},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34378":{"gid":34378,"name":"cephfs.vm08.jwsqrf","rank":0,"incarnation":22,"state":"up:replay","state_seq":1,"addr":"192.168.123.108:6827/2328013860","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2328013860},{"type":"v1","addr":"192.168.123.108:6827","nonce":2328013860}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T19:36:05.605 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 22 2026-03-09T19:36:05.607 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.609+0000 7f976c5fa640 1 -- 192.168.123.107:0/2444050592 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9740077890 msgr2=0x7f9740079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:05.608 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.609+0000 7f976c5fa640 1 --2- 192.168.123.107:0/2444050592 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9740077890 0x7f9740079d50 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f974c0040c0 tx=0x7f974c002d80 comp rx=0 tx=0).stop 2026-03-09T19:36:05.608 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.609+0000 7f976c5fa640 1 -- 192.168.123.107:0/2444050592 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9764079680 msgr2=0x7f9764075c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:05.608 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.609+0000 7f976c5fa640 1 --2- 192.168.123.107:0/2444050592 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9764079680 0x7f9764075c40 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f975400d900 tx=0x7f975400ddd0 comp rx=0 tx=0).stop 2026-03-09T19:36:05.608 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.610+0000 7f976c5fa640 1 -- 192.168.123.107:0/2444050592 shutdown_connections 2026-03-09T19:36:05.608 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.610+0000 7f976c5fa640 1 --2- 192.168.123.107:0/2444050592 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9740077890 0x7f9740079d50 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:05.608 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.610+0000 7f976c5fa640 1 --2- 192.168.123.107:0/2444050592 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9764079680 0x7f9764075c40 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:05.608 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.610+0000 7f976c5fa640 1 --2- 192.168.123.107:0/2444050592 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f97641029d0 0x7f9764075700 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:05.608 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.610+0000 7f976c5fa640 1 -- 192.168.123.107:0/2444050592 >> 192.168.123.107:0/2444050592 conn(0x7f97640fe710 msgr2=0x7f97640feaf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:05.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.610+0000 7f976c5fa640 1 -- 192.168.123.107:0/2444050592 shutdown_connections 2026-03-09T19:36:05.609 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:05.610+0000 7f976c5fa640 1 -- 192.168.123.107:0/2444050592 wait complete. 2026-03-09T19:36:05.688 DEBUG:tasks.fs:max_mds reduced in epoch 22 2026-03-09T19:36:05.688 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 23 2026-03-09T19:36:05.873 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:05.903 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:05 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/716819798' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-09T19:36:05.903 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:05 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/2444050592' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-09T19:36:06.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:05 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/716819798' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-09T19:36:06.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:05 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/2444050592' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-09T19:36:06.145 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.146+0000 7fcb57390640 1 -- 192.168.123.107:0/1179821689 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb50100770 msgr2=0x7fcb50100bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:06.145 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.146+0000 7fcb57390640 1 --2- 192.168.123.107:0/1179821689 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb50100770 0x7fcb50100bd0 secure :-1 s=READY pgs=181 cs=0 l=1 rev1=1 crypto rx=0x7fcb440099b0 tx=0x7fcb4402f220 comp rx=0 tx=0).stop 2026-03-09T19:36:06.145 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.147+0000 7fcb57390640 1 -- 192.168.123.107:0/1179821689 shutdown_connections 2026-03-09T19:36:06.145 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.147+0000 7fcb57390640 1 --2- 192.168.123.107:0/1179821689 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb50100770 0x7fcb50100bd0 unknown :-1 s=CLOSED pgs=181 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:06.145 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.147+0000 7fcb57390640 1 --2- 192.168.123.107:0/1179821689 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb50106770 0x7fcb50106b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:06.145 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.147+0000 7fcb57390640 1 -- 192.168.123.107:0/1179821689 >> 192.168.123.107:0/1179821689 conn(0x7fcb500fc470 msgr2=0x7fcb500fe890 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:06.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.148+0000 7fcb57390640 1 -- 192.168.123.107:0/1179821689 shutdown_connections 2026-03-09T19:36:06.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.148+0000 7fcb57390640 1 -- 192.168.123.107:0/1179821689 wait complete. 2026-03-09T19:36:06.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.148+0000 7fcb57390640 1 Processor -- start 2026-03-09T19:36:06.146 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.149+0000 7fcb57390640 1 -- start start 2026-03-09T19:36:06.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.149+0000 7fcb57390640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb50100770 0x7fcb5006d930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:06.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.149+0000 7fcb57390640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb50106770 0x7fcb5006de70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:06.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.149+0000 7fcb57390640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcb5006e3b0 con 0x7fcb50100770 2026-03-09T19:36:06.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.149+0000 7fcb57390640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcb5006e520 con 0x7fcb50106770 2026-03-09T19:36:06.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.149+0000 7fcb54904640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb50106770 0x7fcb5006de70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:06.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.149+0000 7fcb54904640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb50106770 0x7fcb5006de70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:58122/0 (socket says 192.168.123.107:58122) 2026-03-09T19:36:06.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.149+0000 7fcb54904640 1 -- 192.168.123.107:0/1870584494 learned_addr learned my addr 192.168.123.107:0/1870584494 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:06.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.149+0000 7fcb54904640 1 -- 192.168.123.107:0/1870584494 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb50100770 msgr2=0x7fcb5006d930 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:06.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.149+0000 7fcb55105640 1 --2- 192.168.123.107:0/1870584494 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb50100770 0x7fcb5006d930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:06.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.150+0000 7fcb54904640 1 --2- 192.168.123.107:0/1870584494 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb50100770 0x7fcb5006d930 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:06.147 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.150+0000 7fcb54904640 1 -- 192.168.123.107:0/1870584494 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcb44009660 con 0x7fcb50106770 2026-03-09T19:36:06.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.150+0000 7fcb55105640 1 --2- 192.168.123.107:0/1870584494 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb50100770 0x7fcb5006d930 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:36:06.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.150+0000 7fcb54904640 1 --2- 192.168.123.107:0/1870584494 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb50106770 0x7fcb5006de70 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fcb4402f730 tx=0x7fcb440043d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:06.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.150+0000 7fcb3e7fc640 1 -- 192.168.123.107:0/1870584494 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcb4403d070 con 0x7fcb50106770 2026-03-09T19:36:06.148 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.151+0000 7fcb3e7fc640 1 -- 192.168.123.107:0/1870584494 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fcb4402fc90 con 0x7fcb50106770 2026-03-09T19:36:06.149 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.151+0000 7fcb3e7fc640 1 -- 192.168.123.107:0/1870584494 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcb440417b0 con 0x7fcb50106770 2026-03-09T19:36:06.149 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.151+0000 7fcb57390640 1 -- 192.168.123.107:0/1870584494 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcb501b1530 con 0x7fcb50106770 2026-03-09T19:36:06.149 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.151+0000 7fcb57390640 1 -- 192.168.123.107:0/1870584494 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcb501b19a0 con 0x7fcb50106770 2026-03-09T19:36:06.150 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.152+0000 7fcb1ffff640 1 -- 192.168.123.107:0/1870584494 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcb50101eb0 con 0x7fcb50106770 2026-03-09T19:36:06.152 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.154+0000 7fcb3e7fc640 1 -- 192.168.123.107:0/1870584494 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcb44038730 con 0x7fcb50106770 2026-03-09T19:36:06.152 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.154+0000 7fcb3e7fc640 1 --2- 192.168.123.107:0/1870584494 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcb300776d0 0x7fcb30079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:06.153 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.155+0000 7fcb55105640 1 --2- 192.168.123.107:0/1870584494 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcb300776d0 0x7fcb30079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:06.155 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.156+0000 7fcb55105640 1 --2- 192.168.123.107:0/1870584494 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcb300776d0 0x7fcb30079b90 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fcb40008bc0 tx=0x7fcb40005e90 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:06.155 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.156+0000 7fcb3e7fc640 1 -- 192.168.123.107:0/1870584494 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7fcb440becb0 con 0x7fcb50106770 2026-03-09T19:36:06.156 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.158+0000 7fcb3e7fc640 1 -- 192.168.123.107:0/1870584494 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcb44087360 con 0x7fcb50106770 2026-03-09T19:36:06.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.281+0000 7fcb1ffff640 1 -- 192.168.123.107:0/1870584494 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 23, "format": "json"} v 0) v1 -- 0x7fcb50100bd0 con 0x7fcb50106770 2026-03-09T19:36:06.280 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.282+0000 7fcb3e7fc640 1 -- 192.168.123.107:0/1870584494 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 23, "format": "json"}]=0 dumped fsmap epoch 23 v33) v1 ==== 107+0+4270 (secure 0 0 0) 0x7fcb44086ab0 con 0x7fcb50106770 2026-03-09T19:36:06.281 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:06.281 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":23,"btime":"2026-03-09T19:33:56:353933+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34382,"name":"cephfs.vm07.uizncw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/2856024060","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":2856024060},{"type":"v1","addr":"192.168.123.107:6827","nonce":2856024060}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":34386,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6829/670620212","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":670620212},{"type":"v1","addr":"192.168.123.107:6829","nonce":670620212}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20}],"filesystems":[{"mdsmap":{"epoch":23,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:33:55.856997+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":101,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":34378},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34378":{"gid":34378,"name":"cephfs.vm08.jwsqrf","rank":0,"incarnation":22,"state":"up:reconnect","state_seq":7,"addr":"192.168.123.108:6827/2328013860","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2328013860},{"type":"v1","addr":"192.168.123.108:6827","nonce":2328013860}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T19:36:06.281 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 23 2026-03-09T19:36:06.283 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.285+0000 7fcb1ffff640 1 -- 192.168.123.107:0/1870584494 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcb300776d0 msgr2=0x7fcb30079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:06.283 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.285+0000 7fcb1ffff640 1 --2- 192.168.123.107:0/1870584494 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcb300776d0 0x7fcb30079b90 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fcb40008bc0 tx=0x7fcb40005e90 comp rx=0 tx=0).stop 2026-03-09T19:36:06.283 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.285+0000 7fcb1ffff640 1 -- 192.168.123.107:0/1870584494 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb50106770 msgr2=0x7fcb5006de70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:06.283 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.285+0000 7fcb1ffff640 1 --2- 192.168.123.107:0/1870584494 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb50106770 0x7fcb5006de70 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fcb4402f730 tx=0x7fcb440043d0 comp rx=0 tx=0).stop 2026-03-09T19:36:06.283 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.286+0000 7fcb1ffff640 1 -- 192.168.123.107:0/1870584494 shutdown_connections 2026-03-09T19:36:06.283 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.286+0000 7fcb1ffff640 1 --2- 192.168.123.107:0/1870584494 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fcb300776d0 0x7fcb30079b90 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:06.284 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.286+0000 7fcb1ffff640 1 --2- 192.168.123.107:0/1870584494 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb50106770 0x7fcb5006de70 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:06.284 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.286+0000 7fcb1ffff640 1 --2- 192.168.123.107:0/1870584494 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb50100770 0x7fcb5006d930 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:06.284 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.286+0000 7fcb1ffff640 1 -- 192.168.123.107:0/1870584494 >> 192.168.123.107:0/1870584494 conn(0x7fcb500fc470 msgr2=0x7fcb5010a730 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:06.284 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.286+0000 7fcb1ffff640 1 -- 192.168.123.107:0/1870584494 shutdown_connections 2026-03-09T19:36:06.284 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.286+0000 7fcb1ffff640 1 -- 192.168.123.107:0/1870584494 wait complete. 2026-03-09T19:36:06.337 DEBUG:tasks.fs:max_mds reduced in epoch 23 2026-03-09T19:36:06.337 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 24 2026-03-09T19:36:06.506 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:06.768 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.769+0000 7f2db7730640 1 -- 192.168.123.107:0/2156545273 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2db0108a00 msgr2=0x7f2db0108de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:06.768 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.769+0000 7f2db7730640 1 --2- 192.168.123.107:0/2156545273 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2db0108a00 0x7f2db0108de0 secure :-1 s=READY pgs=182 cs=0 l=1 rev1=1 crypto rx=0x7f2da00099b0 tx=0x7f2da002f220 comp rx=0 tx=0).stop 2026-03-09T19:36:06.768 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.770+0000 7f2db7730640 1 -- 192.168.123.107:0/2156545273 shutdown_connections 2026-03-09T19:36:06.768 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.770+0000 7f2db7730640 1 --2- 192.168.123.107:0/2156545273 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2db0102a00 0x7f2db0102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:06.768 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.770+0000 7f2db7730640 1 --2- 192.168.123.107:0/2156545273 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2db0108a00 0x7f2db0108de0 unknown :-1 s=CLOSED pgs=182 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:06.768 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.770+0000 7f2db7730640 1 -- 192.168.123.107:0/2156545273 >> 192.168.123.107:0/2156545273 conn(0x7f2db00fe700 msgr2=0x7f2db0100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:06.768 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.770+0000 7f2db7730640 1 -- 192.168.123.107:0/2156545273 shutdown_connections 2026-03-09T19:36:06.768 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.770+0000 7f2db7730640 1 -- 192.168.123.107:0/2156545273 wait complete. 2026-03-09T19:36:06.769 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.771+0000 7f2db7730640 1 Processor -- start 2026-03-09T19:36:06.769 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.771+0000 7f2db7730640 1 -- start start 2026-03-09T19:36:06.769 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.771+0000 7f2db7730640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2db0102a00 0x7f2db0075700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:06.769 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.771+0000 7f2db7730640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2db0108a00 0x7f2db0075c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:06.769 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.771+0000 7f2db7730640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2db00796f0 con 0x7f2db0102a00 2026-03-09T19:36:06.769 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.771+0000 7f2db7730640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2db0079860 con 0x7f2db0108a00 2026-03-09T19:36:06.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.771+0000 7f2db54a5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2db0102a00 0x7f2db0075700 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:06.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.771+0000 7f2db54a5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2db0102a00 0x7f2db0075700 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:33866/0 (socket says 192.168.123.107:33866) 2026-03-09T19:36:06.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.771+0000 7f2db54a5640 1 -- 192.168.123.107:0/2742954387 learned_addr learned my addr 192.168.123.107:0/2742954387 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:06.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.772+0000 7f2db4ca4640 1 --2- 192.168.123.107:0/2742954387 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2db0108a00 0x7f2db0075c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:06.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.772+0000 7f2db54a5640 1 -- 192.168.123.107:0/2742954387 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2db0108a00 msgr2=0x7f2db0075c40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:06.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.772+0000 7f2db54a5640 1 --2- 192.168.123.107:0/2742954387 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2db0108a00 0x7f2db0075c40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:06.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.772+0000 7f2db54a5640 1 -- 192.168.123.107:0/2742954387 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2da0009660 con 0x7f2db0102a00 2026-03-09T19:36:06.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.772+0000 7f2db4ca4640 1 --2- 192.168.123.107:0/2742954387 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2db0108a00 0x7f2db0075c40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T19:36:06.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.772+0000 7f2db54a5640 1 --2- 192.168.123.107:0/2742954387 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2db0102a00 0x7f2db0075700 secure :-1 s=READY pgs=183 cs=0 l=1 rev1=1 crypto rx=0x7f2da0009980 tx=0x7f2da0031d20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:06.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.773+0000 7f2d9e7fc640 1 -- 192.168.123.107:0/2742954387 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2da003d070 con 0x7f2db0102a00 2026-03-09T19:36:06.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.773+0000 7f2d9e7fc640 1 -- 192.168.123.107:0/2742954387 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f2da00043d0 con 0x7f2db0102a00 2026-03-09T19:36:06.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.773+0000 7f2d9e7fc640 1 -- 192.168.123.107:0/2742954387 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2da0031070 con 0x7f2db0102a00 2026-03-09T19:36:06.772 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.773+0000 7f2db7730640 1 -- 192.168.123.107:0/2742954387 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2db0076180 con 0x7f2db0102a00 2026-03-09T19:36:06.772 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.773+0000 7f2db7730640 1 -- 192.168.123.107:0/2742954387 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2db01a9010 con 0x7f2db0102a00 2026-03-09T19:36:06.773 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.775+0000 7f2db7730640 1 -- 192.168.123.107:0/2742954387 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2d78005350 con 0x7f2db0102a00 2026-03-09T19:36:06.773 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.775+0000 7f2d9e7fc640 1 -- 192.168.123.107:0/2742954387 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2da002fa80 con 0x7f2db0102a00 2026-03-09T19:36:06.774 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.776+0000 7f2d9e7fc640 1 --2- 192.168.123.107:0/2742954387 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2d80077680 0x7f2d80079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:06.774 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.777+0000 7f2db4ca4640 1 --2- 192.168.123.107:0/2742954387 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2d80077680 0x7f2d80079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:06.775 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.777+0000 7f2db4ca4640 1 --2- 192.168.123.107:0/2742954387 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2d80077680 0x7f2d80079b40 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f2da4005f10 tx=0x7f2da4005ea0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:06.775 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.777+0000 7f2d9e7fc640 1 -- 192.168.123.107:0/2742954387 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f2da0031e90 con 0x7f2db0102a00 2026-03-09T19:36:06.778 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.781+0000 7f2d9e7fc640 1 -- 192.168.123.107:0/2742954387 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2da00836a0 con 0x7f2db0102a00 2026-03-09T19:36:06.823 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:06 vm07.local ceph-mon[111841]: pgmap v285: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:06.898 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.900+0000 7f2db7730640 1 -- 192.168.123.107:0/2742954387 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 24, "format": "json"} v 0) v1 -- 0x7f2d780058d0 con 0x7f2db0102a00 2026-03-09T19:36:06.899 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.902+0000 7f2d9e7fc640 1 -- 192.168.123.107:0/2742954387 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 24, "format": "json"}]=0 dumped fsmap epoch 24 v33) v1 ==== 107+0+5118 (secure 0 0 0) 0x7f2da00836a0 con 0x7f2db0102a00 2026-03-09T19:36:06.900 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:06.900 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":24,"btime":"2026-03-09T19:33:57:888255+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34382,"name":"cephfs.vm07.uizncw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/2856024060","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":2856024060},{"type":"v1","addr":"192.168.123.107:6827","nonce":2856024060}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":34386,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6829/670620212","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":670620212},{"type":"v1","addr":"192.168.123.107:6829","nonce":670620212}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44337,"name":"cephfs.vm08.zcaqju","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3904856878","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3904856878},{"type":"v1","addr":"192.168.123.108:6825","nonce":3904856878}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24}],"filesystems":[{"mdsmap":{"epoch":24,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:33:56.919449+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":101,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":34378},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34378":{"gid":34378,"name":"cephfs.vm08.jwsqrf","rank":0,"incarnation":22,"state":"up:rejoin","state_seq":8,"addr":"192.168.123.108:6827/2328013860","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2328013860},{"type":"v1","addr":"192.168.123.108:6827","nonce":2328013860}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T19:36:06.900 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 24 2026-03-09T19:36:06.902 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.904+0000 7f2db7730640 1 -- 192.168.123.107:0/2742954387 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2d80077680 msgr2=0x7f2d80079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:06.902 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.904+0000 7f2db7730640 1 --2- 192.168.123.107:0/2742954387 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2d80077680 0x7f2d80079b40 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f2da4005f10 tx=0x7f2da4005ea0 comp rx=0 tx=0).stop 2026-03-09T19:36:06.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.905+0000 7f2db7730640 1 -- 192.168.123.107:0/2742954387 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2db0102a00 msgr2=0x7f2db0075700 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:06.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.905+0000 7f2db7730640 1 --2- 192.168.123.107:0/2742954387 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2db0102a00 0x7f2db0075700 secure :-1 s=READY pgs=183 cs=0 l=1 rev1=1 crypto rx=0x7f2da0009980 tx=0x7f2da0031d20 comp rx=0 tx=0).stop 2026-03-09T19:36:06.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.905+0000 7f2db7730640 1 -- 192.168.123.107:0/2742954387 shutdown_connections 2026-03-09T19:36:06.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.905+0000 7f2db7730640 1 --2- 192.168.123.107:0/2742954387 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2d80077680 0x7f2d80079b40 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:06.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.905+0000 7f2db7730640 1 --2- 192.168.123.107:0/2742954387 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2db0108a00 0x7f2db0075c40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:06.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.905+0000 7f2db7730640 1 --2- 192.168.123.107:0/2742954387 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2db0102a00 0x7f2db0075700 unknown :-1 s=CLOSED pgs=183 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:06.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.905+0000 7f2db7730640 1 -- 192.168.123.107:0/2742954387 >> 192.168.123.107:0/2742954387 conn(0x7f2db00fe700 msgr2=0x7f2db0107640 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:06.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.905+0000 7f2db7730640 1 -- 192.168.123.107:0/2742954387 shutdown_connections 2026-03-09T19:36:06.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:06.905+0000 7f2db7730640 1 -- 192.168.123.107:0/2742954387 wait complete. 2026-03-09T19:36:06.954 DEBUG:tasks.fs:max_mds reduced in epoch 24 2026-03-09T19:36:06.954 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 25 2026-03-09T19:36:07.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:06 vm08.local ceph-mon[103420]: pgmap v285: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:07.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:06 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1870584494' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-09T19:36:07.132 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:07.161 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:06 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1870584494' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-09T19:36:07.398 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.399+0000 7f643ec9b640 1 -- 192.168.123.107:0/173625170 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6438106780 msgr2=0x7f6438106b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:07.399 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.399+0000 7f643ec9b640 1 --2- 192.168.123.107:0/173625170 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6438106780 0x7f6438106b60 secure :-1 s=READY pgs=184 cs=0 l=1 rev1=1 crypto rx=0x7f6428009a00 tx=0x7f642802f280 comp rx=0 tx=0).stop 2026-03-09T19:36:07.399 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.401+0000 7f643ec9b640 1 -- 192.168.123.107:0/173625170 shutdown_connections 2026-03-09T19:36:07.399 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.401+0000 7f643ec9b640 1 --2- 192.168.123.107:0/173625170 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6438100780 0x7f6438100be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:07.399 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.401+0000 7f643ec9b640 1 --2- 192.168.123.107:0/173625170 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6438106780 0x7f6438106b60 unknown :-1 s=CLOSED pgs=184 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:07.399 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.401+0000 7f643ec9b640 1 -- 192.168.123.107:0/173625170 >> 192.168.123.107:0/173625170 conn(0x7f64380fc460 msgr2=0x7f64380fe880 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:07.399 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.401+0000 7f643ec9b640 1 -- 192.168.123.107:0/173625170 shutdown_connections 2026-03-09T19:36:07.399 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.401+0000 7f643ec9b640 1 -- 192.168.123.107:0/173625170 wait complete. 2026-03-09T19:36:07.399 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.401+0000 7f643ec9b640 1 Processor -- start 2026-03-09T19:36:07.399 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.402+0000 7f643ec9b640 1 -- start start 2026-03-09T19:36:07.400 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.402+0000 7f643ec9b640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6438100780 0x7f643806d930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:07.400 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.402+0000 7f643ec9b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6438106780 0x7f643806de70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:07.400 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.402+0000 7f643ec9b640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f643806e440 con 0x7f6438106780 2026-03-09T19:36:07.400 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.402+0000 7f643ec9b640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f643806e5b0 con 0x7f6438100780 2026-03-09T19:36:07.402 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.402+0000 7f642ffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6438106780 0x7f643806de70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:07.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.402+0000 7f642ffff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6438106780 0x7f643806de70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:33886/0 (socket says 192.168.123.107:33886) 2026-03-09T19:36:07.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.402+0000 7f642ffff640 1 -- 192.168.123.107:0/16764919 learned_addr learned my addr 192.168.123.107:0/16764919 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:07.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.402+0000 7f643ca10640 1 --2- 192.168.123.107:0/16764919 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6438100780 0x7f643806d930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:07.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.403+0000 7f643ca10640 1 -- 192.168.123.107:0/16764919 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6438106780 msgr2=0x7f643806de70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:07.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.403+0000 7f643ca10640 1 --2- 192.168.123.107:0/16764919 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6438106780 0x7f643806de70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:07.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.403+0000 7f643ca10640 1 -- 192.168.123.107:0/16764919 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6428009660 con 0x7f6438100780 2026-03-09T19:36:07.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.403+0000 7f643ca10640 1 --2- 192.168.123.107:0/16764919 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6438100780 0x7f643806d930 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f6428009a00 tx=0x7f6428031d30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:07.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.403+0000 7f642dffb640 1 -- 192.168.123.107:0/16764919 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6428002a50 con 0x7f6438100780 2026-03-09T19:36:07.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.404+0000 7f643ec9b640 1 -- 192.168.123.107:0/16764919 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f64381b1680 con 0x7f6438100780 2026-03-09T19:36:07.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.404+0000 7f643ec9b640 1 -- 192.168.123.107:0/16764919 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f64381b1b70 con 0x7f6438100780 2026-03-09T19:36:07.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.404+0000 7f642dffb640 1 -- 192.168.123.107:0/16764919 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f6428031070 con 0x7f6438100780 2026-03-09T19:36:07.403 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.404+0000 7f642dffb640 1 -- 192.168.123.107:0/16764919 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6428038680 con 0x7f6438100780 2026-03-09T19:36:07.404 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.407+0000 7f642dffb640 1 -- 192.168.123.107:0/16764919 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f642803f070 con 0x7f6438100780 2026-03-09T19:36:07.405 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.407+0000 7f642dffb640 1 --2- 192.168.123.107:0/16764919 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f640c0778e0 0x7f640c079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:07.405 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.407+0000 7f642dffb640 1 -- 192.168.123.107:0/16764919 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f64280bf500 con 0x7f6438100780 2026-03-09T19:36:07.410 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.412+0000 7f643ec9b640 1 -- 192.168.123.107:0/16764919 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6400005350 con 0x7f6438100780 2026-03-09T19:36:07.410 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.412+0000 7f642ffff640 1 --2- 192.168.123.107:0/16764919 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f640c0778e0 0x7f640c079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:07.410 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.413+0000 7f642ffff640 1 --2- 192.168.123.107:0/16764919 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f640c0778e0 0x7f640c079da0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f6438075600 tx=0x7f6420004390 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:07.419 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.417+0000 7f642dffb640 1 -- 192.168.123.107:0/16764919 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6428087b30 con 0x7f6438100780 2026-03-09T19:36:07.534 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.535+0000 7f643ec9b640 1 -- 192.168.123.107:0/16764919 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 25, "format": "json"} v 0) v1 -- 0x7f64000051c0 con 0x7f6438100780 2026-03-09T19:36:07.535 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.537+0000 7f642dffb640 1 -- 192.168.123.107:0/16764919 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 25, "format": "json"}]=0 dumped fsmap epoch 25 v33) v1 ==== 107+0+5127 (secure 0 0 0) 0x7f6428087280 con 0x7f6438100780 2026-03-09T19:36:07.535 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:07.535 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":25,"btime":"2026-03-09T19:33:58:894762+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34382,"name":"cephfs.vm07.uizncw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/2856024060","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":2856024060},{"type":"v1","addr":"192.168.123.107:6827","nonce":2856024060}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":34386,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6829/670620212","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":670620212},{"type":"v1","addr":"192.168.123.107:6829","nonce":670620212}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44337,"name":"cephfs.vm08.zcaqju","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3904856878","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3904856878},{"type":"v1","addr":"192.168.123.108:6825","nonce":3904856878}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24}],"filesystems":[{"mdsmap":{"epoch":25,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:33:58.894755+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":101,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":34378},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34378":{"gid":34378,"name":"cephfs.vm08.jwsqrf","rank":0,"incarnation":22,"state":"up:active","state_seq":9,"addr":"192.168.123.108:6827/2328013860","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2328013860},{"type":"v1","addr":"192.168.123.108:6827","nonce":2328013860}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34378,"qdb_cluster":[34378]},"id":1}]} 2026-03-09T19:36:07.535 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 25 2026-03-09T19:36:07.538 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.540+0000 7f643ec9b640 1 -- 192.168.123.107:0/16764919 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f640c0778e0 msgr2=0x7f640c079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:07.538 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.540+0000 7f643ec9b640 1 --2- 192.168.123.107:0/16764919 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f640c0778e0 0x7f640c079da0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f6438075600 tx=0x7f6420004390 comp rx=0 tx=0).stop 2026-03-09T19:36:07.538 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.540+0000 7f643ec9b640 1 -- 192.168.123.107:0/16764919 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6438100780 msgr2=0x7f643806d930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:07.538 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.540+0000 7f643ec9b640 1 --2- 192.168.123.107:0/16764919 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6438100780 0x7f643806d930 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f6428009a00 tx=0x7f6428031d30 comp rx=0 tx=0).stop 2026-03-09T19:36:07.538 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.540+0000 7f643ec9b640 1 -- 192.168.123.107:0/16764919 shutdown_connections 2026-03-09T19:36:07.538 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.540+0000 7f643ec9b640 1 --2- 192.168.123.107:0/16764919 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f640c0778e0 0x7f640c079da0 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:07.538 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.540+0000 7f643ec9b640 1 --2- 192.168.123.107:0/16764919 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6438106780 0x7f643806de70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:07.538 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.540+0000 7f643ec9b640 1 --2- 192.168.123.107:0/16764919 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6438100780 0x7f643806d930 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:07.538 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.540+0000 7f643ec9b640 1 -- 192.168.123.107:0/16764919 >> 192.168.123.107:0/16764919 conn(0x7f64380fc460 msgr2=0x7f64380fde80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:07.539 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.541+0000 7f643ec9b640 1 -- 192.168.123.107:0/16764919 shutdown_connections 2026-03-09T19:36:07.539 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:07.541+0000 7f643ec9b640 1 -- 192.168.123.107:0/16764919 wait complete. 2026-03-09T19:36:07.595 DEBUG:tasks.fs:max_mds reduced in epoch 25 2026-03-09T19:36:07.595 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 26 2026-03-09T19:36:07.792 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:08.053 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:07 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/2742954387' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-09T19:36:08.053 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:07 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/16764919' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-09T19:36:08.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.075+0000 7f8a194e9640 1 -- 192.168.123.107:0/194770610 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8a14102270 msgr2=0x7f8a1410a760 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:08.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.075+0000 7f8a194e9640 1 --2- 192.168.123.107:0/194770610 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8a14102270 0x7f8a1410a760 secure :-1 s=READY pgs=185 cs=0 l=1 rev1=1 crypto rx=0x7f8a080099b0 tx=0x7f8a0802f220 comp rx=0 tx=0).stop 2026-03-09T19:36:08.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.076+0000 7f8a194e9640 1 -- 192.168.123.107:0/194770610 shutdown_connections 2026-03-09T19:36:08.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.076+0000 7f8a194e9640 1 --2- 192.168.123.107:0/194770610 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8a14102270 0x7f8a1410a760 unknown :-1 s=CLOSED pgs=185 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:08.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.076+0000 7f8a194e9640 1 --2- 192.168.123.107:0/194770610 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a14101950 0x7f8a14101d30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:08.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.076+0000 7f8a194e9640 1 -- 192.168.123.107:0/194770610 >> 192.168.123.107:0/194770610 conn(0x7f8a140fb2f0 msgr2=0x7f8a140fd710 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:08.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.076+0000 7f8a194e9640 1 -- 192.168.123.107:0/194770610 shutdown_connections 2026-03-09T19:36:08.074 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.076+0000 7f8a194e9640 1 -- 192.168.123.107:0/194770610 wait complete. 2026-03-09T19:36:08.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.077+0000 7f8a194e9640 1 Processor -- start 2026-03-09T19:36:08.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.077+0000 7f8a194e9640 1 -- start start 2026-03-09T19:36:08.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.077+0000 7f8a194e9640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8a14101950 0x7f8a14198670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:08.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.078+0000 7f8a194e9640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a14102270 0x7f8a14198bb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:08.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.078+0000 7f8a194e9640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8a14199240 con 0x7f8a14101950 2026-03-09T19:36:08.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.078+0000 7f8a194e9640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8a1419cfb0 con 0x7f8a14102270 2026-03-09T19:36:08.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.078+0000 7f8a127fc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a14102270 0x7f8a14198bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:08.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.078+0000 7f8a127fc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a14102270 0x7f8a14198bb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:58174/0 (socket says 192.168.123.107:58174) 2026-03-09T19:36:08.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.078+0000 7f8a127fc640 1 -- 192.168.123.107:0/2125023606 learned_addr learned my addr 192.168.123.107:0/2125023606 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:08.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.078+0000 7f8a12ffd640 1 --2- 192.168.123.107:0/2125023606 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8a14101950 0x7f8a14198670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:08.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.079+0000 7f8a127fc640 1 -- 192.168.123.107:0/2125023606 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8a14101950 msgr2=0x7f8a14198670 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:08.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.079+0000 7f8a127fc640 1 --2- 192.168.123.107:0/2125023606 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8a14101950 0x7f8a14198670 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:08.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.079+0000 7f8a127fc640 1 -- 192.168.123.107:0/2125023606 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8a08009660 con 0x7f8a14102270 2026-03-09T19:36:08.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.079+0000 7f8a12ffd640 1 --2- 192.168.123.107:0/2125023606 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8a14101950 0x7f8a14198670 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T19:36:08.078 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.080+0000 7f8a127fc640 1 --2- 192.168.123.107:0/2125023606 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a14102270 0x7f8a14198bb0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f8a08009980 tx=0x7f8a080043d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:08.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.081+0000 7f89f7fff640 1 -- 192.168.123.107:0/2125023606 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8a0803d070 con 0x7f8a14102270 2026-03-09T19:36:08.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.081+0000 7f8a194e9640 1 -- 192.168.123.107:0/2125023606 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8a1419d230 con 0x7f8a14102270 2026-03-09T19:36:08.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.081+0000 7f8a194e9640 1 -- 192.168.123.107:0/2125023606 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8a1419d720 con 0x7f8a14102270 2026-03-09T19:36:08.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.082+0000 7f89f7fff640 1 -- 192.168.123.107:0/2125023606 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f8a0802fc90 con 0x7f8a14102270 2026-03-09T19:36:08.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.082+0000 7f89f7fff640 1 -- 192.168.123.107:0/2125023606 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8a080418c0 con 0x7f8a14102270 2026-03-09T19:36:08.080 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.083+0000 7f89f7fff640 1 -- 192.168.123.107:0/2125023606 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8a0804b430 con 0x7f8a14102270 2026-03-09T19:36:08.081 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.083+0000 7f89f7fff640 1 --2- 192.168.123.107:0/2125023606 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f89dc0778e0 0x7f89dc079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:08.081 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.083+0000 7f8a12ffd640 1 --2- 192.168.123.107:0/2125023606 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f89dc0778e0 0x7f89dc079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:08.082 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.084+0000 7f89f7fff640 1 -- 192.168.123.107:0/2125023606 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f8a080be730 con 0x7f8a14102270 2026-03-09T19:36:08.082 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.084+0000 7f8a12ffd640 1 --2- 192.168.123.107:0/2125023606 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f89dc0778e0 0x7f89dc079da0 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f8a000046b0 tx=0x7f8a00009210 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:08.082 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.084+0000 7f8a194e9640 1 -- 192.168.123.107:0/2125023606 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8a14069700 con 0x7f8a14102270 2026-03-09T19:36:08.085 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.087+0000 7f89f7fff640 1 -- 192.168.123.107:0/2125023606 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8a0802fe00 con 0x7f8a14102270 2026-03-09T19:36:08.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:07 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/2742954387' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-09T19:36:08.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:07 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/16764919' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-09T19:36:08.208 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.210+0000 7f8a194e9640 1 -- 192.168.123.107:0/2125023606 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 26, "format": "json"} v 0) v1 -- 0x7f8a14199a00 con 0x7f8a14102270 2026-03-09T19:36:08.210 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.211+0000 7f89f7fff640 1 -- 192.168.123.107:0/2125023606 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 26, "format": "json"}]=0 dumped fsmap epoch 26 v33) v1 ==== 107+0+4324 (secure 0 0 0) 0x7f8a0802fe00 con 0x7f8a14102270 2026-03-09T19:36:08.210 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:08.210 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":26,"btime":"2026-03-09T19:34:01:467500+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34382,"name":"cephfs.vm07.uizncw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/2856024060","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":2856024060},{"type":"v1","addr":"192.168.123.107:6827","nonce":2856024060}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":34386,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6829/670620212","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":670620212},{"type":"v1","addr":"192.168.123.107:6829","nonce":670620212}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44337,"name":"cephfs.vm08.zcaqju","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3904856878","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3904856878},{"type":"v1","addr":"192.168.123.108:6825","nonce":3904856878}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24}],"filesystems":[{"mdsmap":{"epoch":26,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:34:01.467499+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":102,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[1],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T19:36:08.210 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 26 2026-03-09T19:36:08.213 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.215+0000 7f8a194e9640 1 -- 192.168.123.107:0/2125023606 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f89dc0778e0 msgr2=0x7f89dc079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:08.213 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.215+0000 7f8a194e9640 1 --2- 192.168.123.107:0/2125023606 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f89dc0778e0 0x7f89dc079da0 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f8a000046b0 tx=0x7f8a00009210 comp rx=0 tx=0).stop 2026-03-09T19:36:08.213 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.215+0000 7f8a194e9640 1 -- 192.168.123.107:0/2125023606 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a14102270 msgr2=0x7f8a14198bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:08.213 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.215+0000 7f8a194e9640 1 --2- 192.168.123.107:0/2125023606 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a14102270 0x7f8a14198bb0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f8a08009980 tx=0x7f8a080043d0 comp rx=0 tx=0).stop 2026-03-09T19:36:08.214 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.216+0000 7f8a194e9640 1 -- 192.168.123.107:0/2125023606 shutdown_connections 2026-03-09T19:36:08.214 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.216+0000 7f8a194e9640 1 --2- 192.168.123.107:0/2125023606 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f89dc0778e0 0x7f89dc079da0 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:08.214 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.216+0000 7f8a194e9640 1 --2- 192.168.123.107:0/2125023606 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a14102270 0x7f8a14198bb0 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:08.214 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.216+0000 7f8a194e9640 1 --2- 192.168.123.107:0/2125023606 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8a14101950 0x7f8a14198670 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:08.214 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.216+0000 7f8a194e9640 1 -- 192.168.123.107:0/2125023606 >> 192.168.123.107:0/2125023606 conn(0x7f8a140fb2f0 msgr2=0x7f8a14109d50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:08.214 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.216+0000 7f8a194e9640 1 -- 192.168.123.107:0/2125023606 shutdown_connections 2026-03-09T19:36:08.214 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.217+0000 7f8a194e9640 1 -- 192.168.123.107:0/2125023606 wait complete. 2026-03-09T19:36:08.289 DEBUG:tasks.fs:max_mds reduced in epoch 26 2026-03-09T19:36:08.289 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 27 2026-03-09T19:36:08.464 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:08.753 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.754+0000 7f57c4458640 1 -- 192.168.123.107:0/1465363669 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57bc0ffea0 msgr2=0x7f57bc10cd70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:08.753 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.754+0000 7f57c4458640 1 --2- 192.168.123.107:0/1465363669 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57bc0ffea0 0x7f57bc10cd70 secure :-1 s=READY pgs=186 cs=0 l=1 rev1=1 crypto rx=0x7f57ac0099b0 tx=0x7f57ac02f220 comp rx=0 tx=0).stop 2026-03-09T19:36:08.753 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.755+0000 7f57c4458640 1 -- 192.168.123.107:0/1465363669 shutdown_connections 2026-03-09T19:36:08.753 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.755+0000 7f57c4458640 1 --2- 192.168.123.107:0/1465363669 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57bc0ffea0 0x7f57bc10cd70 unknown :-1 s=CLOSED pgs=186 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:08.753 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.755+0000 7f57c4458640 1 --2- 192.168.123.107:0/1465363669 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f57bc0ff580 0x7f57bc0ff960 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:08.753 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.755+0000 7f57c4458640 1 -- 192.168.123.107:0/1465363669 >> 192.168.123.107:0/1465363669 conn(0x7f57bc0fb3d0 msgr2=0x7f57bc0fd7f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:08.753 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.755+0000 7f57c4458640 1 -- 192.168.123.107:0/1465363669 shutdown_connections 2026-03-09T19:36:08.753 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.755+0000 7f57c4458640 1 -- 192.168.123.107:0/1465363669 wait complete. 2026-03-09T19:36:08.754 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.756+0000 7f57c4458640 1 Processor -- start 2026-03-09T19:36:08.754 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.756+0000 7f57c4458640 1 -- start start 2026-03-09T19:36:08.754 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.757+0000 7f57c4458640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f57bc0ff580 0x7f57bc1a06d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:08.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.757+0000 7f57c4458640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57bc0ffea0 0x7f57bc1a0c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:08.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.757+0000 7f57c19cc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57bc0ffea0 0x7f57bc1a0c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:08.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.757+0000 7f57c19cc640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57bc0ffea0 0x7f57bc1a0c10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:55072/0 (socket says 192.168.123.107:55072) 2026-03-09T19:36:08.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.757+0000 7f57c19cc640 1 -- 192.168.123.107:0/4253789420 learned_addr learned my addr 192.168.123.107:0/4253789420 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:08.755 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.757+0000 7f57c21cd640 1 --2- 192.168.123.107:0/4253789420 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f57bc0ff580 0x7f57bc1a06d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:08.756 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.758+0000 7f57c4458640 1 -- 192.168.123.107:0/4253789420 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f57bc1a1260 con 0x7f57bc0ffea0 2026-03-09T19:36:08.756 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.758+0000 7f57c4458640 1 -- 192.168.123.107:0/4253789420 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f57bc19a7c0 con 0x7f57bc0ff580 2026-03-09T19:36:08.757 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.758+0000 7f57c19cc640 1 -- 192.168.123.107:0/4253789420 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f57bc0ff580 msgr2=0x7f57bc1a06d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:08.757 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.758+0000 7f57c19cc640 1 --2- 192.168.123.107:0/4253789420 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f57bc0ff580 0x7f57bc1a06d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:08.757 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.758+0000 7f57c19cc640 1 -- 192.168.123.107:0/4253789420 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f57ac009660 con 0x7f57bc0ffea0 2026-03-09T19:36:08.757 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.758+0000 7f57c19cc640 1 --2- 192.168.123.107:0/4253789420 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57bc0ffea0 0x7f57bc1a0c10 secure :-1 s=READY pgs=187 cs=0 l=1 rev1=1 crypto rx=0x7f57ac002410 tx=0x7f57ac004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:08.757 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.759+0000 7f57ab7fe640 1 -- 192.168.123.107:0/4253789420 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f57ac03d070 con 0x7f57bc0ffea0 2026-03-09T19:36:08.757 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.759+0000 7f57ab7fe640 1 -- 192.168.123.107:0/4253789420 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f57ac0043b0 con 0x7f57bc0ffea0 2026-03-09T19:36:08.757 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.759+0000 7f57c4458640 1 -- 192.168.123.107:0/4253789420 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f57bc19aa40 con 0x7f57bc0ffea0 2026-03-09T19:36:08.758 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.759+0000 7f57c4458640 1 -- 192.168.123.107:0/4253789420 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f57bc19af30 con 0x7f57bc0ffea0 2026-03-09T19:36:08.759 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.760+0000 7f57ab7fe640 1 -- 192.168.123.107:0/4253789420 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f57ac0388e0 con 0x7f57bc0ffea0 2026-03-09T19:36:08.759 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.761+0000 7f57c4458640 1 -- 192.168.123.107:0/4253789420 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f578c005350 con 0x7f57bc0ffea0 2026-03-09T19:36:08.763 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.762+0000 7f57ab7fe640 1 -- 192.168.123.107:0/4253789420 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f57ac048020 con 0x7f57bc0ffea0 2026-03-09T19:36:08.763 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.762+0000 7f57ab7fe640 1 --2- 192.168.123.107:0/4253789420 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f57900778e0 0x7f5790079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:08.763 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.762+0000 7f57ab7fe640 1 -- 192.168.123.107:0/4253789420 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f57ac0be360 con 0x7f57bc0ffea0 2026-03-09T19:36:08.763 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.765+0000 7f57ab7fe640 1 -- 192.168.123.107:0/4253789420 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f57ac086910 con 0x7f57bc0ffea0 2026-03-09T19:36:08.763 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.765+0000 7f57c21cd640 1 --2- 192.168.123.107:0/4253789420 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f57900778e0 0x7f5790079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:08.763 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.765+0000 7f57c21cd640 1 --2- 192.168.123.107:0/4253789420 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f57900778e0 0x7f5790079da0 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f57b0005f10 tx=0x7f57b0005ea0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:08.836 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:08 vm07.local ceph-mon[111841]: pgmap v286: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:08.894 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.895+0000 7f57c4458640 1 -- 192.168.123.107:0/4253789420 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 27, "format": "json"} v 0) v1 -- 0x7f578c0051c0 con 0x7f57bc0ffea0 2026-03-09T19:36:08.894 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.896+0000 7f57ab7fe640 1 -- 192.168.123.107:0/4253789420 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 27, "format": "json"}]=0 dumped fsmap epoch 27 v33) v1 ==== 107+0+4403 (secure 0 0 0) 0x7f57ac086060 con 0x7f57bc0ffea0 2026-03-09T19:36:08.895 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:08.895 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":27,"btime":"2026-03-09T19:34:01:739554+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34386,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6829/670620212","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":670620212},{"type":"v1","addr":"192.168.123.107:6829","nonce":670620212}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44337,"name":"cephfs.vm08.zcaqju","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3904856878","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3904856878},{"type":"v1","addr":"192.168.123.108:6825","nonce":3904856878}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24}],"filesystems":[{"mdsmap":{"epoch":27,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:34:01.739525+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":102,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34382},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34382":{"gid":34382,"name":"cephfs.vm07.uizncw","rank":0,"incarnation":27,"state":"up:replay","state_seq":1,"addr":"192.168.123.107:6827/2856024060","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":2856024060},{"type":"v1","addr":"192.168.123.107:6827","nonce":2856024060}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T19:36:08.895 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 27 2026-03-09T19:36:08.897 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.899+0000 7f57c4458640 1 -- 192.168.123.107:0/4253789420 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f57900778e0 msgr2=0x7f5790079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:08.898 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.899+0000 7f57c4458640 1 --2- 192.168.123.107:0/4253789420 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f57900778e0 0x7f5790079da0 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f57b0005f10 tx=0x7f57b0005ea0 comp rx=0 tx=0).stop 2026-03-09T19:36:08.898 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.899+0000 7f57c4458640 1 -- 192.168.123.107:0/4253789420 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57bc0ffea0 msgr2=0x7f57bc1a0c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:08.898 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.899+0000 7f57c4458640 1 --2- 192.168.123.107:0/4253789420 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57bc0ffea0 0x7f57bc1a0c10 secure :-1 s=READY pgs=187 cs=0 l=1 rev1=1 crypto rx=0x7f57ac002410 tx=0x7f57ac004290 comp rx=0 tx=0).stop 2026-03-09T19:36:08.898 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.900+0000 7f57c4458640 1 -- 192.168.123.107:0/4253789420 shutdown_connections 2026-03-09T19:36:08.898 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.900+0000 7f57c4458640 1 --2- 192.168.123.107:0/4253789420 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f57900778e0 0x7f5790079da0 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:08.898 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.900+0000 7f57c4458640 1 --2- 192.168.123.107:0/4253789420 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57bc0ffea0 0x7f57bc1a0c10 unknown :-1 s=CLOSED pgs=187 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:08.898 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.900+0000 7f57c4458640 1 --2- 192.168.123.107:0/4253789420 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f57bc0ff580 0x7f57bc1a06d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:08.898 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.900+0000 7f57c4458640 1 -- 192.168.123.107:0/4253789420 >> 192.168.123.107:0/4253789420 conn(0x7f57bc0fb3d0 msgr2=0x7f57bc0fbbd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:08.898 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.900+0000 7f57c4458640 1 -- 192.168.123.107:0/4253789420 shutdown_connections 2026-03-09T19:36:08.898 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:08.900+0000 7f57c4458640 1 -- 192.168.123.107:0/4253789420 wait complete. 2026-03-09T19:36:08.969 DEBUG:tasks.fs:max_mds reduced in epoch 27 2026-03-09T19:36:08.969 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 28 2026-03-09T19:36:09.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:08 vm08.local ceph-mon[103420]: pgmap v286: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:09.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:08 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/2125023606' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-09T19:36:09.146 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:09.176 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:08 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/2125023606' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-09T19:36:09.425 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.426+0000 7f9de8868640 1 -- 192.168.123.107:0/274612907 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9de0108a00 msgr2=0x7f9de0108de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:09.426 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.426+0000 7f9de8868640 1 --2- 192.168.123.107:0/274612907 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9de0108a00 0x7f9de0108de0 secure :-1 s=READY pgs=188 cs=0 l=1 rev1=1 crypto rx=0x7f9dd80099b0 tx=0x7f9dd802f260 comp rx=0 tx=0).stop 2026-03-09T19:36:09.426 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.428+0000 7f9de8868640 1 -- 192.168.123.107:0/274612907 shutdown_connections 2026-03-09T19:36:09.426 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.428+0000 7f9de8868640 1 --2- 192.168.123.107:0/274612907 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9de0102a00 0x7f9de0102e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:09.426 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.428+0000 7f9de8868640 1 --2- 192.168.123.107:0/274612907 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9de0108a00 0x7f9de0108de0 unknown :-1 s=CLOSED pgs=188 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:09.426 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.428+0000 7f9de8868640 1 -- 192.168.123.107:0/274612907 >> 192.168.123.107:0/274612907 conn(0x7f9de00fe700 msgr2=0x7f9de0100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:09.427 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.430+0000 7f9de8868640 1 -- 192.168.123.107:0/274612907 shutdown_connections 2026-03-09T19:36:09.427 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.430+0000 7f9de8868640 1 -- 192.168.123.107:0/274612907 wait complete. 2026-03-09T19:36:09.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.430+0000 7f9de8868640 1 Processor -- start 2026-03-09T19:36:09.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.430+0000 7f9de8868640 1 -- start start 2026-03-09T19:36:09.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.431+0000 7f9de8868640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9de0102a00 0x7f9de01a0670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:09.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.431+0000 7f9de65dd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9de0102a00 0x7f9de01a0670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:09.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.431+0000 7f9de65dd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9de0102a00 0x7f9de01a0670 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:55084/0 (socket says 192.168.123.107:55084) 2026-03-09T19:36:09.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.431+0000 7f9de8868640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9de0108a00 0x7f9de01a0bb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:09.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.431+0000 7f9de8868640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9de019a760 con 0x7f9de0102a00 2026-03-09T19:36:09.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.431+0000 7f9de8868640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9de019a8d0 con 0x7f9de0108a00 2026-03-09T19:36:09.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.431+0000 7f9de65dd640 1 -- 192.168.123.107:0/3256866141 learned_addr learned my addr 192.168.123.107:0/3256866141 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:09.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.431+0000 7f9de65dd640 1 -- 192.168.123.107:0/3256866141 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9de0108a00 msgr2=0x7f9de01a0bb0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:36:09.430 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.432+0000 7f9de65dd640 1 --2- 192.168.123.107:0/3256866141 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9de0108a00 0x7f9de01a0bb0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:09.430 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.432+0000 7f9de65dd640 1 -- 192.168.123.107:0/3256866141 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9dd8009660 con 0x7f9de0102a00 2026-03-09T19:36:09.430 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.432+0000 7f9de65dd640 1 --2- 192.168.123.107:0/3256866141 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9de0102a00 0x7f9de01a0670 secure :-1 s=READY pgs=189 cs=0 l=1 rev1=1 crypto rx=0x7f9dd802f770 tx=0x7f9dd80043d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:09.430 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.432+0000 7f9dcb7fe640 1 -- 192.168.123.107:0/3256866141 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9dd803d070 con 0x7f9de0102a00 2026-03-09T19:36:09.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.432+0000 7f9dcb7fe640 1 -- 192.168.123.107:0/3256866141 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f9dd802fc90 con 0x7f9de0102a00 2026-03-09T19:36:09.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.432+0000 7f9de8868640 1 -- 192.168.123.107:0/3256866141 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9de019ab50 con 0x7f9de0102a00 2026-03-09T19:36:09.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.433+0000 7f9de8868640 1 -- 192.168.123.107:0/3256866141 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9de019b040 con 0x7f9de0102a00 2026-03-09T19:36:09.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.434+0000 7f9de8868640 1 -- 192.168.123.107:0/3256866141 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9de0104140 con 0x7f9de0102a00 2026-03-09T19:36:09.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.436+0000 7f9dcb7fe640 1 -- 192.168.123.107:0/3256866141 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9dd80417d0 con 0x7f9de0102a00 2026-03-09T19:36:09.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.436+0000 7f9dcb7fe640 1 -- 192.168.123.107:0/3256866141 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9dd804b430 con 0x7f9de0102a00 2026-03-09T19:36:09.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.436+0000 7f9dcb7fe640 1 --2- 192.168.123.107:0/3256866141 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9dbc077a80 0x7f9dbc079f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:09.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.436+0000 7f9dcb7fe640 1 -- 192.168.123.107:0/3256866141 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f9dd80bf710 con 0x7f9de0102a00 2026-03-09T19:36:09.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.437+0000 7f9dcb7fe640 1 -- 192.168.123.107:0/3256866141 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9dd80876f0 con 0x7f9de0102a00 2026-03-09T19:36:09.436 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.438+0000 7f9de5ddc640 1 --2- 192.168.123.107:0/3256866141 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9dbc077a80 0x7f9dbc079f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:09.436 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.438+0000 7f9de5ddc640 1 --2- 192.168.123.107:0/3256866141 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9dbc077a80 0x7f9dbc079f40 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f9de019bbd0 tx=0x7f9dcc006d20 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:09.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.552+0000 7f9de8868640 1 -- 192.168.123.107:0/3256866141 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 28, "format": "json"} v 0) v1 -- 0x7f9de019bd10 con 0x7f9de0102a00 2026-03-09T19:36:09.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.552+0000 7f9dcb7fe640 1 -- 192.168.123.107:0/3256866141 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 28, "format": "json"}]=0 dumped fsmap epoch 28 v33) v1 ==== 107+0+5254 (secure 0 0 0) 0x7f9dd8087510 con 0x7f9de0102a00 2026-03-09T19:36:09.551 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:09.551 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":28,"btime":"2026-03-09T19:34:04:710094+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34386,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6829/670620212","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":670620212},{"type":"v1","addr":"192.168.123.107:6829","nonce":670620212}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":34410,"name":"cephfs.vm08.jwsqrf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/2173796097","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2173796097},{"type":"v1","addr":"192.168.123.108:6827","nonce":2173796097}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":28},{"gid":44337,"name":"cephfs.vm08.zcaqju","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3904856878","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3904856878},{"type":"v1","addr":"192.168.123.108:6825","nonce":3904856878}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24}],"filesystems":[{"mdsmap":{"epoch":27,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:34:01.739525+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":102,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34382},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34382":{"gid":34382,"name":"cephfs.vm07.uizncw","rank":0,"incarnation":27,"state":"up:replay","state_seq":1,"addr":"192.168.123.107:6827/2856024060","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":2856024060},{"type":"v1","addr":"192.168.123.107:6827","nonce":2856024060}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T19:36:09.551 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 28 2026-03-09T19:36:09.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.555+0000 7f9de8868640 1 -- 192.168.123.107:0/3256866141 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9dbc077a80 msgr2=0x7f9dbc079f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:09.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.555+0000 7f9de8868640 1 --2- 192.168.123.107:0/3256866141 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9dbc077a80 0x7f9dbc079f40 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f9de019bbd0 tx=0x7f9dcc006d20 comp rx=0 tx=0).stop 2026-03-09T19:36:09.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.555+0000 7f9de8868640 1 -- 192.168.123.107:0/3256866141 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9de0102a00 msgr2=0x7f9de01a0670 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:09.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.555+0000 7f9de8868640 1 --2- 192.168.123.107:0/3256866141 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9de0102a00 0x7f9de01a0670 secure :-1 s=READY pgs=189 cs=0 l=1 rev1=1 crypto rx=0x7f9dd802f770 tx=0x7f9dd80043d0 comp rx=0 tx=0).stop 2026-03-09T19:36:09.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.556+0000 7f9de8868640 1 -- 192.168.123.107:0/3256866141 shutdown_connections 2026-03-09T19:36:09.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.556+0000 7f9de8868640 1 --2- 192.168.123.107:0/3256866141 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f9dbc077a80 0x7f9dbc079f40 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:09.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.556+0000 7f9de8868640 1 --2- 192.168.123.107:0/3256866141 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9de0108a00 0x7f9de01a0bb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:09.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.556+0000 7f9de8868640 1 --2- 192.168.123.107:0/3256866141 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9de0102a00 0x7f9de01a0670 unknown :-1 s=CLOSED pgs=189 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:09.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.556+0000 7f9de8868640 1 -- 192.168.123.107:0/3256866141 >> 192.168.123.107:0/3256866141 conn(0x7f9de00fe700 msgr2=0x7f9de00feae0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:09.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.556+0000 7f9de8868640 1 -- 192.168.123.107:0/3256866141 shutdown_connections 2026-03-09T19:36:09.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:09.557+0000 7f9de8868640 1 -- 192.168.123.107:0/3256866141 wait complete. 2026-03-09T19:36:09.632 DEBUG:tasks.fs:max_mds reduced in epoch 28 2026-03-09T19:36:09.632 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 29 2026-03-09T19:36:09.801 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:10.094 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:09 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/4253789420' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-09T19:36:10.094 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:09 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/3256866141' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-09T19:36:10.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.096+0000 7fb97a7ac640 1 -- 192.168.123.107:0/3587147639 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb974075720 msgr2=0x7fb974075b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:10.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.096+0000 7fb97a7ac640 1 --2- 192.168.123.107:0/3587147639 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb974075720 0x7fb974075b00 secure :-1 s=READY pgs=190 cs=0 l=1 rev1=1 crypto rx=0x7fb9600099b0 tx=0x7fb96002f220 comp rx=0 tx=0).stop 2026-03-09T19:36:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:09 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/4253789420' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-09T19:36:10.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:09 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/3256866141' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-09T19:36:10.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.098+0000 7fb97a7ac640 1 -- 192.168.123.107:0/3587147639 shutdown_connections 2026-03-09T19:36:10.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.098+0000 7fb97a7ac640 1 --2- 192.168.123.107:0/3587147639 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb974076040 0x7fb974111330 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:10.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.098+0000 7fb97a7ac640 1 --2- 192.168.123.107:0/3587147639 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb974075720 0x7fb974075b00 unknown :-1 s=CLOSED pgs=190 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:10.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.098+0000 7fb97a7ac640 1 -- 192.168.123.107:0/3587147639 >> 192.168.123.107:0/3587147639 conn(0x7fb9740fe710 msgr2=0x7fb974100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:10.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.098+0000 7fb97a7ac640 1 -- 192.168.123.107:0/3587147639 shutdown_connections 2026-03-09T19:36:10.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.098+0000 7fb97a7ac640 1 -- 192.168.123.107:0/3587147639 wait complete. 2026-03-09T19:36:10.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.099+0000 7fb97a7ac640 1 Processor -- start 2026-03-09T19:36:10.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.099+0000 7fb97a7ac640 1 -- start start 2026-03-09T19:36:10.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.099+0000 7fb97a7ac640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb974075720 0x7fb97419ee10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:10.098 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.099+0000 7fb97a7ac640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb974076040 0x7fb97419f350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:10.098 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.099+0000 7fb97a7ac640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb97419f9e0 con 0x7fb974075720 2026-03-09T19:36:10.098 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.099+0000 7fb97a7ac640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb9741a3750 con 0x7fb974076040 2026-03-09T19:36:10.098 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.100+0000 7fb973fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb974075720 0x7fb97419ee10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:10.098 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.100+0000 7fb973fff640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb974075720 0x7fb97419ee10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:55102/0 (socket says 192.168.123.107:55102) 2026-03-09T19:36:10.098 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.100+0000 7fb973fff640 1 -- 192.168.123.107:0/2922454590 learned_addr learned my addr 192.168.123.107:0/2922454590 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:10.098 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.100+0000 7fb973fff640 1 -- 192.168.123.107:0/2922454590 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb974076040 msgr2=0x7fb97419f350 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:36:10.098 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.100+0000 7fb9737fe640 1 --2- 192.168.123.107:0/2922454590 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb974076040 0x7fb97419f350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:10.098 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.100+0000 7fb973fff640 1 --2- 192.168.123.107:0/2922454590 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb974076040 0x7fb97419f350 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:10.098 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.100+0000 7fb973fff640 1 -- 192.168.123.107:0/2922454590 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb960009660 con 0x7fb974075720 2026-03-09T19:36:10.098 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.101+0000 7fb9737fe640 1 --2- 192.168.123.107:0/2922454590 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb974076040 0x7fb97419f350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:36:10.099 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.101+0000 7fb973fff640 1 --2- 192.168.123.107:0/2922454590 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb974075720 0x7fb97419ee10 secure :-1 s=READY pgs=191 cs=0 l=1 rev1=1 crypto rx=0x7fb960002410 tx=0x7fb960004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:10.099 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.101+0000 7fb9717fa640 1 -- 192.168.123.107:0/2922454590 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb96003d070 con 0x7fb974075720 2026-03-09T19:36:10.099 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.101+0000 7fb9717fa640 1 -- 192.168.123.107:0/2922454590 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fb9600043b0 con 0x7fb974075720 2026-03-09T19:36:10.099 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.101+0000 7fb9717fa640 1 -- 192.168.123.107:0/2922454590 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb960041880 con 0x7fb974075720 2026-03-09T19:36:10.099 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.102+0000 7fb97a7ac640 1 -- 192.168.123.107:0/2922454590 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb9741a39d0 con 0x7fb974075720 2026-03-09T19:36:10.100 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.102+0000 7fb97a7ac640 1 -- 192.168.123.107:0/2922454590 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb9741a3ec0 con 0x7fb974075720 2026-03-09T19:36:10.102 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.104+0000 7fb9717fa640 1 -- 192.168.123.107:0/2922454590 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb96004b430 con 0x7fb974075720 2026-03-09T19:36:10.103 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.104+0000 7fb9717fa640 1 --2- 192.168.123.107:0/2922454590 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb95c077890 0x7fb95c079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:10.103 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.104+0000 7fb9717fa640 1 -- 192.168.123.107:0/2922454590 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7fb9600bf5f0 con 0x7fb974075720 2026-03-09T19:36:10.103 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.105+0000 7fb9737fe640 1 --2- 192.168.123.107:0/2922454590 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb95c077890 0x7fb95c079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:10.103 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.105+0000 7fb97a7ac640 1 -- 192.168.123.107:0/2922454590 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb974076e60 con 0x7fb974075720 2026-03-09T19:36:10.103 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.105+0000 7fb9737fe640 1 --2- 192.168.123.107:0/2922454590 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb95c077890 0x7fb95c079d50 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7fb9741a03c0 tx=0x7fb964005e30 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:10.107 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.109+0000 7fb9717fa640 1 -- 192.168.123.107:0/2922454590 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb960087cc0 con 0x7fb974075720 2026-03-09T19:36:10.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.232+0000 7fb97a7ac640 1 -- 192.168.123.107:0/2922454590 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 29, "format": "json"} v 0) v1 -- 0x7fb974075b00 con 0x7fb974075720 2026-03-09T19:36:10.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.234+0000 7fb9717fa640 1 -- 192.168.123.107:0/2922454590 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 29, "format": "json"}]=0 dumped fsmap epoch 29 v33) v1 ==== 107+0+5257 (secure 0 0 0) 0x7fb960046090 con 0x7fb974075720 2026-03-09T19:36:10.232 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:10.232 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":29,"btime":"2026-03-09T19:34:06:718790+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34386,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6829/670620212","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":670620212},{"type":"v1","addr":"192.168.123.107:6829","nonce":670620212}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":34410,"name":"cephfs.vm08.jwsqrf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/2173796097","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2173796097},{"type":"v1","addr":"192.168.123.108:6827","nonce":2173796097}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":28},{"gid":44337,"name":"cephfs.vm08.zcaqju","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3904856878","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3904856878},{"type":"v1","addr":"192.168.123.108:6825","nonce":3904856878}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24}],"filesystems":[{"mdsmap":{"epoch":29,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:34:05.892012+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":102,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34382},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34382":{"gid":34382,"name":"cephfs.vm07.uizncw","rank":0,"incarnation":27,"state":"up:reconnect","state_seq":7,"addr":"192.168.123.107:6827/2856024060","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":2856024060},{"type":"v1","addr":"192.168.123.107:6827","nonce":2856024060}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T19:36:10.232 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 29 2026-03-09T19:36:10.235 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.237+0000 7fb97a7ac640 1 -- 192.168.123.107:0/2922454590 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb95c077890 msgr2=0x7fb95c079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:10.235 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.237+0000 7fb97a7ac640 1 --2- 192.168.123.107:0/2922454590 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb95c077890 0x7fb95c079d50 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7fb9741a03c0 tx=0x7fb964005e30 comp rx=0 tx=0).stop 2026-03-09T19:36:10.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.238+0000 7fb97a7ac640 1 -- 192.168.123.107:0/2922454590 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb974075720 msgr2=0x7fb97419ee10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:10.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.238+0000 7fb97a7ac640 1 --2- 192.168.123.107:0/2922454590 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb974075720 0x7fb97419ee10 secure :-1 s=READY pgs=191 cs=0 l=1 rev1=1 crypto rx=0x7fb960002410 tx=0x7fb960004290 comp rx=0 tx=0).stop 2026-03-09T19:36:10.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.238+0000 7fb97a7ac640 1 -- 192.168.123.107:0/2922454590 shutdown_connections 2026-03-09T19:36:10.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.238+0000 7fb97a7ac640 1 --2- 192.168.123.107:0/2922454590 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fb95c077890 0x7fb95c079d50 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:10.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.238+0000 7fb97a7ac640 1 --2- 192.168.123.107:0/2922454590 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb974076040 0x7fb97419f350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:10.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.238+0000 7fb97a7ac640 1 --2- 192.168.123.107:0/2922454590 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb974075720 0x7fb97419ee10 unknown :-1 s=CLOSED pgs=191 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:10.236 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.238+0000 7fb97a7ac640 1 -- 192.168.123.107:0/2922454590 >> 192.168.123.107:0/2922454590 conn(0x7fb9740fe710 msgr2=0x7fb9740ffe60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:10.237 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.239+0000 7fb97a7ac640 1 -- 192.168.123.107:0/2922454590 shutdown_connections 2026-03-09T19:36:10.237 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.239+0000 7fb97a7ac640 1 -- 192.168.123.107:0/2922454590 wait complete. 2026-03-09T19:36:10.317 DEBUG:tasks.fs:max_mds reduced in epoch 29 2026-03-09T19:36:10.317 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 30 2026-03-09T19:36:10.490 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:10.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.788+0000 7f25a71ee640 1 -- 192.168.123.107:0/4181448738 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f25a0072ad0 msgr2=0x7f25a010b9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:10.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.788+0000 7f25a71ee640 1 --2- 192.168.123.107:0/4181448738 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f25a0072ad0 0x7f25a010b9a0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f259800b0a0 tx=0x7f259802f4c0 comp rx=0 tx=0).stop 2026-03-09T19:36:10.788 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.790+0000 7f25a71ee640 1 -- 192.168.123.107:0/4181448738 shutdown_connections 2026-03-09T19:36:10.788 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.790+0000 7f25a71ee640 1 --2- 192.168.123.107:0/4181448738 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f25a0072ad0 0x7f25a010b9a0 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:10.788 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.790+0000 7f25a71ee640 1 --2- 192.168.123.107:0/4181448738 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25a0072120 0x7f25a0072500 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:10.788 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.790+0000 7f25a71ee640 1 -- 192.168.123.107:0/4181448738 >> 192.168.123.107:0/4181448738 conn(0x7f25a006c7d0 msgr2=0x7f25a006cbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:10.790 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.790+0000 7f25a71ee640 1 -- 192.168.123.107:0/4181448738 shutdown_connections 2026-03-09T19:36:10.790 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.790+0000 7f25a71ee640 1 -- 192.168.123.107:0/4181448738 wait complete. 2026-03-09T19:36:10.790 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.791+0000 7f25a71ee640 1 Processor -- start 2026-03-09T19:36:10.790 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.791+0000 7f25a71ee640 1 -- start start 2026-03-09T19:36:10.790 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.791+0000 7f25a71ee640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25a0072120 0x7f25a007d4c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:10.790 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.791+0000 7f25a71ee640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f25a007da00 0x7f25a007de60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:10.790 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.791+0000 7f25a71ee640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f25a0084470 con 0x7f25a0072120 2026-03-09T19:36:10.790 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.791+0000 7f25a71ee640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f25a00845e0 con 0x7f25a007da00 2026-03-09T19:36:10.790 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.791+0000 7f25a61ec640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25a0072120 0x7f25a007d4c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:10.790 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.791+0000 7f25a61ec640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25a0072120 0x7f25a007d4c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:55120/0 (socket says 192.168.123.107:55120) 2026-03-09T19:36:10.791 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.791+0000 7f25a61ec640 1 -- 192.168.123.107:0/1924325286 learned_addr learned my addr 192.168.123.107:0/1924325286 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:10.791 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.791+0000 7f25a61ec640 1 -- 192.168.123.107:0/1924325286 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f25a007da00 msgr2=0x7f25a007de60 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T19:36:10.791 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.791+0000 7f25a61ec640 1 --2- 192.168.123.107:0/1924325286 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f25a007da00 0x7f25a007de60 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:10.791 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.791+0000 7f25a61ec640 1 -- 192.168.123.107:0/1924325286 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f259c009590 con 0x7f25a0072120 2026-03-09T19:36:10.791 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.791+0000 7f25a61ec640 1 --2- 192.168.123.107:0/1924325286 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25a0072120 0x7f25a007d4c0 secure :-1 s=READY pgs=192 cs=0 l=1 rev1=1 crypto rx=0x7f259c0029c0 tx=0x7f259c002e90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:10.791 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.793+0000 7f25977fe640 1 -- 192.168.123.107:0/1924325286 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f259c00ebd0 con 0x7f25a0072120 2026-03-09T19:36:10.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.793+0000 7f25a71ee640 1 -- 192.168.123.107:0/1924325286 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2598009d00 con 0x7f25a0072120 2026-03-09T19:36:10.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.793+0000 7f25a71ee640 1 -- 192.168.123.107:0/1924325286 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f25a0082400 con 0x7f25a0072120 2026-03-09T19:36:10.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.795+0000 7f25a71ee640 1 -- 192.168.123.107:0/1924325286 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2578009860 con 0x7f25a0072120 2026-03-09T19:36:10.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.800+0000 7f25977fe640 1 -- 192.168.123.107:0/1924325286 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f259c00ed30 con 0x7f25a0072120 2026-03-09T19:36:10.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.800+0000 7f25977fe640 1 -- 192.168.123.107:0/1924325286 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f259c0187b0 con 0x7f25a0072120 2026-03-09T19:36:10.798 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.800+0000 7f25977fe640 1 -- 192.168.123.107:0/1924325286 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f259c016020 con 0x7f25a0072120 2026-03-09T19:36:10.799 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.802+0000 7f25977fe640 1 --2- 192.168.123.107:0/1924325286 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2574077a80 0x7f2574079f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:10.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.802+0000 7f25977fe640 1 -- 192.168.123.107:0/1924325286 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f259c09b7b0 con 0x7f25a0072120 2026-03-09T19:36:10.800 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.802+0000 7f25a59eb640 1 --2- 192.168.123.107:0/1924325286 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2574077a80 0x7f2574079f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:10.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.803+0000 7f25a59eb640 1 --2- 192.168.123.107:0/1924325286 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2574077a80 0x7f2574079f40 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7f25a010db60 tx=0x7f2598002750 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:10.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.803+0000 7f25977fe640 1 -- 192.168.123.107:0/1924325286 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f259c063e60 con 0x7f25a0072120 2026-03-09T19:36:10.924 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:10 vm07.local ceph-mon[111841]: pgmap v287: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:10.924 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:10 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/2922454590' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-09T19:36:10.924 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.925+0000 7f25a71ee640 1 -- 192.168.123.107:0/1924325286 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 30, "format": "json"} v 0) v1 -- 0x7f2578009de0 con 0x7f25a0072120 2026-03-09T19:36:10.925 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.927+0000 7f25977fe640 1 -- 192.168.123.107:0/1924325286 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 30, "format": "json"}]=0 dumped fsmap epoch 30 v33) v1 ==== 107+0+5254 (secure 0 0 0) 0x7f259c0635b0 con 0x7f25a0072120 2026-03-09T19:36:10.925 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:10.926 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":30,"btime":"2026-03-09T19:34:07:721454+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34386,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6829/670620212","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":670620212},{"type":"v1","addr":"192.168.123.107:6829","nonce":670620212}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":34410,"name":"cephfs.vm08.jwsqrf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/2173796097","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2173796097},{"type":"v1","addr":"192.168.123.108:6827","nonce":2173796097}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":28},{"gid":44337,"name":"cephfs.vm08.zcaqju","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3904856878","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3904856878},{"type":"v1","addr":"192.168.123.108:6825","nonce":3904856878}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24}],"filesystems":[{"mdsmap":{"epoch":30,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:34:07.543631+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":102,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":2,"in":[0],"up":{"mds_0":34382},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34382":{"gid":34382,"name":"cephfs.vm07.uizncw","rank":0,"incarnation":27,"state":"up:rejoin","state_seq":8,"addr":"192.168.123.107:6827/2856024060","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":2856024060},{"type":"v1","addr":"192.168.123.107:6827","nonce":2856024060}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T19:36:10.926 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 30 2026-03-09T19:36:10.930 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.931+0000 7f25a71ee640 1 -- 192.168.123.107:0/1924325286 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2574077a80 msgr2=0x7f2574079f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:10.930 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.931+0000 7f25a71ee640 1 --2- 192.168.123.107:0/1924325286 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2574077a80 0x7f2574079f40 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7f25a010db60 tx=0x7f2598002750 comp rx=0 tx=0).stop 2026-03-09T19:36:10.930 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.931+0000 7f25a71ee640 1 -- 192.168.123.107:0/1924325286 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25a0072120 msgr2=0x7f25a007d4c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:10.930 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.931+0000 7f25a71ee640 1 --2- 192.168.123.107:0/1924325286 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25a0072120 0x7f25a007d4c0 secure :-1 s=READY pgs=192 cs=0 l=1 rev1=1 crypto rx=0x7f259c0029c0 tx=0x7f259c002e90 comp rx=0 tx=0).stop 2026-03-09T19:36:10.930 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.932+0000 7f25a71ee640 1 -- 192.168.123.107:0/1924325286 shutdown_connections 2026-03-09T19:36:10.930 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.932+0000 7f25a71ee640 1 --2- 192.168.123.107:0/1924325286 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2574077a80 0x7f2574079f40 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:10.930 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.932+0000 7f25a71ee640 1 --2- 192.168.123.107:0/1924325286 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f25a007da00 0x7f25a007de60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:10.930 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.932+0000 7f25a71ee640 1 --2- 192.168.123.107:0/1924325286 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25a0072120 0x7f25a007d4c0 unknown :-1 s=CLOSED pgs=192 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:10.930 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.932+0000 7f25a71ee640 1 -- 192.168.123.107:0/1924325286 >> 192.168.123.107:0/1924325286 conn(0x7f25a006c7d0 msgr2=0x7f25a006fd30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:10.930 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.932+0000 7f25a71ee640 1 -- 192.168.123.107:0/1924325286 shutdown_connections 2026-03-09T19:36:10.931 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:10.932+0000 7f25a71ee640 1 -- 192.168.123.107:0/1924325286 wait complete. 2026-03-09T19:36:10.984 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 31 2026-03-09T19:36:11.163 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:11.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:10 vm08.local ceph-mon[103420]: pgmap v287: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:11.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:10 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/2922454590' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-09T19:36:11.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.439+0000 7f2e9401b640 1 -- 192.168.123.107:0/931208593 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2e8c076040 msgr2=0x7f2e8c111330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:11.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.439+0000 7f2e9401b640 1 --2- 192.168.123.107:0/931208593 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2e8c076040 0x7f2e8c111330 secure :-1 s=READY pgs=193 cs=0 l=1 rev1=1 crypto rx=0x7f2e740099b0 tx=0x7f2e7402f220 comp rx=0 tx=0).stop 2026-03-09T19:36:11.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.440+0000 7f2e9401b640 1 -- 192.168.123.107:0/931208593 shutdown_connections 2026-03-09T19:36:11.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.440+0000 7f2e9401b640 1 --2- 192.168.123.107:0/931208593 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2e8c076040 0x7f2e8c111330 unknown :-1 s=CLOSED pgs=193 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:11.439 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.440+0000 7f2e9401b640 1 --2- 192.168.123.107:0/931208593 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e8c075720 0x7f2e8c075b00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:11.439 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.440+0000 7f2e9401b640 1 -- 192.168.123.107:0/931208593 >> 192.168.123.107:0/931208593 conn(0x7f2e8c0fe710 msgr2=0x7f2e8c100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:11.439 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.441+0000 7f2e9401b640 1 -- 192.168.123.107:0/931208593 shutdown_connections 2026-03-09T19:36:11.439 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.441+0000 7f2e9401b640 1 -- 192.168.123.107:0/931208593 wait complete. 2026-03-09T19:36:11.439 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.441+0000 7f2e9401b640 1 Processor -- start 2026-03-09T19:36:11.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.443+0000 7f2e9401b640 1 -- start start 2026-03-09T19:36:11.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.443+0000 7f2e9401b640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2e8c075720 0x7f2e8c19ee80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:11.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.443+0000 7f2e9401b640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e8c076040 0x7f2e8c19f3c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:11.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.443+0000 7f2e9401b640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2e8c19f9c0 con 0x7f2e8c075720 2026-03-09T19:36:11.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.443+0000 7f2e9401b640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2e8c19fb30 con 0x7f2e8c076040 2026-03-09T19:36:11.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.445+0000 7f2e9158f640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e8c076040 0x7f2e8c19f3c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:11.442 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.445+0000 7f2e9158f640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e8c076040 0x7f2e8c19f3c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:34242/0 (socket says 192.168.123.107:34242) 2026-03-09T19:36:11.443 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.445+0000 7f2e9158f640 1 -- 192.168.123.107:0/3029920404 learned_addr learned my addr 192.168.123.107:0/3029920404 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:11.443 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.445+0000 7f2e91d90640 1 --2- 192.168.123.107:0/3029920404 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2e8c075720 0x7f2e8c19ee80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:11.443 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.445+0000 7f2e9158f640 1 -- 192.168.123.107:0/3029920404 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2e8c075720 msgr2=0x7f2e8c19ee80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:11.443 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.445+0000 7f2e9158f640 1 --2- 192.168.123.107:0/3029920404 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2e8c075720 0x7f2e8c19ee80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:11.443 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.445+0000 7f2e9158f640 1 -- 192.168.123.107:0/3029920404 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2e74009660 con 0x7f2e8c076040 2026-03-09T19:36:11.443 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.445+0000 7f2e91d90640 1 --2- 192.168.123.107:0/3029920404 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2e8c075720 0x7f2e8c19ee80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T19:36:11.444 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.446+0000 7f2e9158f640 1 --2- 192.168.123.107:0/3029920404 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e8c076040 0x7f2e8c19f3c0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f2e74002fd0 tx=0x7f2e7402fcb0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:11.444 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.446+0000 7f2e82ffd640 1 -- 192.168.123.107:0/3029920404 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2e7403c040 con 0x7f2e8c076040 2026-03-09T19:36:11.444 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.446+0000 7f2e9401b640 1 -- 192.168.123.107:0/3029920404 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2e8c1a38b0 con 0x7f2e8c076040 2026-03-09T19:36:11.444 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.446+0000 7f2e9401b640 1 -- 192.168.123.107:0/3029920404 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2e8c1a3da0 con 0x7f2e8c076040 2026-03-09T19:36:11.445 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.447+0000 7f2e82ffd640 1 -- 192.168.123.107:0/3029920404 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f2e74004440 con 0x7f2e8c076040 2026-03-09T19:36:11.446 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.448+0000 7f2e82ffd640 1 -- 192.168.123.107:0/3029920404 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2e74040880 con 0x7f2e8c076040 2026-03-09T19:36:11.446 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.448+0000 7f2e9401b640 1 -- 192.168.123.107:0/3029920404 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2e8c076e60 con 0x7f2e8c076040 2026-03-09T19:36:11.446 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.449+0000 7f2e82ffd640 1 -- 192.168.123.107:0/3029920404 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2e74040a60 con 0x7f2e8c076040 2026-03-09T19:36:11.447 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.449+0000 7f2e82ffd640 1 --2- 192.168.123.107:0/3029920404 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2e5c0776d0 0x7f2e5c079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:11.447 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.449+0000 7f2e91d90640 1 --2- 192.168.123.107:0/3029920404 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2e5c0776d0 0x7f2e5c079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:11.448 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.449+0000 7f2e82ffd640 1 -- 192.168.123.107:0/3029920404 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f2e740bfd80 con 0x7f2e8c076040 2026-03-09T19:36:11.448 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.450+0000 7f2e91d90640 1 --2- 192.168.123.107:0/3029920404 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2e5c0776d0 0x7f2e5c079b90 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f2e7c005f10 tx=0x7f2e7c005ea0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:11.450 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.452+0000 7f2e82ffd640 1 -- 192.168.123.107:0/3029920404 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2e740c6080 con 0x7f2e8c076040 2026-03-09T19:36:11.577 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.579+0000 7f2e9401b640 1 -- 192.168.123.107:0/3029920404 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 31, "format": "json"} v 0) v1 -- 0x7f2e8c075b00 con 0x7f2e8c076040 2026-03-09T19:36:11.578 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.580+0000 7f2e82ffd640 1 -- 192.168.123.107:0/3029920404 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 31, "format": "json"}]=0 dumped fsmap epoch 31 v33) v1 ==== 107+0+5263 (secure 0 0 0) 0x7f2e74088430 con 0x7f2e8c076040 2026-03-09T19:36:11.578 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:11.578 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":31,"btime":"2026-03-09T19:34:08:746910+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34386,"name":"cephfs.vm07.zkmcyw","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6829/670620212","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":670620212},{"type":"v1","addr":"192.168.123.107:6829","nonce":670620212}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":34410,"name":"cephfs.vm08.jwsqrf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/2173796097","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2173796097},{"type":"v1","addr":"192.168.123.108:6827","nonce":2173796097}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":28},{"gid":44337,"name":"cephfs.vm08.zcaqju","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3904856878","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3904856878},{"type":"v1","addr":"192.168.123.108:6825","nonce":3904856878}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24}],"filesystems":[{"mdsmap":{"epoch":31,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:34:08.746908+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":102,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":2,"in":[0],"up":{"mds_0":34382},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34382":{"gid":34382,"name":"cephfs.vm07.uizncw","rank":0,"incarnation":27,"state":"up:active","state_seq":9,"addr":"192.168.123.107:6827/2856024060","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":2856024060},{"type":"v1","addr":"192.168.123.107:6827","nonce":2856024060}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34382,"qdb_cluster":[34382]},"id":1}]} 2026-03-09T19:36:11.579 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 31 2026-03-09T19:36:11.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.584+0000 7f2e9401b640 1 -- 192.168.123.107:0/3029920404 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2e5c0776d0 msgr2=0x7f2e5c079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:11.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.584+0000 7f2e9401b640 1 --2- 192.168.123.107:0/3029920404 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2e5c0776d0 0x7f2e5c079b90 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f2e7c005f10 tx=0x7f2e7c005ea0 comp rx=0 tx=0).stop 2026-03-09T19:36:11.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.584+0000 7f2e9401b640 1 -- 192.168.123.107:0/3029920404 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e8c076040 msgr2=0x7f2e8c19f3c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:11.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.584+0000 7f2e9401b640 1 --2- 192.168.123.107:0/3029920404 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e8c076040 0x7f2e8c19f3c0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f2e74002fd0 tx=0x7f2e7402fcb0 comp rx=0 tx=0).stop 2026-03-09T19:36:11.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.584+0000 7f2e9401b640 1 -- 192.168.123.107:0/3029920404 shutdown_connections 2026-03-09T19:36:11.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.584+0000 7f2e9401b640 1 --2- 192.168.123.107:0/3029920404 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f2e5c0776d0 0x7f2e5c079b90 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:11.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.584+0000 7f2e9401b640 1 --2- 192.168.123.107:0/3029920404 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e8c076040 0x7f2e8c19f3c0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:11.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.584+0000 7f2e9401b640 1 --2- 192.168.123.107:0/3029920404 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2e8c075720 0x7f2e8c19ee80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:11.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.584+0000 7f2e9401b640 1 -- 192.168.123.107:0/3029920404 >> 192.168.123.107:0/3029920404 conn(0x7f2e8c0fe710 msgr2=0x7f2e8c0ffdc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:11.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.584+0000 7f2e9401b640 1 -- 192.168.123.107:0/3029920404 shutdown_connections 2026-03-09T19:36:11.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:11.584+0000 7f2e9401b640 1 -- 192.168.123.107:0/3029920404 wait complete. 2026-03-09T19:36:11.632 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph fs dump --format=json 32 2026-03-09T19:36:11.801 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:12.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.057+0000 7ff4b3e60640 1 -- 192.168.123.107:0/4214595893 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4ac072340 msgr2=0x7ff4ac072720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:12.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.057+0000 7ff4b3e60640 1 --2- 192.168.123.107:0/4214595893 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4ac072340 0x7ff4ac072720 secure :-1 s=READY pgs=194 cs=0 l=1 rev1=1 crypto rx=0x7ff49c0099b0 tx=0x7ff49c02f220 comp rx=0 tx=0).stop 2026-03-09T19:36:12.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.058+0000 7ff4b3e60640 1 -- 192.168.123.107:0/4214595893 shutdown_connections 2026-03-09T19:36:12.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.058+0000 7ff4b3e60640 1 --2- 192.168.123.107:0/4214595893 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff4ac072cf0 0x7ff4ac10cd90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:12.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.058+0000 7ff4b3e60640 1 --2- 192.168.123.107:0/4214595893 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4ac072340 0x7ff4ac072720 unknown :-1 s=CLOSED pgs=194 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:12.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.058+0000 7ff4b3e60640 1 -- 192.168.123.107:0/4214595893 >> 192.168.123.107:0/4214595893 conn(0x7ff4ac06b7f0 msgr2=0x7ff4ac06bc00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:12.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.059+0000 7ff4b3e60640 1 -- 192.168.123.107:0/4214595893 shutdown_connections 2026-03-09T19:36:12.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.059+0000 7ff4b3e60640 1 -- 192.168.123.107:0/4214595893 wait complete. 2026-03-09T19:36:12.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.059+0000 7ff4b3e60640 1 Processor -- start 2026-03-09T19:36:12.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.059+0000 7ff4b3e60640 1 -- start start 2026-03-09T19:36:12.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.059+0000 7ff4b3e60640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4ac072340 0x7ff4ac114d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:12.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.059+0000 7ff4b3e60640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff4ac072cf0 0x7ff4ac1152b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:12.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.059+0000 7ff4b3e60640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff4ac115990 con 0x7ff4ac072340 2026-03-09T19:36:12.057 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.059+0000 7ff4b3e60640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff4ac1b5d40 con 0x7ff4ac072cf0 2026-03-09T19:36:12.058 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.060+0000 7ff4b13d4640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff4ac072cf0 0x7ff4ac1152b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:12.058 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.060+0000 7ff4b1bd5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4ac072340 0x7ff4ac114d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:12.058 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.060+0000 7ff4b1bd5640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4ac072340 0x7ff4ac114d70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:55164/0 (socket says 192.168.123.107:55164) 2026-03-09T19:36:12.058 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.060+0000 7ff4b1bd5640 1 -- 192.168.123.107:0/731154279 learned_addr learned my addr 192.168.123.107:0/731154279 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:12.058 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.060+0000 7ff4b1bd5640 1 -- 192.168.123.107:0/731154279 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff4ac072cf0 msgr2=0x7ff4ac1152b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:12.058 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.060+0000 7ff4b1bd5640 1 --2- 192.168.123.107:0/731154279 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff4ac072cf0 0x7ff4ac1152b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:12.058 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.060+0000 7ff4b1bd5640 1 -- 192.168.123.107:0/731154279 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff49c009660 con 0x7ff4ac072340 2026-03-09T19:36:12.058 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.060+0000 7ff4b13d4640 1 --2- 192.168.123.107:0/731154279 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff4ac072cf0 0x7ff4ac1152b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T19:36:12.058 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.061+0000 7ff4b1bd5640 1 --2- 192.168.123.107:0/731154279 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4ac072340 0x7ff4ac114d70 secure :-1 s=READY pgs=195 cs=0 l=1 rev1=1 crypto rx=0x7ff49c005ec0 tx=0x7ff49c004290 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:12.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.061+0000 7ff49affd640 1 -- 192.168.123.107:0/731154279 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff49c03d070 con 0x7ff4ac072340 2026-03-09T19:36:12.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.061+0000 7ff49affd640 1 -- 192.168.123.107:0/731154279 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7ff49c038730 con 0x7ff4ac072340 2026-03-09T19:36:12.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.061+0000 7ff4b3e60640 1 -- 192.168.123.107:0/731154279 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff4ac1b5fc0 con 0x7ff4ac072340 2026-03-09T19:36:12.060 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.061+0000 7ff4b3e60640 1 -- 192.168.123.107:0/731154279 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff4ac1b64b0 con 0x7ff4ac072340 2026-03-09T19:36:12.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.064+0000 7ff49affd640 1 -- 192.168.123.107:0/731154279 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff49c041600 con 0x7ff4ac072340 2026-03-09T19:36:12.061 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.064+0000 7ff498ff9640 1 -- 192.168.123.107:0/731154279 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff474005350 con 0x7ff4ac072340 2026-03-09T19:36:12.066 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.064+0000 7ff49affd640 1 -- 192.168.123.107:0/731154279 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff49c04b430 con 0x7ff4ac072340 2026-03-09T19:36:12.066 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.065+0000 7ff49affd640 1 --2- 192.168.123.107:0/731154279 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff4880776d0 0x7ff488079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:12.066 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.065+0000 7ff49affd640 1 -- 192.168.123.107:0/731154279 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7ff49c0be4b0 con 0x7ff4ac072340 2026-03-09T19:36:12.066 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.065+0000 7ff4b13d4640 1 --2- 192.168.123.107:0/731154279 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff4880776d0 0x7ff488079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:12.066 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.065+0000 7ff4b13d4640 1 --2- 192.168.123.107:0/731154279 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff4880776d0 0x7ff488079b90 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7ff4ac116320 tx=0x7ff4a0005f50 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:12.066 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.068+0000 7ff49affd640 1 -- 192.168.123.107:0/731154279 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff49c086b60 con 0x7ff4ac072340 2026-03-09T19:36:12.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:11 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1924325286' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 30, "format": "json"}]: dispatch 2026-03-09T19:36:12.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:11 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/3029920404' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 31, "format": "json"}]: dispatch 2026-03-09T19:36:12.181 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:11 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1924325286' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 30, "format": "json"}]: dispatch 2026-03-09T19:36:12.181 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:11 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/3029920404' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 31, "format": "json"}]: dispatch 2026-03-09T19:36:12.182 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.183+0000 7ff498ff9640 1 -- 192.168.123.107:0/731154279 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 32, "format": "json"} v 0) v1 -- 0x7ff474005600 con 0x7ff4ac072340 2026-03-09T19:36:12.184 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.186+0000 7ff49affd640 1 -- 192.168.123.107:0/731154279 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 32, "format": "json"}]=0 dumped fsmap epoch 32 v33) v1 ==== 107+0+5280 (secure 0 0 0) 0x7ff49c0388a0 con 0x7ff4ac072340 2026-03-09T19:36:12.185 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:36:12.185 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":32,"btime":"2026-03-09T19:34:08:763299+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34410,"name":"cephfs.vm08.jwsqrf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/2173796097","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2173796097},{"type":"v1","addr":"192.168.123.108:6827","nonce":2173796097}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":28},{"gid":44337,"name":"cephfs.vm08.zcaqju","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3904856878","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3904856878},{"type":"v1","addr":"192.168.123.108:6825","nonce":3904856878}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24}],"filesystems":[{"mdsmap":{"epoch":32,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T19:24:23.601314+0000","modified":"2026-03-09T19:34:08.763275+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":102,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":2,"in":[0,1],"up":{"mds_0":34382,"mds_1":34386},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34382":{"gid":34382,"name":"cephfs.vm07.uizncw","rank":0,"incarnation":27,"state":"up:active","state_seq":9,"addr":"192.168.123.107:6827/2856024060","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":2856024060},{"type":"v1","addr":"192.168.123.107:6827","nonce":2856024060}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}},"gid_34386":{"gid":34386,"name":"cephfs.vm07.zkmcyw","rank":1,"incarnation":32,"state":"up:starting","state_seq":1,"addr":"192.168.123.107:6829/670620212","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6828","nonce":670620212},{"type":"v1","addr":"192.168.123.107:6829","nonce":670620212}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34382,"qdb_cluster":[34382]},"id":1}]} 2026-03-09T19:36:12.185 INFO:teuthology.orchestra.run.vm07.stderr:dumped fsmap epoch 32 2026-03-09T19:36:12.187 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.189+0000 7ff498ff9640 1 -- 192.168.123.107:0/731154279 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff4880776d0 msgr2=0x7ff488079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:12.187 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.189+0000 7ff498ff9640 1 --2- 192.168.123.107:0/731154279 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff4880776d0 0x7ff488079b90 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7ff4ac116320 tx=0x7ff4a0005f50 comp rx=0 tx=0).stop 2026-03-09T19:36:12.188 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.190+0000 7ff498ff9640 1 -- 192.168.123.107:0/731154279 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4ac072340 msgr2=0x7ff4ac114d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:12.188 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.190+0000 7ff498ff9640 1 --2- 192.168.123.107:0/731154279 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4ac072340 0x7ff4ac114d70 secure :-1 s=READY pgs=195 cs=0 l=1 rev1=1 crypto rx=0x7ff49c005ec0 tx=0x7ff49c004290 comp rx=0 tx=0).stop 2026-03-09T19:36:12.188 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.190+0000 7ff498ff9640 1 -- 192.168.123.107:0/731154279 shutdown_connections 2026-03-09T19:36:12.188 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.190+0000 7ff498ff9640 1 --2- 192.168.123.107:0/731154279 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7ff4880776d0 0x7ff488079b90 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:12.188 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.190+0000 7ff498ff9640 1 --2- 192.168.123.107:0/731154279 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff4ac072cf0 0x7ff4ac1152b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:12.188 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.190+0000 7ff498ff9640 1 --2- 192.168.123.107:0/731154279 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4ac072340 0x7ff4ac114d70 unknown :-1 s=CLOSED pgs=195 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:12.188 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.190+0000 7ff498ff9640 1 -- 192.168.123.107:0/731154279 >> 192.168.123.107:0/731154279 conn(0x7ff4ac06b7f0 msgr2=0x7ff4ac10de10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:12.188 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.191+0000 7ff498ff9640 1 -- 192.168.123.107:0/731154279 shutdown_connections 2026-03-09T19:36:12.188 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.191+0000 7ff498ff9640 1 -- 192.168.123.107:0/731154279 wait complete. 2026-03-09T19:36:12.261 DEBUG:teuthology.run_tasks:Unwinding manager ceph-fuse 2026-03-09T19:36:12.265 INFO:tasks.ceph_fuse:Unmounting ceph-fuse clients... 2026-03-09T19:36:12.265 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:36:12.265 DEBUG:teuthology.orchestra.run.vm07:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-09T19:36:12.289 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:36:12.290 DEBUG:teuthology.orchestra.run.vm07:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-09T19:36:12.348 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd blocklist ls 2026-03-09T19:36:12.559 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:12.833 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.835+0000 7fe615432640 1 -- 192.168.123.107:0/1131710080 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe610106780 msgr2=0x7fe610106b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:12.833 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.835+0000 7fe615432640 1 --2- 192.168.123.107:0/1131710080 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe610106780 0x7fe610106b60 secure :-1 s=READY pgs=196 cs=0 l=1 rev1=1 crypto rx=0x7fe5fc009a00 tx=0x7fe5fc02f270 comp rx=0 tx=0).stop 2026-03-09T19:36:12.833 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.835+0000 7fe615432640 1 -- 192.168.123.107:0/1131710080 shutdown_connections 2026-03-09T19:36:12.833 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.835+0000 7fe615432640 1 --2- 192.168.123.107:0/1131710080 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe610100780 0x7fe610100be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:12.833 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.835+0000 7fe615432640 1 --2- 192.168.123.107:0/1131710080 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe610106780 0x7fe610106b60 unknown :-1 s=CLOSED pgs=196 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:12.833 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.835+0000 7fe615432640 1 -- 192.168.123.107:0/1131710080 >> 192.168.123.107:0/1131710080 conn(0x7fe6100fc460 msgr2=0x7fe6100fe880 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:12.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.836+0000 7fe615432640 1 -- 192.168.123.107:0/1131710080 shutdown_connections 2026-03-09T19:36:12.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.836+0000 7fe615432640 1 -- 192.168.123.107:0/1131710080 wait complete. 2026-03-09T19:36:12.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.836+0000 7fe615432640 1 Processor -- start 2026-03-09T19:36:12.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.836+0000 7fe615432640 1 -- start start 2026-03-09T19:36:12.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.836+0000 7fe615432640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe610100780 0x7fe61006ab60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:12.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.836+0000 7fe615432640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe610106780 0x7fe61006b0a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:12.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.836+0000 7fe615432640 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe61006b6a0 con 0x7fe610100780 2026-03-09T19:36:12.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.836+0000 7fe615432640 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe61006b810 con 0x7fe610106780 2026-03-09T19:36:12.835 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.837+0000 7fe60e7fc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe610106780 0x7fe61006b0a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:12.835 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.837+0000 7fe60e7fc640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe610106780 0x7fe61006b0a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.107:34284/0 (socket says 192.168.123.107:34284) 2026-03-09T19:36:12.835 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.837+0000 7fe60e7fc640 1 -- 192.168.123.107:0/158298780 learned_addr learned my addr 192.168.123.107:0/158298780 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:12.835 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.837+0000 7fe60effd640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe610100780 0x7fe61006ab60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:12.835 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.837+0000 7fe60effd640 1 -- 192.168.123.107:0/158298780 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe610106780 msgr2=0x7fe61006b0a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:12.835 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.837+0000 7fe60effd640 1 --2- 192.168.123.107:0/158298780 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe610106780 0x7fe61006b0a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:12.835 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.837+0000 7fe60effd640 1 -- 192.168.123.107:0/158298780 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe5fc009660 con 0x7fe610100780 2026-03-09T19:36:12.835 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.837+0000 7fe60effd640 1 --2- 192.168.123.107:0/158298780 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe610100780 0x7fe61006ab60 secure :-1 s=READY pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7fe5fc009a00 tx=0x7fe5fc031d20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:12.835 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.838+0000 7fe5ebfff640 1 -- 192.168.123.107:0/158298780 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe5fc031e90 con 0x7fe610100780 2026-03-09T19:36:12.836 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.838+0000 7fe615432640 1 -- 192.168.123.107:0/158298780 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe610071ab0 con 0x7fe610100780 2026-03-09T19:36:12.836 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.838+0000 7fe615432640 1 -- 192.168.123.107:0/158298780 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe610071f40 con 0x7fe610100780 2026-03-09T19:36:12.836 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.838+0000 7fe5ebfff640 1 -- 192.168.123.107:0/158298780 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7fe5fc031110 con 0x7fe610100780 2026-03-09T19:36:12.836 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.838+0000 7fe5ebfff640 1 -- 192.168.123.107:0/158298780 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe5fc038680 con 0x7fe610100780 2026-03-09T19:36:12.838 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.840+0000 7fe615432640 1 -- 192.168.123.107:0/158298780 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe610101ec0 con 0x7fe610100780 2026-03-09T19:36:12.838 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.840+0000 7fe5ebfff640 1 -- 192.168.123.107:0/158298780 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe5fc048020 con 0x7fe610100780 2026-03-09T19:36:12.838 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.840+0000 7fe5ebfff640 1 --2- 192.168.123.107:0/158298780 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fe5e00779b0 0x7fe5e0079e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:12.838 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.840+0000 7fe5ebfff640 1 -- 192.168.123.107:0/158298780 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7fe5fc0be710 con 0x7fe610100780 2026-03-09T19:36:12.839 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.841+0000 7fe60e7fc640 1 --2- 192.168.123.107:0/158298780 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fe5e00779b0 0x7fe5e0079e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:12.839 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.841+0000 7fe60e7fc640 1 --2- 192.168.123.107:0/158298780 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fe5e00779b0 0x7fe5e0079e70 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7fe604004620 tx=0x7fe60400a480 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:12.841 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.844+0000 7fe5ebfff640 1 -- 192.168.123.107:0/158298780 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe5fc086dc0 con 0x7fe610100780 2026-03-09T19:36:12.895 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:12 vm07.local ceph-mon[111841]: pgmap v288: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:12.950 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.952+0000 7fe615432640 1 -- 192.168.123.107:0/158298780 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7fe610100be0 con 0x7fe610100780 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.953+0000 7fe5ebfff640 1 -- 192.168.123.107:0/158298780 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 43 entries v102) v1 ==== 81+0+2669 (secure 0 0 0) 0x7fe5fc086510 con 0x7fe610100780 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:6827/2328013860 2026-03-10T19:34:01.467325+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:6826/2328013860 2026-03-10T19:34:01.467325+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:6825/1034426472 2026-03-10T19:33:51.905906+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/257370646 2026-03-10T19:28:33.084780+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6801/781232384 2026-03-10T19:28:33.084780+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6800/781232384 2026-03-10T19:28:33.084780+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:0/1004147687 2026-03-10T19:28:11.339662+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/2125348601 2026-03-10T19:21:57.086091+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/3701716623 2026-03-10T19:21:57.086091+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6826/1147625344 2026-03-10T19:24:28.828509+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/2365276262 2026-03-10T19:22:09.108986+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/4096740693 2026-03-10T19:22:48.312502+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/702157851 2026-03-10T19:22:09.108986+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/2507114958 2026-03-10T19:28:33.084780+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/2738169749 2026-03-10T19:28:33.084780+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/1578432790 2026-03-10T19:22:09.108986+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6800/1318262611 2026-03-10T19:22:09.108986+0000 2026-03-09T19:36:12.951 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:0/1792148298 2026-03-10T19:28:11.339662+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/1009274776 2026-03-10T19:27:41.886216+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6827/1147625344 2026-03-10T19:24:28.828509+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/910287422 2026-03-10T19:27:41.886216+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/3350280612 2026-03-10T19:28:33.084780+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:6829/502005203 2026-03-10T19:28:11.339662+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/2360800775 2026-03-10T19:28:33.084780+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/2275130412 2026-03-10T19:22:48.312502+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/2908784426 2026-03-10T19:28:33.084780+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:6828/502005203 2026-03-10T19:28:11.339662+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6801/1318262611 2026-03-10T19:22:09.108986+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6801/2885771920 2026-03-10T19:27:41.886216+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:0/1945955998 2026-03-10T19:28:11.339662+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6800/1021580706 2026-03-10T19:22:48.312502+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:0/3933824698 2026-03-10T19:28:11.339662+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6800/2885771920 2026-03-10T19:27:41.886216+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6801/1021580706 2026-03-10T19:22:48.312502+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:0/1222239637 2026-03-10T19:28:11.339662+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6800/1456056000 2026-03-10T19:21:57.086091+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:0/2135388892 2026-03-10T19:28:11.339662+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/2208266146 2026-03-10T19:27:41.886216+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/961195961 2026-03-10T19:21:57.086091+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6801/1456056000 2026-03-10T19:21:57.086091+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:6824/1034426472 2026-03-10T19:33:51.905906+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/1848167886 2026-03-10T19:22:48.312502+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/3787716570 2026-03-10T19:27:41.886216+0000 2026-03-09T19:36:12.952 INFO:teuthology.orchestra.run.vm07.stderr:listed 43 entries 2026-03-09T19:36:12.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.956+0000 7fe615432640 1 -- 192.168.123.107:0/158298780 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fe5e00779b0 msgr2=0x7fe5e0079e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:12.954 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.957+0000 7fe615432640 1 --2- 192.168.123.107:0/158298780 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fe5e00779b0 0x7fe5e0079e70 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7fe604004620 tx=0x7fe60400a480 comp rx=0 tx=0).stop 2026-03-09T19:36:12.955 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.957+0000 7fe615432640 1 -- 192.168.123.107:0/158298780 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe610100780 msgr2=0x7fe61006ab60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:12.955 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.957+0000 7fe615432640 1 --2- 192.168.123.107:0/158298780 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe610100780 0x7fe61006ab60 secure :-1 s=READY pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7fe5fc009a00 tx=0x7fe5fc031d20 comp rx=0 tx=0).stop 2026-03-09T19:36:12.955 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.957+0000 7fe615432640 1 -- 192.168.123.107:0/158298780 shutdown_connections 2026-03-09T19:36:12.955 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.957+0000 7fe615432640 1 --2- 192.168.123.107:0/158298780 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7fe5e00779b0 0x7fe5e0079e70 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:12.955 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.957+0000 7fe615432640 1 --2- 192.168.123.107:0/158298780 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe610106780 0x7fe61006b0a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:12.955 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.958+0000 7fe615432640 1 --2- 192.168.123.107:0/158298780 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe610100780 0x7fe61006ab60 unknown :-1 s=CLOSED pgs=197 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:12.955 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.958+0000 7fe615432640 1 -- 192.168.123.107:0/158298780 >> 192.168.123.107:0/158298780 conn(0x7fe6100fc460 msgr2=0x7fe6100fd8d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:12.956 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.958+0000 7fe615432640 1 -- 192.168.123.107:0/158298780 shutdown_connections 2026-03-09T19:36:12.956 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:12.958+0000 7fe615432640 1 -- 192.168.123.107:0/158298780 wait complete. 2026-03-09T19:36:13.027 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-09T19:36:13.027 DEBUG:teuthology.orchestra.run.vm07:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-09T19:36:13.044 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph osd blocklist ls 2026-03-09T19:36:13.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:12 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/731154279' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 32, "format": "json"}]: dispatch 2026-03-09T19:36:13.262 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:36:13.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:12 vm08.local ceph-mon[103420]: pgmap v288: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:13.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:12 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/731154279' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 32, "format": "json"}]: dispatch 2026-03-09T19:36:13.548 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.548+0000 7f58a7577640 1 -- 192.168.123.107:0/3144604051 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f58a80ff4a0 msgr2=0x7f58a80ff900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:13.548 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.548+0000 7f58a7577640 1 --2- 192.168.123.107:0/3144604051 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f58a80ff4a0 0x7f58a80ff900 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7f5890009a00 tx=0x7f589002f280 comp rx=0 tx=0).stop 2026-03-09T19:36:13.548 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.550+0000 7f58a7577640 1 -- 192.168.123.107:0/3144604051 shutdown_connections 2026-03-09T19:36:13.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.550+0000 7f58a7577640 1 --2- 192.168.123.107:0/3144604051 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f58a80ff4a0 0x7f58a80ff900 unknown :-1 s=CLOSED pgs=198 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:13.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.550+0000 7f58a7577640 1 --2- 192.168.123.107:0/3144604051 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58a8105550 0x7f58a8105930 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:13.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.550+0000 7f58a7577640 1 -- 192.168.123.107:0/3144604051 >> 192.168.123.107:0/3144604051 conn(0x7f58a80fb180 msgr2=0x7f58a80fd5a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:13.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.551+0000 7f58a7577640 1 -- 192.168.123.107:0/3144604051 shutdown_connections 2026-03-09T19:36:13.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.551+0000 7f58a7577640 1 -- 192.168.123.107:0/3144604051 wait complete. 2026-03-09T19:36:13.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.551+0000 7f58a7577640 1 Processor -- start 2026-03-09T19:36:13.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.552+0000 7f58a7577640 1 -- start start 2026-03-09T19:36:13.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.552+0000 7f58a7577640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f58a80ff4a0 0x7f58a819a590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:13.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.552+0000 7f58a6575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f58a80ff4a0 0x7f58a819a590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:13.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.552+0000 7f58a6575640 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f58a80ff4a0 0x7f58a819a590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:55200/0 (socket says 192.168.123.107:55200) 2026-03-09T19:36:13.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.552+0000 7f58a7577640 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58a8105550 0x7f58a819aad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:13.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.552+0000 7f58a7577640 1 -- 192.168.123.107:0/1857242076 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f58a819b160 con 0x7f58a80ff4a0 2026-03-09T19:36:13.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.552+0000 7f58a7577640 1 -- 192.168.123.107:0/1857242076 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f58a819eed0 con 0x7f58a8105550 2026-03-09T19:36:13.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.552+0000 7f58a6575640 1 -- 192.168.123.107:0/1857242076 learned_addr learned my addr 192.168.123.107:0/1857242076 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-09T19:36:13.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.553+0000 7f58a6575640 1 -- 192.168.123.107:0/1857242076 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58a8105550 msgr2=0x7f58a819aad0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T19:36:13.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.553+0000 7f58a6575640 1 --2- 192.168.123.107:0/1857242076 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58a8105550 0x7f58a819aad0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:13.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.553+0000 7f58a6575640 1 -- 192.168.123.107:0/1857242076 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5890009660 con 0x7f58a80ff4a0 2026-03-09T19:36:13.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.553+0000 7f58a6575640 1 --2- 192.168.123.107:0/1857242076 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f58a80ff4a0 0x7f58a819a590 secure :-1 s=READY pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7f589c00e990 tx=0x7f589c00ee60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:13.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.553+0000 7f588f7fe640 1 -- 192.168.123.107:0/1857242076 <== mon.0 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f589c00cd30 con 0x7f58a80ff4a0 2026-03-09T19:36:13.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.553+0000 7f588f7fe640 1 -- 192.168.123.107:0/1857242076 <== mon.0 v2:192.168.123.107:3300/0 2 ==== config(27 keys) v1 ==== 1108+0+0 (secure 0 0 0) 0x7f589c00ce90 con 0x7f58a80ff4a0 2026-03-09T19:36:13.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.553+0000 7f588f7fe640 1 -- 192.168.123.107:0/1857242076 <== mon.0 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f589c010640 con 0x7f58a80ff4a0 2026-03-09T19:36:13.552 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.554+0000 7f58a7577640 1 -- 192.168.123.107:0/1857242076 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f58a819f1b0 con 0x7f58a80ff4a0 2026-03-09T19:36:13.552 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.554+0000 7f58a7577640 1 -- 192.168.123.107:0/1857242076 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f58a819f6d0 con 0x7f58a80ff4a0 2026-03-09T19:36:13.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.555+0000 7f58a7577640 1 -- 192.168.123.107:0/1857242076 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f586c005350 con 0x7f58a80ff4a0 2026-03-09T19:36:13.556 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.555+0000 7f588f7fe640 1 -- 192.168.123.107:0/1857242076 <== mon.0 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f589c0040d0 con 0x7f58a80ff4a0 2026-03-09T19:36:13.556 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.556+0000 7f588f7fe640 1 --2- 192.168.123.107:0/1857242076 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f5880077890 0x7f5880079d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T19:36:13.556 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.556+0000 7f588f7fe640 1 -- 192.168.123.107:0/1857242076 <== mon.0 v2:192.168.123.107:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6652+0+0 (secure 0 0 0) 0x7f589c014070 con 0x7f58a80ff4a0 2026-03-09T19:36:13.556 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.558+0000 7f588f7fe640 1 -- 192.168.123.107:0/1857242076 <== mon.0 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f589c09e050 con 0x7f58a80ff4a0 2026-03-09T19:36:13.557 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.559+0000 7f58a5d74640 1 --2- 192.168.123.107:0/1857242076 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f5880077890 0x7f5880079d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T19:36:13.557 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.559+0000 7f58a5d74640 1 --2- 192.168.123.107:0/1857242076 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f5880077890 0x7f5880079d50 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f5890004870 tx=0x7f58900047e0 comp rx=0 tx=0).ready entity=mgr.24557 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T19:36:13.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.677+0000 7f58a7577640 1 -- 192.168.123.107:0/1857242076 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7f586c0051c0 con 0x7f58a80ff4a0 2026-03-09T19:36:13.675 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.677+0000 7f588f7fe640 1 -- 192.168.123.107:0/1857242076 <== mon.0 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 43 entries v102) v1 ==== 81+0+2669 (secure 0 0 0) 0x7f589c0616a0 con 0x7f58a80ff4a0 2026-03-09T19:36:13.676 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:6827/2328013860 2026-03-10T19:34:01.467325+0000 2026-03-09T19:36:13.676 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:6826/2328013860 2026-03-10T19:34:01.467325+0000 2026-03-09T19:36:13.676 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:6825/1034426472 2026-03-10T19:33:51.905906+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/257370646 2026-03-10T19:28:33.084780+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6801/781232384 2026-03-10T19:28:33.084780+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6800/781232384 2026-03-10T19:28:33.084780+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:0/1004147687 2026-03-10T19:28:11.339662+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/2125348601 2026-03-10T19:21:57.086091+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/3701716623 2026-03-10T19:21:57.086091+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6826/1147625344 2026-03-10T19:24:28.828509+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/2365276262 2026-03-10T19:22:09.108986+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/4096740693 2026-03-10T19:22:48.312502+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/702157851 2026-03-10T19:22:09.108986+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/2507114958 2026-03-10T19:28:33.084780+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/2738169749 2026-03-10T19:28:33.084780+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/1578432790 2026-03-10T19:22:09.108986+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6800/1318262611 2026-03-10T19:22:09.108986+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:0/1792148298 2026-03-10T19:28:11.339662+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/1009274776 2026-03-10T19:27:41.886216+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6827/1147625344 2026-03-10T19:24:28.828509+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/910287422 2026-03-10T19:27:41.886216+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/3350280612 2026-03-10T19:28:33.084780+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:6829/502005203 2026-03-10T19:28:11.339662+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/2360800775 2026-03-10T19:28:33.084780+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/2275130412 2026-03-10T19:22:48.312502+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/2908784426 2026-03-10T19:28:33.084780+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:6828/502005203 2026-03-10T19:28:11.339662+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6801/1318262611 2026-03-10T19:22:09.108986+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6801/2885771920 2026-03-10T19:27:41.886216+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:0/1945955998 2026-03-10T19:28:11.339662+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6800/1021580706 2026-03-10T19:22:48.312502+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:0/3933824698 2026-03-10T19:28:11.339662+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6800/2885771920 2026-03-10T19:27:41.886216+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6801/1021580706 2026-03-10T19:22:48.312502+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:0/1222239637 2026-03-10T19:28:11.339662+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6800/1456056000 2026-03-10T19:21:57.086091+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:0/2135388892 2026-03-10T19:28:11.339662+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/2208266146 2026-03-10T19:27:41.886216+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/961195961 2026-03-10T19:21:57.086091+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:6801/1456056000 2026-03-10T19:21:57.086091+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.108:6824/1034426472 2026-03-10T19:33:51.905906+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/1848167886 2026-03-10T19:22:48.312502+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.107:0/3787716570 2026-03-10T19:27:41.886216+0000 2026-03-09T19:36:13.677 INFO:teuthology.orchestra.run.vm07.stderr:listed 43 entries 2026-03-09T19:36:13.679 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.681+0000 7f58a7577640 1 -- 192.168.123.107:0/1857242076 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f5880077890 msgr2=0x7f5880079d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:13.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.681+0000 7f58a7577640 1 --2- 192.168.123.107:0/1857242076 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f5880077890 0x7f5880079d50 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f5890004870 tx=0x7f58900047e0 comp rx=0 tx=0).stop 2026-03-09T19:36:13.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.681+0000 7f58a7577640 1 -- 192.168.123.107:0/1857242076 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f58a80ff4a0 msgr2=0x7f58a819a590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T19:36:13.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.682+0000 7f58a7577640 1 --2- 192.168.123.107:0/1857242076 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f58a80ff4a0 0x7f58a819a590 secure :-1 s=READY pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7f589c00e990 tx=0x7f589c00ee60 comp rx=0 tx=0).stop 2026-03-09T19:36:13.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.682+0000 7f58a7577640 1 -- 192.168.123.107:0/1857242076 shutdown_connections 2026-03-09T19:36:13.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.682+0000 7f58a7577640 1 --2- 192.168.123.107:0/1857242076 >> [v2:192.168.123.107:6800/418954333,v1:192.168.123.107:6801/418954333] conn(0x7f5880077890 0x7f5880079d50 secure :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f5890004870 tx=0x7f58900047e0 comp rx=0 tx=0).stop 2026-03-09T19:36:13.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.682+0000 7f58a7577640 1 --2- 192.168.123.107:0/1857242076 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58a8105550 0x7f58a819aad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:13.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.682+0000 7f58a7577640 1 --2- 192.168.123.107:0/1857242076 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f58a80ff4a0 0x7f58a819a590 unknown :-1 s=CLOSED pgs=199 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T19:36:13.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.682+0000 7f58a7577640 1 -- 192.168.123.107:0/1857242076 >> 192.168.123.107:0/1857242076 conn(0x7f58a80fb180 msgr2=0x7f58a8068e20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T19:36:13.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.682+0000 7f58a7577640 1 -- 192.168.123.107:0/1857242076 shutdown_connections 2026-03-09T19:36:13.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-09T19:36:13.682+0000 7f58a7577640 1 -- 192.168.123.107:0/1857242076 wait complete. 2026-03-09T19:36:13.723 INFO:tasks.cephfs.fuse_mount:Running fusermount -u on ubuntu@vm07.local... 2026-03-09T19:36:13.723 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T19:36:13.723 DEBUG:teuthology.orchestra.run.vm07:> sudo fusermount -u /home/ubuntu/cephtest/mnt.0 2026-03-09T19:36:13.756 INFO:teuthology.orchestra.run:waiting for 300 2026-03-09T19:36:14.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:13 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/158298780' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-09T19:36:14.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:13 vm07.local ceph-mon[111841]: from='client.? 192.168.123.107:0/1857242076' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-09T19:36:14.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:13 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/158298780' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-09T19:36:14.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:13 vm08.local ceph-mon[103420]: from='client.? 192.168.123.107:0/1857242076' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-09T19:36:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:14 vm07.local ceph-mon[111841]: pgmap v289: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:15.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:14 vm08.local ceph-mon[103420]: pgmap v289: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:17.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:16 vm08.local ceph-mon[103420]: pgmap v290: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:17.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:16 vm07.local ceph-mon[111841]: pgmap v290: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:19.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:18 vm07.local ceph-mon[111841]: pgmap v291: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:19.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:18 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:36:19.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:18 vm08.local ceph-mon[103420]: pgmap v291: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:19.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:18 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:36:21.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:20 vm07.local ceph-mon[111841]: pgmap v292: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:21.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:20 vm08.local ceph-mon[103420]: pgmap v292: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:23.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:22 vm07.local ceph-mon[111841]: pgmap v293: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:23.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:22 vm08.local ceph-mon[103420]: pgmap v293: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:25.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:24 vm07.local ceph-mon[111841]: pgmap v294: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:25.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:24 vm08.local ceph-mon[103420]: pgmap v294: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:27.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:26 vm08.local ceph-mon[103420]: pgmap v295: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:27.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:26 vm07.local ceph-mon[111841]: pgmap v295: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:29.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:28 vm08.local ceph-mon[103420]: pgmap v296: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:29.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:28 vm07.local ceph-mon[111841]: pgmap v296: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:31.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:30 vm07.local ceph-mon[111841]: pgmap v297: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:31.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:30 vm08.local ceph-mon[103420]: pgmap v297: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:33.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:32 vm08.local ceph-mon[103420]: pgmap v298: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:33.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:32 vm07.local ceph-mon[111841]: pgmap v298: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:34.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:34 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:36:34.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:36:35.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:35 vm08.local ceph-mon[103420]: pgmap v299: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:35.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:35 vm07.local ceph-mon[111841]: pgmap v299: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:37.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:37 vm08.local ceph-mon[103420]: pgmap v300: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:37.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:37 vm07.local ceph-mon[111841]: pgmap v300: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:39.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:39 vm08.local ceph-mon[103420]: pgmap v301: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:39.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:39 vm07.local ceph-mon[111841]: pgmap v301: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:40.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:40 vm07.local ceph-mon[111841]: pgmap v302: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:40.595 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:40 vm08.local ceph-mon[103420]: pgmap v302: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:42.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:42 vm07.local ceph-mon[111841]: pgmap v303: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:43.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:42 vm08.local ceph-mon[103420]: pgmap v303: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:44.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:44 vm07.local ceph-mon[111841]: pgmap v304: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:45.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:44 vm08.local ceph-mon[103420]: pgmap v304: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:47.032 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:46 vm08.local ceph-mon[103420]: pgmap v305: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:47.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:46 vm07.local ceph-mon[111841]: pgmap v305: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:49.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:48 vm08.local ceph-mon[103420]: pgmap v306: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:36:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:48 vm07.local ceph-mon[111841]: pgmap v306: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:36:51.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:50 vm08.local ceph-mon[103420]: pgmap v307: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:36:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:36:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:36:51.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:36:51.218 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:50 vm07.local ceph-mon[111841]: pgmap v307: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:51.218 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:36:51.218 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:36:51.218 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:36:51.218 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:36:53.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:52 vm08.local ceph-mon[103420]: pgmap v308: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:53.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:52 vm07.local ceph-mon[111841]: pgmap v308: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:55.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:54 vm08.local ceph-mon[103420]: pgmap v309: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:55.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:54 vm07.local ceph-mon[111841]: pgmap v309: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:57.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:56 vm08.local ceph-mon[103420]: pgmap v310: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:57.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:56 vm07.local ceph-mon[111841]: pgmap v310: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:59.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:36:58 vm08.local ceph-mon[103420]: pgmap v311: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:36:59.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:36:58 vm07.local ceph-mon[111841]: pgmap v311: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:01.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:00 vm08.local ceph-mon[103420]: pgmap v312: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:01.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:00 vm07.local ceph-mon[111841]: pgmap v312: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:03.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:02 vm08.local ceph-mon[103420]: pgmap v313: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:03.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:02 vm07.local ceph-mon[111841]: pgmap v313: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:04 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:37:04.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:04 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:37:05.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:05 vm07.local ceph-mon[111841]: pgmap v314: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:05.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:05 vm08.local ceph-mon[103420]: pgmap v314: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:06.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:06 vm07.local ceph-mon[111841]: pgmap v315: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:06.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:06 vm08.local ceph-mon[103420]: pgmap v315: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:08.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:08 vm07.local ceph-mon[111841]: pgmap v316: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:09.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:08 vm08.local ceph-mon[103420]: pgmap v316: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:10.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:10 vm07.local ceph-mon[111841]: pgmap v317: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:11.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:10 vm08.local ceph-mon[103420]: pgmap v317: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:12.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:12 vm07.local ceph-mon[111841]: pgmap v318: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:13.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:12 vm08.local ceph-mon[103420]: pgmap v318: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:14.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:14 vm07.local ceph-mon[111841]: pgmap v319: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:15.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:14 vm08.local ceph-mon[103420]: pgmap v319: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:16.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:16 vm07.local ceph-mon[111841]: pgmap v320: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:17.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:16 vm08.local ceph-mon[103420]: pgmap v320: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:18.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:18 vm07.local ceph-mon[111841]: pgmap v321: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:18.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:18 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:37:19.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:18 vm08.local ceph-mon[103420]: pgmap v321: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:19.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:18 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:37:21.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:20 vm08.local ceph-mon[103420]: pgmap v322: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:21.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:20 vm07.local ceph-mon[111841]: pgmap v322: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:23.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:22 vm08.local ceph-mon[103420]: pgmap v323: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:23.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:22 vm07.local ceph-mon[111841]: pgmap v323: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:25.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:24 vm08.local ceph-mon[103420]: pgmap v324: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:25.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:24 vm07.local ceph-mon[111841]: pgmap v324: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:26.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:26 vm07.local ceph-mon[111841]: pgmap v325: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:27.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:26 vm08.local ceph-mon[103420]: pgmap v325: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:29.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:28 vm08.local ceph-mon[103420]: pgmap v326: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:29.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:28 vm07.local ceph-mon[111841]: pgmap v326: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:31.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:30 vm07.local ceph-mon[111841]: pgmap v327: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:31.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:30 vm08.local ceph-mon[103420]: pgmap v327: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:33.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:32 vm07.local ceph-mon[111841]: pgmap v328: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:33.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:32 vm08.local ceph-mon[103420]: pgmap v328: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:34.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:37:34.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:37:35.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:34 vm07.local ceph-mon[111841]: pgmap v329: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:35.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:34 vm08.local ceph-mon[103420]: pgmap v329: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:37.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:36 vm07.local ceph-mon[111841]: pgmap v330: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:37.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:36 vm08.local ceph-mon[103420]: pgmap v330: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:39.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:38 vm07.local ceph-mon[111841]: pgmap v331: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:39.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:38 vm08.local ceph-mon[103420]: pgmap v331: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:41.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:40 vm07.local ceph-mon[111841]: pgmap v332: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:41.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:40 vm08.local ceph-mon[103420]: pgmap v332: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:43.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:42 vm07.local ceph-mon[111841]: pgmap v333: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:43.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:42 vm08.local ceph-mon[103420]: pgmap v333: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:45.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:44 vm07.local ceph-mon[111841]: pgmap v334: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:45.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:44 vm08.local ceph-mon[103420]: pgmap v334: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:47.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:46 vm07.local ceph-mon[111841]: pgmap v335: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:47.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:46 vm08.local ceph-mon[103420]: pgmap v335: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:48 vm07.local ceph-mon[111841]: pgmap v336: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:37:49.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:48 vm08.local ceph-mon[103420]: pgmap v336: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:49.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:37:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:50 vm07.local ceph-mon[111841]: pgmap v337: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:37:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:37:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:37:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:50 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:37:51.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:50 vm08.local ceph-mon[103420]: pgmap v337: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:51.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:37:51.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:37:51.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:37:51.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:50 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:37:53.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:52 vm07.local ceph-mon[111841]: pgmap v338: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:53.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:52 vm08.local ceph-mon[103420]: pgmap v338: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:55.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:54 vm07.local ceph-mon[111841]: pgmap v339: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:55.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:54 vm08.local ceph-mon[103420]: pgmap v339: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:57.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:56 vm07.local ceph-mon[111841]: pgmap v340: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:57.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:56 vm08.local ceph-mon[103420]: pgmap v340: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:59.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:37:58 vm07.local ceph-mon[111841]: pgmap v341: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:37:59.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:37:58 vm08.local ceph-mon[103420]: pgmap v341: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:01.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:00 vm07.local ceph-mon[111841]: pgmap v342: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:01.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:00 vm08.local ceph-mon[103420]: pgmap v342: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:03.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:02 vm08.local ceph-mon[103420]: pgmap v343: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:03.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:02 vm07.local ceph-mon[111841]: pgmap v343: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:04.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:38:04.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:38:05.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:04 vm07.local ceph-mon[111841]: pgmap v344: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:05.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:04 vm08.local ceph-mon[103420]: pgmap v344: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:07.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:06 vm07.local ceph-mon[111841]: pgmap v345: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:07.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:06 vm08.local ceph-mon[103420]: pgmap v345: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:09.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:08 vm07.local ceph-mon[111841]: pgmap v346: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:09.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:08 vm08.local ceph-mon[103420]: pgmap v346: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:11.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:10 vm08.local ceph-mon[103420]: pgmap v347: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:11.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:10 vm07.local ceph-mon[111841]: pgmap v347: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:13.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:12 vm08.local ceph-mon[103420]: pgmap v348: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:13.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:12 vm07.local ceph-mon[111841]: pgmap v348: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:15.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:14 vm08.local ceph-mon[103420]: pgmap v349: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:15.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:14 vm07.local ceph-mon[111841]: pgmap v349: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:17.280 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:17 vm07.local ceph-mon[111841]: pgmap v350: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:17.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:17 vm08.local ceph-mon[103420]: pgmap v350: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:19.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:19 vm08.local ceph-mon[103420]: pgmap v351: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:19.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:19 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:38:19.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:19 vm07.local ceph-mon[111841]: pgmap v351: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:19.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:19 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:38:21.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:21 vm08.local ceph-mon[103420]: pgmap v352: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:21.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:21 vm07.local ceph-mon[111841]: pgmap v352: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:23.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:23 vm08.local ceph-mon[103420]: pgmap v353: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:23.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:23 vm07.local ceph-mon[111841]: pgmap v353: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:25.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:25 vm08.local ceph-mon[103420]: pgmap v354: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:25.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:25 vm07.local ceph-mon[111841]: pgmap v354: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:27.343 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:27 vm07.local ceph-mon[111841]: pgmap v355: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:27.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:27 vm08.local ceph-mon[103420]: pgmap v355: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:29.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:29 vm08.local ceph-mon[103420]: pgmap v356: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:29.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:29 vm07.local ceph-mon[111841]: pgmap v356: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:31.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:31 vm08.local ceph-mon[103420]: pgmap v357: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:31.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:31 vm07.local ceph-mon[111841]: pgmap v357: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:33.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:33 vm08.local ceph-mon[103420]: pgmap v358: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:33.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:33 vm07.local ceph-mon[111841]: pgmap v358: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:34.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:34 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:38:34.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:38:35.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:35 vm08.local ceph-mon[103420]: pgmap v359: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:35.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:35 vm07.local ceph-mon[111841]: pgmap v359: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:36.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:36 vm07.local ceph-mon[111841]: pgmap v360: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:36.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:36 vm08.local ceph-mon[103420]: pgmap v360: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:38.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:38 vm07.local ceph-mon[111841]: pgmap v361: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:39.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:38 vm08.local ceph-mon[103420]: pgmap v361: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:40.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:40 vm07.local ceph-mon[111841]: pgmap v362: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:41.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:40 vm08.local ceph-mon[103420]: pgmap v362: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:42.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:42 vm07.local ceph-mon[111841]: pgmap v363: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:43.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:42 vm08.local ceph-mon[103420]: pgmap v363: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:45.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:44 vm08.local ceph-mon[103420]: pgmap v364: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:45.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:44 vm07.local ceph-mon[111841]: pgmap v364: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:47.030 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:46 vm07.local ceph-mon[111841]: pgmap v365: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:47.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:46 vm08.local ceph-mon[103420]: pgmap v365: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:49.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:48 vm08.local ceph-mon[103420]: pgmap v366: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:49.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:38:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:48 vm07.local ceph-mon[111841]: pgmap v366: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:38:51.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:50 vm08.local ceph-mon[103420]: pgmap v367: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:50 vm07.local ceph-mon[111841]: pgmap v367: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:38:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:38:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:38:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:38:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:38:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:38:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:38:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:38:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-09T19:38:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:38:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:38:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:38:53.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:53 vm08.local ceph-mon[103420]: pgmap v368: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:53.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:53 vm07.local ceph-mon[111841]: pgmap v368: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:55.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:55 vm08.local ceph-mon[103420]: pgmap v369: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:55.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:55 vm07.local ceph-mon[111841]: pgmap v369: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:57.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:57 vm08.local ceph-mon[103420]: pgmap v370: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:57.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:57 vm07.local ceph-mon[111841]: pgmap v370: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:59.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:38:59 vm08.local ceph-mon[103420]: pgmap v371: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:38:59.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:38:59 vm07.local ceph-mon[111841]: pgmap v371: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:01.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:01 vm08.local ceph-mon[103420]: pgmap v372: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:01.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:01 vm07.local ceph-mon[111841]: pgmap v372: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:03.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:03 vm08.local ceph-mon[103420]: pgmap v373: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:03.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:03 vm07.local ceph-mon[111841]: pgmap v373: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:04.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:04 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:39:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:04 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:39:05.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:05 vm08.local ceph-mon[103420]: pgmap v374: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:05.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:05 vm07.local ceph-mon[111841]: pgmap v374: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:07.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:07 vm08.local ceph-mon[103420]: pgmap v375: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:07.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:07 vm07.local ceph-mon[111841]: pgmap v375: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:08.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:08 vm07.local ceph-mon[111841]: pgmap v376: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:08.594 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:08 vm08.local ceph-mon[103420]: pgmap v376: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:10.978 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:10 vm07.local ceph-mon[111841]: pgmap v377: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:11.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:10 vm08.local ceph-mon[103420]: pgmap v377: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:12.844 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:12 vm07.local ceph-mon[111841]: pgmap v378: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:13.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:12 vm08.local ceph-mon[103420]: pgmap v378: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:15.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:14 vm08.local ceph-mon[103420]: pgmap v379: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:15.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:14 vm07.local ceph-mon[111841]: pgmap v379: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:17.030 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:16 vm07.local ceph-mon[111841]: pgmap v380: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:17.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:16 vm08.local ceph-mon[103420]: pgmap v380: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:19.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:18 vm08.local ceph-mon[103420]: pgmap v381: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:19.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:18 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:39:19.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:18 vm07.local ceph-mon[111841]: pgmap v381: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:19.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:18 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:39:21.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:20 vm08.local ceph-mon[103420]: pgmap v382: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:21.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:20 vm07.local ceph-mon[111841]: pgmap v382: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:23.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:22 vm08.local ceph-mon[103420]: pgmap v383: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:23.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:22 vm07.local ceph-mon[111841]: pgmap v383: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:25.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:24 vm08.local ceph-mon[103420]: pgmap v384: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:25.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:24 vm07.local ceph-mon[111841]: pgmap v384: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:27.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:26 vm08.local ceph-mon[103420]: pgmap v385: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:27.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:26 vm07.local ceph-mon[111841]: pgmap v385: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:29.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:28 vm08.local ceph-mon[103420]: pgmap v386: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:29.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:28 vm07.local ceph-mon[111841]: pgmap v386: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:31.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:30 vm08.local ceph-mon[103420]: pgmap v387: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:31.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:30 vm07.local ceph-mon[111841]: pgmap v387: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:33.064 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:32 vm07.local ceph-mon[111841]: pgmap v388: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:33.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:32 vm08.local ceph-mon[103420]: pgmap v388: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:33.844 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:33 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:39:34.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:33 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:39:35.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:34 vm08.local ceph-mon[103420]: pgmap v389: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:35.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:34 vm07.local ceph-mon[111841]: pgmap v389: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:37.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:36 vm08.local ceph-mon[103420]: pgmap v390: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:37.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:36 vm07.local ceph-mon[111841]: pgmap v390: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:39.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:38 vm08.local ceph-mon[103420]: pgmap v391: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:39.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:38 vm07.local ceph-mon[111841]: pgmap v391: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:41.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:40 vm08.local ceph-mon[103420]: pgmap v392: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:41.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:40 vm07.local ceph-mon[111841]: pgmap v392: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:43.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:42 vm08.local ceph-mon[103420]: pgmap v393: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:43.173 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:42 vm07.local ceph-mon[111841]: pgmap v393: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:45.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:44 vm08.local ceph-mon[103420]: pgmap v394: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:45.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:44 vm07.local ceph-mon[111841]: pgmap v394: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:47.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:46 vm08.local ceph-mon[103420]: pgmap v395: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:47.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:46 vm07.local ceph-mon[111841]: pgmap v395: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:48 vm08.local ceph-mon[103420]: pgmap v396: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:39:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:48 vm07.local ceph-mon[111841]: pgmap v396: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:39:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:50 vm07.local ceph-mon[111841]: pgmap v397: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:51.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:50 vm08.local ceph-mon[103420]: pgmap v397: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:39:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:39:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:39:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:39:52.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:39:52.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:39:52.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:39:52.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:39:53.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:52 vm07.local ceph-mon[111841]: pgmap v398: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:53.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:52 vm08.local ceph-mon[103420]: pgmap v398: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:55.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:54 vm07.local ceph-mon[111841]: pgmap v399: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:55.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:54 vm08.local ceph-mon[103420]: pgmap v399: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:57.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:56 vm07.local ceph-mon[111841]: pgmap v400: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:57.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:56 vm08.local ceph-mon[103420]: pgmap v400: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:59.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:39:58 vm08.local ceph-mon[103420]: pgmap v401: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:39:59.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:39:58 vm07.local ceph-mon[111841]: pgmap v401: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:01.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:00 vm07.local ceph-mon[111841]: pgmap v402: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:01.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:00 vm07.local ceph-mon[111841]: overall HEALTH_OK 2026-03-09T19:40:01.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:00 vm08.local ceph-mon[103420]: pgmap v402: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:01.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:00 vm08.local ceph-mon[103420]: overall HEALTH_OK 2026-03-09T19:40:03.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:02 vm07.local ceph-mon[111841]: pgmap v403: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:03.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:02 vm08.local ceph-mon[103420]: pgmap v403: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:04.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:40:04.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:40:05.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:04 vm08.local ceph-mon[103420]: pgmap v404: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:05.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:04 vm07.local ceph-mon[111841]: pgmap v404: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:07.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:06 vm08.local ceph-mon[103420]: pgmap v405: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:07.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:06 vm07.local ceph-mon[111841]: pgmap v405: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:09.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:09 vm08.local ceph-mon[103420]: pgmap v406: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:09.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:09 vm07.local ceph-mon[111841]: pgmap v406: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:11.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:11 vm08.local ceph-mon[103420]: pgmap v407: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:11.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:11 vm07.local ceph-mon[111841]: pgmap v407: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:13.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:13 vm08.local ceph-mon[103420]: pgmap v408: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:13.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:13 vm07.local ceph-mon[111841]: pgmap v408: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:15.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:15 vm08.local ceph-mon[103420]: pgmap v409: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:15.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:15 vm07.local ceph-mon[111841]: pgmap v409: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:17.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:17 vm07.local ceph-mon[111841]: pgmap v410: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:17.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:17 vm08.local ceph-mon[103420]: pgmap v410: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:19.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:19 vm08.local ceph-mon[103420]: pgmap v411: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:19.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:19 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:40:19.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:19 vm07.local ceph-mon[111841]: pgmap v411: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:19.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:19 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:40:21.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:21 vm08.local ceph-mon[103420]: pgmap v412: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:21.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:21 vm07.local ceph-mon[111841]: pgmap v412: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:23.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:23 vm08.local ceph-mon[103420]: pgmap v413: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:23.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:23 vm07.local ceph-mon[111841]: pgmap v413: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:25.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:25 vm08.local ceph-mon[103420]: pgmap v414: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:25.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:25 vm07.local ceph-mon[111841]: pgmap v414: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:27.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:27 vm08.local ceph-mon[103420]: pgmap v415: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:27.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:27 vm07.local ceph-mon[111841]: pgmap v415: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:29.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:29 vm08.local ceph-mon[103420]: pgmap v416: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:29.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:29 vm07.local ceph-mon[111841]: pgmap v416: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:31.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:31 vm08.local ceph-mon[103420]: pgmap v417: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:31.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:31 vm07.local ceph-mon[111841]: pgmap v417: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:33.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:33 vm08.local ceph-mon[103420]: pgmap v418: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:33.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:33 vm07.local ceph-mon[111841]: pgmap v418: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:34.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:34 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:40:34.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:34 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:40:35.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:35 vm08.local ceph-mon[103420]: pgmap v419: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:35.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:35 vm07.local ceph-mon[111841]: pgmap v419: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:37.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:37 vm08.local ceph-mon[103420]: pgmap v420: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:37.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:37 vm07.local ceph-mon[111841]: pgmap v420: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:38.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:38 vm08.local ceph-mon[103420]: pgmap v421: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:38.478 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:38 vm07.local ceph-mon[111841]: pgmap v421: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:41.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:40 vm08.local ceph-mon[103420]: pgmap v422: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:41.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:40 vm07.local ceph-mon[111841]: pgmap v422: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:43.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:42 vm08.local ceph-mon[103420]: pgmap v423: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:43.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:42 vm07.local ceph-mon[111841]: pgmap v423: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:45.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:44 vm08.local ceph-mon[103420]: pgmap v424: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:45.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:44 vm07.local ceph-mon[111841]: pgmap v424: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:47.031 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:46 vm07.local ceph-mon[111841]: pgmap v425: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:47.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:46 vm08.local ceph-mon[103420]: pgmap v425: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:49.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:48 vm08.local ceph-mon[103420]: pgmap v426: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:49.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:48 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:40:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:48 vm07.local ceph-mon[111841]: pgmap v426: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:49.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:48 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:40:51.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:50 vm08.local ceph-mon[103420]: pgmap v427: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:51.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:50 vm07.local ceph-mon[111841]: pgmap v427: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:52.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:40:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:40:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:40:52.095 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:51 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:40:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T19:40:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T19:40:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T19:40:52.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:51 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' 2026-03-09T19:40:53.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:52 vm08.local ceph-mon[103420]: pgmap v428: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:53.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:52 vm07.local ceph-mon[111841]: pgmap v428: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:55.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:54 vm08.local ceph-mon[103420]: pgmap v429: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:55.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:54 vm07.local ceph-mon[111841]: pgmap v429: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:57.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:56 vm08.local ceph-mon[103420]: pgmap v430: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:57.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:56 vm07.local ceph-mon[111841]: pgmap v430: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:59.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:40:58 vm08.local ceph-mon[103420]: pgmap v431: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:40:59.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:40:58 vm07.local ceph-mon[111841]: pgmap v431: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:41:01.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:41:00 vm08.local ceph-mon[103420]: pgmap v432: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:41:01.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:41:00 vm07.local ceph-mon[111841]: pgmap v432: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:41:03.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:41:02 vm08.local ceph-mon[103420]: pgmap v433: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:41:03.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:41:02 vm07.local ceph-mon[111841]: pgmap v433: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:41:04.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:41:03 vm07.local ceph-mon[111841]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:41:04.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:41:03 vm08.local ceph-mon[103420]: from='mgr.24557 192.168.123.107:0/2339048624' entity='mgr.vm07.xacuym' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T19:41:05.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:41:04 vm07.local ceph-mon[111841]: pgmap v434: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:41:05.345 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:41:04 vm08.local ceph-mon[103420]: pgmap v434: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:41:07.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:41:06 vm07.local ceph-mon[111841]: pgmap v435: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:41:07.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:41:06 vm08.local ceph-mon[103420]: pgmap v435: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:41:09.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:41:08 vm07.local ceph-mon[111841]: pgmap v436: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:41:09.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:41:08 vm08.local ceph-mon[103420]: pgmap v436: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:41:11.228 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:41:10 vm07.local ceph-mon[111841]: pgmap v437: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:41:11.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:41:10 vm08.local ceph-mon[103420]: pgmap v437: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:41:12.806 ERROR:tasks.cephfs.fuse_mount:process failed to terminate after unmount. This probably indicates a bug within ceph-fuse. 2026-03-09T19:41:12.806 ERROR:teuthology.run_tasks:Manager failed: ceph-fuse Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-09T19:41:12.807 DEBUG:teuthology.run_tasks:Unwinding manager cephadm 2026-03-09T19:41:12.810 INFO:tasks.cephadm:Teardown begin 2026-03-09T19:41:12.810 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/contextutil.py", line 32, in nested yield vars File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/cephadm.py", line 2252, in task yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-09T19:41:12.810 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-09T19:41:12.836 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-09T19:41:12.865 INFO:tasks.cephadm:Disabling cephadm mgr module 2026-03-09T19:41:12.865 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.ceph.io/ceph-ci/ceph:reef shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 -- ceph mgr module disable cephadm 2026-03-09T19:41:13.026 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/mon.vm07/config 2026-03-09T19:41:13.084 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:41:12 vm07.local ceph-mon[111841]: pgmap v438: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:41:13.197 INFO:teuthology.orchestra.run.vm07.stderr:Error: statfs /etc/ceph/ceph.client.admin.keyring: no such file or directory 2026-03-09T19:41:13.212 DEBUG:teuthology.orchestra.run:got remote process result: 125 2026-03-09T19:41:13.212 INFO:tasks.cephadm:Cleaning up testdir ceph.* files... 2026-03-09T19:41:13.212 DEBUG:teuthology.orchestra.run.vm07:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-09T19:41:13.227 DEBUG:teuthology.orchestra.run.vm08:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-09T19:41:13.243 INFO:tasks.cephadm:Stopping all daemons... 2026-03-09T19:41:13.243 INFO:tasks.cephadm.mon.vm07:Stopping mon.vm07... 2026-03-09T19:41:13.243 DEBUG:teuthology.orchestra.run.vm07:> sudo systemctl stop ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mon.vm07 2026-03-09T19:41:13.344 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:41:12 vm08.local ceph-mon[103420]: pgmap v438: 65 pgs: 65 active+clean; 214 MiB data, 939 MiB used, 119 GiB / 120 GiB avail 2026-03-09T19:41:13.380 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 09 19:41:13 vm07.local systemd[1]: Stopping Ceph mon.vm07 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:41:13.501 DEBUG:teuthology.orchestra.run.vm07:> sudo pkill -f 'journalctl -f -n 0 -u ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mon.vm07.service' 2026-03-09T19:41:13.534 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T19:41:13.534 INFO:tasks.cephadm.mon.vm07:Stopped mon.vm07 2026-03-09T19:41:13.534 INFO:tasks.cephadm.mon.vm08:Stopping mon.vm08... 2026-03-09T19:41:13.534 DEBUG:teuthology.orchestra.run.vm08:> sudo systemctl stop ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mon.vm08 2026-03-09T19:41:13.819 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:41:13 vm08.local systemd[1]: Stopping Ceph mon.vm08 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:41:13.819 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:41:13 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm08[103416]: 2026-03-09T19:41:13.636+0000 7fb2362ae640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm08 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T19:41:13.819 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 09 19:41:13 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-mon-vm08[103416]: 2026-03-09T19:41:13.636+0000 7fb2362ae640 -1 mon.vm08@1(peon) e3 *** Got Signal Terminated *** 2026-03-09T19:41:13.907 DEBUG:teuthology.orchestra.run.vm08:> sudo pkill -f 'journalctl -f -n 0 -u ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@mon.vm08.service' 2026-03-09T19:41:13.943 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T19:41:13.943 INFO:tasks.cephadm.mon.vm08:Stopped mon.vm08 2026-03-09T19:41:13.943 INFO:tasks.cephadm.osd.0:Stopping osd.0... 2026-03-09T19:41:13.943 DEBUG:teuthology.orchestra.run.vm07:> sudo systemctl stop ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.0 2026-03-09T19:41:14.228 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:41:13 vm07.local systemd[1]: Stopping Ceph osd.0 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:41:14.228 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:41:14 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0[119655]: 2026-03-09T19:41:14.051+0000 7f6b1380d640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T19:41:14.228 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:41:14 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0[119655]: 2026-03-09T19:41:14.051+0000 7f6b1380d640 -1 osd.0 102 *** Got signal Terminated *** 2026-03-09T19:41:14.228 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:41:14 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0[119655]: 2026-03-09T19:41:14.051+0000 7f6b1380d640 -1 osd.0 102 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T19:41:19.358 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:41:19 vm07.local podman[168549]: 2026-03-09 19:41:19.108124818 +0000 UTC m=+5.070206166 container died a203aa2416565f2bebebbedece3a291fe2772c83ac9d0f5d18fed2fc70ffd81e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T19:41:19.358 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:41:19 vm07.local podman[168549]: 2026-03-09 19:41:19.137396451 +0000 UTC m=+5.099477799 container remove a203aa2416565f2bebebbedece3a291fe2772c83ac9d0f5d18fed2fc70ffd81e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T19:41:19.358 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:41:19 vm07.local bash[168549]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0 2026-03-09T19:41:19.358 INFO:journalctl@ceph.osd.0.vm07.stdout:Mar 09 19:41:19 vm07.local podman[168631]: 2026-03-09 19:41:19.321615401 +0000 UTC m=+0.016999633 container create e90d0b5d7454d61093ee5c297e40abca7b73ba57b045a21c39d646ff7f64c8d6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-0-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default) 2026-03-09T19:41:19.566 DEBUG:teuthology.orchestra.run.vm07:> sudo pkill -f 'journalctl -f -n 0 -u ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.0.service' 2026-03-09T19:41:19.601 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T19:41:19.601 INFO:tasks.cephadm.osd.0:Stopped osd.0 2026-03-09T19:41:19.601 INFO:tasks.cephadm.osd.1:Stopping osd.1... 2026-03-09T19:41:19.601 DEBUG:teuthology.orchestra.run.vm07:> sudo systemctl stop ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.1 2026-03-09T19:41:19.978 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:41:19 vm07.local systemd[1]: Stopping Ceph osd.1 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:41:19.978 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:41:19 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1[126243]: 2026-03-09T19:41:19.756+0000 7ff21fcc3640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T19:41:19.978 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:41:19 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1[126243]: 2026-03-09T19:41:19.757+0000 7ff21fcc3640 -1 osd.1 102 *** Got signal Terminated *** 2026-03-09T19:41:19.978 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:41:19 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1[126243]: 2026-03-09T19:41:19.757+0000 7ff21fcc3640 -1 osd.1 102 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T19:41:25.061 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:41:24 vm07.local podman[168731]: 2026-03-09 19:41:24.792019316 +0000 UTC m=+5.048988441 container died 36b65f1069e55fbe78505c35f7e379b4cd8768055022b8ec0c7e226e3b04371c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid) 2026-03-09T19:41:25.061 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:41:24 vm07.local podman[168731]: 2026-03-09 19:41:24.821692321 +0000 UTC m=+5.078661445 container remove 36b65f1069e55fbe78505c35f7e379b4cd8768055022b8ec0c7e226e3b04371c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T19:41:25.061 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:41:24 vm07.local bash[168731]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1 2026-03-09T19:41:25.061 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:41:24 vm07.local podman[168800]: 2026-03-09 19:41:24.974272756 +0000 UTC m=+0.021622245 container create 0005eea79b3467f91a3ec8064d15ce728982b04327c11155c324b3b9fedfc52f (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-deactivate, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T19:41:25.061 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:41:25 vm07.local podman[168800]: 2026-03-09 19:41:25.027458454 +0000 UTC m=+0.074807954 container init 0005eea79b3467f91a3ec8064d15ce728982b04327c11155c324b3b9fedfc52f (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True) 2026-03-09T19:41:25.061 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:41:25 vm07.local podman[168800]: 2026-03-09 19:41:25.030668394 +0000 UTC m=+0.078017883 container start 0005eea79b3467f91a3ec8064d15ce728982b04327c11155c324b3b9fedfc52f (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-deactivate, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid) 2026-03-09T19:41:25.061 INFO:journalctl@ceph.osd.1.vm07.stdout:Mar 09 19:41:25 vm07.local podman[168800]: 2026-03-09 19:41:25.03345503 +0000 UTC m=+0.080804519 container attach 0005eea79b3467f91a3ec8064d15ce728982b04327c11155c324b3b9fedfc52f (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-1-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T19:41:25.183 DEBUG:teuthology.orchestra.run.vm07:> sudo pkill -f 'journalctl -f -n 0 -u ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.1.service' 2026-03-09T19:41:25.217 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T19:41:25.218 INFO:tasks.cephadm.osd.1:Stopped osd.1 2026-03-09T19:41:25.218 INFO:tasks.cephadm.osd.2:Stopping osd.2... 2026-03-09T19:41:25.218 DEBUG:teuthology.orchestra.run.vm07:> sudo systemctl stop ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.2 2026-03-09T19:41:25.372 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:41:25 vm07.local systemd[1]: Stopping Ceph osd.2 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:41:25.728 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:41:25 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2[132445]: 2026-03-09T19:41:25.372+0000 7fcd27c80640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T19:41:25.728 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:41:25 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2[132445]: 2026-03-09T19:41:25.372+0000 7fcd27c80640 -1 osd.2 102 *** Got signal Terminated *** 2026-03-09T19:41:25.728 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:41:25 vm07.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2[132445]: 2026-03-09T19:41:25.372+0000 7fcd27c80640 -1 osd.2 102 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T19:41:30.686 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:41:30 vm07.local podman[168894]: 2026-03-09 19:41:30.423628943 +0000 UTC m=+5.066003009 container died 1b8bc1f96eb7dc5de3d08e7e26cff9d5eb54199781ba7059f7faf425b02c5d60 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T19:41:30.686 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:41:30 vm07.local podman[168894]: 2026-03-09 19:41:30.451119202 +0000 UTC m=+5.093493268 container remove 1b8bc1f96eb7dc5de3d08e7e26cff9d5eb54199781ba7059f7faf425b02c5d60 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T19:41:30.686 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:41:30 vm07.local bash[168894]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2 2026-03-09T19:41:30.686 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:41:30 vm07.local podman[168960]: 2026-03-09 19:41:30.595190309 +0000 UTC m=+0.016409799 container create 1cd41ee66f23d26293dc7519304acfe07612094f21985be876e880d99937e778 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-deactivate, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_REF=squid) 2026-03-09T19:41:30.686 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:41:30 vm07.local podman[168960]: 2026-03-09 19:41:30.635626722 +0000 UTC m=+0.056846222 container init 1cd41ee66f23d26293dc7519304acfe07612094f21985be876e880d99937e778 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-deactivate, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T19:41:30.686 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:41:30 vm07.local podman[168960]: 2026-03-09 19:41:30.639876167 +0000 UTC m=+0.061095657 container start 1cd41ee66f23d26293dc7519304acfe07612094f21985be876e880d99937e778 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-deactivate, ceph=True, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T19:41:30.686 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:41:30 vm07.local podman[168960]: 2026-03-09 19:41:30.64158217 +0000 UTC m=+0.062801670 container attach 1cd41ee66f23d26293dc7519304acfe07612094f21985be876e880d99937e778 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-2-deactivate, ceph=True, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T19:41:30.686 INFO:journalctl@ceph.osd.2.vm07.stdout:Mar 09 19:41:30 vm07.local podman[168960]: 2026-03-09 19:41:30.588865008 +0000 UTC m=+0.010084509 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:41:30.819 DEBUG:teuthology.orchestra.run.vm07:> sudo pkill -f 'journalctl -f -n 0 -u ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.2.service' 2026-03-09T19:41:30.858 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T19:41:30.858 INFO:tasks.cephadm.osd.2:Stopped osd.2 2026-03-09T19:41:30.858 INFO:tasks.cephadm.osd.3:Stopping osd.3... 2026-03-09T19:41:30.858 DEBUG:teuthology.orchestra.run.vm08:> sudo systemctl stop ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.3 2026-03-09T19:41:31.273 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:41:30 vm08.local systemd[1]: Stopping Ceph osd.3 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:41:31.273 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:41:30 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3[112967]: 2026-03-09T19:41:30.971+0000 7f76666ea640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T19:41:31.273 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:41:30 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3[112967]: 2026-03-09T19:41:30.971+0000 7f76666ea640 -1 osd.3 102 *** Got signal Terminated *** 2026-03-09T19:41:31.273 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:41:30 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3[112967]: 2026-03-09T19:41:30.971+0000 7f76666ea640 -1 osd.3 102 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T19:41:36.278 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:41:36 vm08.local podman[143554]: 2026-03-09 19:41:36.005467881 +0000 UTC m=+5.048367342 container died bde783ff786f7d70024e8d83200249af84676e292292afdc9a39c62370406acc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS) 2026-03-09T19:41:36.279 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:41:36 vm08.local podman[143554]: 2026-03-09 19:41:36.040048513 +0000 UTC m=+5.082947974 container remove bde783ff786f7d70024e8d83200249af84676e292292afdc9a39c62370406acc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T19:41:36.279 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:41:36 vm08.local bash[143554]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3 2026-03-09T19:41:36.279 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:41:36 vm08.local podman[143622]: 2026-03-09 19:41:36.188216732 +0000 UTC m=+0.018722667 container create f6a0ae2fae35561de137228d5f6445846c6a589b518bb49869aeadda4fe3c557 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2) 2026-03-09T19:41:36.279 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:41:36 vm08.local podman[143622]: 2026-03-09 19:41:36.238148051 +0000 UTC m=+0.068653986 container init f6a0ae2fae35561de137228d5f6445846c6a589b518bb49869aeadda4fe3c557 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-deactivate, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2) 2026-03-09T19:41:36.279 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:41:36 vm08.local podman[143622]: 2026-03-09 19:41:36.241228037 +0000 UTC m=+0.071733972 container start f6a0ae2fae35561de137228d5f6445846c6a589b518bb49869aeadda4fe3c557 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default) 2026-03-09T19:41:36.279 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 09 19:41:36 vm08.local podman[143622]: 2026-03-09 19:41:36.24276841 +0000 UTC m=+0.073274345 container attach f6a0ae2fae35561de137228d5f6445846c6a589b518bb49869aeadda4fe3c557 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-3-deactivate, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.build-date=20260223) 2026-03-09T19:41:36.421 DEBUG:teuthology.orchestra.run.vm08:> sudo pkill -f 'journalctl -f -n 0 -u ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.3.service' 2026-03-09T19:41:36.459 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T19:41:36.459 INFO:tasks.cephadm.osd.3:Stopped osd.3 2026-03-09T19:41:36.459 INFO:tasks.cephadm.osd.4:Stopping osd.4... 2026-03-09T19:41:36.459 DEBUG:teuthology.orchestra.run.vm08:> sudo systemctl stop ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.4 2026-03-09T19:41:36.845 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:41:36 vm08.local systemd[1]: Stopping Ceph osd.4 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:41:36.845 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:41:36 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4[118709]: 2026-03-09T19:41:36.616+0000 7fb8501ba640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T19:41:36.845 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:41:36 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4[118709]: 2026-03-09T19:41:36.616+0000 7fb8501ba640 -1 osd.4 102 *** Got signal Terminated *** 2026-03-09T19:41:36.845 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:41:36 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4[118709]: 2026-03-09T19:41:36.616+0000 7fb8501ba640 -1 osd.4 102 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T19:41:40.594 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:40 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[124469]: 2026-03-09T19:41:40.146+0000 7fe54905f640 -1 osd.5 102 heartbeat_check: no reply from 192.168.123.107:6806 osd.0 since back 2026-03-09T19:41:17.813777+0000 front 2026-03-09T19:41:17.813704+0000 (oldest deadline 2026-03-09T19:41:39.513548+0000) 2026-03-09T19:41:41.594 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:41 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[124469]: 2026-03-09T19:41:41.180+0000 7fe54905f640 -1 osd.5 102 heartbeat_check: no reply from 192.168.123.107:6806 osd.0 since back 2026-03-09T19:41:17.813777+0000 front 2026-03-09T19:41:17.813704+0000 (oldest deadline 2026-03-09T19:41:39.513548+0000) 2026-03-09T19:41:41.925 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:41:41 vm08.local podman[143718]: 2026-03-09 19:41:41.652388909 +0000 UTC m=+5.048353116 container died 588104e3b774b502dcbbb887e9f5d42504fb76d3bf824b6c834e6e5bcb7360aa (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS) 2026-03-09T19:41:41.925 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:41:41 vm08.local podman[143718]: 2026-03-09 19:41:41.676298355 +0000 UTC m=+5.072262562 container remove 588104e3b774b502dcbbb887e9f5d42504fb76d3bf824b6c834e6e5bcb7360aa (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T19:41:41.925 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:41:41 vm08.local bash[143718]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4 2026-03-09T19:41:41.925 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:41:41 vm08.local podman[143797]: 2026-03-09 19:41:41.8366759 +0000 UTC m=+0.017521219 container create 81d5b0fa5455d3ab9c4824fe752421bdceb698e216ec52c6aad339f76d82bf15 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-deactivate, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) 2026-03-09T19:41:41.925 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:41:41 vm08.local podman[143797]: 2026-03-09 19:41:41.877980228 +0000 UTC m=+0.058825567 container init 81d5b0fa5455d3ab9c4824fe752421bdceb698e216ec52c6aad339f76d82bf15 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2) 2026-03-09T19:41:41.925 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:41:41 vm08.local podman[143797]: 2026-03-09 19:41:41.881428454 +0000 UTC m=+0.062273773 container start 81d5b0fa5455d3ab9c4824fe752421bdceb698e216ec52c6aad339f76d82bf15 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-4-deactivate, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T19:41:41.925 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 09 19:41:41 vm08.local podman[143797]: 2026-03-09 19:41:41.828582771 +0000 UTC m=+0.009428090 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:41:42.087 DEBUG:teuthology.orchestra.run.vm08:> sudo pkill -f 'journalctl -f -n 0 -u ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.4.service' 2026-03-09T19:41:42.122 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T19:41:42.122 INFO:tasks.cephadm.osd.4:Stopped osd.4 2026-03-09T19:41:42.122 INFO:tasks.cephadm.osd.5:Stopping osd.5... 2026-03-09T19:41:42.122 DEBUG:teuthology.orchestra.run.vm08:> sudo systemctl stop ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.5 2026-03-09T19:41:42.187 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:42 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[124469]: 2026-03-09T19:41:42.148+0000 7fe54905f640 -1 osd.5 102 heartbeat_check: no reply from 192.168.123.107:6806 osd.0 since back 2026-03-09T19:41:17.813777+0000 front 2026-03-09T19:41:17.813704+0000 (oldest deadline 2026-03-09T19:41:39.513548+0000) 2026-03-09T19:41:42.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:42 vm08.local systemd[1]: Stopping Ceph osd.5 for 17715774-1bed-11f1-9ad8-1bc9d74ff594... 2026-03-09T19:41:42.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:42 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[124469]: 2026-03-09T19:41:42.277+0000 7fe54d258640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T19:41:42.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:42 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[124469]: 2026-03-09T19:41:42.277+0000 7fe54d258640 -1 osd.5 102 *** Got signal Terminated *** 2026-03-09T19:41:42.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:42 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[124469]: 2026-03-09T19:41:42.277+0000 7fe54d258640 -1 osd.5 102 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T19:41:43.595 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:43 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[124469]: 2026-03-09T19:41:43.189+0000 7fe54905f640 -1 osd.5 102 heartbeat_check: no reply from 192.168.123.107:6806 osd.0 since back 2026-03-09T19:41:17.813777+0000 front 2026-03-09T19:41:17.813704+0000 (oldest deadline 2026-03-09T19:41:39.513548+0000) 2026-03-09T19:41:44.594 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:44 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[124469]: 2026-03-09T19:41:44.168+0000 7fe54905f640 -1 osd.5 102 heartbeat_check: no reply from 192.168.123.107:6806 osd.0 since back 2026-03-09T19:41:17.813777+0000 front 2026-03-09T19:41:17.813704+0000 (oldest deadline 2026-03-09T19:41:39.513548+0000) 2026-03-09T19:41:45.594 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:45 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[124469]: 2026-03-09T19:41:45.120+0000 7fe54905f640 -1 osd.5 102 heartbeat_check: no reply from 192.168.123.107:6806 osd.0 since back 2026-03-09T19:41:17.813777+0000 front 2026-03-09T19:41:17.813704+0000 (oldest deadline 2026-03-09T19:41:39.513548+0000) 2026-03-09T19:41:46.594 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:46 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[124469]: 2026-03-09T19:41:46.142+0000 7fe54905f640 -1 osd.5 102 heartbeat_check: no reply from 192.168.123.107:6806 osd.0 since back 2026-03-09T19:41:17.813777+0000 front 2026-03-09T19:41:17.813704+0000 (oldest deadline 2026-03-09T19:41:39.513548+0000) 2026-03-09T19:41:47.492 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:47 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[124469]: 2026-03-09T19:41:47.109+0000 7fe54905f640 -1 osd.5 102 heartbeat_check: no reply from 192.168.123.107:6806 osd.0 since back 2026-03-09T19:41:17.813777+0000 front 2026-03-09T19:41:17.813704+0000 (oldest deadline 2026-03-09T19:41:39.513548+0000) 2026-03-09T19:41:47.492 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:47 vm08.local ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5[124469]: 2026-03-09T19:41:47.109+0000 7fe54905f640 -1 osd.5 102 heartbeat_check: no reply from 192.168.123.107:6814 osd.1 since back 2026-03-09T19:41:24.714605+0000 front 2026-03-09T19:41:24.714591+0000 (oldest deadline 2026-03-09T19:41:46.414452+0000) 2026-03-09T19:41:47.492 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:47 vm08.local podman[143893]: 2026-03-09 19:41:47.309045461 +0000 UTC m=+5.045541353 container died 7f9a10e5f49da9f931793b109206913b1f1a5415b83c86bee46c3bf709ea8a55 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T19:41:47.492 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:47 vm08.local podman[143893]: 2026-03-09 19:41:47.337023785 +0000 UTC m=+5.073519677 container remove 7f9a10e5f49da9f931793b109206913b1f1a5415b83c86bee46c3bf709ea8a55 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, OSD_FLAVOR=default) 2026-03-09T19:41:47.492 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:47 vm08.local bash[143893]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5 2026-03-09T19:41:47.719 DEBUG:teuthology.orchestra.run.vm08:> sudo pkill -f 'journalctl -f -n 0 -u ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.5.service' 2026-03-09T19:41:47.786 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:47 vm08.local podman[143960]: 2026-03-09 19:41:47.493756887 +0000 UTC m=+0.020175600 container create 676b8d9f232437056023e02b8ad48fe479e0ecf1bfcb9a82efa25bf984bff421 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2) 2026-03-09T19:41:47.786 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:47 vm08.local podman[143960]: 2026-03-09 19:41:47.537107516 +0000 UTC m=+0.063526238 container init 676b8d9f232437056023e02b8ad48fe479e0ecf1bfcb9a82efa25bf984bff421 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default) 2026-03-09T19:41:47.786 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:47 vm08.local podman[143960]: 2026-03-09 19:41:47.540765634 +0000 UTC m=+0.067184356 container start 676b8d9f232437056023e02b8ad48fe479e0ecf1bfcb9a82efa25bf984bff421 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-deactivate, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T19:41:47.786 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:47 vm08.local podman[143960]: 2026-03-09 19:41:47.543770829 +0000 UTC m=+0.070189561 container attach 676b8d9f232437056023e02b8ad48fe479e0ecf1bfcb9a82efa25bf984bff421 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223) 2026-03-09T19:41:47.786 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:47 vm08.local podman[143960]: 2026-03-09 19:41:47.485884971 +0000 UTC m=+0.012303704 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T19:41:47.786 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:47 vm08.local podman[143960]: 2026-03-09 19:41:47.676643046 +0000 UTC m=+0.203061768 container died 676b8d9f232437056023e02b8ad48fe479e0ecf1bfcb9a82efa25bf984bff421 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T19:41:47.786 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:47 vm08.local podman[143960]: 2026-03-09 19:41:47.699640345 +0000 UTC m=+0.226059067 container remove 676b8d9f232437056023e02b8ad48fe479e0ecf1bfcb9a82efa25bf984bff421 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594-osd-5-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T19:41:47.786 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:47 vm08.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.5.service: Deactivated successfully. 2026-03-09T19:41:47.786 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:47 vm08.local systemd[1]: Stopped Ceph osd.5 for 17715774-1bed-11f1-9ad8-1bc9d74ff594. 2026-03-09T19:41:47.786 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 09 19:41:47 vm08.local systemd[1]: ceph-17715774-1bed-11f1-9ad8-1bc9d74ff594@osd.5.service: Consumed 5.666s CPU time. 2026-03-09T19:41:47.795 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T19:41:47.795 INFO:tasks.cephadm.osd.5:Stopped osd.5 2026-03-09T19:41:47.795 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 --force --keep-logs 2026-03-09T19:41:47.908 INFO:teuthology.orchestra.run.vm07.stdout:Deleting cluster with fsid: 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:41:49.453 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm07.stderr:ceph-fuse[96354]: fuse finished with error 0 and tester_r 0 2026-03-09T19:41:56.647 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 --force --keep-logs 2026-03-09T19:41:56.762 INFO:teuthology.orchestra.run.vm08.stdout:Deleting cluster with fsid: 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:42:01.974 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-09T19:42:02.003 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-09T19:42:02.030 INFO:tasks.cephadm:Archiving crash dumps... 2026-03-09T19:42:02.031 DEBUG:teuthology.misc:Transferring archived files from vm07:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/crash to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/617/remote/vm07/crash 2026-03-09T19:42:02.031 DEBUG:teuthology.orchestra.run.vm07:> sudo tar c -f - -C /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/crash -- . 2026-03-09T19:42:02.068 INFO:teuthology.orchestra.run.vm07.stderr:tar: /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/crash: Cannot open: No such file or directory 2026-03-09T19:42:02.068 INFO:teuthology.orchestra.run.vm07.stderr:tar: Error is not recoverable: exiting now 2026-03-09T19:42:02.069 DEBUG:teuthology.misc:Transferring archived files from vm08:/var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/crash to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/617/remote/vm08/crash 2026-03-09T19:42:02.070 DEBUG:teuthology.orchestra.run.vm08:> sudo tar c -f - -C /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/crash -- . 2026-03-09T19:42:02.101 INFO:teuthology.orchestra.run.vm08.stderr:tar: /var/lib/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/crash: Cannot open: No such file or directory 2026-03-09T19:42:02.101 INFO:teuthology.orchestra.run.vm08.stderr:tar: Error is not recoverable: exiting now 2026-03-09T19:42:02.102 INFO:tasks.cephadm:Checking cluster log for badness... 2026-03-09T19:42:02.103 DEBUG:teuthology.orchestra.run.vm07:> sudo egrep '\[ERR\]|\[WRN\]|\[SEC\]' /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.log | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | egrep -v FS_DEGRADED | egrep -v 'filesystem is degraded' | egrep -v FS_INLINE_DATA_DEPRECATED | egrep -v FS_WITH_FAILED_MDS | egrep -v MDS_ALL_DOWN | egrep -v 'filesystem is offline' | egrep -v 'is offline because no MDS' | egrep -v MDS_DAMAGE | egrep -v MDS_DEGRADED | egrep -v MDS_FAILED | egrep -v MDS_INSUFFICIENT_STANDBY | egrep -v MDS_UP_LESS_THAN_MAX | egrep -v 'online, but wants' | egrep -v 'filesystem is online with fewer MDS than max_mds' | egrep -v POOL_APP_NOT_ENABLED | egrep -v 'do not have an application enabled' | egrep -v 'overall HEALTH_' | egrep -v 'Replacing daemon' | egrep -v 'deprecated feature inline_data' | egrep -v MGR_MODULE_ERROR | egrep -v OSD_DOWN | egrep -v 'osds down' | egrep -v 'overall HEALTH_' | egrep -v '\(OSD_DOWN\)' | egrep -v '\(OSD_' | egrep -v 'but it is still running' | egrep -v 'is not responding' | egrep -v MON_DOWN | egrep -v PG_AVAILABILITY | egrep -v PG_DEGRADED | egrep -v 'Reduced data availability' | egrep -v 'Degraded data redundancy' | egrep -v 'pg .* is stuck inactive' | egrep -v 'pg .* is .*degraded' | egrep -v 'pg .* is stuck peering' | head -n 1 2026-03-09T19:42:02.171 INFO:tasks.cephadm:Compressing logs... 2026-03-09T19:42:02.171 DEBUG:teuthology.orchestra.run.vm07:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T19:42:02.172 DEBUG:teuthology.orchestra.run.vm08:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T19:42:02.194 INFO:teuthology.orchestra.run.vm07.stderr:find: ‘/var/log/rbd-target-api’: No such file or directory 2026-03-09T19:42:02.194 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-09T19:42:02.195 INFO:teuthology.orchestra.run.vm08.stderr:find: gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-09T19:42:02.195 INFO:teuthology.orchestra.run.vm08.stderr:‘/var/log/rbd-target-api’: No such file or directory 2026-03-09T19:42:02.195 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mon.vm07.log 2026-03-09T19:42:02.195 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/cephadm.log: gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.log 2026-03-09T19:42:02.196 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-volume.log 2026-03-09T19:42:02.197 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/cephadm.log: gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-client.ceph-exporter.vm08.log 2026-03-09T19:42:02.198 INFO:teuthology.orchestra.run.vm08.stderr: 92.7% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-09T19:42:02.198 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-volume.log: gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mgr.vm08.mxylvw.log 2026-03-09T19:42:02.198 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mon.vm08.log 2026-03-09T19:42:02.207 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mgr.vm08.mxylvw.log: /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-client.ceph-exporter.vm08.log: 94.0% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-client.ceph-exporter.vm08.log.gz 2026-03-09T19:42:02.208 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mon.vm07.log: 90.7% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-09T19:42:02.208 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mgr.vm07.xacuym.log 2026-03-09T19:42:02.210 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.log: 87.7% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.log.gz 2026-03-09T19:42:02.210 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.audit.log 2026-03-09T19:42:02.210 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.audit.log 2026-03-09T19:42:02.210 INFO:teuthology.orchestra.run.vm08.stderr: 92.4% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-volume.log.gz 2026-03-09T19:42:02.211 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mgr.vm07.xacuym.log: gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.cephadm.log 2026-03-09T19:42:02.211 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mon.vm08.log: gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.log 2026-03-09T19:42:02.219 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.audit.log: 89.2% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mgr.vm08.mxylvw.log.gz 2026-03-09T19:42:02.220 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.cephadm.log 2026-03-09T19:42:02.220 INFO:teuthology.orchestra.run.vm08.stderr: 91.7% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.audit.log.gz 2026-03-09T19:42:02.221 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.audit.log: 91.6% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.audit.log.gz 2026-03-09T19:42:02.222 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.log: 87.8% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.log.gz 2026-03-09T19:42:02.222 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.3.log 2026-03-09T19:42:02.222 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-volume.log 2026-03-09T19:42:02.222 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.cephadm.log: 85.4% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.cephadm.log.gz 2026-03-09T19:42:02.222 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.4.log 2026-03-09T19:42:02.223 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.3.log: gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.5.log 2026-03-09T19:42:02.224 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.cephadm.log: 85.7% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph.cephadm.log.gz 2026-03-09T19:42:02.228 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-client.ceph-exporter.vm07.log 2026-03-09T19:42:02.237 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-volume.log: gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.0.log 2026-03-09T19:42:02.239 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.4.log: gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mds.cephfs.vm08.zcaqju.log 2026-03-09T19:42:02.241 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-client.ceph-exporter.vm07.log: 94.0% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-client.ceph-exporter.vm07.log.gz 2026-03-09T19:42:02.241 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.1.log 2026-03-09T19:42:02.249 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.0.log: 92.7% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-volume.log.gz 2026-03-09T19:42:02.250 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.2.log 2026-03-09T19:42:02.252 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.5.log: gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mds.cephfs.vm08.jwsqrf.log 2026-03-09T19:42:02.253 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.1.log: gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mds.cephfs.vm07.uizncw.log 2026-03-09T19:42:02.259 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.2.log: gzip -5 --verbose -- /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mds.cephfs.vm07.zkmcyw.log 2026-03-09T19:42:02.267 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mds.cephfs.vm08.zcaqju.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.1.log 2026-03-09T19:42:02.274 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mds.cephfs.vm07.uizncw.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.0.log 2026-03-09T19:42:02.308 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mds.cephfs.vm07.zkmcyw.log: /var/log/ceph/ceph-client.0.log: 94.5% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mds.cephfs.vm07.zkmcyw.log.gz 2026-03-09T19:42:02.745 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mds.cephfs.vm08.jwsqrf.log: /var/log/ceph/ceph-client.1.log: 92.2% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mon.vm08.log.gz 2026-03-09T19:42:02.805 INFO:teuthology.orchestra.run.vm07.stderr: 89.5% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mgr.vm07.xacuym.log.gz 2026-03-09T19:42:03.651 INFO:teuthology.orchestra.run.vm07.stderr: 90.5% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mon.vm07.log.gz 2026-03-09T19:42:10.228 INFO:teuthology.orchestra.run.vm07.stderr: 93.9% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.0.log.gz 2026-03-09T19:42:11.235 INFO:teuthology.orchestra.run.vm07.stderr: 93.6% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.2.log.gz 2026-03-09T19:42:11.870 INFO:teuthology.orchestra.run.vm07.stderr: 94.8% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mds.cephfs.vm07.uizncw.log.gz 2026-03-09T19:42:12.055 INFO:teuthology.orchestra.run.vm07.stderr: 93.6% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.1.log.gz 2026-03-09T19:42:12.128 INFO:teuthology.orchestra.run.vm08.stderr: 93.6% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.4.log.gz 2026-03-09T19:42:13.885 INFO:teuthology.orchestra.run.vm08.stderr: 94.8% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mds.cephfs.vm08.jwsqrf.log.gz 2026-03-09T19:42:14.408 INFO:teuthology.orchestra.run.vm08.stderr: 93.7% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.5.log.gz 2026-03-09T19:42:15.761 INFO:teuthology.orchestra.run.vm08.stderr: 93.7% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-osd.3.log.gz 2026-03-09T19:42:16.493 INFO:teuthology.orchestra.run.vm07.stderr:gzip: /var/log/ceph/ceph-client.0.log: file size changed while zipping 2026-03-09T19:42:16.494 INFO:teuthology.orchestra.run.vm07.stderr: 93.2% -- replaced with /var/log/ceph/ceph-client.0.log.gz 2026-03-09T19:42:16.495 INFO:teuthology.orchestra.run.vm07.stderr: 2026-03-09T19:42:16.495 INFO:teuthology.orchestra.run.vm07.stderr:real 0m14.311s 2026-03-09T19:42:16.495 INFO:teuthology.orchestra.run.vm07.stderr:user 0m23.002s 2026-03-09T19:42:16.495 INFO:teuthology.orchestra.run.vm07.stderr:sys 0m1.109s 2026-03-09T19:42:19.443 INFO:teuthology.orchestra.run.vm08.stderr:gzip: /var/log/ceph/ceph-client.1.log: file size changed while zipping 2026-03-09T19:42:19.443 INFO:teuthology.orchestra.run.vm08.stderr: 93.3% -- replaced with /var/log/ceph/ceph-client.1.log.gz 2026-03-09T19:43:14.957 INFO:teuthology.orchestra.run.vm08.stderr: 92.9% -- replaced with /var/log/ceph/17715774-1bed-11f1-9ad8-1bc9d74ff594/ceph-mds.cephfs.vm08.zcaqju.log.gz 2026-03-09T19:43:14.960 INFO:teuthology.orchestra.run.vm08.stderr: 2026-03-09T19:43:14.960 INFO:teuthology.orchestra.run.vm08.stderr:real 1m12.774s 2026-03-09T19:43:14.960 INFO:teuthology.orchestra.run.vm08.stderr:user 1m23.419s 2026-03-09T19:43:14.960 INFO:teuthology.orchestra.run.vm08.stderr:sys 0m5.449s 2026-03-09T19:43:14.961 INFO:tasks.cephadm:Archiving logs... 2026-03-09T19:43:14.961 DEBUG:teuthology.misc:Transferring archived files from vm07:/var/log/ceph to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/617/remote/vm07/log 2026-03-09T19:43:14.961 DEBUG:teuthology.orchestra.run.vm07:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-09T19:43:16.170 DEBUG:teuthology.misc:Transferring archived files from vm08:/var/log/ceph to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/617/remote/vm08/log 2026-03-09T19:43:16.171 DEBUG:teuthology.orchestra.run.vm08:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-09T19:43:21.013 INFO:tasks.cephadm:Removing cluster... 2026-03-09T19:43:21.013 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 --force 2026-03-09T19:43:21.120 INFO:teuthology.orchestra.run.vm07.stdout:Deleting cluster with fsid: 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:43:21.384 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 17715774-1bed-11f1-9ad8-1bc9d74ff594 --force 2026-03-09T19:43:21.511 INFO:teuthology.orchestra.run.vm08.stdout:Deleting cluster with fsid: 17715774-1bed-11f1-9ad8-1bc9d74ff594 2026-03-09T19:43:21.846 INFO:tasks.cephadm:Removing cephadm ... 2026-03-09T19:43:21.846 DEBUG:teuthology.orchestra.run.vm07:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-09T19:43:21.862 DEBUG:teuthology.orchestra.run.vm08:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-09T19:43:21.878 INFO:tasks.cephadm:Teardown complete 2026-03-09T19:43:21.878 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-09T19:43:21.881 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/contextutil.py", line 32, in nested yield vars File "/home/teuthos/teuthology/teuthology/task/install/__init__.py", line 644, in task yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-09T19:43:21.881 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-09T19:43:21.881 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-09T19:43:21.904 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-09T19:43:21.952 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-09T19:43:21.952 DEBUG:teuthology.orchestra.run.vm07:> 2026-03-09T19:43:21.952 DEBUG:teuthology.orchestra.run.vm07:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-09T19:43:21.952 DEBUG:teuthology.orchestra.run.vm07:> sudo yum -y remove $d || true 2026-03-09T19:43:21.952 DEBUG:teuthology.orchestra.run.vm07:> done 2026-03-09T19:43:21.958 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-09T19:43:21.958 DEBUG:teuthology.orchestra.run.vm08:> 2026-03-09T19:43:21.958 DEBUG:teuthology.orchestra.run.vm08:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-09T19:43:21.958 DEBUG:teuthology.orchestra.run.vm08:> sudo yum -y remove $d || true 2026-03-09T19:43:21.958 DEBUG:teuthology.orchestra.run.vm08:> done 2026-03-09T19:43:22.202 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:22.203 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:22.203 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-09T19:43:22.203 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:22.203 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-09T19:43:22.203 INFO:teuthology.orchestra.run.vm07.stdout: ceph-radosgw x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 31 M 2026-03-09T19:43:22.203 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-09T19:43:22.203 INFO:teuthology.orchestra.run.vm07.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-09T19:43:22.203 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:22.203 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-09T19:43:22.203 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:22.203 INFO:teuthology.orchestra.run.vm07.stdout:Remove 2 Packages 2026-03-09T19:43:22.203 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:22.203 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 31 M 2026-03-09T19:43:22.203 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-09T19:43:22.207 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-09T19:43:22.207 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-09T19:43:22.225 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-09T19:43:22.225 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-09T19:43:22.257 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-09T19:43:22.285 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T19:43:22.285 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:43:22.285 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-09T19:43:22.285 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-09T19:43:22.285 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-09T19:43:22.285 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:22.287 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T19:43:22.293 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:22.294 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:22.294 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-09T19:43:22.294 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:22.294 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-09T19:43:22.294 INFO:teuthology.orchestra.run.vm08.stdout: ceph-radosgw x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 31 M 2026-03-09T19:43:22.294 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-09T19:43:22.294 INFO:teuthology.orchestra.run.vm08.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-09T19:43:22.294 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:22.295 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-09T19:43:22.295 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:22.295 INFO:teuthology.orchestra.run.vm08.stdout:Remove 2 Packages 2026-03-09T19:43:22.295 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:22.295 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 31 M 2026-03-09T19:43:22.295 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-09T19:43:22.296 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T19:43:22.299 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-09T19:43:22.299 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-09T19:43:22.311 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T19:43:22.313 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-09T19:43:22.313 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-09T19:43:22.344 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-09T19:43:22.368 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T19:43:22.368 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:43:22.368 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-09T19:43:22.368 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-09T19:43:22.368 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-09T19:43:22.368 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:22.371 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T19:43:22.385 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T19:43:22.387 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T19:43:22.388 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T19:43:22.400 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T19:43:22.439 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T19:43:22.439 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:22.439 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-09T19:43:22.439 INFO:teuthology.orchestra.run.vm07.stdout: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-09T19:43:22.439 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:22.439 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:22.489 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T19:43:22.489 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T19:43:22.534 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T19:43:22.534 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:22.534 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-09T19:43:22.534 INFO:teuthology.orchestra.run.vm08.stdout: ceph-radosgw-2:18.2.7-1055.gab47f43c.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-09T19:43:22.534 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:22.534 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:22.663 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:22.663 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:22.663 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-09T19:43:22.663 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:22.663 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-09T19:43:22.663 INFO:teuthology.orchestra.run.vm07.stdout: ceph-test x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 149 M 2026-03-09T19:43:22.664 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-09T19:43:22.664 INFO:teuthology.orchestra.run.vm07.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-09T19:43:22.664 INFO:teuthology.orchestra.run.vm07.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-09T19:43:22.664 INFO:teuthology.orchestra.run.vm07.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-09T19:43:22.664 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:22.664 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-09T19:43:22.664 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:22.664 INFO:teuthology.orchestra.run.vm07.stdout:Remove 4 Packages 2026-03-09T19:43:22.664 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:22.664 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 151 M 2026-03-09T19:43:22.664 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-09T19:43:22.667 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-09T19:43:22.667 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-09T19:43:22.692 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-09T19:43:22.693 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-09T19:43:22.740 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:22.740 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:22.740 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-09T19:43:22.740 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:22.740 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-09T19:43:22.740 INFO:teuthology.orchestra.run.vm08.stdout: ceph-test x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 149 M 2026-03-09T19:43:22.740 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-09T19:43:22.741 INFO:teuthology.orchestra.run.vm08.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-09T19:43:22.741 INFO:teuthology.orchestra.run.vm08.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-09T19:43:22.741 INFO:teuthology.orchestra.run.vm08.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-09T19:43:22.741 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:22.741 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-09T19:43:22.741 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:22.741 INFO:teuthology.orchestra.run.vm08.stdout:Remove 4 Packages 2026-03-09T19:43:22.741 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:22.741 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 151 M 2026-03-09T19:43:22.741 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-09T19:43:22.743 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-09T19:43:22.743 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-09T19:43:22.746 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-09T19:43:22.754 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 1/4 2026-03-09T19:43:22.757 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-09T19:43:22.761 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-09T19:43:22.769 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-09T19:43:22.769 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-09T19:43:22.777 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-09T19:43:22.824 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-09T19:43:22.831 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 1/4 2026-03-09T19:43:22.835 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-09T19:43:22.845 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-09T19:43:22.845 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 1/4 2026-03-09T19:43:22.845 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-09T19:43:22.845 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-09T19:43:22.853 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-09T19:43:22.869 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-09T19:43:22.906 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-09T19:43:22.906 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:22.906 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-09T19:43:22.906 INFO:teuthology.orchestra.run.vm07.stdout: ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-09T19:43:22.906 INFO:teuthology.orchestra.run.vm07.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-09T19:43:22.906 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:22.906 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:22.948 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-09T19:43:22.948 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 1/4 2026-03-09T19:43:22.948 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-09T19:43:22.948 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-09T19:43:23.006 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-09T19:43:23.006 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:23.006 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-09T19:43:23.006 INFO:teuthology.orchestra.run.vm08.stdout: ceph-test-2:18.2.7-1055.gab47f43c.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-09T19:43:23.006 INFO:teuthology.orchestra.run.vm08.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-09T19:43:23.006 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:23.006 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:23.155 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout: ceph x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 0 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mds x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 6.4 M 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mon x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 18 M 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout: ceph-osd x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 58 M 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout:Remove 8 Packages 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 84 M 2026-03-09T19:43:23.156 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-09T19:43:23.159 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-09T19:43:23.159 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-09T19:43:23.183 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-09T19:43:23.183 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-09T19:43:23.224 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-09T19:43:23.226 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 1/8 2026-03-09T19:43:23.248 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2/8 2026-03-09T19:43:23.248 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:43:23.248 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-09T19:43:23.248 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-09T19:43:23.248 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-09T19:43:23.248 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:23.251 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2/8 2026-03-09T19:43:23.261 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2/8 2026-03-09T19:43:23.263 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout: ceph x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 0 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mds x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 6.4 M 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mon x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 18 M 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout: ceph-osd x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 58 M 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout:Remove 8 Packages 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 84 M 2026-03-09T19:43:23.264 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-09T19:43:23.268 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-09T19:43:23.268 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-09T19:43:23.278 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T19:43:23.278 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-09T19:43:23.278 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:23.279 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T19:43:23.293 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-09T19:43:23.293 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-09T19:43:23.299 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T19:43:23.302 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-09T19:43:23.304 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-09T19:43:23.306 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-09T19:43:23.329 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 7/8 2026-03-09T19:43:23.329 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:43:23.329 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-09T19:43:23.329 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-09T19:43:23.329 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-09T19:43:23.329 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:23.329 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 7/8 2026-03-09T19:43:23.330 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-09T19:43:23.332 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 1/8 2026-03-09T19:43:23.337 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 7/8 2026-03-09T19:43:23.351 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2/8 2026-03-09T19:43:23.351 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:43:23.351 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-09T19:43:23.351 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-09T19:43:23.351 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-09T19:43:23.351 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:23.353 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2/8 2026-03-09T19:43:23.359 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 8/8 2026-03-09T19:43:23.359 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:43:23.359 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-09T19:43:23.359 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-09T19:43:23.359 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-09T19:43:23.359 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:23.360 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 8/8 2026-03-09T19:43:23.362 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2/8 2026-03-09T19:43:23.377 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T19:43:23.377 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-09T19:43:23.377 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:23.378 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T19:43:23.406 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T19:43:23.410 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-09T19:43:23.412 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-09T19:43:23.414 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-09T19:43:23.439 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 7/8 2026-03-09T19:43:23.439 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:43:23.439 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-09T19:43:23.439 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-09T19:43:23.439 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-09T19:43:23.439 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:23.440 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 7/8 2026-03-09T19:43:23.444 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 8/8 2026-03-09T19:43:23.444 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 1/8 2026-03-09T19:43:23.444 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 2/8 2026-03-09T19:43:23.444 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 3/8 2026-03-09T19:43:23.444 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 4/8 2026-03-09T19:43:23.444 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-09T19:43:23.444 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-09T19:43:23.444 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-09T19:43:23.449 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 7/8 2026-03-09T19:43:23.469 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 8/8 2026-03-09T19:43:23.470 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:43:23.470 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-09T19:43:23.470 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-09T19:43:23.470 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-09T19:43:23.470 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:23.471 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 8/8 2026-03-09T19:43:23.488 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-09T19:43:23.488 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:23.488 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-09T19:43:23.488 INFO:teuthology.orchestra.run.vm07.stdout: ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:23.488 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:23.488 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:23.488 INFO:teuthology.orchestra.run.vm07.stdout: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:23.488 INFO:teuthology.orchestra.run.vm07.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-09T19:43:23.488 INFO:teuthology.orchestra.run.vm07.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-09T19:43:23.488 INFO:teuthology.orchestra.run.vm07.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T19:43:23.488 INFO:teuthology.orchestra.run.vm07.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T19:43:23.488 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:23.488 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:23.566 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 8/8 2026-03-09T19:43:23.566 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 1/8 2026-03-09T19:43:23.567 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 2/8 2026-03-09T19:43:23.567 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 3/8 2026-03-09T19:43:23.567 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 4/8 2026-03-09T19:43:23.567 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-09T19:43:23.567 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-09T19:43:23.567 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-09T19:43:23.619 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-09T19:43:23.619 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:23.619 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-09T19:43:23.619 INFO:teuthology.orchestra.run.vm08.stdout: ceph-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:23.619 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mds-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:23.619 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mon-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:23.619 INFO:teuthology.orchestra.run.vm08.stdout: ceph-osd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:23.619 INFO:teuthology.orchestra.run.vm08.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-09T19:43:23.619 INFO:teuthology.orchestra.run.vm08.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-09T19:43:23.619 INFO:teuthology.orchestra.run.vm08.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T19:43:23.619 INFO:teuthology.orchestra.run.vm08.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T19:43:23.619 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:23.619 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:23.722 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:23.727 INFO:teuthology.orchestra.run.vm07.stdout:============================================================================================ 2026-03-09T19:43:23.727 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-09T19:43:23.727 INFO:teuthology.orchestra.run.vm07.stdout:============================================================================================ 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: ceph-base x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 22 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout:Removing dependent packages: 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: ceph-immutable-object-cache x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 395 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 4.5 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-cephadm noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 736 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-dashboard noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 87 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 66 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-rook noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 563 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: rbd-mirror x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 12 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: ceph-common x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 71 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: ceph-grafana-dashboards noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 355 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-modules-core noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 1.5 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: ceph-prometheus-alerts noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 52 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: ceph-selinux x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 138 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: libcephsqlite x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 438 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: libradosstriper1 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 1.5 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-common x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 640 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-09T19:43:23.728 INFO:teuthology.orchestra.run.vm07.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:23.729 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-09T19:43:23.730 INFO:teuthology.orchestra.run.vm07.stdout:============================================================================================ 2026-03-09T19:43:23.730 INFO:teuthology.orchestra.run.vm07.stdout:Remove 84 Packages 2026-03-09T19:43:23.730 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:23.730 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 515 M 2026-03-09T19:43:23.730 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-09T19:43:23.755 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-09T19:43:23.755 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-09T19:43:23.853 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout:============================================================================================ 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout:============================================================================================ 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: ceph-base x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 22 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout:Removing dependent packages: 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: ceph-immutable-object-cache x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 395 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 4.5 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-cephadm noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 736 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-dashboard noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 87 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 66 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-rook noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 563 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: rbd-mirror x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 12 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: ceph-common x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 71 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: ceph-grafana-dashboards noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 355 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-modules-core noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 1.5 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: ceph-prometheus-alerts noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 52 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: ceph-selinux x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 138 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: libcephsqlite x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 438 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: libradosstriper1 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 1.5 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-common x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 640 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-09T19:43:23.858 INFO:teuthology.orchestra.run.vm08.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:23.859 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-09T19:43:23.860 INFO:teuthology.orchestra.run.vm08.stdout:============================================================================================ 2026-03-09T19:43:23.860 INFO:teuthology.orchestra.run.vm08.stdout:Remove 84 Packages 2026-03-09T19:43:23.860 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:23.860 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 515 M 2026-03-09T19:43:23.860 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-09T19:43:23.873 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-09T19:43:23.873 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-09T19:43:23.884 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-09T19:43:23.884 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-09T19:43:24.010 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-09T19:43:24.010 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-09T19:43:24.042 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-09T19:43:24.042 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 1/84 2026-03-09T19:43:24.050 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 1/84 2026-03-09T19:43:24.074 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2/84 2026-03-09T19:43:24.074 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:43:24.074 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-09T19:43:24.074 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-09T19:43:24.074 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-09T19:43:24.074 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:24.075 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2/84 2026-03-09T19:43:24.093 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2/84 2026-03-09T19:43:24.147 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el9. 3/84 2026-03-09T19:43:24.158 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-09T19:43:24.158 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 1/84 2026-03-09T19:43:24.167 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 1/84 2026-03-09T19:43:24.171 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 4/84 2026-03-09T19:43:24.171 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noa 5/84 2026-03-09T19:43:24.186 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2/84 2026-03-09T19:43:24.186 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:43:24.186 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-09T19:43:24.186 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-09T19:43:24.186 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-09T19:43:24.186 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:24.186 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noa 5/84 2026-03-09T19:43:24.187 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2/84 2026-03-09T19:43:24.192 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-09T19:43:24.192 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarc 7/84 2026-03-09T19:43:24.206 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarc 7/84 2026-03-09T19:43:24.206 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2/84 2026-03-09T19:43:24.222 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-09T19:43:24.225 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-09T19:43:24.227 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-09T19:43:24.232 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-09T19:43:24.236 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-09T19:43:24.245 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-09T19:43:24.257 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-09T19:43:24.263 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-09T19:43:24.271 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el9. 3/84 2026-03-09T19:43:24.273 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-09T19:43:24.279 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-09T19:43:24.299 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 4/84 2026-03-09T19:43:24.299 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noa 5/84 2026-03-09T19:43:24.311 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-09T19:43:24.312 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noa 5/84 2026-03-09T19:43:24.317 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-09T19:43:24.317 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarc 7/84 2026-03-09T19:43:24.317 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-09T19:43:24.320 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-09T19:43:24.329 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-09T19:43:24.330 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarc 7/84 2026-03-09T19:43:24.336 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-09T19:43:24.336 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f 23/84 2026-03-09T19:43:24.340 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-09T19:43:24.344 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f 23/84 2026-03-09T19:43:24.345 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-09T19:43:24.348 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-09T19:43:24.355 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-09T19:43:24.360 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-09T19:43:24.370 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-09T19:43:24.385 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-09T19:43:24.393 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-09T19:43:24.405 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-09T19:43:24.440 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-09T19:43:24.446 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-09T19:43:24.473 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-09T19:43:24.474 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-09T19:43:24.482 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-09T19:43:24.486 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-09T19:43:24.491 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-09T19:43:24.497 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 27/84 2026-03-09T19:43:24.498 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-09T19:43:24.500 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-09T19:43:24.507 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-09T19:43:24.507 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f 23/84 2026-03-09T19:43:24.517 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f 23/84 2026-03-09T19:43:24.523 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 29/84 2026-03-09T19:43:24.523 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:43:24.523 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-09T19:43:24.523 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-09T19:43:24.523 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-09T19:43:24.523 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:24.524 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 29/84 2026-03-09T19:43:24.537 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 29/84 2026-03-09T19:43:24.542 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 30/84 2026-03-09T19:43:24.545 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 31/84 2026-03-09T19:43:24.547 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 32/84 2026-03-09T19:43:24.551 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 33/84 2026-03-09T19:43:24.554 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 34/84 2026-03-09T19:43:24.559 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-09T19:43:24.563 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 36/84 2026-03-09T19:43:24.612 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 37/84 2026-03-09T19:43:24.617 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-09T19:43:24.629 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 38/84 2026-03-09T19:43:24.632 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 39/84 2026-03-09T19:43:24.635 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 40/84 2026-03-09T19:43:24.638 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 41/84 2026-03-09T19:43:24.640 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 42/84 2026-03-09T19:43:24.647 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-09T19:43:24.663 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.7-1055.gab47f43 43/84 2026-03-09T19:43:24.663 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:43:24.663 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-09T19:43:24.663 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:24.663 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-immutable-object-cache-2:18.2.7-1055.gab47f43 43/84 2026-03-09T19:43:24.664 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-09T19:43:24.671 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 27/84 2026-03-09T19:43:24.673 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.7-1055.gab47f43 43/84 2026-03-09T19:43:24.674 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-09T19:43:24.675 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 44/84 2026-03-09T19:43:24.677 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 45/84 2026-03-09T19:43:24.681 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-ply-3.11-14.el9.noarch 46/84 2026-03-09T19:43:24.684 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 47/84 2026-03-09T19:43:24.686 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 48/84 2026-03-09T19:43:24.689 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 49/84 2026-03-09T19:43:24.693 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 50/84 2026-03-09T19:43:24.695 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 29/84 2026-03-09T19:43:24.695 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:43:24.695 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-09T19:43:24.695 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-09T19:43:24.695 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-09T19:43:24.695 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:24.696 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 29/84 2026-03-09T19:43:24.696 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 51/84 2026-03-09T19:43:24.706 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 52/84 2026-03-09T19:43:24.710 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 29/84 2026-03-09T19:43:24.713 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 53/84 2026-03-09T19:43:24.715 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 30/84 2026-03-09T19:43:24.716 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 54/84 2026-03-09T19:43:24.718 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 31/84 2026-03-09T19:43:24.721 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 55/84 2026-03-09T19:43:24.721 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 32/84 2026-03-09T19:43:24.726 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 56/84 2026-03-09T19:43:24.726 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 33/84 2026-03-09T19:43:24.731 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 34/84 2026-03-09T19:43:24.733 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 57/84 2026-03-09T19:43:24.736 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-09T19:43:24.739 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 58/84 2026-03-09T19:43:24.742 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 36/84 2026-03-09T19:43:24.745 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 59/84 2026-03-09T19:43:24.749 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 60/84 2026-03-09T19:43:24.755 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 61/84 2026-03-09T19:43:24.759 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 62/84 2026-03-09T19:43:24.762 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 63/84 2026-03-09T19:43:24.767 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 64/84 2026-03-09T19:43:24.776 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 65/84 2026-03-09T19:43:24.780 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 66/84 2026-03-09T19:43:24.782 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c.el 67/84 2026-03-09T19:43:24.788 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.el9 68/84 2026-03-09T19:43:24.794 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 37/84 2026-03-09T19:43:24.795 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 69/84 2026-03-09T19:43:24.800 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 70/84 2026-03-09T19:43:24.804 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 71/84 2026-03-09T19:43:24.808 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 38/84 2026-03-09T19:43:24.811 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 39/84 2026-03-09T19:43:24.814 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 40/84 2026-03-09T19:43:24.817 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 41/84 2026-03-09T19:43:24.819 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 42/84 2026-03-09T19:43:24.826 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 72/84 2026-03-09T19:43:24.826 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-09T19:43:24.826 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:24.833 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 72/84 2026-03-09T19:43:24.841 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.7-1055.gab47f43 43/84 2026-03-09T19:43:24.841 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T19:43:24.841 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-09T19:43:24.842 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:24.842 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-immutable-object-cache-2:18.2.7-1055.gab47f43 43/84 2026-03-09T19:43:24.853 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.7-1055.gab47f43 43/84 2026-03-09T19:43:24.854 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 72/84 2026-03-09T19:43:24.854 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 73/84 2026-03-09T19:43:24.855 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 44/84 2026-03-09T19:43:24.859 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 45/84 2026-03-09T19:43:24.862 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-ply-3.11-14.el9.noarch 46/84 2026-03-09T19:43:24.865 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 47/84 2026-03-09T19:43:24.868 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 48/84 2026-03-09T19:43:24.871 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 49/84 2026-03-09T19:43:24.874 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 50/84 2026-03-09T19:43:24.878 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 51/84 2026-03-09T19:43:24.886 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 52/84 2026-03-09T19:43:24.892 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 53/84 2026-03-09T19:43:24.895 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 54/84 2026-03-09T19:43:24.897 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 55/84 2026-03-09T19:43:24.900 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 56/84 2026-03-09T19:43:24.906 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 57/84 2026-03-09T19:43:24.911 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 58/84 2026-03-09T19:43:24.917 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 59/84 2026-03-09T19:43:24.921 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 60/84 2026-03-09T19:43:24.928 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 61/84 2026-03-09T19:43:24.932 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 62/84 2026-03-09T19:43:24.935 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 63/84 2026-03-09T19:43:24.939 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 64/84 2026-03-09T19:43:24.947 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 65/84 2026-03-09T19:43:24.952 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 66/84 2026-03-09T19:43:24.954 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c.el 67/84 2026-03-09T19:43:24.961 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.el9 68/84 2026-03-09T19:43:24.968 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 69/84 2026-03-09T19:43:24.972 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 70/84 2026-03-09T19:43:24.974 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 71/84 2026-03-09T19:43:24.996 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 72/84 2026-03-09T19:43:24.996 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-09T19:43:24.996 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:25.003 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 72/84 2026-03-09T19:43:25.026 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 72/84 2026-03-09T19:43:25.027 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 73/84 2026-03-09T19:43:30.697 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 73/84 2026-03-09T19:43:30.697 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /sys 2026-03-09T19:43:30.697 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /proc 2026-03-09T19:43:30.697 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /mnt 2026-03-09T19:43:30.697 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /var/tmp 2026-03-09T19:43:30.697 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /home 2026-03-09T19:43:30.697 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /root 2026-03-09T19:43:30.697 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /tmp 2026-03-09T19:43:30.697 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:30.708 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 74/84 2026-03-09T19:43:30.735 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 74/84 2026-03-09T19:43:30.739 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-ceph-common-2:18.2.7-1055.gab47f43c.el9.x8 75/84 2026-03-09T19:43:30.742 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 76/84 2026-03-09T19:43:30.746 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 77/84 2026-03-09T19:43:30.749 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 78/84 2026-03-09T19:43:30.751 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-09T19:43:30.752 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_6 80/84 2026-03-09T19:43:30.767 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_6 80/84 2026-03-09T19:43:30.770 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-09T19:43:30.773 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-09T19:43:30.777 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-09T19:43:30.777 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 84/84 2026-03-09T19:43:30.929 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 73/84 2026-03-09T19:43:30.929 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /sys 2026-03-09T19:43:30.929 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /proc 2026-03-09T19:43:30.929 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /mnt 2026-03-09T19:43:30.929 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /var/tmp 2026-03-09T19:43:30.929 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /home 2026-03-09T19:43:30.929 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /root 2026-03-09T19:43:30.929 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /tmp 2026-03-09T19:43:30.929 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:30.940 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 74/84 2026-03-09T19:43:30.946 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 84/84 2026-03-09T19:43:30.946 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 1/84 2026-03-09T19:43:30.946 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2/84 2026-03-09T19:43:30.946 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c.el 3/84 2026-03-09T19:43:30.946 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-immutable-object-cache-2:18.2.7-1055.gab47f43 4/84 2026-03-09T19:43:30.946 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 5/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarc 6/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noa 7/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f 8/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el9. 9/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 10/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.el9 11/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 12/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 17/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 18/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 19/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 20/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 21/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_6 22/84 2026-03-09T19:43:30.947 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 23/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 24/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 25/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 26/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 27/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 28/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 29/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 30/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 31/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ceph-common-2:18.2.7-1055.gab47f43c.el9.x8 32/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 33/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 34/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 35/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 36/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 37/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 38/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 39/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 40/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 41/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 42/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 43/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 44/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 45/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 46/84 2026-03-09T19:43:30.948 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 47/84 2026-03-09T19:43:30.949 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 48/84 2026-03-09T19:43:30.949 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 49/84 2026-03-09T19:43:30.949 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 50/84 2026-03-09T19:43:30.949 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-09T19:43:30.949 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-09T19:43:30.949 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-09T19:43:30.949 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-09T19:43:30.949 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-09T19:43:30.949 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-09T19:43:30.949 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-09T19:43:30.950 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-09T19:43:30.951 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-09T19:43:30.951 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-09T19:43:30.951 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-09T19:43:30.951 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-09T19:43:30.967 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 74/84 2026-03-09T19:43:30.970 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-ceph-common-2:18.2.7-1055.gab47f43c.el9.x8 75/84 2026-03-09T19:43:30.973 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 76/84 2026-03-09T19:43:30.975 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 77/84 2026-03-09T19:43:30.978 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 78/84 2026-03-09T19:43:30.980 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-09T19:43:30.980 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_6 80/84 2026-03-09T19:43:30.994 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_6 80/84 2026-03-09T19:43:30.996 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-09T19:43:30.999 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-09T19:43:31.001 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-09T19:43:31.001 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 84/84 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 84/84 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: ceph-immutable-object-cache-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-09T19:43:31.025 INFO:teuthology.orchestra.run.vm07.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-09T19:43:31.026 INFO:teuthology.orchestra.run.vm07.stdout: python3-ply-3.11-14.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:31.027 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 84/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 1/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c.el 3/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-immutable-object-cache-2:18.2.7-1055.gab47f43 4/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 5/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarc 6/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noa 7/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f 8/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el9. 9/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 10/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.el9 11/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 12/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 17/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 18/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 19/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 20/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 21/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_6 22/84 2026-03-09T19:43:31.106 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 23/84 2026-03-09T19:43:31.107 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 24/84 2026-03-09T19:43:31.107 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 25/84 2026-03-09T19:43:31.107 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 26/84 2026-03-09T19:43:31.107 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 27/84 2026-03-09T19:43:31.107 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 28/84 2026-03-09T19:43:31.107 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 29/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 30/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 31/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ceph-common-2:18.2.7-1055.gab47f43c.el9.x8 32/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 33/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 34/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 35/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 36/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 37/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 38/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 39/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 40/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 41/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 42/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 43/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 44/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 45/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 46/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 47/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 48/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 49/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 50/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-09T19:43:31.108 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-09T19:43:31.109 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-09T19:43:31.110 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-09T19:43:31.110 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-09T19:43:31.110 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-09T19:43:31.110 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-09T19:43:31.110 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-09T19:43:31.110 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-09T19:43:31.110 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 84/84 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: ceph-base-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: ceph-grafana-dashboards-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: ceph-immutable-object-cache-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-dashboard-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-diskprediction-local-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-modules-core-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-rook-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: ceph-prometheus-alerts-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: ceph-selinux-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: libcephsqlite-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-09T19:43:31.184 INFO:teuthology.orchestra.run.vm08.stdout: libradosstriper1-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-common-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-ply-3.11-14.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: rbd-mirror-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:31.185 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:31.186 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:31.246 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:31.246 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:31.246 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-09T19:43:31.246 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:31.246 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-09T19:43:31.246 INFO:teuthology.orchestra.run.vm07.stdout: cephadm noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 218 k 2026-03-09T19:43:31.246 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:31.246 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-09T19:43:31.246 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:31.246 INFO:teuthology.orchestra.run.vm07.stdout:Remove 1 Package 2026-03-09T19:43:31.246 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:31.246 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 218 k 2026-03-09T19:43:31.246 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-09T19:43:31.248 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-09T19:43:31.248 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-09T19:43:31.249 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-09T19:43:31.249 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-09T19:43:31.265 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-09T19:43:31.265 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 1/1 2026-03-09T19:43:31.372 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 1/1 2026-03-09T19:43:31.407 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:31.407 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:31.407 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-09T19:43:31.407 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:31.407 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-09T19:43:31.407 INFO:teuthology.orchestra.run.vm08.stdout: cephadm noarch 2:18.2.7-1055.gab47f43c.el9 @ceph-noarch 218 k 2026-03-09T19:43:31.407 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:31.407 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-09T19:43:31.407 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:31.408 INFO:teuthology.orchestra.run.vm08.stdout:Remove 1 Package 2026-03-09T19:43:31.408 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:31.408 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 218 k 2026-03-09T19:43:31.408 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-09T19:43:31.409 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-09T19:43:31.409 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-09T19:43:31.411 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-09T19:43:31.411 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-09T19:43:31.417 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 1/1 2026-03-09T19:43:31.417 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:31.417 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-09T19:43:31.417 INFO:teuthology.orchestra.run.vm07.stdout: cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:43:31.417 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:31.417 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:31.428 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-09T19:43:31.428 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 1/1 2026-03-09T19:43:31.535 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 1/1 2026-03-09T19:43:31.587 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 1/1 2026-03-09T19:43:31.587 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:31.587 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-09T19:43:31.587 INFO:teuthology.orchestra.run.vm08.stdout: cephadm-2:18.2.7-1055.gab47f43c.el9.noarch 2026-03-09T19:43:31.587 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:31.587 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:31.605 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-immutable-object-cache 2026-03-09T19:43:31.605 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-09T19:43:31.609 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:31.609 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-09T19:43:31.609 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:31.776 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-immutable-object-cache 2026-03-09T19:43:31.776 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-09T19:43:31.780 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:31.780 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-09T19:43:31.780 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:31.800 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-mgr 2026-03-09T19:43:31.800 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-09T19:43:31.803 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:31.804 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-09T19:43:31.804 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:31.961 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-mgr 2026-03-09T19:43:31.961 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-09T19:43:31.964 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:31.964 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-09T19:43:31.964 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:31.974 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-mgr-dashboard 2026-03-09T19:43:31.974 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-09T19:43:31.978 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:31.978 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-09T19:43:31.978 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:32.134 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-mgr-dashboard 2026-03-09T19:43:32.134 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-09T19:43:32.137 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:32.138 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-09T19:43:32.138 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:32.148 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-09T19:43:32.148 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-09T19:43:32.150 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:32.151 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-09T19:43:32.151 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:32.301 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-09T19:43:32.301 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-09T19:43:32.304 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:32.305 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-09T19:43:32.305 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:32.320 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-mgr-rook 2026-03-09T19:43:32.320 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-09T19:43:32.322 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:32.323 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-09T19:43:32.323 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:32.471 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-mgr-rook 2026-03-09T19:43:32.471 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-09T19:43:32.474 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:32.475 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-09T19:43:32.475 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:32.491 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-mgr-cephadm 2026-03-09T19:43:32.491 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-09T19:43:32.494 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:32.495 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-09T19:43:32.495 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:32.646 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-mgr-cephadm 2026-03-09T19:43:32.646 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-09T19:43:32.649 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:32.650 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-09T19:43:32.650 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:32.671 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:32.671 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:32.672 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-09T19:43:32.672 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:32.672 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-09T19:43:32.672 INFO:teuthology.orchestra.run.vm07.stdout: ceph-fuse x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 2.5 M 2026-03-09T19:43:32.672 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:32.672 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-09T19:43:32.672 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:32.672 INFO:teuthology.orchestra.run.vm07.stdout:Remove 1 Package 2026-03-09T19:43:32.672 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:32.672 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 2.5 M 2026-03-09T19:43:32.672 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-09T19:43:32.673 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-09T19:43:32.673 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-09T19:43:32.683 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-09T19:43:32.683 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-09T19:43:32.707 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-09T19:43:32.720 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 1/1 2026-03-09T19:43:32.789 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 1/1 2026-03-09T19:43:32.835 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:32.835 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:32.835 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-09T19:43:32.835 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:32.835 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-09T19:43:32.835 INFO:teuthology.orchestra.run.vm08.stdout: ceph-fuse x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 2.5 M 2026-03-09T19:43:32.835 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:32.835 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-09T19:43:32.835 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:32.835 INFO:teuthology.orchestra.run.vm08.stdout:Remove 1 Package 2026-03-09T19:43:32.835 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:32.835 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 2.5 M 2026-03-09T19:43:32.835 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-09T19:43:32.837 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 1/1 2026-03-09T19:43:32.837 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:32.837 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-09T19:43:32.837 INFO:teuthology.orchestra.run.vm07.stdout: ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:32.837 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:32.837 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:32.837 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-09T19:43:32.837 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-09T19:43:32.846 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-09T19:43:32.846 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-09T19:43:32.871 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-09T19:43:32.885 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 1/1 2026-03-09T19:43:32.950 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 1/1 2026-03-09T19:43:32.991 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 1/1 2026-03-09T19:43:32.991 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:32.991 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-09T19:43:32.991 INFO:teuthology.orchestra.run.vm08.stdout: ceph-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:32.991 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:32.991 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:33.027 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:33.027 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:33.027 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repo Size 2026-03-09T19:43:33.027 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:33.027 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-09T19:43:33.027 INFO:teuthology.orchestra.run.vm07.stdout: librados-devel x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 456 k 2026-03-09T19:43:33.027 INFO:teuthology.orchestra.run.vm07.stdout:Removing dependent packages: 2026-03-09T19:43:33.027 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs-devel x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 139 k 2026-03-09T19:43:33.027 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:33.027 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-09T19:43:33.027 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:33.027 INFO:teuthology.orchestra.run.vm07.stdout:Remove 2 Packages 2026-03-09T19:43:33.027 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:33.028 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 595 k 2026-03-09T19:43:33.028 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-09T19:43:33.029 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-09T19:43:33.029 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-09T19:43:33.040 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-09T19:43:33.040 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-09T19:43:33.065 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-09T19:43:33.067 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T19:43:33.079 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2/2 2026-03-09T19:43:33.143 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2/2 2026-03-09T19:43:33.143 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T19:43:33.182 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:33.182 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:33.182 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repo Size 2026-03-09T19:43:33.182 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:33.182 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-09T19:43:33.182 INFO:teuthology.orchestra.run.vm08.stdout: librados-devel x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 456 k 2026-03-09T19:43:33.182 INFO:teuthology.orchestra.run.vm08.stdout:Removing dependent packages: 2026-03-09T19:43:33.182 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs-devel x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 139 k 2026-03-09T19:43:33.182 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:33.182 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-09T19:43:33.182 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:33.182 INFO:teuthology.orchestra.run.vm08.stdout:Remove 2 Packages 2026-03-09T19:43:33.182 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:33.183 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 595 k 2026-03-09T19:43:33.183 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-09T19:43:33.184 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-09T19:43:33.184 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-09T19:43:33.193 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2/2 2026-03-09T19:43:33.193 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:33.193 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-09T19:43:33.193 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:33.193 INFO:teuthology.orchestra.run.vm07.stdout: librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:33.193 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:33.193 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:33.194 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-09T19:43:33.195 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-09T19:43:33.221 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-09T19:43:33.223 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T19:43:33.237 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2/2 2026-03-09T19:43:33.301 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2/2 2026-03-09T19:43:33.301 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 1/2 2026-03-09T19:43:33.345 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2/2 2026-03-09T19:43:33.345 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:33.345 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-09T19:43:33.345 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:33.345 INFO:teuthology.orchestra.run.vm08.stdout: librados-devel-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:33.345 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:33.345 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:33.391 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:33.391 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:33.391 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repo Size 2026-03-09T19:43:33.391 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:33.391 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-09T19:43:33.391 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs2 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 2.0 M 2026-03-09T19:43:33.391 INFO:teuthology.orchestra.run.vm07.stdout:Removing dependent packages: 2026-03-09T19:43:33.391 INFO:teuthology.orchestra.run.vm07.stdout: python3-cephfs x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 505 k 2026-03-09T19:43:33.391 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-09T19:43:33.391 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-argparse x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 186 k 2026-03-09T19:43:33.391 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:33.391 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-09T19:43:33.391 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:33.391 INFO:teuthology.orchestra.run.vm07.stdout:Remove 3 Packages 2026-03-09T19:43:33.391 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:33.392 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 2.6 M 2026-03-09T19:43:33.392 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-09T19:43:33.393 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-09T19:43:33.393 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-09T19:43:33.405 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-09T19:43:33.405 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-09T19:43:33.431 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-09T19:43:33.434 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_64 1/3 2026-03-09T19:43:33.435 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el9.x8 2/3 2026-03-09T19:43:33.435 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 3/3 2026-03-09T19:43:33.496 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 3/3 2026-03-09T19:43:33.496 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 1/3 2026-03-09T19:43:33.496 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el9.x8 2/3 2026-03-09T19:43:33.533 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:33.534 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:33.534 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repo Size 2026-03-09T19:43:33.534 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:33.534 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-09T19:43:33.534 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs2 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 2.0 M 2026-03-09T19:43:33.534 INFO:teuthology.orchestra.run.vm08.stdout:Removing dependent packages: 2026-03-09T19:43:33.534 INFO:teuthology.orchestra.run.vm08.stdout: python3-cephfs x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 505 k 2026-03-09T19:43:33.534 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-09T19:43:33.534 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-argparse x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 186 k 2026-03-09T19:43:33.534 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:33.534 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-09T19:43:33.534 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:33.534 INFO:teuthology.orchestra.run.vm08.stdout:Remove 3 Packages 2026-03-09T19:43:33.534 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:33.534 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 2.6 M 2026-03-09T19:43:33.534 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-09T19:43:33.536 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_64 3/3 2026-03-09T19:43:33.536 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:33.536 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-09T19:43:33.536 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:33.536 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:33.536 INFO:teuthology.orchestra.run.vm07.stdout: python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:33.536 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:33.536 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:33.536 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-09T19:43:33.536 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-09T19:43:33.548 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-09T19:43:33.548 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-09T19:43:33.575 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-09T19:43:33.577 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_64 1/3 2026-03-09T19:43:33.578 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el9.x8 2/3 2026-03-09T19:43:33.578 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 3/3 2026-03-09T19:43:33.639 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 3/3 2026-03-09T19:43:33.639 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 1/3 2026-03-09T19:43:33.639 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el9.x8 2/3 2026-03-09T19:43:33.680 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_64 3/3 2026-03-09T19:43:33.680 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:33.680 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-09T19:43:33.680 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:33.680 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-argparse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:33.680 INFO:teuthology.orchestra.run.vm08.stdout: python3-cephfs-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:33.680 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:33.680 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:33.709 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: libcephfs-devel 2026-03-09T19:43:33.709 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-09T19:43:33.712 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:33.712 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-09T19:43:33.712 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:33.871 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: libcephfs-devel 2026-03-09T19:43:33.872 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-09T19:43:33.875 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:33.875 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-09T19:43:33.875 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:33.900 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: librados2 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 12 M 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout:Removing dependent packages: 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: python3-rados x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 1.1 M 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: python3-rbd x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 1.1 M 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: python3-rgw x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 265 k 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: rbd-fuse x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 231 k 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: rbd-nbd x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 490 k 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: librbd1 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 12 M 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: librgw2 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 16 M 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout:Remove 19 Packages 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 73 M 2026-03-09T19:43:33.902 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-09T19:43:33.906 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-09T19:43:33.906 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-09T19:43:33.927 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-09T19:43:33.927 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-09T19:43:33.966 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-09T19:43:33.968 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 1/19 2026-03-09T19:43:33.970 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2/19 2026-03-09T19:43:33.973 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 3/19 2026-03-09T19:43:33.973 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 4/19 2026-03-09T19:43:33.987 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 4/19 2026-03-09T19:43:33.990 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/19 2026-03-09T19:43:33.992 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 6/19 2026-03-09T19:43:33.994 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 7/19 2026-03-09T19:43:33.995 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/19 2026-03-09T19:43:33.997 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 9/19 2026-03-09T19:43:33.997 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 10/19 2026-03-09T19:43:34.011 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 10/19 2026-03-09T19:43:34.012 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 11/19 2026-03-09T19:43:34.012 INFO:teuthology.orchestra.run.vm07.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-09T19:43:34.012 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:34.026 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 11/19 2026-03-09T19:43:34.029 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 12/19 2026-03-09T19:43:34.033 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : re2-1:20211101-20.el9.x86_64 13/19 2026-03-09T19:43:34.037 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 14/19 2026-03-09T19:43:34.040 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 15/19 2026-03-09T19:43:34.043 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 16/19 2026-03-09T19:43:34.045 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 17/19 2026-03-09T19:43:34.048 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 18/19 2026-03-09T19:43:34.062 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 19/19 2026-03-09T19:43:34.064 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: librados2 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 12 M 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout:Removing dependent packages: 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: python3-rados x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 1.1 M 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: python3-rbd x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 1.1 M 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: python3-rgw x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 265 k 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: rbd-fuse x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 231 k 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: rbd-nbd x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 490 k 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: librbd1 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 12 M 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: librgw2 x86_64 2:18.2.7-1055.gab47f43c.el9 @ceph 16 M 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout:Remove 19 Packages 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:34.066 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 73 M 2026-03-09T19:43:34.067 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-09T19:43:34.070 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-09T19:43:34.070 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-09T19:43:34.095 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-09T19:43:34.095 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 19/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 2/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 3/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 4/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 5/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 6/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 7/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 8/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 9/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 10/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 11/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 12/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 13/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 14/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 15/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 16/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 17/19 2026-03-09T19:43:34.119 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : re2-1:20211101-20.el9.x86_64 18/19 2026-03-09T19:43:34.134 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-09T19:43:34.137 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 1/19 2026-03-09T19:43:34.139 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2/19 2026-03-09T19:43:34.141 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 3/19 2026-03-09T19:43:34.141 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 4/19 2026-03-09T19:43:34.153 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 4/19 2026-03-09T19:43:34.155 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/19 2026-03-09T19:43:34.158 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 6/19 2026-03-09T19:43:34.160 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 7/19 2026-03-09T19:43:34.162 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/19 2026-03-09T19:43:34.164 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 9/19 2026-03-09T19:43:34.164 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 10/19 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 19/19 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: re2-1:20211101-20.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-09T19:43:34.165 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:34.178 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 10/19 2026-03-09T19:43:34.178 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 11/19 2026-03-09T19:43:34.178 INFO:teuthology.orchestra.run.vm08.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-09T19:43:34.178 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:34.191 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 11/19 2026-03-09T19:43:34.193 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 12/19 2026-03-09T19:43:34.196 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : re2-1:20211101-20.el9.x86_64 13/19 2026-03-09T19:43:34.199 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 14/19 2026-03-09T19:43:34.202 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 15/19 2026-03-09T19:43:34.204 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 16/19 2026-03-09T19:43:34.206 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 17/19 2026-03-09T19:43:34.209 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 18/19 2026-03-09T19:43:34.223 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 19/19 2026-03-09T19:43:34.284 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 19/19 2026-03-09T19:43:34.284 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/19 2026-03-09T19:43:34.284 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 2/19 2026-03-09T19:43:34.284 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 3/19 2026-03-09T19:43:34.284 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 4/19 2026-03-09T19:43:34.284 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 5/19 2026-03-09T19:43:34.284 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 6/19 2026-03-09T19:43:34.284 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 7/19 2026-03-09T19:43:34.284 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 8/19 2026-03-09T19:43:34.284 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 9/19 2026-03-09T19:43:34.284 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 10/19 2026-03-09T19:43:34.285 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 11/19 2026-03-09T19:43:34.285 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 12/19 2026-03-09T19:43:34.285 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 13/19 2026-03-09T19:43:34.285 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 14/19 2026-03-09T19:43:34.285 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 15/19 2026-03-09T19:43:34.285 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 16/19 2026-03-09T19:43:34.285 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 17/19 2026-03-09T19:43:34.285 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : re2-1:20211101-20.el9.x86_64 18/19 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 19/19 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: librados2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: librbd1-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: librgw2-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: python3-rados-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: python3-rbd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: python3-rgw-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: rbd-fuse-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: rbd-nbd-2:18.2.7-1055.gab47f43c.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: re2-1:20211101-20.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-09T19:43:34.336 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:34.361 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: librbd1 2026-03-09T19:43:34.361 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-09T19:43:34.364 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:34.365 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-09T19:43:34.365 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:34.555 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: python3-rados 2026-03-09T19:43:34.555 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-09T19:43:34.559 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:34.559 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-09T19:43:34.559 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:34.583 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: librbd1 2026-03-09T19:43:34.583 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-09T19:43:34.586 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:34.587 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-09T19:43:34.587 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:34.732 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: python3-rgw 2026-03-09T19:43:34.732 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-09T19:43:34.735 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:34.736 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-09T19:43:34.736 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:34.779 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: python3-rados 2026-03-09T19:43:34.779 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-09T19:43:34.782 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:34.782 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-09T19:43:34.782 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:34.900 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: python3-cephfs 2026-03-09T19:43:34.900 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-09T19:43:34.903 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:34.903 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-09T19:43:34.903 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:34.950 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: python3-rgw 2026-03-09T19:43:34.950 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-09T19:43:34.953 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:34.953 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-09T19:43:34.953 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:35.069 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: python3-rbd 2026-03-09T19:43:35.070 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-09T19:43:35.072 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:35.073 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-09T19:43:35.073 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:35.119 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: python3-cephfs 2026-03-09T19:43:35.119 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-09T19:43:35.122 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:35.123 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-09T19:43:35.123 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:35.234 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: rbd-fuse 2026-03-09T19:43:35.234 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-09T19:43:35.237 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:35.237 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-09T19:43:35.237 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:35.288 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: python3-rbd 2026-03-09T19:43:35.288 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-09T19:43:35.291 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:35.291 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-09T19:43:35.291 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:35.397 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: rbd-mirror 2026-03-09T19:43:35.397 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-09T19:43:35.399 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:35.400 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-09T19:43:35.400 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:35.461 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: rbd-fuse 2026-03-09T19:43:35.461 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-09T19:43:35.464 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:35.464 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-09T19:43:35.464 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:35.558 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: rbd-nbd 2026-03-09T19:43:35.558 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-09T19:43:35.561 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-09T19:43:35.561 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-09T19:43:35.561 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-09T19:43:35.582 DEBUG:teuthology.orchestra.run.vm07:> sudo yum clean all 2026-03-09T19:43:35.624 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: rbd-mirror 2026-03-09T19:43:35.624 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-09T19:43:35.627 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:35.628 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-09T19:43:35.628 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:35.705 INFO:teuthology.orchestra.run.vm07.stdout:56 files removed 2026-03-09T19:43:35.731 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-09T19:43:35.758 DEBUG:teuthology.orchestra.run.vm07:> sudo yum clean expire-cache 2026-03-09T19:43:35.800 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: rbd-nbd 2026-03-09T19:43:35.800 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-09T19:43:35.803 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-09T19:43:35.803 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-09T19:43:35.803 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-09T19:43:35.831 DEBUG:teuthology.orchestra.run.vm08:> sudo yum clean all 2026-03-09T19:43:35.916 INFO:teuthology.orchestra.run.vm07.stdout:Cache was expired 2026-03-09T19:43:35.916 INFO:teuthology.orchestra.run.vm07.stdout:0 files removed 2026-03-09T19:43:35.937 DEBUG:teuthology.parallel:result is None 2026-03-09T19:43:35.955 INFO:teuthology.orchestra.run.vm08.stdout:56 files removed 2026-03-09T19:43:35.974 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-09T19:43:36.000 DEBUG:teuthology.orchestra.run.vm08:> sudo yum clean expire-cache 2026-03-09T19:43:36.149 INFO:teuthology.orchestra.run.vm08.stdout:Cache was expired 2026-03-09T19:43:36.150 INFO:teuthology.orchestra.run.vm08.stdout:0 files removed 2026-03-09T19:43:36.169 DEBUG:teuthology.parallel:result is None 2026-03-09T19:43:36.169 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm07.local 2026-03-09T19:43:36.169 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm08.local 2026-03-09T19:43:36.169 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-09T19:43:36.169 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-09T19:43:36.193 DEBUG:teuthology.orchestra.run.vm08:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-09T19:43:36.194 DEBUG:teuthology.orchestra.run.vm07:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-09T19:43:36.259 DEBUG:teuthology.parallel:result is None 2026-03-09T19:43:36.262 DEBUG:teuthology.parallel:result is None 2026-03-09T19:43:36.262 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-09T19:43:36.265 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-09T19:43:36.265 DEBUG:teuthology.orchestra.run.vm07:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-09T19:43:36.304 DEBUG:teuthology.orchestra.run.vm08:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-09T19:43:36.316 INFO:teuthology.orchestra.run.vm07.stderr:bash: line 1: ntpq: command not found 2026-03-09T19:43:36.319 INFO:teuthology.orchestra.run.vm08.stderr:bash: line 1: ntpq: command not found 2026-03-09T19:43:36.322 INFO:teuthology.orchestra.run.vm07.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-09T19:43:36.322 INFO:teuthology.orchestra.run.vm07.stdout:=============================================================================== 2026-03-09T19:43:36.322 INFO:teuthology.orchestra.run.vm07.stdout:^- funky.f5s.de 2 7 377 67 +1542us[+1544us] +/- 33ms 2026-03-09T19:43:36.322 INFO:teuthology.orchestra.run.vm07.stdout:^- www.h4x-gamers.top 2 7 377 65 +1868us[+1870us] +/- 42ms 2026-03-09T19:43:36.322 INFO:teuthology.orchestra.run.vm07.stdout:^+ 193.158.22.13 1 7 377 67 -2212us[-2209us] +/- 17ms 2026-03-09T19:43:36.322 INFO:teuthology.orchestra.run.vm07.stdout:^* ntp2.noris.net 2 7 377 63 +1772us[+1774us] +/- 16ms 2026-03-09T19:43:36.326 INFO:teuthology.orchestra.run.vm08.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-09T19:43:36.327 INFO:teuthology.orchestra.run.vm08.stdout:=============================================================================== 2026-03-09T19:43:36.327 INFO:teuthology.orchestra.run.vm08.stdout:^* ntp2.noris.net 2 7 377 69 +1759us[+1742us] +/- 16ms 2026-03-09T19:43:36.327 INFO:teuthology.orchestra.run.vm08.stdout:^- www.h4x-gamers.top 2 7 377 128 +1962us[+1945us] +/- 41ms 2026-03-09T19:43:36.327 INFO:teuthology.orchestra.run.vm08.stdout:^- funky.f5s.de 2 7 377 65 +1618us[+1618us] +/- 33ms 2026-03-09T19:43:36.327 INFO:teuthology.orchestra.run.vm08.stdout:^+ 193.158.22.13 1 7 377 62 -2175us[-2175us] +/- 17ms 2026-03-09T19:43:36.328 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-09T19:43:36.331 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-09T19:43:36.331 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-09T19:43:36.334 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-09T19:43:36.337 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-09T19:43:36.339 INFO:teuthology.task.internal:Duration was 1550.459407 seconds 2026-03-09T19:43:36.339 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-09T19:43:36.342 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-09T19:43:36.342 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-09T19:43:36.365 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-09T19:43:36.404 INFO:teuthology.orchestra.run.vm07.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-09T19:43:36.411 INFO:teuthology.orchestra.run.vm08.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-09T19:43:36.626 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-09T19:43:36.626 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm07.local 2026-03-09T19:43:36.626 DEBUG:teuthology.orchestra.run.vm07:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-09T19:43:36.652 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm08.local 2026-03-09T19:43:36.652 DEBUG:teuthology.orchestra.run.vm08:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-09T19:43:36.690 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-09T19:43:36.691 DEBUG:teuthology.orchestra.run.vm07:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T19:43:36.694 DEBUG:teuthology.orchestra.run.vm08:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T19:43:37.378 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-09T19:43:37.378 DEBUG:teuthology.orchestra.run.vm07:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T19:43:37.380 DEBUG:teuthology.orchestra.run.vm08:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T19:43:37.403 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-09T19:43:37.403 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-09T19:43:37.403 INFO:teuthology.orchestra.run.vm08.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: gzip -5 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-09T19:43:37.403 INFO:teuthology.orchestra.run.vm08.stderr: --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T19:43:37.404 INFO:teuthology.orchestra.run.vm08.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: /home/ubuntu/cephtest/archive/syslog/journalctl.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-09T19:43:37.405 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-09T19:43:37.406 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-09T19:43:37.406 INFO:teuthology.orchestra.run.vm07.stderr:gzip/home/ubuntu/cephtest/archive/syslog/kern.log: -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T19:43:37.406 INFO:teuthology.orchestra.run.vm07.stderr: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-09T19:43:37.407 INFO:teuthology.orchestra.run.vm07.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-09T19:43:37.544 INFO:teuthology.orchestra.run.vm08.stderr: 97.8% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-09T19:43:37.569 INFO:teuthology.orchestra.run.vm07.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 96.7% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-09T19:43:37.571 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-09T19:43:37.574 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-09T19:43:37.575 DEBUG:teuthology.orchestra.run.vm07:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-09T19:43:37.635 DEBUG:teuthology.orchestra.run.vm08:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-09T19:43:37.661 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-09T19:43:37.664 DEBUG:teuthology.orchestra.run.vm07:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-09T19:43:37.678 DEBUG:teuthology.orchestra.run.vm08:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-09T19:43:37.700 INFO:teuthology.orchestra.run.vm07.stdout:kernel.core_pattern = core 2026-03-09T19:43:37.729 INFO:teuthology.orchestra.run.vm08.stdout:kernel.core_pattern = core 2026-03-09T19:43:37.744 DEBUG:teuthology.orchestra.run.vm07:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-09T19:43:37.768 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T19:43:37.768 DEBUG:teuthology.orchestra.run.vm08:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-09T19:43:37.802 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T19:43:37.802 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-09T19:43:37.806 INFO:teuthology.task.internal:Transferring archived files... 2026-03-09T19:43:37.806 DEBUG:teuthology.misc:Transferring archived files from vm07:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/617/remote/vm07 2026-03-09T19:43:37.806 DEBUG:teuthology.orchestra.run.vm07:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-09T19:43:37.839 DEBUG:teuthology.misc:Transferring archived files from vm08:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/617/remote/vm08 2026-03-09T19:43:37.839 DEBUG:teuthology.orchestra.run.vm08:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-09T19:43:37.874 INFO:teuthology.task.internal:Removing archive directory... 2026-03-09T19:43:37.874 DEBUG:teuthology.orchestra.run.vm07:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-09T19:43:37.879 DEBUG:teuthology.orchestra.run.vm08:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-09T19:43:37.931 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-09T19:43:37.934 INFO:teuthology.task.internal:Not uploading archives. 2026-03-09T19:43:37.934 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-09T19:43:37.937 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-09T19:43:37.937 DEBUG:teuthology.orchestra.run.vm07:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-09T19:43:37.939 DEBUG:teuthology.orchestra.run.vm08:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-09T19:43:37.953 INFO:teuthology.orchestra.run.vm07.stdout: 8532144 0 drwxr-xr-x 3 ubuntu ubuntu 19 Mar 9 19:43 /home/ubuntu/cephtest 2026-03-09T19:43:37.954 INFO:teuthology.orchestra.run.vm07.stdout: 33629068 0 d--------- 2 ubuntu ubuntu 6 Mar 9 19:24 /home/ubuntu/cephtest/mnt.0 2026-03-09T19:43:37.954 INFO:teuthology.orchestra.run.vm07.stderr:find: ‘/home/ubuntu/cephtest/mnt.0’: Permission denied 2026-03-09T19:43:37.954 INFO:teuthology.orchestra.run.vm07.stderr:rmdir: failed to remove '/home/ubuntu/cephtest': Directory not empty 2026-03-09T19:43:37.973 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T19:43:37.973 ERROR:teuthology.run_tasks:Manager failed: internal.base Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 48, in base yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 153, in __exit__ self.gen.throw(typ, value, traceback) File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 53, in base run.wait( File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 485, in wait proc.wait() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 161, in wait self._raise_for_status() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 181, in _raise_for_status raise CommandFailedError( teuthology.exceptions.CommandFailedError: Command failed on vm07 with status 1: 'find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest' 2026-03-09T19:43:37.974 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-09T19:43:37.977 DEBUG:teuthology.run_tasks:Exception was not quenched, exiting: MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-09T19:43:37.978 INFO:teuthology.run:Summary data: description: orch/cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/no kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{reef} 1-volume/{0-create 1-ranks/2 2-allow_standby_replay/no 3-inline/no 4-verify} 2-client/fuse 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} duration: 1550.4594066143036 failure_reason: reached maximum tries (50) after waiting for 300 seconds flavor: default owner: kyr status: fail success: false 2026-03-09T19:43:37.978 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-09T19:43:38.003 INFO:teuthology.run:FAIL